Immersive photography

IMMERSIVE PHOTOGRAPHY

Origins

Sitching of images and viewing on digital devices
 
In the years 1990-1994 the foundations for immersive photography were laid within the Apple Human Interface Group research group. Dan O'Sullivan creates the first panoramas, taken in Paris.

«The first series of panoramic images using Eric Chen's new QTVR joining technique. All twenty scenes were shot with sound. I collaborated with Lili Chenge and Michael Chen in the Apple Computer Advanced Graphics group in the development of the software and interface to control these images and synchronize them with sound.
Technical note: the photos were taken in analog and then digitized on the Kodak Photo-CD. The panoramas were spliced using MPW (Macintosh Programmer's Workshop). The visualization software was written with Hypercard.»

Dan O'Sullivan - June 30, 1992

Apple, with the presentation of Eric Chen (and others) at the Siggraph in 1995 (QuickTime VR - An Image-Based Approach to Virtual Environment Navigation), puts on the market the software called QuickTimeVR Authoring Studio which includes these 3 functions:

1) Function for the stitching and perspective deformation correction of the images.
eric chen
As seen in the image below, in order to be stitched, the images must necessarily be deformed.
distorsione
 
2) Function for the visualization without deformation of the image obtained by stitching. The projection is the cylindrical one, the software is in fact limited to the creation of panoramas of 36 ° x 160° approximately.
apple qtvr
The vision of the panoramic (cylindrical) and its undeformed transposition on the computer monitor.
 
3) Software for creating hotspots (hot areas, or areas sensitive to interactivity).
With this functionality, the user can manage to move from one "node" (single panorama) to another, thus shifting the point of view.
apple qtvr
Three nodes connected
Client-side viewing software is included as an implementation of QuickTime 2.5 and is called QuickTime Virtual Reality, or QTVR.
In April 2001 it was updated to allow the display of spherical panoramas of 360° x180°: it is called cubic VR, since the mapping is performed on a cube rather than on a sphere.
 

In 1998 Prof. Helmut Dersch of the University of Furtwangen, Germany, began the research to develop an open-source software capable of sewing both rectilinear photographs (produced by normal lenses, wide-angle lenses or canvases) and deformed, that is, originated by fisheye.
The software package is called Panotools.
A war opens with an American company, IPIX, which claims copyright on the algorithms for deformation and splicing of images produced by fisheye, especially for the splicing of only two images generated by fisheye.
Professor Dersch shrink back, but does not give up on his research, also because copyright is registered in the USA and has no value in Europe.
But in turn Ford Oxaal complains that IPIX fraudulently copyrighted part of its studies.
IPIX went bankrupt in 2006, but in 2015 industries of a certain importance such as the Japanese Ricoh, put on the market cameras with two opposing fisheye, with an internal software that allows stitching.
Obviously any copyright restrictions are passed.

ipix


The research of prof. Dersch also extend on the Java-based visualization engine that already in 1999 (three years before QuickTime) allows the viewing of 360° x 180° panoramas.

The code of the panotools for the stitching of the images is used and enriched by a graphical interface by various passionate software developers. PTGui, PTMac, Hugin, Easypano are all software that were initially based on the panotools code for splicing images.
More recently, software such as Autopano and Autopano Giga (for high resolution panoramas) by Kolor appear. (and stops in 2019)
Further progress is made in the field of image overlay (blending), in the automatic determination of control points (necessary to identify the areas of overlap between two adjacent photos), and in the production of panoramic HDR images and in the merging (enfusing) of various exhibitions.
Since the advent of smart-phones, towards 2007, many apps have been produced for joining photos, both in partial and complete 360° panoramas.
Today (2020) many smartphones have software inside that make the phone a potential 360° camera

As for the software for viewing immersive photos on the net, with the rapid decline of QuckTime, in 2005 software based on Adobe Flash began to be developed: Flash Panoramas and then Krpano, Pano2VR, Lucid, Panosalado, Panotour (based on Krpano).

On the Java side, F. Seniore optimizes Prof Dersch's PTviewer applet, and Immervision and Easypano (also in Flash) continue the marketing of Java applets.

Also worth mentioning is Spi-V, software developed by Aldo Hoeben and based on Adobe Shockwave.

Currently, with the entry on the market of iPad, iPhone and mobile phones with Android OS, the visualization software have abandoned Adobe Flash (which until 2015 was installed on 99% of computers), and use the HTML5 and WebVR code (for the display on HMD devices)

Today, (2020) with the entry on the market of numerous 360 cameras, the visualization software has multiplied and there are many on the net that display 360° stereo and video panoramas.


 
NOTES
We remember Janie Fitzgerald and Scott Highton among the first professional photographers to have used Apple's new software.

REFERENCES
- Chen, S. E., QuickTime VR - An Image-based Approach to Virtual Environment Navigation, Proceedings of SIGGRAPH'95, pp. 29-38, 1995
- Panotools
- Douglas Cape, A Short History of QuickTime VR, Dorénavant Blog

 

< previous next >

© Toni Garbasso