Alpha on Amazon

Subscribe to Cameracraft

Cameracraft is one of the highest quality photo enthusiast magazines you'll find - worldwide. Our Photoclubalpha subscription deal is £20 less (UK) than the full annual cost for six editions. You can cancel at any time on Paypal or by contacting us.

Postal Region

PDF hi-res download version

Payment Options

Photoclubalpha Forum

Join our free Forum for a wealth of info, great company and some fantastic photo sharing threads! Registration on the Forum is separate from Registration on the website, but you are allowed to register using the same name and password.
  • Alpha E-Mount system • Re: Sony HEIF: Why no Adobe RGB? March 24, 2024
    I don't own high-end screens. My two Samsung 24" screens don't show any difference between JPG and Heif files.Statistics: Posted by Fotogeorge — Sun Mar 24, 2024 7:14 pm
    Fotogeorge
  • Alpha E-Mount system • Re: Sony HEIF: Why no Adobe RGB? March 23, 2024
    I have been finding several things about the A6700 that not a single written or video review bothered to mention. No surprise. Every time I buy a camera it is the same. I have searched and still not found an answer to my question so today I asked ChatGPT. It gave this answer which is […]
    bakubo
  • Alpha E-Mount system • Re: Sony HEIF: Why no Adobe RGB? March 20, 2024
    I'm here. Check it every day. Most of my older photo software doesn't use Heif. My iPad uses Heif so I changed it to Jpg. All my older photo software products don't recognize Heif files. Canon originally used Heif files. I'll stick with Jpeg. I have a A6500. The A6500 doesn't work with Nissin MF18 […]
    Fotogeorge
  • Alpha E-Mount system • Sony HEIF: Why no Adobe RGB? March 19, 2024
    Is anyone still on this forum?Last month I bought a Sony A6700. I still have mostly m4/3 gear and still mostly prefer it, but I was sort of bored and felt like trying something else.Anyway, I have been playing around with HEIF photos. Sony, Canon, Nikon, Fuji, Panasonic, etc. these days produce 3 types of […]
    bakubo
  • Give it Your Best Shot • Re: Fog January 29, 2024
    Beautiful B&W images. Thank you.Statistics: Posted by Fotogeorge — Mon Jan 29, 2024 7:07 pm
    Fotogeorge

Past Article Calendar

August 2010
M T W T F S S
 1
2345678
9101112131415
16171819202122
23242526272829
3031  

A dream of the future – and past

Sometimes earlier this year, early Spring I think, I had a vivid and detailed dream during a slow waking-up hour. It was the kind of dream which feels rational not random. I knew what I was doing in it – in control!

This time, I described the dream to my wife and son; he knows a lot about this stuff, and thought it was an accurate dream. It was possible. Now Sony is about to release the camera I was using in the dream.

Here is the dream.

I am walking across a kind of pier or boardwalk construction at the edge of water. It’s not in Britain. It’s warm and sunny, and it could be in the USA. The boards are raised above what would be the shore, and there are wooden buildings left and right of me. Ahead, I can see the lake water, and boat moorings with a jetty. To the left of me is the largest building, which is a shop or museum; something to visit. There are ornamental shrubs placed on planters or pots, and there are some notices or signs on the building. To the right, the wooden building is functional; it could be a boat house, a yacht club, or something like that. There are pine woods beyond.

My job is to move to the four corners of this scene, and other positions, taking care to make a complete set of images from a range of camera placements and angles. I’m using a wide-angle lens, and my camera is equipped with GPS which records the exact position and orientation of the camera for every shot.

I do not worry about people in the pictures because the software will ignore them, nor about the light, but it is a beautiful day anyway. I am taking the pictures for a project and this is paid work. This is actually what I do for a living (in the dream). I am visiting hundreds of the most frequently-photographed places in the world, and producing a set of pictures of each one.

But it’s not what I am doing which is the interesting bit. It’s what I know about it. In the dream, I have all the knowledge about what I am doing that I would have if it was real.

480 width ad:

Time and space

Here’s how my pictures are being used. Each set of images with its GPS coordinates is fed into a system which constructs a 3D model of the environment. It is capable of recognising identical elements seen from different angles, and uses the GPS data to help identify them. With two 2D views of a building from different positions, it can use the focus distance and lens angle information to compensate for small inaccuracies in the GPS data, and wireframe the exact design and scale of the structure.

It identifies textures and objects like foliage, text on signs, clouds, and people. Once my entire set of images from this place has been processed (I am aware they are being transmitted as I take the pictures) new photographs which never existed can be created. A virtual camera can be positioned anywhere within the area I photographed, and my few dozen still images from fixed positions enable a new view to be constructed with complete accuracy.

I’ve used the result (in my dream) and it has incredibly high resolution because of the correlated image information. It’s a bit like Sony’s multi-shot or HDR or panorama technology, but instead of aligning two very similar images, it maps the coincident key points of entirely different views of the same scene. Where a walk-through VR allows viewing all angles from one position, this allows viewing any angle from any position.

And it goes beyond that to add a timeline.

The system I’m working for gathers millions of photographs from people all over the world. I’m photographing these key locations because they are the most photographed in the world. Camera phone images now record GPS data, and also record the date. So (at this future time) do most digital cameras and video cameras.

The system can find images matching every location by trawling the web; from Flickr, Facebook or whatever is out there. It can analyse the images to see whether they actually match the location they appear to be from. For every location, the system gathers in as many more pictures as it can find.

The first result of this is more detail. The second is that the viewer can change the season or weather conditions in which the location is seen. It can be viewed at night, in snow, in rain, at sunset; whatever. My image-set provides the framework, but seasonal changes can be created from the ‘found’ images of the place.

The second result is the timeline. Old photographs of these places have been fed into the system. For some popular spots, it’s possible to track the environment backwards for over 100 years. Trees change size, buildings appear and disappear. By turning on ‘people’ (which the software can remove) the crowds, groups or individuals who were in the scene at any time can be shown. And the 3D environment is still enabled because all the old photographs are co-ordinate mapped to the new information.

I do not have to work all this out in my dream, because I already know it. I am working with this awareness. The entire thing is known to me, without having to think about it. I also know that future pictures captured from internet will continue to add to the timeline and the ‘people’ function, so in five years’ time the seasons and the visitors to this place can be viewed almost by the minute.

The dark side

Because this is a dream, I do not have to think or rationalise to get this understanding; it was included with the dream. As I wake up, I realise what I have been dreaming and then make an effort to ‘save to memory’. That also kicks in the thinking process.

I start to wonder who was hiring me to do this survey-type photography, because in the dream that is one thing I don’t know. I realise how exciting it is to be able to use this Google-Earth or Google-Street type application to view not only any part and any angle of these tourist locations, but any season or time of day, and many past times in their history.

When I describe it to him, Richard suggests it’s probably Microsoft. He likes the collation of web-sourced images covering seasons, and maybe decades of past time. He thinks it is all possible and the core technology exists right now. I should patent it and give it a name!

But there is one thing which I understood just as I was waking up; the system can recognise people. Not just as people to be ‘removed’ from a scene or turned back on; it can recognise faces. The movements of one individual can be reconstructed within that location, and it can use a ‘cloud’ of gathered pictures taken at the same time to do so. This is not just virtual tourism and virtual history. In other locations – not beautiful waterside boardwalk quays – it is surveillance brought to a new level.

Sony A55 and A580

Sony’s new models with built-in GPS are the first cameras which will record the data my dream required. The GPS is not the typical latitude-longitude only. It also records height above sea level (elevation) and the direction the camera is pointing (orientation). The camera-data information records the focus distance and point of focus, and the angle of view of the lens (focal length), the time, and the measured light level and apparent colour temperature. Maybe in the A55 the spirit level function also records horizon tilt and position.

OK, the camera I was using in the dream was more like a 5 x 4 on a tripod. But that could be just a dream – like the giant fish which leapt on to boards and brought the jetty crashing down into the water a second before I woke up…

– David Kilpatrick

480 width ad:

5 comments to A dream of the future – and past

  • EarMaster

    Look at some of the Synths that have a higher synthyness and make sure that you are actually looking at a photosynth and not a panorama (in the explore section you can select this on the left side).

    Here is an example that works really well in my opinion: //photosynth.net/view.aspx?cid=e24e7ec8-b6ff-4946-9cf1-26602b9335f5
    You can select “Point cloud” and “Overhead” from the view button on the bottom of the interface (second from the right). This allows you to see how all the images are beeing arranged corresponding to their content and therefore create a 3D model. You can also hover each point with you mouse and see that Photosynth even recognized the position of the photographer and his viewing angle. The images in this example are also not taken from a single point.

    Some details on the technique and capabilities of Photosynth gives this TED talk by Blaise Aguera y Arcas showing one of the first public releases of Photosynth: //www.ted.com/talks/lang/eng/blaise_aguera_y_arcas_demos_photosynth.html

    • admin

      Yes, it’s pretty remarkable but nothing I had seen before, so it’s not a memory of Photosynth – and not all that closely related. In the dream I only had to take four photographs of any one area and nothing looking up or down. Probably more inspired by graphic adventure games on PlayStation or stuff like that. What I was doing was photogrammetric, more like a surveyor’s job than a photographer. I guess the equivalent with the Bridges photosynth would be if someone had created a topographical wireframe, and the colour/texture information from the photos was then mapped and wrapped to the wireframe. It sounds a bit silly to say it, but this dream was so clear that I knew what I was doing. I was being very careful to include key points like roof apex, wall corners etc and everything had to be very straight/level. I was also aware that the boardwalk/decking was going to make a good results.

      About this time I had been doing some NEX panoramas and those make you very aware of how the foreground matters, and I was also working on a 3D commission so I had been lining up coincident points in images and experimenting with camera positions. Also, going back a long way to 1979 I was one of the still life studio photographers employed by Nimslo to use the Computrak, and to shoot motordrive 3D aerial sequences from a moving helicopter (and also multishot full frame 35mm with their large cameras). And my site progress photography in the 1980s involved exactly what I was doing in the dream – using an architecturally correct large format or rollfilm camera with a lens covering more than 90 degree, to photograph set areas from four exact repeated positions (also over a period of time). All those memories plus stuff like:

      //www.360cities.net/profile/daniel-oi

      Probably had more influence than Photosynth or whatever I have previously seen of it.

  • Steve4D

    Very interesting David….. You were very fortunate to be able to recall your dream so vividly and then express it so graphically for the rest of us.

    Some may ask the question; ‘What were you smoking?’. However, in most of what you expressed the technology is already here; GPS, 360 panoramas and the rest is sheer computing power to collect and then generate images in the time and space continuum for a given location. The fact that you mention that you are taking photos of the most frequently photographed places in the world, makes all this all very logical and even achievable, because there is a already huge database to draw from and it is already growing larger every day.

    Maybe the ‘people’ on/off switch thing is revealing, in a Freudian way – Representing your desire to have ability to remove distracting people from both the subject matter and the task at hand, with the mere push of a button?

    The notion that the device being used is 5 x 4 (rather large), must mean that it was a clunky beta version and was not yet fully refined to fit into an A55 sized package.

    You were actually on a ‘PAID’ assignment working for Sony, and this is also precisely when the fish leapt onto the boards and you woke up 🙂

    Fascinating and a good read – Cheers Steve

  • EarMaster

    Well, your dream isn’t so much future as you might think. Check out Photosynth (//photosynth.net/). It’s a technology Microsoft purchases some time ago and created an application which does in fact matches photographs of a place together and arranges them in a 3D space corresponding to the part of the place they are showing. It does not create a 3D model of the place but you can navigate through this in a way that feels a lot like a 3D version of the place. You will need Microsoft Silverlight to experience this facinating technique, but it’s worth it.

    • admin

      Photosynth appears to be a web version of Hugin – 360 degree panorama stitching. I do not know why they use the word 3D because someone else pointed me to this, and I had a look; all I can find are normal Hugin-style stitchers, and not as good as Hugin examples I have seen in the past, like Daniel Oi’s great examples from Glasgow at night in winter. All the pictures of the place come from one photographer standing in one position and shooting a stitcher. Unless I am missing something? I’ve probably seen Photosynth before as I had an earlier version of Silverlight installed.