With Computational Photography, 95% of what you shoot is no longer captured but generated

Apple Keynote, Sept 10 2019, Steve Jobs theater

iPhone 11 and its « Deep Fusion » mode leave no doubt that photography is now software.

Everybody in techdom has noticed that Apple Sept 10, 2019 keynote was still deeply focused on the iPhone, and more precisely on what the company calls its « camera system ». The company insisted almost dramatically on the « extraordinary » capabilities of the new product line when it comes to photo and video.

I will focus on the iPhone Pro that brings the most radically (3+ optics are no longer new though) but also seem to cater, in the keynote at least, to the category of « Pro photographers ». Its relies on 3 cameras, that boast « exceptional » features in Apple’s usual jargon, yet the real revolution is in what you can do with them : 6 pro photographers examples were displayed on the screen as a proof. Same with the « totally redesigned » Photos app. Capturing light was one thing, it is now eclipsed by processing pixels.

Still, the most prominent features advocated yesterday were « Night Mode », « Deep Fusion », video shooting modes that all rely on the 2 or 3 cameras and that « pre shoot » before the A13 bionic chip combines and optimizes everything for final rendering.

Capturing was one thing, it is now eclipsed by processing.

In the case of « Deep Fusion » — that will come, interestingly, as a software update later in November — details unveiled in the keynote mentioned 8 shots being taken just before the button is pressed, then combined with the photo and processed by the A13 bionic chip neural engine to generate, in almost real time, an optimized picture that has been calculated pixel by pixel.

Apple is always scarce on figures ; the tech Kommentariat got frustrated by the absence of mW in the battery life, or the very bizarre chart with no axis that supported the idea that the new in-house A13 bionic chip was the best ever included in any smartphone. Googling deeper across the web does not bring much though : for instance, a similar « 1 Trillion operations per second » claim was already made at the previous iPhoneXS launch, relating the to the previous A12 chip (even if A13 is supposed to have more transistors, deliver more, and consume less though). Setting this aside, and just dividing 1 trillion (as 10^12) by 12 million pixels results in a whopping 83.000 operations per pixel in a second.

Computational photography delivers post-processed images recreated from pre-processed shots, according to an AI « taste ».

Photography has therefore entered a new era where you now take a picture (if the word still has a meaning) first, and manage parameters afterwards such as focus, exposure, depth of field. As a result you do not get « one shot » but are offered an optimized computation among tons of others. In the case of « Deep Fusion », what you receive is a post-processed image that has been totally recreated from pre-processed shots, all of this according to an Artificial Intelligence « taste ».

Now (even) « Pro » photographers will start trading freedom of choice for convenience. Pictures have become shots and I start missing my Ilford rolls. Sometimes.

Disclosure : I modestly contributed to this 10 years ago as I joined imsense, bringing single picture Dynamic Range Processing to the iPhone, and later bringing Apple’s HDR mode to decency. We honestly just wanted to rebalance the light by recovering « eye-fidelity ».

Apple m’a racheté – la véritable histoire d’imsense

Après 18 mois de silence strict (aucun commentaire oral ou écrit des rumeurs qui ont pu circuler sur le sujet) et 6 mois supplémentaires de discrétion écrite, j’avais repris ma liberté de parole courant juillet 2012, une fois passé le 2ème anniversaire de l’acquisition d’imsense par Apple.

Les enjeux principaux étaient liés à la propriété intellectuelle, comme l’a illustré dans le courant de l’été le méga procès opposant Apple à Samsung.

Ci-dessous l’interview exclusive accordée à Olivier Frigara pour la 115è édition de son émission « On refait le Mac » avec en complément mon coup de coeur : le service @SaneBox de tri automatique des emails importants

Autres liens:

A Magazine is an iPad that does not work

Amazing video evidencing how Operating Systems and User Experiences can affect our behavior very early in the process. And a tribute to Steve Jobs.

From the author of this video

« Technology codes our minds, changes our OS. Apple products have done this extensively. The video shows how magazines are now useless and impossible to understand, for digital natives. It shows real life clip of a 1-year old, growing among touch screens and print. And how the latter becomes irrelevant. Medium is message. Humble tribute to Steve Jobs, by the most important person : a baby.« 

Monitoring and Optimizing your iPhoto folder size

I noticed recently that my hard drive had shrunk by several, if not a dozen GB. The phenomenon shortly followed a smooth upgrade to OSX Lion who became suspect #1 shortly. And in vain.

Then I turned towards the other usual suspects, namely caches. For these, a simple restart does most of the cleanup and reclaims several GB if your machine has been up and running for a long while (several days if not weeks).

Focus really came to iPhoto when I realized that some changes made to pictures within iPhoto did not reflect on my iPad2: pics were preserved, but did not show up in their latest version.

So I opened the iPhoto packet (Ctrl-click) in my Images folder, sorted content by size, and found that the iPod Photo Cache was 19.6 GB = 25% of my total iPhoto library size

This folder holds all the resized versions of pictures you sync with your various iDevices: in my case, many of the iPhones I had used in my former business lives along with two iPads and a now old iPod Photo… Even if it sits in the iPhoto packet, the iPod Photo Cache content is actually manipulated by iTunes at each synchronization session.

Capture_decran_2011-08-30_a_14

So I quit iPhoto, trashed the folder, and went for 2 long sync sessions (iPhone and iPad) as the whole cache had to be regenerated. Now my pics are in sync again, and the new cache folder is 85% smaller.

Tell me about yours.

Delighting Views on the iPad

The original post was released on imphotonow.com with URL: imphotonow.com/2010/01/delighting-views-on-the-ipad, and it has been very slightly edited to fit this blog format. Updates and revisions are commented out accordingly.

So the party is over, and everybody is waking up with the usually mixed feelings, debating whether the wait was after all better than the catch. In the particular case of Apple, we have seen the usual comments ranging from the self-congratulation « I predicted it »’s to the disappointed over-expectations made on feature lists. And after all, while the party is over, another wait just begins, with a new array of thoughts about how long the AppleStore queues will be in 60 days.

Of the most interesting comments I could read (I switched off really quickly from « real time » video coverage that was both empty and self-flattering), a few key points shall imho cut through the obfuscatory “missing feature” bashing or the “I want it” enthusiasms

1. It’s the user experience, stupid

It’s definitely not about features. Apple has by now made us used to products that are designed by engineers for the mass market, while most of the rest of the industry still sells products designed by mass marketers but at the end of the day suited for engineers.
I myself would have loved to see a camera in the device (even read intelligent comments like « after all, it’s just $1 additional to the BOM » – but they miss the point), then started to wonder about the practicality of the form factor for such a camera…

Apple’s omissions in the feature list may be disappointing, they are conscious by necessity, and have been certainly given lots of thoughts. Like Mashable’s Stan Shroeder, we might miss a big point if we think we would have done this better. Remember great design is about choice, and about deliberate omissions. Who recalls by now that the iPhone lacked copy/paste, MMS support, or GPS for a while ?

So I expect this all will be balanced or offset by the experiences in browsing, reading, emailing, as happened with the iPhone thanks to a superior design in each feature and a superb integration across them. Paradoxically, that last part reminds me of what Apple was trying to achieve already in 1994 on the Newton: I was lucky enough to work on the platform at the time and remember very well that despite mid-90’s hardware limitations (and probably a form factor error – apparently proven by the Palm Pilot later success), the NewtonOS was already bringing coherence and even cohesion across applications in order to strengthen the sum of the parts…

2. It’s the industrial approach, stupid

As I previously wrote, Apple products are designed by engineers. The real stunning part of this week’s announcement is how Apple is apparently pushing hardware/software integration to the next level. Very few people noted at the time that the iPhone was the first hardware platform deliberately dimensioned to host 3 consecutive major releases of an operating system. That the unique form factor choice (as opposed to the usual bet hedging, try-it-all spread from other phone vendors) allowed not only economies of scale, but also predictability for application vendors (anybody who developed S60 apps is probably smiling at these lines).

The same industrial approach prevails in the use of aluminum, where Apple now has gained great design skills through its line of MacBooks, as well as probably in the casing and shaping which gives the iPad an immediate proximity with the rest of the Apple family.

In the iPad case, Apple is again playing the one (new) size fits all game and we shall expect such form factor to be maintained across the following revisions (there will be revisions for sure, the whens and the whats will be secrets as always, but we can expect some of the missing items in our shopping lists to be ticked once Apple has adapted the user interface : a camera – probably front end for video calls -, mat charging – or transparent solar panels – …

Meanwhile, the form factor itself will open a vast array of possibilities when (re)designing applications for the iPad, as evidenced in the effort that Apple undertook themselves for existing applications as well as for the new iWork suite. (Incidentally, iWork on iPad will provide an interesting answer to some naysayers who have been claiming for the past decade that a good smartphone was a smartphone where you could edit Word documents, Excel spreadsheets, and PowerPoints on the go)

A few people noted the appearance of Apple’s own processor design to power the iPad while avoiding a drain in power. The chip must be very optimized in order to keep the device up for 10 hours, yet very powerful to keep the graphics flowing across such a large screen, and more difficult to compare with anything else by the still kicking megahertz myth.