Where Is Computational Photography Going Next?

Computer trickery makes for better snapshots

Key Takeaways

  • Google’s new Pixel 6 cameras have more computer trickery than ever.
  • Finally, non-white people are a priority in Google’s algorithms.
  • You still have to know where to point the camera to get a great shot. 
Four Google Pixel 6 smartphones lined up, face down.


Google's new Pixel 6 phones are cameras with a phone attached

Like the iPhone, Google's new Pixels come in regular and Pro tiers and have huge, unmissable camera bumps around the back. In Google's case, the bump is a bar that stretches across the width of the device. It looks cool and houses an impressive array of new lenses and sensors.

But, also like the iPhone, it's what's inside that counts. Computational photography is taking over photography, but where is it going?

"Those promoting computational (and AI-based) photography have for years promised algorithms and technology that can turn the average photograph into something that a pro photographer would be proud of, but this is still some way off," professional photographer Tim Daniels told Lifewire via email. 

Computational Photography

Computational photography started as a way to massage the awful images from early smartphone cameras into pictures you could look at and enjoy. The tiny lenses and sensors in phones struggled in low light and had trouble capturing intricate detail.

But then, dedicated image-processing chips like Apple's Neural Engine, capable of trillions of operations in a second, transformed images. Now we have background-blurring portrait modes, night modes that give amazing images from near darkness, "sweater mode," which combines several images to provide better detail, along with magic tricks like blink detection, which means that half-closed eyes never spoil group shots. 

The beauty of all this trickery is that all you have to do is frame your shot, and the phone delivers a perfect shot every time. On the other hand, photographers don't always want a "perfect" shot. 

"Personally, I can't see computational photography ever having a large market share among hobbyist photographers like myself. We enjoy photography for its own sake—selecting exposure, aperture, framing, and the like—and ceding this to an algorithm would take away a lot of the fun of photography," said Daniels.

The Pixel 6

The cameras inside the new phones are impressive. Both models get a wide and ultra-wide camera, and the Pro adds a 4X telephoto, but the hardware is only part of the story. 

For example, Magic Eraser lets you remove distracting elements from the photo. Not only that, but the camera detects these elements automatically and suggests removal. You just confirm with a tap.

Example of what Magic Eraser can do in a photograph.
Example of what Magic Eraser can do in a photograph.

Magic Eraser

Or how about face unblur? If your subject is moving fast in low light, this feature attempts to unblur their face. It's perfect for snapshots of fast-moving kids (all kids who aren't sleeping) indoors. And Motion Mode does the opposite—deliberately blurring elements that are moving for effect. 

Perhaps the best feature is the most subtle. Real Tone lets cameras render any skin tone properly. "With Pixel 6, we vastly improved our camera tuning models and algorithms to more accurately highlight the nuances of diverse skin tones," says Google's camera blog.

Google worked with Black, Indigenous, and People of Color (BIPOC) photographers to create the images used to train the algorithms. Given the ethnic bias that has been built into photography since the early film days, this is a big deal. 

Better Pictures, Less Effort

Computational photography seems to have two purposes right now. One is to give you an amazing photo, no matter the conditions. The other is to mimic techniques that often take a lot of knowledge and skill to achieve "by hand" on a regular camera. 

In some ways, this risks making all our photos look the same. On the other hand, take a look at any of the photos from the members of any camera club over the decades, and they're just as full of cliches. From the rule-of-thirds to using slow shutter speeds to blur waterfalls to the almost unshakeable instinct to have people smile in photos. 

Photography examples from the Google Pixel 6.
Photography examples from the Google Pixel 6.


"Although the Pixel 6 is the next evolution of this technology, it is still in the first stages of computational photography and does not mean that you can take high-quality photos without skill," says Daniels

For those who prefer to push through these cliches, nothing changes. But for folks who just want great pictures of family, friends, places, and breakfast, computational photography is the best thing ever. Imagine how different your world would be if all those old printed snaps in family albums were as good as the pictures you take with your phone.

Was this page helpful?