News Smart & Connected Life Don’t Replace Your Camera With a Smartphone Just Yet Real talk about smartphone photography by Lance Ulanoff Editor-in-Chief, Lifewire.com our editorial process Facebook Twitter LinkedIn Lance Ulanoff Published July 15, 2019 Updated August 12, 2019 11:44AM EDT Lifewire / Daniel Fishel Smart & Connected Life Phones Internet & Security Computers Smart & Connected Life Home Theater Software & Apps Social Media Streaming Gaming View More Tweet Share Email On my Instagram is a gorgeous shot of the waxing gibbous moon. Actually, my Instagram is full of moon shots, illustrating my endless fascination with Earth’s orbiting satellite. When I showed it to a friend, he started and asked if I had used ultra-zoom to capture it. I looked at him for a beat and then realized he was asking about what mythical iPhone camera setting I used to capture the shot. Even if I tied my iPhone to a SpaceX rocket and remotely triggered a 2X zoom photo, I still couldn’t get this shot (below). I laughed and explained that this stark image was captured with my Sony A6000, a mirrorless, prosumer digital camera fitted with a 200 mm telephoto lens. It’s not the first time I’ve gotten such a question. Most people I know assume that if you post a photo on Instagram (or even Facebook or Twitter), you took it with the one camera you always have on you: your smartphone camera. Lifewire /Lance Ulanoff Get Real I worry sometimes that I’ve misled people when it comes to smartphone cameras and their photographic capabilities. Don’t get me wrong. Apple, Samsung and Google have done some stellar mobile photography development. Apple in particular has led the way on lenses, aperture, and image signal processing (ISP). An iPhone XS literally does trillions of computations on each image. As a result, the iPhone XS punches above its weight when it comes to capturing detail and motion even in some pretty challenging lighting situations. Samsung’s Galaxy S9 and the Galaxy S10+ are equally able pocket shooters, with strong low-light performance and rich (sometimes over-rich) color saturation. Similarly, the Google’s Pixel 3 does a remarkable job of capturing image detail even in the lowest light situations. In optimal lighting and subject situations, any of these smartphone cameras is capable of poster-worthy photography. In most other environments, Apple, Google Samsung and other smartphones manufacturers do algorithmic acrobatics (including High Dynamic Range image layering) to combat the very real physical limitations of virtually all smartphone cameras. Smartphones have tiny lenses, but that’s not as much of a limitation as you might think. Yes, my A6000 is able to use much larger lenses to focus far more light into the body of the camera, but it’s what’s on the receiving end of that light that probably has the greatest impact on overall image quality and photographic capabilities. iFixit Sensing a Theme Take a look at any iFixit teardown of your favorite smartphone and you’ll understand just how much work each manufacturer does to squeeze a tremendous amount of powerful technology and battery (often the largest component) into your smartphone. The image sensor — and usually a healthy stack of lenses in front of it — is just one small part of the whole. Also, keep in mind that for today’s smartphones, these companies are throwing in as many as five cameras – that’s five sets of lens stacks and sensors. The main 12 MP sensor in my iPhone XS measures approximately 7 mm x 5.8 mm. My A6000’s APS-C 24-megapixel sensor is, by contrast, 23.5 mm x 15.6 mm, which is a little smaller than an old-school 35 mm negative. Tiny sensors do not translate into poor image quality, but the closer Samsung, Apple and others get to SLR-level megapixels on sub-10 mm image sensors, the more pixels they’ll be squeezing into a tiny space, making each pixel smaller and smaller. Tinier pixels can result in interference and image-quality issues. In addition, these tiny sensors can’t always capture all the information gathered by the lens, essentially cropping images on the fly. To their credit, smartphone manufacturers have developed image processing tricks to essentially obliterate any such noise. The sunset was spectacular, but I had to digitally zoom on my iPhone XS to capture it. Lifewire / Lance Ulanoff Making Adjustments I want to be clear. Most of the time, I am just like you, happily shooting what I think are sometimes excellent photos with my smartphone, but every once in a while, I kick myself for leaving behind my most capable digital camera. The other night, we took a stroll on the Jones Beach boardwalk. It was an unusually clear night with a spectacular sunset on one side of me and on the other the nearly full moon producing a lovely reflection on the ocean below. I started taking photos with the iPhone XS. The iPhone lens, like the one on most other smartphone cameras, is a wide-angle lens, which means whatever looked close to me (like the beautiful red sunset and the bright moon) looked like distant smudgy vistas and featureless orbs through the iPhone display. I loved how the moon reflected on the water below, but in this iPhone XS shot, it ended up looking like a distant light bulb. Lifewire / Lance Ulanoff More Control, Please The moon, which, as the sun sets becomes the brightest object in the sky, over-matched my smartphone camera. Apple’s iPhone XS main camera comes with 2X optical zoom, which is generally a godsend, but it wasn’t enough to bring the moon and ocean horizon close enough to create much of a composition. So, I pinched out to go past optical and into digital zoom, knowing that this does more to degrade the image than it does to bring my subject closer. I also tried tapping on the screen to access the basic exposure control and then swiped down until the moon was not a blown-out white dot. In the end I couldn’t get the shot I saw with my own naked eyes. Samsung at least includes manual photo controls in its camera app that let you control shutter speed, ISO (speed of the digital film; a higher number collects more light but introduces more grain, a lower number pulls in less light but makes the image crystal clear), and manual focus. On my iPhone I can install apps like Camera + and Moment for more manual controls. The latter company also sells a collection of physical lens add-ons that can further improve your smartphone photographic results. Still none of these tools can rival the precision control possible on a camera like the Sony A6000. In addition to interchangeable lenses (a 200 mm lens is fantastic for optically pulling in distant objects like the moon), they let you adjust every aspect of image capture. To get my best moon shots, I adjust the shutter speed to roughly 125 of a second (the amount of time the shutter is open), turn up the f-stop (the size of the aperture, a tighter one increases focus depth) to f11 or above, keep my ISO low to reduce grain, manually focus by looking through the viewfinder to see my subject in crisp detail, and then I stand as still as possible (if I didn’t bring a tripod) before I take the shot. With the moon a little larger and the sky a bit brighter, I got better iPhone XS results. Lifewire / Lance Ulanoff What's Possible I can do some of these things with my smartphone, but not all. There’s no way to manually control a smartphone’s aperture through each stop. Focus control on the iPhone usually consists of tapping on the screen to bring one subject or another into focus (those aforementioned apps can help with more granular focus). Soon, Samsung, Apple and others will start unveiling their latest smartphones, featuring even better, if not bigger, camera sensors, lenses, and artificial intelligence-infused image processors. The images they’ll show you will be stunning. But until these manufacturers can match the sensor sizes, lens, and setting control options of DSLRS, they will not surpass prosumer-level camera photography.