Here’s how the Pixel’s AI zoom compares to a real 100x lens


In case you missed it last week among other big news items, Google shipped a phone camera with a zoom feature that uses generative AI. That’s right: the Pixel 10 Pro comes with AI right inside the camera app that cleans up otherwise crappy digital zoom images all the way up to 100x. It’s a what-is-a-photo nightmare, but it’s also pretty good — at least it seems to be. But it’s hard to be completely sure what the thing you’re photographing is supposed to look like when it’s miles away. So I brought in a ringer for some side-by-side comparisons: the Nikon Coolpix P1100.

For those unfamiliar, the P1100 is a massive ultrazoom camera with an equivalent range of 24-3000mm. When you have optics like that you don’t need to do any upscaling like the Pixel 10 Pro does. The camera applies some noise reduction, sharpening, and color adjustments, sure. But it doesn’t have to completely guess at what any individual pixel should look like, because it had some information to start with.

Digital zoom, like the Pixel 10 Pro uses, is a different story. Upscaling an image 10 or 20 or 100 times without the benefit of optical magnification leaves a lot of gaps to fill in. Algorithms can make pretty good guesses, but they are just that: guesses. The Pixel 10 Pro’s Pro Res Zoom makes those guesses with the help of generative AI. And if we’re taking AI zoom photos, what better subject to start with than the moon?

1/3

Taken with Pixel 10 Pro at 100x, no AI processing.

It is asking a lot of a smartphone camera to take a picture of the moon, and Google isn’t the first phone maker to bring AI to the fight. The Pro Res Zoom version certainly looks moon-like, but AI gives it a strange spongey texture that doesn’t look quite right — especially comparing it to the P1100’s version.

1/3

Taken with Pixel 10 Pro at 100x, no AI processing.

The images above of Lumen Field’s exterior were taken from an overlook in downtown Seattle near Pike Place Market about a mile away. It was a hazy, overcast day so apologies for the drab images, but they give a better idea of where Pro Res Zoom excels and where it falls down. The AI model makes the numbers on the signs readable and cleans up edges really well, but it basically erases the metal cladding on the side of the building, like overly aggressive noise reduction. And once again, AI doesn’t know what to do with writing.

1/3

Taken with Pixel 10 Pro at 100x, no AI processing.

These photos of Starbucks headquarters, a mile south of Lumen, were taken from the same viewpoint. On a small screen the AI version seems alright, but if you look closely you can see where it turned some lamps into windows and gave the clock on the tower a little Salvador Dalí treatment.

1/3

Taken with Pixel 10 Pro at 100x, no AI processing.

On a sunnier day I pointed both cameras at another Seattle landmark. I was about three miles away from the Space Needle and encountered another enemy of long-range photography: heat haze. The AI didn’t quite know what to do with the distorted lines and created Tim Burton’s The Space Needle instead. But you can see that the P1100 didn’t fare much better, what with all the hot atmosphere between the lens and the subject.

1/3

Taken with Pixel 10 Pro at 100x, no AI processing.

Heat haze is clearly a problem in this situation, too. I wasn’t standing too far from the planes at Boeing Field in the images above, but there was a lot of hot asphalt between me and the planes I was photographing creating heat waves. But this is clearly where AI shines. In fact, it might be your only option if you’re trying to correct for something as tricky as heat haze.

This is where everything gets complicated

This is where everything gets complicated. Generative AI has existed in photo editing tools for years now, and it’s extremely useful for things like removing noise from a photo taken with an old DSLR. Heat haze is an even nastier problem; the random distortions and waves are all but impossible to correct with traditional digital photo editing tools. Landscape and wildlife photographers are already embracing AI editing tools that can do things your regular Lightroom sliders can only dream of.

Is it different when AI is inside the camera app, not just in the professional image editor you’d use after the fact? Absolutely. Does Pro Res Zoom get things wrong a lot? Also yes. But this has been an illuminating exercise, and I don’t think this is the last we’ll hear of generative AI being used in the image capture tool itself.

10 Comments

Follow topics and authors from this story to see more like this in your personalized homepage feed and to receive email updates.




Source link

Leave a Comment

Your email address will not be published. Required fields are marked *