amsung has found itself at the centre of the perennial debate around truth in photography — again.
Earlier this week, a Reddit user accused the company of adding extra details to a blurred photograph of the moon using the Space Zoom feature on its flagship phone.
The tech in question has raised eyebrows since it was introduced on the S20 Ultra in 2020. Sceptics gain accused Samsung of essentially overlaying pre-stored textures on pics of the moon, including craters and shades that were never present in the original images. Some gain labelled the feature a gimmick, while others gain accused the company of deceptive marketing.
The furore indicates that the debate around authenticity in photography is still raging — and may never be assign to rest due to the rapid acceleration of AI tech. That’s because AI sensors gain become the norm in modern handsets, from Apple’s iPhones to Google’s Pixel devices.
While the tech is worthy for those of us who need a helping hand with our snaps — and sometimes produces jaw-dropping results — it also raises thorny questions for purists about accuracy in smartphone photography.
Beautifying TikTok filters and photoshopped pics are one thing, but are the powerful sensors on our phones just spitting out fake images? To find out, let’s dig into the debate.
How does Samsung space zoom work?
Samsung says the 100x Space Zoom feature on its Galaxy Ultra phones can capture crisp, detailed images of the moon. Exactly how it achieves this feat remains unclear.
To paraphrase the company’s official statement, when you grasp a photo of the moon, its AI tech recognises it as the main object in the frame. The phone takes multiple shots for multi-frame composition, after which the AI enhances the details of the image quality and colours.
All of this is done using tech with fancy names such as “scene optimiser,” “super resolution,” and a “digital enhancement engine”.
Most importantly in this case, Samsung says “it does not apply any image overlaying to the photo”. In a possible nod to the latest allegations, the company adds that it “continues to improve Scene Optimizer to reduce any potential confusion that may occur between the act of taking a picture of the real moon and an image of the moon.”
Although the scene optimisation feature is automatic, you can turn it off by tapping the moving icon on the screen when you use the camera. Of course, you should execute this only if you don’t want the tech to enhance your photos.
The Samsung space zoom controversy explained
The latest chapter in the Samsung space zoom controversy involves a viral Reddit post in which a user accused the firm of faking its moon shots. To back up the claim, the user known as u/ibreakphotos created an intentionally blurry photo of the moon, displayed it on a computer, and then snapped a pic of it with a Samsung Galaxy S23 Ultra.
As you can see in the side-by-side pics above, the phone appears to gain added extra details and textures to what was originally a “blurry mess,” according to the Reddit poster. “Samsung is using AI/ML (neural network trained on 100s of images of the moon) to recover/add the texture of the moon on your moon pictures, and while some mediate that’s your camera’s capability, it’s actually not,” the user said.
“And it’s not sharpening, it’s not adding detail from multiple frames because in this experiment, all the frames contain the same amount of detail.”
YouTube star MKBHD — whose original clip of the space zoom feature wowed Twitter (and its owner Elon Musk) — also followed the same steps. He noticed “sharp details” in the Samsung pic that weren’t in the original source image.
But, he concluded that the results point to the broader trend with smartphone cameras, which is that they “don’t deliver reality” as much as “a computer’s interpretation of what it thinks you’d like reality to study like”.
How is AI used in smartphones?
If you’ve taken a photo with a smartphone in the past few years, the chances are some AI magic has been sprinkled on top of your pic. The latest handsets all advance with tech that can spruce up your images, from iPhones that combine multiple pics with computer vision to deliver better photos in low-light conditions, to Google Pixel phones that can unblur people’s faces.
But, there is still arguably a considerable disagreement between what these phones are doing and what Samsung’s space zoom function achieves, says the Evening Standard’s resident photography buff, Barry Collins. Specifically, Samsung’s tech appears to be adding in details that were never present in an image in the first area, presumably from a library of existing photos of the moon.
“Does that cross the boundaries? That’s debatable,” Collins said. “But my view is that if they’re adding extra elements to photos that did not appear in the original shot, the consumer should be warned.”
In that sense, space zoom is akin to features like sky replacement offered by photo-editing apps, Collins added. Both appear to grasp “elements that weren’t in the original shot and patch them in,” he said.
Beyond debates over authenticity, the AI feature could also pose issues around copyright. “If Samsung is (and I gain no evidence that it is) inserting other people’s work into your own, then what happens if you republish that photo online as your own work? It all gets very murky,” Collins said.