For years, Samsung’s Space Zoom-enabled phones have been known for their ability to take incredibly detailed photos of the moon. But a recent Reddit post made clear just how much computer processing the company does, and given the evidence provided, it feels like we should go ahead and say it: Samsung’s images of the moon are fake.
But what exactly does “fake” mean in this scenario? It’s a tricky question that becomes increasingly important and complex as more computer techniques are incorporated into the photographic process. We can say with certainty that our understanding of what constitutes a photograph fake will change soon, just like in the past, to include digital cameras, photoshop, instagram filters and more. But let’s stay with Samsung and the moon for now.
The test of Samsung cell phones by Reddit user u/ibreakphotos was brilliant in its simplicity. They created an intentionally blurred photo of the moon, displayed it on a computer screen, and then photographed that image with a Samsung S23 Ultra. As you can see below, the first image has appeared on the screen no detail at all, but the resulting image showed a sharp and clear “photo” of the moon. The S23 Ultra added detail that just wasn’t there before. There was no upscaling of blurry pixels and no retrieval of seemingly lost data. There was just a new moon – a false one.
Here is the blurred image of the moon that was used:
A GIF of the recording process:
And the resulting “photo”:
This is not a new controversy. People have been asking questions about Samsung’s moon photography since the company unveiled a 100x “Space Zoom” feature in its S20 Ultra in 2020. Some have accused the company of simply copying and pasting pre-saved textures onto images of the moon to create its photos. but Samsung says the process is more complicated.
2021, input mag published a lengthy feature about the “fake detailed moon photos” captured by the Galaxy S21 Ultra. Samsung told the publication that “no image overlays or texture effects are applied when taking a photo,” instead the company uses AI to detect the moon’s presence and “then offers a detail-enhancing feature by reducing blur and noise.” .
The company offered a bit more information later in this blog post (translated from Korean by Google). But the heart of the explanation – describing the crucial step that takes us from a photo of a blurry moon to a moon in focus – is treated in veiled terms. Samsung simply says it uses a “detail enhancement engine feature” to “effectively Remove noise and maximize the details of the moon to complete a bright and clear picture of the moon” (emphasis added). What does that mean? We just don’t know.
A “detailed improvement of the engine function” is to blame for this.
The generous interpretation is that Samsung’s process captures blurry details in the original photo and then uses AI to upscale them. This is an established technique that has its problems (see: Xerox copiers change numbers when upscaling blurry originals) and I don’t think the resulting photo would be forged. But as the Reddit tests show, Samsung’s process is more intrusive than that: It doesn’t just improve the sharpness of blurry details – it creates them. At this point, I think most people would agree that the resulting image is, for better or for worse, fake.
The difficulty here is that the concept of “fake” is spectrum rather than binary. (Like all the categories we use to classify the world.) For photography, the standard of “authenticity” is usually defined by the information received by an optical sensor: the light that is captured when the photograph is taken. You can then edit this information fairly extensively the way professional photographers tweak RAW images, adjusting color, exposure, contrast, etc., but the end result isn’t fake. In this particular case, however, the moon images captured by the Samsung phone appear to be the product of a computational process rather than the result of optical data. In other words, it’s more of a generated image than a photograph.
Some may disagree with this definition, and that’s okay. This distinction will also become much more difficult in the future. As smartphone makers have used computational techniques to overcome the limitations of smartphones’ small camera sensors, the mix of “optically captured” and “software-generated” data in their output has shifted. We are certainly heading towards a future where techniques like Samsung’s “Detail Improvement Engine” will be more common and widespread. You could train “detail refinement engines” on all kinds of data, e.g. B. with the faces of your family and friends to make sure you never take a bad picture of them, or famous landmarks to enhance your vacation snaps. In time, we will likely forget that we ever labeled such images as fake.
Samsung says “no image overlays or texture effects are applied when taking a photo”
But for now, Samsung’s moon images stand out, and I think that’s because it’s a particularly handy application for this type of computational photography. First of all, moon photography is visually accessible. The moon looks more or less the same in every image taken from Earth (ignoring librations and rotational differences), and while it has detail, it lacks depth. That makes it relatively easy to add AI improvements. Second, moon photography is marketing for catnip because a) everyone knows cell phones take bad pictures of the moon and b) everyone can test the feature for themselves. That’s made it easy for Samsung to showcase the photographic capabilities of its phones. Just look at this ad for the S23 Ultra with a moon zoom at 11 seconds:
It’s this viral appeal that has landed the company in trouble. Without properly explaining the function, Samsung has allowed many people to mistake its AI-enhanced images for a physics-defying optical zoom that doesn’t fit in a smartphone. That, in turn, has prompted others to debunk the images (because the tech world loves a scandal). Samsung doesn’t exactly claim that its moon shots are representative all its zoom photography, but a consumer would be forgiven for thinking that, so it’s worth stressing what’s really going on.
Ultimately, photography is changing and with it our understanding of what constitutes a “real photo”. But for now, it seems fair to conclude that Samsung’s moon photos are more fake than real. This will probably no longer be the case in a few years. Samsung did not respond immediately The edgeasked for a comment, but we’ll update this article if they get back to us. In the meantime, if you want to take an undistorted photo of the moon with your Samsung device, just turn off the Scene Optimizer feature and get ready to capture a picture of a blurry circle in the sky.