Is the Moon fake? Some say yes
The Galaxy S23 Ultra uses AI for better Moon photos, and some say that makes them fake.
There’s a debate breaking out online over whether these photos of the Moon are real. No, I’m not talking about these photos specifically, but all of the well-detailed, close-up photos of the Moon shot on Samsung’s smartphones.
Over the years, Samsung has released a handful of smartphones with a feature called Space Zoom. This, combined with the company’s various camera processing algorithms, has allowed owners of the Galaxy S21 Ultra, S22 Ultra, and S23 Ultra to capture photos like the ones you see above with an amazing glimpse at the big rock in the sky we see every night.
Only it’s not an accurate representation of what the camera sensor sees. It’s what Samsung wants you to see.
You see, the physics of smartphone cameras can only take their quality so far. You can’t strap a DSLR sensor to the back of a phone, and it’s not a good idea to include a camera that can physically zoom in and out (although it’s been tried before). Instead, companies are left with camera sensors that have periscoping lenses, 1-inch sensors, and super high megapixel counts to improve your photos. It’s the only way manufacturers can evolve the mobile photography experience given the constraints of the form factor.
Samsung has been at the forefront of this trend, bringing all of those features and more to its Ultra series that launched in 2020. But hardware is only part of the story when it comes to photo quality; without decent software processing, the photos these tiny smartphone cameras take wouldn’t look all that good.
Each company has their own processing style, with some more zealous than others in terms of how true to life their images are. Admittedly, Samsung’s phones have always strayed away from a natural color profile to one that’s a lot more saturated and vibrant than how the scene you captured looks in real life. That’s not really a bad thing - it’s a subjective decision to make whether that’s better than a more natural image.
Sometimes, Samsung takes its processing a little too far, like when it boosts saturation for flowers and completely destroys the red hues. Or, when you’re taking a picture of your breakfast and the most vibrant colors are boosted to unhealthy levels. In spite of all those quirks, though, I’m not sure Samsung’s ever gone farther than what they’ve done with processing photos of the Moon.
Over the weekend, Reddit user r/ibreakphotos discovered that Samsung adds an aggressive amount of artificial intelligence to pictures of the Moon taken on the S23 Ultra. To determine this, the Redditor took a picture of a 170x170 image of the Moon on a computer monitor - upscaled by 4x to fill the screen - from the opposite side of a room in their home. The resulting image was one that packs in details and sharpening that simply didn’t exist in the original image given its significant compression.
Here’s a comparison between the original 170x170 image and the photo they took on the S23 Ultra.
As you can see, there’s an incredibly stark difference in quality. The photo on the right has been given a red hue, gobs of extra detail, and an astonishing level of clarity.
This is what’s leading people to believe that these photos are nothing more than fakes. Samsung’s aggressive AI tuning results in photos of the Moon that look anything but true to what the S23 Ultra’s cameras can actually see.
The way it works is quite interesting. It looks for the Moon in a sea of darkness, which usually means it’s nighttime and you’ve zoomed in. It spots the glowing white light in the sky, reduces the exposure, and does its best to focus on the object. From there, it uses a database of Moon textures and details to improve the clarity of the shot. As long as there’s nothing else in the frame (external lights, reflections, etc.), it can focus on the Moon pretty easily to match up its details with what the AI process wants to do.
It’s an easy thing to accomplish, oddly enough. Since we only ever see one side of the Moon from Earth, Samsung is able to tune its algorithm to look for the same craters and imperfections on the Moon’s surface, accommodating for a slight difference in the angle.
The results are nothing short of amazing, albeit pretty misleading. Recently, we had a low-hanging blood Moon in my area, so I grabbed my S23 Ultra and headed outside to capture it. Here’s a picture using 30x zoom and one using 10x zoom, keeping a lot of Atlantic City still in the frame.
Clearly, there’s some AI processing applied to the Moon in the second photo, but it had a harder time balancing it with the lights of AC and the reflections on the water.
Still, this is a huge amount of detail for a smartphone camera to capture, and it’s clear that Samsung’s artificial processing works. But here’s the question: is any of this a big deal? Is it worth the outrage? Should we all be calling the S23 Ultra’s camera a fraud?
I don’t think so. For one, this isn’t a new controversy. The S21 Ultra from 2021 was accused by Input Mag of simply replacing objects in the sky it thought was the Moon with full images of the Moon that already exist to boost their detail. Samsung denied this, saying that there wasn’t any “image overlaying or texture effects” being applied to the final image.
However, it did admit that they use “a detail enhancing function” that reduces blurs and noise from the resulting photograph. This lines up with Samsung’s blog post detailing how it handles Moon photography. In it, the company admits to using the technology since the days of the Galaxy S10 series, injecting “learned data,” “multi-frame synthesis,” and “deep learning-based ai technology” to make each photo of the Moon that’s captured look a little better. The post also mentions how obstructions like clouds can stop the processing from happening, while clear skies in the early evening will cause the camera to process them as pitch black in order to focus solely on the Moon.
But here’s the thing: even though there’s a lot of AI processing going on here, there’s already a great deal of AI in every photo we take. From landscapes to portraits to close-ups of our meals, every modern smartphone camera has some level of AI that tunes colors, details, and contrasts to levels that aren’t very true to life, leaving you with a false impression of the scene you just captured.
Is that a bad thing? Not really. Like I said earlier, smartphone cameras are severely limited by their size and the devices they live on. There’s only so much you can do to improve physical optics, so companies are left with tuning the software, and smartphone software tuning is purely subjective in terms of how good it looks.
Some people prefer how Apple’s iPhone photos look, with their more natural aesthetic. Some prefer Google’s Pixel photos, which are more contrasty and vibrant. OnePlus, on the other hand, tries to hit somewhere in the middle.
Then there’s Samsung who, at any time of the day, sticks to its ways of boosted saturation and aggressive highlight handling. It just so happens that at night, when your camera is zoomed into the Moon, it uses AI to boost its clarity to unnatural levels.
Does that make photos of the Moon captured on Galaxy S phones fake? That’s up to you to decide. No one picture from a smartphone will represent reality accurately, so you may as well pick the camera phone that appeals to you the most, even if that means taking “fake” pictures of the Moon.
Footnote
If you’re a Galaxy S smartphone user and wish to take “real” pictures of the Moon, turning off “Scene optimizer” in the camera settings will disable all the AI tuning, at least according to Samsung (via Digital Trends). Interestingly, the photos I took of the blood Moon were with Scene optimizer turned off. I’ve kept Scene optimizer off since the day I got my S23 Ultra, so I guess the standard post-processing algorithm is still enough to apply these effects to photos you take of the Moon.
Great insight to smartphone photography with in-depth understanding and examples.