And unlike, for example, the Eiffel Tower, its appearance will not change dramatically depending on the lighting. Shooting the moon usually only happens at night, and Samsung’s processing breaks down if the moon is partially obscured by clouds.
One of the clearest ways Samsung’s processing plays with the moon is by manipulating the contrast of the midtones, making its topography more pronounced. However, it is clearly also capable of introducing the appearance of texture and detail not present in the raw photo.
Samsung does this because the 100x zoom images of the Galaxy S21, S22, and S23 Ultra phones suck. Of course they do. They involve a massive crop in a small 10MP sensor. Periscope zooms in phones are great, but they’re not magic.
credible theories
Huawei is the other major company accused of faking its photos of the moon, along with the otherwise brilliant Huawei P30 Pro from 2019. It was Huawei’s last flagship released before the company was blacklisted in the United States, thereby destroying the appeal of its phones in the West.
Android Authority claimed the phone pasted a stock image of the moon into your photos. Here’s how the company responded: “Moon Mode works on the same principle as other AI Master Modes, in that it recognizes and optimizes details in an image to help individuals take better Pictures. It does not in any way replace the image, which would require unrealistic storage space since the AI mode recognizes more than 1,300 scenarios. Based on machine learning principles, the camera recognizes a scenario and helps optimize focus and exposure to improve details such as shapes, colors and highlights/lowlights.
Familiar, right?
You won’t see these techniques used in too many other brands, but not for any noble reason. If a phone doesn’t have a long-range zoom of at least 5x, a Moon mode is largely useless.
Trying to photograph the moon with an iPhone is difficult. Even the iPhone 14 Pro Max doesn’t have the zoom range for that, and the phone’s auto exposure will turn the moon into a blob of searing white. From a photographer’s perspective, the S23’s exposure control alone is excellent. But just how “fake” are the S23 moon images really?
The more generous interpretation is that Samsung is taking actual image data from the camera and simply implementing its knowledge of machine learning to massage the processing. It could, for example, help him trace the outlines of the Sea of Serenity and the Sea of Tranquility when trying to bring out a greater sense of detail from a blurry source.
However, this line is stretched in the way the final image renders the position of the Kepler, Aristarchus, and Copernicus craters with odd-seeming accuracy when these small features are not noticeable in the source. While you can infer where moon features are from a fuzzy source, that’s next level stuff.
Still, it’s easy to overestimate the head start the Samsung Galaxy S23 gets here. His moon photos might look okay at a glance, but they’re still bad. A recent Versus video featuring the S23 Ultra and the Nikon P1000 shows what a decent sub-DSLR superzoom camera is capable of.
A matter of trust
The fury over this question of the moon is understandable. Samsung uses lunar imagery to promote its 100x camera mode and the images are, to some extent, synthesized. But he really just stuck a toe out of the ever-expanding Overton AI window here, which has led innovation in phone photography for the past decade.
Each of these technical tricks, whether you call them AI or not, were designed to do what would have been impossible with the raw basics of a phone camera. One of the first of these, and arguably the most important, was HDR (High Dynamic Range). Apple integrated HDR into its camera app in iOS 4.1, released in 2010, the year of the iPhone 4.