Samsung’s moonshot forces us to question how much AI is too much.

[ad_1]

And unlike, for example, the Eiffel Tower, its appearance does not change significantly depending on the light. Moonshots usually only happen at night, and if the moon is partially covered by clouds, Samsung’s process is broken.

One of the most obvious ways Samsung can relate to the moon is by controlling the mid-tone contrast, making the landscape more vivid. However, it can also introduce an aspect of texture and detail that is not present in a raw photo.

Samsung does this because the Galaxy S21, S22 and S23 Ultra phones offer 100x zoom. Of course they do. They include plenty of cropping in a small 10-MP sensor. Periscope zoom phones are great, but not magic.

Credible theories

Huawei is another big company accused of faking moon photos, with the 2019 flagship Huawei P30 Pro. It was the last flagship released before the company was blacklisted in the US.

Android Authority claims that the phone has superimposed the image of the moon on your photos. Here’s the company’s response: “Moon Mode works on the same principle as other mainstream AI modes, recognizing and optimizing details in the image to help individuals take better photos. It doesn’t replace the image in any way – AI mode recognizes more than 1,300 scenarios, so it requires an unrealistic amount of storage space. Based on machine learning principles, the camera recognizes a situation and helps improve focus and exposure to enhance details such as shapes, colors and highlights/lowlights.

Familiar, right?

You don’t see these techniques used in many other brands, but not because of any high-mindedness. Moon mode is mostly pointless if a phone doesn’t have at least 5x long throw.

Trying to hit the moon with an iPhone is hard. Even the iPhone 14 Pro Max doesn’t have a zoom range for it, and the phone’s auto exposure turns the moon into a white blob. From a photographer’s point of view, the S23’s exposure control alone is excellent. But how “fake” are the S23 moon images, really?

The most generous interpretation is that Samsung uses the actual camera image data and only applies its machine learning expertise to massage the process. This can help, for example, in searching for details in a sea of ​​calm and serenity when trying to extract a greater sense of detail from a blurred source.

However, this line is drawn in such a way that the final image shows the positions of Kepler’s, Aristarchus’ and Copernicus’ wells with almost unbelievable accuracy when these small features seem unrecognizable from the source. While you can take a hint of where the moon’s features are from a dim source, that’s the next step.

Still, it’s easy to imagine how much of a leg up the Samsung Galaxy S23 will get here. The moon photos may look okay at first glance, but they’re still bad. A recent Versus video featuring the S23 Ultra and the Nikon P1000 shows what a decent sub-DSLR consumer superzoom camera can do.

A question of trust

The anger over this lunar issue is understandable. Samsung uses images of the moon to boost its 100x camera mode, and the images are somewhat composite. But here’s a finger peeking out of the ever-expanding Overton AI window that’s been driving phone photography innovation for the past decade.

All these technical tricks, whether you call them AI or not, are designed to do the impossible with the raw basics of a phone camera. One of the first and arguably the most important of these was HDR (High Dynamic Range). Apple built HDR into the Camera app in iOS 4.1, the 2010 release of the iPhone 4.

[ad_2]

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *

14 − 5 =