How much lunar detail can your smartphone store?

[ad_1]

I love this one Question from YouTuber Marcus Brownlee, who runs MKBHD. “What is a photo?” he asks. It’s a deep question.

Think about how early black and white film cameras worked. Point the camera at a tree and press a button. This opens the shutter to allow light to pass through a lens (or more than one lens) to capture the image of the tree on the film. Once this film was made, it showed the image-photon. But that’s just a photo Representation About what was actually there, or what the photographer saw with their own eyes. The color is gone. The photographer adjusts settings such as the camera’s focus, depth of field, or shutter speed and selects film that affects the brightness or sharpness of the image. Adjusting the camera and film settings is the photographer’s job; That’s what makes photography an art form.

Now jump forward in time. We’re using digital smartphone cameras instead of film, and these phones have made big improvements: better sensors, multiple lenses, and things like image stabilization, longer exposure times, and higher dynamic range, where the phone can take multiple photos at different exposures and combine them for a more stunning image.

But they can do something that was previously a photographer’s job: their software can edit the image. In this video, Brownlee uses the camera on his Samsung Galaxy S23 Ultra to take a photo of the moon. Excellent and stable – used 100X zoom to get a picture of the moon. Maybe as well cool.

The video – and more – was posted on Reddit by a user who goes by “ibreakphotos”. In an experiment, they used the camera to take a picture of a dim image of the moon on a computer monitor and Still Produces a sharp, detailed image. what was going on

Another video went on to say that Brownlee repeated the test with the same results. The specs are a product of the camera’s AI software, not just the optics, he said. The camera’s processes “basically AI calculates what you see in the viewfinder into what it knows the moon should look like,” he said in the video. Ultimately, he says, “What comes out of a smartphone camera isn’t so much reality as it is an interpretation of what a computer thinks it wants to look like.”

(When WIRED’s Gear Team covered the moon shot dustup, a Samsung spokesperson told them: “When a user takes a photo of the moon, AI-based scene enhancement technology recognizes the moon as the main object and takes multiple shots for multiple frames. Composition, after which AI improves the image’s quality and color details.” ” Samsung explained how the visual optimization function works and how to turn it off when taking pictures of the moon. Read more from the Gear. The team at Computational Photography here, and see more from Brownlee on the topic.)

[ad_2]

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *

eighteen − 4 =