It’s true that software and machine learning are more important than hardware when it comes to delivering the best smartphone cameras. But it’s also true that hardware plays a part, and that over the past few years Samsung has been unusually conservative with its smartphone camera technology. And it’s also true that you absolutely could not level that accusation at the Galaxy S20 Ultra.
We’ll have to spend more time with the device to see if Samsung’s software game has improved. But it’s hard to imagine the company making much more effort to fill out the S20 phones’ spec sheets. The Ultra in particular has crammed two notable firsts (for the US market, at least) into its sizable camera bump: a “periscope” telephoto lens for unprecedented zooming capability, and a 108-megapixel sensor for the main wide-angle camera.
That makes the Galaxy S20 Ultra the most mainstream phone yet to adopt the broader global trend of high-megapixel sensors, which Chinese phone makers have been all over for more than a year now. In the past, this would have been a recipe for disaster — jacking up the megapixel count often made photos look worse, not better, as the smaller pixels resulted in less light-gathering ability. But as phone cameras, processors, and software have evolved, it’s gotten harder to evaluate sensor technology with traditional metrics. While a 108-megapixel phone camera might sound ridiculous to many photographers, it makes more sense when you understand how Samsung is using the technology.
The Galaxy S20 Ultra isn’t actually the first 108-megapixel phone on the market — that honor falls to Xiaomi’s Mi Note 10, which is also known as the CC9 Pro in China. The Mi Note 10 uses a Samsung-designed sensor that’s very similar to the part in the Galaxy S20 Ultra, though there are a couple of differences. Samsung views its sensor business as an important area of future growth and has been pushing various high-resolution designs, including 48-megapixel and 64-megapixel parts, since last year; these latest 108-megapixel sensors are an attempt to further differentiate itself from market leader Sony.
The first thing to understand about image sensors is that, all things being equal, the bigger the better. Cameras need light to create photos, and physically larger sensors are able to capture more of it. Each sensor, however, is divided into millions of pixels that all gather light independently of one another. This means that the higher the resolution of a given sensor, the smaller the pixels will be, and consequently the lower the chance of each one accurately recording color information. That’s why photos in low light exhibit color noise, and why that noise tends to be particularly prevalent on small-sensor cameras with high megapixel counts.
The Galaxy S20 Ultra is definitely a small-sensor camera with a high megapixel count. But in the context of phone cameras, the sensor is actually pretty huge. Samsung is using the same 0.8-micron pixel size from its 48- and 64-megapixel sensors, then linearly increasing the physical surface area to arrive at 108. The result is a 1/1.3-inch sensor, which is bigger than the one in Nokia’s iconic Lumia 1020 (though still a little smaller than the one in its 808 PureView predecessor). Given the f/1.8 aperture, these 108-megapixel phones should have more pure light-gathering capability than basically anything on the market.
That doesn’t mean noise isn’t a concern, and 0.8 microns is still pretty small for an individual pixel. For comparison, the iPhone 11 has 1.4-micron pixels, though its 12-megapixel sensor is much smaller at 1/2.55 inches and obviously a lot lower in resolution. To avoid noise, higher-resolution phone sensors combine data from neighboring pixels into one, reducing the resolution of the resulting image but theoretically increasing image quality. In situations with bright light, meanwhile, you have the option to shoot at full resolution for greater detail.
Here’s a photo I took with the Xiaomi Mi Note 10 this afternoon, for example. (Yes, you can turn off the obnoxious watermark, but I left it on for easy reference.)
And here’s how it looks when fully cropped in, alongside a comparison to the class-leading iPhone 11.
As you can see, you get a clear improvement in detail with the 108-megapixel image. That’s a specific mode you have to switch to on the Mi Note 10, though — the default setting combines four pixels into one and turns out 27-megapixel photos. Here’s an example of that:
And again, next to the iPhone 11:
The pixel-binned Xiaomi photo wins again. Samsung’s newest 108-megapixel sensor actually goes further and combines 9 pixels into one for 12-megapixel photos. That’ll make for an even more interesting comparison to the iPhone when the Galaxy S20 Ultra is available. But without making judgements on Xiaomi’s image processing, it’s clear at the very least that 108-megapixel image sensors allow for the possibility of significantly more detailed photos than conventional phone cameras in situations with good lighting.
Low light is a more complicated comparison, because the state of the art in that regard relies heavily on software and algorithms that differ greatly between manufacturers even when the same sensor is used. It’s entirely possible that Samsung will match the likes of Google, Apple, and Huawei with the Galaxy S20’s night mode, but if that does happen the sensor won’t be the deciding factor.
Dim lighting, then, will probably be more of a workout for these sensors’ pixel-binning solutions — you can’t rely on night mode, but it won’t be light enough for full resolution. Here’s a picture I took on the Mi Note 10 in a poorly lit room.
And here’s a crop next to the iPhone 11’s equivalent photo.
You’ll see that while the Mi Note 10’s 27-megapixel image is higher resolution than the iPhone’s, it doesn’t resolve as cleanly at the pixel level and allows more color noise to creep in. There are purple pixels that just shouldn’t be there, for example, whereas the iPhone picture is remarkably sharp and consistent. But I should also note that the iPhone picture activated Apple’s automatic Deep Fusion processing, which is specifically designed to handle fine detail in dim lighting, so this is another situation where hardware alone might not be the most important element. In theory, too, Samsung’s pixel-binned 12-megapixel images should handle noise better than Xiaomi.
Physically larger sensors also affect depth of field, or the degree to which a given image is in focus. Blurry backgrounds are caused by a combination of longer focal lengths and larger lens apertures, with the former having a greater impact. In order to achieve the same field of view on a larger sensor, you need to use a longer lens. The Mi Note 10 has a 6.72mm lens versus the iPhone’s 4.25mm, plus a slightly faster f/1.7 aperture to the iPhone’s f/1.8, which results in a shallower depth of field.
Here’s a comparison:
These days, of course, shallow depth of field is handled by phones’ “portrait” modes, which use algorithms to calculate and render the out-of-focus areas. But they’re still not truly reliable, unlike the laws of physics that dictate how light refracts through a lens and onto a sensor. You’re never going to get background-obliterating portrait shots from hardware this tiny, but it’s still useful for closeups of food, pets, and so on.
The bottom line is, we don’t know how good Samsung’s Galaxy S20 Ultra camera really is. For that, you’ll need to stay tuned for our review and subsequent comparisons. But we do know that high-megapixel sensors aren’t a gimmick, and can’t really be judged in the way we’d evaluate traditional cameras. Don’t call this a reboot of the megapixel wars that plunged consumer point-and-shoots into irrelevance. This 108-megapixel sensor from Samsung is comparatively huge next to its competitors and should be able to capture unprecedented detail with the right implementation.
Samsung has given its flagship phone a serious camera hardware upgrade this year. Now we just need to find out what it’s done to the software.