I’m of the firm opinion that for most users, cameras are the backbone of smartphone upgrade cycles. Better performance is always nice to have, but with the sheer grunt on hand with even mid-range hardware, performance metrics aren’t nearly as critical a buying decision anymore. Imaging, on the other hand, offers the most visible improvement year on year. Ever since the launch of the first Pixel, Google has had a laser focus on photography.
Ironically, despite the popularity of its smartphones resting largely on imaging prowess, Google’s hardware development on the camera front has been surprisingly sluggish.
Did you know that the Pixel series has been using the same camera sensor since the Pixel 3 launched way back in 2018? That sensor wasn’t all that different from the Pixel 2 before it either. Or take, for example, the Pixel 5, which finally tossed in an ultra-wide sensor, but didn’t include table stakes like a telephoto sensor. Instead, Google insisted on using its software-based Super Res Zoom technique that worked to a degree, but couldn’t hold a candle to true optical zoom. Elsewhere, the year before, the company opted for a telephoto lens on the Pixel 4 but chose not to include an ultra-wide sensor, something you definitely can’t replicate with software.
The Pixel series is a classic example of Google building a consumer product with an engineer’s mindset.
Google’s strategy towards imaging, and smartphones in general, has been diametrically opposed to what nearly every other OEM pushes in the Android space — specs. In an almost Apple-like fashion, Google has spent the better part of the last four years squeezing the best it can out of the Pixel’s camera sensor and building a consumer product with an engineer’s mindset. Except, even Apple opts to use hardware solutions instead of reinventing the wheel.
That’s all set to change with the Pixel 6 and Pixel 6 Pro, and that’s a very exciting prospect.
Why Google fell behind the camera curve
Let’s start with the obvious — it is clear that Google has been pushing the aging IMX363 sensor to the limits. Our own testing revealed just how far the Pixel 5 is falling behind the competition. From HDR noise to zoom capabilities and the lackluster ultra-wide camera, there are some things even software can’t overcome.
You could blame ex-camera chief Marc Levoy for this aversion to change. In an interview around the launch of the Pixel 5, Levoy stated that he wasn’t convinced that pixel binning and the resulting increase in signal-to-noise ratio from a high-resolution sensor would bring about a tangible improvement in imaging. This might’ve been true in 2019, but has since been proven wrong by the multitude of phones that have used these sensors to great effect.
While few have matched Google’s software prowess, improvements in sensors have allowed its competitors to overcome many hardware limitations. Huawei has been pioneering the use of RYYB sensors that enable night vision-like capabilities, while Sony is roping in the expertise of its camera division to improve color science. Others like OnePlus opted to partner with traditional camera manufacturers like Hasselblad to up their game.
Where Google’s software made up for hardware deficiencies, camera sensors have caught up and exceeded its capabilities.
Elsewhere, the BBK group has invested heavily into imaging, and phones like the Oppo Find X3 include a bevy of camera sensors to cover any possible use case. Xiaomi has also jumped into the ring and the Mi 11 Ultra is one of the best-equipped camera flagships around, not just for the hardware, but also for its excellent camera tuning.
Where Google was leading by a country mile, it is now in lockstep with the competition at best, and behind it in more ways than one.
A new sensor gives Google’s software the hardware it needs to shine
If there’s one thing Google’s thought process behind the Pixel series has shown us, it’s that the company isn’t interested in competing a quarter-mile at a time. Instead, it prefers to take big leaps ahead and refine the hardware to perfection. With Levoy no longer at the helm, it appears that Google might have realized the error in its earlier thinking.
An upgraded camera sensor is exactly what the Pixel 6 series needed to up its game, and it is exactly what it is getting. Don’t get me wrong, Google’s software prowess is what continues to ensure Pixel phones are some of the best camera phones around. The software has pushed the hardware to the best of its capabilities, but we already know that Google’s imaging algorithms shine on higher-end hardware.
Ports of Google’s camera app already exist for phones with sensors that are generations ahead, and the results are telling. With Google opting for an up-to-date sensor, the already excellent software can maximize the benefits of years of hardware advancements. Google’s commitment to an AI, machine learning-infused future with the Tensor chipset will only elevate this further. However, it extends well beyond a tangible, but expected upgrade in image quality.
The newer sensor will bring along basic enhancements like faster focusing speed. While Google is yet to reveal exactly what camera sensor it will use, leaks from the latest Android 12 beta suggest that the Pixel 6 series could be using the Samsung ISOCELL GN1 sensor as its primary wide-angle camera. Notably, this isn’t Samsung’s latest sensor, that’d be the Isocell GN2 which can not only use a full sensor read-out for detecting phase changes but can compare pixels in multiple directions to assist with focusing. This is still Google, remember, so it’s not too surprising we might not see the most up-to-date tech. We take what we can.
There’s plenty the GN1 could provide, regardless. For example, Google’s astrophotography mode already captures stellar images. Increasing the pixel size through pixel binning can greatly increase the amount of light falling on the sensor. The increased light capture means that you should be able to get similar results in a much shorter time — possibly even handheld.
Or how about Night Sight? In low light, exposure times on the Pixel 5 can vary anywhere from three to five seconds, on occasion even longer. Increasing the photosensitivity reduces the amount of exposure time needed, and, combined with Google’s software techniques, you could get near-instant low-light shots with the same fidelity that has made the Pixel a popular choice for low-light photography.
Related: The best camera phones you can get
The natural depth of field of a high-resolution sensor combined with excellent portrait mode algorithms should, on paper, lead to more natural-looking bokeh. The 4x optical zoom included on the Pixel 6 Pro, combined with Super Res Zoom enhancements and a high-res sensor could enable seamless scaling from 1x to well beyond the limits of the optical zoom, without any visual degradation in quality all the way through.
Speaking of zoom, the Pixel 6 Pro finally sports the standard trifecta of sensors with both the aforementioned telephoto shooter and an ultra-wide. This is something that has been common on nearly every recent premium flagship for several years now, but Google flip-flopped between zoom and ultra-wide photography between the Pixel 4 and Pixel 5 generations. The Pixel 6 may miss out on a telephoto camera, but at least the top-tier offering has all bases covered.
Custom Tensor silicon plays to Google’s software strengths
Hardware isn’t the only area where the Pixel 6 series is getting a welcome shot in the arm: its storied computational photography is also getting some upgrades.
The Pixel’s entire photography experience has been spiked with AI goodness right from the beginning. From HDR+ on the original Pixel to Pixel 2’s Visual Compute Core that sped up HDR processing. Tight AI integration in Google’s custom Tensor silicon should amp this to the next level enabling many more capabilities.
Compositing and processing up to 10 high-resolution images can bog down a processor. Purpose-built silicon that is designed to do just that? Not so much. Imagine a burst mode that can capture each individual image with the same fidelity as an individual HDR+ shot.
In fact, Pixel 6 demos seem to allude to a very interesting future. For example, one demo talks about sharpening a blurred-out face using data from a secondary camera. The same techniques could also be extended to better object removal.
The upgraded camera hardware will finally place the Pixel series on a level playing field against the competition.
Video has never been the Pixel series’ forte but the upgraded sensor should help bring the Pixel closer to form both in quality and capabilities. It took years for Google to jump on to the 4K/60fps capture bandwagon. Slow-motion capture is still limited to just 120fps at Full HD. However, newer sensors and processors can push this all the way to 480fps, with better quality to boot.
Meanwhile, Samsung, Xiaomi, and others are already pushing 8K video capture. Do you need 8K footage? Probably not. But the possibility of downsampling 8K footage to 4K for better detail and colors is exciting, and switching to a faster ISP combined with the Tensor chipset could enable this future.
Or how about HDR video infused with the same software magic that makes the Pixel’s still camera so special?
The Pixel camera has always been reliable — the Pixel 6 makes it exciting again
I’m excited by the possibilities here. It’s been a hot minute since Google did something fresh with the Pixel series and no, I’m not counting the failed Soli experiment. The Pixel 6 Pro with its ground-up redesign, both external and internal, is like a much-needed breath of fresh air for Google’s hardware efforts, and that’s true none more so than with its cameras. It remains to be seen just how much of a difference the fancy neural networks will make, but the upgraded camera hardware alone will place the Pixel 6 Pro on a level field against the competition. And that should go a long way towards charting the next few years of camera innovations.
Are you excited about the Pixel 6 series or is the camera in your current phone good enough for you? Let us know in the comments.