Google Pixel 6 camera features: What should you expect?

Google Pixel 6 Pro camera

Google surprised us all this week when it revealed some official details of the Pixel 6 series, ranging from the new in-house Tensor SoC to design and more. But one of the main tidbits is that the new phones are finally delivering big camera upgrades.

The Pixel 6 Pro in particular is finally getting a triple rear-view camera for the first time in the history of the Pixel series, albeit in typical Google fashion a few years later than competing Android brands. That means we have a standard camera, ultra-wide-angle shooter, and telephoto lens on the same phone, unlike previous Google flagships where you could choose between an ultra-wide-angle and telephoto camera.

There are more upgrades out there than just a triple camera system, however, and these could allow for some really nifty additions and enhancements to the Google Pixel camera experience. This is what we expect from the Pixel 6 and Pixel 6 Pro cameras.

Related: The best camera phones you can get

The main sensor is finally getting an upgrade

Dual cameras camera module Pixel 4

Perhaps the most notable photo upgrade is that the Pixel 6 series gets a brand new main camera sensor. This is a major deal as Google has been using the same 12 MP IMX363 sensor since the Pixel 3 days, with the Pixel 2’s IMX362 sensor also being very similar.

A new, larger sensor could potentially offer several improvements over the old IMX363 sensor, such as: B. an improved dynamic range and better details. The latter was particularly noticeable in our Pixel 5 test, as we said that images of detailed scenes “look busy and excessively high-contrast”.

According to Google, the new camera sensor captures 150% more light than the old sensor (via The edge), which means we should expect improved photos at night. The Pixel Camera app may not need to turn on night vision mode as often as its predecessors, as the improved light harvesting capabilities and faster processing should deliver brighter shots in standard photo mode.

We’ve already seen that Google has a night vision portrait mode on the Pixel 5 too, so improved light picking could result in more low-light portraits without using night vision or night vision portraits in situations that would be too dark for the Pixel 5.

A new main camera sensor means Google may not have to rely on Night Sight very often.

We would also expect better results for astrophotography images and astro time-lapse. We hope we don’t have to wait 15 seconds for a single astrophotography image and four minutes for the full exposure. Improved light detection functions open the door for more ultra-low light functions, such as B. Video recordings in low light and nocturnal time-lapse.

There’s no word on the specific camera sensor yet, but sensors like the 50MP Samsung Isocell GN2 also offer Dual Pixel Pro autofocus technology and support for more video quality options (e.g. 4K at 120fps and 8K recording). Google has generally been late with high resolution video options compared to competitors, but we definitely expect improved auto focus technology when using such a module.

Improved autofocus (and exposure) would also match the philosophy that Google camera engineer Isaac Reynolds was advocating at the time the Pixel 4 series was launched. That said, users should be able to simply open the camera app and take the correct shot without tapping the viewfinder.

Great software zoom meets great hardware

The Super Res Zoom function of the Google Pixel 3.

The Pixel 6 Pro also gets a 4x telephoto camera – a major upgrade to the 2x telephoto lens in the Pixel 4 series. Nothing is known about further details about the 4x camera, but that still makes us excited about the zoom on the new pixels.

With the combination of Super Res Zoom and a 4x telephoto camera, we can expect that the Pixel 6 Pro will also provide great images beyond 4x zoom. All phones have a threshold where the zoom goes from good to bad, but the Pixel 6 Pro’s threshold should be significantly higher than the Pixel 5 and Pixel 4 because of the higher native zoom.

One potential challenge we’ve seen with phones with a 4x or 5x telephoto camera (in addition to a main camera) is that close-range zoom tends to suffer. We have seen companies like Huawei, Oppo, and others turn to image fusion technology to combat this by combining the results from the main and telecamera to deliver solid 2x to 3x footage.

We suspect that Google could do the same for sub-4x shots by combining the main camera, telephoto camera, and Super Res zoom. However, we’ve seen some phones suffer from reduced detail around the edges when using image fusion techniques like this one, so Google may have to pay attention to that too.

The new promise of machine learning silicon

Google Tensor Processor

The Pixel 6 series will also be the first Google phones to be entirely powered by an in-house chipset called the Tensor Processor. We expect Google Arm to use CPU and GPU technology, but the big news here is that the chipset has a TPU (Tensor Processing Unit). This is a machine learning processor that promises to enable a wide variety of mobile functions that would normally require an internet connection.

However, this isn’t the first time we’ve seen dedicated Google silicon in a Pixel phone as it previously used the Pixel Visual Core and Pixel Neural Core. These chips did everything from speeding up HDR + processing and voice inference to enabling 3D face unlock. But the TPU promises an even bigger upgrade that delivers more horsepower while being an integral part of the overall chipset than Google’s earlier chips.

See also: Why the Pixel 6’s tensor chip is actually a big deal (and why not)

Google is also using the TPU to improve camera functionality, with the company showing several demos The edge. One such demo saw a blurry photo of a toddler with Google running that image through the TPU to defuse the child’s face. The pixel maker says that in addition to the existing multi-frame processing with the main camera, they can use the ultra-wide camera as part of the process, resulting in a blurry face.

Another demo touted by Google for the TPU is the enhanced HDR video recording. This demo pitted the Pixel 6 against the Pixel 5 and iPhone 12 Pro Max and recorded an HDR video from a beach (in 4K / 30fps). The edge found that the Pixel 6 came out on top because it didn’t artificially lighten the shadows compared to Apple’s device, while at the same time it looked more natural than both.

Google informed the point of sale that it is using the same HDR process for HDR video that is used for its still images. We can’t help but feel like this opens the door to an improved burst mode that has been missing on Pixel devices for a while.

Google demonstrated features like “defusing” faces and improved HDR video, but more powerful machine learning could bring many more features.

Another possibility posed with faster AI silicon is that we could finally see object removal technology as touted by Google in 2017. At the time, she showed a picture taken behind a chain link fence, using machine learning to remove the fence for a clearer shot. We haven’t seen this technology since then, but the TPU could in theory enable it, as well as features like deleting objects in general (like Samsung phones) and removing reflections (like Huawei devices).

Google has also previously used machine learning hardware to enable enthusiast-focused features like dual exposure control. This nifty feature was first introduced in the Pixel 4 range and allowed users to adjust the shadow levels before shooting. So hopefully the TPU offers more convenient functions in the viewfinder, so that users have one less reason to visit an editing app.

Former Google employee Marc Levoy admitted back in 2019 that the company had yet to solve the challenge of capturing both a detailed moon and a moon-lit landscape in one shot. The former director of photography attributed the difficulty of overcoming this hurdle to the excessive dynamic range between the bright moon and the dark landscape and urged people to “stay tuned”. It stands to reason that Google either put this work on hold or just waited for better camera hardware and machine learning. And guess what we got with the Pixel 6 series?


There are still a lot of unknowns surrounding the Pixel 6 camera, but there is a lot to be excited about. Are you impressed with the Pixel 6 Pro camera hardware to date? Let us know via the poll above and in the comments.

Source link

Related Posts