Google’s Pixel 2 and Pixel 2 XL might have experienced teething issues when it comes to their displays, however one place in which they didn’t fail to impress were their cameras. Following in the footsteps of the original Pixel duo that started a year before, the new 12.2-megapixel detectors in Google’s latest smartphones are a treat to use already, but their whole potential isn’t quite exploited yet as there are many promised characteristics yet to arrive, waiting to be enabled via future software updates.
Gadgets 360 needed a Hangout session with Brian Rakaowski, VP of Product Management at Google and Timothy Knight, that leads camera growth for Pixel 2, to talk specifically about the camera and also what makes it tick. We are all aware of some of the more publicised issues with the new Pixels like the audio issues when recording video and over Bluetooth, odd screen flashes, but we have had any issues with the camera too, which we hoped to find some clarity on from the Google duo, no puns intended.
The Pixel 2 does a fantastic job stabilising video but in low light, especially at 4K, the footage tends to have quite noisy. This is mainly since the Pixel 2 tries to brighten up the scene as far as possible by boosting the ISO, which gives you a brighter scene for sure, but in the cost of noise. This is done intentionally, Knight explains.
“That is really a tradeoff we think a lot about. We tried to strike a balance of both,” he states. “If you compared the Pixel 2 camera to additional mobile cameras, you’ll see that we’re brighter. It’s easy to make the noise go away if you just make the image dim. We decided that we rather allow the user see the spectacle more clearly, by making it brighter, even if this means there is some more noise.” Knight additionally states that 1080p video should be a bit less noisy in comparison to 4K, since there’s more headroom to do heavy weight processing, compared to 4K.
Another feature that’s missing in the Pixel two is 60fps service at 4K, something that the iPhone 8 Plusand iPhone X boast off. “4K in 60[fps], sadly, is not something we’re going to bring to Pixel 2,” says Knight. “For future products, we’ll consider it certainly. But for Pixel two, 4K 30 and 1080 60 is the video we all plan to encourage.” This limitation seems to get more to do with Qualcomm’s Snapdragon 835chipset than anything else however.
If you have looked in the settings of this Pixel 2’s camera app, you will notice that enabling manual control for HDR+ gives you a second option in the viewfinder, called HDR+ enhanced. When we analyzed the Pixel 2 along with the Pixel 2 XL, we didn’t actually notice any quality difference between the 2 modes, other than the fact that it takes a longer time to process the HDR+ improved photo. Turns out, we were right.
“In the large majority of instances, there is no difference. From a user perspective, HDR+ and HDR+ enhanced will take the same photograph,’ explains Knight. ‘In a few conditions, HDR+ enhanced might take a photograph which has a little more dynamic selection.” The reason the enhanced mode takes longer to process is since in regular HDR+ manner, Zero Shutter Lag (ZSL) is on whereas in the elongated mode, it’s off. Shutter lag is typically the time obtained by the moment you press the camera button, to if the picture is actually recorded and saved. Zero Shutter Lag (ZSL) typically gives you near-instantaneous shots, with almost zero delay.
We initially assumed that the Pixel two’s Visual Core imaging chip will help speed this process up, after it’s active in the Android 8.1 update, but that does not seem to be the situation. The Visual Core SoC’s primary purpose will be to empower third-party camera apps to utilize the HDR+ attribute. “When third party’s use the camera API, they’ll be able to get the high quality processed pictures as a result,” says Rakaowski.
Finally, the absence of manual controls and RAW file supports is another bummer in brand new camera program. This is an area that other Android makers like Samsung and HTC have really mastered through the years. Not everyone needs manual controls but it’s nice to have the option, especially once you want to take some artistic shots, and it’s very helpful in low light. Having this feature would also help restrain the vulnerability in video, for those who prefer to capture the scene for what it is instead of brightening things up. However, Knight isn’t convinced that simply putting sliders for ISO, aperture, and so on is the best interface for a phone. He further claims that in doing so, users will not be able to benefit from HDR+, so image quality will endure.
Google might add some amount of manual control in the long run, “but in the moment, don’t expect to realize a manual slider anytime soon,” says Knight. It appears that Google is relying heavily on its machine learning to improve photos and make them look so good as they perform, which might explain why they are not willing to relinquish control over to the consumer. This applies to RAW file service too.
“We’ve received similar feedback from other users too [about RAW support]. We do not have some updates today but we are looking into it,” says Knight.