Technology

Instagram, Snapchat, and WhatsApp get better photos on the Pixel 2

Enlarge / The Pixel 2 Cameras.

Ron Amadeo

Google is opening the SoC Learning Machine designed by Google from Pixel 2, the Visual Core Pixel to third-party applications. The first applications to take advantage of the chip are Snapchat and the Facebook stack of social media applications: Facebook, Instagram, and WhatsApp. With the February Android security update for Pixel 2 each application will use Google's HDR + photo processing in its own images.

With the launch of Android 8.1 Oreo Google activated the Visual Core Pixel in Pixel 2 and Pixel 2 XL and added a "Neural Networks API" to Android. The new API allows applications to exploit all the machine learning acceleration chips present in the device, of which the Visual Core Pixel is one of the first examples. The Google HDR + photo algorithm is one of the first software written for the Visual Core Pixel, and it is now open to more apps than the Google Camera app.

Google's HDR + algorithm takes a burst of photos with short exposure times, aligning them to account for any movement and averaging them together. The result is a much better image, with less noise and a higher dynamic range. The images are also oversampled to provide more detail than with a single 12MP image. HDR + is so good that the Android modding community has taken the portage from the exclusive Google Camera Pixel app to other devices, where the use of HDR + instantly improves the exit of the camera.

Before this release, HDR + on the Pixel had a big deficit of features. It was exclusive to the Google Camera app, so if you were using another camera app, the algorithm was not available and you ended up taking inferior photos. While any application of Pixel can call the Google Camera app with a mere intention of "shooting", applications can not create custom features in addition to the Appliance app photo of Google. This is a problem if you want to do something like the "Lens" camera effects of Snapchat, which requires building a custom camera application from scratch and losing HDR +. I doubt that you need an additional SoC to open the HDR + algorithms to third parties, but now all the Pixel photos will be equal as long as the application will support the Visual Core Pixel.

Snapchat and Facebook are just launch partners for this feature. HDR + algorithms are now open to any developer wishing to connect to Google's chip.

Leave a Reply

Your email address will not be published.