Google explains how Live HDR + works on Pixel 4 and 4a

Live HDR + Pixel 4a

This same week, the Google Pixel 4a, the company's new mid-range phone, was officially presented, after months of rumors and speculation about its arrival on the market. As with the brand's phones, the camera is the strong point of this model, called to be one of the most prominent in this field, even though it will not reach Spain at the moment.

One of the most important functions on these Pixel 4a, in addition to the Pixel 4, is Live HDR +. Google has now wanted to explain this function, which allows the user to see how the photo is going to look accurately before it is going to be taken. A complex function, but one that the firm now explains in detail.

Google explains us how Live HDR works on Google Pixel 4 and Google ...

This is how Live HDR + works for the Pixel 4aAccording to Google on its blog, until not too long ago, they were not able to calculate the HDR in real-time. This was something that prevented it from being seen in the viewfinder, but that was finally changed with the launch of the Pixel 4. Also, this Live HDR + technology has been applied to the new Pixel 4a. Thanks to it, a preview is given that allows an approximation to the final photo that is identical.

HDR processing is somewhat complex, as 3-15 photos are captured that have different exposures. Google's HDR uses an algorithm that applies a different tone curve to each image, thereby reallocating the pixels of each one in the final photo. They combine those values ​​to create a photo that has the highest possible dynamic range. This allows you to get the best possible photo at all times.


iPhone SE got a competitor from Google. Pixel 4a rolls especially ...

Since HDR + is a slow process, bringing it to the preview is tricky. Google presents Live HDR + as a solution to this, which allows this process to be carried out, albeit on a reduced scale. The previously explained process is carried out on Pixel 4 and Pixel 4a, although in this case, it applies only to small parts of said scene. In this way, the phone can know in advance the necessary processing and show that preview.

It is processing that the GPU is in charge of, which also uses a neural network called HDNet. It can use low-resolution photos to predict curves in a high-resolution viewer. Google has also gone further in the Pixel 4 and Pixel 4a, as they have introduced double-exposure HDR. In this way, the phones allow you to show that preview and adjust the HDR in real-time, being able to work independently with two controls.

An important technology for Google, thus improving the camera on its Pixel 4 and Pixel 4a. A good incentive to buy these models, especially for users who value the camera and image processing.

Post a Comment

0 Comments