Born after the iPhone until 9 years, after the first Android in 8 years, Pixel brought in a special power. In a time when the smartphone market has plunged, partners rely on Google's operating system, the Pixel Phone is Google's own vision for the future of Android. Still the same Snapdragon chip found on dozens of other top smartphones, the same sensor bought from Sony, the same screen bought from Samsung or LG, Pixel must prove that Google's hardware really has its own place. in the whole smartphone village.
It doesn't take Google long to prove that. With an "individual parcel" AI processor, the first-generation Pixel combines up to 10 frames to create a single image. The concept of "moment" is broken – the resulting image is not a real-life "moment" but a combination of 10 different frames. Pixel started to attach itself to the mythical photography features.
In 2017, on the 2nd generation, Google used the algorithm to create amazingly low-light photos. More mythically, with just one camera – a sensor and a lens, Google can still produce more accurate bokeh (background removal) images than smartphones using dual cams.
The myth continues with the Pixel 3. Still a single camera, nor does it have a folding lens like the P30 Pro or OPPO Reno, Google creates a digital zoom feature on par with optical zoom.
Unlike the previous 3 generations, Pixel 4 leaked all before its launch. Even a store in Vietnam also has products to use from 2 months before Google announced products in New York. All the attention is focused on the new dual camera that first appeared on the Pixel (and the ugly thick bezel design). That dual camera has been on smartphones since 5 years ago. It is not wrong to say that no one is expecting a surprise from the camera of the new Pixel.
Yet Google has once again proved the level of camera at the superfluous level. On Pixel 4, Google unveiled a crazy idea called "Live HDR +". If you've ever used the HDR feature, you'll know that it can only be applied AFTER the image was taken – in essence, HDR is to combine multiple frames to retain the best amount of detail. But Google's HDR is real-time HDR: the effect is constantly applied to the camera app itself. What the user sees on the screen will capture the same HDR image.
No smartphone company has done this myth. Remember again, that HDR is a feature that can only be applied when combining multiple captured frames. By algorithm, with superior AI engine, Google does this myth so that users can fully enjoy the benefits of HDR.
Equally myth is the ability to adjust the shadows (shadows) and highlights (hightlights). So far, smartphones have pulled bright and brightened the entire frame: brightening the highlight areas will be 'burnt', while reducing brightness the shadow areas will be pitch black. Google solves this problem by itself. It automatically detects bright and dark areas, and then provides two corresponding and independent adjustment bars: The problem of brightening makes the image "flare" or "black" is completely solved. correct your photos without worrying about framing your frame.
"This is an adjustment that has never been seen on any other camera," said Google Research head Marc Levoy. Indeed, this is the way to pull light is not available on iPhone, Galaxy, Mi Mix or Mate / P. A feature for the first time available but will definitely be popular. Have you ever taken a dark photo, pulled up the bright white background? Have you ever tried to create a beautiful shadow effect and make the frame dark?
Everything about the Pixel 4 is too mythical. The Google Pixel 4 can capture the night sky, using a 4-minute "package" of AI for a mere 16 seconds (15 blocks of data 16 consecutive seconds). With such exposure, the flash is intended only as a flashlight. White balance is also optimized by AI to eliminate chromatic aberration.
Take a look at what Apple and Samsung are doing. Deep Fusion can indeed produce beautiful photos, Night Mode on the Galaxy S10 actually improved the low-light photography quality when it was released as a software update in May. With the iPhone XR, Apple learned Pixel 2's mythical feature when taking bokeh photos while still removing fonts. Apple and Samsung also have features that Google does not have, such as environmental light emulation (iPhone) or error detection (dirty lens, someone with eyes closed – on Galaxy).
But the smartphones of Apple, Samsung, and all the other big players do not have the mythical features that Pixel 4 has. Apple, Samsung, and all the other big guys have not been able to pioneer the impossible (but extremely useful) features on the latest Google smartphones. It's simple, because no other smartphone company is the AI power like Google. This year is the same, the next year is the same: no matter what anyone does, shows, talking about the level of "AI camera phone" can only talk about Google!