Connect with us


Google launches new Pixels

At Google’s online event last night, it showed the Pixel 6 and 6 Pro in action with several must-have Android features coming to the device, powered by Google’s first smartphone processor designed in-house.



The Pixel 6 smartphone was finally launched by Google last night at an online event. Over the past few months, there have been accurate leaks and even a full disassembly, which made us very well acquainted with the device long before its launch. Where Google surprised us, however, was with the software, which is the main draw of this handset.

Tensor, Google’s new in-house system-on-a-chip (SoC), comes with the Pixel 6 family of devices. It features similar magnitudes of performance improvements as Apple’s recent move from third party SoCs to in-house M1 processors. It also enables a host of new AI-driven features to speed up processing and improve battery life.

Image of a computer chip against a black background. The chip is divided into sections that read: TPU, ISP, Security, Content Hub, CPU, GPU, and System Cache.

The most notable addition is the context hub, which enables what Google calls “Ambient Assistance”, which shows a user what they want, when they need it. For example, when it’s time for a flight, the content hub will sort through emails to find a boarding pass, and show the QR code on a user’s lock screen.

Animated GIF showing the At A Glance feature on a new Pixel 6 phone.

The Pixel 6 devices also feature a Titan M2 security chip inside Tensor to protect sensitive user data. Independent security lab testing showed that Titan M2 can withstand attacks like electromagnetic analysis, voltage glitching and even laser fault injection. Yes, Google says it shot lasers at its chips and they still didn’t give in.

On the outside, the Pixel 6 devices feature a long camera bump on the back and an edge-to-edge display with a holepunch camera on the front.

Both the Pixel 6 and Pixel 6 Pro have a new 1/1.3 inch sensor on the back. This primary sensor captures up to 150% more light (compared to Pixel 5’s primary camera), meaning users can get better photos and videos with more detail and improved low-light performance. Both phones also have new ultrawide lenses with larger sensors, which can be used to fit more in a shot, or to assist AI photography. On the front, there is an upgraded ultrawide camera that records 4K video. 

The Pixel 6 Pro has a telephoto lens with 4x optical zoom and up to 20x zoom with an improved version of Pixel’s Super Res Zoom. 

On the software side, two new camera features were introduced: Real Tone for better image equity for more skin tones, and Magic Eraser to remove photobombers. 

9 portraits made on Pixel 6, featuring a diverse range of people of color.

Google admitted its AI portrait models were heavily skewed towards lighter-skinned subjects, which made darker-skinned people appear blue or grey, which translates to looking ashy. This is why it commissioned professional photographers to show their AI what a good photo of a darker-skinned person looks like, which significantly improved Google’s camera models. 

Other smartphone brands with similar models to Google’s tend to deliver unfair experiences for people of colour, like over-brightening or unnaturally desaturating skin tones, in the pursuit of whiteness. This is why it is launching Real Tone in Google Photos later this year to assist third-party Android brands with this issue.

The Magic Eraser feature can detect distractions in photos, like people in the background, power lines and power poles, and suggest what a user might want to remove. Then, they can choose whether to erase them all at once or tap to remove them one by one.

Gif showing Magic Eraser being used in Google Photos on Pixel 6 to manually remove distractions from the background of a photo of a child at a pumpkin patch. A person and various items are circled and then removed, resulting in an image with just the child.

The onboard AI can also help users with phone calls and call centres. Before a user even places a call to a business number, they will see the current and projected Wait Times for the rest of the week. That can help them decide whether they have time to call now, or plan when to call later to avoid long waits. Google says the Wait Times feature is inferred from call length data that is not linked to user identifiers.

Once a user calls a business, Direct My Call helps them get to the right place with less hassle. Google Assistant transcribes the automated message and menu options in real-time and displays them on-screen to see and tap, so one doesn’t need to remember all the options. Direct My Call is powered by Google’s Duplex technology, which uses advanced speech recognition and language understanding models to determine when the business wants a user to do something ​​– like select a number (“Press 1 for hours and locations”), say a word (“Say ‘representative’ to speak with one of our agents”) or input your account number.

When calling a business, see menu options on the screen for you to tap.

It also waits on hold for you and knows the difference between a hold message and a human representative. It will alert a user once they’re put through to a representative.

Press “Hold for me” and let Google Assistant wait on hold, then notify you when someone is ready to talk.

The Google Pixel 6 will be available in the US from 28 October, starting at $599 for the standard version, and $899 for the larger Pro version.