Rick Osterloh sat down on the couch, dropped his laptop and leaned back, content. Although it is not a microphone, the effect is similar. Google's chief of hardware just showed me a demonstration of its latest feature, computational processing for video. It will be available on the Pixel 6 Pro and Pixel 6 Pro. This feature could only be achieved with Google's mobile processor, which it announced today.He is obviously proud of the news and eager to share it with others. Tensor is the name of the chip and it's the first SoC (system-on-chip) created by Google. He said that the company has been in this business for about five years. However, CEO Sundar Pichai stated in a statement, Tensor was four years in development and draws on two decades of Google's computing experience.Google is well-known for its software expertise. Google is a leader in computational photography, with Night Sight mode that allows for low-light shots. It also stunned the world by its ability to imitate human speech with its conversational AI Duplex.Osterloh gestured animatedly while holding a diet Coke in his other hand.GoogleTensor is enabling experiences that other chips (the company used Qualcomms Snapdragon processors for its previous phones) could not deliver. This includes being able run multiple AI-intensive tasks simultaneously on the phone without it overheating or being able to apply computational processing as videos are being captured.Google announced Tensors today, ahead of the Pixel 6s launch in the fall. This belief in Tensors importance is why Google chose this day. It is not yet sharing details about the processor or specific information about its new flagships. Osterloh stated that there is a lot more information available and wanted to provide context. It's a big change and we think it is important to get started early.There is an additional benefit. Osterloh said that information is released. It's like stuff leaks nowadays.New chip design with AI-infused technologyWe have heard a lot of rumors, thanks to these leaks, about Googles plans to create its own mobile processor under Project Whitechapel. Although the code names are not disclosed by Google, it is clear that Tensor work has been ongoing for quite some time.Osterloh explained that the chips name refers to TensorFlow, the open-source platform for machine intelligence. This should give you an idea of how important AI is in this processor. Although Google has not yet revealed all details about Tensor, Osterloh explained that the SoC was an ARM chip built around a TPU (or Tensor Processing Unit). The TPU was based on larger versions of the Google AI researchers who co-designed the mobile chip.GoogleIt's not only designed to speed up machine-learning tasks on your smartphone. Osterloh stated that they have also redesigned ISP, the image signal processor. He said that there are only a few places in the ISP where machine learning can be inserted, which is a new feature.Google also rebuilt the memory architecture in order to make it easier for users to access RAM and to allow data manipulation when processing images. Osterloh also stated that they have embedded their image processing algorithms directly into hardware in a few instances. This allows Google to do things that were previously impossible on standard SoCs. However, he did not share details about what Tensor now allows that other SoCs could not.Osterloh admits that people may view the company as unproven, given this is Google's first mobile chip. He did however respond by saying that people are well aware of Google's capabilities.Naturally, it is natural to wonder whether the company can compete on areas such as power efficiency and heat management. Osterloh stated that Tensor is designed to be more efficient than other processors while maintaining a temperature threshold. Osterloh stated that the system is similar to other processors. We can use the most efficient subsystem for the task.GoogleDespite the ongoing shortage of chips worldwide, Osterloh believes that Google will be able to manage the demand. He said that everyone is affected by this shortage, without a doubt. This is a positive thing. We made it happen and are responsible for it. We think that we should be fine.Tensor is a great choice!What can Tensor do better than other mobile processors? Google is saving the best bits for the Pixel 6s launch this fall. It did however offer two areas where there would be significant improvements: Voice recognition and processing. Osterloh demonstrated some Tensor-enabled features on both the Pixel 6 Pro and Pixel 6 Pro at our meeting. This also gave us our first glimpse of the phones. These handsets have a new design and offer bright colors. Osterloh says that the handsets also feature a horizontal camera bump, which spans the width the rear. This is an intentional design element. "We have been well-known for our photography and [so] we wanted this to be highlighted.Google upgraded the cameras, but not only are there promised improvements in photography from optical hardware. Tensor is responsible for some of this. The company ran into problems with previous chips when it tried to improve the photography experience on its phones. He said that these chips were not made for machine learning or AI and weren't optimized for the direction Google is taking.GoogleWhere is Google heading? Osterloh and his colleagues see a future of ambient computing. This vision is one Osterloh has shared with others in the past. The future will see all of the sensors and devices around them communicating with Google (sometimes via the Assistant or the internet). Osterloh is certain that the smartphone will still be the most important device for most people. The mobile phone is the central point of this.Google needed to find a way to go beyond current processor limits in order to achieve its goals. Osterloh stated that when faced with these types of engineering constraints, constraints, and technical constraints in the past we have always taken on the problem ourselves.Photo and video processing upgradesTensor allows the Pixel 6 to simultaneously capture images from two sensors. The main sensor records at normal exposure while the wide-angle runs at a faster shutter speed. Osterloh explained that the system uses a variety of machine learning models to analyze the scene. This includes things like the presence of a face or whether the device is shaking. You will be less likely to get blurry shots if you are trying to capture hyperactive toddlers or puppies.Tensor will allow Google to perform computationally demanding tasks while you are shooting video. Osterloh stated that the company has not been able to apply much machine learning to video in the past because it would be too demanding for a phone processor. Tensor is changing all that, he stated. They have been able to develop an HDRnet model for videos that dramatically improves quality in difficult situations, such as when the camera is pointed at the sun.Osterloh demonstrated how the Pixel 6 could do both of these things. He showed me before-and-after examples of blurry photos of active children and video comparisons of campgrounds at sunset. Although there was a noticeable difference, I can't show you the actual results. These were demos that Google had provided. These features will only be useful and impressive until they are tested in the real world.Improvements in speech and voiceHowever, I was able to view a more compelling preview. Osterloh showed me voice dictation in the Pixel 6 via GBoard. The new phone will allow you to use the microphone to narrate your message, and hotwords such as Clear or Send to trigger actions. The onscreen keyboard allows you to edit typos while the mic is still hearing your dictation.All this is possible via the Speech On Device API. I was amazed that the system could distinguish between sending the message when you tell it to and when you send it. Osterloh explained that the algorithm looks beyond the hotword to consider your voice tone and delivery before it triggers an action.Osterloh also showed me Live Caption with Translate and Android 12s Material You Design. Tensor is the Android's Live Caption feature. This provides subtitles for any audio on your device and can also translate what's being said in real-time. It all happens on your device so it doesn't matter if you don't have subtitles when you watch TED Talks or other international programs.GoogleMaterial You: A Look atMaterial You, the UI update that Google introduced at I/O this past year, Osterloh described as the most significant UI change in Android since the beginning. The feature has been available in the Android 12 public beta for some time, but it is still not there. Osterloh demonstrated how it works. He changed the wallpaper on a Pixel 6 to something more pinkish, creating a scene of water. The system icons and highlights were then quickly updated to match. The app icons were also painted to match, but I learned something new from the demo. If you prefer your icons to remain in their original colors, you have the option to leave them alone.Although we have a good idea of what's in store for the fall, Google is keeping many details secret. It is not yet clear if Google plans to disclose if Tensor was developed with help from other manufacturers. Details about the CPU and GPU cores, clock speeds, and other components will be revealed later in the year. Google has finally fulfilled a long-held dream with its new chip.He said that we kind of view this as The Google Phone. This is what we started many years ago, and are finally here.