Now we know the Pixel 6 is real and will feature Google's SoC "Tensor", which powers all the features that make a phone feel more Pixel-ish. This is a great first part. The Pixel 6 looks just like the flagship Pixel phone. The part about Google's chip is even more impressive.The Pixel 6 may be a failure, but it could also be one of the most popular Android phones. We won't be able to find out much more about it until we get one. Google has high hopes for the Tensor chip, but it has much more at stake. Tensor, if it succeeds in doing everything a smartphone requires, will be a great success that very few other companies can copy.Let's not get into the Pixel 6. Let's not get into the Pixel 6. A chip Tensor can be confusing to someone who isn't familiar with Google and all its other activities. A Tensor is a muscle that tenses in order to stretch another part of the body. They can be found in the ears or mouths of everyone. Tensor, a company that makes parts for skateboards, is interesting. It can even be used as an abstract metaphysical object to express a multilinear concept in very geeky Good Will Hunting-level mathematics. Guess which definition Google loves the most?Source: Google Many Tensors.This is because even complex mathematics can be used to create artificial intelligence. Google loves everything about AI, even the bad bits. Google even called the Tensor Processing Unit the first chip it invented. This is not the first Google chip that powers a computer. It's also not the first Tensor Processing Unit.This isn’t the first Google Tensor Chip.Machine learning and AI calculations are done by most companies using what is called neural networks. Although I do have some understanding of the structure of a neural network within my body, I don't know much beyond that they exist in the middle between an input and an output. For example, if your hand wants to grasp and move something, it does so.Computer neural networks work in the same way: They take one or more inputs and drive them through a series of hidden processes (meaning that only the input computations can access them), and then shuffle the result through some type of output. When programmers feed photos of cats and chain-link fences to a neural network, the processes in the middle do the "learning" and "remember" what it has learned.