Is Pixel Visual Core a nod to the future of smart phones?

I hope so. But we shall see.

Pixel Visual Core is at a super high level not a totally new concept. Phone have shipped in the past with separate processors for separate tasks. What is interesting about Pixel Visual Core is that it is a processor dedicated to a perhaps broader application than dedicated processors have been used for in the past. And, one which genuinely makes a phone a desirable purchase for many; image processing.

Why is this important? Well, that is a loaded question. In the end, it may very well not be. As I've said, this isn't truly the first time people have done this. HoloLens has its own HPU unit for example and Microsoft has built AI processing chips. And I'm sure phone vendors have dabbled in the past with all sorts of additional processing units. Most notably, APUs. And hey, if you want to be technical... GPUs are also technically a separate computer with a dedicated purpose.

It could be important for Google though. Depending on how programmable this SoC is and how loosely or tightly coupled to the OS it is, it could mean that the quality of images produced on the new Pixel phones could potentially continue to improve over time. Without hardware upgrades. Furthermore, when image processing it required, it can offload some work from the main SoC freeing up more bandwidth for it to do its thing. In other words, the Snapdragon processor on the Pixel phones will appear to be faster than their counterparts when it is able to offload work to the new core.

And those are also incidentally the kinds of improvements I'm excited about. Back to the GPU note quickly. GPUs solved a specific problem with a dedicated chipset. This allowed systems, in generally, to become drastically more powerful overall in tasks which required that chip. Now, over time performance from a given piece of silicon will get maxed out and software will cease making tangible differences. But then new iterations will be released. Eventually, I expect, if this takes off, that it will follow a similar trajectory as GPUs.

I expect Google will be able to make use of this SoC to such an extent that by the end of life of this device it is still able to keep pace with or even still dominate hardware lacking such a chip. Then it will slowly become common for all vendors to leverage such a chip. Maybe, like GPU's it will eventually just become a dedicated part of the main SoC. But the fact is, people make such a big deal about phone picture quality and speeds. If this hardware can deliver the goods... it will in short order be a must have on all phones.

If it takes off, what will happen next is that the hardware itself will become more specialized and adapted to the task it performs. Just like GPU processor and architecture are drastically different from CPUs, I expect after a few iterations this will look nothing like the SoC we see today in terms of architecture and processing approach.

My hope is that over time we find more and more problems we can solve with such auxiliary SoCs. It isn't a simple field to grow to grow however. This is actually a bit of risk taking on Google's part. It is really only mitigated by the fact that Google's own camera app will definitely take advantage of it and the fact that people place a high value on the quality of pictures a phone can produce.

If this tech matures, it could take a lot of pressure off of investing heavily in higher quality lenses and camera hardware. Given the onset of phones with polarizing camera bumps I think it is easy to see how being able to accomplish more with less hardware might give designers more to work with in a meaningful way. Perhaps the next generation of phones will see the camera bumps gone for good as a result.

Audio processing is another place I'd love to see something like applied. I don't think it will happen though. I think the average person is happy enough with whatever audio quality they get and I don't believe it is a major drain on modern CPUs to process audio at that quality. I'm no audiophile, but I can hear the differences between the same song played on multiple devices. A dedicated processor and API might go a long way to encouraging the dev community to produce tools for delivering more consistent and higher quality sounding audio. Wouldn't it be nice to take a piece of garbage audio device and throw a dedicated processor and machine learning at it and combine it with some equalizer settings to get something much higher quality coming out the other end?

Biometrics are another area. Voice processing, finger print processing, facial imaging. Today, these are all processed on the main SoC using a general purpose computing chip. Offloading this work to a lighter weight, dedicated unit could improve battery life and accuracy. I'd not be surprised to find that this is actually what comes next. If not biometrics in general, then voice processing at the least.

With the push towards digital assistants phone microphones are spending a lot more time active and processing voice. While I doubt this consumes huge amounts of processing time (or if it did, it could just happen on another core), being active and processing continually is still a battery drain. If this could be offloaded to an even lower power specialized processing unit you could either improve the accuracy or the battery life or likely even some balance of the two.

The present reality is that smartphone capabilities are being constrained largely by battery. Apple and BlackBerry convinced us to accept a drop from days of battery life to aiming for just being able to wake up and have it last until you would feel OK charging it again over night. History has shown we're not really willing to accept much less. And battery tech is barely keeping pace with new advances which add to the drain. As a result, even as batteries increase in capacity and become better and better, average battery life in devices isn't actually improving at all.

This reality leads to current the pace of development and innovation. Which is to say, right now it is pretty stagnant. Each year sees very small incremental improvements. Eventually we will hit a wall. Ideas like Pixel Visual Core are like little channels through those walls. It is technology which in one way or another allows us to do more with less. It makes our phones more efficient. Without advances in efficiency, we don't have much runway for innovation.

So, I know this sounds like an incredibly gushy post about a piece of tech which isn't even active yet and may not live up to my expectations. It isn't about Pixel Visual Core really. It is about what I see it representing and possibly enabling in down the road. It won't change the world. But, it could open the door for some very cool ideas.

Comments

Popular Posts