Google introduced the Tensor series of chips with the Pixel 6 series in 2021. It was meant to usher in a range of chips specially tailored by Google to keep up with its class-leading AI-based software features. Just three Tensor chips in, and many of Google’s new AI features need to be off-loaded to the cloud for processing.
I think the problem is the conflicting goals that Google has with that chip. They want the chip to be able to run AI stuff locally with the Edge TPU ASIC that it includes, but at the same time Google also wants to make money by having Pixel devices offload AI tasks to the cloud. Google can’t reconcile these two goals.
I don’t think they’re opposing goals. Google does not make more money from a task running in its cloud than on its devices, if anything that costs them more money.
I think it’s realistic to assume that Google is going to impose quotas on those “free” AI features that are running on the cloud right now and have people pay for more quota. It makes no economic sense for Google to keep offering those compute services for free. Remember Google Colab? Started completely free with V100 and A100 GPUs, now you have to pay to just keep using a simple T4 GPU without interruption.
Removed by mod
Bold of you to assume they don’t do that with the locally processed data too
Google makes money from ads that they’re going to serve you no matter where they process your data.
Google is going to pull all that metadata from your device regardless of where it was processed.
Servers cost Google money to run. It costs them nothing to run something on your device. They clearly have a vested interest in running it on your device if they can.
There’s a solution: Charge the customer once for the hardware and then add a monthly fee to be able to use all of it. Sony and Microsoft have great success with that.
Do that and an unlock hack will swiftly follow.
that almost noone will use
Lol you wut?
Do you know how expensive conventional AI setups are? An unlocked AI chip on a phone would fast replace nVidia cards in the AI scene for low level researchers, especially those dealing with sensitive data for whom cloud access is not viable.
My laptop is $1500, and is just about viable for this kind of stuff. It took it three days non-stop to create a trading model for ~22 stocks, processing 10 years worth of data for each.
Now maybe it doesn’t mean much for the consumer, that’s true. It means a hell of a lot for small time developers though, including those developing the apps consumers use.