Artificial intelligence models can be surprisingly stealableprovided you somehow manage to sniff out the models electromagnetic signature.
Their method entails analyzing electromagnetic radiations while a TPU chip is actively running.
For example, ChatGPTits made of billions of parameters, which is kind of the secret.
© Shutterstock
When someone steals it, ChatGPT is theirs.
You know, they dont have to pay for it, and they could also sell it.
Theft is already a high-profile concern in the AI world.
This overwhelming pattern issparkinglawsuitsand eventoolstohelp artists fight backby poisoning art generators.
They also worked directly with Google to help the company determine the extent to which its chips were attackable.
But this particular technique of extracting entire model architecture hyperparameters is significant.
News from the future, delivered to your present.
Two banks say Amazon has paused negotiations on some international data centers.
The Secretary of Defense keeps getting caught using Signal, a remarkable feat.