Tech giants such as Google, Meta, Microsoft, OpenAI and Perplexity are pouring billions of dollars into data centres, with analysts predicting that total spending could swell to nearly $1 trillion by the end of the decade. However, Perplexity CEO Aravind Srinivas believes that the future of artificial intelligence would come from locally run on-device AI models.
In a podcast with Prakhar Gupta on YouTube, Srinivas said that the “biggest threat to a data center is if the intelligence can be packed locally on a chip that’s running on the device and then there’s no need to inference all of it on like one centralized data center.” The Perplexity CEO added that the benefits of on-device AI models is that it will easily adapt to user preferences since the AI model will be “living on your computer.”
AI chatbots such as ChatGPT, Gemini, Perplexity and others currently rely on massive servers housed in data centres to process user queries. These facilities not only consume enormous amounts of electricity but also require constant maintenance and depend heavily on water to keep machines cool.
However, Aravind Srinivas argues that such data centres. which are mostly funded by tech giants, could become much less relevant if AI processing shifts to devices themselves. Running models locally, he says, would not only cut electricity and maintenance costs but also significantly improve user privacy by keeping data on the device.
While today’s AI models are too large and resource-intensive to run efficiently on smartphones or laptops, Srinivas believes advances in chip technology from companies like Apple and Qualcomm could make on-device AI viable in the future. Srinivas also addressed the issue of hallucinations in AI systems, acknowledging that current models still hallucinate “here and there,” but predicting that the problem could be fixed completely in the next five years.
© IE Online Media Services Pvt Ltd






