“There are very few companies that are not affected by AI,” said Vishal Sharma, Amazon’s vice president of artificial general information on Monday at Barcelona’s Mobile World Congress. When he asked whether European companies would change their generative AI strategies in light of geopolitical tensions with the US, he dismissed the idea that open source models could reduce computational needs and bias.
On the startup conference stage, Sharma said that Amazon is deploying AI through its own basic model across Amazon Web Services (Amazon’s cloud computing division). The warehouse’s robotics, Alexa Consumer products, and more.
“We currently have something like three-quarters of a million robots. They do everything from running themselves in a warehouse. Alexa products are probably the most widely deployed home AI products that exist… There’s no part that’s not touched on Amazon’s generation AI.”
In December, AWS announced a new suite of four text generation models, a family of multimodal generation AI models called NOVA.
Sharma said these models are being tested against public benchmarks. “It’s clear that there’s a huge variety in use cases. There’s no all-around. There are places where video generation is needed, and there are other places like Alexa, where you ask them to do certain things. The response should be very quick and very predictable. You can’t hallucinate “unlock the backdoor.” ”
However, he said it is unlikely to reduce computational needs in small open source models. “When you start implementing it in different scenarios, you need more and more intelligence,” he said.
Amazon has also launched Bedrock, an in-AWS service aimed at companies and startups that want to mix and match various basic models, including China’s Deepseek. This allows users to seamlessly switch models, he said.
Amazon is also working with humanity to build a huge AI computing cluster on Trainium 2 chips, investing $8 billion. Meanwhile, Elon Musk’s Xai recently released its latest flagship AI model, Grok 3, which includes around 200,000 GPUs, using a massive Memphis data center.
Asked about this level of computing resources, Sharma said, “My personal opinion is that computing will be part of the conversation for a very long time.”

He didn’t think that Amazon was under pressure from the recent blizzard of open source models that emerged from China. “I wouldn’t describe it that way,” he said. On the contrary, he suggested that Amazon is comfortable deploying Deepseek and other models on AWS. “We are a company that believes in choice. From a customer’s perspective, we are embraced that whatever trends and technology are better,” says Sharma.
When Open AI introduced ChatGPT in late 2022, did he think Amazon had taken a nap?
“No, I don’t think I agree with that idea,” he said. “Amazon has been working on AI for about 25 years. If you look at things like Alexa, there are 20 different AI models running on Alexa. There were billions of parameters that were already there for the language. We’ve seen this for quite some time.”
Regarding the recent issues of controversy over Trump and Zelensky and the subsequent tensions over US relations with many European countries, did he think European companies might be looking for future genai resources elsewhere?
Sharma acknowledged that the issue was “outside” of his “zone of expertise,” and the outcome was “very difficult for me to predict,” but he gave diplomatic hints that some companies might adjust their strategies. “What I say is that it’s a case where technological innovation responds to incentives,” he said.