AWS CEO speaks on 2025 AI-Priorities, NVIDIA & Anthropic Partnerships, Trainium 2, and US Governance

AWS CEO Matt Garman speaks on the various moving parts behind a successful 2025 for AI deployments

5
AWS CEO speaks on 2025 AI-Priorities, NVIDIA & Anthropic Partnerships, Trainium 2, and US Governance
AI ModelsNews Analysis

Published: February 20, 2025

Rory Greener

As the AI landscape evolves, many new and emerging considerations accompany it. AWS is at the centre of many of those conversations, with the leading digital business provider touching on various aspects of many international companies’ daily procedures via its deep ecosystem and portfolio of products and customers. With AI coming into the picture, AWS is keen to stay on top of emerging technology to create transformative tools for enterprise clients that continue to develop their broader enterprise solutions ecosystem.

Speaking to Time on AWS’s goals for AI deployment in enterprise sectors, AWS CEO Matt Garman noted that the firm’s “first priority always is to maintain outstanding security and operational excellence.”

“We want to help customers get ready for that AI transformation that’s going to happen,” Garmen remarked. To reach this goal, Garman highlighted various elements that are coming together to make AI transformation even more tangible in the future.

Part of that journey comes via AWS working on developing its AI ecosystem in 2025 to help customers “get all of their applications in a place that they can take advantage of AI.”

“It’s a hugely important priority for us to help customers continue on that migration to the cloud because if their data is stuck on-premise and legacy data stores and other things, they won’t be able to take advantage of AI,” Garman explained.

This can include helping AWS clients modernize data and analytics stacks into a cloud service, as well as organizing that cloud data in a manner that allows AWS customers “to take advantage of AI, is that is a big priority for us,” Garman also said.

Garmen added that other factors in a successful year of enterprise AI at AWS include helping clients “scale the AI capabilities, bring the cost down for customers, while [we] keep adding the value. ”

Garman also explained:

For 2025, our goal is for customers to move AI workloads really into production that deliver great ROI for their businesses. And that crosses making sure all their data is in the right place, and make sure they have the right compute platforms. We think Trainium is going to be an important part of that.

Leveraging a History of AI Expertise

According to Garman, AWS is aiming to achieve its 2025 AI goals by, in part by, “empowering our broad partner ecosystem to go fast and help customers evolve.”

The AWS CEO explained that the firm has “a long history of doing AI inside of AWS,” providing a strong footing going forward, “in fact, most of the most popular AI services that folks use, like SageMaker, for the last decade have all been built on AWS,” Garman keenly noted.

Speaking further on how AWS is building from its pre-existing client base and knowledge of AI to produce transformative AI solutions ready for enterprise customers, Garman added:

Because of who our customer base is, our strategy was always to build a robust, secure, performance featureful platform that people could really integrate into their actual businesses. And so we didn’t rush really quickly to throw a chatbot up on our website. We really wanted to help people build a platform that could deeply integrate into their data, that would protect their data. That’s their IP, and it’s super important for them, so [we] had security front of mind.

Garman also stated that in the last year, AWS noticed that enterprise AI adopters made a general stride toward completing the proof of concepts stage and, therefore, allowing them to move towards AI solution production and deployment.

Garman explained how the broader AWS ecosystem then fits into this goal:

They realized that the platform is what they needed. They had to be able to leverage their data. They wanted to customize models. They wanted to use a bunch of different models. They wanted to have guardrails. They needed to integrate with their own enterprise data sources, a lot of which lived on AWS, and so their applications were AWS.

On Anthropic, NVIDIA, and Trainium 2

Late last year, AWS introduced Trainium AI chips that aim to optimize AI training and inference while reducing the costs of such procedures.

Alongside the Trainium pathway, AWS is also a core partner for Anthropic, a firm that, according to Garman, has “one of the strongest AI teams in the world. ” Therefore, AWS has access to an innovative chipset and service provider to help drive forward the AWS AI ecosystem.

Garman explained:

They have the leading model in the world right now. I think most people consider Sonnet to be the top model for reasoning and for coding and for a lot of other things as well. We get a lot of great feedback from customers on them. So we love that partnership, and we learn a lot from them too, as they build their models on top of Trainium, so there’s a nice flywheel benefit where we get to learn from them, building on top of us.

NVIDIA is another AWS AI partner and a partner that Garman equally champions as a key player for his firm’s customers, stating that NVIDIA is “an incredibly important partner of ours.”

“Today, the vast majority of AI workloads run on Nvidia technology, and we expect that to continue for a very long time,” Garman remarked.

Speaking further on how these partnerships help to boost AWS’ AI ecosystem, Garman said:

Both for AI companies who are looking to train these massive clusters, [for example] Anthropic is going to be training their next generation, industry-leading model on Trainium 2.

US Governance on AI

Looking back at 2025, the early stages of the year have already proved monumental in terms of AI governance.

Whether it’s the Paris AI Summit, the UK’s AI Playbook, or the billions Trump is dedicating to Project Stargate, the world is working on the red tape affecting the industry.

Speaking on the recent moves from the US, Garman notes that he is “optimistic that President Trump and his administration can help us loosen some of the restrictions on helping build data centers faster.”

Garman also added that the US government can help firms like AWS “cut through some of that bureaucratic red tape and move faster.”

Garman notes this is of particular importance as firms like DeepSeek disrupt the industry and US AI leaders. “I think that’ll be important, particularly as we want to maintain the AI lead for the U.S. ahead of China and others,” Garman concluded.

AI AgentsAI AssistantsPartnershipsProductivity
Featured

Share This Post