Amazon CEO Says, Really Good AI Models Take Billions of Dollars to Train

Amazon is introducing a cloud service Bedrock that developers can utilize to enhance their software with AI systems that can generate text, similar to the ChatGPT chatbot powered by OpenAI.

The recent announcement indicates that the largest cloud infrastructure provider would not leave a trending AI growth area to competitors like Google and Microsoft.

Through its new Bedrock generative AI service, Amazon Web Services (AWS) will offer access to its first-party language models Titan, Google-backed Anthropic, and language models from startups AI21, and a model to turn text into images from Stability AI. One Titan model has the capacity to generate text for emails, blog posts, or other documents. The other will help with search as well as personalization.

Amazon CEO Andy Jassy said, “Most companies want to use these large language models but the really good ones take billions of dollars to train and many years, and most companies don’t want to go through that. So what they want to do is they want to work off of a foundational model that’s big and great already and then have the ability to customize it for their own purposes. And that’s what Bedrock is.”

The Bedrock initiative of Amazon comes around one month after OpenAI came up with GPT-4. The toughest competition for Amazon’s AWS comes from Microsoft because Microsoft has invested a lot in OpenAI and supporting OpenAI with computing power through its cloud provider Azure Cloud.

Bratin Saha, an AWS vice president said, “People using ChatGPT and Microsoft’s Bing chatbot based on OpenAI language models have at times encountered inaccurate information, owing to a behavior called hallucination, where the output can appear convincing but actually has nothing to do with the training data. Amazon is really concerned about accuracy and ensuring its Titan models produce high-quality responses.”