AWS has announced that it is planning to increase the availability of generative AI services for organizations in the cloud by expanding its offerings in this area.
Andy Jassy, the CEO of Amazon, has recognized the widespread interest in generative AI applications such as ChatGPT. He also expressed his belief that most customer experiences and applications will be transformed using generative AI.
“We have been working on our own LLMs for a while now, believe it will transform and improve virtually every customer experience, and will continue to invest substantially in these models across all of our consumer, seller, brand, and creator experiences. Additionally, as we’ve done for years in AWS, we’re democratizing this technology so companies of all sizes can leverage Generative AI ... Let’s just say that LLMs and Generative AI are going to be a big deal for customers, our shareholders, and Amazon,” said Jassy.
AWS has announced the launch of a new cloud service - Amazon Bedrock - which allows developers to integrate artificial intelligence systems into their software. This service is similar in functionality to OpenAI's ChatGPT and is part of a new set of generative AI development tools now available on Amazon Web Services (AWS).
Amazon Bedrock is designed to simplify access to foundational models (FMs) by providing API access to FMs from different labs and companies like AI21 Labs, Anthropic, and Stability AI. This will allow developers to incorporate high-performing FMs into their software easily. Amazon will soon also provide access to two new Titan FMs - a generative large language model (LLM) for tasks like text generation and summarization and an embeddings LLM that converts text inputs into numerical representations.
“Bedrock is the easiest way for customers to build and scale generative AI-based applications using FMs, democratizing access for all builders.Bedrock will offer the ability to access a range of powerful FMs for text and images — including Amazon’s Titan FMs, which consist of two new LLMs we’re also announcing today — through a scalable, reliable, and secure AWS managed service, ” said Swami Sivasubramanian, VP of Data and Machine Learning, AWS.
According to Sivasubramanian, customers will now have greater flexibility with Amazon Bedrock to find and personalize foundational models (FMs) that meet their specific requirements using their data. They can then easily integrate these FMs into their applications using the AWS tools and integration.
Bedrock is currently available in a limited preview.
Machine Learning and LLM Investments
Besides Amazon Bedrock, Amazon has also unveiled the launch of a new infrastructure for machine learning. This infrastructure comprises Amazon EC2 Trn1n instances powered by AWS Trainium, which boasts 1600 Gbps of network bandwidth, and Amazon EC2 Inf2 instances powered by AWS Inferentia2, optimized for large-scale generative AI applications. Both of these instances are now generally available.
Moreover, In a letter to investors, Amazon's CEO Andy Jassy stated that the company is planning to invest in large language models (LLMs) and generative AI. He acknowledged that machine learning has been important for several decades, but it has only been in the past five to ten years that companies have started to adopt its use. This is primarily due to the recent availability of higher volumes of computing capacity at lower prices, which has made it more accessible for companies.
Elsewhere, Zendesk and AWS have recently entered into a five-year strategic collaboration agreement to assist companies in providing customized customer experiences on a large scale.