Amazon's innovative Bedrock AI launch
Amazon has made an announcement today regarding the general availability of Bedrock. This service offers a selection of generative AI models, including those developed by Amazon and its third-party collaborators, accessible to users through an API.
Introduced in early April, Bedrock offers AWS customers the capability to construct applications utilizing generative AI models. These models can be personalized with proprietary data. Additionally, brands and developers can harness these models to create AI 'agents' that autonomously perform tasks such as travel booking, inventory management, and insurance claims processing.
Amazon has revealed its plans to incorporate Llama 2, the open-source large language model created by Meta, into the Bedrock platform in the weeks ahead. This inclusion will complement the existing lineup of models from AI21 Labs, Anthropic, Cohere, and Stability AI.
Amazon asserts that Bedrock will become the inaugural "fully managed generative AI service" to provide access to Llama 2, specifically offering both the 13-billion- and 70-billion-parameter variants. Parameters, which encompass the knowledge gleaned from historical training data, fundamentally determine the model's proficiency in addressing tasks, such as text generation. Nevertheless, it is important to acknowledge that Llama 2 has been accessible on other cloud-based generative AI platforms for a considerable duration, including Google's Vertex AI.
Bedrock bears several resemblances to Vertex AI, which, in its own right, provides a selection of finely adjustable first-party and third-party models for customers to employ in constructing generative AI applications. However, Swami Sivasubramanian, Vice President of Data and AI at AWS, contends that Bedrock possesses a distinct advantage by seamlessly integrating with existing AWS services, including AWS PrivateLink, facilitating the establishment of secure connections between Bedrock and a company's virtual private cloud.
In fairness to Google, one could contend that this advantage is more a matter of perception than a concrete one, as it hinges on the specific customer and their chosen cloud infrastructure. Naturally, you won't find Sivasubramanian readily acknowledging this.
"In the past year, the exponential growth of data availability, the accessibility of scalable computing resources, and the continuous evolution of machine learning techniques have ignited a significant surge in enthusiasm for generative AI. This surge has generated novel concepts with the potential to reshape entire industries and redefine conventional work practices," stated Sivasubramanian in a press release. "Today's announcement marks a noteworthy achievement, making generative AI accessible to businesses of all sizes, ranging from startups to large enterprises, and to every professional, encompassing developers and data analysts alike."
In a related development earlier today, Amazon unveiled the deployment of its Titan Embeddings model, a proprietary model designed to transform text into numerical representations known as embeddings, with applications in search and personalization. The Titan Embeddings model offers support for approximately 25 languages and can process text segments, or entire documents, of up to 8,192 tokens (equivalent to approximately 6,000 words) in length, placing it on a par with OpenAI's latest embeddings model.
Bedrock initially encountered some challenges. In May, Bloomberg reported that, six weeks after Amazon's somewhat enigmatic press presentation featuring only one testimonial, most cloud customers were still unable to access the technology. However, with today's announcements and its recent substantial investment in AI startup Anthropic, Amazon unmistakably signals its intent to make a significant impact in the expanding and profitable realm of generative AI.


