Amazon is competing with Microsoft and Google in generative AI using custom-designed AWS chips Inferentia and Trainium. Despite a later entry, Amazon leverages its dominant cloud position and existing customer base. While Nvidia GPUs currently dominate training models, Amazon’s tools like SageMaker and CodeWhisperer support generative AI. With CEO Andy Jassy overseeing large language model development, Amazon aims to enhance its generative AI capabilities. Amid security concerns, Bedrock assures isolated environments. Amazon’s strategy focuses on enabling value creation through generative AI within its ecosystem.
Amazon is engaged in a competitive race with Microsoft and Google in the field of generative AI, utilizing custom-designed AWS chips to gain an edge. Situated within an unmarked office building in Austin, Texas, a small team of Amazon employees is dedicated to creating two distinct types of microchips, Inferentia and Trainium, aimed at enhancing the training and acceleration of generative AI models. These proprietary chips provide an alternative to the utilization of Nvidia GPUs for AWS customers, which have become progressively difficult and costly to acquire.
Amazon Web Services CEO, Adam Selipsky, stated in a June interview with CNBC that Amazon is uniquely positioned to fulfill the growing demand for chips suitable for generative AI. Despite Amazon’s strong position, other industry players have acted more swiftly and invested significantly more resources to seize opportunities within the booming generative AI sector. For instance, when OpenAI introduced ChatGPT in November, Microsoft drew attention by hosting the viral chatbot and investing a reported $13 billion in OpenAI. Microsoft quickly incorporated generative AI models into its products, integrating them into Bing by February. Similarly, Google launched its large language model, Bard, and made a substantial $300 million investment in OpenAI competitor Anthropic.
However, Amazon entered the generative AI landscape later, introducing its suite of large language models named Titan and launching a service called Bedrock to aid developers in enhancing software through generative AI. Despite Amazon’s reputation for pioneering new markets rather than chasing them, some analysts suggest that Amazon finds itself in an unfamiliar position of playing catch-up.
While Amazon’s custom silicon chips, particularly Inferentia and Trainium, could eventually confer an advantage in generative AI, they face formidable competition from established solutions. Amazon initiated the production of custom silicon with the Nitro hardware in 2013, which has since become the highest-volume AWS chip, with over 20 million in active use.
Although Amazon’s Graviton, an Arm-based server chip introduced in 2018, has gained traction, Nvidia’s GPUs still reign supreme for training models. In recognition of Nvidia’s dominance, AWS launched new AI acceleration hardware powered by Nvidia H100s in July. Nevertheless, Amazon’s extensive cloud dominance and existing customer base could set it apart in the generative AI landscape, as many AWS customers may opt for Amazon due to familiarity with its ecosystem.
With AWS being the leading cloud computing provider, boasting a 40% market share in 2022, its cloud dominance can be leveraged to drive generative AI adoption. Despite a recent trend of declining operating income, AWS remains a substantial contributor to Amazon’s overall profits. Amazon also offers a growing assortment of developer tools focusing on generative AI, including SageMaker, which provides algorithms and models, and CodeWhisperer, a coding companion. Furthermore, AWS has unveiled initiatives like AWS HealthScribe, an AI-powered service for drafting patient visit summaries, and AWS HealthScribe, a $100 million generative AI innovation center.
While Amazon is primarily concentrating on tools rather than a direct competitor to ChatGPT, leaked internal communications have revealed that CEO Andy Jassy is overseeing a central team dedicated to developing expansive large language models. In the second-quarter earnings call, Jassy highlighted the significant role AI plays in driving AWS business, with over 20 machine learning services in operation.
The rapid growth of AI has raised concerns over data security, with companies wary of confidential information being incorporated into publicly used large language models. In response, Amazon assures clients that its Bedrock service offers isolation in a virtual private cloud environment, with strong encryption and access controls.
Despite Amazon’s late entry into the generative AI field, its cloud leadership and expansive customer base position it well for potential expansion in the sector. The company’s strategy centers on capitalizing on its existing ecosystem, enabling AWS customers to explore value creation through generative AI applications.