Microsoft has a strategic partnership with OpenAI where it can get access to the latest OpenAI AI models while OpenAI will use Azure as its preferred cloud platform. Now, Azure’s biggest competitor, Amazon Web Services (AWS) has announced similar long-term strategic partnership with Hugging Face. Through this partnership, AWS and Hugging Face will accelerate the availability of next-generation machine learning models by making them more accessible to the development community with the highest performance at the lowest cost.
While Microsoft, Google, OpenAI and others have their own machine learning models to process and generate text, audio, and images, Amazon is lacking in that front. To overcome this issue, Hugging Face will use AWS as a preferred cloud provider so developers in Hugging Face’s community can access AWS’s tools (e.g., Amazon SageMaker, AWS Trainium, AWS Inferentia) to train, fine-tune, and deploy models on AWS. Also, Hugging Face will use the latest research findings using Amazon SageMaker to build next-generation AI models.
Through this partnership, Hugging Face customers can now easily fine-tune and deploy machine models available on Hugging Face in just a few clicks on Amazon SageMaker and Amazon EC2, taking advantage of purpose-built machine learning accelerators including AWS Trainium and AWS Inferentia.
“Generative AI has the potential to transform entire industries, but its cost and the required expertise puts the technology out of reach for all but a select few companies,” said Adam Selipsky, CEO of AWS. “Hugging Face and AWS are making it easier for customers to access popular machine learning models to create their own generative AI applications with the highest performance and lowest costs. This partnership demonstrates how generative AI companies and AWS can work together to put this innovative technology into the hands of more customers.”