AWS doubles down on generative AI training

AWS doubles down on generative AI training

Posted on



Amazon Web Services Inc. is extending its reach further into the domain of artificial intelligence software development with the release today of several new tools for generative AI  training and deployment on its cloud platform.

In a post on the AWS Machine Learning blog, the company detailed new offerings that include the ability to build and train foundation models, which are large-scale, pre-trained language models that create a foundation for targeted natural language processing tasks. Foundation models are typically trained on massive amounts of text data using deep learning techniques, which allows them to learn to understand the nuances of human language to the point that they can generate text that is almost indistinguishable from that written by humans.

The use of pre-trained foundation models can save developers significant amounts of time and resources that would otherwise be required to train a language model from scratch. OpenAI LLC’s Generative Pre-trained Transformer or GPT is an example of a foundation model that can be used for text generation, sentiment analysis and language translation.

LLM choices

Bedrock is a new service that makes foundation models from a variety of sources available via an application program interface. They include the Jurassic-2 multilingual large language models from AI21 Labs Ltd. — which generate text in Spanish, French, German, Portuguese, Italian and Dutch — and Anthropic’s PBC’s Claude LLM, which performs a variety of conversational and text processing tasks based on responsible AI system training principles. Users can also access Stability AI Ltd. as well as Amazon LLMs using the API.

Foundation models are pre-trained at internet scale and so can be customized with relatively little additional training, wrote Swami Sivasubramanian, vice president of database, analytics and machine learning at AWS. He gave the example of a content marketing manager for a fashion retailer who can provide Bedrock with as little as 20 examples of well-performing taglines “from past campaigns, along with the associated product descriptions, and Bedrock will automatically start generating effective social media, display ad and web copy for the new handbags.”

In conjunction with the Bedrock announcement, AWS is also rolling out two new large language models under the Titan banner. The first is a generative LLM for summarization, text generation, classification, open-ended question-and-answer and information extraction. The second is an LLM that translates text inputs into numerical representations that contain the semantic meaning of the text and are useful in producing contextual responses that go beyond word matching.

Hardware boost

AWS is also beefing up the hardware it uses to deliver training and inferencing on its cloud. New, network-optimized EC2 Trn1n instances, which incorporate the company’s proprietary Trainium and Inferentia2 processors, now provide 1,600 gigabits per second of network bandwidth, or roughly a 20% performance boost. The company’s Inf2 instances, which use Inferentia2 for inferencing of large-scale generative AI applications with models containing hundreds of billions of parameters, are also now generally available.

Another availability announcement is CodeWhisperer, an AI coding companion that uses a foundation model to generate code suggestions in real time based on natural language comments and prior code in an integrated development environment. The tool works with Python, Java, JavaScript, TypeScript C# and 10 other languages and can be accessed from a variety of IDEs.

“Developers can simply tell CodeWhisperer to do a task, such as ‘parse a CSV string of songs’ and ask it to return a structured list based on values such as artist, title and highest chart rank,”  Sivasubramanianwrote. CodeWhisperer generates “an entire function that parses the string and returns the list as specified.” He said developers who used the preview version reported speed improvements of 57% and a 27% better success rate than when working without the tool.

Image: Unsplash

Show your support for our mission by joining our Cube Club and Cube Event Community of experts. Join the community that includes Amazon Web Services and Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger and many more luminaries and experts.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *