IBM to Add LLama 2 to its AI and Data Platform Watsonx

IBM has announced plans to host Meta's Llama 2-chat 70 billion parameter model in the watsonx.ai studio, part of IBM's enterprise-ready AI and data platform watsonx.

This move follows IBM's collaboration with Meta on open innovation for AI, which includes work with open source projects developed by Meta, such as the PyTorch machine learning framework and the Presto query engine used in watsonx.data.

This aligns with IBM's approach of providing a variety of AI models, including those from third-party sources and their own. Within watsonx.ai, AI developers can utilize pre-trained models from both IBM and the Hugging Face community. These models cater to various NLP tasks like answering questions, generating content, summarizing text, and classifying and extracting information.

Currently, only a group of select clients and partners will have early access to Llama 2 in watsonx.ai. IBM also plans to release its AI Tuning Studio, as well as additional AI models in watsonx.ai.

The company is committed to keeping trust and security principles at the forefront with all its upcoming generative AI capabilities. For example, when users run the Llama 2 model through the prompt lab in watsonx.ai, they can toggle on the AI guardrails function to help automatically remove harmful language from the input prompt text as well as the output generated by the model.

To help clients realize the full potential of the models for their use cases, IBM Consulting offers the expertise of 21,000 data, AI and automation consultants in addition to its Center of Excellence for Generative AI comprised of more than 1,000 consultants with specialized generative AI expertise.