Mon. Dec 23rd, 2024
Ibm Makes Ai Easier For All Businesses

Let’s be honest: Building AI applications is not for the faint of heart.

Large companies may have enough bandwidth and budget to fail quickly, but small businesses cannot afford to fail even once.

The problem is that to run AI properly, you need labeled data, plenty of money, and time that many people don’t have.

IBM is a long-time supporter of AI through its IBM Watson mantra, providing an alternative foundational model.

With the watsonx revamp, we aim to address the biggest challenge facing AI projects: the lack of labeled data.

Making AI efforts less difficult

He spoke at a media conference held nearby. IBM Think SingaporeSriram Raghavan, vice president of IBM Research AI, pointed out that getting labeled data is never easy.

“You either have it or you have to spend money to collect it. For enterprises, it takes time to curate the data, which limits their set of use cases. ” says Raghavan.

So IBM took the lessons from earlier Watson and rolled out watsonx. Through his three main components: watsonx.ai, watsonx.data, and watson.governance, the company makes it easy for any company to develop the right use cases without labeling their data.

“What’s really interesting to us is that we can build and retrain without labeled data,” says Raghavan.

“[Foundation models] This opens up the possibility of dramatically scaling AI at no cost. So being able to commoditize that cost across the base model is really, really exciting,” Raghavan added.

Another reason companies are excited about IBM’s promise is workflow agility.

The world is starting to wake up to the impact of AI, thanks to Open AI releasing ChatGPT to the world. And it’s not just the customers. Regulators and governments are also starting to get involved.

The foundational model provides a way for companies to avoid having to go back to the drawing board and do everything from scratch when new guardrails or regulations are introduced. IBM enables many companies to build models for specific industry use cases based on the enterprise data they already have.

“What we look at when it comes to generative AI and underlying models is often workflows that start not with base data, but with a base model that might be provided by a provider or third-party open source. Adjust and adapt. Enterprise data,” Raghavan said.

many people control them all

IBM is fully committed to the foundational model.

Chetan Krishnamurthy, Chief Marketing Officer and Vice President of Marketing and Communications, IBM APAC explained how it can be helpful.

The challenge is that many companies are already working on different models. So IBM’s answer is to rely on industry partnerships and alliances to build an ecosystem of models.

In August of this year, IBM announced that it would be making Meta’s Llama 2 chat 70 billion parameter model available to select customers. It also hosts StarCoder, a large language model for code that includes over 80 programming languages, Git commits, GitHub issues, and Jupyter notebooks.

IBM continues to create and release new foundation models. These are called Granite and are multi-sized foundational models that use generative AI for language and coding.

Currently, Granite models use data from the Internet, academia, unstructured code data sets, law, and finance. “And bringing these into the enterprise not only enables new generative AI use cases, but also accelerates the adoption of traditional AI use cases,” Raghavan explained. .

Resource-strapped companies can train these models on specialized datasets, ensuring results are based on relevant industry knowledge. Additionally, the Granite model is trained to look out for hate and profanity (HAP) content, so there’s no need to hire reviewers for large amounts of content.

The future lies in model partnerships

What makes IBM’s approach unique is that we work with all the important AI players and aim to make their work available to everyone. And when we work with large clients, we open source it. Agnes Heftberger, general manager for IBM Australia, South East Asia, New Zealand and Korea, highlighted the example of NASA.

For companies, this means the ability to move quickly on AI initiatives and quickly pivot or capture AI-driven opportunities. And they can now do it with the data they already have or have access to.

But IBM doesn’t stop there. During a media briefing, Krishnamurthy talked about how regional businesses can use IBM watsonx Code Assistant to quickly modernize applications and automate IT without paying huge consulting fees.

IBM watsonx Code Assistant provides pre-trained models based on specific programming languages. One model is IBM Z application modernization. This allows companies to convert his COBOL code to Java, etc. It does not downplay the IBM Z value proposition or generate Java code that is difficult to maintain compared to previous modernization tools. And IBM backs it up with support from IBM Consulting, which helps companies tackle difficult modernization projects.

Another useful use case is Red Hat Ansible Lightspeed. It is a generative AI service for Ansible content, born out of Red Hat’s Project Wisdom, a collaboration with IBM to add AI capabilities.

“Therefore, we want to offer our customers the best of proprietary, third-party and open source models and allow them to choose the best one.”Raghavan he concluded.

It’s up to businesses to choose the next step in AI.

Winston Thomas is the editor-in-chief of CDOTrends and DigitalWorkforceTrends. He is a Singularity believer, blockchain enthusiast, and believes we already live in the Metaverse.you can contact him [email protected].

Image credit: iStockphoto/Mykyta Dolmov