Lamini, a Palo Alto-based startup building a platform to help companies adopt generative AI technology, has raised $25 million from investors including Stanford University computer science professor Andrew Ng.
laminiThe company, co-founded several years ago by Sharon Zhou and Greg Diamos, has an interesting sales pitch.
Zhou and Diamos argue that many generative AI platforms are too generic and don’t have the solutions or infrastructure to meet the needs of enterprises. In contrast, Lamini was built from the ground up with enterprises in mind and is focused on delivering high accuracy and scalability for generative AI.
“The top priority of almost every CEO, CIO, and CTO is leveraging generative AI within their organization with the highest ROI,” Zhou, CEO of Lamini, told TechCrunch. “But while it’s easy for individual developers to get a working demo on their laptops, the path to production is littered with failures. ”
Zhou notes that many companies have expressed frustration with the hurdles to meaningfully implementing generative AI across business functions.
According to one process poll According to MIT Insights, although 75% of organizations have experimented with generative AI, only 9% have widely adopted it. The biggest hurdles range from lack of IT infrastructure and capabilities to poor governance structures, inadequate skills and high implementation costs. Security is also an important factor. investigation According to Insight Enterprises, 38% of companies say security impacts their ability to leverage generative AI technology.
So what is Ramini’s answer?
Zhou said that “every part” of Lamini’s technology stack, from hardware to software, including the engines used to support model orchestration, fine-tuning, execution, and training, supports enterprise-scale production. It says it is optimized for AI workloads. Admittedly, the word “optimization” is ambiguous, but Lamini is pioneering one step he Zhou calls “memory tuning.” This is a technique that trains a model on data to accurately recall a piece of data.
Chou argues that memory conditioning could potentially reduce the occurrence of hallucinations and cases where the model fabricates facts on request.
“Memory tuning is a training paradigm that is just as efficient as fine tuning, but goes beyond it. It trains a model on your own data, including important facts, figures, and figures, and makes the model highly accurate. ” said AI designer Nina Wei. Lamini’s professor told me via email: “Rather than generalizing or hallucinating, you can remember and recall exact matches of important information.”
I don’t know if I’ll buy it or not. “Memory tuning” seems more like a marketing term than an academic term. There are no research papers on it. At least none that I’ve been able to find. I’ll leave it to Lamini to provide evidence that his “memory conditioning” is superior to other hallucination reduction techniques currently being attempted or being attempted.
Fortunately for Lamini, memory tuning isn’t the only differentiator.
Zhou said the platform can operate in highly secure environments, including air-gapped environments. Lamini allows enterprises to run, fine-tune, and train models in a variety of configurations, from on-premises data centers to public and private clouds. It also scales workloads “elastically,” Zhou says, reaching 1,000 GPUs or more as the application or use case requires.
“Currently, the incentives are not aligned in the closed-source model market,” Zhou said. “What we are aiming for is Let’s put control back into the hands of the many, not the few, starting with the companies that value control the most and stand to lose the most from proprietary data owned by others. ”
For what it’s worth, Lamini’s co-founders have an impressive track record in the AI space. They also stand shoulder to shoulder apart from Ng, which definitely explains his investment.
Chou previously served on the faculty at Stanford University, where he led a group researching generative AI. Prior to earning his PhD in computer science under Ng, he was a machine learning product manager at Google Cloud.
Diamos co-founded MLCommons, an engineering consortium dedicated to creating standard benchmarks for AI models and hardware, and the MLCommons benchmark suite MLPerf. He also led his AI research at Baidu, where he worked alongside Ng while Ng was chief scientist. Diamos was also his software architect at Nvidia. CUDA team.
The co-founder’s industry connections appear to have given Ramini an advantage in terms of funding. In addition to Ng, Figma CEO Dylan Field, Dropbox CEO Drew Houston, OpenAI co-founder Andrej Karpathy, and — oddly enough — CEO of luxury goods giant LVMH. Bernard Arnault, who is an investor in Lamini.
AMD Ventures is also an investor (a bit ironic given Diamos’ Nvidia roots), as are First Round Capital and Amplify Partners. AMD was involved early on, supplying Lamini with data center hardware, and now Lamini is up and running. Many of those models Powered by AMD Instinct GPU, it bucks industry trends.
Lamini touts that model training and execution performance is on par with Nvidia’s GPU equivalents, depending on the workload. We do not have the ability to verify that claim, so we will defer to a third party.
To date, Lamini has raised $25 million in seed and Series A rounds (Series A was led by Amplify). Zhou said the funding will be used to triple the company’s 10-person team, expand its computing infrastructure and begin development into “deeper technical optimization.”
There are many enterprise generative AI vendors that can compete with aspects of the Lamini platform. These include big tech companies like Google, AWS, and Microsoft (via their OpenAI partnership). Google, AWS, and OpenAI in particular have been aggressively courting companies in recent months, introducing features such as streamlined tweaking and private tweaking of private data.
I asked Chou about Lamini’s customers, revenue, and overall go-to-market momentum. She wasn’t willing to reveal much at this somewhat early stage, but AMD (through a partnership with AMD Ventures), AngelList, and NordicTrack are among her early (paying) users of Lamini, as well as some non-profits. He said it also includes public government agencies.
“We are growing rapidly,” she added. “The biggest challenge is serving customers. We’ve only been catering to inbound demand because we’ve been inundated with it. Given the interest in generative AI, we’re not representative of the downturn in technology overall. Unlike our peers in the AI industry, we have gross margins and margins that are similar to regular technology companies.”
Mike Dauber, general partner at Amplify, said: There are many AI infrastructure companies out there, but the ones I’ve seen are taking enterprise issues seriously and creating solutions that help enterprises unlock the tremendous value of their private data while meeting even the most stringent compliance. This is his first Lamini. and security requirements. ”