Kai-Fu Lee’s Game-Changing LLM
In late March, Lee launched a company called 01.AI with the vision to develop a homegrown large language model for the Chinese market. The venture puts him in competition with other prominent Chinese tech leaders, including Sogou’s founder Wang Xiaochuan, who have been swiftly gathering talent and venture capital to establish China’s equivalents of OpenAI.
“I think necessity is the mother of innovation, and there’s clearly a huge necessity in China,” Lee, who’s 61 and leading 01.AI as CEO, explained the motive behind starting the company. “Unlike the rest of the world, China doesn’t have access to OpenAI and Google because those two companies did not make their products available in China, so I think many doing LLM are trying to do their part in creating a solution for a market that really needs this.”
01. AI’s growth is a fitting reflection of the rapid development in the generative AI field. Seven months after its founding, the startup has released its first model, the open-source Yi-34B. The decision to introduce an open LLM as its debut product is a way to “give back” to society, said Lee. For people who have felt LLaMA is a “godsend” to them, “we’ve provided a compelling alternative,” he added.
As of writing, Yi-34B, which is a bilingual (English and Chinese) base model trained with 34 billion parameters and significantly smaller than other open models like Falcon-180B and Meta LlaMa2-70B, came in first amongst pre-trained LLM models, according to a ranking by Hugging Face.
“We still believe that larger models, when trained well on a large amount of high-quality data, will always outperform substantially smaller models of comparable quality and comparable technology, so I think [Yi-34B] outperforming much larger models is something that we don’t usually see,” said Lee. “We feel quite confident that as we release models that are 100 billion to 400 billion over the next year and a half, these models will be dramatically better than today’s model that we announced.”
The startup’s ability to commence model training quickly is no doubt an outcome of its smooth fundraising, which is critical to securing top-tier talent and AI processors. While declining to disclose how much 01.AI has raised, Lee said it’s valued at $1 billion after receiving financing from Sinovation Ventures, Alibaba Cloud, and other undisclosed investors.
01.AI has already grown to more than 100 employees, over half of whom are LLM experts from major multinational and Chinese tech firms. Its vice president of technology, for instance, is an early member of Google’s Bard, and its chief architect was a founding member of TensorFlow and worked alongside renowned researchers like Jeff Dean and Samy Bengio at Google Brain. The key figures behind Yi-34B are Wenhao Huang, a Microsoft Research Asia veteran, and Ethan Dai, who held senior AI positions at Huawei and Alibaba.
Having backed over ten unicorns and venture-built seven companies through Sinovation Ventures, Lee is possibly one of the most well-connected investors and entrepreneurs in China.
“It’s been, you know, over 25 years since the founding of Microsoft Research Asia, and everything I’ve done has been about getting super great talent,” said Lee, who launched Microsoft Research Asia, the U.S. giant’s biggest research center abroad, before spearheading Google China. Over the years, Microsoft Research Asia has earned the reputation of being the “West Point” for nurturing China’s AI entrepreneurs.
“Now, of course, you want to pay people fairly, and you need to be competitive in pay, but I really think that it’s also about people believing they can make a difference and believing the company can succeed,” said Lee, appearing in a video call at 9:30 p.m. Beijing time. His staff was just as self-motivating. One of the startup’s infrastructure experts was working well into the wee hours that day, still messaging Lee at 2:15 a.m. to express his excitement about being part of AI’s mission.
It’s no secret that building LLMs is a costly undertaking. To sustain its cash-intensive operations, 01.AI has plans for monetization right from the start. While the company will continue to open source some of its models, its objective is to build a state-of-the-art proprietary model that serves as a foundation for a diverse range of commercial products.
“We can’t open source everything,” said Lee. “We were quite cognizant of the fact that these large language models require a lot of computing and, therefore, are very expensive. When we raise a lot of money, most of it will be spent on the GPU. Given that, we needed to first acquire as much GPU as we could, which we did.”
Like other LLM players in China, 01.AI has proactively stockpiled GPUs in anticipation of U.S. sanctions; it borrowed money to buy processors even before it landed funding. Over the past year, the Biden administration has heightened restrictions on China’s access to high-end AI chips, prompting Chinese firms to pay inflated prices for chips. The foresight was rewarded—01.AI now has a supply that will suffice for at least the next 12–18 months.
Aside from causing headaches for Chinese firms, U.S. sanctions have been a catalyst for innovation by encouraging them to optimize the use of computing power. “With a very high-quality infrastructure team, for every 1000 GPUs, we might be able to squeeze 2,000 GPU workloads out of them,” said Lee.
01. AI’s path to monetization hinges largely on its ability to find a product-market fit for its expensive AI models. While top-notch LLM scientists are scarce, there’s no shortage of product talent in China.
“China’s not ahead of the U.S. in LLM, but there’s no doubt China can build better applications than American developers mostly because of the phenomenal mobile internet ecosystem that was built over the last 12 years or so,” argued Lee.
While the founder gave no details on the services in the pipeline, he hinted that the company is experimenting with concepts in the productivity and social directions, and he’d be “disappointed” if 01.AI didn’t release an app within this calendar year.
The startup’s ultimate goal, according to Lee, is to become an ecosystem where outside developers can build applications easily. “The duty is not just to push out good research models, but even more importantly to make application development easy so that there can be compelling applications,” he said. “At the end of the day, it is an ecosystem play.” Time will tell if Lee’s AI endeavor will pay off.