As AI companies race to improve the accuracy of large language models (LLMs) and apps built on top of them, a startup that has emerged as a key partner in that effort is announcing a significant round of funding. Turing, which works with armies of engineers to contribute code to AI projects — including assisting in the building of LLMs for OpenAI and others, as well as creating generative AI apps for enterprises — has secured a $111 million Series E round, doubling its valuation to $2.2 billion.
Turing currently generates around $300 million in ARR (annualized revenue run rate), and when this latest round was priced, it stood at $167 million. That means the company is growing fast. It has also been profitable for around a year, according to CEO Jonathan Siddharth, who tells Tech Zone Daily the company plans to use the funding to expand its business to more customers and broaden its use cases.
Turing today says that it works with some 4 million coders around the world. The actual number of Turing employees is much smaller — it’s a figure in the hundreds, one of its investors said.
Malaysia’s sovereign wealth fund Khazanah Nasional Berhad is leading the new round, with participation also from WestBridge Capital, Sozo Ventures, UpHonest Capital, AltaIR Capital, Amino Capital, Plug and Play, MVP Ventures, Fortius Ventures, Gaingels, and Mastodon Capital Management. Palo Alto-based Turing has raised $225 million to date.
Turing’s turn as a major partner to AI companies was not how the company got its start. Originally, it was effectively an HR tech startup.
Specifically, its early product was a platform for vetting and hiring remote coders, a business that started to take off during the COVID-19 pandemic, as the world sought better tools to source and manage remote teams. That business was strong enough to catapult the company to “unicorn” status, impressing both customers and investors.
“When I first met them in 2018 my jaw dropped,” Sumir Chadha, a co-founder and managing partner at WestBridge, said in an interview, referring to how Siddharth seemingly upended the whole management consultancy and offshoring model. “You don’t need any of that. You don’t even need an HR staff. You can do it all with Turing and remote engineers.”
It was also strong enough to start catching a different kind of attention.
As a Semafor report from last year recounts, Siddharth was summoned in 2022 to OpenAI for a meeting, which he thought would be to talk about recruiting engineers for the startup. Instead, it turned out to be a proposition. OpenAI researchers had discovered that code added into training datasets helped improve the model’s reasoning capabilities, and it wanted Turing’s help to generate that code.
Siddharth obliged and that set off a whole new business focus for the startup, which he says now works with a number of foundational AI companies to provide similar services, as well as with companies that build apps on top of those LLMs.
“We could not have predicted the ChatGPT moment,” he said this week, and he couldn’t have predicted how the infrastructure he built for hiring coders would place Turing itself into the middle of the action. “I don’t think people knew how important software engineering tokens [would be] to teach an LLM to think and reason and code.”
But it hasn’t been a pivot. Siddharth was quick to correct me when I used the word, and he pointed out that Turing still generates substantial revenue from its older business sourcing coding talent, although he would not disclose how much.
“They are all growing,” he said of the different business lines. “We are stepping on the gas to scale up R&D, and scale up sales and marketing across all three businesses. We are in rapid expansion mode.”
However, the main focus in terms of new business, it seems, will be to continue doubling down on its work in AI as a coding provider for the building of future LLMs — a division Turing optimistically calls “Turing AGI Advancement” — and in its work on apps and services built on those LLMs — under the banner of “Turing Intelligence.”
Updated to include a more recent ARR figure.