Periodic Labs: How AI’s Fastest-Rising Startup Hit 1650% Growth
A startup that pairs artificial intelligence with robotic laboratories – not chatbots, not image generators, but actual physical experiments – has rocketed to the top of AI’s growth charts. Periodic Labs, founded by former leaders at OpenAI and Google DeepMind, reported 1650% growth, earning it the distinction of AI’s fastest-rising startup. The company’s thesis is deceptively simple: the internet’s roughly 10 trillion text tokens have been exhausted by frontier AI models, and the next frontier of intelligence requires generating entirely new data by running real experiments in autonomous labs.
What makes Periodic Labs remarkable isn’t just the growth figure. It’s the caliber of talent, the scale of funding, and the audacity of the mission – building AI scientists that can hypothesize, experiment, and learn from nature itself. With a record-breaking $300 million seed round and early talks at a $7 billion valuation, the company is betting that the future of AI lies not in consuming more web text, but in poking reality with robotic arms and learning from what happens.
The Founders Behind the Rocket Ship
Periodic Labs was co-founded by Ekin Dogus Cubuk and Liam Fedus, two researchers whose combined resumes read like a highlight reel of the last decade’s biggest AI breakthroughs.
Cubuk led the materials science and chemistry team at Google Brain and DeepMind, where he spearheaded GNoME – a model that discovered over 2 million new crystal structures in 2023, materials with potential applications in batteries, superconductors, and next-generation electronics. He also co-authored a landmark 2023 Nature paper documenting a fully automated robotic lab that synthesized 41 novel compounds in just 17 days.
Fedus served as Vice President of Post-Training Research at OpenAI, where he was part of the small team that created ChatGPT. He also led the development of OpenAI’s Operator agent (now called Agent) and the first trillion-parameter neural network. When Fedus tweeted his departure from OpenAI in March 2025, it set off what can only be described as a venture capital stampede – one investor reportedly sent a “love letter” to the not-yet-incorporated company.
The founding team of roughly 20 people includes creators of neural attention mechanisms, Microsoft’s MatterGen materials science AI, and contributors to reinforcement learning breakthroughs in math and code. In a cultural touch that nods to their mission, each team member picks an element from the periodic table for a custom desk plate.
A $300 Million Seed Round and the Path to $7 Billion
Periodic Labs emerged from stealth on September 30, 2025, with $300 million in seed funding – the largest seed round in AI history. Andreessen Horowitz led the round, with Felicis cutting the very first check. Other participants included DST Global, NVentures (NVIDIA’s venture capital arm), Accel, and a roster of high-profile angels: Jeff Bezos, Elad Gil, Eric Schmidt, and Jeff Dean.
The origin of that first check is a Silicon Valley story in itself. Peter Deng, a former OpenAI executive turned Felicis general partner, met Fedus for coffee in San Francisco’s Noe Valley neighborhood. The conversation turned into a pitch walk over the area’s famously steep hills. Deng, overdressed in a sweater on a day that turned hot, was scrambling to keep up when Fedus said something that “literally stopped me in my tracks”: “Everyone talks about doing science, but in order to do science, you actually have to do real science.” Deng committed to invest on the spot – only to discover the company hadn’t been incorporated yet and didn’t even have a name.
As of early 2026, Periodic Labs is reportedly in talks for hundreds of millions more at a $7 billion valuation, a trajectory that reflects both the 1650% growth metric and deep investor conviction in the company’s approach.
Why the Internet Isn’t Enough
The central argument driving Periodic Labs is that frontier AI models have hit a wall. The internet contains an estimated 10 trillion text tokens, and the best models have fully consumed them. Re-reading the textbook, as the founders put it, yields diminishing returns. To create genuinely new knowledge – not just recombine existing information – AI needs to interact with the physical world.
This is where autonomous labs become critical. Each experiment Periodic runs can produce gigabytes of high-quality, proprietary data that exists nowhere on the internet. Crucially, these labs also capture negative results – the failed experiments that make up over 90% of materials science research but are almost never published. In traditional science, failures are discarded. At Periodic Labs, they become training fuel.
“Every deviation in an experiment and every error feedback is an opportunity for the model to understand the physical world,” Cubuk has explained. “AI is not afraid of failure; it’s only afraid of having no data.”
How the Technology Actually Works
Periodic Labs operates on a closed-loop system that integrates three components into what might be called a “trinity” science stack:
- AI hypothesis generation: Starting with pre-trained language models fine-tuned on physics and chemistry, the system generates conjectures at the quantum mechanical scale about new materials and their properties.
- Autonomous robotic labs: Robots in powder synthesis labs mix chemical precursors, heat materials in high-temperature furnaces, and characterize the resulting compounds using instruments like spectrometers – all with minimal human intervention. Each run generates gigabytes of high-dimensional physical data.
- Reinforcement learning from nature: Results flow back into the AI, which refines its models through mid-training and reinforcement learning. Unlike traditional RL environments (video games, simulations), the reward signal comes from physical reality itself. When the AI predicts a material’s properties and synthesizes it, nature provides a definitive answer about whether the prediction was correct.
This loop – hypothesize, synthesize, measure, learn, repeat – runs continuously. The AI reads literature, runs quantum mechanical simulations, takes action in the lab, and gets feedback from nature. Over time, this builds what the company calls a “scientific experience database” of unprecedented depth.
| Focus Area | Specific Goals | Potential Impact |
|---|---|---|
| Superconductors | Higher-temperature operation, potentially room-temperature | Next-gen transportation, efficient power grids, Moore’s Law revival, nuclear fusion |
| Semiconductors | Solving chip heat dissipation | Faster iteration cycles via AI agents interpreting experimental data |
| Advanced Materials | New magnets, heat shields | Defense, aerospace, advanced manufacturing breakthroughs |
Early Customers and Real-World Traction
Periodic Labs isn’t operating purely in research mode. The company has already secured customers in semiconductors, space, and defense – sectors representing trillions in annual R&D spend.
One concrete example: the company is working with a semiconductor manufacturer struggling with heat dissipation on advanced chips. Periodic is building custom AI agents that help engineers and researchers interpret experimental data and iterate faster on solutions. This “land and expand” strategy – solving critical problems with clear, measurable evaluations first, then scaling – reflects a pragmatic approach to commercialization in industries where AI trained only on text has barely made a dent.
The a16z investment thesis frames the opportunity starkly: the industries Periodic targets – advanced manufacturing, materials science, semiconductors, energy, and aerospace – represent roughly $15 trillion in global GDP. These are sectors where the bottleneck has been the iteration speed of human-led experimentation. Periodic’s autonomous labs aim to remove that constraint entirely.
The Talent Arms Race
Within weeks of its funding announcement, more than 20 elite researchers from Meta, OpenAI, and DeepMind joined Periodic Labs – many reportedly forfeiting millions in equity at their previous employers. The roster includes Alexandre Passos, a creator of OpenAI’s o1 and o3 models; Eric Toberer, a materials scientist with key superconductor discoveries; Dzmitry Bahdanau, a pioneer of neural attention mechanisms; and Matt Horton, a creator of two of Microsoft’s generative AI materials science tools.
To maintain cohesion across such diverse expertise, the team holds weekly graduate-level lectures where physicists teach ML researchers about quantum mechanics and AI specialists teach physicists about large language models. “We do feel like a tight coupling is extremely important,” Cubuk has said. The goal is for everyone to understand every part of what they’re building – a culture more reminiscent of a university research group than a typical startup.
How Periodic Compares to Other Approaches
| Approach | Data Source | Strengths | Limitations |
|---|---|---|---|
| Periodic Labs (Physical Labs + AI) | Proprietary experimental data including 90%+ failure rates; GBs per experiment | Deep physical grounding; strong data moat; measurable industrial applications | Capital-intensive; requires physical lab scaling |
| Traditional LLMs (Text-Trained) | Internet corpora (~10T tokens) | Cheap, scalable, fast iteration | No real physics understanding; no negative results; shallow for materials science |
| Simulation-Only AI | Virtual physics engines, game worlds | Low cost; multi-user scalability | Weaker moat; ground truth is the simulator, not reality |
Competitors in the AI-for-science space include FutureHouse, a San Francisco nonprofit building autonomous AI scientists, and Google DeepMind’s ongoing materials science work. The University of Toronto’s Acceleration Consortium and startups like Tetsuwan Scientific are also active. But none match Periodic’s combination of funding scale, talent density, and integrated physical lab infrastructure.
What Comes Next
Periodic Labs has also launched an Academic Grant Program to support pioneering research and is building a Scientific Advisory Board that includes Nobel-caliber talent from Stanford and Northwestern – professors like Carolyn Bertozzi (Stanford Chemistry), Zhi-Xun Shen (Stanford Physics), and Mercouri Kanatzidis (Northwestern Chemistry).
The road ahead is not without risk. Scaling physical labs is expensive and slow compared to spinning up GPU clusters. The company’s moat depends on generating proprietary data at a pace that justifies its capital-intensive model. And the ultimate scientific targets – room-temperature superconductors, materials that could restart Moore’s Law – are among the hardest problems in physics.
But the 1650% growth figure, the $7 billion valuation talks, and the caliber of people betting their careers on this mission all point in one direction: Periodic Labs represents a genuine paradigm shift. Not AI that reads about science, but AI that does science – burning its fingers on real experiments and getting smarter with every failure. If the closed loop between intelligence and physical reality works at scale, the implications stretch far beyond materials science into energy, transportation, computing, and the fundamental pace of human progress.
Sources
- Periodic Labs – Official Website
- Felicis: Our Investment in Periodic Labs
- Andreessen Horowitz: Investing in Periodic Labs
- Observer: Bezos-Backed Startup Receives $300M Seed
- Datamation: Periodic Labs Powers Up for AI Advances
- TechCrunch: $300M VC Frenzy for Periodic Labs
- TechCrunch: $300M Seed to Automate Science
- Teaching AI Real World Physics: Periodic Labs
- 36Kr: Former OpenAI VP Launches AI for Science
- TechFundingNews: Periodic Labs Eyes $7B Valuation