Blog

NVIDIA GTC DC Takeaways for Biopharma: Sovereign AI, Physical AI, and the Data Work That Pays Off

October 31, 2025

America is building AI factories at industrial scale, like a modern-day version of 18th century Lancashire.

That was the unmistakable takeaway from NVIDIA’s GTC DC: AI factories and “physical AI” are moving from concept to operating reality, with direct consequences for science, productivity, and national competitiveness. NVIDIA, as evidenced by its unprecedented new $5 trillion market cap, is becoming the dominant substrate of this latest industrial revolution.

What is an AI factory? Typically it comprises a full‑stack, repeatable blueprint that transforms data, models, and compute into production intelligence across industries. Eli Lilly’s TuneLab supercomputer, unveiled anew at NVIDIA GTC, now spans 1,000+ Blackwell Ultra GPUs to accelerate imaging, antibody generation, and gene therapy R&D. The Department of Energy’s new Solstice system will field 100,000 Blackwell GPUs to supercharge scientific discovery at Argonne. Nokia and NVIDIA outlined an AI platform for 6G at the show that turns networks into reasoning systems, an essential backbone for robotics, smart manufacturing, and data-center integration across the economy.​

Compute is no longer the binding constraint for scientific AI; scientific data readiness is.

Overlapping with the factory story was a strong policy push for sovereign AI—trusted, controllable infrastructure operated under domestic rules of the road for security, privacy, and resilience. NVIDIA’s VP/GM of Healthcare Kimberly Powell hosted a stirring conversation with U.S. Senator Todd Young of Indiana, where he laid out his agenda for a secure, domestically controlled AI infrastructure for advanced manufacturing and scientific research. With the right incentives and standards, the U.S. can maintain its resilience and competitiveness with geographic rivals. Global leadership will be earned as much in factories and foundries as in model labs, with sovereignty a design requirement rather than a bolt‑on.​ Sen. Young emphasized why Washington is now the staging ground. 

For biopharma leaders at GTC, the message landed with particular force. Compute is no longer the binding constraint; scientific data readiness is. Tens of exabytes of R&D data sit in proprietary formats scattered across millions of silos, impeding AI workflows and starving models of consistent signals from discovery through manufacturing. Meanwhile, Eroom’s Law continues to pressure returns, and one‑off integration projects routinely stall under complexity, timeline, and scale. (Our CEO writes about these root causes of biopharma’s productivity challenges on his Substack.) 

The lesson from AI factories in other verticals applies directly: value accrues to those who industrialize the data layer, productize repeatable workflows, and field teams that fuse scientific and AI capabilities in day‑to‑day operations.​​ This is the problem set TetraScience was built to solve. 

The Tetra Scientific Data Foundry standardizes and governs instrument and assay data as AI‑native assets—deconstructing proprietary outputs into atomic measurements and metadata mapped to shared schemas and ontologies that models can actually learn from at enterprise scope. On top of this foundation, our Scientific Use Case Factory turns those assets into packaged, repeatable workflows: a lead clone selection assistant that compresses timelines from 8 months to 2.5 months, a media optimization app that reduces wet‑lab runs by 80%, and a universal chromatography dashboard that cuts out‑of‑spec events by 75%. Sciborg teams of hybrid scientist‑engineers address these patterns in the wild by translating domain questions into data pipelines, agents, and outcomes across discovery, development, and manufacturing.​

Seen through this lens, Lilly’s TuneLab is compelling not only for GPU count but for its data posture: a platform to collaborate with biotechs under strict privacy that still allows model‑driven R&D acceleration across modalities. The DOE’s Solstice bet is similar in spirit—build the capability, then feed it with trustworthy, well‑structured scientific data to change the tempo of discovery in fields that underpin energy security and economic growth. The sovereign thread connects both: durable advantage emerges when the nation can run these cycles on infrastructure controlled for security, provenance, and policy, and when regulated sectors can adopt with confidence.​

The next phase is about sequencing. First, get the data foundation right with AI‑native standards, governance, and observability at source; second, productize high‑value scientific workflows; third, scale via agents and automation that keep humans in the loop while compounding learning across programs and plants. Do that, and AI factories become more than keynote metaphors. They become operating systems for scientific progress, and reliable engines for American productivity, resilience, and shared prosperity.

Talk to us to learn about how we're partnering with NVIDIA to reimagine and replatform science for the era of AI.