SAP Put $1.16 Billion Into a Startup That Teaches AI to Read Your Spreadsheets. Here Is Why That Is Correct.

Every major foundation model announcement in 2025 and 2026 has focused on the same class of information: text. Text from the internet. Text from books. Text from code repositories. The entire transformer architecture that underpins GPT‑5.5, Claude Opus, and Gemini Ultra was designed to learn from sequences of text tokens. These models have learned to understand and generate human language with remarkable capability, and they have learned to write and debug code with increasing reliability.
What they have not learned to do well is reason over a table of numbers.
Not the numbers in a financial report, translated into prose and posted online. The actual numbers, organized in rows and columns, with relationships between fields, missing values, categorical encodings, and the kind of messy, real‑world structure that characterizes every ERP export, every CRM dataset, and every operational database that any enterprise runs. Approximately 90 percent of the data in enterprise environments is structured, tabular data of exactly this type. Text‑based foundation models address 10 percent of it.
Prior Labs, the Munich‑based AI startup, was built to address the other 90 percent.
On May 7, 2026, SAP announced it had invested $1.16 billion in Prior Labs, making it the largest private AI investment in European history and establishing Prior Labs as the continent's best‑funded AI lab. The investment gives SAP a majority stake in the company, with Prior Labs continuing to operate independently under its founders' leadership.
Prior Labs was co‑founded by Noah Hollmann and Samuel Müller, researchers whose academic work on tabular learning at institutions including ETH Zurich produced some of the most influential papers on machine learning for structured data in recent years. Their foundational research demonstrated that the transformer architecture that powers large language models could be adapted specifically for tabular data, with training approaches that allow the resulting models to reason over structured information far more effectively than models trained primarily on text.
Hollmann's description of what Prior Labs builds is technically precise and commercially important: foundation models that understand the language of business data, able to analyze, predict, and generate insights from structured information the way that text‑based models understand and generate language.
The commercial case for this capability is not subtle. SAP's core product portfolio, S/4HANA, SuccessFactors, Ariba, and its analytics and data management suite, is built around structured enterprise data. When a CFO uses SAP to understand their company's cash position, or a supply chain manager uses SAP to forecast demand, the underlying data they are working with is rows and columns, not paragraphs. An AI that can reason over that structured data natively, without first converting it into a text representation that loses most of the relational information, has the potential to deliver insights and recommendations that no text‑based model can match.
The $1.16 billion secures SAP's access to that capability at a moment when the enterprise AI market is moving rapidly toward foundation models that can handle multiple data modalities rather than defaulting to text‑based chat interfaces for all enterprise queries. Prior Labs' founders will use the capital to expand the research team, accelerate model development, and scale the engineering infrastructure required to deploy tabular AI models across SAP's global customer base.





