
Ex-UBS knowledge graph architect Tony Seale in a recent LinkedIn post pointed out the trend:
“Ontology is having its moment. There was a time when we called it the ‘O word.’ Nothing killed a conversation with business – or IT – faster than mentioning ontology. The rule was simple: deliver value, keep the ontology part quiet.
“But things have changed.
“Ontology is shaping up to be the buzzword of 2026. A big part of that is Palantir’s extraordinary rise – their entire Foundry platform is built on ontological modeling. Microsoft is now moving ontology into Fabric. The race is on.”
Most executives don’t know what an ontology is. In this post, I’ll endeavor to explain the concept and why it’s critical to AI success.
To paint this big picture of why ontologies are important, I’ll need to explain the larger phenomenon that ontologies are a part of – The Third Wave – first.
The Third Wave: AI Becomes Context Adaptive
Needless to say, business and sharing go hand in hand. But sharing in the digital world has required creating and marshalling an entire reusable realm of logically described, digitized generalities and specifics in order for sharing at scale to happen. In other words, machines need things sufficiently explained to them in order to take appropriate action.
Each business context needs its own dynamic, shareable version of these logically interconnected descriptions. And each context needs to be able to interact on demand with other contexts.
The term “world model” is now in vogue to refer to a comprehensive collection of these contexts.
Executives should be skeptical of vendors claiming to have world models who don’t adhere to standards. Often, vendors overlook standards, preferring instead to create their own proprietary “world models”. Or, they ignore the need for shared standards entirely.
Many organizations implicitly back a shared standards approach to world models. Over decades, businesses and the semantics community at large have built standard, shared models to explain how people, places, things and ideas interact, as well as how these digitized concepts are different from or similar to one another. Standard, shared, open models from organizations such as the World Wide Web Consortium (W3C), the International Standards Organization (ISO) and the Object Management Group (OMG) are widely used, particularly by knowledge-intensive industries such as life sciences, manufacturing and financial services.
Back in 2017, John Launchbury of DARPA led the development of a video that explained the three waves of AI:
- Handcrafted knowledge (which is still relevant)
- Statistical learning (today’s machine learning, neural networks, generative AI, etc.)
- Contextual adaptation (a blending of waves I and II)
Statistical learning doesn’t explain how it arrives at a conclusion that a given image depicts a cat, for example, Launchbury pointed out in the video. Contextual adaptation, by contrast, does enable such explanations, with the help of the handcrafted knowledge.
Statistical learning can require tens or hundreds of thousands of examples to be able to identify the “catness” of images of cats. The addition of handcrafted knowledge makes it possible to teach machines to identify cats in images by using only a few examples.

The blending of handcrafted knowledge and statistical learning makes it possible to create synergies in order to boost the level of automation in explainable AI systems. I’ve used the term contextual computing in other pieces to describe such systems.
Ontologies: Networked Knowledge Models Custom Built for Your Own Instance Data
Success in business hinges on sharing the right knowledge at the right time in the right place for the right purpose. A network of consistent structure at the data layer called a knowledge graph can make this possible.
Knowledge graphs, as Graphwise points out, blend several data management paradigms to enable right knowledge / right time / right place / right purpose utility:
- Database (data explored via structured queries);
- Graph (analyzed as a network data structure);
- Knowledge base (formal semantics used to interpret the data and infer new facts).

Ontologies, according to Graphwise, “can be seen as the data schema of the graph. They serve as a formal contract between the developers of the knowledge graph and its users regarding the meaning of the data in it. A user could be another human being or a software application that wants to interpret the data in a reliable and precise way. Ontologies ensure a shared understanding of the data and its meanings.”
Boutique knowledge graph consultancy Enterprise Knowledge notes that “Ontologies are semantic data models that define the types of things that exist in our domain and the properties that can be used to describe them. Ontologies are generalized data models, meaning that they only model general types of things that share certain properties, but don’t include information about specific individuals in our domain.” The following diagram that depicts a book publishing ontology provides an example of how ontologies express interlinked concepts in generalized or abstracted ways.

How Ontologies and Standards-Based Knowledge Graphs Pay Off
A standards-based, ontologized approach to data management pays the biggest dividends when systems scale sufficiently to enable substantial competitive advantage. When ontologies help organizations reach across supply chain boundaries, for example, the network effect can increase exponentially.
As you can imagine, achieving this kind of scale requires considerable management skill and consistent effort. In order to cross company boundaries and supply, demand, and partner chains, the challenge of right knowledge / right time / right place / right purpose becomes even more difficult. Agents (software designed to act on behalf of users in more or less autonomous ways) further compound the challenge.
How does an organization meet all these specific requirements together when a business opportunity arises? A successful bringing together of these factors requires above all else a precise description of adapted context, to use Launchbury’s phrasing. The following methods help with the precise creation of such context:
- Unique, global, shared identifiers (to avoid ambiguity)
- A means of sharing descriptions extensibly (to enable accessibility and reusability)
- Relationships that fit the context (to clarify how people, places, things and ideas interrelate within the context)
- Graph or network modeling enabling many kinds of logically connected abstractions (to match levels of specificity/granularity and ensure apples-to-apples comparisons)
When harnessing the power of these methods and the O word together, the Third Wave can build momentum and crest to deliver–finally – the benefits that AI has long promised.





Leave a Reply