Having just returned from the EAGE Digitalization conference in Stavanger, Norway, one theme stood out above all others: collaboration. The future of energy data will depend less on individual technologies and more on how effectively we bring together people, data, and tools.
Historically, EAGE events have leaned towards geoscience but the digitalization series is different by design, drawing a broader mix of engineers, data scientists, digital leaders, and software providers alongside geoscientists. This event was notable in that it was the first time engineers joined the technical committee and that there was a dedicated session tailored to their perspectives.
What was particularly striking in Stavanger was the progress that becomes possible when these groups work together on the same data challenges.
Engineers, perhaps more than anyone, depend heavily on accurate, real-time data to feed simulation models and support the high-value decisions that shape exploration, production, and long-term resource management. Geoscience teams generate much of the data that underpins these models, but for it to be useful, it must be structured, consistent and trustworthy.
This is where data readiness becomes critical. Information from multiple sources, seismic surveys, well logs, historical exploration reports and operational records, must be brought together into coherent datasets. Early engagement between geoscience, and engineering teams is essential.
By working together at the outset, teams can align on what constitutes reliable, decision-ready data and ensure that datasets are structured to support real operational needs. When that alignment is in place, organizations are far less likely to face costly model recalibrations caused by poor or inconsistent source data.
Collaboration is reshaping the energy data ecosystem
This kind of collaboration becomes even more important when we look at National Data Repositories (NDRs). Across the world, NDRs are becoming essential infrastructure for managing subsurface information, enabling regulators, operators and researchers to access consistent datasets.
Norway provides a powerful example. The country has moved further and faster than many others in building a national resource. The key reason for their success is simple but significant – industry-wide collaboration. Bringing all parties together has not been easy but once buy-in was secured, progress accelerated. For countries looking to unlock the value of their subsurface data, whether for exploration, carbon storage or energy transition initiatives, the Norwegian model demonstrates what coordinated action can achieve.
Partnerships over proprietary platforms
Another strong theme emerging from Stavanger was the growing importance of partnerships. The scale of today’s data challenges means that no single company can solve them alone.
Energy companies often operate hundreds of software applications across exploration, production operations, asset management, regulatory compliance, and environmental monitoring. Managing these systems and the data they generate has become increasingly complex.
Traditionally, organizations attempted to solve these challenges through proprietary systems developed in-house. Increasingly however, they recognize that this approach is no longer sustainable. Instead, we are seeing a shift toward collaborative technology ecosystems.
Several companies discussed partnerships with major technology providers like Meta and Microsoft to create shared infrastructure for data access and management. In some cases, even long-standing competitors such as Schlumberger and Halliburton are collaborating to make a client’s entire application system accessible through a unified Microsoft Azure environment.
At the same time, companies are looking outward to customizable software platforms with low-code or no-code tools, allowing them to configure workflows without extensive development. These platforms provide built-in governance, security, and compliance while enabling faster deployment of new capabilities.
The result is a shift away from isolated software ecosystems toward integrated digital platforms, where applications, analytics, and datasets can interact more easily.
AI is driving the need for data readiness
Underlying these developments is the growing need for AI-ready data.
AI promises enormous value for the energy sector, from automated interpretation of subsurface data to predictive maintenance and operational optimization. But these systems depend on high-quality, structured datasets.
Legacy archives present a major obstacle. Much of the industry’s historical knowledge sits in scanned documents, unstructured reports and inconsistent file formats. Simply applying AI tools to these materials rarely produces reliable results, data must first be cleaned, standardized, and structured.
This is where another important theme from Stavanger comes in – humans must remain firmly in the loop. While AI plays an increasingly important role in data processing, human expertise is still essential for interpreting legacy information, resolving inconsistencies, assigning meaningful metadata and validating datasets.
At Ovation we see this challenge regularly. To give just one example, records such as well logs and technical reports often contain inconsistent naming conventions or incomplete descriptions. Decoding these files requires not just technical processing but also practical industry knowledge. Our sector experience helps us correctly categorize and annotate these datasets so they become searchable, usable, and trustworthy.
Even the most advanced AI models won’t function effectively if there are gaps in the data or the data itself can’t be trusted.
OSDU and the push towards interoperable data
These challenges also explain why initiatives such as the Open Subsurface Data Universe (OSDU) are attracting attention.
OSDU is a global initiative to create a standardized cloud-based data platform to allow subsurface data to be stored, accessed, and shared using common schemas and Application Programming Interfaces (APIs).
By enabling applications from different vendors to work with the same datasets, it aims to reduce integration costs, accelerate decision-making, and improve operational efficiency. However, that implementing OSDU is not straightforward. Legacy systems, evolving technical standards, competing vendor interests, and complex data governance requirements all present challenges. Organizations must also rethink how their data environments are structured and managed.
Even if the industry never reaches a fully unified platform, the collaborative work surrounding OSDU is already pushing companies toward more interoperable data architectures.
Looking back to move forward
One final point reinforced at Stavanger is that the energy transition will depend heavily on historical data.
For projects such as carbon capture and storage (CCUS), companies often need to analyze subsurface data that may be 40 years old or more. The challenge is that this information is frequently scattered across disks, spreadsheets, legacy databases, and paper archives.
Before it can support modern energy initiatives, that data must first be located, digitized, structured, and made accessible. In many cases, organizations are only now beginning to ask a fundamental question: do we still have access to the data we need?
If the conversations in Stavanger are any indication, the industry increasingly understands that the answer lies in collaboration, data readiness, and shared technology platforms.
The companies that succeed will be those that treat their data not as a by-product of operations, but as a strategic asset.