Generative AI has opened new possibilities — from mainstream tools like ChatGPT and Grok to specialized assistants tapping into company data. These systems promise efficiency, but they also come with risks: even the most polished AI answer can be factually off the mark.
At the Dig X Subsurface 2025 conference in Oslo in December, Daniel Dura, Data Scientist at Equinor, will unpack the challenges and share how the company is working to make AI conversations more reliable.
A Retrieval-Augmented Generation (RAG) AI assistant, designed for subsurface tasks, pulls data through APIs to answer queries like, “Find me any information about a Whipstock service for wellbore x.” or “How many packers can we run in a lower completion and still reach TD?”.
But without careful evaluation, the AI might misread a wellbore name or misinterpret terms like “TD” (Total Depth), leading to risky operational errors.
The presentation dives into real-world examples of these hiccups, showing how AI can produce plausible but incorrect responses.
By working closely with subsurface experts, the team reviews AI outputs, builds a reference dataset of authentic queries, and applies evaluation metrics such as groundedness to measure whether answers are supported by the right source documents.
The talk will also point to broader industry initiatives, including the Norwegian Offshore Directorate’s collaboration with the FORCE forum, aimed at building shared datasets. Together, these efforts represent a step toward more reliable and trustworthy subsurface AI.
Join us at Scandic Fornebu, December 03-04, 2025, to learn more. The program can be found on the conference website.
![window.adn = window.adn || {};
adn.calls = adn.calls || [];
adn.calls.push(function() {
adn.request({
network: "2cddc6",
adUnits: [{
auId: "2e0bfb",
auW: 1230,
auH: 480
}]
});
});
Towards a trusted assistant](https://geo365.no/wp-content/uploads/2025/11/1000_AI-assistant-1.jpg)