Back to Resources
Blog

Unconventional Asset Management at the Speed of Science-Based AI: Accelerating Multi-Well and Multi-Bench Optimization

Written by
April 28, 2025
Share this post
Copy Link

It has always been a challenge to apply traditional reservoir simulation technologies and the matching of such calculations to actual field and well production when dealing with modern unconventional plays such as the Permian Basin. The duration of a single modeling exercise, and the amount of compute power needed to deliver it, preclude a dynamic use of modeling and simulation to guide decision-making at the pace of typical drilling programs involving many new wells each month. The feedback of recorded production volumes, often reflecting interactions between wells or between benches, needs to be used promptly to update predictions and better understand underlying uncertainties.

Did you know that ExxonMobil teamed up with the leading developer of Science-Based AI (SBAI) NobleAI to collaborate on dramatically changing the whole paradigm? The goal was to achieve orders of magnitude faster results with a beneficial increase in the diversity of models, all this while remaining constrained by physics. To get the whole story, read the 2023 paper SPE -214818-MS by D. Gala et al.:AI-Powered, Lightning-Fast Production Modeling of Multi-Well and Multi-Bench Unconventional Development“.

To whet your appetite, here is an overview of some key aspects covered in the SPE paper.  Traditional AI training sets rely on a set of actual borehole, formation stimulation and production data, which will constrain its relevance to the dataset’s geographical location and formation geology. The process used by NobleAI consisted in putting together representative pseudo-wells covering all the combinations of some 150 parameters, all constrained by physical and geological rules, such that the resulting training data can be used over a very wide geographical area. Efficiencies were created at various steps of the process to accelerate productivity and reduce the consumption of computational resources. The result was favorably matched to the outcomes of traditional numerical simulations. It was also compared to a similar approach using generic AI, which confirmed that in the absence of scientific constraints the results were populated with physically improbable models and in some cases hallucinations.

The AI platform consists of three salient components, which all played a critical role in delivering value. To address the challenges of working in a context that combines uncertainties with high dimensions, science-based machine learning models were deployed. Among the foundational techniques, we see loss functions informed by physics that were used to penalize solutions that did not satisfy the physical constraints. State representation learning was used to learn reduced-order models from limited data. The physics included the modeling of multiphase flows using nonlinear equations and complex well interference among many others. The results were benchmarked against Consumer AI and this demonstrated that SBAI was not only more efficient at reaching better performance with minimal training, but also that its results were better at scaling up from the training set to the full test set.

The second component of the AI platform is Explainable AI, with several interpretability tools, for example the Shapely Additive Explanations (SHAP) which reveal the importance and ranking of features, local values for each model and relative impact feature by feature.

And third, the scalable infrastructure component of the platform is key to the useability of the final model, leveraging scalable ML Ops and cloud infrastructure. Some of the capabilities relevant to this application (and others) include multi-modal data encoding and preprocessing (which reduces repetitive on-the-fly data conversions), a low latency model training process (repeatable as often as needed without excessive resource utilization) and the logging and tracking of metrics (accelerating model evaluation and improvement loops).

The proxy model proved valuable in traditional workflows. These included history matching, sensitivity studies, economics and some optimization applications.

Actual case studies encompassed real unconventional assets. In all cases traditional simulations were computed and applied to provide a robust comparison spanning 40 years of production and for oil, water and gas. The match proved to be decent and accurately captured real-world issues such as fracture interactions when wells are too close to each other, something the Consumer AI failed to register.

The overall value of this approach is its speed and its breadth of application. Ultra-precise methods that take months to deliver a result for a single set of model assumptions is of little use in the fast-evolving environment of modern unconventional plays. Being able to use a single, efficient modeling process for any given set of data and in orders of magnitude less time opens the way for reservoir engineers and drillers to collaborate on field management proactively and with a far greater insight into the impact of future events and their bearing on the economics and integrity of the assets.

Unconventional Asset Management at the Speed of Science-Based AI: Accelerating Multi-Well and Multi-Bench Optimization

Written by
April 28, 2025
Share this post

It has always been a challenge to apply traditional reservoir simulation technologies and the matching of such calculations to actual field and well production when dealing with modern unconventional plays such as the Permian Basin. The duration of a single modeling exercise, and the amount of compute power needed to deliver it, preclude a dynamic use of modeling and simulation to guide decision-making at the pace of typical drilling programs involving many new wells each month. The feedback of recorded production volumes, often reflecting interactions between wells or between benches, needs to be used promptly to update predictions and better understand underlying uncertainties.

Did you know that ExxonMobil teamed up with the leading developer of Science-Based AI (SBAI) NobleAI to collaborate on dramatically changing the whole paradigm? The goal was to achieve orders of magnitude faster results with a beneficial increase in the diversity of models, all this while remaining constrained by physics. To get the whole story, read the 2023 paper SPE -214818-MS by D. Gala et al.:AI-Powered, Lightning-Fast Production Modeling of Multi-Well and Multi-Bench Unconventional Development“.

To whet your appetite, here is an overview of some key aspects covered in the SPE paper.  Traditional AI training sets rely on a set of actual borehole, formation stimulation and production data, which will constrain its relevance to the dataset’s geographical location and formation geology. The process used by NobleAI consisted in putting together representative pseudo-wells covering all the combinations of some 150 parameters, all constrained by physical and geological rules, such that the resulting training data can be used over a very wide geographical area. Efficiencies were created at various steps of the process to accelerate productivity and reduce the consumption of computational resources. The result was favorably matched to the outcomes of traditional numerical simulations. It was also compared to a similar approach using generic AI, which confirmed that in the absence of scientific constraints the results were populated with physically improbable models and in some cases hallucinations.

The AI platform consists of three salient components, which all played a critical role in delivering value. To address the challenges of working in a context that combines uncertainties with high dimensions, science-based machine learning models were deployed. Among the foundational techniques, we see loss functions informed by physics that were used to penalize solutions that did not satisfy the physical constraints. State representation learning was used to learn reduced-order models from limited data. The physics included the modeling of multiphase flows using nonlinear equations and complex well interference among many others. The results were benchmarked against Consumer AI and this demonstrated that SBAI was not only more efficient at reaching better performance with minimal training, but also that its results were better at scaling up from the training set to the full test set.

The second component of the AI platform is Explainable AI, with several interpretability tools, for example the Shapely Additive Explanations (SHAP) which reveal the importance and ranking of features, local values for each model and relative impact feature by feature.

And third, the scalable infrastructure component of the platform is key to the useability of the final model, leveraging scalable ML Ops and cloud infrastructure. Some of the capabilities relevant to this application (and others) include multi-modal data encoding and preprocessing (which reduces repetitive on-the-fly data conversions), a low latency model training process (repeatable as often as needed without excessive resource utilization) and the logging and tracking of metrics (accelerating model evaluation and improvement loops).

The proxy model proved valuable in traditional workflows. These included history matching, sensitivity studies, economics and some optimization applications.

Actual case studies encompassed real unconventional assets. In all cases traditional simulations were computed and applied to provide a robust comparison spanning 40 years of production and for oil, water and gas. The match proved to be decent and accurately captured real-world issues such as fracture interactions when wells are too close to each other, something the Consumer AI failed to register.

The overall value of this approach is its speed and its breadth of application. Ultra-precise methods that take months to deliver a result for a single set of model assumptions is of little use in the fast-evolving environment of modern unconventional plays. Being able to use a single, efficient modeling process for any given set of data and in orders of magnitude less time opens the way for reservoir engineers and drillers to collaborate on field management proactively and with a far greater insight into the impact of future events and their bearing on the economics and integrity of the assets.

Sign Up for Updates