Deep learning, stand out UX, innovation multiplying, orchestration layers lowering costs and saving time, cloud shifted—our over-caffeinated Petro.ai developers are simplifying the technology interface to shoulder more of the O&G computational burden for over-tasked engineers and scientists.
Like any good starship, the Petro.ai Platform consists of advanced systems, intricate network corridors and state of the art fabrication. Built for USS Enterprise-size* industries to Nebula class cruiser* companies, the Petro.ai platform architecture employs an agile combination of API, data integration, Kubernetes, and cloud-based functionality.
Kubernetes, our helmsman*
“Petro.ai, being an enterprise class data science platform,” Dr. Derek Ruths, Chief Data Scientist of Petro.ai explains, “it consists of many applications and services that have to work together. To make that happen there must be some coordinating cloud layer. For us, we use Kubernetes to make that orchestration possible.
“We’re running a data science platform for oil and gas, which first and foremost has to be able to do really hard computational work. And if all that hard work happens in one place, it can completely bog down or wipe out the service. The world class way to fix that, is to use a distributed architecture which means that we have multiple systems running at the same time in separate locations. When somebody fires off a well forecasting job that happens in a place different from the piece that’s serving the website or running a dashboard.
“So, when an oil and gas company is thinking about a platform, they need that platform to scale. To bring more users on it. To demand more workflow. To forecast more wells. To run more models. To launch more dashboards.
“We’ve built our system to work natively. For Petro.ai, it is profoundly easy to give more resources when a client needs it. And that means the architecture can scale. The client can pay for a tier and resources that fit their needs. If you only have one or two users and you’re running a couple of jobs, you’ll pay for that. If you need a massive infrastructure and you have hundreds of users and you’re running thousands of jobs a day, we can scale up to that. Regardless, the client is paying for just the resources they need. Easy to deploy, easy to maintain, easy to keep healthy at all times.
“We’ve built our development disciplines around making sure we’re hyper-focused on clients and able to respond quickly to the insights we receive from customers and our own internal users. We’re developing a track record for getting feedback and turning the concept into reality in a day or two.”
Data, our tapestry*
Data looms large for any company trying to increase productivity in this time and space. IT management of a database or data stack is time consuming and compartmentalizing. Forbes observes, “Those terms (data base and data stack) are becoming increasingly obsolete because firms are realizing that their business operates on what is now often referred to as a ‘data fabric’ i.e., an interwoven cloth made up of… data threads that come in many colors, strengths, thicknesses, textures and types.”
Petro.ai operates using a densely woven ‘data fabric.’ Charles Connell, VP of Product, describes the resilient new fabrication, “In the app we use Mongo dB as a database which is built for big data. We use a very small portion of the capability of Mongo which is employed by companies like Visa that have millions of transactions. We’re able to integrate O&G data types that are core operational data, unstructured data from unmanaged sources, and the developed data from our AI computational derivations.
“We’re often asked, how much data can you put into 3D Petro.ai Earth and visualize?We take a very similar approach to Google Earth and Google Maps in how we generate tiles of data. It’s very performant even with large numbers of wells loaded.
“Finally, we have unique environments for every client, so their data is never shared with any other client’s data. We use enterprise grade cloud architecture for a best-in-class cloud deployment.”
API, our communicator*
Connell adds, “The API, Application Programming Interface, that the end user accesses is the same API that we use internally to make apps. It’s not something we built and maintain specifically for an end user. Our clients are using the same API that we use, so it’s a very robust way of accessing all of the data as well as the calculations inside of Petro.ai.
“Using the API, a client can execute calculations in Petro.ai. They could kick off a forecast scenario, decline a group of wells and get those results without ever having to log in to Petro.ai, they could just use the API.
“Data output is managed in two ways. The first is through the API. The data that is generated in Petro.ai gets saved generally to the Mongo Database. Mongo itself just provides a very amorphous storage place where you can put data. If you just put data into Mongo without having a specific data model, it’s very difficult to get that data out.
“Some of the early failures that oil and gas companies had with big data was with using databases like Mongo or Hadoop. It was easy to write data into it and difficult to get data out. In Petro.ai we use Mongo as part of the architecture, but we’ve done a lot of work to build a robust data model that sits on top of Mongo so you can write that data and also get it out to the API.
“The other way to get the data out is through the CSV export function. If you’re in the app, you can click a button and download what you’re working on as a CSV. From there you can analyze it in excel or any other business intelligence tool.”
*Star Trek trivia
- Engage Development Space. Pike said engage. Kirk uses engage to order warp speed in "The Corbomite Maneuver."
- USS Enterprise is a Constitution-class Federation vessel that Kirk captained from 2286-93.
- Nebula Class was a Federation cruiser used by Starfleet in the 24th century.
- Helmsman. Starfleet position aboard 23rd century Federation starships. Also, the most proficient shuttle pilot.
- Tapestry. 15th episode of the 6th season of Star Trek: The Next Generation
- Communicator. The away mission walkie-talkie inspired Motorola in 1996 to name the first flip phone StarTAC.