Subscribe to DSC Newsletter

DSC Webinar Series: Data as the New Oil: Producing Value in the Oil and Gas Industry

Oil and gas exploration and production activities generate large amounts of data from sensors, logistics, business operations and more. Given the data volume, variety and velocity, gaining actionable and relevant insights from the data is challenging. Learn about these challenges and how to address them by leveraging big data technologies in this webinar.

During the webinar we will dive deep into approaches for predicting drilling equipment function and failure, a key step towards zero unplanned downtime. In the process of drilling wells, non-productive time due to drilling equipment failure can be expensive. We will highlight how the Pivotal Data Labs team uses big data technologies to build models for predicting drilling equipment function and failure. Models such as these can be used to build essential early warning systems to reduce costs and minimize unplanned downtime.

Views: 300

Comment

You need to be a member of Data Science Central to add comments!

Join Data Science Central

Comment by Michael Clayton on January 22, 2015 at 7:55pm

Great presentation, relevant, and surprised me so that I plan to watch this and then plan to watch it again later.

we have SLOWER version of this kind of effort in semiconductor wafer manufacturing which is long slow batch processing, but each machine has hundreds of sensors streaming data during the minutes to hours of processing the wafers.  Mix and match realtime and online data collection, formatting for DB's and local alarms, and capturing operator logs.  We have used Spotfire in our industry, just as oil companies have used it for years to avoid dealing with the IT folks for visualizations (engineer-manager owners of the tool).   Also SAS-JMP visualizations and modeling.  Nowdays, R and Python help with the data integration and data formatting and stat analysis at lower cost IF you have the talent to use those tools in the operations.  

Machine learning ideas were first used in 1980's for "health monitoring" from sensor streams for our process tools (hundreds of tools, hundreds of sensors per tool, but stop and go operation as wafers and loaded, processed, unloaded, many hundreds of wafers per hour over hundreds of discrete steps (many weeks total to get the 30 or 40 layers processes on each wafer).  

We did get help from the chemical engineers like Dr. John MacGregor who worked with the continuous processing industries like oil refining and paper making in 90's.  Can't wait for the results webinar as drill experience accrues..if they will share those results.  

Videos

  • Add Videos
  • View All

© 2019   Data Science Central ®   Powered by

Badges  |  Report an Issue  |  Privacy Policy  |  Terms of Service