
Big Oil’s Favorite Toy: Supercomputers (2018) - ganeumann
https://www.wsj.com/articles/big-oils-new-favorite-toy-supercomputers-1523358000?mod=rsswn
======
technofiend
Shell Oil had a Cray 1 and an ncube when I worked there in 1989. Chevron had a
Cray 1 and a raft of IBM mainframes in 1990. Chevron's DC was set up for
visitors so every cluster had a sign describing the computing power for that
system. Total compute power was _very_ important because it meant they could
do more analysis on seismic data to refine their oil and gas lease bids.

Chevron had so much seismic data they required (from memory) six robot tape
libraries. They were ganged together so the picker robots could hand tapes
between cabinets in case all drives in a given cabinet were in use. It was
cool as hell to watch the camera mounted right above the picker flying around
this dark wall of tapes to go grab one.

One of Shell's Quality training videos was about The Guy Who Lost The Tape;
seems a seismic data tape was mislabeled and lost, causing Shell issues with
bidding on a lease. Their Deming-style quality training was all about
preventing that sort of thing happening again. I dare say their data was more
valuable in total than the hardware.

~~~
beerandt
If you knew the effort and cost that goes into large scale near and offshore
seismic and non-seismic surveys then you wouldn't think twice about putting
the value of the data well above the value of the hardware.

Offshore deepwater (~2500ft+) non-seismic surveys might cost 6-7 figures per
day to operate, and might fill up a hard drive every 1-3 days.

Depending on how many drives fit on a tape, the raw data could get very
expensive, very quickly, even before it's been processed, analyzed, etc.

~~~
technofiend
Absolutely and of course the opportunity cost from missing out on a lucrative
oil field could buy a room full of compute.

~~~
MR4D
It's even worse than that.

I used to work with a guy who told me that the reason BP bought Amoco, and not
the other way around, is that years before, the Amoco team misread the seismic
_map_ (not the chart), and bid on the wrong piece of land. BP got the other
piece, and the difference was big enough that within a decade, one bought the
other.

 _THAT_ is how much the data is worth.

------
yardie
After research labs big oil has always been the 2nd largest client for
supercomputers for decades. It's even mentioned in 90s films Hackers[0] as
"The Gibson."

[0]
[https://en.wikipedia.org/wiki/Hackers_(film)](https://en.wikipedia.org/wiki/Hackers_\(film\))

------
CharlesColeman
Aren't they Big Oil's _old_ favorite toy? In the 90s, a local university near
me had a Cray XMP that was a gift from some oil company after they'd
decommissioned it.

~~~
baroffoos
Exactly. I have been hearing about this stuff for ages. Oil companies
optimized raping the land decades ago.

~~~
Jordanpomeroy
If only there were something we could do to stop the world’s oil dependency

------
petschge
When going to training classes on optimal I/O strategies for new
supercomputers there is usually a bunch of people working for oil companies.
The simulations I do for plasma physics are considered "easy" there because
they need little input, generate only a few 10 terabyte output and are by-and-
large compute bound.

------
tyfon
I was working as a hired SGI/SUN admin via ABB for Statoil in 1997 in
Stavanger. The HQ was packed with Ultra 60s with creator3d or elite3d cards.
Some even had both. We had to keep a few hundred of these running and they
were used to explore seismic data visually.

Those machines were pretty powerful for those days, almost at supercomputer
level. One cost more than a years salary for me back then.

But Oil has always loved computers.

------
mikorym
What kind of signal analysis do the oil companies do these days (like wavelet
transforms)?

My (in my opinion failed) HonsBSc project was on signal analysis of GC-MS (Gas
Chromatography coupled to Mass Spectrometry) signals.

If I had to do it again in 2019, then machine learning would be a much more
pertinent focus. However, without the computing power of today and ubiquitous
presence of programming libraries for that purpose, there are actually other
ways of approaching such data (like wavelets).

Wavelet transforms were invented quite a while ago [1] but I think the seismic
data analysts were some of the first to really investigate the applications of
that field. The other application is for compression (and loading over an
internet connection) [2].

[1]
[https://en.wikipedia.org/wiki/Haar_wavelet](https://en.wikipedia.org/wiki/Haar_wavelet)

[2]
[https://en.wikipedia.org/wiki/Lifting_scheme](https://en.wikipedia.org/wiki/Lifting_scheme)

~~~
Jordanpomeroy
Given a sparse data set recorded by methods that have known constraints and
limitations, generate a physical model that could conceivably produce
measurements that match the given data

------
dreamcompiler
Schlumberger was a major force in AI in the 1980s when I worked there. So was
Texas Instruments, which started as an oil well instrumentation company.
Computing has always been important to Big Oil.

------
mogadsheu
Strangely enough, I had a couple of classes with Xukai.

Former energy VC with the Norwegian state energy co here. Two of the companies
we invested into were HPC related—one of them developed node controller
solutions for parallel processing, the other developed a platform to analyze
massive amounts of subsurface data more quickly.

When the cost to drill a single exploratory well costs as much as some IPO’s,
in some cases with <10% expected chance of success, there’s plenty of love for
supercomputing in this industry.

------
kjs3
Geotechnical users were _always_ the second or third biggest users of
supercomputers after national security users. Big oil always had the most
astonishing data centers.

------
jhallenworld
New big data jobs in oil exploration?

Yup: [https://www.oilandgasjobsearch.com/Oil-and-Gas-
Jobs/Search/B...](https://www.oilandgasjobsearch.com/Oil-and-Gas-
Jobs/Search/Big-Data)

------
riskneutral
Nun new. Supercomputing has been used in Oil & Gas exploration for decades.

~~~
jgalt212
Indeed, but interestingly enough the book, _The Prize: The Epic Quest for Oil,
Money & Power_, only has 9 mentions of the word _computer_ and zero for
_supercomputer_.

The Prize: The Epic Quest for Oil, Money & Power

[https://www.amazon.com/Prize-Epic-Quest-Money-
Power/dp/14391...](https://www.amazon.com/Prize-Epic-Quest-Money-
Power/dp/1439110123)

~~~
EricE
From the authors Bio: "Daniel Yergin is the author of the bestseller The
Quest: Energy, Security, and the Remaking of the Modern World which has been
hailed as “a fascinating saga” about the __“quest for sustainable resources of
energy,” and “the book you must read to understand the future of our economy
and our way of life,” __... "

So it's more a book to persuade (feelings) than inform (facts) - no wonder
they aren't mentioned.

Know what you're reading.

~~~
jgalt212
I know I read a book that won the Pulitzer.

[https://www.pulitzer.org/winners/daniel-
yergin](https://www.pulitzer.org/winners/daniel-yergin)

and I know you the passage you cite from his bio is not in related to the book
I cited.

------
jonbaer
Probably quite a few combinatorial problems not to ignore,
[https://phys.org/news/2018-09-quantum-mechanics-oil-
industry...](https://phys.org/news/2018-09-quantum-mechanics-oil-industry-
recovery.html)

