If you were looking to modernize an industry O&G is super fit for disruption. There are really only two major players who have awful legacy software. We spent $300k last year to aquire a single seat on a piece of software. We spent another $200k on 2 seats for a year of another piece of software.
These applications are total garbage too.
The only way to "modernize" Oil and Gas that is compatible with a future for the planet is to shut it down.
The same process can be used with biochar, or potentially carbon from some future carbon capture method.
"Shut it down!" doesn't apply so well to the food supply right?
How do you think goods are transported? How do you think the material is processed?
I agree that humanity needs to find better alternatives, but we all have to be realistic.
The quality of the data is important. Potentially there is a lot of value here, especially for academia who might not have access to this kind of data. You may have multiple Petabytes, but how much of that are you giving away?
I would also argue that 130TB is on the edge of what you can feasibly transfer and store without requiring some kind of complex setup. When you get into Petabytes you're really having to design a unique system just to store and access this data.
I think there’s a lot of unknowns in terms of capabilities and algorithms to go after that in that market.
I’d thought of going back into it and developing some front end visualization software - but the amount of secrecy and magic sauce put me off.
What? There many more than two major players and every large contractor has written their own internal processing software. The best tools in their packages are reserved for internal use only, just as the majors did decades ago when the majors operated their own acquisition crews and had in-house processing staff.
Multi-nationals would do turkey-shoots to put new data in the hands of multiple contractors and let them have at it with their best processors and tools partly to see if processing shops had developed better tools but also to target talented processors for their own operations. It's pretty cut-throat out there.
If your company spent $300k on a single seat I would love to know what software they licensed. As an independent contractor for a couple of decades I have been able to license top software packages from top-line processing software companies for under $100k per seat. After you buy that seat you are only paying maintenance in successive years so your costs usually drop to around 20% of the cost of a new license. It covers patches and maintenance and entitles you to new versions on upgrade as long as you are current on your license. You can get a second-tier package for less than $60k + 20% annual maintenance. Some brand new packages I evaluated in the last 5 years debuted below $40k for a package that was full-featured and ready to go from field tapes to final deliverables. Your people must not do any evaluation at all.
The seismic software field is constantly changing.
>These applications are total garbage too.
Haha. I have seen a lot of this in my time in the industry. One thing that chaps most of our asses is that the larger software companies are tuned to the needs of those who hold the most software licenses so small shops are frequently ignored if they request new features, bug fixes, etc. in favor of the software provider adding some new whiz-bang feature for a large license holder.
A lot of the software packages available share the same roots. Several packages that I have personally evaluated are derived from one single public code base with the only real difference being their GUI. One may be more user-friendly, another sucks to have to deal with it but it has all the tools plus some custom gizmorithms, a third is almost a clone of the first but leaves out modules most useful for land or marine data and doesn't allow VSP processing.
I know they are the same code base because I cornered the developers when I noticed errors common to all of them. I had documented a persistent bug (hey, now it's a feature!) in their software while also evaluating other provider's software and in the process found two other packages that would produce the exact same output every time given identical input even though the output was clearly wrong. In one of the packages, even the parameterization screen dialogs were nearly identical. Pretty unimaginative GUI coders for some of this stuff but it is likely because the guys slapping the interface on this kludge don't actually understand the objectives behind what we are doing, they don't understand which bits of information make or break the imaging effort, etc. because they are coders and not geophysicists.
But, overall, large contractors employ some of the brightest minds in geophysics, computer science, physics, mathematics, geology, etc. That is the main reason that we are able to squeeze old data to extract even more geology than we initially could when it was acquired. Algorithms have improved, hardware is up to the task of keeping everything straight as it gets hammered through the flows.
Yes every shop has internal tools but again much of this is stuff cobbled together by non-software developers. I remember at one point a major tool at this one shop stopped working. It turns out they had hard coded a network share as a temp folder and the folder got removed.
Out $300k spend was on a completely loaded out license from a company that rhymes with lumberge. I don't want to put the name since there was a pretty strict NDA infront.
Your big packages for interpretation are Petrel, Paradigm's Epos systems, and IHS Kingdom-SMT. They are all great packages but as you noted there are things about each one that the user will stumble into which end up making no sense and likely result from poorly coded features which should have been upgrades but ended up being kludges.
> rhymes with lumberge
Scumbagger? As a former employee I can tell you that the best day of my life outside of my marriage and the births of my kids was the day I opened my mail to find an offer letter from another company - which I promptly accepted. Scumbagger bought a great processing software provider a few years ago. Their software was full-featured and very user-friendly. It had some quirks and a lot of kludges but their software support was top-notch the best I have seen in the industry. After the buyout, the older support hands were laid off and the support was bureaucratized to the point where it was no longer worth it to report a bug or request support. A true shame. Once a company gets that large they become like that old saying about juggies on the field crews - if they can't fuck it up they shit on it.
I've also seen a lot of geo/petro phsycists/processors intentionally not want to help because they know heavy automation endangers their job.
One could argue that existing software even if legacy works and gets the job done so what is to (significantly) gain from writing new (and improved) software?
Most positions I've seen do not look like they pay that well. You can make more as a processor in a shop on land and work a regular 8-5 job. On the boats you get a 12-hour tour for the duration and the only real perk that might make it worthwhile is the opportunity to visit foreign ports and dawdle during breaks.
Survey automation is a complex task because depending on whether you are a marine crew or a land crew. Some things are easier in marine work due to less cultural constraints (buildings, highways, pipelines, etc.) But in the same way it is easier on land to locate and replace any sensor that fails without losing much data from that receiver location. For best imaging you need to be able to avoid introducing holes in your data coverage and correct anything that causes a data loss. Redundancy is a real thing out there.
The industry has morphed into one where many larger acquisition contractors have divested themselves of the ships needed to acquire the surveys and they contract that now to custom acquisition crews. Everything went bare-bones a rawhides in the last downturn and as we know, seismic exploration is one of the last things to recover after a bust.
Also, there is a shift in the industry from ownership of the survey equipment (sensors, recording systems, etc.) to rental of everything. Manufacturers build it all, rent it out for custom surveys, maintain it and service it all, train the equipment operators, etc. That cuts costs and makes acquisition a matter of retaining trained personnel for key positions and recruiting trainable people for the rest.
Unfortunately I have no way to verify the source.
Potentially increased extraction of fossil fuels is incompatible with the UK's climate obligations and not something that should be celebrated.
Much of the UK's reduction in grid carbon emissions since around 2010 is due to a coal -> renewables shift, rather than just coal -> gas. In fact, even gas-fired electricity production has begun to decline in the UK as more wind capacity comes online.
Total low-carbon (renewables + nuclear) production reached 56% share in 2018.
Around Europe we can see the countries that have low emissions are either heavy on nuclear or have good access to hydro, or both. The rest that are doing well are on gas. Those who shunned gas and nuclear and went all in with wind and solar are usually amongst the worst. We can expand renewables as much as we like but it's not going to be enough until we have a scalable way of storing the energy generated from it. If we want to reduce gas in order to cut emissions even further, it's going to have to be nuclear. However, if we're talking about the cost of nuclear, the cost of carbon capture and storage of gas generation also becomes an option.
Yes, there is significant day-to-day variance in renewables production. But renewables are a very significant and rapidly growing energy source in the UK, reaching 30% of total grid production in 2018, while both gas and coal are in decline.
In fact, just wind + solar has already exceeded the combined annual generation from the UK's entire nuclear fleet. And it's likely that in the next 1-2 years, wind alone will exceed nuclear production.
Daily variance is also declining over time as the wind turbine fleet becomes more geographically dispersed.
It's true that if the UK had never built gas power plants and still relied extensively on coal, then we'd be in a much worse situation. But if we had gone for gas alone, emissions would be far higher than with a gas+renewables mix. And energy security would be worse, leaving the UK vulnerable to fluctuating prices and potential gas shortages and supply interruptions as most natural gas is imported.
> "Those who shunned gas and nuclear and went all in with wind and solar are usually amongst the worst."
Well, we all know about Germany. The problem here is that not only are they relying on coal, but much of it is actually lignite (brown coal) - the dirtiest form of coal.
And much of the reason they are still so dependent on fossil fuels is not because of lack of renewables production, but because of transmission constraints between the north (where most of the wind production is) and the south (where the biggest demand centers are). This issue is being resolved over time.
You also seem to be ignoring countries like Denmark, Spain, and Portugal who have very successfully moved to wind and solar and now have very little dependence on coal.
They are much more heavily taxed than other companies.
>This means that the marginal tax rate on PRT paying fields is now 81% (fields not paying PRT pay a rate of 62%) https://en.wikipedia.org/wiki/Petroleum_Revenue_Tax
I guess the Empire finally colonised itself.
I just wanted to see some numbers or a nice pdf or two with a few seismic plots;  and  delivered. Although I'm not so sure about the value of the word clouds , the Relinquishment reports are concise with pretty plots.
 gas! https://itportal.ogauthority.co.uk/web_files/gis/images/Word...
If you’re searching for logs on a producing well it might be useful.
But, I somewhat doubt the seismic data has much value if they’re giving it away for free.
It really has an infinite shelf life due to the processor's ability to periodically employ newer, faster algorithms on newer, faster hardware to produce products that, though they frequently show only marginal differences, are still marketable as new products or upgrades over old datasets.
Legacy data, due to the acquisition methods employed decades ago, cannot be replicated today. You will not get a permit to acquire airgun data today using the same energies they routinely used a few decades ago nor will you be able to use a broadband source like dynamite offshore. It was routine back in the 60's and 70's. The bandwidth of data today is different. New source types and improvements to old sources can help but the old data has tremendous value as a calibration. The penetration of energy for imaging the deepest events in the subsurface is so much better in old data due to the low frequency penetration characteristics of old sources (higher energy sources).
If you look around for free or publicly available seismic data there really isn't much and ten years ago there was almost none that was easy to find. Industry groups hoping to help newcomers learn by processing raw field data have always been beggars to the data holders. Licensing restrictions follow data everywhere and a lot of it comes with tight constraints on how it can be used and whether it can be published.
Most contractors hold tightly to their data because it doesn't matter how old it is, you can always squeeze it through another processing flow and output a brand new, improved product and offer that for sale to your existing and prospective clients. Old surveys get new names, they are merged with new data using match filters and cross-correlations and tied so that it is not possible to tell where the old data coverage ended and the new data began.
I started processing almost 30 years ago. Some of the data we processed then was already 20 years old. It served to help a client decide whether a new survey would offer any value to their exploration efforts by contributing a more detailed subsurface image. 2D was a great reconnaissance tool and still is today. By reprocessing some 2D data a client can focus their 3D efforts on proscpects where the potential for success are highest thus cutting their costs and if you think cost cutting isn't a thing in the oil patch, seismic data processing is a loss leader for the big contractors.
I did some 4D seismic processing which involves acquisition of new data using the exact same parameters and processing flows as were used in the first survey. New data is then compared to older data so that operators can see the tell-tale changes in their reservoirs which indicate migration of fluids in the subsurface during production or fluid invasion during waterfloods or CO2 floods.
Old data never dies nor does it lose its value. Like I mentioned above, new data gets matched to old, old data gets matched to new. Any time a survey is acquired for the first time in an unexplored area, that survey data becomes the ground truth dataset. All future data will be compared to it for quality, bandwitdh, signal to noise ratio, etc.
Geophysics, or seismic data processing, really is a "what do you want it to look like" operation. Once you know the acquisition geometry then you can determine everything you need to know to image the subsurface just by smashing it through enough algorithms to filter out all the geology-related attributes like formation velocities, amplitudes - especially anomalous amplitudes, formation thicknesses and their depths below the surface or the seafloor, fluid content, etc. It really is amazing what you can discover without once touching a rock today. It goes so far beyond imaging subsurface structures. I love this field of work.
I am pretty happy to see seismic data released for personal use. I will be digging through this to see what I can find.
They already have numerous surveys and lines throughout those areas.
My first job out of college I processed and archived several warehouses of data going back to some of the very first analog signals recorded, TI’s first digital tapes (named GSI at the time) and also digitized paper records from the 1920s. Great first job that exposed me to massive data and algorithms!
I worked on field crews back then and one of our observers would fill out his paperwork the night before while he had a few beers and smoked all kinds of things. Then the next day he would just make quick notes if something ended up different. Too bad that when wasn't diligent at modifying his pre-written reports. The prevailing belief on the field crews was that someone would figure it all out in processing later so if they didn't get it right during acquisition we could always fix it. Some of the best projects I worked on as a processor involved unraveling the chains of errors in documentation to improve imaging of old surveys.
That bit scared me a bit. It is not always known where those massive pipes are through the land. I hope they did leave out the details that would pose security threat/information that can be used in a sabotage.
These pipelines are quite easily identified on-shore by surface markers in the UK. Also, like I did, folks would have noticed the fairly substantial earthworks that went on when they were being installed....and bided their time if they were to interfere with them. I remember the excavations in Scotland for these pipelines back in the late 70's and 80's. There's no big secret about them.
As to offshore, well your "terrorists" are going to need a fairly huge amount of resources to attack them...deep sea divers, vessels large enough to withstand North Sea storms etc.
In software, it is a really bad idea to hope that your code is so obscure that no attacker will find a security flaw.
IRL, on the other hand, unknown positions of assets are often critical (the uncertainty whether you know about all the adversary's nuclear silos being a huge part of the nuclear deterrence policy).
There is just so much resources you can throw at discovering underground pipelines.
 a quick google search will lead you down a huge rabbit hole. Here's a good start: https://www.youtube.com/watch?v=pL9q2lOZ1Fw but there's huge flaws in all these sectors and even in cellular communication. Just watch some Black Hat and Def Con conference talks.
But I assume that a pipe that is placed 5m underground on an unmarked path crossing a valley, was done so for a security as well as convenience. I will not apply to see that data, but if I was a "Red Team", it would be 'useful' information to have.
I'm pretty sure the only reason they're opening this up now, is that it's now become so easy to get ground-penetrating radar satellites into space, that every state actor now knows where all everyone else's pipes are. They can't hide it, so they may as well give up trying.
Edit: only the ignition was accidental, the puncturing was intentional
Thanks to the badly designed fixed-term parliament act government is effectively stuck there, powerless, breaking records for scale of defeat, and utterly discredited. Thanks once again to Cameron then - for letting the LibDems break something they didn't understand.
The popular has a chance, while the rest is ignored as government fails to consider anything but Brexit, and can't even deliver that.
It would be unbelievable in House of Cards, or Yes, Minister.
Sadly the lesson is not to get rid of the politicians completely.
The message may not be to get rid of politicians, but perhaps that a major weakening of the party system is called for. Getting rid of career politicians might be a good move though.
Course I wouldn't expect any party with realistic chance of government to favour either those or PR. I can hope though... :)
I was reading that Shell UK is making a change to the greener, which scares me because big oil is NEVER green friendly unless they want something more sinister.
Unless it's a tiny PR stunt, R&D project or mandated by regulation, you should instead take it as a sign that for the first time green energy in a free market is competitive at scale in certain situations, even without accounting for externalities. That's fantastic news.
Solar and wind are neither clean nor cheap, but they do have a lot of policial backing.
I know it's not a popular thing to say here it's none the less true.
To actually produce them is one thing, they require huge areas of land to produce, they are unreliable meaningb they still need oil and gas and coal for when they dont work, solar uses rare earth metals, they can only be used for one thing and they currently only produce around 1% of our needs. They are great as supporting sources of energy but they arent actually solutions to our fundamental energy needs. And they arent as cheap as claimed since production, installation and decommissioning normally isnt factored into the cost when you see comparisons.
The greentech industry is every bit as bottomline focused as the oil industry and it is so with an inferior product mostly pushed through by political lobbying not on market terms.
Still, hopefully I'll be pleasantly surprised a few times.