cv

Oil group Total hopes new supercomputer will help it find oil faster and more cheaply

Recommended Posts

Oil group Total hopes new supercomputer will help it find oil faster and more cheaply

PARIS (Reuters) - Energy major Total said its new supercomputer - which has propelled it to a world ranking as the most powerful computer in the sector - will enable its geologists to find oil faster, cheaper and with a better success rate.

The Pangea III computer build by IBM will help process complex seismic data in the search for hydrocarbons 10 times faster that before, Total said on Tuesday.

The computing power of the company has been increased to 31.7 so-called ‘petaflops’ from 6.7 petaflops in 2016, and from 2.3 petaflops in 2013, Total said, adding that it was the equivalent of around 170,000 laptops combined.

Petaflops is a measurement of computing power.

The computer ranks as number 1 among supercomputers in the oil and gas sector, and number 11 globally, according to the TOP500 table (www.top500.org) which ranks supercomputers twice a year.

 

Total’s European peer Eni’s HPC4 supercomputer is ranked number 17 in the global top 500 list.

Oil and gas companies, along with other industrial groups, are increasingly relying on powerful computers to process complex data faster. This enables them to cut costs while boosting productivity and the success rate of projects.

Total did not say how much it had invested in the new supercomputer.

The company’s senior vice president for exploration, Kevin McLachlan, told Reuters that 80% of the Pangea III’s time would be dedicated to seismic imaging.

“We can do things much faster,” he said. “We are developing advanced imaging algorithms to give us much better images of the sub-surface in these complex domains and Pangea III will let us do it 10 times faster than we could before.”

Total said the new algorithms can process huge amounts of data more accurately, and at a higher resolution.

It would also help to locate more reliably hydrocarbons below ground, which is useful in complex environments where it is exploring for oil trapped under salt, such as Brazil, the Gulf of Mexico, Angola and the Eastern Mediterranean.

 

McLachlan expected the increased computer power to affect Total’s success rate in exploration, because of the better imaging, and in oil well appraisals, development and drilling. 

“What used to take a week, now takes us a day to process,” he said, adding that tens of millions of dollars of savings would be made on the oil wells as a direct result of obtaining better images.

Share this post


Link to post
Share on other sites

As with all computers, crap in = crap out, this one will simply do it 10 times faster!

  • Like 1

Share this post


Link to post
Share on other sites

6 hours ago, Douglas Buckland said:

As with all computers, crap in = crap out, this one will simply do it 10 times faster!

Someone has to deal with all the goodness that the data crap generates. Bad (raw data)  crap goes in, good (usable data, that can be monetized) crap comes out......

Share this post


Link to post
Share on other sites

That makes no sense at all! Are you telling me that a computer can take erroneous data and somehow make it 'good data'?

Share this post


Link to post
Share on other sites

3 minutes ago, Douglas Buckland said:

That makes no sense at all! Are you telling me that a computer can take erroneous data and somehow make it 'good data'?

It makes total sense if you know what you want... I did mention raw data and the end result after the high speed super uber computers finish analysing all the raw data , that can then be used and deployed by engineers, geologists, geo-scientists to solve the puzzles

Share this post


Link to post
Share on other sites

Right...a computer, simply because of the spped at which it can perform calculations, can take "Bad (raw) data" in and magically transform it into useful information.

Ahhh, I see the problem now. You have confused 'raw' data with 'bad' data, which is what I described.

That said, I have seen geophysicists run the raw data, not be happy with the results, then change the 'given' transit velocities to come up with what they wanted. There was no 'scientific' reason to change the velocities. This defeats any value the computer may have given them.

I have seen engineer's use 'canned' programs with absolutely no idea of the assumptions made in the program.

Computers are only as good as those using them.

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
You are posting as a guest. If you have an account, please sign in.
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.