OPEC is no longer an Apex Predator

2 hours ago, Enthalpic said:

The markets work to maximize profits for the rich, not necessarily saving the world or benefiting mankind.

"Why bother?" because what we are doing is screwing everything up!  Even if you don't believe in anthropogenic climate change you should at least accept that burning stuff causes pollution. If not, please operate a charcoal hibachi indoors

As I said, let regulators play referee. 

  • Upvote 1

Share this post


Link to post
Share on other sites

Just now, BenFranklin'sSpectacles said:

As I said, let regulators play referee. 

Until a government is elected that fires reputable regulators and censors information for their corporate masters.

Share this post


Link to post
Share on other sites

9 minutes ago, Enthalpic said:

Until a government is elected that fires reputable regulators and censors information for their corporate masters.

Q:  How could such a thing happen?
A:  The People let it happen. 

C'est la vie in an idiocracy.  All we can do is protect ourselves and those who will listen to us. 

  • Great Response! 1
  • Upvote 1

Share this post


Link to post
Share on other sites

11 hours ago, BenFranklin'sSpectacles said:

If oil shot to $200/bbl, electricity prices - and the renewables that depend on those prices - would decline.  It sounds counter intuitive, but that's how economies of scale work in electricity markets. 

First, we need to talk about the oil subsidies you mentioned, which are emotionally compelling and irrelevant.  Here's why:  the significant links between oil prices and electricity prices have been severed: 
1)  Electricity generation from oil was replaced by coal, nuclear, and natural gas after the first oil crisis.  Oil-fired generation is now irrelevant.  Why did this happen?  Because coal, nuclear, and natural gas are cheap & plentiful. 
2)  Natural gas prices recently decoupled from oil prices.
This implies that oil subsidies have little/nothing to do with electricity.  I.e. oil subsidies are not germane to this conversation. 

The remaining question is, "What effect - if any - would rising oil prices have on electricity prices?"  The only link is transportation, which would electrify faster.  Of note: that's not an immediate price spike like we see in oil; it's a minor effect that plays out over decades.  To answer our question in more detail, we must know something about electricity markets.  Coal, nuclear, and natural gas benefit from economies of scale.  I.e. using more of them substantially decreases their cost:
1)  R&D, overhead, etc are spread over larger fleets of assets
2)  Existing assets achieve higher average capacity factors
3)  Utilities construct larger, more advanced (read: cheaper) plants
4)  Greater potential profits drive innovation
5)  Larger fleets and interconnected grids decrease the relative cost of reserve generation
6)  Etc.

You get the picture.  We have decades/centuries of natural gas reserves, centuries of coal reserves, and millenia of nuclear reserves.  Thus, unlike oil, electricity has no foreseeable supply crunch to spike prices.  When electrical base-load increases, economies of scale dominate.  Base-load costs will decline. 

Nuclear would be the greatest benefactor of high oil prices.  The existing nuclear fleet is a gaggle of primitive, expensive, relatively dangerous, one-of-a-kind plants.  Construction of nuclear halted before anything approximating standardization, economies of scale, or advanced technology could be implemented.  In a world with rising electricity demand, unprecedentedly large markets, smooth demand curves, and cheap storage, nuclear would hit its stride. 

To illustrate that point: where there's sufficient base-load, you can run 4-8 identical 1400+ MW reactors at a single site.  These reactors can burn advanced fuel that goes 2-5 years between refuelings, enabling 95+% capacity factors.  In that scenario, the current $0.02/kWh might drop to $0.01-$0.015/kWh - remarkably close to the maligned "too cheap to meter" prediction.  Throw in advanced reactor designs, and it gets even cheaper.  That's reliable, base-load generation without the inconvenience, additional grid investment, backup generation, and land that renewables require.  The bottom line: where there's sufficient base-load, renewables can't compete.  Without government intervention in both the nuclear and renewable markets, renewables would remain a niche application. 

Will we see the requisite 24/7 base-load?  Yes.  The combination of cheap storage, electrified transportation, a world that never sleeps, and advancing technology will level the demand curve.  There are three major categories of load fluctuation; all three will be affected: 
1)  Seasonal.  E.g. the US Midwest uses electric air conditioners in the summer, but nat gas/propane heating in the Winter.  This causes a Summer spike.  The spread of heat pumps,efficient buildings, etc will dampen the seasonal demand curve. 
2)  Weekly.  A culture that rested on Sunday used less electricity on Sunday.  We now live in a world of automated factories running 24/7, cities that never sleep, non-standard work schedules, and improving efficiency.  Weekend demand will rise to match weekday demand. 
3)  Daily.  Afternoon cooling loads, evening cooking, morning showers, businesses turning the lights on, etc.  The lowest demand always occurs at night, with various fluctuations throughout the day.  With the aforementioned changes, daily fluctuations will be dampened. 

Now let's add grid storage, which is being specified to handle 1-4 hour peaks.  This is enough to affect daily peaks.  Natural gas peaking plants merely shave peaks.  Storage shaves peaks and shifts that demand to the trough.  I.e. storage raises the floor, creating more opportunity for base-load generation.  That benefits coal, nuclear, and natural gas far more than it benefits renewables. 

Now let's add electrified transportation, which has two components:
1)  Commercial vehicles, which might run 24/7.  Some commercial vehicle schedules could be optimized to charge during demand troughs throughout the day.  E.g. autonomous taxis.  Others could plug into a depot at night, filling the nightly trough.  There's no need to build expensive, risky infrastructure to coordinate charging.  You simply schedule follow a schedule.  That gives businesses the predictability their operations demand. 
2)  Consumer vehicles.  These will fill whatever nightly trough remains. 

When you add all of this up, you get a remarkably smooth demand curve - and that without the expense, privacy invasion, and risk of networked devices.  The smoothing demand curve favors coal and nuclear.  I'll be surprised if renewables make headway beyond the most favorable niches. 

Well written post and a lot of truth in there. But I think you missed my point - had oil not been cheap then renewables and others like nuclear would have been further up the tech development latter. This would likely have meant renewables and nuclear both would have a larger share of electricity generation and therefore achieving the economies of scale you talk about... 

Another example of indirect subsidies is the infrastructure that supports it - for example the port in my city : Some 30 years ago they undertook some upgrades to be able to accept larger vessels (the work involved upgrading the port facilities and dredging the channel). The cost for this was spread over all the shipping activities and the Owners of the port put up the financing. Now, the Owners is ultimately the state, so essentially the state decided to subsidise a coal fired power plant. But that subsidy is not counted in any statistics. 

I am not an engineer, but I do work in the offshore oil and gas industry. And I know first hand that there are several indirect subsidies even today. 

I do not for a second believe that our societies can do without fossil fuels for at least my lifetime, but I believe that renewables will continue to grow and grow more than most people think... 

 

  • Like 1
  • Upvote 1

Share this post


Link to post
Share on other sites

4 hours ago, Rasmus Jorgensen said:

Well written post and a lot of truth in there. But I think you missed my point - had oil not been cheap then renewables and others like nuclear would have been further up the tech development latter. This would likely have meant renewables and nuclear both would have a larger share of electricity generation and therefore achieving the economies of scale you talk about...  

 Another example of indirect subsidies is the infrastructure that supports it - for example the port in my city : Some 30 years ago they undertook some upgrades to be able to accept larger vessels (the work involved upgrading the port facilities and dredging the channel). The cost for this was spread over all the shipping activities and the Owners of the port put up the financing. Now, the Owners is ultimately the state, so essentially the state decided to subsidise a coal fired power plant. But that subsidy is not counted in any statistics.  

I am not an engineer, but I do work in the offshore oil and gas industry. And I know first hand that there are several indirect subsidies even today. 

I do not for a second believe that our societies can do without fossil fuels for at least my lifetime, but I believe that renewables will continue to grow and grow more than most people think...  

I understand what you're saying, but I don't think that's how history played out. 
1)  Nuclear was booming - and then squelched by regulation - before the first oil crisis.  I.e. the cheapest oil never competed with nuclear. 
2)  No one had even thought about renewables until the first oil crisis, nearly 50 years of subsidies haven't made them competitive beyond niche markets, and there's no roadmap to making them competitive beyond niche markets.  If oil had been expensive from the beginning, the world would have used coal & nuclear. 

Let's revisit the first oil crisis and the idea of renewables.  Government & academia justified renewables by scaring the public.  They were only able to scare the public because of fuel shortages.  Fuel shortages only came to be because society became dependent on foreign oil.  Society only became dependent on foreign oil because oil was so cheap.  Without cheap oil, renewables may not exist at all!

How would history have played out if oil had always been expensive?  There would have been cars, but fewer of them.  The US would have kept its per capita consumption closer to Germany's, which would put us at <10MMbpd today.  Suburbs would have developed differently, public transit would be extensive like it was in the early 1900's, trains would be use more extensively in lieu of trucks, etc.  In short, there wouldn't have been an oil crisis because there'd be remarkably little oil dependence.  Without an oil crisis, there'd be no way to convince The Public to waste billions on renewables.  They simply wouldn't exist. 

This is a great example of Nash equilibria in action.  The world settles into stable behaviors; it takes a large shock to knock them out of that stable equilibria and set them on a course to a new equilibria.  Politicians are in the business of teeing up those shocks - making a handsome profit on the outcomes, of course. 

  • Like 1

Share this post


Link to post
Share on other sites

On 5/12/2019 at 1:43 PM, BenFranklin'sSpectacles said:

Fossil fuels - coal in particular - have already proven they're a low-cost source of energy.  I'm not concerned about the cost of fuel because it's a known quantity.  The cost of commodities like concrete, steel, and aluminum have also been well established.  My point in mentioning the materials involved in renewables is to illustrate that their cost will hit a floor - just like every other well-developed technology.  We can't count on perpetual decreases. 

The question is, "Where will that floor occur?"  Building renewables out to 2-3X average load + transmission lines + grid upgrades + storage + backup generation is going to be expensive.  Attempts thus far have failed to deliver on their low-cost promises.  With that track record of failure, I'll need to see success before I believe it. 

As for your linked articles, those are wonderful exercises in mental gymnastics.  I've seen enough of academia's work to not believe a thing they say.  What do the professional engineers say?  What do the people in the field say?  What is actually being implemented?  Specifically, we need to talk to the people who:
1)  Design these systems for a living
2)  Stake their careers on being correct in the real world - not just on paper

If you can present an argument based on real systems, I'll take this seriously.  Otherwise, I'm not interested.  I don't have time for academia's corrupt, wishful thinking. 

Ben,

Where do professional engineers learn their craft?  

From engineering professors, or that was how it was done when I went to university.

You can ignore science if you like, generally engineers do not.

Coal fired power plants are not being built in most advanced economies and the only fuel source that is currently competitive with wind and solar is natural gas in low cost producing nations (Russia, US, and Middle East).  Natural gas will deplete and become more expensive, the cost of solar power is likely to continue to fall.  Eventually it will reach some floor, though we don't know where that floor will be.

The transmission grid is largely built, the cost to connect wind and solar to the existing grid is included in cost estimates. HVDC transmission lines will be built where it makes economic sense to do so.

It will likely be difficult to get to eliminate fossil fuel emissions, but the World can probably get to 70% of energy use from non-fossil fuel energy by 2050.  And perhaps to 50% by 2035.  Combined wind and solar output will make much of existing power generation unprofitable as the cost continues to fall.

Note that projections of future solar PV cost by the EIA are very conservative and even they have solar PV LCOE less than new natural gas by 2040.

See also

https://data.nrel.gov/submissions/103

Utility scale one axis tracked PV (most new installations are of this type) system real costs (in constant 2017$) have decreased at an average rate of 20% per year from 2010 to 2018.  Possibly this decrease in cost will be slower in the future. The NREL Sunshot Goal for 2020 was reached for utility scale solar 3 years early in 2017 (goal was LCOE of 6 cents per kWhr), the 2030 goal is 3 cents per kWhr).  The LCOE for utility scale PV solar decreased from 27 cents per kWhr in 2010 to 6 cents per kWhr in 2017 an average annual decrease of about 19%/year.  Note that only a 5% decrease in cost per year on average from 2018 to 2030 would be needed to reach the Sunshot 2030 goal for LCOE of 3 cents per kWhr in 2017$.  Note that these LCOE estimates do not include the ITC tax credit, they are unsubsidized costs for PV solar at the utility scale.

  • Upvote 1

Share this post


Link to post
Share on other sites

1 hour ago, D Coyne said:

Ben,

Where do professional engineers learn their craft?  

From engineering professors, or that was how it was done when I went to university.

You can ignore science if you like, generally engineers do not.

Engineers In Training are required to work under a Registered Professional Engineer for several years, thoroughly documenting their work to verify they've met certain requirements.  They're also required to master an extensive curriculum and pass exams far more grueling than anything a university offers.  Only then can the become a Registered Professional Engineer. 

Of course, only 1/3rd or so of engineers become Registered Professional Engineers.  Other professions consider new graduates basically worthless and spend several years training them.  It's only after they've invested years developing their skills that they're given real responsibility. 

Traditionally, academia was the contractor for initial training because academia was the lowest cost provider.  In recent years, academia has neglected their responsibilities to the point where the "education" they offer is next to worthless.  I estimated that 80% of my graduating class had no clue what they were doing.  Simultaneously, academia has raised prices to usurious levels.  Both industry and students have grown tired of this trend; steps are being taken to strip academia of its funding:

1)  Universities are now run by administrators instead of academics.  These administrators are beholden to their benefactors: the state, tuition-paying students, and those who fund research.  Thus, academia is being brought to heel. 
2)  Talented students are stacking their high school curriculum with college-level courses.  This strips universities of the first 1-2 years of tuition.  I.e. it strips them of the entry-level courses they've most neglected.
3)  Undergraduate Engineering degrees were shortened from 5 years/150 credits to 4 years/120 credits.  This effectively strips universities of the last year of tuition, shifting that money to alternative education and students' pockets. 
4)  Engineering Masters degrees are shortening from 36 credit hours to 30 credit hours because students no longer see the value and industry doesn't care about the course work.
5)  Hiring managers are ignoring GPA and courses taken.  Instead, they're demanding to see proof of competence: projects completed, teams joined, internships, etc. 
6)  Newer professions, such as Quality Engineering and various IT jobs, no longer require engineering degrees.  Instead, they require X years of experience and the appropriate industry certifications. 
7)  At least in the US, direct government funding of public education is decreasing.  Instead, government programs offer tuition assistance to students, who may choose any educational program they please.  I.e. academia must now compete on the open market. 
8)) Private schools are standing up engineering programs focused on teaching instead of research to meet industry's need for competent, emotionally-stable graduates.  The best students are paying to attend these schools, and the best employers are shifting recruiting to them.  This strips traditional academia of the one valuable product they offered: talented students that Someone Else had already trained. 
9)  Engineering programs are replacing some professors with dedicated instructors.  Many of these have industry experience in lieu of a PhD because everyone has discovered that the PhD's without experience haven't a clue what they're doing

You get the point.  No one questioned academia's role provided they got the job done, but they're no longer getting the job done.  Once that psychological threshold was reached, the world set about creating alternatives.  Academia's power, influence, and pay will continue to decline - and good riddance. 

Share this post


Link to post
Share on other sites

31 minutes ago, BenFranklin'sSpectacles said:

Engineers In Training are required to work under a Registered Professional Engineer for several years, thoroughly documenting their work to verify they've met certain requirements.  They're also required to master an extensive curriculum and pass exams far more grueling than anything a university offers.  Only then can the become a Registered Professional Engineer. 

Of course, only 1/3rd or so of engineers become Registered Professional Engineers.  Other professions consider new graduates basically worthless and spend several years training them.  It's only after they've invested years developing their skills that they're given real responsibility. 

Traditionally, academia was the contractor for initial training because academia was the lowest cost provider.  In recent years, academia has neglected their responsibilities to the point where the "education" they offer is next to worthless.  I estimated that 80% of my graduating class had no clue what they were doing.  Simultaneously, academia has raised prices to usurious levels.  Both industry and students have grown tired of this trend; steps are being taken to strip academia of its funding:

1)  Universities are now run by administrators instead of academics.  These administrators are beholden to their benefactors: the state, tuition-paying students, and those who fund research.  Thus, academia is being brought to heel. 
2)  Talented students are stacking their high school curriculum with college-level courses.  This strips universities of the first 1-2 years of tuition.  I.e. it strips them of the entry-level courses they've most neglected.
3)  Undergraduate Engineering degrees were shortened from 5 years/150 credits to 4 years/120 credits.  This effectively strips universities of the last year of tuition, shifting that money to alternative education and students' pockets. 
4)  Engineering Masters degrees are shortening from 36 credit hours to 30 credit hours because students no longer see the value and industry doesn't care about the course work.
5)  Hiring managers are ignoring GPA and courses taken.  Instead, they're demanding to see proof of competence: projects completed, teams joined, internships, etc. 
6)  Newer professions, such as Quality Engineering and various IT jobs, no longer require engineering degrees.  Instead, they require X years of experience and the appropriate industry certifications. 
7)  At least in the US, direct government funding of public education is decreasing.  Instead, government programs offer tuition assistance to students, who may choose any educational program they please.  I.e. academia must now compete on the open market. 
8)) Private schools are standing up engineering programs focused on teaching instead of research to meet industry's need for competent, emotionally-stable graduates.  The best students are paying to attend these schools, and the best employers are shifting recruiting to them.  This strips traditional academia of the one valuable product they offered: talented students that Someone Else had already trained. 
9)  Engineering programs are replacing some professors with dedicated instructors.  Many of these have industry experience in lieu of a PhD because everyone has discovered that the PhD's without experience haven't a clue what they're doing

You get the point.  No one questioned academia's role provided they got the job done, but they're no longer getting the job done.  Once that psychological threshold was reached, the world set about creating alternatives.  Academia's power, influence, and pay will continue to decline - and good riddance. 

Here it's 4 years 150 credits (6 courses/term).

I did an internship between years 3 and 4 - it didn't teach me nearly as much as university did.  No employer is going to teach math. 

If everything requires experience you will have no engineers "I can't get a job because I have no experience; I can't get experience because I can't get a job."

Share this post


Link to post
Share on other sites

10 minutes ago, Enthalpic said:

Here it's 4 years 150 credits (6 courses/term).

I did an internship between years 3 and 4 - it didn't teach me nearly as much as university did.  No employer is going to teach math. 

If everything requires experience you will have no engineers "I can't get a job because I have no experience; I can't get experience because I can't get a job." 

Every employer lists experience - and a pile of other things - so they have plausible deniability when they reject a candidate.  You can't cry discrimination if you didn't meet the minimum requirements. 

Employers don't need to teach math, but there's no universal law saying universities must do this.  The material is freely available online, and most students end up self-teaching anyway.  Why waste money on academia?  Why tolerate the welfare system we call teaching assistanceships?  Why suffer the waste and abuse that comes with tenured positions?  

It would be far cheaper to have students study independently and spend $150 on a proctored exam - just as we do for AP courses & standardized tests.  The infrastructure already exists; we're just not using it.  Alternatively, we could have a handful of instructors record lectures and post them online for every student in the country to use.  We could reduce the number of universities in the US from >4000 to 4 - a 99.9% reduction in cost!

Students who want jobs are already required to do extra-curricular activities, join teams, and complete projects.  Meanwhile, there's effectively no feedback on homework and exams because the teaching assistants who grade them barely look at them.  Excellent grades are handed out with a wink and a not.  Everyone knows the homework & exams are worthless; why maintain the charade that college courses add value? 

Basically, the public education system is an expensive travesty.  We'd get better results by simply telling students they're responsible for their own success and turning them loose.  Let's quit pretending there's something magical about universities and treat them like the businesses they are.  Let's create new business models and let the old go bankrupt. 

Share this post


Link to post
Share on other sites

3 hours ago, BenFranklin'sSpectacles said:

 Alternatively, we could have a handful of instructors record lectures and post them online for every student in the country to use.

MIT has a bunch of course material online for free.  I've used it - it's great but they they aren't handing out expensive pieces of paper. https://ocw.mit.edu/index.htm

I agree that for some courses you should just be able to challenge an exam, but coming from a chemistry background there was a lot of hands on lab work that is irreplaceable.

Plus, I really enjoyed my time in university so I support it.  🍺 🍺 🍺 🍺 🍺  ✌️ 👩

Share this post


Link to post
Share on other sites

Former CEO of Africa's largest company and oil major Sonatrach. That oil should be around 70 dollars, this is the best price for consumers and producers. Any lower or higher and we can have major problems in the future.  

https://youtu.be/f7DF1Hg1CiI

 

  • Upvote 1

Share this post


Link to post
Share on other sites