ronwagn

Is It the End of the Road for Computing Power?

Recommended Posts

https://www.wsj.com/articles/is-it-the-end-of-the-road-for-computing-power-11646389802?mod=tech_lead_pos13

 

HEARD ON THE STREET

Is It the End of the Road for Computing Power?

That phone in your pocket was predicted by Moore’s Law, but it is bumping against its physical and economic limits

 

There is a Goldilocks zone. Shrinking the whole chip...would actually cost more and doesn’t necessarily mean better performance and power.

— Dylan Patel

 

 

Share this post


Link to post
Share on other sites

Making the circuits to withstand higher voltage is, in my opinion, the next step for conventional computing. Which of course brings on a plethora of other necessary improvements, particularly heat management. 

  • Like 1

Share this post


Link to post
Share on other sites

While it is true that Moore's law is expected to finally reach all physical limits in 2025 or so - and there has been a notable slowing between generations of computers - computer companies can still increase the power of their systems .. stick in a few more processors.. they just won't also shrink. For tower computers (I have one on my desk) this doesn't mean much as there is plenty of space in it .. phones and laptops are carried around of course but if the chips can't shrink any further maybe there are work arounds such as better software that can push processing up into a cloud.. or maybe we can have brain implants and use our organic brains as back-up processors, with the phones making all the real decisions.  Engineers will work something out.. 

  • Upvote 2

Share this post


Link to post
Share on other sites

15 minutes ago, markslawson said:

that can push processing up into a cloud.

For sure, once data transfer speeds get high enough (like 5G) almost everything can be done by large servers. 

  • Like 1
  • Upvote 1

Share this post


Link to post
Share on other sites

4 hours ago, KeyboardWarrior said:

Making the circuits to withstand higher voltage is, in my opinion, the next step for conventional computing. Which of course brings on a plethora of other necessary improvements, particularly heat management. 

The wavelength of the electron is a hard limit.

Optical or quantum computers is the next real jump.

  • Like 2

Share this post


Link to post
Share on other sites

Funny topic, I do believe we have reached the point of no return long ago. Hi tech is finally just running out of marketing concepts...Actually in retrospect boring mid tech is running out of marketing concepts.

Concepts: Create the need and they will come.

 

  • Upvote 1

Share this post


Link to post
Share on other sites

(edited)

Lots of people, like me, have old computers laying around and newer devices sitting in desks unused. I have 2 cheap cams and other devices that I never used. Yet I just bought a Chromebook for a backup to my 17 inch HP laptop. I never bothered to go from a to 16 ram, but just used a tab suspender or rebooted to clear the cache. I would buy a gaming rig if I wasn't so busy online. I used to love gaming when I was working. I went from a Texas Instruments to 286, 486, and stopped at I 5 because there was no need short of gaming which I no longer had time for once I retired. I used gaming as a stress reliever before then. 

I never understood why people wanted tiny phones or small computers, I want big. My new Chromebook is 15.6 inches and runs a much less demanding Chrome O.S. It is hard to find a 17 inch Chromebook. I think I paid about $1,500 for my 286 and my Chromebook, cost me $379. Considering inflation that would be about $150 in equal dollar value today. The difference in capability is, of course, many times greater. 

2021 Flagship Acer Chromebook 15.6" FHD 1080p IPS Touchscreen Light Computer Laptop, Intel Celeron N4020, 4GB RAM, 64GB eMMC,HD Webcam,WiFi, 12+ Hours Battery,Chrome OS,w/Marxsol Cables

 

Amazon'sChoicein Tradit

I wonder who spends more for high priced computers, our government or gamers?

https://www.zdnet.com/article/most-expensive-gaming-computer/

Most expensive gaming computers: $20,000 is a small price to pay for victory

We've curated a list of the most expensive gaming desktops and laptops you can get your hands on if you're willing to cash out your 401(k) early.

Edited by ronwagn
reference

Share this post


Link to post
Share on other sites

On 3/7/2022 at 7:44 PM, KeyboardWarrior said:

Making the circuits to withstand higher voltage is, in my opinion, the next step for conventional computing. Which of course brings on a plethora of other necessary improvements, particularly heat management. 

Bigger current!

https://en.wikipedia.org/wiki/Emitter-coupled_logic

  • Like 1

Share this post


Link to post
Share on other sites

On 3/7/2022 at 4:10 PM, ronwagn said:

https://www.wsj.com/articles/is-it-the-end-of-the-road-for-computing-power-11646389802?mod=tech_lead_pos13

 

HEARD ON THE STREET

Is It the End of the Road for Computing Power?

That phone in your pocket was predicted by Moore’s Law, but it is bumping against its physical and economic limits

 

There is a Goldilocks zone. Shrinking the whole chip...would actually cost more and doesn’t necessarily mean better performance and power.

— Dylan Patel

 

 

The biggest gains to be made are from changing the programming paradigm to SIMD and MIMD.

  • Like 1

Share this post


Link to post
Share on other sites

On 3/8/2022 at 2:23 AM, ronwagn said:

Lots of people, like me, have old computers laying around and newer devices sitting in desks unused. I have 2 cheap cams and other devices that I never used. Yet I just bought a Chromebook for a backup to my 17 inch HP laptop. I never bothered to go from a to 16 ram, but just used a tab suspender or rebooted to clear the cache. I would buy a gaming rig if I wasn't so busy online. I used to love gaming when I was working. I went from a Texas Instruments to 286, 486, and stopped at I 5 because there was no need short of gaming which I no longer had time for once I retired. I used gaming as a stress reliever before then. 

I never understood why people wanted tiny phones or small computers, I want big. My new Chromebook is 15.6 inches and runs a much less demanding Chrome O.S. It is hard to find a 17 inch Chromebook. I think I paid about $1,500 for my 286 and my Chromebook, cost me $379. Considering inflation that would be about $150 in equal dollar value today. The difference in capability is, of course, many times greater. 

2021 Flagship Acer Chromebook 15.6" FHD 1080p IPS Touchscreen Light Computer Laptop, Intel Celeron N4020, 4GB RAM, 64GB eMMC,HD Webcam,WiFi, 12+ Hours Battery,Chrome OS,w/Marxsol Cables

 

Amazon'sChoicein Tradit

I wonder who spends more for high priced computers, our government or gamers?

https://www.zdnet.com/article/most-expensive-gaming-computer/

Most expensive gaming computers: $20,000 is a small price to pay for victory

We've curated a list of the most expensive gaming desktops and laptops you can get your hands on if you're willing to cash out your 401(k) early.

Try LG Gram 17. It is what I have.

The government has lots more expensive rigs than $20K

  • Like 1

Share this post


Link to post
Share on other sites

Very impressive Andre! I will consider it for my next Windows laptop. Is it capable of gaming. Do you really need that much computer power? I have 23 tabs on my Chromebook right now. I am still setting up all my stuff. It is actually quicker than my HP laptop which is a few years old but was about double the price. The new oval type c similar USP ports help and it also has a faster WI Fi connection. 

Share this post


Link to post
Share on other sites

(edited)

1 hour ago, Andrei Moutchkine said:

Try LG Gram 17. It is what I have.

The government has lots more expensive rigs than $20K

Yes but not the numbers of computers. Do you remember when certain projects used to become parasites off of individual computers? They needed more computing power for scientific purposes. 

Now the just buy what they need with tax money from the same people they are spying on. I think it is worldwide to varying extents. I always assume that I am an open book. 

Edited by Ron Wagner

Share this post


Link to post
Share on other sites

28 minutes ago, Ron Wagner said:

Very impressive Andre! I will consider it for my next Windows laptop. Is it capable of gaming. Do you really need that much computer power? I have 23 tabs on my Chromebook right now. I am still setting up all my stuff. It is actually quicker than my HP laptop which is a few years old but was about double the price. The new oval type c similar USP ports help and it also has a faster WI Fi connection. 

No, it is not capable of gaming. There is no separate GPU and the performance is significantly throttled by the thermals. Think a larger MacBook Air (Allegedly, the newer 2022? versions do have beefier cooling and don't do this anymore) I've got a desktop/server for games.

I've got 300+ tabs, but I use Mozilla, which handles this better than Chrome. Also, required a RAM upgrade to 24GB. It is believed to be the most user-serviceable ultrabook. I am a fan of LG 16:10 panels. Also got a 32" Dell from 2013, that is still the best monitor I've ever seen.

  • Upvote 1

Share this post


Link to post
Share on other sites

41 minutes ago, Ron Wagner said:

Yes but not the numbers of computers. Do you remember when certain projects used to become parasites off of individual computers? They needed more computing power for scientific purposes. 

Now the just buy what they need with tax money from the same people they are spying on. I think it is worldwide to varying extents. I always assume that I am an open book. 

You mean the things which have shelves upon shelves of commodity hardware and populate the Top500 list?

https://www.top500.org/lists/top500/

Those are only good for running supercomputing benchmarks and are a form of government pork barrel and dick swinging contest. All the practically usable supercomputers tend to fit into a single cabinet.

  • Haha 1

Share this post


Link to post
Share on other sites

8 minutes ago, Andrei Moutchkine said:

No, it is not capable of gaming. There is no separate GPU and the performance is significantly throttled by the thermals. Think a larger MacBook Air (Allegedly, the newer 2022? versions do have beefier cooling and don't do this anymore) I've got a desktop/server for games.

I've got 300+ tabs, but I use Mozilla, which handles this better than Chrome. Also, required a RAM upgrade to 24GB. It is believed to be the most user-serviceable ultrabook. I am a fan of LG 16:10 panels. Also got a 32" Dell from 2013, that is still the best monitor I've ever seen.

Wow. I think my Chromebook can also add Linux but I doubt I would ever need that power. I watch my 48 inch TV across the room over my laptop. 

Do you need the power for engineering work or what?

Share this post


Link to post
Share on other sites

7 minutes ago, Andrei Moutchkine said:

You mean the things which have shelves upon shelves of commodity hardware and populate the Top500 list?

https://www.top500.org/lists/top500/

Those are only good for running supercomputing benchmarks and are a form of government pork barrel and dick swinging contest. All the practically usable supercomputers tend to fit into a single cabinet.

All the computers in the world are only a drop in the bucket compared to the processor that the Holy Spirit has. 

  • Like 1
  • Upvote 1

Share this post


Link to post
Share on other sites

3 minutes ago, KeyboardWarrior said:

You sure? Higher voltage is what allows the components to switch their states faster. I'll look into this more. 

I am sure. The effect of higher voltage is highly localized. The kind of logic used most of the time usually benefit from both lower voltage and lower current. There are no EDA tools which support ECL.

  • Upvote 1

Share this post


Link to post
Share on other sites

1 minute ago, Andrei Moutchkine said:

I am sure. The effect of higher voltage is highly localized. The kind of logic used most of the time usually benefit from both lower voltage and lower current. There are no EDA tools which support ECL.

Wouldn't you be limited by wire size? Voltage doesn't experience this same limitation. 

Share this post


Link to post
Share on other sites

Just now, KeyboardWarrior said:

Wouldn't you be limited by wire size? Voltage doesn't experience this same limitation. 

ECL components are not transistors. They are amplifiers or something? Have several functions at once. Esoteric stuff, nobody since Seymour Cray has really used, so I can't really tell.

  • Upvote 1

Share this post


Link to post
Share on other sites

On 3/7/2022 at 4:35 PM, TailingsPond said:

The wavelength of the electron is a hard limit.

Optical or quantum computers is the next real jump.

Not sure what to think about the space efficiency of optical computers, but I'm sure they'll improve. 

Share this post


Link to post
Share on other sites

Computing "power" has not changed in over 15 years now.  Your computers run no faster today than 15 years ago, could claim 20 years I suppose, but I digress....  The only difference has been on the usage of using the graphics card as the computing powerhouse as no one expects efficiency from said graphics card and therefore they keep adding more transistors and using parallel architecture to increase "power" for the "high" end users(99% gamers)

The end user software experience has gone BACKWARDS in the last 10 years(I would argue over 20).

The only "advancement" if you call it advancement, is in cell phones, though even they hit a brick wall over 5 years ago. 

As George Friedman pointed out, the transistor as a tech, is dead as it is over 50 years old.  There is no more advancement.  He is mostly correct where the only difference is in small areas of efficiency.  There are avenues to shrink things, but who cares?  We already have multiple orders of magnitude more computing power than required.  The only reason many things are slower is because software designers are not engineers anymore.  They are assemblers of pre-existing blocks making these horrid conglomerations of putrid garbage requiring vast amounts of RAM etc to do the same thing everyone used to do on a tiny fraction of resources when said software was purposefully designed for the application.  Programs require multiple Gigabytes which used to require a couple megabytes or even kilobytes to do the exact same thing.  Oh, but the graphics are nicer.... but the end user experience is still no different.

  • Great Response! 1

Share this post


Link to post
Share on other sites

(edited)

6 hours ago, footeab@yahoo.com said:

Computing "power" has not changed in over 15 years now.  Your computers run no faster today than 15 years ago, could claim 20 years I suppose, but I digress....  The only difference has been on the usage of using the graphics card as the computing powerhouse as no one expects efficiency from said graphics card and therefore they keep adding more transistors and using parallel architecture to increase "power" for the "high" end users(99% gamers)

The end user software experience has gone BACKWARDS in the last 10 years(I would argue over 20).

The only "advancement" if you call it advancement, is in cell phones, though even they hit a brick wall over 5 years ago. 

As George Friedman pointed out, the transistor as a tech, is dead as it is over 50 years old.  There is no more advancement.  He is mostly correct where the only difference is in small areas of efficiency.  There are avenues to shrink things, but who cares?  We already have multiple orders of magnitude more computing power than required.  The only reason many things are slower is because software designers are not engineers anymore.  They are assemblers of pre-existing blocks making these horrid conglomerations of putrid garbage requiring vast amounts of RAM etc to do the same thing everyone used to do on a tiny fraction of resources when said software was purposefully designed for the application.  Programs require multiple Gigabytes which used to require a couple megabytes or even kilobytes to do the exact same thing.  Oh, but the graphics are nicer.... but the end user experience is still no different.

Computers literally run on higher frequencies now and have vastly higher transistor density, use multiples physical cores, and hyper-threading so I strongly disagree that computers "run no faster" than 15 years ago. 

They also made heat management a lot better - powerful cell phones have no fans! 

I fully agree that most code is waste.  They literally put people on the moon with less computing power than a modern pocket calculator.

My first computer was a Commodore 64.  64Kb of memory and it still ran pretty fun simple games.

 

 

Edited by TailingsPond

Share this post


Link to post
Share on other sites

5 hours ago, TailingsPond said:

Computers literally run on higher frequencies now and have vastly higher transistor density, use multiples physical cores, and hyper-threading so I strongly disagree that computers "run no faster" than 15 years ago. 

They also made heat management a lot better - powerful cell phones have no fans! 

I fully agree that most code is waste.  They literally put people on the moon with less computing power than modern pocket calculator.

My first computer was a Commodore 64.  64Kb of memory and it still ran pretty fun simple games.

 

 

Frequencies today ~4Ghz.  Frequencies 15 yrs ago 4Ghz(Heck 4Ghz was hit over 20 years ago).  on 90 nm.   All they did was add more  transistors with more cores since then while shrinking the die.  Multi core threading was available decades ago.  The coined hyper threading was coined 25? years ago?  Not that it matters as nothing(basic software) actually uses multiple cores other than giant networking apps, or some video processing software/video games. 

Yipeee... uses less power... Who cares, unless you are operating a data center;  No one. 

There is nothing on a computer today that increases efficiency over which was already in use 20 years ago.  Yes, there was video conferencing 20 years ago.  The only limit was internet connex for most people. 

  • Upvote 1

Share this post


Link to post
Share on other sites

1 hour ago, footeab@yahoo.com said:

Frequencies today ~4Ghz.  Frequencies 15 yrs ago 4Ghz(Heck 4Ghz was hit over 20 years ago).  on 90 nm.   All they did was add more  transistors with more cores since then while shrinking the die.  Multi core threading was available decades ago.  The coined hyper threading was coined 25? years ago?  Not that it matters as nothing(basic software) actually uses multiple cores other than giant networking apps, or some video processing software/video games. 

Yipeee... uses less power... Who cares, unless you are operating a data center;  No one. 

There is nothing on a computer today that increases efficiency over which was already in use 20 years ago.  Yes, there was video conferencing 20 years ago.  The only limit was internet connex for most people. 

Kudos for a respectful, knowledgeable rebuttal.

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
You are posting as a guest. If you have an account, please sign in.
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.