Jump to content

AMD: Moore's Law's end is near


nsane.forums

Recommended Posts

nsane.forums

YIW4mHq.jpg

Theoretical physicist Michio Kaku believes

left before ever-shrinking transistor sizes smack up against limitations imposed by the laws of thermodynamics and quantum physics. That day of reckoning for the computing industry may still be a few years away, but signs of the coming Moorepocalypse are already here. Just ask chip maker AMD.

The company's Chief Product Architect John Gustafson believes AMD’s difficulties in transitioning from 28-nanometer chips to 20-nanometer silicon shows we’ve reached the beginning of the end.

"You can see how Moore's Law is slowing down," Gustafson recently told the Inquirer. "We've been waiting for that transition from 28nm to 20nm to happen and it's taking longer than Moore's Law would have predicted...I'm saying you are seeing the beginning of the end of Moore's law." A processor’s nanometer measurement tells you the size of the smallest transistors on a given chip.

Doomsday predictions about the end of Moore's Law are nearly as old as the famous observation posited by Intel cofounder Gordon Moore in 1965. In his 2011 book Physics of the Future, for example, Kaku predicted the end of Moore’s Law could turn Silicon Valley into a “rust belt” if a replacement technology for silicon isn’t found.

Not just about tech

Moore's Law, Gustafson argues, wasn't just about the technological ability to put more transistors on a chip, but also the economic feasibility of doing so. "The original statement of Moore's Law is the number of transistors that is more economical to produce will double every two years," Gustafson said. "It has become warped into all these other forms but that is what he originally said."

Gustafson's comments echo similar statements another AMD representative recently told PCWorld. "Moore’s Law was always about the cost of the transistors as much as performance increasing as you could afford more and more of them," Gary Silcott, the senior PR manager for AMD's APU and CPU products, said.

AMD's argument may also reveal a bit of corporate bias related to the company's recent struggles. While AMD's chips are currently stuck at 28nm, Intel is pushing ahead with smaller and smaller designs. Intel currently produces 22nm chips for its latest generation of Core processors, Ivy Bridge. The next generation, Haswell, will also feature a 22nm process. Intel, in 2014, expects to produce 14nm Haswell chips, and the company is aiming to produce 10nm chips by 2016.

But AMD is not alone in seeing the beginning of the end for Moore's famous observation. Computer processors in general are starting to lag behind the intent, if not the letter, of Moore’s Law, as PCWorld Senior Writer Brad Chacos recently reported. Chips may be getting smaller, but huge gains in processing power aren’t making the same jumps over that time as we saw in previous decades. Instead, smaller chips are more about improving graphics and energy efficiency rather than raw performance.

So if we are seeing the end of Moore’s Law, what’s next for computers? Kaku suggested a few interesting possibilities such as molecular transistors or, much further down the road, quantum computers. Until then, Intel, AMD, and other chip makers will continue to squeeze every last ounce of speed and power they can from silicon designs.

Michio Kaku: Tweaking Moore’s Law

Link to comment
Share on other sites

  • Replies 7
  • Views 936
  • Created
  • Last Reply
  • Administrator

Somehow, I believe it will help consumers in a way.

How? Right now, anything you buy, gets overtaken by a new gen processor in 18 months or a year. If, if the Moore's Law slows down, it will mean more time for users to enjoy their current hardware. This will also make companies innovate more I think.

However, that's the only positive side of this, the negative one is quite powerful.

Link to comment
Share on other sites

While faster more advanced chips are nice...

How much more power and speed do most users really need? I have the the same 4 core AMD chip now for 3 years and it is working just fine. There really isn't anything software wise that bogs it down and a faster chip just wouldn't do much for me anyway.

Link to comment
Share on other sites

  • Administrator

While faster more advanced chips are nice...

How much more power and speed do most users really need? I have the the same 4 core AMD chip now for 3 years and it is working just fine. There really isn't anything software wise that bogs it down and a faster chip just wouldn't do much for me anyway.

Well, gaming badly ported games for one needs more CPU power. :P I think better graphics card need more processor power too.

But, but, most importantly, I think advancement is required on mobile side. As much as I hate seeing new mobile versions each year, it needs to go a long way to come near the speed of the PCs.

Link to comment
Share on other sites

From the Users' point-of-view, there's absolutely nothing wrong with the speed at which technology is making advances - the problem is with the speed at which older technology is being deprived of support.

From the Manufacturers' point-of-view, it's getting increasing exorbitant to support older technology at the rate at which newer advances are getting piled up.

Link to comment
Share on other sites

This article, somehow, predicts the beginning of DNA computing era.

Link to comment
Share on other sites

Well, gaming badly ported games for one needs more CPU power. :P I think better graphics card need more processor power too.

But, but, most importantly, I think advancement is required on mobile side. As much as I hate seeing new mobile versions each year, it needs to go a long way to come near the speed of the PCs.

Isn’t this just the reality that as computer chips got faster, ram got larger and storage space got larger…Programmers got sloppier?

Look back to the early days of computers. With 64k ram and 2 to 4 MHz processors, programmers could write playable games and working programs.

The first IBM PC that I had, which admittedly was decades ago, had a 4.77 MHz processor, 640k Ram and a 10 MB hard drive.

It has changed for the better in terms of hardware but in some ways software has gotten sloppier as programmers no longer strive to squeeze all they can out of the hardware. There is no reason with the hardware available to not be sloppy.

Mobile computing is following the same path..

Link to comment
Share on other sites

  • Administrator

Well, gaming badly ported games for one needs more CPU power. :P I think better graphics card need more processor power too.

But, but, most importantly, I think advancement is required on mobile side. As much as I hate seeing new mobile versions each year, it needs to go a long way to come near the speed of the PCs.

Isn’t this just the reality that as computer chips got faster, ram got larger and storage space got larger…Programmers got sloppier?

Look back to the early days of computers. With 64k ram and 2 to 4 MHz processors, programmers could write playable games and working programs.

The first IBM PC that I had, which admittedly was decades ago, had a 4.77 MHz processor, 640k Ram and a 10 MB hard drive.

It has changed for the better in terms of hardware but in some ways software has gotten sloppier as programmers no longer strive to squeeze all they can out of the hardware. There is no reason with the hardware available to not be sloppy.

Mobile computing is following the same path..

No idea actually. However, an individual programmer with a small software wouldn't care for the hardware. But big time programmers like Adobe will like more powerful hardware so that they can give their software more power to work with. What's even sad is that big time programmers make softwares in big time powerful computers - they probably have no idea how things work on common (non-Mac) computers.

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...