• Welcome to the new Internet Infidels Discussion Board, formerly Talk Freethought.

Moore's Law Keeps Going, Defying Expectations

Axulus

Veteran Member
Joined
Jun 17, 2003
Messages
4,201
Location
Hallandale, FL
Basic Beliefs
Right leaning skeptic
Personal computers, cellphones, self-driving cars—Gordon Moore predicted the invention of all these technologies half a century ago in a 1965 article for Electronics magazine. The enabling force behind those inventions would be computing power, and Moore laid out how he thought computing power would evolve over the coming decade. Last week the tech world celebrated his prediction here because it has held true with uncanny accuracy—for the past 50 years.

...

Moore anticipated the two-year doubling trend based on what he had seen happen in the early years of computer-chip manufacture. In his 1965 paper he plotted the number of transistors that fit on a chip since 1959 and saw a pattern of yearly doubling that he then extrapolated for the next 10 years. (He later revised the trend to a doubling about every two years.) “Moore was just making an observation,” says Peter Denning, a computer scientist at the Naval Postgraduate School in California. “He was the head of research at Fairchild Semiconductor and wanted to look down the road at how much computing power they’d have in a decade. And in 1975 his prediction came pretty darn close.”

But Moore never thought his prediction would last 50 years. “The original prediction was to look at 10 years, which I thought was a stretch,” he told Friedman last week, “This was going from about 60 elements on an integrated circuit to 60,000—a 1,000-fold extrapolation over 10 years. I thought that was pretty wild. The fact that something similar is going on for 50 years is truly amazing.”

Just why Moore’s law has endured so long is hard to say. His doubling prediction turned into an industry objective for competing companies. “It might be a self-fulfilling law,” Denning explains. But it is not clear why it is a constant doubling every couple of years, as opposed to a different rate or fluctuating spikes in progress. “Science has mysteries, and in some ways this is one of those mysteries,” Denning adds. Certainly, if the rate could have gone faster, someone would have done it, notes computer scientist Calvin Lin of the University of Texas at Austin.

Many technologists have forecast the demise of Moore’s doubling over the years, and Moore himself states that this exponential growth can’t last forever. Still, his law persists today, and hence the computational growth it predicts will continue to profoundly change our world. As he put it: “We’ve just seen the beginning of what computers are going to do for us.”

http://www.scientificamerican.com/article/moore-s-law-keeps-going-defying-expectations/#b05g31t20w15
 
Is this true? Small technology is speeding up, but isn't that more about adapting larger scale desktop technology to small miniature technology?
 
Is this true? Small technology is speeding up, but isn't that more about adapting larger scale desktop technology to small miniature technology?

Adaptations on top of adaptations are what exponential scales are all about. So why are you surprised?
I suppose, I've always thought (perhaps mistakenly) that technological advances would keep up at ridiculous rates. Like back in the 90s when you just bought an out of date computer... the moment it was purchased. These days the computer processing powers are not advancing at any rate like they were in the old days of 1990's.
 
Adaptations on top of adaptations are what exponential scales are all about. So why are you surprised?
I suppose, I've always thought (perhaps mistakenly) that technological advances would keep up at ridiculous rates. Like back in the 90s when you just bought an out of date computer... the moment it was purchased. These days the computer processing powers are not advancing at any rate like they were in the old days of 1990's.
Two reasons:

1) Desktop markets are saturated (if not declining) and unlike the 90s, are not the cutting edge. That has shifted to mobile and servers.

2) Nowadays there is a lot of effort into making processors consume less power, and not just be faster.

And of course the third reason is that Moore's law is winding down due to physics. In the 90s, they could still just basically scale down by maintaining essentially the same structure, just smaller, but now the scale requires using different materials and methods entirely. And it's getting harder and more expensive to keep up.
 
Adaptations on top of adaptations are what exponential scales are all about. So why are you surprised?
I suppose, I've always thought (perhaps mistakenly) that technological advances would keep up at ridiculous rates. Like back in the 90s when you just bought an out of date computer... the moment it was purchased. These days the computer processing powers are not advancing at any rate like they were in the old days of 1990's.

My head is spinning from this post.

The rate has not only increased annually since the 90's; it has accelerated! That's what Moore's Law is all about. The law has roughly held for 50 years. This is a fact; it is not up for debate.
 
I suppose, I've always thought (perhaps mistakenly) that technological advances would keep up at ridiculous rates. Like back in the 90s when you just bought an out of date computer... the moment it was purchased. These days the computer processing powers are not advancing at any rate like they were in the old days of 1990's.

My head is spinning from this post.

The rate has not only increased annually since the 90's; it has accelerated! That's what Moore's Law is all about. The law has roughly held for 50 years. This is a fact; it is not up for debate.

The original 1965 formulation was that transistor density of integrated circuits doubles every year, but this was revised to every two years in 1975; which is rather odd for an immutable law that is 'a fact' and 'not up for debate'.

Typically the modern formulation of Moore's Law refers to cost vs speed as well as (or instead of) transistor density per board - or per chip.

In the consumer market, the period from the early eighties until about the mid naughties was characterised by PCs which became about twice as fast every 18-24 months, while remaining at about the same price; and since that time, there has been a transition to a halving of price every 18-24 months while remaining at about the same maximum speed. This transition has also been associated with multiple core technology - it is commonplace today to get eight CPU cores in a laptop, where a similarly priced machine a decade ago would have had a single core, running at a half to a quarter of the clock speed.

None of the formulations of Moore's Law are absolutely accurate; and exactly what form the Law should take, and how long it will continue to hold true (or even whether it still does hold true) is very much up for debate.

The main driver of the conformance of hardware to Moore's Law is that Moore's Law has been used as a benchmark for R&D, making it more a self-fulfilling prophecy, than a law of nature.

Chip makers will continue to strive to reach a standard that their marketers can spin as conforming with Moore's Law for as long as they can; but it would be a mistake to confuse sales hype, for statements of fact that are not up for debate.
 
I suppose, I've always thought (perhaps mistakenly) that technological advances would keep up at ridiculous rates. Like back in the 90s when you just bought an out of date computer... the moment it was purchased. These days the computer processing powers are not advancing at any rate like they were in the old days of 1990's.

My head is spinning from this post.
Was it really that bad of a post? Am I incorrect in stating that personal computers becoming obsolete no where remotely near as fast as they did in the 90s?
 
My head is spinning from this post.
Was it really that bad of a post? Am I incorrect in stating that personal computers becoming obsolete no where remotely near as fast as they did in the 90s?

You're incorrect, yes; although the *perception* is valid enough. The reason for this isn't to do with any lack of technological progress itself though.

PC's used to become "obsolete" pretty quickly because graphical advances in gaming kept pushing faster and faster hardware. It didn't take as long until your hardware could no longer reliable run the latest games. More recently however, companies have been developing their games primarily with consoles in mind. Rather than build the game at the highest tech level possible (on the PC), they develop it at the level of the console for development and porting ease. Rather than requiring a downgrade from the PC to the console version, the game would instead require an upgrade to make the most out of the cutting edge PC hardware available, which generally isn't done for a variety of reasons. This means that most games don't really push the hardware as much as they theoretically could, and so PC's nowadays can go longer without requiring substantial upgrading.
 
My head is spinning from this post.

The rate has not only increased annually since the 90's; it has accelerated! That's what Moore's Law is all about. The law has roughly held for 50 years. This is a fact; it is not up for debate.

The original 1965 formulation was that transistor density of integrated circuits doubles every year, but this was revised to every two years in 1975; which is rather odd for an immutable law that is 'a fact' and 'not up for debate'.

I said "roughly". This is obviously not a perfect law. Humanity could have been extinct by a meteor 20 years ago. The fact that the law's predictions have roughly held for 50 years is not controversial; stop looking for an argument.

None of the formulations of Moore's Law are absolutely accurate; ...

If you spent more time trying to understand what the other person is trying to say and less time on your critique, you would have noticed the all-important adjective "roughly".

and exactly what form the Law should take, and how long it will continue to hold true (or even whether it still does hold true) is very much up for debate.

I agree with this, but I don't see how it's relevant to what I said.

The main driver of the conformance of hardware to Moore's Law is that Moore's Law has been used as a benchmark for R&D, making it more a self-fulfilling prophecy, than a law of nature.

But that is just your opinion. And I did not say anything about what drives the accuracy of Moore's law.

I really think it is important that we understand each other.
 
Was it really that bad of a post? Am I incorrect in stating that personal computers becoming obsolete no where remotely near as fast as they did in the 90s?

You're incorrect, yes; although the *perception* is valid enough.
That's all I ask :D
The reason for this isn't to do with any lack of technological progress itself though.

PC's used to become "obsolete" pretty quickly because graphical advances in gaming kept pushing faster and faster hardware. It didn't take as long until your hardware could no longer reliable run the latest games. More recently however, companies have been developing their games primarily with consoles in mind. Rather than build the game at the highest tech level possible (on the PC), they develop it at the level of the console for development and porting ease. Rather than requiring a downgrade from the PC to the console version, the game would instead require an upgrade to make the most out of the cutting edge PC hardware available, which generally isn't done for a variety of reasons. This means that most games don't really push the hardware as much as they theoretically could, and so PC's nowadays can go longer without requiring substantial upgrading.
Seriously, gaming is the single driving force?
 
My head is spinning from this post.
Was it really that bad of a post? Am I incorrect in stating that personal computers becoming obsolete no where remotely near as fast as they did in the 90s?

You made a huge leap from your observation to your conclusion.

Furthermore, your conclusion goes against everything that has been historically known about the accuracy of Moore's law. It's not like you said that it seems that the gravitational constant doesn't seem to be exactly G because of an observation that you made. It's more like saying that factual information about the number of murders in some country has not risen because it seems that way which would go against the records that show that murders have risen. In other words, it's not really up for debate, unless you are claiming a conspiracy, which I don't think you were.
 
Was it really that bad of a post? Am I incorrect in stating that personal computers becoming obsolete no where remotely near as fast as they did in the 90s?

It's just that you made a huge leap from your observation to your conclusion.

Furthermore, your conclusion goes against everything that has been historically known about the accuracy of Moore's law. It's not like you said that it seems that the gravitational constant doesn't seem to be exactly G because of an observation that you made. It's more like saying that factual information about the number of murders in some country has not risen because it seems that way which would go against the records that show that murders have risen. In other words, it's not really up for debate, unless you are claiming a conspiracy, which I don't think you were.

Make sure you're talking about the same kind of rate.

Give a poor person $1000 and you can change their life. Give a rich person $10,000 and they might not even notice. Which "rate" is higher?
 
It's just that you made a huge leap from your observation to your conclusion.

Furthermore, your conclusion goes against everything that has been historically known about the accuracy of Moore's law. It's not like you said that it seems that the gravitational constant doesn't seem to be exactly G because of an observation that you made. It's more like saying that factual information about the number of murders in some country has not risen because it seems that way which would go against the records that show that murders have risen. In other words, it's not really up for debate, unless you are claiming a conspiracy, which I don't think you were.

Make sure you're talking about the same kind of rate.

Give a poor person $1000 and you can change their life. Give a rich person $10,000 and they might not even notice. Which "rate" is higher?

I don't think I understand what you are getting at. I wouldn't have thought that either is describable as a rate. Wouldn't both be discontinuous increases?
 
Make sure you're talking about the same kind of rate.

Give a poor person $1000 and you can change their life. Give a rich person $10,000 and they might not even notice. Which "rate" is higher?

I don't think I understand what you are getting at. I wouldn't have thought that either is describable as a rate. Wouldn't both be discontinuous increases?

You were already talking about rates - I just tried to give you an example where an accelerating rate has different effects based on the baseline value.

So, moving from 1 to 2 in computing power might be a much bigger deal than going from 100 to 200. That way, computers that are more powerful might become obsolete more slowly than earlier, weaker computers, even if Moore's law still holds...
 
The original 1965 formulation was that transistor density of integrated circuits doubles every year, but this was revised to every two years in 1975; which is rather odd for an immutable law that is 'a fact' and 'not up for debate'.

I said "roughly". This is obviously not a perfect law. Humanity could have been extinct by a meteor 20 years ago. The fact that the law's predictions have roughly held for 50 years is not controversial; stop looking for an argument.

None of the formulations of Moore's Law are absolutely accurate; ...

If you spent more time trying to understand what the other person is trying to say and less time on your critique, you would have noticed the all-important adjective "roughly".

and exactly what form the Law should take, and how long it will continue to hold true (or even whether it still does hold true) is very much up for debate.

I agree with this, but I don't see how it's relevant to what I said.
If you don't see how '[it] is very much up for debate' is a directly relevant response to the assertion 'This is a fact; it is not up for debate', then there is no hope for you.
The main driver of the conformance of hardware to Moore's Law is that Moore's Law has been used as a benchmark for R&D, making it more a self-fulfilling prophecy, than a law of nature.

But that is just your opinion. And I did not say anything about what drives the accuracy of Moore's law.

I really think it is important that we understand each other.

You need to stop making demonstrably false statements then.

I understand what you wrote. It was wrong. You said it was not up for debate, so the fact that you are debating it here is a clear demonstration that it is wrong.

If, as appears to be your defence, you meant something other than what you wrote - perhaps something that is not wrong - then any inability to understand each other would seem to result from your inability to communicate clearly. That is not something I can fix; it is up to you to deal with it, either by not saying things you don't mean, or not saying anything at all.
 
I don't think I understand what you are getting at. I wouldn't have thought that either is describable as a rate. Wouldn't both be discontinuous increases?

You were already talking about rates - I just tried to give you an example where an accelerating rate has different effects based on the baseline value.

So, moving from 1 to 2 in computing power might be a much bigger deal than going from 100 to 200. That way, computers that are more powerful might become obsolete more slowly than earlier, weaker computers, even if Moore's law still holds...

Yes, that is what I was trying to say to Jimmy Higgins. His observation is not a strong reason for the conclusion.
 
I said "roughly". This is obviously not a perfect law. Humanity could have been extinct by a meteor 20 years ago. The fact that the law's predictions have roughly held for 50 years is not controversial; stop looking for an argument.

None of the formulations of Moore's Law are absolutely accurate; ...

If you spent more time trying to understand what the other person is trying to say and less time on your critique, you would have noticed the all-important adjective "roughly".

and exactly what form the Law should take, and how long it will continue to hold true (or even whether it still does hold true) is very much up for debate.

I agree with this, but I don't see how it's relevant to what I said.
If you don't see how '[it] is very much up for debate' is a directly relevant response to the assertion 'This is a fact; it is not up for debate', then there is no hope for you.
The main driver of the conformance of hardware to Moore's Law is that Moore's Law has been used as a benchmark for R&D, making it more a self-fulfilling prophecy, than a law of nature.

But that is just your opinion. And I did not say anything about what drives the accuracy of Moore's law.

I really think it is important that we understand each other.

You need to stop making demonstrably false statements then.

I understand what you wrote. It was wrong. You said it was not up for debate, so the fact that you are debating it here is a clear demonstration that it is wrong.

If, as appears to be your defence, you meant something other than what you wrote - perhaps something that is not wrong - then any inability to understand each other would seem to result from your inability to communicate clearly. That is not something I can fix; it is up to you to deal with it, either by not saying things you don't mean, or not saying anything at all.

You're off on some unnecessarily rigorous tangent. Come back to reality, and think about what I am saying.

I was simply pointing out that history has shown us that Moore's law has been roughly accurate. What Jimmy's observations are is a vanishingly infinitesimal argument about the history of processing speeds.

It would be like Dodge disclosing that they have built more red trucks than black trucks, and me saying that it doesn't seem that way from what I have noticed on the streets. In this example, the question about the ratio of black Dodge trucks to red Dodge trucks is not debatable baring some kind of inside knowledge.
 
I said "roughly". This is obviously not a perfect law. Humanity could have been extinct by a meteor 20 years ago. The fact that the law's predictions have roughly held for 50 years is not controversial; stop looking for an argument.

None of the formulations of Moore's Law are absolutely accurate; ...

If you spent more time trying to understand what the other person is trying to say and less time on your critique, you would have noticed the all-important adjective "roughly".

and exactly what form the Law should take, and how long it will continue to hold true (or even whether it still does hold true) is very much up for debate.

I agree with this, but I don't see how it's relevant to what I said.
If you don't see how '[it] is very much up for debate' is a directly relevant response to the assertion 'This is a fact; it is not up for debate', then there is no hope for you.
The main driver of the conformance of hardware to Moore's Law is that Moore's Law has been used as a benchmark for R&D, making it more a self-fulfilling prophecy, than a law of nature.

But that is just your opinion. And I did not say anything about what drives the accuracy of Moore's law.

I really think it is important that we understand each other.

You need to stop making demonstrably false statements then.

I understand what you wrote. It was wrong. You said it was not up for debate, so the fact that you are debating it here is a clear demonstration that it is wrong.

If, as appears to be your defence, you meant something other than what you wrote - perhaps something that is not wrong - then any inability to understand each other would seem to result from your inability to communicate clearly. That is not something I can fix; it is up to you to deal with it, either by not saying things you don't mean, or not saying anything at all.

You're off on some unnecessarily rigorous tangent. Come back to reality, and think about what I am saying.

I was simply pointing out that history has shown us that Moore's law has been roughly accurate. What Jimmy's observations are is a vanishingly infinitesimal argument about the history of processing speeds.

It would be like Dodge disclosing that they have built more red trucks than black trucks, and me saying that it doesn't seem that way from what I have noticed on the streets. In this example, the question about the ratio of black Dodge trucks to red Dodge trucks is not debatable baring some kind of inside knowledge.

You made a direct claim that it was not up for debate.

And yet, here you are...
 
Back
Top Bottom