Page 2 of 2 [ 19 posts ]  Go to page Previous  1, 2


When do you think we'll get to 1m (or more) cores in a single CPU?
Never 23%  23%  [ 5 ]
They already exist in Area 51 9%  9%  [ 2 ]
Before 2020 5%  5%  [ 1 ]
The 2020s 5%  5%  [ 1 ]
The 2030s 27%  27%  [ 6 ]
The 2040s 0%  0%  [ 0 ]
After 2050 32%  32%  [ 7 ]
Total votes : 22

NeoPlatonist
Deinonychus
Deinonychus

User avatar

Joined: 21 Nov 2006
Age: 38
Gender: Male
Posts: 356
Location: Indiana

30 Jan 2007, 12:22 am

Right now I don't see much reason to have more than 8 cores on a home computer. There is only so much parallelization and or multitasking a person can do. Of course, predicting the future of computing has resulted in some statements that, in hindsight, are hilariously wrong. If you really wanted to, you could buy a 16 core server from Sun (8 dual core Opteron procs): Sun Fire X4600 Only $20,000....


_________________
~Michael


Jameson
Veteran
Veteran

User avatar

Joined: 3 Jan 2007
Gender: Male
Posts: 2,877
Location: Here and there.

30 Jan 2007, 10:00 am

NeoPlatonist wrote:
Right now I don't see much reason to have more than 8 cores on a home computer.


Home computers aren't the main market for multi-core systems, servers are. Servers need to be able to handle many connections at once, and that's a bit easier to accomplish with more cpus.


_________________
Air·is·water·with·holes·in·it. Think·honk·if·you're·a·telepath. Never·call·a·man·a·fool.·Borrow·from·him. A·tautology·is·a·thing·which·is·tautological. Hi!·I'm·a·.signature·virus!·Copy·me·into·your·~/.signature·to·help·me·spread!


Run
Hummingbird
Hummingbird

User avatar

Joined: 3 Sep 2005
Gender: Male
Posts: 22
Location: Amsterdam

07 Feb 2007, 11:31 am

You can't use "Moore's law" because there is no such law.

Originally, Moore observered that the "chip complexity for minimal cost"
had roughly doubled every year. His guess was that that would probably
continue for like the next 10 years (1965-1975), until in 1975 they would
have in 1975 a number of component *for minimum cost* of 65,000,
which he believed to still fit on a single wafer by then.

Moore's law is not about just the density of transistors that can be achieved,
but about the density of transistors at which the cost per transistor is the lowest.

This observation was a self-fulfilling prophecy: because the chip companies
believed they'd need twice as much transistors per wafer next year in order
to be able to compete in the best way on the market (that is, get the most
for their money to produce such chips) they constantly funded research to
reach that goal.

Despite their efforts, they couldn't and the "law" was adjusted to be a
doubling every 18 months. It is amazing that this has continued for this
long - but eventually a physical limit would have to stop the *desire* to
follow Moore's "law".

In the meantime, it has been apparent that having transistors that are
twice as small allows one to make computers twice as fast.
So people already started to talk about Moore's "law" as if computers would
be twice as FAST every 18 months for the next decadeS -- that has NOTHING
to do with Moore's original statement, but ok- it could be true so why not.

There is little reason to believe it is a law however: 1) It has to be possible
to begin with, but moreover 2) there has to be a commercial driving force
to try to achieve it.

Don't take that for granted. They stopped with moon landings too...
It will be possible to build computers that are twice as fast a lot longer
than that it will be commercially interesting to do so.

Especially when you realize that in order to finance all of this, we needed
the Boom of the PC market. That market, some people say, is stalling, if not
dieing. There hardly is any increase anymore in the number of sold PC's.
(If we have to thank Microsoft for anything it's that they've been a driving
force for people to by new hardware, in this regard).

Actually, my observation is that in the past years computers have NOT become
twice as fast even every 2 years - let alone every 1.5 years. We are running
against the physical limit that a higher density of transistors is not possible
anymore - and increasing the number of transistors is not interesting anymore
because their powerconsumption becomes an issue (which is wasn't before).

Nowadays when companies buy a supercomputer, they look at efficiency per
Watt, not per cpu. It doesn't demand an IQ of 165 to see that therefore
doubling the number of cores (while you can't decrease the size of the transistors
a lot anymore) is definitely not the way to go.

I'm not saying that computers won't get a million times faster, eventually, but
it won't be by putting 1,000,000 silicon based cores on a single chip, that is
for sure.