please explain this to me...

please explain this to me...

Post by mnhenle » Wed, 17 Sep 2003 08:21:18


Is it true that video game consoles had been 64 bit processors for
ages and even 128 bit processors since the PS2?

how different are these game console processors from the regular PC
processors? Are those processors faster than the 32 bit computer
processors?

Why can't they make 128 bit PC processors?

Why are PCs lagging behind game consoles?
 
 
 

please explain this to me...

Post by Felger Car » Wed, 17 Sep 2003 10:10:31


Mike, if you walk into a software store, there are a gazillion programs
that run on x86 PCs, many of them highly useful. All you can get for a
game console is a few games.

Would you buy a car that could only be refueled with a special hose, and
that hose was only available in Fargo and Memphis? There is much to be
said for standardization.

 
 
 

please explain this to me...

Post by Keith R. W » Wed, 17 Sep 2003 10:54:30

In article <bqt9b.4688$UN4.2418
@newsread3.news.pas.earthlink.net>, XXXX@XXXXX.COM says...


Hmm, sounds like HFC cars! ;-)

--
Keith
 
 
 

please explain this to me...

Post by Tony Hil » Wed, 17 Sep 2003 14:29:19

n 15 Sep 2003 16:21:18 -0700, XXXX@XXXXX.COM (Mike Henley) wrote:

<sigh> there should be a FAQ about this one somewhere.

No. The PlayStation2 and the XBox both use 32-bit CPUs (in terms of
the size of the general purpose registers and pointers) I don't know
too much about the Gamecube, but I believe that it is also 32-bits.

What IS true is that the graphics portion of game consoles have been
64-bits for some time now and even 128-bits (possibly 256-bits in the
case of the XBox?). Similarly graphics cards on PCs are 128 or
256-bits these days.


That depends. The XBox uses a chip that is essentially identical (on
the insides at least) to a regular PC processor. It has a 733MHz
Celeron (PIII core) processor. The only difference between this chip
and a PC Celeron processor is that the XBox processor talks to the
outside world at 133MHz, while the PC Celeron only did so at 66 or
100MHz (though a very similar PIII had a 133MHz bus speed like the
XBox).

The Playstation2 uses a MIPS R3000 processor core. I don't think that
there are any desktop PC chips using this processor, but SGI (back
when it was still called Silicon Graphics) used some in their servers
and graphics workstations about 10 years ago. Here's a link to a
description of one of those old systems with a bunch of pictures:

http://obsolete.majix.org/computers/sgi.indigo/indigo3k.shtml

I don't know too much about the GameCube's processor, but I believe
it's a 32-bit PowerPC core, not entirely unlike the G3 processor of an
Apple Mac.


No, they are quite a bit slower, particularly now that the current
crop of game consoles have aged a bit while PCs keep moving ahead
every day. Even the XBox, which has the most computing of today's big
three consoles has a MUCH slower processor than all but the cheapest
PC processors being sold today.


They could, but it would be useless and SLOWER than a 32 or 64-bit
processor. When you go from a 32-bit to 64-bit processor you lose
about 5-10% performance (if all else were equal, which it usually
isn't). Going to 128-bit performance would cost you another 10%+ in
performance. Larger pointers mean more data to transfer from memory
to cache and fewer pointers being held in cache, both of which will
negatively effect performance.

A major misconception about wider processors is that they can handle
twice as much data at a time. This is NOT true! A 64-bit chip can
NOT handle twice as much data at a time as a 32-bit chip, what it can
do is handle values twice as large. However, a 32-bit can handle
values of up to 4 billion, and a 64-bit chip can handle values of up
to 10^19. So it's extremely rare that you need values bigger than 4
billion, and if you do, you're usually using floating point numbers
anyway, and there you've pretty much always had the option of using
64-bit or even 80-bit values. It's essentially never happens that you
need values bigger than 10^19, which is a REALLY friggin' huge number.

The reason why we have gone from 8 to 16 to 32 and now 64-bit
processors is mainly that these chips can handle more memory. 16-bit
chips could only handle up to 64KB of memory without resorting to
really ugly hacks. A 32-bit chip can only effectively handle up to
4GB of memory, though again those really ugly hacks have shown up in
PC chips allowing them to access up to 64GB. 64-bit chips can handle
up to 10^19 bytes of memory, which is a whole heck of a l
 
 
 

please explain this to me...

Post by Brian Jone » Sat, 20 Sep 2003 04:25:19

On this day of our lord, 15 Sep 2003 16:21:18 -0700, XXXX@XXXXX.COM
(Mike Henley) quilled:


When they talk about 64bit, 128bit etc.for consoles they are talking
about the graphics capability. My R9800pro is 256bit so it way
surpasses any console in that respect. And the Data bus on PC's are
already 64bits and have been for years.