On page 50 describing the GPU instruction set, you should note right at the top the instruction LOADP which I quote is a:
"64-bit memory read. The source register contains a 32-bit byte address, which must be phrase aligned. The destination register will have the low long-word loaded into it, the high long-word is available in the high-half register. This applies to external memory only."
Not surprisingly, the companion instruction STOREP is described on page 56.
Now let's move on to the the Blitter on page 64: "The tour de force of the Blitter is its ability to generate Gouraud shaded polygons, using Z-buffering, in sixteen bit pixel mode. A lot of the logic in the Blitter is devoted to its ability to create these pixels four at a time, and to write them at a rate limited only by the bus bandwidth"
which is also a 64-bit operation.
Finally, while conceptually one could argue that the Atari 800 had a GPU (if you consider 4 players and a missile to be such) did the CD32 have direct hardware-assisted support for gouraud shading, texture-mapping (and if you were remotely clever, 64-bit texture mapping), and Z-buffering as the Atari Jaguar did? I don't think so but I'm open to being corrected here. I do think that having such support supports the claim that said Atari Jaguar was the first console with a true GPU.
"64-bit memory read. The source register contains a 32-bit byte address, which must be phrase aligned. The destination register will have the low long-word loaded into it, the high long-word is available in the high-half register. This applies to external memory only."
Which is 32-bit addressing, not 64-bit. Exactly what was claimed in the comment you're responding to.
Sure, 32-bit addressing with 64-bit memory operations, but that said, I wasn't aware of any existing videogame console even today with 4GB or more of memory so it seems a bit pedantic to disqualify on that basis or are you saying there has yet to actually be such a console?
Or are you saying if the chip squanders transistors so that it's capable of 64-bit addressing, but they're useless because there's less than a GB of memory on-board, the addressing alone constitutes magic blue crystals of 64-bit legitimacy?
The horrendous handling of the Jaguar by Atari aside, and despite its unfortunate resemblance to a bedpan, it was a surprisingly powerful machine for the time, no harder to program than the ragingly successful PS2 (IMO of course), and the best way to show it off was to make full use of its 66 MHz worth of instruction issue (2 x 26.6 MHz RISC process plus a 13.3 MHz 68000 and its 64-bit memory operations. For example, the Jaguar version of Doom was the best of all ports at the time: http://doom.wikia.com/wiki/Atari_Jaguar
All that said, I find reminiscing on video game consoles of the past to be a more positive experience than what this lawsuit has demonstrated about the state of patent law. Sure, Samsung copied Apple's look and feel. Why is this and why should this be illegal unless they literally slapped the word IPhone or IPad on their gadgets? Might as well have banned the Chevy Camaro for copying the Ford Mustang IMO.
It is a 32-bit architecture based on how people measure 32-bit architectures. It is not capable of addressing 64-bit either logically or physically. Having a wide bus to memory is not a qualifier for calling something 64-bit.
> "plus a 13.3 MHz 68000 and its 64-bit memory operations"
The 68000 does not have 64-bit memory operations. It has 32-bit addressing logically, but only had 24-bit physically which cause some problems with folks who did "clever" things on the Mac.
I'll go with the traditional definition of a GPU which means goes back to the Amiga for personal computers.
> Sure, Samsung copied Apple's look and feel. Why is this and why should this be illegal unless they literally slapped the word IPhone or IPad on their gadgets? Might as well have banned the Chevy Camaro for copying the Ford Mustang IMO.
On page 50 describing the GPU instruction set, you should note right at the top the instruction LOADP which I quote is a:
"64-bit memory read. The source register contains a 32-bit byte address, which must be phrase aligned. The destination register will have the low long-word loaded into it, the high long-word is available in the high-half register. This applies to external memory only."
Not surprisingly, the companion instruction STOREP is described on page 56.
Now let's move on to the the Blitter on page 64: "The tour de force of the Blitter is its ability to generate Gouraud shaded polygons, using Z-buffering, in sixteen bit pixel mode. A lot of the logic in the Blitter is devoted to its ability to create these pixels four at a time, and to write them at a rate limited only by the bus bandwidth"
which is also a 64-bit operation.
Finally, while conceptually one could argue that the Atari 800 had a GPU (if you consider 4 players and a missile to be such) did the CD32 have direct hardware-assisted support for gouraud shading, texture-mapping (and if you were remotely clever, 64-bit texture mapping), and Z-buffering as the Atari Jaguar did? I don't think so but I'm open to being corrected here. I do think that having such support supports the claim that said Atari Jaguar was the first console with a true GPU.