

Microsoft beats Intel, AMD to market with CPU/GPU combo chip - Halienja
http://arstechnica.com/gaming/news/2010/08/microsoft-beats-intel-amd-to-market-with-cpugpu-combo-chip.ars?utm_source=rss&utm_medium=rss&utm_campaign=rss

======
dman
Fun excerpt - "It would have been easier and more natural to just connect the
CPU and GPU with a high-bandwidth, low-latency internal connection, but that
would have made the new SoC faster in some respects than the older systems,
and that's not allowed."

~~~
anigbrowl
Well, figure that game publishers will have some of their animation/game
timing routines keyed to the specifications of the original model. If the new
one has more juice and programmers start taking advantage of it, Microsoft
would probably get the blame for those titles' failure to run well on the
older models.

I was actually surprised at how smoothly Sony managed to phase out the ps/2
backward compatibility (to lower costs) from the PS3 after the first year or
two. i guess it helps that you can pick up the older console for something
ridiculous like $75 if there's some classic old title you can't give up.
Incidentally, word is that development has already begun on PS4 games, so you
know what to ask Santa for in 2012.

~~~
malkia
No they don't.

First of all, we all run different compilation modes - debug, release, ship,
etc. A lot even do LTCG (Link-Time Compile Generation) - e.g. full global opt.

So there is no such thing as relying.

If templates/inline members are heavily used, the difference between debug and
optimized code can go somewhere between 5..10 slower.

Normally for straight "C" code is only x2-x3 times, unless AltiVec is used
(but only in few tight places - skinning, dsp processing, etc.)

So no one ties anything to CPU. It's unreliable. Don't forget that it's not
only your stuff that's running.

If you are on Dolby Digital, then there would be some kind of encoder running
with you.

If you are doing multi-player game, then it could be that you are the one
running the server (obviously made such way that no one else knows).

If you are running in different resolution - same applies - different timings.

If you are doing splitscreen - yup.. Another change

And if you want to do real 3D - one more....

And many more (HDD presence or not - e.g. caching on the HDD or not).

At the end of the day, you try to get 30fps or 60fps, and some other internal
frame for the server (if multiplayer game). If you get that you are just fine.

No one relies on what you've just said.

That was done only on the really old machines, where you had to synchronize
each horizontal scan line so you get some special effects (more colors, or
more stuff being drawn).

Almost anything since 1996 is not like that.

Even the PS1 had different speeds, and it was TRC violation to rely on that,
because Sony were smart and knew they had to emulate that in future, and exact
proper emulation is not always easy.

~~~
anigbrowl
You clearly deal with this regularly whereas I'm just observing from the
sidelines, so I'm happy to be corrected. I extrapolated from the single-task
game/home computers of the 80s, where a lot of those other considerations
simply didn't exist and hooking into hardware clocks etc. was considered
acceptable if it delivered something to the user. You can see the downsides of
this with emulators, where some games run unplayably fast unless you use a
command-line switch to throttle the emulator speed to the original ~30fps or
similar. Sinclair Spectrum games are particularly prone to this. Now I feel
old :-)

I didn't realize today's console manufacturers enforced such strict discipline
on programmers/publishers; I guess breaking the terms means losing access to
the development platform completely?

On a side note, since you're in the profession, why is it that consoles have
such small amounts of RAM? IIRC the ps3 has 256mb each of main and GPU RAM,
which is less than the (cheap) graphics card in my PC. I'm perplexed as to why
one would go with less than a gigabyte, given the low cost of memory.

~~~
malkia
I don't think breaking TRC would mean taking development rights from you. All
you had to do is fix them, and resubmit (each submit to the manifacturer costs
some money).

Violations are usually user interface problems - not handling correctly if a
game controller was pulled, or Disk I/O error happened, or network was
disconnected. Some are strict (for example you should have something on the
screen no less than N seconds once the disc is put in), some are suggestions
for better integration with the system.

You should also display how big your save games would be :) (Not sure if this
is still requirment everywhere), but it used to be for PS1/PS2.

As for RAM - well I guess when the console was first produced, RAM costs were
really high. I'm not a hardware guy, but eventually the manifacturers are
squeezing that expensive console to produce to something very cheap, and
having less memory makes it easier. Just a guess. Not much knowledge in this
area.

But having only 256MB of RAM is not that bad - we often don't use malloc/free
new/delete, or have them working on specific memory blocks (for example to
keep recent cache of collisions), and usually allocated with the same sizes.

Not having much RAM means you have to be more inventive - you can find savings
different ways - streaming, layering, or other techniques. ID's recent tech is
just awesome in that respect.

Although I've told you that we have different compile options, the HW stays
the same :) This is both good & bad - good that you can become really expert
in it, bad because by the time you are - next console arrives.

Usually that's why the first game suck, and games released 2nd or 3rd year are
just awesome, and the last ones released might be better than first ones
released for the next generation - God Of War for PS2 was such example. There
was nothing that good on the PS3 at that time from the same genre.

Metal Gear Solid for PS1 also shaked up the graphics quality. It's still an
awesome looking game even for PS1. And this was on 2MB of memory, 512kb video
(or was it less or little more), and I can't remmember now well 256kb or 512kb
audio.

I worked on the port of MGS to PC (mgspc.com), and was amazed of how good the
japanese coders were, and the way the music was composed - it was midi/mod
like but really really well done (we lied on the PC and released it as .wav
file - no mixing).

What I'm saying here, is that if you have little digital samples and some
notes/tracks you can compose music pieces that otherwise would take you
megabytes of memory (APDCM compression was just 1:4 with low quality, unlike
1:10 or more for mp3 and others).

~~~
anigbrowl
Thanks a lot - I learned more from that than from 6 months of reading console
blogs!

