
John Carmack coded Quake on a 28-inch 16:9 1080p monitor in 1995 - ukdm
http://www.geek.com/articles/games/john-carmack-coded-quake-on-a-28-inch-169-1080p-monitor-in-1995-20110920/
======
ck2
Sony FW900, last, best CRT ever made (that was affordable).

23 inch 16:10 CRT, 1920x1200 - weighs nearly 100 pounds and draws 150 watts if
I remember correctly.

Originally like $2000, down to $300 at the end (refurbished).

Had variable phosphorus pitch, denser at the corners and an internal cpu
adjusted the corners to correct for (earth?) magnetic field.

Only went to LCD when mine finally died and there was no one who could repair
it and getting another was out of the question because shipping prices have
gone through the roof.

There is a huge fan thread on them in [H]ardForum with lots of photos.

The colors on them are unbelievable.

~~~
watmough
Absolutely my favorite resolution, and is what's currently setup on my HP
ZR24w, as well as deliberately defocussed a couple of notches to fuzz the text
up a bit.

In comparison to the $2000 and $10000 crts, this is a great monitor with an
ips panel, a standard sRGB color-gamut, and a cheap for an ips panel price of
about $380 right now.

Yeah, I'm happy to be programming now, though 6 years ago was a great time for
'if you can haul it, you can have it' deals on crts.

~~~
telcodud
_Absolutely my favorite resolution, and is what's currently setup on my HP
ZR24w, as well as deliberately defocussed a couple of notches to fuzz the text
up a bit._

What do you mean by "deliberate defocus?"

~~~
karavelov
My ZR24w has sharpness control in the menu (image control section). Lower
levels make the image somehow fuzzy. I guess that's what he is using

~~~
Lagged2Death
Actually, the "sharpness" control on any type of monitor applies a deliberate
distortion to the image that produces a sensory effect that gives an
_impression_ of a sharper image. The image isn't actually sharpened at all. It
works much like unsharp masking in photography. Particularly on an LCD monitor
with a digital video connection, a sharpness setting of 0 is absolutely the
way to go.

~~~
mseebach
I just played with the sharpness control on a HP Compaq consumer 24" display,
and the lower sharpness settings seem to actively blur the image. Why would
you possibly ever want that?

~~~
Lagged2Death
I believe that if you peer at the individual pixels with a strong magnifying
glass, you will see there is no blurring going on.

If you're using a recent version of Windows properly set up for an LCD
display, it does use sub-pixel rendering, a resolution-enhancement technique
(Microsoft calls it ClearType) that increases the resolution of text. Although
it can make text look slightly soft, most people agree that it's also easier
to read than text that hasn't been enhanced this way.

If your LCD is anything like mine, the fake "sharpness" it adds can un-do the
benefits of the sub-pixel rendering. Maybe that is what you are seeing.

~~~
ghusbands
Many displays do indeed do blurring at the lesser end of the sharpness scale,
as mseebach notes, and their 'no alteration' point is then normally about 10%
from the bottom. It can be easily observed on edges that aren't text, even if
text is subpixel rendered.

------
illumen
There's lots to learn from Mr Carmack, and other impressive programmers for
that matter.

He had/has an amazing talent for producing truly great work at great speed. Is
it just magic that he can do this? Or does he have techniques that help him?

It seems obvious, but great tools help a great craftsman. So can great
methods. He combined so many techniques from different disciplines.

What tools can a developer use today that propel them above what others are
doing? If you're just using a standard issue macbook at this point, your tools
are not better than what others are using.

His techniques for focusing on development tasks are also very useful. It
boils down to a bubble sorted todo list, constantly refined and structured for
high throughput.

I also like his approach to C, and learned a lot about being pragmatic and
keeping things simple. For example, his approach to file io. He would read in
a whole file at once, rather than reading chunks at a time. This was at a
time, when most file readers were mixing file io in with their parsing code -
making them slower and more complicated.

His business techniques have always been amazing to watch too. Like how he
used blogging in the 90s to gain a wide audience. As well as doing demo
software for shareware. Finally his move into space craft after his successful
game development career has been inspirational too.

~~~
dasil003
> _What tools can a developer use today that propel them above what others are
> doing? If you're just using a standard issue macbook at this point, your
> tools are not better than what others are using._

Depending on what you're doing, hardware is mostly a commodity at this point.
You need sufficient hardware speed to maintain flow, but that's about it.
Otherwise a big monitor (maybe several of them) and good keyboard / mouse /
ergonomics are all you need.

Workflows though, I think that's where the magic is. And a great workflow is
perfectly suited to the task at hand. The true master programmer optimizes the
workflow without descending into wasteful fiddling.

~~~
illumen
Yeah, customised workflows are great.

Resources can always help though. An SSD vs a HD, 64GB of ram compared to 1GB,
16 fast CPU cores vs 1 slow core, will all help with development. Things like
integration servers and such can help too. A testing rack of 30 mobile devices
is another useful thing. If you have 100 times the amount of processing power
as a standard macbook pro I can assure you, a good programmer will be able to
use it to increase their productivity. Even something like a backup internet
line could save 1 day a year. Fast backup and restore tools could save another
1-5 days per year.

There's one script that helps a lot with improving productivity...

sudo echo "127.0.0.1 news.ycombinator.com" >> /etc/hosts

~~~
captaintk
when calling this as a normal user, it will most likely fail. try echo
127.0.0.1 news.ycombinator.com | sudo tee -a /etc/hosts instead

------
guelo
It's really sad how it seems like we're stuck with the 1080p craze for
monitors, we've made negative progress in this area, it's becoming really hard
to find higher resolutions. Which is weird because normally computer marketing
is to absurd levels all about bigger numbers, but the HDTV crap has apparently
trained everybody that 1080p is the ultimate in video. Luckily mobile seems to
have dodged the HDTV bullet and they're competing on DPI.

~~~
patrickyeon
I think the 1080p craze is driven by consumer demand. For the average consumer
(not a gamer, not a power user), the ultimate visual experience is an HD
movie. There is no need for better quality than HD (1080p), because you won't
find a source better than that, so the most sensible thing for a hardware
provider is to deliver 1080p at the cheapest point possible. Or to improve on
other areas than purely pixel count (colour accuracy, brightness, viewing
angle...)

It's hard to keep in mind, but the majority of buyers, and therefore the
majority of income, may not be like you.

~~~
typicalrunt
You're half right. But there's more to the story.

In the consumer's mind, there is no better quality than HD because it doesn't
exist. Look at the way marketing speaks to the average consumer. In marketing,
perception is reality.

You may know what 1080p means, but most consumers have no clue what the '1080'
or the 'p' means, or how it relates to their viewing experience.

~~~
antihero
I think a lot of is placebo, too. I bet people convince themselves they're
enjoying things more because they consciously know they're getting "higher
quality". It would be interesting to do a double blind study to see how many
consumers could tell the difference between 720p and 1080p.

------
andrewf
I doubt this is Quake in 1995. Maybe WinQuake / QuakeWorld / GLQuake, or even
Quake2.

* It looks like Visual Studio on a post-Windows-3.x GUI, which means Windows 95 (unlikely) or Windows NT 4+ (1996 or later)

* Quake was developed on NeXTSTEP and DOS.

* John Carmack blogged (well, as close as you'd come to it in those days) that he was going to start looking at Win32 in the "near future" on Jul 1, 1996: [http://www.team5150.com/~andrew/carmack/johnc_plan_1996.html...](http://www.team5150.com/~andrew/carmack/johnc_plan_1996.html#d19960701)

~~~
unwind
It's Quake __2 __, the image was in the other recent submission about the
Quake 2 source code analysis (<http://news.ycombinator.com/item?id=3018539>).

~~~
petenixey
That document show an astounding rate of progress. It's when you read a
document like that that you realise just how impossible it is for a team of
mediocre developers to compete with a team (or in this case an individual)
great developer.

------
cellularmitosis
Ugh. Dear intarweb, please stop trying to outsmart safari on iOS. It works
just fine as-is. We don't need your fancy-pants JavaScript-based paging
implementations.

~~~
thought_alarm
That's Onswipe, a company founded on the idea that mobile websites should
consume as much CPU, RAM, and network resources as possible, because your time
isn't valuable, your battery is overcharged, and you have unlimited bandwidth.

~~~
joshu
You forgot causing safari to crash.

------
pmjordan
_"I wonder what Carmack uses now? Whatever it is, he could probably have
several of them hooked up to a machine each running at 1920 x 1080 and still
come nowhere near close to drawing 180 watts."_

That's a little optimistic. The 27" and 30" TFTs which are becoming
increasingly commonplace consume upwards of 100W, at least at or near full
brightness so you'd only need about 2.

~~~
ukdm
I guess when you're talking about the 27" and 30" displays that's the case,
but 24" seems to be where the very low power use can be seen. LG has a display
it claims only consumes 28 watts, so an array of 6 of those and you'd still
have some watts to spare on the InterView.

~~~
angrycoder
A 24" dell Ultrasharp uses 75W. I doubt he is using anything lower quality
than that.

~~~
morsch
That's true for the U2410. The new 24" Ultrasharp (U2412) uses an LED
backlight and has a typical power draw of 38W. Contrary to popular belief, LED
backlighting doesn't lead to a better picture. It is significantly more
efficient than CCFL backlighting though.

On a sidenote, and apart from the gains in efficiency, the new Ultrasharp
doesn't seem like much of an upgrade, at least at first glance. What a
shame/thank god I don't have to buy a new one.

~~~
angrycoder
The 2412 isn't a newer version of the 2410, it is a budget version and
significantly worse.

------
prawn
This would bring back some expensive memories for a few on HN I'm sure. I
remember, as a multimedia trainee in around 1996-97, buying a 21" NEC CRT for
$2,200 _second hand_. I couldn't give it away today, so it sits in the shed
along with a few other CRTs.

Seeing Frank Pritchard's CRT "sculpture" in Deus Ex: Human Revolution
certainly made me think...

------
spektom
Here's the answer to the question "what monitor does he use now?":

[http://twitter.com/#!/ID_AA_Carmack/status/11636594701461094...](http://twitter.com/#!/ID_AA_Carmack/status/116365947014610944)

------
biot
That reminds me of this: <http://xkcd.com/732/>

Speaking of which, what is the highest resolution monitor available today that
isn't outrageously expensive? Apple's 2560x1440 Thunderbolt/Cinema display is
nice. Any WQUXGA (3840x2400) monitors available like Toshiba's $18000 one[0]
but that don't come with a "medical imaging" price tag?

[0] [http://www.theinquirer.net/inquirer/news/1032529/toshiba-
lau...](http://www.theinquirer.net/inquirer/news/1032529/toshiba-
launches-22-inch-wquxga-monitor)

~~~
lusr
I picked up a pair of IBM T221s (22.2", 3840x2400x48hz) for around 900 USD
each. They work beautifully.

~~~
biot
Do you run both from one machine?

~~~
lusr
Currently no; waiting for custom LFH-60 <-> Dual-Link DVI cables.

A pair of these cables will allow one Radeon 6750 to power a T221 at full
3840x2400@48Hz resolution as two 1920x2400@48Hz displays in an Eyefinity
configuration if you hack some EDID values. I chose the Radeon 6750 because
it's the cheapest AMD card that has two dual-link DVI ports.

I chose AMD because Eyefinity supports framelock on a single card under
Windows 7, whereas it seems like nVidia Surround only supports framelock
across more than one card (which is inconvenient).

Framelock is necessary because Windows 7 doesn't have built-in support for
desktop spanning across the two 1920x2400 displays like XP. (I believe this
isn't an issue under Linux either). Without framelock support you'd get
display tearing in the middle of the full 3840x2400 display.

It's a pity the nVidia route isn't an option: the AMD Catalyst drivers aren't
as good as nVidia's (mouse cursor corruption in 2011? WTF AMD...), and the
T221 was originally driven by nVidia cards it seems so the you'd think
engineering design is there.

------
ohboy
Why is a 1080p monitor for 1995 "amazing"? It was quite common for 21"
monitors to be 1600x1200 and 1920x1080 isn't a giant leap from that. I picked
up a cheap 21" CRT capable of 1600x1200 in the late 90s and I'm no John
Carmack.

I think it's because that sounds amazing to average consumers, who were lucky
to have 1024x768 on a dot pitch better than 0.28, but why is this on geek.com?
Shouldn't most of their readers remember having large, heavy monitors in the
90s? Really makes me wonder what Matthew Humphries (author of that story) was
using in 1995.

CRTs weren't like LCDs, the image didn't push off the side if you pushed the
resolution too far, you could pretty much push them as far as they could go
until you couldn't read it anymore or until it became all vertical lines. Ah,
the good ole days...

------
chrissnell
I had a SGI 1600SW LCD and a Sony FW900 CRT on my desk in 1998. That was a
bitchin' desktop setup back then. The SGI required this special graphics card
from Number Nine, which ended up going bankrupt, thus ending driver
availability. Even by today's standards, the quality of the SGI display was
outstanding.

Still, one of the most expenses displays I've seen belonged to a dorm mate of
mine in 1993. He had a 20" CRT (Viewsonic maybe?) that was connected with four
component cables to a Matrox video card. I'm sure that it would be laughable
today but damn, in 1993 that thing was unbelievable.

------
awongh
does anyone know what those glasses they show at the bottom are for? And that
strange looking charging station thingy?

Also, the monitor, _by itself_ was $10k back in the day:
[http://www.thefreelibrary.com/New+Ultra-
Wide+Format+Monitor+...](http://www.thefreelibrary.com/New+Ultra-
Wide+Format+Monitor+for+Panoramic+Viewing.-a019381586)

~~~
jdabney
The glasses are 3D LCD shutter glasses. They sync the frame rate of the
shutter to the video output using to the box on top of the monitor. This is a
old school version of what the 3D TVs are doing now. We had a whole lab setup
with active 3D at work and it was truly amazing.
[http://en.wikipedia.org/wiki/Cave_Automatic_Virtual_Environm...](http://en.wikipedia.org/wiki/Cave_Automatic_Virtual_Environment)

~~~
dhughes
In the 1990s it seemed like every day there was some new Virtual Reality
device or at least talk of it, people were waiting for their own holodeck but
it never arrived VR faded away.

------
sifi
Anyone know what machine he was using? I guess it was some SGI machine but it
didn't say the exact model.

~~~
mambodog
Not sure about the machine but according to Romero[1] Quake (as with Doom) was
developed on NeXTSTEP 3.3, which they continued to use, later running on Intel
hardware, until 1996.

QuakeEd on NeXTSTEP: <http://rome.ro/uploaded_images/qe_dev-726646.gif>

[1] <http://rome.ro/2006/12/apple-next-merger-birthday.html>

------
ethank
I loved my Intergraph. I had a TDZ-2000 but for monitors I had a Sony GDM-F500
monitor, which was amazing and weighed a ton.

For a bit I had SGI 1600 LCD, but it required a special video card which died.

The Intergraph TDZ-2000 was a great computer though. I bought a floors model
after Siggraph in 1999 or 2000. They were pricey:
[http://www.digitalvideoediting.com/Htm/Articles/intergraph_g...](http://www.digitalvideoediting.com/Htm/Articles/intergraph_gx1_reviewhtm.htm)

------
WalterBright
I have a hard time believing today how much code I wrote years ago on a 24*80
tty.

~~~
colomon
Heck, I spent my first four years programming on a 25*40 screen displayed on a
crappy old TV set... When I finally got 80 columns, it seemed like magic.

------
gallerytungsten
I had an Intergraph machine like that back in the day. Same giant case, except
in blue. I think that one was dual Pentium II, 400MHz, running Windows NT 3.5
or so. Came with a fast array too and a similar keyboard.

------
baddox
Was 1080p relatively unheard of in 1995? What about 16:9 displays in general?

~~~
patrickgzill
They were very expensive. A typical high end CRT monitor might be 19-21", with
the ultra high end being 24". 1600x1200 @75Hz refresh was what you wanted, but
you usually had to use lower color depth to achieve it.

------
zandorg
I like the way it looks like his hands are a blur, for moving so fast.

~~~
schwap
I imagine that's a photographic issue given that his untouched mouse is also a
blur.

~~~
angrycoder
Nope, he is moving the mouse with his mind.

------
kenotic
I never cease to be amazed by Carmack.

------
swah
Also, Diet Coke.

------
zbuc
That was a $10,000 monitor back in the day. Seriously.

"The InterView 28hd96 Color Monitor is priced at $9,995 (U.S. List) and will
be available in May 1997."

Source: [http://www.allbusiness.com/government/government-bodies-
offi...](http://www.allbusiness.com/government/government-bodies-
offices/7292416-1.html#ixzz1Yb5jIIJj)

