Hacker News new | comments | ask | show | jobs | submit login
Accurate CRT Simulation (gamasutra.com)
272 points by jsnell on Mar 28, 2016 | hide | past | web | favorite | 77 comments



In the analogtv.c source code, by Trevor Blackwell (part of XScreenSaver) is this comment [1]:

    A maxim of technology is that failures reveal underlying
    mechanism. A good way to learn how something works is to
    push it to failure. The way it fails will usually tell you
    a lot about how it works.
To mimic the image on a 1970s-era colour TV, XScreenSaver uses an accurate DSP simulation of all the analogue circuitry, including bandwidth limits, noise, and distortion. A gigahertz processor is just capable of keeping up with this.

[1] In XScreenSaver: a collection of free screen savers for X11 and MacOS by Jamie Zawinski [2].

Edit: if you want to read the source code---highly recommended!---retype the address below by hand or you'll become acquainted with jwz's countermeasures.

[2] https://www.jwz.org/xscreensaver/xscreensaver-5.34.tar.gz


Ironic that JWZ takes offense to the link since the code in question is by Trevor, who is a cofounder of YC.

Mirror here: http://web.mit.edu/ghudson/dev/nokrb/third/xscreensaver/hack...


I think he's had that HN redirect for years. He is blocking hotlinking directly to files.

Try https://www.jwz.org/xscreensaver/


No, he blocks/redirects all traffic with HN as referrer. Kind of a jerk move, which sucks because his site is full of useful stuff.


jwz is a smart guy, but ultimately ... a bit inflexible and jerkish. I used to argue with him over using C++ in xscreensaver (I wrote a couple savers, glplanet, pulsar and glextrusion). I wanted to write a full scene graph implementation, a physics library, and a bytecode VM to implement screensavers without having to write a ton of low-level C++ code. Unfortunately, jwz's constraints include "Every screensaver must be a single C file" and "no C++", which while possibly wise when he first expressed it, has become a canard.


There's nothing wrong with asserting "no C++" in systems software, especially on *NIX. The stability of the C++ ABI has generally been terrible for a long time, and by avoiding C++ in a component that's heavily used, a lot of maintenance headaches are avoided.

As one of the maintainers of the C++ ABI for an operating system, I assure you, it is best avoided for system components unless there's a compelling justification.


xscreensaver is not a "system" component. Nor is it systems software. It's a game engine that only rarely takes user input.


Xscreensaver is a security boundary, and has been better at avoiding lock screen bypass hacks than many other screensavers.


It's a system component in the sense that it's typically distributed as part of an operating system. Not in the sense that it's a "kernel" or something like that.

As someone that helps build an operating system distribution, we consider Xscreensaver to be a system component.

Also, for licensing purposes, the GPL, as an example, considers software distributed with the system to be a system component in some scenarios. For example, "GPLv3 has adjusted the definition of System Library to include software that may not come directly with the operating system, but that all users of the software can reasonably be expected to have."


Your definition is weird. Which distribution is this?

xscreensaver is X11, not GPL.

Why does a C++ ABI really matter in a system distribution? All the distros I've used are internally consistent in their ABI and update all their deps when the C++ ABI is updated.


Your definition is weird.

Not from a distributor's perspective ;-)

xscreensaver is X11, not GPL.

Correct, but since it is distributed as part of the system, or is a component that most users will have, then, as an example, if you link your GPL'd code to Xscreensaver's API, that's ok, because it's considered a "system library".

Really, I was referring to more for the purposes of definition than anything.

Why does a C++ ABI really matter in a system distribution?

Because ABIs create a point of synchronization, and internally for that distributor, that create a flag day where a given set of components must be rebuilt at the same time.

Ideally, for a distributor, you don't rebuild the entire world every time one component changes. You only rebuild what you have to, since every delivery causes new binaries to be produced, which then in turn increases the amount of downtime for consumers that are upgrading.

As such, changing ABIs (thankfully GCC's doesn't change as much as it used to) create unnecessary churn, and if you can easily avoid them, you usually do (as a distributor).


what does an ABI matter here? Nearly all platforms that xscreensaver runs on produces its own self-consistent ABI and then links the apps against that ABI. When the ABI changes, the apps change with it.


An ABI matters here because it means that the Xscreensaver component would have to stay in step with the compiler component, which means you've created an extra build time dependency.

It also means that if a user had their own Xscreensaver module, it too would be required to be using the same ABI.

In short, there's no particularly compelling advantage and a lot of disadvantages for a widely used and distributed component where you want to maximize its availability on platforms.

C++ is just a tool; it's not always the right one. The maintainer has made implementation choices to maximize compatibility and portability. You have to remember that Xscreensaver is designed to be used and executed on a wide variety of platforms, some of which are relatively old.


What would be the problem with releasing all that stuff with some other name?

(he seems to port xscreensaver to all the things he uses, so I can understand not wanting to accept a huge pile of someone else's code in a different language than the rest of the package)


I am not an expert on licenses, but since jwz chose the X11 license, it may be possible to take the screensavers and releasing them under another name. I don't know what that would achieve.

I was able to port many of my screensavers to other platforms easily; in most cases, the savers just have two functions: setup() and draw(), the first of which does any one-time initialization, the second which does all per-frame rendering. One can write a simple wrapper for any platform that initializes the GL contexts and handles user input, as well as any locking/unlocking code.

However, I don't think that would achieve much. Most of the savers were written in very terse, nasty form to achieve the various constraints the authors worked under.

I think it makes more sense to make NuSaver: the design I described (a scene graph, a physics engine, and a bytecode VM, then rewrite the good savers in the VM). In this case, I would probably just adopt an existing C++ scene graph (Qt has one built-in), a physics engine (I adapted Box2D to Qt, but you need a 3D engine), and a VM (I would use javascript).


I think it makes more sense to make NuSaver

Yes, this is what I meant. How big a justification does he need to refuse code when such a thing is possible?


It's not worth it. Most ppl screenblank so savers are mostly for nostalgia these days.


Sure, I assumed as much.

But my point is that maybe he has reasons other than being a jerk. If it isn't worth it to you to do it otherwise, why does the discussion start with it being worth it to him?


I suggest learning the full history of the thing which is jwz to understand why jwz is a jerk. He's amazingly smart, but extremely impatient and outright hostile to nearly every software engineer he encounters.

He's often right (see CADT) but in a way that is jerkish, and that's really limited the adoption of jwz code.


Correct. Even / redirects to imgur if the referrer is HN.


That redirects to imgur.


Copy and paste the url.


Looks like JWZ didn't like your linking, and it now has a redirect to an offensive(?) picture on imgur


I assume it's temporary. For posterity, the image is http://imgur.com/32R3qLv


No, he's been blocking all links from HN for a while.



To open jwz links, right click > open in incognito window.

He blocks links coming from HN. I believe he always has.


no, it started after a few times people posted jwz links on HN and he didn't like the comments he got from HN users.


From the article: "As the electron guns sweep across the screen, varying their intensity to adjust brightness, they tend to overshoot their desired value and bounce back a short time later. This creates alternating vertical bands of light and dark seen at the edges of high contrast changes in brightness."

That's caused by a different effect. NTSC has terrible color bandwidth. The YIQ signal is really an I (intensity) signal at full bandwidth, and YQ signals with severely limited bandwidth (because, during the transition from black and white to color, they had to be stuck in subcarriers of the main black and white signal. NTSC Color TV is really greyscale TV with color tweaks superimposed.

You can only go through the YQ color space about 10 times across the width of the screen. The way to simulate this is to convert to YIQ, apply a low-pass filter horizontally to the Y and Q components, then convert back. Note that this is a horizontal issue only; in the vertical direction, every line can be a different color without problems.


fyi(q), I doesn't stand for intensity, but for in-phase https://en.wikipedia.org/wiki/In-phase_and_quadrature_compon...


Right, NTSC has luminance (Y), I and Q define a vector in the color space, and have much less bandwidth than the luminance signal.


>NTSC Color is really greyscale TV

My old CRT only supported PAL. So this is the reason why NTSC inputs were displayed without color.


Both PAL and color NTSC are derived from greyscale television standards and represent luminance a similar way. PAL uses a slightly different encoding scheme for color information that improves on some of the deficiencies of NTSC and uses a different color carrier frequency. Since the carriers are far enough apart your PAL TV doesn't even "see" the NTSC color signal and so you just get greyscale rather than incorrectly decoded color.


Your old CRT is probably PAL-M, a variant of PAL. Was it made in Brazil? https://en.wikipedia.org/wiki/PAL-M#Compatibility


I (scandinavian) have a few old TVs and monitors that are able to synchronize to the NTSC clock, but are unable to decode NTSC colors. It's not uncommon, and even my 1084S does it. As far as I understand this, PAL is just a color encoding standard, and the synchronization characteristics typically associated with it are broadcast standards, e.g. PAL-M, PAL-B etc. If you have a TV that decodes PAL at 50Hz there is a good chance that it will decode PAL at 60 Hz even outside countries that had these broadcast standards.

For NTSC, since it's only really the timing and color encoding that differs it's then not uncommon to get a grayscale picture from NTSC input.


Back in the day, the joke was that NTSC stood for "Never Twice the Same Color".


In civilized world we used SCART or JP-21 to achieve pure RGB and avoid most of the artefacts author is so nostalgic about.

http://retrorgb.com/rgbguide.html

so no, our consoles didnt look like http://i.imgur.com/Pq2Yra4.png, they did indeed look pixel perfect http://retrorgb.com/images/Link%20RGB%20Compare.jpg


SCART came fairly late. In the 80s we were using analogue UHF modulators connected over coax to the aerial socket of a TV. Here is the ZX81 circuit board for example: http://www.worldofspectrum.org/ZX81BasicProgramming/chap25.h...


According to Wikipedia, SCART was compulsory on French TVs from 1980.

(I'm not old enough to remember any TV connector before about 1997. By then, in the UK, only really cheap or old stuff lacked SCART.)


USA never used SCART. Went straight from UHF antenna to component. NES and SNES shipped with RF cables - the earliest with component was the N64.


SNES has RGB output, all you even needed was a cheap cable.

I wonder why there was no rgb inputs in US TVs, this is the most natural 240p/480i signal you can think about, goes almost straight from video chip to CRT electron gun.


Well, the important thing is that you got a chance to be smug about your 30-year-old TV.


It's a bit of a smug-response to people being "smug" about the "what indie developer think pixel art looked like", here: http://i.imgur.com/Pq2Yra4.png.

Like, yeah, maybe on your ntsc tv, my pixel-art games looked pretty good on my monitor.


"Never Twice the Same Colour"

"Something exceedingly Contrary to the American Method"

"Perfection at Last" or "Picture Always Lousy"

(Is there one for DVB vs ATSC?)


This whole article is a "smug about someones really BAD 30-year-old TV"


If you think that lovingly describing the details of an obsolete technology is "smug," then you are seriously on the wrong website.

(BTW, did you notice the part where the author specifically says he's talking about NTSC, and other standards work differently? Because, to be honest, getting all nationalistic over obsolete TV connection standards is kinda weird.)


Indeed! Here are potato quality images I took of RGB signal on a decent CRT a few days ago: http://imgur.com/a/UIMCI

Whenever I see those artifact-loaded attempts I think it's more of a combination of a very bad CRT, cable, and RF modulator. With most of those artifacts in signal coming from RF modulator.


My BBC computer (1981) uses a 6-pin DIN digital RGB connector +5 V/0 V, 1 V p-p composite colour or monochrome video (link S39).


My only computer with so-so video connection was C64 over svideo. Ever since then everything was RGB straight to TV(or 1084s). Amiga, sega megadrive, playstation, first xbox, all pixel perfect with zero artefacts.

Its like someone in US decided to once again screw whole consumer base and remove essential feature in order to create expensive niche. Now that I think about it, what did people in US use professionally to send video? Were there "professional broadcast quality" labelled TVs with component inputs in the eighties?


Professional video used Betacam

https://en.wikipedia.org/wiki/Betacam

When I started editing, BetaSP was the broadcast format.


This article explains how to get that in the Commodore 64, by binding the A/V cable output to a SCART plug.

The results are indeed brilliant.

https://ilesj.wordpress.com/2012/03/30/c64-av-cable-theory-a...


Thats not it, C64 doesnt have RGB output, it has Y/C = svideo, 4 pin mini din connector, similar to ps2 mouse/keyboard, think old VHS camcorders.

It is ok, removes most of cvbs garbage like dot crawl, but still nowhere near rgb.


I've done it recently, but growing up every TV seemed to have SCART connectors and nothing used the RGB pins. All the consoles I owned were connected over RF or composite, much like everyone else I knew.


Some claimed that low refresh rate on screen caused it to flash. But that wasn't true, as long as the screen had slow enough phosphor. With older CRT displays it was totally possible to read the text from the screen after three seconds of turning the display off or giving clear screen command. You could also play with magnets on the screen, not all screens got great degauss feature, so little magnet play was fun. It's also important to monitor that CRTs aren't equal. If you used regular 70s color TV it was very different from high end 21" Trinitron CRT used before flat panels became popular. Here you can see one nice CRT: https://www.youtube.com/watch?v=IweoaOb53XQ One of the reasons why many computers didn't use 80 char line length was that TVs didn't produce sharp enough image and it was impossible to read 80 char lines. Of course it was also possible to modify displays to produce sharper image. With Spectravideo it was possible to use separate card to produce sharper image and 80 columns. http://www.samdal.com/SVIDOCS/STM-I_SVI806.pdf


Funnily, my family still uses a Trinitron daily – it’s our youngest and largest TV. Around 30", amazing quality.


Jeeze, that thing must weigh as much as a vw bug.


There were CRT-based 1080i 16:9 Trinitron HDTVs available (briefly) during the early days of HDTV. My parents had a 34-inch model. Heaviest television set I've ever had the displeasure of moving. Definitely a two-man lift.

Awesome picture quality, though.


Yeah, I had one of these. It was great for its day, but definitely far too heavy to be practical. Having that device carted out of my house (the mover said he could have carried it on his back, but they used two people and then nearly dropped it anyway) was one of my greatest days.


At first I though the curvature was exaggerated but then I looked at some pictures and it was actually even worse.

This reminds me the first time I used one of those flat CRT, I was so used to the convex curve that the flat one looked concave.


Well, anything Trinitron-based didn't have nearly as much curvature for technical reasons, so if you always used higher-end sets and monitors you might never have had one with this kind of curvature.


If you're on Mac (or iOS), I highly recommend the Cathode app[1]. It's a terminal in the style of old CRTs, and I always get a kick out of using it.

[1]: http://www.secretgeometry.com/apps/cathode/


Cathode is excellent, and makes working in the terminal so much more enjoyable. There's so many other little touches as well, you can include the sound of an old hard drive whirring up on startup, for instance.

I've been trying to find something similar for my Windows 10 machines, but no luck. I wish they'd port it across, but failing that I'd love to hear recommendations for a Windows app like Cathode.


And it's free text-editor cousin: https://itunes.apple.com/us/app/blinky/id550873221?mt=12

Very cool and pretty fun.


Fun, Thanks. Noticing that 'Pokey' in the 8-bit Atari emulation is actually the analog in/sound chip--ANTIC was display.


Very similar to Cathode, and free: https://github.com/Swordfish90/cool-retro-term


Thanks! Just tried it with the C64 font on a curved ultrawide monitor:

http://matracas.org/tmp/cool-retro-term-C64-font-Dell-U3415W...


Haha. This reminds me of "Mother" in the Alien movie. Supercomputer, yet retro.


That made my day :)


I love this topic and compiled a slew of related links last year: https://chadaustin.me/2015/11/crts-pixels-and-video-games/


This is a fun and interesting article!

There are a few minor errors in this section:

> Cathode ray tube displays work by repeatedly sweeping an electron gun back and forth across the extents of a fluorescent screen. The strength of the electron beam may vary during this sweep, and this in turn affects the brightness of the phosphors it illuminates. In early black-and-white models, only a single electron gun was necessary; with the introduction of color TVs, three electron guns were used, each tuned to a different phosphor, which would appear as the color red, green, or blue.

> The points of light created by this interaction are too diffuse to produce a sharp image, so a shadow mask is used to focus the electron beams before they reach the screen. A shadow mask consists of a metal plate with holes or apertures designed to filter out unwanted electrons. The shape of the shadow mask and the configuration of its holes vary by model and contribute greatly to the characteristics of the resulting image. The shadow mask also defines the dot pitch of the display, effectively limiting its highest possible resolution. As we will see later on, the dot pitch of the display does not necessarily correspond to the resolution of the displayed image.

The three electron guns in a color CRT aren't tuned to the different color phosphors, they are physically positioned so that the holes in the shadow mask or the gaps in the aperture grille block the beams from hitting the unwanted color phosphors and only hit the phosphor that each beam should light up.

In a shadow mask CRT, there is a triad of phosphor dots (or stripes in newer TVs) for each hole in the shadow mask. The electron beam from each gun goes through this hole at a different angle, so it only hits the correct phosphor and is blocked from the other two.

An aperture grille CRT uses a similar concept, but instead of mask with holes in it and triads of dots, it has a series of parallel vertical wires and vertical phosphor stripes serving the same function.

The mask or grille is not there because the electron beam is too diffuse to produce a sharp image. A monochrome CRT has no shadow mask or aperture grille, and with proper focusing it can be plenty sharp. The mask or grille is there simply to block the the beams from hitting the wrong color phosphors.

More information is in the Wikipedia articles on shadow masks and aperture grilles:

https://en.wikipedia.org/wiki/Shadow_mask

https://en.wikipedia.org/wiki/Aperture_grille


One thing with wire-based aperture grilles: tapping the side of the monitor will make the wires in the aperture grille vibrate, throwing of the colors. So, if you want to emulate a Trinitron display really, really well, you need an acceleration sensor and, for good measure, a microphone (to detect loud noises at the right frequency to make the wires vibrate)

I also have a question some reader might be able to answer: if you do the simple implementation, (at least) 2/3rds of your electrons will hit the aperture mask. To improve power efficiency, you would need finer control over the video signal. did any television ever do that? Or did they already have to do it to prevent the electron beams for different color from interfering with each other?


> they are physically positioned so that the holes in the shadow mask or the gaps in the aperture grille block the beams from hitting the unwanted color phosphors and only hit the phosphor that each beam should light up.

Moreover, if you physically rotate the display, then the colour changes because that alignment is thrown off—by Earth’s magnetic field, IIRC.


Thanks for the explanation, that was bugging me.


Maybe it's just me, but I think the old CRTs looked horrible, and I find games and other applications which strive to mimic them as closely as possible extremely visually unpleasant.

Sure, there's the nostalgia factor, but 20 seconds later I'm over it. If you have a game or something that has a CRT emulation mode, let me turn it off.


I looked into early camera tubes recently- it's amazing that they work at all :-)

The "image orthicon" https://en.wikipedia.org/wiki/Video_camera_tube#Image_orthic... can only be described as a Rube-Goldberg device.


NTSC - Never The Same Color


I'm seeing a game development article on the front page almost every time I check HN now.

How many of you are making games?




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: