Hacker News new | comments | show | ask | jobs | submit login
How much information can you convey with a 1 bit interface? (justindunham.net)
35 points by zw123456 1607 days ago | hide | past | web | 20 comments | favorite



I'm not sure how this is a 1-bit interface. It seems like it's '1 bit per minute'. If you arbitrarily choose a unit to use 1-bit to realize, you can build almost anything. For example even the highest end '24-bit' ADC and DAC chips around are often 1-bit supersampling (sigma-delta) architectures. You can communicate through human speech with a 1-bit interface, for some definition of a 1-bit interface.

I'd also add that random combinations of flashing lights is horrible from a UX standpoint, anyone tried programming a universal remote that only communicates via flashing lights?


Answer: 1 bit + time can convey any and all data, this is better known as serial communication. You just have to know the encoding.


1 bit + time == Morse code.

I'd say you can convey any data with just 1 bit - imagine Morse code but only with dots. Although, it would be tedious (if you want to do it by hand).

But if you think of it as series - then yes, that is a time.


Morse code is a form of encoding for a low bandwidth serial communication format.

Written in dot-dash form the length of the symbol represents time.


Not necessarily. 1 bit + time also gives you PWM.


Simply another form of encoding.


This isn't all that abstract or theoretical of a problem. There's lots of devices (printers without lcd screens, routers, for example) which use one or two lights, possibly with different color options, to display a bunch of different possible errors. Often this simply means creating a "key" for the interface (i.e. check the manual to look up what two blinking red leds means or what one steady green and one blinking red means) which is admittedly not fun for the user, but it's also damn hard to make things intuitive with so little resolution (literally!) to convey your info in.


My favorite example of this is how the linux kernel will actually give you morse code through the keyboard lights (num lock, scroll lock, caps lock) during a kernel panic [0].

[0]: http://www.linuxtoday.com/developer/2002072200326NWKNDV


Cool, there is a scene in Neal Stephenson's Cryptonomicon where the protagonist programs his keyboard lights to play back messages to him in Morse Code. I had no idea it had been implemented IRL.


Back in the early days of CHDK (custom firmware for Canon cameras) they would dump the original firmware code of new cameras by writing a tiny loader (that looked like a firmware update program to the camera), put that loader on a memory card in the camera, run the loader on the camera via the firmware upgrade menu option and that loader would just read the original firmware out of memory and blink an LED on the camera over time to send out all the bytes of the original firmware. A computer would then capture that signal with a photodiode connected via serial port or sound/mic input resulting in a binary dump of the camera's original firmware on the remote computer where it could be disassembled and reverse engineered and (eventually) extended to have all sorts of functions the camera didn't ship with.

http://chdk.wikia.com/wiki/Obtaining_a_firmware_dump


That's certainly fun in a book, but honestly I'd freak out and suspect a virus if my Linux machine went mayday and the keyboard goes equally crazy.


If your computer starts acting visibly funny (as apposed to funny in the sense of weird processes or network activity), you can almost entirely rule out a virus, those like to stay hidden.


That's an interesting observation actually, because it didn't use to be the case. It used to be the case that most viruses would intentionally make a nuisance of themselves or display messages every now and again (e.g. you might get a message on boot every 10 times or so - too often and they would get eradicated before they spread).

Viruses that are intended to stay hidden and undiscovered is a new thing, relatively speaking.


tleds (http://users.tkk.fi/jlohikos/tleds) reported network information by flashing the caps, num and scroll lock lights on your keyboard.


I knew this looked familiar. Previous discussion with an interesting thread about Morse Code: https://news.ycombinator.com/item?id=5891383


While the relevance is tenuous at best, when I saw this headline my mind went immediately to a different kind of 1-bit interface.

    WARNING: this is super tangential and has basically nothing to do
    with 1-bit UX design except perhaps in the sense that it gives
    some glimpse at the flexibility of information theory regarding
    time/space tradeoffs. Mostly, I wrote it up because I'm bored.
On my parents' first CD player, I remember the manufacturer proudly advertising via a little badge on the front panel a "1 Bit D-A Converter". At the time it didn't mean much to me; later, when I learned just enough about such things to be dangerous, it didn't seem to make any sense at all.

Still later, in a signal processing class in college, I finally learned why they were so proud of their 1-bit data converter. The converter in question was undoubtedly a delta-sigma DAC [1], and the reason to be proud of just a single bit is that such data converters are, by their nature, highly linear!

Let's look at the dual of the delta-sigma DAC, the delta-sigma ADC. Assume that I'm trying to somehow represent an analog signal (a continuously-valued function, continuous in time) with a 1-bit digital signal (a binary-valued function whose points fall at discrete time intervals). For any such representation, at a given point in time there will be some difference between the continuously-valued and the discrete-valued function. We'll call this error "quantization noise" (for reasons that are partially obvious already and which will be perhaps more obvious later).

If I choose the discrete time intervals to be close enough together (that is, if the sample rate is fast enough), I can try to counteract whatever error exists now by choosing the next binary output to include not only information about that point in time, but also about the error I've just introduced. High frequency information will get little or no benefit, but low-frequency information can be reproduced more faithfully in this fashion.

How do I do this? With a feedback loop. If I want to turn an analog signal into either a 1 or a 0, all I have to do is compare it against some threshold (say, halfway between min and max value). But now, instead of comparing the input against this threshold, I instead compare the integral of the input plus the quantization error I've introduced with all my past comparisons. I do this by feeding back every decision the comparator makes to the input, and integrating the difference between this decisions and the present input.

If we make some convenient assumptions about the quantization noise (for a white Gaussian input signal, the quantization noise is also white and Gaussian, so analyzing its spectrum becomes pretty easy), we can show that the effect of this feedback loop is to push almost all the quantization noise to very high frequencies. We can later reconstruct the original signal by filtering out all the quantization noise (with a few low-pass filters). You might object that non-white input signals surely won't result in white quantization noise, and you're right, but it turns out that even substantially tonal input signals can be processed in this fashion with at most a tweak or two to the underlying structure.

OK, so what of the claim that this system is highly linear? Obviously in the large-signal sense a comparator is highly nonlinear, but in the small-signal sense a comparator is perfectly linear: it can produce only two outputs, and any two points perfectly describe a line. By contrast, let's say that instead of doing a one-bit data conversion, I'd chosen to do a two-bit conversion, i.e., there are four possible points in the constellation. In that case, I have to be absolutely sure that the difference between each pair of sequential codes is precisely the same. If not, the result will be harmonic distortion in the output signal.

So that CD manufacturer did have reason to be so proud of their 1-bit interface after all: with only two states, harmonic distortion is (to first order) eliminated, and so the linearity of the conversion from the digital codes stored on the CD to the analog output is quite good.

Note that there is a bit of work to even get to the point where you have the appropriate data for a 1-bit DAC, since a CD stores audio as 16-bit PCM data with a 44.1 kHz sample rate. In fact, you can convert this to a 1-bit data stream at a much higher bitrate through a process called interpolation, and further one way of implementing such an interpolator is by building a fully digital delta-sigma loop!

One last point of interest: SACD skips over the PCM data entirely, storing a 1-bit delta-sigma modulated stream directly on the disc with a sample rate of 2.8224 MHz. At this sample rate, the audio bandwidth and resolution after filtering is substantially better than CD (in practical implementations, 105 dB dynamic range with 50 kHz audio bandwidth, versus 90 dB dynamic range and 20 kHz audio bandwidth for CDs). Of course, questions regarding the utility of this additional performance remain hotly debated.

[1] http://en.wikipedia.org/wiki/Delta-sigma_modulation


I always liked one button games (1 bit input), such as Canabalt[1] or Helicopter[2].

Another interesting UX exercise would be 1 bit input, 1 bit output system.

[1]: http://www.adamatomic.com/canabalt/

[2]: http://www.helicoptergame.net/


Also "Tiny Wings" (for iOS) and there was one called "fishy fishy" or something like that for PC.



Knowing some information theory should tell us that the answer is precisely one bit. Is there a law like Betteridge's for articles where the headline answers its own question?




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: