Hacker News new | past | comments | ask | show | jobs | submit | netmare's comments login

If A can die from sesame seeds and B can't and the A-to-B ratio is something like 1:1e+6, doesn't it make more sense for A to bake their own bread instead?

"Bake your own bread" - does not solve the situation when you want to travel, eat out etc.

Most bread types don't contain sesame seeds so why let the sesame seeds contaminate them and cause life-and-death situations?

Proposed legislation: allow large-scale production and sales of sesame seed bread only if you produce and sell more sesame-free bread.


The Wikipedia entry on text-based user interfaces uses IT as an example of a text-mode program featuring a pixel-accurate mouse pointer. The "citation-needed" had always been annoying me, since I knew it was true, but I couldn't prove it. Now, we can finally link to IT_MOUSE.ASM and be done with it.


That seems like an analytical conclusion based on a primary source, and thus a violation of No Original Research policy if sourced directly to the primary source on which the conclusion is based, and not a published secondary source that actually provides the analysis of the primary source.


Just cite that HN comment above you then;)


One of Wikipedia's sourcing paradoxes is that the literal horse's mouth is the worst source as to be unusable.


No, primary sources are fine if they say the thing you are sourcing them for (high quality secondary sources are usually preferred, but primary sources aren't “the worst” or “unusuable”); they aren't okay for something that requires additional analysis/interpretation. (That's true of any source, but it’s more of an issue with the use of primary sources.)

I mean this as a genuine comment and not a political non-sequitor but we'll see how it lands. If you think that's a bad idea, can you imagine what Donald J. Trump's wiki page would look like if you allowed 'primary' sources? Or, really, any political figure. Though I think he makes a very good implicit case against allowing that.


"Original research"/primary sources should not matter for facts, i.e. those laid out as actual code.

(It didn't land well.)


I agree. It doesn't seem to make sense for some facts.

I was way out of my wheelhouse on Wikipedia one day and I saw some popular K-Pop group had announced their new album release date on their Instagram feed. I updated their Wikipedia entry with a citation to the group's own announcement and was very swiftly smashed by some over-arching editor who told me what a fuck-up my life was for even thinking this might be a suitable source for the information.


If the fact is in the source cited, primary sources are okay as direct sources. If it is an analytical interpretation of the source (such as most fact claims about the source or its contents, rather than fact claims made directly in the source), then that interpretation needs sourced from somewhere.

Somewhere that isn't the person updating the article, that is. The trick is to make a random website that does the same, and then it's totally fine!

[citation needed]


Bisqwit has a video about the technique: https://youtu.be/7nlNQcKsj74


But is Impulse Tracker a true textmode application? While not full of flashy graphics and icons, it uses lots of widgets and layout features that put it squarely in conventional GUI territory. And the repo contains the VESA code for rendering the equalizer UI.


It was a common technique in the DOS era to simulate GUIs with text. This was not just done with the characters in codepage 437 either; using redefinable character sets in the EGA and VGA, graphical elements such as buttons, input fields, etc. could be rendered in character cells. By redefining characters where the mouse pointer was supposed to be, a pixel-accurate mouse pointer could be created. Not just trackers used this technique, but e.g., later versions of Norton Utilities did as well.


Apparently this was hidden behind the Alt+F12 key combo? I might have to fire up a copy as I don't remember ever seeing this bit. Edit: it doesn't seem to be enabled in 2.14. Oh well, got to hear "Blue Flame" again!

The rest of the application runs in 80x50 text mode, and it included a character editor so you could customize the box drawing characters if you were so inclined.


I think it was a rhetorical question. I wasn't a big fan of the SS series, but those damn kamikaze still echo in my mind. I've even had nightmares about them...


the way they start in the distance and blend into the rest of the sounds until... ".. wait what's that sound?"


You could then fix the inverted x axis by using the mouse on the underside of the table. Works fine with optical mice, but a ball mouse might require fiddling with gravity (or magnets, but that's no fun).


I think most ball mice have the ball sitting tightly enough.


I still prefer it the way it was when I started using computers with DOS 6.x and Windows 3.1: underline in console, bar in GUIs, block in both with overwrite mode. Always blinking of course. My problems with modern programs are 1) overwrite mode is usually not implemented and 2) moving the cursor does not reset the blink timer. The first I can understand (although it's trivial to do), but the second results in the cursor getting invisible when moving while pressing the arrow keys. It's the little things...


That's like a programming language with 0.5-based array indexing.


It's the classic "off by half" index bug.


Doing the mental math is a lot harder, too. Any halfway decent computer nerd knows on some level that 2^10=1024, but 1/2^10=0.0009765625.

Doesn't quite lock into the memory banks so easily.



No fence post problems if you just put each post in the middle of each span!


David Crane, the creator of Pitfall, explains it all very nicely in a GDC Classic Game Postmortem [1]. I'm linking at the relevant timestamp, but the whole video is pure Atari 2600 goodness. The full Postmortems playlist is quite enjoyable too. I particularly liked the talks about the original Deus Ex, Myst, Loom, Adventure, Marble Madness, Ms. Pac-Man, Paperboy, Lemmings, and more. I find these videos very soothing and nostalgic. Got to rewatch them now...

[1]: https://youtu.be/tfAnxaWiSeE?list=PL2e4mYbwSTbbiX2uwspn0xiYb...


Although they're the same glyph, they are different codepoints in Unicode. There's an old joke about replacing semicolons with Greek question marks in JavaScript and baffling your colleagues...


And that the semicolon was used as a PRINT statement terminator when you wanted to suppress the implicit newline at the end.


Right! But it was more than that: it was the separator between data you wanted to print without any blank in-between, as opposed to the comma that would add a Tab.


I "solved" that by using two assembling passes. The first had dummy jump/call addresses in order to determine the code offsets of labels. Then I would resolve the relative offsets, replace the target labels with offset deltas and reassemble.

I used that trick to speed up my QBasic programs with graphics, string and list handling, etc. It was one of the first rewarding times in my programming "career" and it actually made me feel proud of myself.


Yeah-- I did the same thing. It just sucks when you've forgotten s prefix or a push/pop early in the code and you have to retarget all the subsequent addresses. It really makes you appreciate a real assembler.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: