Hacker Newsnew | past | comments | ask | show | jobs | submit | adrianmonk's commentslogin

The simplest might be for the drone company to act as an intermediary. They'd bill drone users for charging and have contracts with utilities. The drone company could do some authentication / DRM / etc. so that you'd basically have to jailbreak your drone to charge without paying.

Yes, I'm sure the markup would be large as a percentage, but for most customers the convenience would be worth it. Most of the customers are probably commercial and don't want to risk getting banned or sued.


The link you gave doesn't support your claim that saturated fat is good.

In fact, from the very same site, here's another article saying it's not: https://health.clevelandclinic.org/saturated-fats-finding-a-...

Saturated fat is OK in moderate amounts, but if you eat too much, it drives up your cholesterol because your body converts saturated fat into cholesterol[1][2].

The issue I have with this new food pyramid is the guidance ignores the danger of saturated fat. It lists "meats" and "full-fat dairy" among sources of "healthy fats", and that's just not true. In the picture that shows sources of protein/fat, 11 out of 13 of the items are animal-based fats. With a giant ribeye steak, cheese, butter, and whole milk specifically (not just milk), they're simply not giving an accurate picture of healthy fat sources.

I personally don't think seed oils are bad, but even if they were, it does not follow that saturated fat is good. The evidence shows otherwise, for one thing, plus it's not like seed oils and saturated fat are the only two kinds of fat. There are plenty of unsaturated fats which aren't seed oils.

---

[1] https://medlineplus.gov/ency/patientinstructions/000838.htm

[2] https://www.heart.org/en/healthy-living/healthy-eating/eat-s...


This also produces carbon nanotubes, which they claim can be used in construction.

Given that construction currently uses a huge amount of concrete, and given that concrete emits huge amounts of CO2[1], if this could partially replace concrete in construction, it might actually be clean. At least compared to what we're doing now.

I doubt foundations are going to be made out of carbon nanotubes, but they might be useful for the structure (columns, beams, etc.).

---

[1] "4-8% of total global CO2" according to https://en.wikipedia.org/wiki/Environmental_impact_of_concre...


I spent literally thousands of hours staring at those screens. You have it backwards. Interlacing was worse in terms of refresh, not better.

Interlacing is a trick that lets you sacrifice refresh rates to gain greater vertical resolution. The electron beam scans across the screen the same number of times per second either way. With interlacing, it alternates between even and odd rows.

With NTSC, the beam scans across the screen 60 times per second. With NTSC non-interlaced, every pixel will be refreshed 60 times per second. With NTSC interlaced, every pixel will be refreshed 30 times per second since it only gets hit every other time.

And of course the phosphors on the screen glow for a while after the electron beam hits them. It's the same phosphor, so in interlaced mode, because it's getting hit half as often, it will have more time to fade before it's hit again.


Have you ever seen high speed footage of a CRT in operation? The phosphors on most late-80s/90s TVs and color graphic computer displays decayed instantaneously. A pixel illuminated at the beginning of a scanline would be gone well before the beam reached the end of the scanline. You see a rectangular image, rather than a scanning dot, entirely due to persistence of vision.

Slow-decay phosphors were much more common on old "green/amber screen" terminals and monochrome computer displays like those built into the Commodore PET and certain makes of TRS-80. In fact there's a demo/cyberpunk short story that uses the decay of the PET display's phosphor to display images with shading the PET was nominally not capable of (due to being 1-bit monochrome character-cell pseudographics): https://m.youtube.com/watch?v=n87d7j0hfOE


Interesting. It's basically a compromise between flicker and motion blur, so I assumed they'd pick the phosphor decay time based on the refresh rate to get the best balance. So for example, if your display is 60 Hz, you'd want phosphors to glow for about 16 ms.

But looking at a table of phosphors ( https://en.wikipedia.org/wiki/Phosphor ), it looks like decay time and color are properties of individual phosphorescent materials, so if you want to build an RGB color CRT screen, that limits your choices a lot.

Also, TIL that one of the barriers to creating color TV was finding a red phosphor.


There are no pixels in CRT. The guns go left to right, ¥r¥n, left to right, while True for line in range(line_number).

The RGB stripes or dots are just stripes or dots, they're not tied to pixels. There would be RGB guns that are physically offset to each others, coupled with a strategically designed mesh plates, in such ways that e- from each guns sort of moire into only hitting the right stripes or dots. Apparently fractions of inches of offsets were all it took.

The three guns, really more like fast acting lightbulbs, received brightness signals for each respective RGB channels. Incidentally that means they could go between brightness zero to max couple times over 60[Hz] * 640[px] * 480[px] or so.

Interlacing means the guns draw every other lines but not necessarily pixels, because CRTs has beam spot sizes at least.


No, you don't sacrifice refresh rate! The refresh rate is the same. 50 Hz interlaced and 50 Hz non-interlaced are both ~50 Hz, approx 270 visible scanlines, and the display is refreshed at ~50 Hz in both cases. The difference is that in the 50 Hz interlaced case, alternate frames are offset by 0.5 scanlines, the producing device arranging the timing to make this work on the basis that it's producing even rows on one frame and odd rows on the other. And the offset means the odd rows are displayed slightly lower than the even ones.

This is a valid assumption for 25 Hz double-height TV or film content. It's generally noisy and grainy, typically with no features that occupy less than 1/~270 of the picture vertically for long enough to be noticeable. Combined with persistence of vision, the whole thing just about hangs together.

This sucks for 50 Hz computer output. (For example, Acorn Electron or BBC Micro.) It's perfect every time, and largely the same every time, and so the interlace just introduces a repeated 25 Hz 0.5 scanline jitter. Best turned off, if the hardware can do that. (Even if it didn't annoy you, you'll not be more annoyed if it's eliminated.)

This also sucks for 25 Hz double-height computer output. (For example, Amiga 640x512 row mode.) It's perfect every time, and largely the same every time, and so if there are any features that occupy less than 1/~270 of the picture vertically, those fucking things will stick around repeatedly, and produce an annoying 25 Hz flicker, and it'll be extra annoying because the computer output is perfect and sharp. (And if there are no such features - then this is the 50 Hz case, and you're better off without the interlace.)

I decided to stick to the 50 Hz case, as I know the scanline counts - but my recollection is that going past 50 Hz still sucks. I had a PC years ago that would do 85 Hz interlaced. Still terrible.


You assume that non interlaced computer screens in the mid 90s were 60Hz. I wish they were. I was using Apple displays and those were definitely 30Hz.


Which Apple displays were you using that ran at 30Hz? Apple I, II, III, Macintosh series, all ran at 60Hz standard.

Even interlaced displays were still running at 60Hz, just with a half-line offset to fill in the gaps with image.


I think you are right, I had the LC III and Performa 630 specifically in mind. For some reason I remember they were 30Hz but everthing I find googling it suggest they were 66Hz (both video card and screen refresh).

That being said they were horrible on the eyes, and I think I only got comfortable when 100Hz+ CRT screens started being common. It is just that the threshold for comfort is higher than I remember it, which explains why I didn't feel any better in front of a CRT TV.


Could it be that you were on 60Hz AC at the times? That is near enough to produce something called a "Schwebung" when artificial lighting is used. Especially when using flourescent lamps like they were common in offices. They need to be "phasenkompensiert" (phase compensated?/balanced), meaning they have to be on a different phase of the mains electricity, than the computer screens are on. Otherwise even not so sensitive people notice that as interference/sort of flickering. Happens less when you are on 50Hz AC, and the screens run at 60Hz, but with flourescents on the same phase it can still be noticeable.


I doubt that. Even in color.

1986 I got an AtariST with black and white screen. Glorious 640x400 pixels across 11 or 12 inch. At 72Hz. Crystal clear.

https://www.atari-wiki.com/index.php?title=SM124


I wonder how Waymos know that the traffic lights are out.

A human can combine a ton of context clues. Like, "Well, we just had a storm, and it was really windy, and the office buildings are all dark, and that Exxon sign is normally lit up but not right now, and everything seems oddly quiet. Evidently, a power outage is the reason I don't see the traffic light lit up. Also other drivers are going through the intersection one by one, as if they think the light is not working."

It's not enough to just analyze the camera data and see neither green nor yellow nor red. Other things can cause that, like a burned out bulb, a sensor hardware problem, a visual obstruction (bird on a utility cable), or one of those louvers that makes the traffic light visible only from certain specific angles.

Since the rules are different depending on whether the light is functioning or not, you really need to know the answer, but it seems hard to be confident. And you probably want to err on the side of the most common situation, which is that the lights are working.


I recently had a broken traffic light in my city, it was daylight and I didn't notice any other lights that should be on during the day to be off.

My approach was to get closer into the intersection slowly and judge whether the perpendicular traffic would slow down and also try to figure out what was going on or if they would just zip through like if they had green.

It required some attention and some judgement. It definitely wasn't the normal day to day driving where you don't quite think consciously what you're doing.

I understand that individual autonomous vehicles cannot be expected to be given the responsibility to make such a call and the safest thing to do for them is to have them stop.

But I assumed there were still many human operators that would oversee the fleet and they could make the call that the traffic lights are all off


> An error is an event that someone should act on. Not necessarily you.

Personally, I'd further qualify that. It should be logged as an error if the person who reads the logs would be responsible for fixing it.

Suppose you run a photo gallery web site. If a user uploads a corrupt JPEG, and the server detects that it's corrupt and rejects it, then someone needs to do something, but from the point of view of the person who runs the web site, the web site behaved correctly. It can't control whether people's JPEGs are corrupt. So this shouldn't be categorized as an error in the server logs.

But if you let users upload a batch of JPEG files (say a ZIP file full of them), you might produce a log file for the user to view. And in that log file, it's appropriate to categorize it as an error.


That's the difference between an HTTP 4xx and 5xx

4xx is for client side errors, 5xx is for server side errors.

For your situation you'd respond with an HTTP 400 "Bad Request" and not an HTTP 500 "Internal Server Error" because the problem was with the request not with the server.


Counter argument. How do you know the user uploaded a corrupted image and it didn't get corrupted by your internet connection, server hardware, or a bug in your software stack?

You cannot accurately assign responsibility until you understand the problem.


This is just trolling. The JPEG is corrupt if the library that reads it says it is corrupt. You log it as a warning. If you upgrade the library or change your upstream reverse proxy, and starting getting 1000x the number of warnings, you can still recognize that and take action without personally inspecting each failed upload to be sure you haven't yet stumbled on the one edge case where the JPEG library is out of spec.


I will make up some numbers for the sake of illustration. Suppose it takes you half as long to develop code if you skip the part where you make sure it works. And suppose that when you do this, 75% of the time it does work well enough to achieve its goal.

So then, in a month you can either develop 10 features that definitely work or 20 features that have a 75% chance of working. Which one of these delivers more value to your business?

That depends on a lot of things, like the severity of the consequences for incorrect software, the increased chaos of not knowing what works and what doesn't, the value of the features on the list, and the morale hit from slowly driving your software engineers insane vs. allowing them to have a modicum of pride in their work.

Because it's so complex, and because you don't even have access to all the information, it's hard to actually say which approach delivers more value to the business. But I'm sure it goes one way some of the time and the other way other times.

I definitely prefer producing software that I know works, but I don't think that it's an absurd idea the other way delivers more business value in certain cases.


It did, but it was awkward.

Analog cable channels were on a wider range of frequencies than regular TV (radio broadcast) channels. So the VCR's tuner had to be "cable ready".

Some cable channels, especially premium channels, were "scrambled", which meant you needed a cable box to tune them. So the VCR, by itself, could only record the basic channels that came with all cable packages. To record something from a movie channel (HBO, Showtime, etc.), you needed the cable box to tune it in and provide an unscrambled signal to your VCR.

And that meant the cable box needed to be set to the correct channel at the time the VCR woke up and started recording. The simple method was to leave it on the correct channel, but that was tedious and error prone. As I recall, there were also VCRs that could send a command to the cable box to turn it on (emulating the cable box remote) and set the channel, but you had to set that up.

Later, when digital cable came along, you needed the cable box involved for every recording because the channels were no longer coming over the wire in a format that the VCR could tune in.

So yeah, you could do it, but it was a pain.


Here's a video about how player pianos work:

https://www.youtube.com/watch?v=2GcmGyhc-IA

Basically, you have some pedals which generate a vacuum, and then everything is powered and controlled via vacuum. (The internet may not be a series of tubes, but a player piano literally is.)

Using vacuum to control things may seem very niche and exotic, but it was actually very common. Basically every car engine up through about the 1980s used vacuum to control the engine. Cars with a mechanical ignition system often used a vacuum advance to adjust the timing at higher RPMs, for example. Early cruise control systems used vacuum to adjust the throttle.

Anyway, all pianos have felt hammers which strike the string. When you're playing the piano manually, there's a mechanical linkage between the key you press and its hammer. In a player piano, there's another way to move the hammer: a vacuum controlled actuator. The piano roll has holes in it corresponding to notes. The holes allow air to pass through, and that causes the actuator to push the hammer into the string.

In that dance hall machine, which appears to be essentially a pipe organ, there are some similarities and some differences. A pipe organ works by blowing air through the pipes. There's a "wind chest" that stores pressurized air, and when you press a key on the keyboard, it opens a valve to let air into a particular pipe. In the old days, that linkage (between the key and the valve) was mechanical. These days it's electrical or electronic.

At the end of the video above, he even briefly mentions a band organ (which is similar to a dance hall machine) and how music rolls work for it, and it's a similar vacuum system to a player piano.

So I believe a dance hall machine with a music roll probably uses a combination of vacuum and positive pressure. The vacuum would allow reading the music roll (the paper with holes in it corresponding to notes), and that vacuum would actuate valves that allow positive pressure air into the pipes to make sound. In order to convert one of those to be controlled electronically, you could use a bunch of solenoid valves to either control the vacuum or directly control the air going into the pipes. I'm not sure which way they do it.


I wish Mini-TOSLINK[1] had been more successful. It's allows you to put an optical and electrical audio output on the same 3.5mm connector (i.e. headphone port), which is helpful for saving space on crowded panels.

The trick is that your 3.5mm connector only needs to connect on the sides, so the end of the jack can be open for light to be transmitted.

This was seen pretty frequently on laptops for a while, but I think two things doomed it. One, most people just don't use optical. Two, there's nothing to advertise its existence. If you do have one of these ports, you probably don't even know you could plug an optical connector in there.

---

[1] https://en.wikipedia.org/wiki/TOSLINK#Mini-TOSLINK


I remember when all MacBooks had it. "What is this red light for?" used to be a common post on forums.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: