Hacker News new | past | comments | ask | show | jobs | submit login
Raspberry Pi 4 WiFi stops working at 2560 x 1440 screen resolution (enricozini.org)
299 points by pabs3 9 days ago | hide | past | web | favorite | 177 comments





Answer from forum

    Had the same problem with Raspi 4B.
    Problem was dependent on screen resolution (!!).
    With 1920x1080, wlan0 became disconnected after it was
    ok at lower screen resolutions. Was in 2.4GHz band.
    After turning on 5GHz in the router and going into
    the network preferences (right click the network icon top
    right on screen) and SSID ... and checking "automatically
    configure options" the connection remains stable (so far :D ).
So radio interference it is?

Reformatted for mobile users:

Had the same problem with Raspi 4B. Problem was dependent on screen resolution (!!). With 1920x1080, wlan0 became disconnected after it was ok at lower screen resolutions. Was in 2.4GHz band. After turning on 5GHz in the router and going into the network preferences (right click the network icon top right on screen) and SSID ... and checking "automatically configure options" the connection remains stable (so far :D ).


This is a good reminder that every digital circuit is really a fancy analog circuit. Your program may be perfectly correct, but when it runs on faulty hardware, you are hosed no matter how many times you’ve formally proven it.

or even a perfectly functional hardware in presence of EM noise.

Reminds me of the MacBook Air that I have (2018 model).

When I connect my USB 3 hub to it, I lose my WiFi :(

I Googled it when it happened a while back, and apparently other people have this problem with the MacBook Air too.

Some choose to shield the USB 3 cable with tinfoil. Personally I opted to connect a USB 3 Ethernet interface to the hub and use wired Ethernet when I use the hub.


In the very distant past (late 1980s, early 90s), Olivetti changed their PC keyboard design; it went from having the keyboard PCB assembly inside a metal clamshell to a bare board with a metallised plastic sheet on the back of the PCB (they 'cheaped out').

We got to know about the design change when a rather large and sprawling local leisure centre reported that often when someone used their walkie talkie near a PC, the screen filled with random characters and sometimes the dot matrix printers would 'go haywire' (CTRL-P = Print what's displayed on the screen).

In effect, the rf signal from the walkie talkies was 'mashing the keys'.

The immediate fix was to swap in some older keyboards, the longer fix was down to Olivetti using better shielding and some appropriately-placed capacitor decoupling on the power and signal lines.


> "In the very distant past (late 1980s, early 90s), Olivetti..."

Havnm't heard that brand in a while! I now remember I still have an 486 DX4 100MHz Olivetti laptop. Which is kind of pink-red-brownish.


First time I saw a room full of non-crt computer monitors, found out Merrill Lynch moved to a new building to save money. What they discovered is the CRTs would weird out every time the subway train went by. All the cash saved by moving was lost buying the stupid expensive monitors.

I wouldn't have considered trains to be a source of EM interference but in retrospect it makes sense. one that's amazed me is that military jets and warships cause some of the nearby consumer electronics go wonky, but as they literally run systems that are trying to jam radio waves, it makes a ton of sense that eg garage doors have trouble keeping up.

Is it possible the walkie talkie was out of spec as well? Little signal booster added after market?

Doesn't have to be. If I TX with 5W near my new Lattitude laptop, the cursor goes all over the place. I've heard of electricians using uhf radios tripping GFI breakers. I guess what is normal power levels in ham / professional radios are much more than most other stuff is designed to withstand.


it happens with macbook pro 2018 as well. but only with adapters from amazon, using official adapter from apple store seems not to have that problem.

wrapping 3rd party adapter in tin foil actually show much better speedtest results when connected, then without the foil.


Brings back fond memories of an extremely laggy Logitech wireless mouse whenever an external display is plugged into the MiniDP port on the same side.

Fond memories of wondering why WiFi occassionally stopped working - eventually correlated it with the use of the microwave.

Cheap hubs or cables have been causing this since 2016

2016? It's been happening since before electronics existed to allow the personal computer to be invented!

Similar problem with my 2014 MBP: I lose wifi if I plug it into my UHD monitor with a DisplayPort cable, but not if I use HDMI. Probably has to do with cable quality (bought it on Amazon).

I have that issue with my 2016 MacBoo Pro, but I'm pretty sure it's due to my crap quality USB3+NIC that I bought.

I had this problem until I threw out my $10 USB 3 hub and got a $20 one instead.

I have a thunderbolt to 4 port usb 3 hub that when connected to a 2019 MacBook Pro interferes with the mouse pointer (it sticks and the pointer goes large every few seconds). Could this be related?

That could be an electrical grounding problem. In my experience touchpads are very sensitive to the quality of the power supply.

Yeah, we have crap power where I work and have cut down a lot of weird, random issues by buying a decent UPS for each worker machine. Even the people with laptops.

5GHz wifi is a lot more stable. Mine used to disconnect with microwave running. Also your hub might have leaky cables. I had an issue like this with bad hdmi cables too.

Microwave ovens and wifi use the same frequency range for the same reason: 2.4 GHz is available as unlicensed spectrum. In theory someone could build a 5 GHz microwave oven, but it wouldn't be cost-effective for a consumer appliance.

I once heard something about that frequency working better with water molecules, but after looking it up I think it's a myth.


A 5 GHz microwave would be more efficient but you don't necessarily want more efficient for cooking - that would just heat the outside of the food fastest. Commercial microwaves apparently run somewhere in the 900 MHz band. There are some neat graphs here:

http://www1.lsbu.ac.uk/water/microwave_water.html


What kind of cables are you using? Either your cable or hub isn’t up so spec and properly shielded.

Why not simply buy a shielded USB cable?

I have this problem with the macbook pro

Yes, although Eben Upton suggests trying a better grade of HDMI cable... the frequency of the higher resolution on screen will radiate noise out of a poorly shielded cable enough to interfere with Wi-Fi in the 2.4Ghz range.

Don't use a cheap cable, use one rated for 4k, and see if that helps.


Likely, but I've also had strange effects like this that have ultimately been traced back power domain issues on some of our devices.

I will really like to hear what the root cause of this is.

This makes me think of the bug that the QCA AR9331 SoC has. The AR9331 is extremely common in small travel routers, but it has a fun bug where one of it's clock sources is shared between the 802.11 wifi and the USB port. IF the USB port is negotiated at USB 1.x speeds AND the 802.11 radio is scanning, the USB will freak out and die. This generally requires the 802.11 radio to be in client mode rather than AP mode. You can read some details about this on the old OpenWRT forum, if it survived the great forum purge of 2018.


The cause seems to be a low-quality HDMI cable that doesn't provide enough shielding.

Also Macbook Air (especially 11 inch model) 2013-2015 versions have similar problem with (cheap) USB 3.x external drives

Wifi shows as 'connected' but reality is only fraction of packets go through because of interference problems

some online reference from intel on usb.org: https://usb.org/sites/default/files/327216.pdf


Unfortunately not only with cheap external USB 3.x drives and not only with old MacBook Airs. Connecting an Anker USB-C dock/hub to my MacBook Pro 2016 and later 2018 would consistently cause interference with the Microsoft Sculpt Ergonomic and Apple Magic Trackpad (both on 2.4GHz). Then I got an Aukey hub (since my wife was happy with one) and the problems have vanished.

This is an issue with poorly made USB 3 cables, connectors, and devices. Intel has a white paper about it: https://www.intel.com/content/www/us/en/products/docs/io/uni...

> the noise from USB 3.0 data spectrum can be high (in the 2.4–2.5 GHz range). This noise can radiate from the USB 3.0 connector on a PC platform, the USB 3.0 connector on the peripheral device or the USB 3.0 cable. If the antenna of a wireless device operating in this band is placed close to any of the above USB 3.0 radiation channels, it can pick up the broadband noise. The broadband noise emitted from a USB 3.0 device can affect the SNR and limit the sensitivity of any wireless receiver whose antenna is physically located close to the USB 3.0 device. This may result in a drop in throughput on the wireless link.

The money quote:

> With the HDD connected, the noise floor in the 2.4 GHz band is raised by nearly 20 dB. This could impact wireless device sensitivity significantly.

Besides having properly shielded devices and cables (which manufacturers often don't bother doing), they also recommend that the plug in the laptop be fully shielded or enclosed in a metal chassis (which is fulfilled by having an entirely metal case).

I don't know of a cheap RF analyzer but I'd like to get one at some point. I'm curious how many common devices actually adhere to the FCC regulations and/or standards like USB 3, compared to how many are just cheaply made.


> I don't know of a cheap RF analyzer but I'd like to get one at some point.

A SDR might work, and they're quite versatile. Unfortunately, the cheap RTL-SDR ones don't get to 2.4 GHz without a downconverter.


> I don't know of a cheap RF analyzer but I'd like to get one at some point.

That’s one of the features I love about my Aruba APs/controller. Not only does the APs dedicated as AirMonitors/SpectrumMonitors monitor at both the Wi-Fi level and radio spectrum level and shift frequencies as needed, but I can also get a high quality live visualization of radio spectrum interference. Definitely not cheap though!


Depending on your definition of affordable:

https://www.aimtti.com/product-category/spectrum-analyzers


Former communications semiconductor FAE here. We would troubleshoot issued like this all-day every-day for years, usually under NDA before the product is ever released to production. The solutions are routinely as weird as some of the "voodoo" hypothesis tossed around here - wait for it and you'll see. After a while it seems normal that all unverified combinations are broken and the moments of delight are when an unverified configuration completely works.

TIL one more TLA.

FAE = Functional Accessibility Evaluator


Typically in the semiconductor business FAE = Field Application Engineer.

Aw, I was hoping the explanation would be in the thread. Is it high-frequency noise from an unshielded clock? Surpassing a current or temperature limit due to the stress of the high resolution and resulting high memory bandwidth? Time will tell!

> Is it high-frequency noise from an unshielded clock?

I bet that's it.

2560x1440 @ 60 Hz with CVT-RB timings has a pixel clock of 241.5 MHz. The TMDS bit rate is 10x the pixel clock, and 2415 MHz is right in the lower end of the 802.11 band.

If the Pi can be convinced to use CVT blanking, that'll raise the pixel clock to 312 MHz, which should be fine.


This is the answer. That mode puts the second harmonic going out of the HDMI cable smack in the 2.4GHz WiFi band (the fundamental would be at 1/2 the bit rate).

Now the question is who screwed up. Is it a leaky cable? Is it bad PCB design? Is it a problem internal to the SoC? Is it a power delivery issue? Knowing the RPi foundation and Broadcom, I bet one of them screwed this up for all RPis and it isn't just a bad cable.


I'm trying to understand this so I don't know a lot of what's going on here, but why is the fundamental half the bit rate instead of the bit rate?

Imagine a signal that alternates every second (e.g. 10101010). This has a frequency of 1/2 Hz, not 1 Hz, because it repeats every 2 seconds.

I'm not 100% sure, but I'm assuming because of Double Data Rate transfers: https://en.wikipedia.org/wiki/Double_data_rate

I so badly want this to be true - it's like a perfectly designed bug!

And unlike hdmi 2.0, 1.4 wont have scrambling in, so any termination issues will sing quite well.

> And unlike hdmi 2.0, 1.4 wont have scrambling in, so any termination issues will sing quite well.

Not an electrical engineer: which measures are usually adapted to handle termination issues?


Moar power, isolation, and voltage regulators.

And grounding!

I thought something similar: Pixel clock or some harmonic of it feeding back into the wifi chip via the power lines (image search says the chip is in a metal box, so should not be via over-the-air-RF). Thanks for doing the math.

Forcing a different pixel clock is probably the easiest fix, and since the ports claim to do 4kp60, that should be possible (if the display supports it).


Not the numbers the calculator I found online came up with, but I had the same thought. I'm thinking that changing the refresh rate could solve the problem.

The calculator I was using was:

https://tomverbeure.github.io/video_timings_calculator

Changing the refresh rate would work too, but some monitors can be pretty picky about that.


That reminds me of this: https://bellard.org/dvbt/

Why do all the proposed avenues of future investigation, and all of the current comments on this thread, focus on voodoo instead of the far more likely explanation that the display driver is just stomping on the memory of the network interface? If there's software anywhere in a system, 99% of the time that's the problem.

This is not true when radios are involved. In my experience, wireless connectivity issues are rarely caused by software; the problem is much more often caused by interference.

The interference can be internal interference in the device, or interference from other wireless devices. In many cases, the problem are even devices that shouldn't emit RF at all, like power supplies, switches, light bulbs...

Another common issue is poor antenna design (eg. attenuation when you hold the device, or strong directionality of an antenna that should not be directional).

And, last but not least, physical obstacles. Most people understand that concrete walls with rebar will block signal, but a surprisingly large number of people try to use aluminum stands or cases for devices with wireless radios.

All those factors will cause connection issues, and they are really common because debugging them is so hard (who has a spectrum analyzer at home? How do you find out which one of dozens of electronic devices is emitting RF that it shouldn't?)


In addition, the linked forum thread includes a user describing how high resolutions break 2.4GHz networks for them, but 5GHz networks work fine. The display driver is stomping on memory responsible for 2.4GHz, but not 5GHz? I'm really not seeing that as the more likely problem here.

5GHz WiFi has more bandwidth than 2.4GHz, so typically will involve larger IO buffers in the driver, which could easily be enough to expose a memory scribbler (I imagine there's a bunch of other features that are enabled/disabled by the frequency band switch too). However, I think asdfasgasdgasdg's answer is the correct reason not to suspect a memory scribbler - ie a memory scribbler would cause the driver to crash/fail and the kernel would log a message.

Remember the Pi has an odd architecture and all the IO passes through the GPU. The GPU doesn't log human readable messages anywhere. There's a good chance the GPU did log a crash or failure, but only broadcoms engineers can see it.

Er, really, wow. I didn't know that. Can you point me at some more info for that? Surely the GPIOs don't go via the GPU.

It's a BCM2711, and the datasheet is NDA only - typical Broadcom!

The VideoCore (Broadcoms GPU) is the main processor on the thing, and the cluster of ARM cores that run Linux are more of a coprocessor which can only see some of RAM.


But 5 GHz doesn't fail, only 2.4 GHz does.

This is exactly abainbridge's point.

How do you mean?

> 5GHz WiFi has more bandwidth than 2.4GHz, so typically will involve larger IO buffers in the driver, which could easily be enough to expose a memory scribbler

He's saying 5 GHz will expose the scribbler, and the opposite is happening, only 2.4 GHz fails.


@StavrosK Thanks for wading in in my defence, but I had actually mis-understood the situation :-)

Although, if my theory that the IO buffers are different sizes is true, then that could perturb memory layout enough to expose/hide the bug in either direction.


Haha, I was actually the attacker in this instance :)

I do agree that the different IO buffers might hide the bug in one instance, but I think this is just plain old RF noise.


> Haha, I was actually the attacker in this instance :)

I need more sleep.


So the display driver is meant to be mutating memory also owned by the network controller, but not in a way that causes a crash, log messages, or a kernel panic? That doesn't seem so likely to me. I mean it's not impossible but it's rare to see memory corruption/interference cause a clean breakage like this. In my experience it usually causes things to become extremely funky for a short while, then a crash.

Every SoC I've dealt with containing a WiFi core has a dedicated coprocessor (RPU is a common name, depending on vendor) running its' own firmware. So more likely, _that_ core would go funky, then crash. The kernel might have code to recover that, but I doubt it, and it certainly would complain the whole way as you say.

In the Pi, the coprocessor is the GPU, and it is the first to initialize on boot and runs all the firmware-like stuff and handles all IO and does memory allocations/mappings.

It's disrespectful towards the art of Voodoo to call it "RF".

It's also disrespectful to the black art of RF to call it mere Voodoo :-)

Because what if it's not? My first thought is that the HDMI is radiating and interfering with the wifi antenna.

As an embedded engineer, it was a hard lesson for me to learn that not all issues are software issues and the hardware may need to be investigated. This is especially true where there is different behaviour between units. You can't just assume that your 99% estimation (plucked out of thin air) is correct and discredit other potential explanations.


> Because what if it's not?

Then, after you done ruling out the most likely and easiest explanation to test, you can then start exploring the remaining possibilities. Skipping to the more exotic explanations sounds more interesting but it's poor use of time if there's still low-hanging fruit out there.


maybe, with high frequency radios and improperly shielded cables and chips, the most likely scenario is RF interference?

Improper shielding is an assumption with no evidence as yet. I also mentioned that the ease of verifying the explanation should be a factor. Changing software is usually very easy.

It's so common that it's not an unlikely starting point. EMC is a major issue in high frequency electronics design and the raspberry pi had a history of having to redesign certain parts because of not having enough shielding.

I can find [1] on the subject which is quite interesting.

[1]: https://www.element14.com/community/people/PeteL/blog/2012/0...



There doesn't seem to be much info on compliance out there for Pi 4 which must have been significantly different w.r.t HDMI.

Absolutely, and this was before the Pi had built-in Wifi. The norms you have to comply for are immediately a lot stricter as your device falls into a different category (telecommunications devices).


wrapping tinfoil around an hdmi plug/cable isn't particularly hard either :) chips are harder but at least you rule out the cable. HDMI cables are ridiculously finicky if you've ever tried to get anything more than the lowest common denominator 1080p going on them.

I don't agree that wrapping foil is a great way to 100% rule that out as there is room for error. Using different cables/dongles would be better and they already tried that.

> If there's software anywhere in a system, 99% of the time that's the problem.

Unless USB is involved, then it's something in the USB stack...


USB isn't up to spec on the pi4

Only the power bit, and only one resistor...

Not just that one resistor, they also lack the circuitry to prevent feeding power to that port when powered through other means like PoE.

EMI is a headache I deal with daily, on far more sensitive receivers, so voodoo is likely. Though just moving the unit next to the AP (increasing RX signal strength) is an easy diagnosis.

There's several small scale WiFi chips that share clock source with USB - it would be unsurprising to find that the WiFi and video interface are sharing the same clock, so drawing too much from either could directly effect the other.

These kinds of problems are common in embedded computers, like the Pi. Just as common as software.


Clocks will all be buffered due to physical distance between the GPU and WiFi IP core on the SoC so it's unlikely to be a clock loading issue.

Buffering isn't really the problem I was talking about, it was more the shielding of the clock.

For future reference this "Voodoo" is referred to technically as electrical engineering ;)

I don't know much about the Raspberry Pi, but it looks like they chose an ARM core variant without IOMMU, so this might actually be plausible, even though it's such a computer architecture anti-pattern to share system memory DMA across devices.

Can you list which ARM cores you know of that include an IOMMU? I’m personally unaware of any, as that is typically bundled as a separate IP package that must be integrated separately into the system, and is usually customized based on the number of supported masters that require virtualization.

E.g. the Xilinx ZynqMP includes the same Cortex-A53 complex the Raspberry Pi 3 has. They also included CCI-400 coherent interconnect switch to it, and also included the SMMU-500 IOMMU that partially interfaces with the A53 interconnect, but is effectively independently programmed and also controls access to DDR3/4 from the SATA, Displayport and PCIe controllers.

Per the original topic, have they released a full datasheet/reference manual for the Pi 4 SoC yet? I’ve yet to see one other than a VERY high-level overview of it’s new pieces.


> have they released a full datasheet/reference manual for the Pi 4 SoC yet?

Ha. It's Broadcom... They're never going to release one.


Huh, so that's why the iPhone 6s's SecureROM memory regions weren't MMU-locked... IOMMU doesn't come in ARM by default! So you have to wire it up yourself (in your own IP blocks), and then hook it up in software everywhere you want it to work.

And all that costs extra developer time -and money.

Heh.

http://ramtin-amin.fr/#nvmedma


Best bet is probably the device tree.

What does "stomping on the memory of the network interface" mean?

Really, terrible, security vulnerability issues? DRI and networking kennel modules should absolutely not be able to interact with each other at all.

"kernel module" together with "should absolutely not be able to interact with each other" are an impossible requirement with Linux.

I think the other operating systems available for the Pi are roughly in the same boat (Windows & RiscOS). There was a nascent Minix port at some point, I wonder if it was abandoned.


Linux is (currently) a monolithic kernel and I'm not sure that can be accomplished without changing this.

The screen memory is taking up so much RAM that it's overlapping with regions of memory the network interface uses.

Resource are allocated via the kernel - it won't hand out overlapping address ranges.

Maybe the misbehaving driver is writing past the end of its requested space though, inadvertently? (I don't know if this is always called a "heap overflow" or if that's just Clang AddressSanitizer.)

Or something like https://mjg59.dreamwidth.org/11235.html is happening.

That resulted in a wide variety of different failures, from the kernel oopsing to various userspace components crashing. It would be very unusual to have unexpected DMA trigger such a specific failure.

(for avoidance of doubt, I wrote that blog post)


Out-of-bound memory write.

Why would that only interfere with the network driver, rather than tending to crash random userland or crash the kernel?

I don't know, let's see if anyone has an idea about it.

I was just explaining what the OP was asking for. I personally believe it's a EMI-related hardware issue.


I don't agree with how likely this is given the specificity of the bug, but should be super simple to test.

Try to reproduce with a different OS/kernel.


https://twitter.com/assortedhackery/status/12000566338980290...

Actual measurement that a Pi with HDMI at the affected reoslutions radiates over the bottom end of the wifi band.


Mostly because of a known history over the past couple years of USB, WiFi, and/or HDMI causing direct interference with each other. See lots of other comments upthread about similar RF issues people have had, stretching all the way back to 486 laptop keyboards :)

true

It certainly sounds more like a software issue than some arcane effect from RF interference or the like. Could be memory getting smashed, a bus getting saturated, an interrupt not getting serviced, or any similar thing.

Meh. I've done low-level embedded/mobile for a long time now. This actually sounds like a totally reasonable RF interference issue. 2.4Ghz is funky & has desense issues with lots of internal busses (not a HW engineer so not sure why that band specifically). Also radios typically have to accept interference which means the radio would "stop working" rather than causing the display to work weirdly (ironically a much easier failure mode to display/diagnose/notice).

when the late 2016 macbook pro came out with only usb-c i had to buy a usb dongle from amazon (the one included had not enough ports). if i booted the macbook in windows, with the dongle connected the wifi would stop working (the 2.4ghz one) and the 5ghz would work.

Duly noted! I've been out of the embedded space for a long time (I think the last board I worked with was i386EX based) but I'm getting back into it now with an ESP32 so this might actually come in handy. Thanks! :)

Is it using main memory as the video controller's memory? It may be out of memory bandwidth.

That was my first thought as well, but scanning out that resolution should use less than 1 GB/s of memory bandwidth which is nowhere close to the DRAM speed. And usually in that situation you get horizontal speckles in the video output.

Some MacBooks have the same issue (interference). Making sure WiFi is working in the 5Ghz band is one of the workarounds.

I’d want to know if the Bluetooth stops as well since it’s in the same frequency range.

Maybe it doesn't have enough power and prioritizes the GPU over the WiFi antenna?

The wifi antenna connection is analog with its own dedicated wiring.

First thought as well. Didn't see a way it could determine that tho.

I know PIs soft-require more power than the supplies most people give them, and they do things like CPU throttling when undersupplied. I think I've also read that WiFi and/or Bluetooth may stop working when underpowered. So it may not be a balancing thing so much as graceful degradation.

Ah WiFi radio interference, everyone's favorite! It kills Thunderbolt sometimes:

https://www.akitio.com/faq/301-why-does-my-thunderbolt-3-dev...

> Why does my Thunderbolt 3 device not work with Dell's XPS laptops?

> Some users on the Dell forum have found that reducing the power output of the WiFi network adapter to 75% fixes the problem.


I have been using this resolution (with the console runlevel 3 only) and no prob with wifi.

A Xenon flash used to freeze the Pi 2.

I don't know why it became such a piece of breaking news.

When a current flows through a p&n junction, photons are emitted (and an LED is just a diode that happens to emit photons at the wavelengths of visible light). And it works in both ways, if you hit a p&n junction with photons, you produce a current, not only LEDs - any diode will do that, they're all potential photodiodes, it's just that some are more sensitive than others.

You can cause a lot of chips to reset if you shine a bright beam of light to its exposed die, a common way to test chips.

It's also one reason (in addition to cost) that most diodes are sealed in plastic package, not glass package. Fun experiment: buy some 1N4148 small-signal diodes in glass package, connect it to a Darlington-pair transistor amplifier, and shoot the flashlight, you'll see some funny thing on the oscilloscope.


I think the newsworthy part of all that wasn't the physics involved, but more the design-choices on the Rpi 2.

Putting a light-sensitive chip (I think it was a wafer-scale package with no casing) on a board that's intended for use outside of an enclosure was a really big oversight.


> a really big oversight.

I agree.


> I don't know why it became such a piece of breaking news.

Popular buzzword (Raspberry Pi), surprising unexpected outcome (most consumer electronics people are familiar with don’t react to light), manufactured outrage/schadenfreude (look how they screwed this up!)


> Popular buzzword + Surprising unexpected outcome

Good analysis. In both cases, I see the popular press reports them in personified language, "Xenon Death Flash, or Why the Raspberry Pi 2 is Camera Shy", and "Why the new iPhone is Allergic to Helium". If we replace "Raspberry Pi 2" with "semiconductor p&n junctions", and replace "new iPhone" with "MEMS oscillators", it probably won't be news anymore.

> manufactured outrage/schadenfreude

An interesting case as well. I see both incidents as undesirable side-effects that better to be prevented, but I don't think they are major design flaws.


Your version doesn't just remove any personification, it gets rid of the entire concept that these are exposed parts on consumer devices causing them to fail.

So sure, that version would get no notice.


One of my professors told a story of reporters crashing one of AT&T's new digital exchanges because their camera flashes erased some of the EPROMS used in the system.

Helium will also cause it to stop working

Helium will make the iPhone gyro stop working for a few days.

Almost forgot how cool MEMS are

the whole thing, actually, not just the gyro. the clock is nanomechanical and helium breaks it, apparently. the newer models have a version that fixed that particular problem. it was in the fine print in the manual somewhere.

Sounds like a turkey day mystery!

You were missing a router swap on your list.

While it might seem a bit overkill try restarting your router’s WiFi to see if it magically works. I had a war with a pi zero W not that long ago...turns out the 2 band wireless would just die sometimes. Turned out to be an issue with the router ️


Was it a bug in the router or was it a bug in the Pi that happened not to get triggered by a different router?

It was ultimately with the router. Specifically With NAT and 2.4 band. 5g kept going just fine. After looking at the manufacturers website the solution was to either buy a new router or disable port mapping.

Monitors creating harmonic interference with wifi is a very common problem, and most computers don't give any warning of this. I don't see why it wouldn't be possible to say "the wifi won't connect, it may be because of your refresh rate".

Reminds me of an old adage about software:

The plane can't take off because the carpet is the wrong kind of orange.

ObligAnecdote: I once had a keyboard that wouldn't work when the monitor was outputting at 75hz. Had to be 60hz or else nothing. The joys of wireless keyboards.


Has anybody else got 4k video working in the pi? Doesn't work for me on raspbian at all.

I have over VNC

I have this issue with a Zero W. Wlan0 would just disappear. I tried jessie, stretch, and now buster. It had been connected to a ultra-wide monitor (but not at 2560 width, obviously).

I can't get it to fail now, but it is not connected to a monitor anymore.


Any other SBCs that can drive a high res display @60hz just for normal computing purposes? No gaming/video encoding use cases.

Nvidia Jetson Nano. They're the GPU experts after all.

https://developer.nvidia.com/embedded/jetson-nano-developer-...

Don't let the AI-heavy marketing distract. It's an Ubuntu PC effectively.



I have successfully used the Pi4 at 4K. It is a very slow computer for “normal computing purposes” though.

> It is a very slow computer for “normal computing purposes” though.

Can you elaborate what you're doing that is slow? In my experience, CPU wise it's plenty fast, matching typical desktop from 2008. Although memory bandwidth could be better...


It's slow at 4k. I've got the 4GB variant and in 1080p it's good, I'd even say surprisingly so.

Ah, so you mean graphics performance. Yeah, just not enough memory bandwidth. 32-bit DDR4 memory interface is a bit narrow for 4k use!

RockPro64, Odroid C2.

It works fine with Ar1 Pi case - that extends usb-c ports and puts them 2-3cm away

It reminds me of this old demo of electromagnetic interference: http://www.erikyyy.de/tempest/

Sounds like an EMC or Clock error to me.

Please don't use code blocks to quote text, they are impossible to read on mobile.

> Had the same problem with Raspi 4B. Problem was dependent on screen resolution (!!). With 1920x1080, wlan0 became disconnected after it was ok at lower screen resolutions. Was in 2.4GHz band. After turning on 5GHz in the router and going into the network preferences (right click the network icon top right on screen) and SSID ... and checking "automatically configure options" the connection remains stable (so far :D ).


Ubuntu Bug 255161: Openoffice can’t print on Tuesdays

https://news.ycombinator.com/item?id=8171956


Please don’t post quotes as code - it makes it almost impossible to read on mobile.

i read this so often, why doesnt hn fix its css?

because hn is infallible, all hail hn

Honestly, I have no idea what is the right way to format quotes. And sorry for inconvenience.

You could use "quotes" to format quotes. Just a suggestion.

> Honestly, I have no idea what is the right way to format quotes.

Most people on HN do this. It doesn't format the quote in any way, but it works.

¯\_(ツ)_/¯


It's fine. People can scroll on mobile to work around a poor design that fails to take into account styling norms such as indenting quotes.

Don’t worry about it too much. A large part of the problem is that HN comments don’t have a method of delineating quotes, so people default to this because it looks fine on desktop.

You did the right thing. No need to apologise. Don't listen to the whiners with Stockholm syndrome.

The RPi doesn't feel very minimalist.

I wish the RPi was this standalone microserver that had its own flash memory (SD cards are known to fail) that you could plug into your home network and act as a personal server.

Maybe one day it will be possible to do some basic GSM data with a RPi.


If they didn't use SD cards, the storage would be more reliable, but users would spend a lot more time fixing bricked boards. By allowing for removeable storage (in a format that can be plugged into any other computer natively) they solved that problem because I can just re image the card and get going again.

The philosophy of the RPi is that they won't really add features to it unless a bulk of the user base would use the feature. For example they were hesitant to even build the WiFi into it because users who wanted that could always get a USB chipset, and building it in adds BOM cost.

Because with GSM you'd also need a plan for it, I don't really predict they'd add that. Especially since you can get it in hat format already.


M.2 to USB converters work just fine and a M.2 drive would be far more reliable than μSD cards, even the high-endurance ones.

You're talking about a $40 tool just to flash the storage vs a part (micro SD adapter) that they give you with the SD card for free because it's so cheap.

Also doing a quick survey for the rated cycle counts on M.2 vs SD card slots:

M.2: I found this one [1] which is $0.768 for only 60 cycles

SD: This one [2] is $0.6256 for 5,000 cycles

I'm not sure why you'd say M.2 is more reliable, considering users often cycle storage dozens if not hundreds of times.

[1] https://www.digikey.com/product-detail/en/jae-electronics/SM...

[2] https://www.digikey.com/product-detail/en/gct/MEM2051-00-195...




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: