Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: Does your microwave interfere with Bluetooth? Mine does
119 points by Jeff_Brown on Oct 5, 2023 | hide | past | favorite | 177 comments
I can see the Faraday cage in my microwave. It's never cooked anything outside of it. But if I put my phone on one side of it and a Bluetooth speaker on the other, running it interrupts the connection to the speaker. Sound gets through but it's choppy.

Seems bad, right?




Bluetooth and WiFi both borrow their spectrum from microwave ovens. It's typical and expected that microwave ovens will cause some interference with other users of the 2.4GHz ISM band that are very nearby. Microwaves operate at very high power levels and are required to be shielded for human safety, but the permissible leakage power is relatively high compared to typical WiFi and Bluetooth devices---there's a simple reason why. From a legal perspective, Bluetooth is essentially pretending to be a microwave oven and making use of the permitted leakage power.

This is the cost of the historical regulatory situation that most of these unlicensed radio services use the ISM bands originally allocated for microwave heating. One of the advantages of newer WiFI standards, particularly WiFi 6E, is that they finally change this situation by using the U-NII bands allocated specifically for unlicensed short-range digital communications, rather than for microwave heating.

Mind that this is all in the context of US spectrum regulations, although other countries have largely harmonized their approach. I have a lengthier treatment of the topic here: https://computer.rip/2022-04-14-unlicensed-radio.html


And on the follow-up, "Why use 2.4(5) GHz for microwaves?"

Because it happens to be a convenient frequency that water absorbs readily, which is the easiest way of heating up what we'd want to heat with a microwave (read: food).

https://en.wikipedia.org/wiki/Electromagnetic_absorption_by_...

And once we'd polluted the frequency for any stable commercial use, why not use this weird carve-out for a little thing called WiFi?

Rabbit hole: Apparently the precise mechanics of EM heating of molecules are surprisingly complex, and offer a number of frequency options. But this would have been the 1950s(?), so I'd assume they empirically determined a balance of functionality + technical feasibility for water, called it a lunch, and went to have five martinis in a practical fashion.


> "But this would have been the 1950s(?), so I'd assume they empirically determined a balance of functionality + technical feasibility for water, called it a lunch, and went to have five martinis in a practical fashion."

Or maybe they just wanted to revive frozen hamsters more humanely: "I promise this story about microwaves is interesting." by Tom Scott, about scientist James Lovelock - https://www.youtube.com/watch?v=2tdiKTSdE9Y


Holy shit. That at 101 I should be so blessed as to recall technical specifications from experiments performed decades earlier.

Sadly, it looks like Dr. Lovelock passed away a year after that was recorded, last year in 2022.

https://en.m.wikipedia.org/wiki/James_Lovelock

HN obit: https://news.ycombinator.com/item?id=32250694


To build off your rabbit hole, I've read about using the way microwaves selectively heat molecules in cooking. In particular, they can heat oil molecules in a way that increases flavor extraction without causing thermal breakdown of the flavors like uniform heating would.


To further build off your rabbit hole, this is also why 2.4ghz long range wifi connections (like via a yagi antenna) is also problematic. Foliage fucks with your range. Also why central air conditioners and wet walls create wifi shadows.


But the rabbit hole goes further. Higher frequencies are more susceptible to wall in the middle problems, you can see that easily with 5GHz WiFi, lower frequencies might be better but require larger antennas and might suffer from other limitations. As everything else in engineering this is a compromise.


IIRC, the precise frequency was further narrowed to _not_ be absorbed as well by water, because then all the energy would hit the surface of your food and not the inside (although I'm still not sure if that is relevant at a wavelength that is almost as large as the food you'd want to heat).

And then, again IIRC, finally pinned down slightly to the side of that because of the parts that could be sourced to build the first ovens.


Also, to the best of my understanding, the only known risk to human safety of a microwave without proper shielding is injury due to the heating of human tissue. Not like that's a good reason to ignore any potential health concerns, but I used to have the misconception that it was much more dangerous, e.g. could cause cancer, but, of course, ionizing radiation is much, much higher frequency.


The heating part can still be very bad news for your eyes, being the most delicate exposed tissue.


The international allocation of 2.4GHz spectrum for microwaves happened at lobbying of USA, but is global - and exists in order for airplane galleys to be equipped with microwave heaters and be allowed to travel internationally.


it's kind of funny because water resonates in that band, international consensus or not?


https://en.wikipedia.org/wiki/List_of_common_misconceptions:

> Microwave ovens are not tuned to any specific resonance frequency for water molecules in the food, but rather produce a broad spectrum of frequencies, cooking food via dielectric heating of polar molecules, including water. Several absorption peaks for water lie within the microwave range, and while it is true that these peaks are caused by quantization of molecular energy levels corresponding to a single frequency, water absorbs radiation across the entire microwave spectrum.


I would contest Wikipedia's description here a bit because I think it confuses the issue of microwave emissions. A number of ISM bands were allocated for RF heating applications originally, 6.78MHz, 13.560MHz, 27.12MHz, 40.68MHz, 915MHz (commonly referred to as 900MHz), 2.45GHz (commonly referred to as 2.4GHz), 61.25GHz, 122.5GHz. These are just examples taken from my NTIA chart which doesn't call all of them out.

The lower ones are largely historic, the very first RF heating experiments used HF and VHF low band which were easier to produce with the radio transmitter technology of the time, but not very efficient at all. The invention of the magnetron changed that, suddenly it was much easier to produce microwave radiation at high power levels, and so the inefficient HF/VHF RF heating devices have all but faded away (there are some specific technical applications that remain exceptions, as usual).

The low frequencies are inefficient and difficult to produce with compact electronics, the high frequencies aren't very attractive for food heating applications because of the limited skin depth. So every microwave oven you're likely to run into operates at 2.4GHz, which is pretty much the sweet spot for food heating among the allocated ISM bands. That band is defined as 2.4GHz through 2.5GHz. So I quibble with describing microwaves as "broad spectrum." Magnetrons do not produce very narrow output, one of the reasons they aren't often used for radio transmitters today, but microwave ovens are required to constrain their meaningful output to within 50MHz of 2.45GHz. 100MHz is a lot of bandwidth from a modern radio communications perspective, but isn't really that wide from a perspective of physical effects.

2.45GHz was chosen as an ISM band in part because it had good properties for heating, but it wasn't put exactly on a resonance frequency for water or anything like that. The exact details of the selection process are obscure but 2.45GHz was already being used for experimental microwave heating before the ISM band was allocated, and I would imagine came out of some combination of ease of magnetron construction and reasonably good heating properties. It is documented, for example, that the EHF bands were never popular for consumer microwave heating because of poor efficacy with food, although they do have industrial applications (especially in welding).

Given the history of the topic there's a decent chance that 2.45GHz came about because it was being used by experimental radar at the time, the main thing that magnetrons were being built for. Microwave heating was basically a byproduct of radar development during its early days of development.


thanks for this great comment...i learned stuff i hadn't known for sure.


It’s a myth, see the Wikipedia link in an above comment.


Are they actually microwave instead of convection?


Depends on equipment of the plane - the original reasoning for making 2.4GHz the international dumpster spectrum was to install microwaves


But we need to switch off WiFi and Bluetooth


That's because the equipment installed in the galley is going to be tested for operation with the specific aircraft.

Meanwhile phone radios (not WiFi, not Bluetooth) used to have interference issues with various aviation systems (with some 5G bands still causing issues).

Additionally, outside of interference issues from phone systems (and various other non-ISM band radios), the order helps preventing people from doing stupid things with electronics during takeoff and landing, where sudden deceleration can cause things to go awry.


The restriction on cellular phone use on a plane comes from the FCC not FAA.

As far as the FAA is concerned as long as you can demonstrate (through standard tests) that a class of aircraft is tolerant of electronic devices (which the manufacturers can do), it’s up to the airline if they want to allow it. They’ve even greatly simplified the process due to the explosion of different electronic devices over the last couple decades. This covers your cell phone in the same way it does an e reader or anything else. [0] There are basically no substantiated instances of portable electronics interfering with aircraft. Certainly not the number you’d expect given the massive number of devices that transit on a massive number of planes every day at this point.

The restriction on usage comes from the FCC, applies specifically to using land mobile frequencies from aircraft, and is not out of concern for safety of the aircraft.

Things getting up into VHF, UHF and above are “line of sight” propagation. If you can draw a straight line between two points without hitting anything that will attenuate the signal… it’ll pretty much go anywhere. Voyager 1 is transmitting 15 billion miles to us on 8GHz with a 23 watt radio. Your cell phone can put out around 3 watts. People called in to ham radio nets in Vancouver, BC from Seattle, WA on a 5W handheld radio just by going to the top of the Space Needle. People regularly make contact with the ISS or satellites on similar power. A large limiting factor to the distance is the height of the transmitter. Thanks to the curvature of the Earth, over enough distance, that straight line between the two points will begin to intersect the Earth which works great to attenuate the signals to nothing. Getting higher up lets you "shout" further out.

Your cell phone transmitting at 3 watts from that kind of altitude is potentially hitting dozens or hundreds of cell towers. Frequencies which are intended to be shared among inhabitatants in a maybe dozen mile radius now need to be shared across hundreds of miles, reducing the bandwidth available for everyone. And the weak signal will cause interference substantially further.

The FCC limited transmissions while on an aircraft to prevent interference with ground communications, and continues to because there isn’t enough evidence that they won’t.

[0] https://www.faa.gov/documentLibrary/media/Advisory_Circular/...


This is incorrect. The Federal Aviation Regulations (FARs) expressly forbid operating in IFR conditions (or an IFR flight) with cell phones on.


It's a bit of both. Aviation cares not just about specific plane, but also interaction with other systems - which in this case in USA are in purview of FCC.

That said, both reasons are involved, though usually mobile networks tend to optimize for signal in horizontal plane in a way that makes them reduce receiving capability from outside of it.


I've read that the way to determine if your microwave is properly sealed is to put a phone inside of it, close the door, and then try calling it. I've been unable to get the phone to ring last time I did this ~12 years ago, though if there's some expected leakage, I suppose proximity to the cell tower is a factor


If anyone’s interested in seeing the leakage on a spectrum analyzer:

https://youtu.be/tToMYkO7Gkw


And now my new microwave actually has Wi-Fi built-in, we've come full circle.


That's probably a good thing, because it means they have to take extra care to make sure the shielding is good enough that microwave energy used for cooking won't interfere!


Except it disconnects from the Wi-Fi when running, and only re-connects afterwards to send the notification about completion! A total cop-out...


Could I make a legal bluetooth jammer that is actually a small microwave oven then?


Probably not legally.


There's about 1000 watts of RF inside the microwave, it has to be attenuated to less than a few microwatts on the outside to avoid congestion, as they use the same frequency band.

I'm amazed WiFi or Bluetooth ever works at all. 8) You can thank Hedy Lamar for that.

Try cleaning the mating surfaces around the door thoroughly. If that doesn't work, consider replacing the microwave or relocating the speaker.


Wifi has an "interference robustness" option which is designed with microwave ovens in mind. The power output of a microwave is a sine wave, because it's powered from your AC line power, and doesn't emit energy during the zero crossings. "Interference robustness" times packet sends (and lengths) to be when the microwave is producing the least amount of power. Thus, you spend half of every 60th of a second irradiating your food, and the other half downloading videos. (Needless to say, having this on and your microwave going in the background reduces throughput. But, dropped packets are worse than small packets that always get through, and this aims to eliminate completely stomped-on packets.)

Wifi also has a listen-before-talk model. If it "hears" a microwave running, it thinks it's another station transmitting, and backs off. This feature of the protocol is why long-range networks worked so poorly in the past. If station A can't hear station B, but the AP can hear both of them, then A and B are going to step on each other and the AP won't be able to communicate with either station. This is why the "enterprise" way of deploying networks was to have a ton of access points running at low power; that works well with the listen-before-talk model since the AP likely can't talk to or hear stations that are too far away for the stations in its range to hear.

I don't know if interference robustness still exists in modern standards, as I haven't seen it in a control panel for decades, but it was definitely in 802.11b. I have never tested Bluetooth (or read the standard), but basically... the industry knows this is a problem, and handled it a long time ago. Bluetooth might ignore the problem because it plans on frequency-hopping (away from the microwave) anyway, but like all software, that can easily be bugged.


I have an inverter microwave. Before I went crazy with APs around my house, that microwave would win the war; no zero-crossings to sneak a packet by it.

Now my bluetooth only drops when I'm fairly close to the microwave as it runs. (EDIT: my WiFi, not my Bluetooth)


Are you saying that adding WiFi APs improved your Bluetooth connections when the microwave is running? Or is there such a thing as Bluetooth APs? Pardon my confusion.


Wow. I have no idea how I confused that.

Yeah, the WiFi is improved. The Bluetooth is the same as it ever was.

I pardon your confusion.


Cool, I just thought I was missing out on something. :-)


There is also a fun regulatory hack in bluetooth. As it is frequency hopping, FCC considers it spread spectrum, but only if it hops across a large enough set of channels. This matters because spread spectrum radios can run significantly more power.

Bluetooths frequency hopping system avoids interference to it and from it by dropping channels with interfernce or other users from the hop set.

So with enough interference, even monentary interference and the hop set reduces and regulatory limits require dropping power from 100-200mW to 25mW. This usually cuts the connection.


The microwave RF emission is actually very narrowband, though the frequency drifts all the time. Bluetooth uses FHSS so it will eventually go though, unless the noise is so powerful that it saturates the receiver.

I'd post a screenshot from a HackRF-produced waterfall, but I don't have a microwave :). Some wifi controllers can measure energy in the spectrum and can be used to plot a simple waterfall.


>but I don't have a microwave :)

Living the dream!


> Try cleaning the mating surfaces around the door thoroughly.

(I'm obviously not an electrical engineer, given this is 10x stuff) If we're talking a 2.45 GHz microwave signal, that's a 12.2 cm wavelength.

But I thought for shielding you only needed to have gaps of <wavelength to null emissions.

Is there some fractional-wavelength propagation, or is my understanding of EM shielding off-base? How are microwaves noisy? E.g. https://physics.stackexchange.com/questions/269672/does-a-fa...


But the gap around the door might only be millimeters wide, but it is 30cm or more long.

It is "both* dimensions that must be a long way below the wavelength to keep microwaves in.

Instead the door seal uses a technique called a quarter wave choke. It relies on reflecting back any microwaves wanting to escape, and by making incoming and reflected waves perfectly cancel, no power is transmitted.

As OP said, even a small amount of soup dripped onto the seal and your microwaves will all start escaping.


> Is there some fractional-wavelength propagation?

"Propagation" is probably not the right word, but fractional wavelengths can "leak" some amount of field.

https://en.wikipedia.org/wiki/Evanescent_field


Notably, that field is powerful enough that microwave doors need to be about an inch thick. The rest of the inch is just empty space and a sheet of plastic to stop you putting your fingers into the evanescent field.

Unfortunately there were a few microwaves sold recently in Europe which forgot that design element, and if you pointed at food while cooking it, you would cook your fingers too.


You're right. The rule of thumb for EMC engineers is to bond joints at no further than wavelength/10 where the wavelength corresponds to the highest frequency that you wish to maintain good shielding effectiveness. Some MIL projects use wavelength/100.


If it's an old microwave replacing the door seal would be cheaper and easier.


I wonder if building a faraday cage around it would do anything.


> You can thank Hedy Lamar for that.

eh... maybe. Don't forget the microwave isn't CW so there's plenty of transmission slots available on the off cycle.


That's inconvenient, but it's definitely expected.

From https://en.wikipedia.org/wiki/ISM_radio_band :

> The ISM radio bands are portions of the radio spectrum reserved internationally for industrial, scientific, and medical (ISM) purposes, excluding applications in telecommunications. Examples of applications for the use of radio frequency (RF) energy in these bands include radio-frequency process heating, microwave ovens, and medical diathermy machines. The powerful emissions of these devices can create electromagnetic interference and disrupt radio communication using the same frequency, so these devices are limited to certain bands of frequencies. In general, communications equipment operating in ISM bands must tolerate any interference generated by ISM applications, and users have no regulatory protection from ISM device operation in these bands.

> Despite the intent of the original allocations, in recent years the fastest-growing use of these bands has been for short-range, low-power wireless communications systems, since these bands are often approved for such devices, which can be used without a government license, as would otherwise be required for transmitters; ISM frequencies are often chosen for this purpose as they already must tolerate interference issues. Cordless phones, Bluetooth devices, near-field communication (NFC) devices, garage door openers, baby monitors, and wireless computer networks (Wi-Fi) may all use the ISM frequencies, although these low-power transmitters are not considered to be ISM devices.

So basically the microwave oven's Faraday cage needs to block enough for safety. There are regulations about the radio spectrum, but they allow it to emit some.


My data is anecdotal, but I've observed that Panasonic inverter ovens that I've used interfered in the 2.4GHz range, while models of other brands (e.g. GE) have not.

(This is unfortunate because Panasonic seems to be the only brand that can actually adjust power output, whereas the others simulate lower power levels by cycling on and off.)


Adjusting the power output is the definition of “inverter” basically. A few brands offer it, it seems like it was probably patented as it was only 3-4 higher end well known brands at least in Australia.

Inverters themselves are potential noise sources though so may be part of the issue but other implementations may not interfere.


You can thank terribly written intellectual property laws for that exclusivity. It’s not like inverters are some kind of new technology, and yet here we are.


I suspect the inverter design is now cheaper too.

The traditional design needs all power to go through a transformer. A 1 kilowatt 60Hz transformer necessarily uses a lot of copper and steel. The inverter design can use MOSFETs (theoretically cheap, but a reasonable IP cost) and far less copper and steel.


I know of at least two Panasonic inverter microwaves that failed within a 5 year period of ownership.

Mine emitted white smoke warming up some tea while I was in another room. I hope to God it wasn’t beryllium.

I still have a Panasonic OTR microwave, but it’s inverterless. It appears to be an improved design of a GE model from the same OEM.


Probably burnt paint from the inside of the waveguide.

Happens frequently when the mica sheet that covers the injection port gets moisture from steam (who'd have thought - steam in a microwave?!?)

Simple fix is to replace the mica (a few cents from AliExpress) and use steel wool to get rid of any carbon residue around the injection port.


It sounded like there was no magnetron load on the power supply, because the cooling fan was spinning a little faster and the interior was a little brighter. This was during the few seconds between the smoke show and from me pulling the plug.

There was indeed a char mark on the mica sheet, but the beryllium terror at the time was enough for me to chuck it.


You would expect less load on the power supply if there was an arc causing a short in the waveguide.


We have a Panasonic Inverter microwave here, must be 20 years old. It works great and I have never noticed any interference with WiFi or Bluetooth, both of which also get a lot of use in the kitchen. Just another anecdata point!


The earlier inverter models sound like they’re workhorses. The problematic ones I knew of were from around 2012 - 2014.


Oh no. Is that something that happens? Mine has smelled a little like metallic smoke lately.


LG now sells inverter microwaves under the NeoChef brand, I believe. I saw one in a second hand store recently so they've existed for a while now. I haven't tested one to see if it interferes with anything, though.


I have one, and yes, much interference.


GE inverter microwave with interference 20-30ft away and it goes through a wall.


At an old apartment my Chromecast would become unusable whenever I'd use the microwave in the other room. It took me a bit to understand why every now and then it would just take a dump, turns out it was my roommate cooking dinner.

Meanwhile my current microwave I can cook and be on bluetooth headphones paired to my laptop across the house and there's no issue.

Some microwaves are better shielded than others. It might be leakage from the actual cook box, it might be leakage from all the extra circuitry.

Even though the frequency of most microwave's primary element is going to be a little higher than what Bluetooth is supposed to run on, if there's enough energy leaking you'll still potentially drown out the signal. Filters, especially ones made to be kind of cheap, aren't perfect and can't always filter out everything.

And as mentioned you're trying to catch a few milliwatt signal right next to something that's trying to generate and contain a 1,000,000 milliwatt signal.


One thing is Bluetooth. Imagine spending 4 years trying to figure out the source of some mysterious signals registered by your giant radio telescope, only to find it was due to the microwave[1][2] used by the operators to heat hot pockets or whatever.

[1]: https://www.theguardian.com/science/2015/may/05/microwave-ov...

[2]: https://www.nature.com/articles/521129f


Now I'm imagining a collaboration among radio-astronomers to microwave their lunches at the same time around the globe. Who would set up the prank: grad students making a secret pact at a conference to prank their professors, or vice verse?


Can I interest you in some land in western Virginia?

https://en.wikipedia.org/wiki/United_States_National_Radio_Q...


"Honey! I'm detecting some some major CMB! You gotta see this! Can you pop my lunch outta the micro too?! Thanks!"


I had a colleague who "discovered" a pulsar that later turned out to be a radar at San Francisco International Airport. I occasionally still text them photos of the "pulsar" when I have a layover at SFO.


My neighbors have something crazy emanating from their house. None of my Bluetooth devices, from Bose to Plantronic to new AirPods Pro, survive signal connectivity when I walk past their house. The connection gets very disrupted and the devices have trouble reconnecting.

So I have to leave everything off when I go for a run and not connect headphones until I make it to the end of the street. It’s weird.


Crazy story that someone at the local radio astronomy observatory told me.

Context: Radio observatories need to minimize as much radio interference as possible. Typically, they are some distance away from population, and people are asked not to use phones within a couple of km of them. Inside the premises no unshielded electronics are allowed. If any are used, you can immediately see the effects on the data being collected by the telescope.

Anyway, these guys were getting some sort of strong interference signal at 4 pm every day. They could not figure out where it was coming from. They eventually decided that it was not coming from within, but from somewhere outside the observatory. They got some triangulation equipment out and over the course of several days, finally determined that the signal was coming from a house a couple of km away.

So they went over and knocked. Asked the owner what was going on. Turned out, the guy had a electric can opener, and every day at 4 pm he would open a can to feed his dogs. That was the interference signal they were getting all the way to the observatory!

Eventually, after some back and forth, they got the guy a new can opener they had vetted to not cause an interference signal.


Sounds like they have something pretty powerful swamping out that band.

Even though the ISM (2.4 GHz) band is unlicensed, there are still regulatory limits to the maximum emission levels. Devices that exceed those levels are illegal to use. The FCC can apply hefty fines in certain cases.

Fun story, I used to live in an apartment building and my car key fob wouldn't work when I parked close to the building. It would work on the other side of the parking lot though. Turns out someone was using a jammer because they hated the noise that cars make when they lock and unlock. They tried to blame a nearby military base, but I had some RF test gear and located the culprit. They turned it off pretty quick when I showed that they could be fined $10k per day.


Can you recommend gear for this? I also have a major interference issue on my street and I'd like to locate maybe vendors by MAC or some information about what's causing spectrum issues.


Sometimes it’s things like baby monitors or low-cost cameras blasting analogue video.


A lot of current video baby monitors are actually digital video over DECT.


The same can also be very localized. On my way to work there is a 2x2m spot in the middle of open space that does that too


Old CCTV cameras cause this. They use the whole 2.4 G band and transmit at high power, acting as WiFi and Bluetooth blockers for about 100m around.

If someone tries to use WiFi, it garbles the video feed too.


could be a continuously recording wireless camera that likes to hog spectrum


I live in deep suburban/exurbs and this feels quite likely, given the neighborhood. Everyone’s wired for cameras and Amazon Key, etc.


Microwaves typically run 2.500GHz, most bluetooth and wifi is from 2.400 to 2.480GHz to keep some space between them (so if you want to minimise WiFi/Microwave issues on 2.4Ghz, use a low WiFi channel).

Bluetooth is adaptive and will hop frequencies to find quiet space in the range above, however microwaves are an intermittent source so when they go on the leakage will kill any bluetooth that's on a nearby frequency.

An easy way to see this is with a BBC microbit; you can measure the signal strength on channels 1 to 100 (2.4 to 2.5GHz in 1MHz steps) and so plot the local RF sources (WiFi, Bluetooth, Microwave, etc.).


Where are you getting 2.5? 2450 +/- 50 is what I have seen.


From Nordic Semi (nrf52 chips behind most BLE/ANT+ implementations these days). Seems microwave ovens use the + side of the tolerance (2.45+0.5_ so they suggest sticking to 2.40 to 2.48 (channels 1 to 80 in their parlance).


Interesting, I wonder if this might just be due to the dominance of one particular magnetron design / manufacturer. Thanks for getting back to me.


Around 15 years ago my wireless Magic Mouse's pointer movements would become so choppy as to become unusable not just if my microwave was in use, but also if a neighbor's was.

But upgrading to a new laptop+mouse fixed it, and I've never had a problem since.

Since they're on roughly the same frequency, interference makes sense. Microwave ovens are high-energy, Bluetooth is low-energy, so minor leakage can still have a big effect. But there's no health concerns or anything, precisely because it's still so low-energy. (You can't cook food with Bluetooth!)

But it does seem like some Bluetooth chips/stacks are better at hopping around frequencies to avoid it than others, or that particular devices just develop bugs.


I've never seen this with a microwave, but this exact thing (Bluetooth mouse choppiness) happens with interference from other Bluetooth devices in the vicinity, especially when pairing. Most microwaves do produce some BT interference, so I am not surprised that the symptoms would be similar.


Tangentially related: knowing they use the same frequency band, I actually used the work microwave oven as a faraday cage for testing our product which used bluetooth. (Close door, observe signal drop behavior). I was bemused to note that you could still connect to a device inside the oven, over short distances... Hopefully the attenuation was sufficient for safety when using it to cook!

[ For clarity - the oven wasn't on when I used it as a faraday cage]


There's a particular spot on a local highway that interrupts my wireless CarPlay connection if I spend too long in it (eg when traffic is slow). It's right next to an exit with a bunch of buildings (including a hospital) so I'm sure there is some massive emitter in one of those buildings. I'm still at a loss about what could cause that kind of interference inside a major city and still be legal.


Sounds like Berkeley or North Oakland. :) I have the same issue, I never considered Bluetooth interference.


WiFi and Bluetooth have basically zero legal protection. FCC Part 15 "this device must accept any interference received, including interference that may cause undesired operation."


MRI?


Anytime somebody runs the microwave in my office, my headphones start crackling. I'm sitting probably 20 feet from the microwave and my computer (source of the bluetooth signal) is right next to me.


Office microwaves (assuming it's not just a retail one that happens to be in an office) can be overpowered. Think the last one I was near was a 2.5kW unit.


Good lord, who uses that much power in a microwave? 600W are enough.

(Clearly, the idiots microwaving fish in them. In one office I know they put a sticker with a crossed-through fish symbol after one particularly pungent incident)


Yes 600W are enough, but I once had a 1750W microwave and it was luxurious and ridiculously fast at warming. I would pop a bad of popcorn in like 30 seconds, or warm up a slice of pizza in 15 seconds. Makes me think like the difference between a Camaro and a Geo Metro


> Good lord, who uses that much power in a microwave? 600W are enough.

And next thing you'll say is that 1000W is enough for a kettle? 110V 15A AC is enough for an outlet?

Here in Europe, we don't have the patience for slow-boil kettles and slow-boil microwaves :).

(FWIW, I always stick to the full 1000W available on my microwave, and sorely miss the 1250W that my dad brought from Sweden when I was a kid.)


I'm German. High-powered microwaves IME only make for highly unevenly warmed food.


Uhm. Have you seen https://www.miele.com/brand/de/revolutionary-excellence-3868... ? (scroll down a little bit, until the fish in ice block appeaers)

Just about 10kEUR :-)

I think that's the solid state microwave emitters arranged in a phased array finally making their entry into the mass market.


I do. I've got a 1650 watt microwave. It's nice being able to reheat things dang quick. And it cooks a mean baked potato in no time!

Loads of recipes I run into assume at least 800W, usually 1,000W. I'd be pretty frustrated with only 600W. I can always run a high power one at less power, I can't run a low power one higher. I'll always take the more powerful microwave.


My headphones are fine, and the battery doesn't drop out either -- this 3.5mm cable is great.


I think we should adapt it into devices like phones, after all who doesnt like cheap quality audio hardware that just works (TM), and for some reason doesnt degrade or die after few years due to batteries?


I hate cabled headphones on a portable device

Physically tethering to something designed to be slipped into a pocket, put in case, set down on the table while I walk around, etc is stupid

Physically tethering to a stationary object (that you can't really use if you walk away form it) makes sense in some cases


I like headphones primarily on devices that go in my pocket because they move with me. When I use headphones at a non-portable device I'm likely to damage them by walking away with them on.

I use a BT headset at my PC for this reason; I can get up and pace &c. without worrying.

Quick tip if you are using wired headphones while doing chores: run the cable under your shirt; that should leave little-to-no exposed cable to snag on things.


>run the cable under your shirt; that should leave little-to-no exposed cable to snag on things

...except your shirt when you take your phone out of your pocket to change what's playing, answer a message, etc :|


I am also a get-up-and-pacer, and agree with your reasoning completely.


I pace, too .. but not if I'm using a desktop - only when on the phone (and occasionally) with a laptop


What happens when you place your phone in the microwave (don't turn the oven on, obviously) and walk away with your speaker? I'm curious what kind of range you're getting.

For reference: I just tried this with iPhone 13 mini + WH-1000XM3 and the connection dropped after ~5 meters.


Microwaves are well known to interfere with 2.4 GHz communications. That’s one of Bluetooth’s channels.

https://en.m.wikipedia.org/wiki/2.4_GHz_radio_use


I used an SDR to check my microwave's emissions. They were pretty narrow and stable-ish. Not stable like an actual radio, but not all over the place wrecking all of the wifi channels.

I've not had it interfere with bluetooth or wifi. Bluetooth frequency hops, and moves away from channels with interference (dropped frames), doesn't it?

I have a couple of illustrations at https://blog.habets.se/2017/06/Microwave.html


> Bluetooth frequency hops, and moves away from channels with interference (dropped frames), doesn't it?

Hops yes, remembers to avoid certain channels, maybe. IIRC the hop sequence is controlled by the master device so that might add a layer of confusion if it isn't experiencing the issue.

> I used an SDR to check my microwave's emissions. They were pretty narrow and stable-ish.

Yes, the issue is it's manufacturing dependent, so a different batch of those same magnetrons the next weeek would have different properties. Hence the wide band.


Yes, my flatmate had a very old 20+ yrs old microwave (the type with a mechanical dial) and when he used it my WiFi would completely drop out until the cooking was done.

I bought him a new microwave because I was sick of dropping zoom calls. The modern LG microwave was much better and has virtually no effect on the WiFi.


You can demonstrate WiFi interference by putting a laptop next to your microwave and running a ping test to your WiFi router. I can put my laptop about 4 feet away from my microwave and the ping test hangs as soon as the microwave starts. The ping test resumes as soon as the microwave stops.


I haven’t had my afternoon coffee yet, and I read this as though the ping test stops when you put your laptop in the microwave and turn it on.


It should stop as soon as you close the doors, I think, or else RF interference is the least of your problems with the microwave.


Lol! It's tempting. Very tempting.


It did with anything 2.4 GHz (BT, WiFi, Logitech nRF, etc etc). I threw my microwave away. Not really missed it, it was a waste of space, but anyone who wants to do a deauth attack can also get you to disconnect from WiFi.

Right now we got two airfryers, an oven (airfryers are basically mini ovens), and a mini pizza oven. The latter is pretty bad and hard to operate but because our main oven is broken, its as good as it gets. Not much edible comes out of a microwave. The tastes are almost always bland. I'd rather not eat. For my young kids I get to cook plain stuff, they don't enjoy anything complex but like the same stuff like pasta over and over again. We used au bain-marie in past. It requires a little bit more planning but nothing dramatic.


I use my microwave constantly, but it's all for defrosting.

Once the food is no longer frozen but not yet piping hot, it then goes into the toaster oven or skillet or whatever to finish heating including crisping/browning.

It's great because it not only saves significant time, but loses less moisture. Heating from frozen in an oven dries things out too much, or you have to use up aluminum foil to wrap it, which is annoying and a waste.

Also obviously microwaves are great for soup.


We eat soup once a week, on Saturday usually. I open the package (like this [1]), put it in a pan. Warm it up and... have soup. With a fresh baguette, some hummus or aioli or whatever. No microwave required.

If its frozen soup (made in bulk it is very cheap) then it just has to be put out early enough. A microwave could help to defeat bad planning or tough time schedule.

[1] https://www.ah.nl/producten/product/wi920/ah-rijkgevulde-tom...


Well sure. Like I said, the main benefit of the microwave is to defrost faster. Nothing requires a microwave.

And soup doesn't benefit from browning or crispness so you can heat it up in the microwave the whole way.

There's nothing wrong with the pan, it just takes longer. And there isn't any taste/texture benefit over the microwave in the case of soup.


An interesting construction detail of the cheap modern microwave is that it only operates on one half of the mains electric waveform: microwaves use a single high-voltage diode which acts both as half-wave rectifier and voltage doubler. Thus the magnetron only operates for 10 ms in every 20 ms.

In theory 2.4 GHz communication protocols can easily time their transmissions to fit in the gaps left by the microwave. 50% bandwidth loss but no other effect.

This obviously isn't foolproof in practice, when 2.4 GHz was a thing I remember my WiFi dropping off whenever somebody was nuking some food. But perhaps this might have been a quirk of my Panasonic inverter microwave - which obviously is not the simple standard circuit.


The 2.4GHz spectrum is shared between Bluetooth and WiFi 802.11b/g. A few years ago, I was doing some work using an Ubertooth-One scanner (https://greatscottgadgets.com/ubertoothone/). It was showing the traffic on different channels.

My wife stuck a burrito to warm up in the microwave a room away (30-40 ft). This was with a brand-name model, so presumably properly shielded, etc.

Nope. The entire spectrum just went white with noise on all channels.

Once the microwave cycle ended, it still took a good 15-30 seconds before the airwaves calmed down and went back to normal traffic.


Microwaves aren't perfect faraday cages and a little spectrum bleed does happen. Not enough to cook you but definitely enough to make your wifi call or Bluetooth device can get iffy if the microwave is in the path of the sender and receiver.


There is this very popular night market where I live, and whenever I drive through a corner of this market, the Bluetooth in my car just stops. I always wondered what this was. There is just so much interference on a square meter.


Yes. I've wondered about this for a while. Because although the microwave is supposed to be fully shielded, when I take EMF readings on it with my TriField TF2 EMF meter, it's spitting out >100mW/m^2 when it's turned on. And I've seen this happen with just about every microwave I've tried this on. The only ones that haven't seem to be those expensive integrated under-counter ones where the tray slides out rather than opens like a door. Also my phone still works/receives calls when I put it in there, so it can't be as good of a faraday cage as it's supposed to be...


"Microwaves are supposed to be perfectly shielded" is one of those myths that just won't die. Microwaves are allowed to leak a lot, with exactly "how much" dependent on whatever agency writes the rules where you live of course =)

In the US, for instance, it's the Center for Devices and Radiological Health (CDRH), part of the FDA, that sets the rules for microwaves, with the performance standard set forth by CDRH allowing leakage (measured at five centimeters from the oven surface) of 1 mW/cm² at the time of manufacture, and a maximum level of 5 mW/cm² during the lifetime of the oven.[1]

A strong wifi router or bluetooth transmitter may be transmitting at one or two orders of magnitude greater than the microwave's allowed to leak, but if you're closer to the microwave than the wifi/bluetooth transmitter, or the microwave is simply between you and the wifi/blueooth transmitter, or especially if you have a low power transmitter, that microwave's going to wreak havoc.

[1] https://transition.fcc.gov/Bureaus/Engineering_Technology/Do...


Is the interference from leaking microwaves themselves or resonant leakage from the magnetron and/or support circuitry which is desired to be as cheap as possible?

Also you can’t really see the cage: that mesh you see in the window is indeed designed to block emission, but in a cheap one you can often see a gap between the mesh and the bezel, and of course the shell is a cheaply assembled rectangle without tight corner fittings so is probably leaking a small amount here, especially at the back, where they assume a wall will catch any leaks.


You can build a "lectenna" and see where your microwave is leaking energy out.

https://www.youtube.com/watch?v=V5SMF9p-4Q0


London Underground announcements in certain stations seem to break Bluetooth really frequently. I suspect that at some point, the message is being transmitted by radio and this must be at a frequency that interferes.

Also seen on platform WiFi messed up in a similar fashion - this causes the most grief when calling over WiFi kicks in, you make a call only to find it get wrecked the moment they start announcing. Luckily in some stations you actually get a better signal over mobile, so simply turning off WiFi resolves the problem.


Somewhat related, my bluetooth headphones get some static noise when I use my induction oven. On max power ~1900w it disrupts my headphones in a ~2m range haha


My microwave is less than 2 feet from my wifi router :D (but yeah, mostly use 5GHz channels for wifi)

I have no issue listening to podcasts in the kitchen while the microwave is running. I'm using Logitech H800 headphones (modified with wires going to my hearing aids).

If I put my phone in my GE microwave, I have sound breakup issues within 2-3 feet away from the microwave. Sounds like it's better "shielded" than some others mentioned here.


Nope, this is completely normal.

Many commenters are saying microwaves are pretty narrowband.

Maybe some are, but I've done spectrum analysis on a few college-dorm-level microwaves in our office with a Wi-Spy and all 3 of these microwaves spam all of the 3 usable 2.4ghz wi-fi channels when cooking.

We see similar fun in iMac labs, when they're all (attempting to) use Bluetooth Apple Magic Keyboards at once.


I have the same interference with the microwaves at work.

There's also a small plaza in what's considered the very center of my city where I get tons of interference (sound basically keeps cutting as if I was losing connection). There's a subway station underground, and some trolley cables suspended in the air, so maybe there's some sort of power converter underground.


The microwave became a second class citizen in our kitchen when we got the toaster oven/air fryer combo unit and then was relegated to the garage on top of the fridge. We still use it for popcorn and the occasional hot cocoa and if my coffee gets cold on the weekend I'll wander out there to warm it up. It is almost unneeded.

WiFi is better as a result.


Depends on both the microwave and the bluetooth device in question. My favorite pair of headphones are rather inexpensive and my microwave is a couple decades old and very high-wattage, and I get interference if I stand within a couple feet of it. I haven't had any other microwave or any other bluetooth device experience noticeable interference.


Related question:

I used to be able to use wifi when my microwave was running. Then one day with no apparent change it started interfering, so that I can no longer use wifi if the microwave is roughly between the two devices connecting over wifi.

Of other microwaves I’ve encountered, most haven’t interfered with wifi, but one or two have.

Is this a problem?


Does the sound quality improve if either the phone or speaker are inside the microwave? We must test all possible scenarios.


I understand this is a (good) attempt at humor but this might be an interesting idea to see if Bluetooth leaks from inside the microwave oven (or from the outside in). Of course, the microwave has to stay off, naturally :)


Everyone, who see possible interference from microwave, should ask nearest service and make checks, as this could be manufacturer defect or just leakage of old insulation (rust under paint), and it could be dangerous for health.

PS and yes, my friend once found rust hole in old microwave, so this is not very rare thing.


I have wondered about danger to health. The microwave interference is obvious with the airpods.


It's not good. May I just fortunate, but I have not seen any problems from old mature manufacturers, like Panasonic. Sure, I mean new microwave, or in absolutely perfect state.


I have a waterpik whose motor absolutely obliterates 2.4ghz. I sometimes use it while wearing bluetooth headphones, and music immediately stops. It took me awhile to figure it out (I thought it was my head leaning down and putting a metal pegboard between me and my work machine, but nope).


I once had a microwave I called "The WiFi Killer". I work from home, and did then too, and every time one of my kids, or my wife, would use the microwave, I'd get knocked offline. They do work at 2.4Ghz, and this microwave was pretty old. Replaced it and never had another problem.


I had a microwave that interfered with my 2.4 Ghz WiFi signal. If the microwave ran for more than 30 seconds any devices connected to the 2.4 Ghz SSIDs would "stop working". I assume is was due to too much noise caused by the microwave. Devices connected to the 5 Ghz SSIDs worked fine.


My microwave sat Infront of a phone jack. (we had ADSL2 for internet) Every so often out internet would just stop.

Finally realized it was the microwave pumping interference into the jack it took out the internet.

Also worth noting, if it was a bad rain storm the internet would drop in speed.


Yes, we had an LG microwave, took it back after 3 weeks because it knocked out both wifi and Bluetooth. The store didn't want to refund. Said it had to go to warranty repair. Repair shop said they cannot decrease RF emissions, store begrudgingly gave credit.


In the early days of Wifi, it was pretty common to have your connection drop out whenever someone used the microwave. Both systems use the same frequency band.

I assume modern Wifi has gotten better at chugging through the interference (and perhaps microwave-makers better at shielding).


My microwave oven definitely interferes with anything Bluetooth, like headsets etc. Although audio still somewhat works from a laptop upstairs ~30-50 ft away.

Anyways, this is normal. Microwave oven generates a lot of 2400-2500 MHz ISM band noise. You're fine.


My Bluetooth headphones usually experience connectivity issues if I have line of sight with my running microwave. Interestingly it only happens sometimes. Usually audio will stop playing or cut off until there's a wall between me and the microwave.


My bluetooth headphones occasionally drop the connection for a moment when I'm standing to my microwave which is warming up food. 2.4GHz after all. Bluetooth and USB 3 don't play along that well either btw.


I once worked for a startup that was trying to stream ads over bluetooth using microwavable foods. It _worked_, but ultimately used too much bandwidth to be practical at the resolutions advertisers wanted.


Can you explain a bit more how this worked? The food item harvested energy from the microwave, and modulated the leaked microwaves/transmitted its own signal to hijack nearby bluetooth audio devices to play audio ads?


Basically, yes.


That sounds amazing. Are there more details like what sort of ads even?


I agree, technically fascinating, but also... incredibly scummy. As if adtech didn't have enough invasive tools in their arsenal


In a related question, does anyone know why my Bluetooth earbuds frequently experience brief disruptions when crossing the street at intersections? It happens far too reliably to be a coincidence.


Yes. My work microwave is just at the end of my Bluetooth range for my Airpods Pro.

If the microwave is off, then my signal is basically fine and won't cut out.

Once I turn the microwave on, all are bets off, and it cuts in and out.


Same. Airprods Pro are the first product i've ever used that experienced microwave interference.


My first microwave absolutely wrecked 2.4 ghz wifi, replaced it after a few years and the new one didn’t. Didn’t have many BT devices back then, but bet it would jam them as well.


Would probably have been enough to change the wifi channel.


My bluetooth often dies when I'm near tram tracks and their overhead electric lines in my city. Kind of annoying given that there are tram tracks everywhere.


Didn't Wi-Fi originally use 2,4 GHz because it was free to use (without needing a license), because of the noise produced by microwaves on this frequency?


Yes, 1946 was when the FCC opened up 2450 +/- 50 Mc (later MHz), for microwave and medical diathermy use. It was later, in the 80s that they allowed intentional emission. Either as CDMA or FHSS. Within a year we had Wi-Fi (CDMA) and Bluetooth (FHSS).


I used to have to time my gaming around when roommates would be eating. Anything in the microwave was an instant dead connection


I noticed a quirk in my microwave recently if you apply pressure to the door handle it'll turn on the microwave. Kinda spooky.


If that isn't a design feature, may be time for a new microwave friend.


I think microwaves ovens are allowed to emit up to 1W of energy. As mikewarot said - it's amazing Wifi/BT works at all.


My neighbour's microwave nukes my bluetooth entirely. I have moved to wired everything and life is good finally!


> Ask HN: Does your microwave interfere with Bluetooth? Mine does

As far as i know you can make a complain by the FCC.


No, but my reclining chair's motor interferes with the digital signal my TV recieves.


Yes, sound in my headphones is distorted when I go near a working microwave oven.


It destroys the wifi of whatever phone or tablet is next to it while running.


Nope - zero interuptions. Using some cheap ikea microwave.


My waterpik kills bluetooth to my airpods.


Yep. I've seen Bluetooth be affected by microwaves. Also seen WiFi be affected by Bluetooth. It doesn't mean your microwave is unsafe.


Sounds like. A microwave problem


Relevant XKCD: https://xkcd.com/654/


nope.

Had that happen once, swapped the microwave out and never happened again since.


Not great, not terrible.


microwaves emit RF, heavy concept right?




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: