"It’s unclear how many smaller base stations would be needed for 5G service. But it’s widely believed that there would need to be exponentially more because of the limited distance the signals can travel"
Lets unpack this:
1) there is and has been lots of modelling to work out if the frequency bands are viable for business.
2) Where on the exponential curve is it? double? Quadruple? don't use terms that one clearly doesn't understand.
3) "Widely believed" is not a good source. Fucking talk to the scientists that are doing the moedelling. Do you honestly think that the entire international mobile industry is going to just guess as to how many Basestations are needed?
4) "more because of the limited distance the signals can travel" Do you know what this means? propogation is limited, which means that exposure is more than likley to be less. Its simple physics that propagation on the 4-6Ghz bands is nowhere near as much as 700Mhz. (https://gsacom.com/5g-spectrum-bands/)
5) once again, if range is limited, so is exposure.
Once again a lack of basic science, combined with clickbate has continued to undermine valid criticism and replace it with bone headed stupidity.
If you are worried about 5G, why the fuck do you use Wifi? wifi is both 2.4 ghz, and 5 ghz.
> If you are worried about 5G, why the fuck do you use Wifi? wifi is both 2.4 ghz, and 5 ghz.
Assuming one chooses to use Wifi in one's home, perhaps the criticism is still valid because such choice is just that--a choice. By contrast, one is forced to expose oneself to radio waves legally propagating in public spaces.
I'm conflicted about this, because on the one hand, people should have a choice about things that happen to them. On the other hand, if you let them, you get things like this, or anti-vaccination movements. What can one do?
Nothing is more important than medical freedom. You're welcome to vaccinate--you have that right--but vaccines have been proven to be unsafe for many. The federal government has paid $3.6 billion to compensate, and that's after they've denied most claims (those without undeniable proof it was the fault of the vaccine).
If non-ionizing low temperature radiation can be harmful to humans, does that mean we can get health issues just from standing near another human body which emits EM radiation at a far higher frequency and energy density?
Not necessarily. One possible mechanism for non ionizing radiation to be harmful is for the RF fields to induce currents in the dna base stack (which is a conductor), jamming the base excision repair system, which is hypothesized to scan for muations by detecting when current is or isn't running through the dna.
In this case, the RF wouldn't be mutagenic, but it would increase the likelihood of cancer.
Isn't there a pretty well-established (but small) link between temperature and cancer?
There's probably a reason why all the RF cancer studies go to the effort of specifically controlling it and specifying that the effects they are interested in are athermal.
There is a well understood risk of cancer in people who drink really really hot coffee and tea in oesophageal cancer but in no other cancer is there a relationship
The body also reflects terahertz radiation from the surrounding environment, even directly into other people's eyes. This has the bizarre side-effect of causing people to be able to "see" each other.
Interesting thought. Near infrared light does interact with cells (photobiomodulation), usually in a beneficial way. Maybe the wave form is another factor, or some frequency ranges are beneficial while others are not?
"some frequency ranges are beneficial while others are not"
Certainly. Heat is nice (if not too much), but x-ray is usually bad. Gamma-rays a lot worse. And cellphone radiation is probably somewhere in between ... so like the article cites: we don't know yet.
So I do not like sleeping next to my mobile nor have it next to lower body parts for too long ...
X-rays and Gamma and all the way down to UV are ionizing which is why they are harmful. Radio waves are way below light and infrared as was stated.
Forget body heat, why does no one care we are walking around 60w terahertz radiators in our home! Then we go outside and bathe in 1000 watt per meter sunlight which includes ionizing UV.
Cell phones put out the equivalent of a low power LED light I am not sure way everyone assumes they cause harm but seem to ignore light bulbs.
Cell phones emit RF signals that penetrate the body. Lightbulbs for the most part do not. I've always considered the notion of antennas important. Your circulatory system may act as antennas of all different lengths and that would put the received energy right into the blood where it can cause leukemia or other similar cancers. Just speculation of course, but the proposed mechanism applies to RF and not light bulbs.
Grandparent is right though since most of the blue and green light parts are reflected or absorbed in the top skin layers. Starting with red and infrared, deep tissue penetration is possible. The higher the frequency, the lower its ability to penetrate. That’s why RF can pass through bodies while UV light can’t.
Its not as simple as that, X-rays also penetrate but are much higher frequency than UV.
The difference as far as health concerns is X-rays cause DNA damage, RF will simply increase temperature.
Increase temperature enough and you cause tissue damage of course, but we are talking milliwatts at the transmitter point source that falls off at inverse square, again similar to the amount of heating you would get from a low power LED held near your skin, actually less since as you pointed out RF mainly passes right through us rather than being absorbed.
This stuff should continue to be studied, but so should the effects of much higher power light bulbs just in case there is some damaging mechanism yet to be discovered beyond heating your flesh.
When was the last time you saw a lump of meat arc and spark?
Actually, Google "microwave meat sparking" and you'll find plenty of videos --- but in almost all the cases, the meat is cut in a very specific shape (with sharp edges) that makes it concentrate the RF, and of course being a microwave oven the power density is very high. A human body is, to a rough approximation, a bag of water. It will heat, mostly evenly, when exposed to RF (some parts like the eyes do not have good cooling, hence are more dangerous to irradiate, but this is still a conversation about power levels many orders of magnitude higher than a phone would produce.)
Your link takes an award for missing the point. There is a resonant frequency of liquid water in the microwave range which is why tabletop microwaves use that range. They don’t emit a single frequency, no, but I don’t know anyone who claims they do...
Different fields define the EM energy spectrum differently based on use. You can either argue that "microwave" is one end of the "RF spectrum", or that "microwave" and "radio" are two separate sub-categories of EM energy. My field teaches to the latter.
In the end it is an uninteresting point however because it is an argument over what word to use, not fundamental differences in nature. The point I was making earlier which was misinterpreted was that microwave interacts with matter in a fundamentally different way than lower frequency RF, which is critically important to the topic of TFA. Microwaves excite dipoles on the atomic and bio-molecular scale. Lower frequency RF requires some sort of conductive material (or A LOT of energy) to form an antenna, on the other hand, which is why we are transparent to "RF" but not "microwave" (and hence, why some fields treat them separately).
What is your field? What textbook says that "microwave" and "radio" are different? I've been citing a lot of sources here. Just this one from you would be appreciated.
About the blood-vessels-as-antennas theory, I just want to throw in that blood vessels absorb infrared light well (well duh, blood is quite dark), so at least at that frequency energy is very much absorbed, rather than passed through.
I'm no expert on the exact regulations, but I run fccid.io and have a pretty good idea of the process manufacturers go through.
Although forgery of results would be possible, almost all testing is done by independent labs. It would be possible for the manufacturer to submit a "compliant" sample but then program the rest of the units to radiate differently.
What I see most commonly are devices being sold without any FCC testing. All wireless devices sold in the USA should have certification but many new startups may not know the regulation around the industry (i.e. a kickstarter smart bike lock project might not take FCC certification into account and end up selling a few thousand devices before going belly up). Testing is often thousands of dollars so for a little-known startup they may just try to ride on the fact that their Aliexpress supplied Bluetooth chip "is FCC certified" (it's not).
From what I've seen, there is little to no actual enforcement that goes on within this licensing space (but there is also a pretty high rate of compliance).
What are we talking about, basic CE cert EMI testing or in depth RF radio certification? Most kickstarter projects or companies in general don't do own RF design, they buy fully certified modules.
The problem right now is that the tests are not rigorous enough, so they don't even need to do that. The regulation allows them to test the radiation level when the cellphone is 15mm from the body. If the cellphone is on the body without a gap, none of the flagship phones meet the safe level of radiation when they are broadcasting to a tower far away.
Also, the radiation test only tests the size of an adult's body, so children, who will have a larger region of their body and head impacted, are missed from the test.
WiFi transmission power is almost always controlled by software, with no hardware limits. It’s very easy to override this and transmit above the legal limit, to the point where you could potentially damage the hardware itself.
FWIW, I feel that any big violation of radiation limits would be quickly found out as the product in question would make other devices in the vicinity misbehave.
> I would be more worried about issues similar to the diesel gate scandal where manufacturers start forging radiation tests results.
They already do. Part of the reason why government-funded tests are more likely to show that cell phones cause cancer is that there have been industry-funded tests that have placed the phones several inches away from the subject's body and put them in shielded cases.
No discussion of sample sizes or P values? I convinced, I will refuse to stand within inches of a 5G base station for 10 minutes on, 10 minutes off, for 10 hours a day.
As a non-biologist, the amount of power being exposed to the rats here seems pretty high - 0, 1.5, 3, and 6 watts absorbed per Kg of body mass. Also, if I'm reading this right, survival was very slightly lower in the control groups than any of the RF exposed groups.
6 watts absorbed per kg of body mass would be somewhere in the range of 500 watts for a human being, so, yes, very high -- that's around 5x the amount of energy normally produced as body heat. I would think that would be enough to produce severe stress just from the thermal effects.
Quick Google searches indicate about three watts. To a first approximation, you are not concave when talking on a cell phone, so figure at most 1.5W travelling in your direction. Humans are definitely not opaque to radio, but let's suppose I am. Personally, I weigh 90kg, so for me it's something like 15mW per kg at most, for the small handful of minutes per day I spend on the cell phone.
My level of concern about this could not be lower.
Keep in mind if they're doing CDMA the power is dialed down to keep it uniform at the receiver(otherwise the code XOR doesn't work) so unless you're at the limits of reception it's probably not at 3w.
100-500mW/Kg might still be appropriate, though, if you are dissipating a significant fraction of that 1-2W in your body. I have to imagine that most meat-power-dissipation would be pretty well concentrated near the transmitter.
Will you have a choice if you live in a condo where they have places several of these antennae on the roof, where they are broadcasting at much higher wattages?
Hmmm. It depends on the power, frequencies and distance to tissue. I remember a coworker at a GPS mfgr accidentially turned on a 440 MHz radio in high-power (25W) mode with only a whip antenna and received a nasty radio burn when he pointlessly/accidentally put his hand around it. That radio burn looked similar to a sunburn or maybe 20 seconds in the microwave. Increased risk of cancer, for sure.
A fraction of a Watt every now and then might be okay, as long as it’s not bursting at multiple Watts or near the resonant frequencies of purine or water.
I thought microwaves weren't high enough in the spectrum to disrupt DNA which is the main cause of cancer risk for gamma/x-ray? Individual waves don't have the energy to break bonds even for relatively high power transmitters, at least that's what I've been told.
I'm not a comms engineer but I've also been told the biggest risk microwave exposure is thermal burns. Especially to the eyes which have limited cooling ability and pain receptors
Interesting - I didn't know that. But the parent was pretty obviously making the association between forms of damage which definitely increase cancer risk (sunburn) and non-ionizing radiation - which does not. The type of burn he observed is thermal.
So even if thermal damage (or just damage in general) can increase cancer risk a little...it's not remotely the same thing as sunburn or gamma radiation, and as it pertains to cellphones completely irrelevant (because a cellphone is not a 40W antenna emitter).
Any time you damage any part of your body, that tissue has to repair itself. There is always a chance (albeit small) of things not repairing themselves correctly and the DNA getting corrupted. Every time the same tissue has to repair itself, the odds of DNA corruption occurring increase. This is why cancer tends to manifest in areas of the body which are chronically damaged and repaired (such as an alcoholic's liver, or the esophagus of someone who has chronic heartburn, or the lungs of someone who smokes).
As such, it stands to reason that any type of burn (causing tissue damage) could ultimately lead to cancer.
RF burns are thermal burns, but of the weirder kind. They don't heal as nicely as regular burns.
That said, getting an RF burn off an active radio element is like getting a regular burn because you've put your hand around an incandescent lightbulb. It's your fault, and it doesn't imply lightbulbs are a health danger, as in both cases the energy received diminishes by the square of your distance from the source.
People actually cook things in microwaves? As in - they go from raw product to a cooked product? I've only ever used and only ever saw microwaves bring used for warming up meals, never for actual cooking.
For some reason the concept doesn't seem popular in many parts of the world, to the point that people don't even realize you could use a microwave to cook things[0]. But it can be, and people do. I even stumbled upon a cookbook once, that was entirely dedicated to cooking in microwave. Very interesting and insightful read. Microwave is a different beast than ovens and stoves, but if you appreciate its nature, it turns out you can do quite a lot of cooking with it.
--
[0] - This is a funny thought pattern failure that's common for people, and I don't know the better name than "acute lack of problem-solving abilities" for it. I mean, people should realize that cooking happens through heat, and microwave adds heat into a product, hence it can technically be used to cook...
Well, it's certainly doable. Though for meat in particular, people prefer to use stoves, grills, and ovens, because while a microwave can heat up meat to the point where it cooks, you really need a grill or a hot pan, for example, to induce the Maillard reaction on the surface of a steak (give it a good sear), while leaving the inside medium-rare. Try doing that in the microwave...
What buttons? Pretty much every microwave I've ever seen looks like this[0] - timer + power setting. Even our kitchen at work which has some really fancy microwaves only offers time + power, nothing else. The ones full of buttons on the front are from like....American movies, not something I ever see in real life :P
Consider multiple transmitters operating at the same time, in the same room. Even if no signal transmitter is dumping high output into that part of the spectrum, isn’t bandwidth saturation one of the reasons for poor reception and lower bars?
Would that not provide bonuses to absorbed effective wattage? There’s probably some constructive interference producing fortuitous signals, no?
Apparently, at least using some deep laser science magic, you can make photons' energy additive with respect to ionization process. Anyone here who knows the science behind this, and could shine the light on whether or not this thing has any impact on day-to-day interactions with RF radiation?
The best reply to this is that we both generate a lot of radiant energy and are exposed to a lot of radiant energy in the form of sun, orders of magnitude stronger than the transmitters in our phone. The intensities the mechanism demands are neither likely or possible for everyday use
There is no physical basis for the hypothesis that non-ionizing radiation at athermal levels has any effect on humans. So, either come up with some repeatable results, or find something else to spend your research funding on.
You should talk to the James Randi Educational Foundation about that ( http://web.randi.org/ ). Depending on how their rules are structured, the ability to detect Bluetooth emissions might qualify for the $1M prize that they offer for a repeatable, controlled demonstration of paranormal effects.
They reported some results of a not yet published study. So if you allready know everything, I hope you are open for some questions about the base structure and maybe the great unifying theory of the universe?
There have been plenty of published studies with equally-alarming results.
The one thing that all of the studies have had in common is that the effects they describe mysteriously disappear as soon as someone tries to reproduce them.
Someone elsewhere in this topic found a preprint, its another study along the lines of sticking a rat in a microwave oven and going “OMG cell phones are evil, look at these dead rats!”
For what it's worth, the definition of 5G isn't really settled yet. To some people it means a high rate multi-antenna system in the existing radio bands (in the approximate frequency range 900MHz to 2GHz). To others it means a wide-band system at a centre frequency of approximately 60GHz.
The 60GHz band was specifically chosen for short range wide-band data transmissions because it is close to the absorption peak for oxygen. This means it doesn't propagate far in air, making it a poor choice for much other than short range communications, where it's an asset if neighbouring cells don't interfere with each other. Short range in this case means a cell in every room of a house, as typical propagation distance is about 10 metres.
Given that the 60GHz band is specifically absorbed by oxygen, and the body contains oxygen, it strikes me that it's worth specifically testing the biological effects of this band, separately to existing cell phone bands.
Note that the article under consideration here tests at 900MHz and 1900MHz.
to continously connect with your friends just cover your superior areas (mainly head) in aluminium foil and reflect remotley some interest from a long distance by showing the finger ..
neural networks will explode like popocorn into microwave and you could rent your mental computation infrastructure to produce units such as bitcoins
It is worth the risk, in my opinion. I never hold my phone next to my ear, but there will be more general ambient radiation with more 5g cell towers and cell use. Something unfortunate is that there is really no way to opt-out of cell tower radiation exposure unless you live in a remote area.
My grandfather, one of the engineers that worked with the Voyager I and II teams, died of throat cancer. From the late 90s up until his death in 2007 he was warning people about the potential harmful effects of cell phone radiation. He would only use corded phones.
His throat cancer was thought to be caused, at least in part, by his increased exposure to radiation from the instruments he worked on while working for the US government. Most of his work was classified and he took almost everything about his work to his grave. My mother told me he'd disappear for weeks at a time and he'd never talk about where he went or what he did.
If anyone knows more about the specifics of what my grandfather, William Henry Fairing, Jr., was working on, that'd be so amazing to learn about!
Keep in mind that AM radio has historically operated at very high power; e.g. https://en.wikipedia.org/wiki/WLW was broadcasting at 500kW(!) --- but to my knowlege, there has been no reports of increased health issues in the population around those areas, despite all that RF power causing lots of other very visible effects like turning various metal structures into ad-hoc radios.
Disappointing lack of specific information about the test setup, it mentions a variety of configurations at varying energy levels, but omits whether it was a single male/female pair in each configuration or whatever else
I have no doubt our bodies are easily disrupted by EM in ways we know about and probably many, many more we don't, but it is insufficient to base a belief on "science said male rats don't like 5g phones", which is basically all the article states
This article is misleading. The paper it is based on studied the effects of 900MHz CDMA modulated RF on rats. 5G bandwidth in the U.S. is from roughly 3100 MHz to 4200 MHz.
RE: 5G, maybe it's different in US but following quote is from IEEE:
>As far as frequency, the 5G test network used a 15 GHz frequency band, which is higher and shorter range than current 3G/4G cellular frequencies that top out at around 2.6 GHz
The original article was suggesting this substantial difference is the likely source of the problem, which would make sense if the study used the same frequency range.
Higher frequencies tend to dislike solid obstacles and need higher gain (there is a specific absorption rate that phones must not exceed to prevent literally cooking bits of you, something you might not necessarily feel but could eventually result in cancerous tissue just like any repetitive internal trauma)...
such an argument against it's use for cell phones might have weight, on the other hand forcing this paper narrative (if it was in-fact using 900MHz) was a mistake.
"Why, for instance, did only male rats show increased tumor rates, and not females?"
Hmm, was this pre-registered? Wee bit p-hacking? I can't find the answer in the paper.
Hmm, though I did find this interesting nugget: "At the end of the 2-year study, survival was lower in the control group of
23 males than in all groups of male rats exposed to GSM-modulated RFR." Anyone want to venture a post hoc explanation?
The results are too weak and they're searching too hard for results. That's not to say the study itself is "wrong" just that they're motivated to find an issue to get published and media coverage, not to say "oh, it's fine".
The entire article can be summed up with the quote:
> Final results from the peer-reviewed study won't be released until at least the end of 2017.
Cellphone cancer studies are not in short supply. What we need is peer-reviewed reproducible studies.
Thermoregulation reduced metabolic rate thanks to additional thermal energy input, leading to slightly higher lifespans? Pure unfounded speculation, of course.
It looks like the control group just didn't survive for quite as long as this particular strain of rats usually do. Probably just a statistical fluke with the seemingly small (to a non-biologist) sample size.
From 'Conclusions':
>The survival of the control group of male rats in the current study (28%) was relatively low compared to other recent NTP studies in Hsd:Sprague Dawley
® SD ® (Harlan) rats (average 47%, range 24-72%).
Yes because cigarettes, radium, leaded petrol, asbestos, etc. Most things turn out to be pretty safe but in the slim chance they're not, the downside risk is very high.
Not well understood physics, then it was understood and quickly recognized as dangerous. Nothing was done because of nonexistent regulations and corporate money.
>leaded petrol
Recognized as dangerous to begin with ("December 1922, the US Surgeon General wrote to GM regarding growing concerns that environmental lead would become a serious menace to public health."), huge negative effects in workers noticed almost immediately. Nothing was done because of nonexistent regulations and corporate money. (https://www.damninteresting.com/the-ethyl-poisoned-earth/)
>asbestos
Recognized as dangerous by medical journals as far back as the 30s (About 40 years after it came into widespread use). Nothing was done because of nonexistent regulations and corporate money.
>cigarettes
Recognized as dangerous by medical journals as far back as the 40s. Nothing was done because of nonexistent regulations and corporate money.
>Non-ionizing radiation
Not recognized as dangerous despite every conceivable study being tried multiple times. No link to cancer. Technology has been widely used for a little under a century with no signs of population-level effects. No known mechanism of action (unlike all the others). Regardless, regulations limit human exposure and require standardized testing of all consumer products.
...and probably there were people back then saying "Oh more scare stories about Radium again?!?"
Where mobile phones are concerned, I remember my old Ericsson GA628 that used to heat up my ear leaving it red after 5 minutes of conversation.
What's worth bearing in mind is we deployed mobile phones to the public at large before we had any significant data of their long term effects on us. Likewise 5G - we'll only really know what is does to us long term once we've gathered a few years of data, but that's not going to stop us deploying it. So yes it won't immediately kill us but we're deluding ourselves to say we know for sure there's no harm long term.
We have a good idea what 5G does. It heats our bodies by negligible amount. There's no known mechanisms for long-term adverse effects, there's no experimental data supporting such claims.
Radium wasn't used for over a century all over the entire planet and exposed to most of the entire population of the world their entire lives. It's literally the worst comparison to non-ionizing radiation possible.
I think if there were significant issues with non-ionizing radiation, we would have a large sample size by now since almost everyone is exposed to it their entire life.
Also, what is significant about your phone heating up your ear? I can my make ear red with heat by wearing a toque in a warm room, that does not have anything to do with if it causes cancer or not.
My point wasn't about comparing different types of EM radiation - of course there's no comparison.
My point was simply about the process by which society blindly adopts a new technology without fully understanding the consequences and how that might have parallels today.
But its not a new technology, we have been using radio for over 100 years, just because its 5ghz doesn't completely change the physics involved.
Sure if this was a drastically new technology like lets say teleportation, it would be a bad idea to blindly use it everywhere without some study into the physical effects. Radio though is very well understood.
Lets unpack this: 1) there is and has been lots of modelling to work out if the frequency bands are viable for business.
2) Where on the exponential curve is it? double? Quadruple? don't use terms that one clearly doesn't understand.
3) "Widely believed" is not a good source. Fucking talk to the scientists that are doing the moedelling. Do you honestly think that the entire international mobile industry is going to just guess as to how many Basestations are needed?
4) "more because of the limited distance the signals can travel" Do you know what this means? propogation is limited, which means that exposure is more than likley to be less. Its simple physics that propagation on the 4-6Ghz bands is nowhere near as much as 700Mhz. (https://gsacom.com/5g-spectrum-bands/)
5) once again, if range is limited, so is exposure.
Once again a lack of basic science, combined with clickbate has continued to undermine valid criticism and replace it with bone headed stupidity.
If you are worried about 5G, why the fuck do you use Wifi? wifi is both 2.4 ghz, and 5 ghz.