Hacker News new | past | comments | ask | show | jobs | submit login
“We have no reason to believe 5G is safe” (scientificamerican.com)
925 points by bookofjoe 22 days ago | hide | past | web | favorite | 586 comments

Honestly, after a glance at the links, they don't inspire confidence in me either. There are papers about very small changes on very rare types of cancer, measured by proxy on a species where it's common; papers criticizing the critics claiming association with interested parties (what is a perfectly fine thing to claim, but what is it doing in a scientific paper?); and the only wide review I could find claims that the literature has very weak conclusions that are not sufficient to claim any danger.

As usual, it's not the scientists that are wackos, it's the press that is claiming things completely different from what they say. There are proposals for better test equipment that should be taken, but I don't see any other claim for change there.

> At the same time, everybody seems to accept that the chance of getting cancer in your lifetime has risen to about one in three for men and one in five for women.

The wildest of the claims anywhere on the linked papers is a ~10% increase on the rate of one of the rarest types of cancer, so this line of research won't give you the answers you are looking for.

The thing that concerns me about millimeter wave tech and 5G is that it seems like a solution looking for a problem. My city just had a wave of poles dropped in a few neighborhoods, and they seem to me to be an expensive boondoggle. IMO, we would be better served by the heavy hand of the FCC allowing fiber providers to run telephone poles and trenches without any accountability (as the wireless carriers can with 5G).

These radio bands have been in military use for a long time. I’m surprised that no health studies have been done or released to the public.

It's not a solution looking for a problem. The problem is that the existing infrastructure does not support network slicing. Without network slicing the telcos are severely limited in their pricing models. "5G" is just them trying to convince consumers that this is for their benefit.

Thanks! Just reading:


"The "one-size-fits-all" network paradigm employed in the past mobile networks (2G, 3G and 4G) is no longer suited to efficiently address a market model composed by very different applications like machine-type communication, ultra reliable low latency communication and enhanced mobile broadband content delivery.[2][11]"

"to address a market model" in the sentence seems to support your claim.

True! Right now telcos loathe being locked in as fixed-price dumb bit-pipes for the content providers, the netflix'es, facebooks & amazon's, who are making all the money.

Thus: 5G.

It's competition and regulation, not technology, that prevents telcos from charging as much as they'd like. How will network slicing change that?

How will 5g fix this for them?

>"The problem is that the existing infrastructure does not support network slicing"

What does "network slicing" mean here? I'm not familiar with this term.

I’m sure it solves the problem of making more money for the carriers and letting Verizon, et al, bust the CWA.

It (it being the millimeter wave 5G) doesn’t solve any problem that a customer has.

"These radio bands have been in military use for a long time. I’m surprised that no health studies have been done or released to the public."

I wouldn't be. To a good first-order approximation, the military doesn't give a hoot about long term health consequences and wouldn't release any such studies, if any were made, unless they indicated a requirement for the military usage to change.

>allowing fiber providers to run telephone poles and trenches without any accountability (as the wireless carriers can with 5G)

That would definitely invert the value calculations. I work in the industry and there's no question that the savings from 5G are regulatory in nature. Ironic that it's the less regulated tech that has the more questionable health effects.

This comment is utterly false. Just after a brief glance at the paper, I found a study on showing a 2x increase in cancer on a sample size of over 1,000,000 humans.

Please give a link, I want to read it without going through these that aren't that interesting.

it's a single paper by a single author (itself a proxy measure of low quality) about a single population (military), based on indirectly (not-measured) exposures. I don't think you can really conclude anything at all about that paper.

Having read it, the paper itself is much more balanced than how the biased persons (in any direction) would like to talk about it. It ends with (emphasis mine):

"The main results obtained in the present study were a doubled incidence of all neoplasms with a threefold increase of cancers of the alimentary tract and a sixfold increase of malignancies of the haemopoietic system and lymphatic organs in 20- to 59-year-old career military servicemen exposed occupationally to pulse-modulated 150- to 3500- MHz RF/MW radiation. However, this does not prove a causal link between development of neoplastic diseases and direct interaction of EM fields, since retrospective analysis cannot provide convincing evidence for such links. Nevertheless, the high incidence of certain forms of neoplasms in personnel exposed to pulse-modulated RF/MW radiation clearly shows a need for urgent identification of causal factors present in the occupational environment."

Two other quotes from the paper showing author's awareness of what can and what can't be claimed as proven:

"The highest difference in morbidity rate between RF/MW-exposed and non-exposed personnel was found for malignancies of the haemopoietic system and lymphatic organs (Table 2) with the odds ratio exceeding 6 and the incidence of above 40 new cases per 100000 of exposed subjects annually.

Neoplasms of the haemopoietic system and lymphatic organs are among the malignancies that are to a considerable degree related to multiple environmental and occupational factors, including ionising radiation, organic solvents, some synthetic stains, resins, higher alcohols and numerous other substances [l]. Therefore, many industrial occupations, including e.g. aluminium production, petroleum refining, painting, mining, driving and car servicing, are considered to increase the risk of development of leukaemias and lymphomas. Electric and electronic industry workers have also considerable possibilities for exposure to potential leukaemiogenic factors and substances during their routine or additional duties. This may strongly influence and bias the morbidity rates of haemopoietic and lymphatic malignancies occurring in these populations and their relation to EM fields."

Also some technical details are provided:

"Although assessment of the individual exposure levels (‘dose’) was not possible, it is known from measurement of field power density at working posts that about 80% of the investigated personnel were exposed to RF/MW fields of 0.1-2 W/cm2 and 15% to mean power densities of 2-6 W/m2"

and earlier, showing that the cm2 above is a typo:

"Evaluation of the exposure intensities revealed that at 80-85% of posts, the fields (mostly pulse-modulated RF/MWs at 150-3500 MHz) do not exceed 2 W/m2 (0.2 mW/cm2), while the others have intensities 2-6 W/m2"

It seems that those were the people being exposed to the radar beams during their work hours (or maybe even outside of these). We should for the start compare that amount of exposure to that of humans in the cities due to the cell phones to know which orders of magnitude is the difference.

TL;DR: the correlation is high flux EM (ie radar) exposure being linked with significantly higher rates of [cancer]. The cause may be something environmental also correlated with EM exposure such as industrial solvents (such as those typically used in the manufacture and maintenance of electrical systems like radar) which we already know to be carcinogens.

The post you replied to is already a summary and a much better one than yours.

See this review (which includes your study) for a counterpoint: https://ehp.niehs.nih.gov/doi/full/10.1289/ehp.7306

This is not at all a counterpoint. This review basically says "We need more data" and says the reason not to draw conclusions yet is that we don't even know how much exposure we experience in daily life.

They cite some issues with the study. One of them is that the confidence intervals don't seem to add up and the incidence numbers don't look like the general populace for the control group.

I skimmed the article (it's quite long) found the criticisms weak.

> No data published; for Szmigielski (1996) it is implied that there were two to three brain tumors in the exposed group, in which case we imply that the 95% CI for brain tumor is incorrect.

> Several of these studies did not follow workers after they left the job of interest (Garland et al. 1990; Grayson 1996; Szmigielski 1996), with the potential for bias if individuals left employment because of health problems that later turned out to be due to cancer; this might especially be a problem for some types of brain tumor, which can be present for long periods before diagnosis.

The above two criticisms are only regarding the brain tumor incidence data, which is not the most significant finding in the paper.

> “Expected” rates in Szmigielski (1996) paper appear to be incorrect, according to the Royal Society of Canada (1999).

Link is broken, and why Canadian cancer incidence rates are assumed to be equal to Polish ones is not explained.

> Significant excesses were reported for several cancer sites not seen in other studies, and for cancer overall, suggesting possible bias.

Not sure why reporting what you found suggests that you're biased, but okay.

Plus correlation is not causation.

Correlation is not an excuse to completely ignore findings either. As the original study said: more data/analysis is needed.

Agreed. The "correlation != causation" mantra does not match the context here.

It's true that correlation studies are not as conclusive as laboratory experiments in general, but when we're talking about negative long-term health effects on humans, you can't establish causation without doing something unethical. But does that mean science becomes useless? I don't think so.

There are times when it's appropriate to infer causation from a correlation and act on that conclusion. For example, the famous lead toxicity studies are correlation studies. One such study, cited over 1000 times, [0] meets only 6 out of 9 criteria for inferring causation [1], and people are comfortable making that inference.

A good next step would be to go through the PowerWatch list of studies [2] and evaluate these studies based on these criteria (or a similar list).


[0] https://www.nejm.org/doi/full/10.1056/nejm199001113220203 [1] https://journals.sagepub.com/doi/pdf/10.1177/003591576505800... [2] https://drive.google.com/file/d/19CbWmdGTnnW1iZ9pxlxq1ssAdYl...

Never implied that. The study of radar operators in the military specifically states that the cancer could be caused by anything, calling out industrial solvents often used when servicing radar equipment. But the “5G is cancer” people use the study to support their claim that millimetre saves cause cancer.

The radar operators surely don’t wash them every day with “industrial solvents”. It’s nowhere said that there’s any basis that the operators are consistently exposed to them. The study also doesn’t claim what you claim, it is written much more carefully.

Where did I claim that servicing equipment means washing the equipment with industrial solvents?

Why do you feel the need to “scare quote” industrial solvents which is a common term used to refer to toxic and carcinogenic chemicals used in industrial processes?

Agree - there's shouldn't be any place for ad hominem attacks in a scientific paper. To me, this is a strong indicator of bias and/or journalistic incompetence.

Maybe it's initially counter-intiutitve, but the rise in the rate of cancer should be expected as lifespans are increased and deaths from other diseases decline. E.g. every male will get prostrate cancer "if" they live long enough.

I'm amazed at how many people here actually decline to click on the links in the article, which would guide one to a large list of scientific publications, with links to the original publications themselves.

Yet they are very ready to call "more than 240 scientists who have published peer-reviewed research on the biologic and health effects of nonionizing electromagnetic fields" "wackos" or "cranks with a PhD", call their research "bullshit" or "impossible", call the people "thruthers" or claiming "Russian troll farms" are behind this story.

I don't think I've ever seen so much non-scientific HN comments on a science article.

At the same time, everybody seems to accept that the chance of getting cancer in your lifetime has risen to about one in three for men and one in five for women. And nobody knows why. However everybody who points to a possible answer is shot down without much investigation. Sad, really.

> At the same time, everybody seems to accept that the chance of getting cancer in your lifetime has risen to about one in three for men and one in five for women. And nobody knows why. However everybody who points to a possible answer is shot down without much investigation. Sad, really.

Plentiful nutrition which has led to 90% of the population of the USA being overweight, obese, or overfat is another potential culprit. Women suffering from anorexia developed fewer tumors. Similar experiments on lab animals in a controlled setting have the same result. https://www.ncbi.nlm.nih.gov/pubmed/11246846 I believe there are other changing risk factors in behavior as well, for instance women who have never had a child are more likely to develop breast cancer and fertility rates have dropped dramatically over the last 100 years.

Cell phones have only been widely deployed in the last two decades, and I don’t think those trends in cancer rates you’re referencing correlate very well to cellular deployment.

I’d still be curious to see more research happening in the field of millimeter waves, personally I don’t see this technology as very useful right now either compared to traditional cellular due to its lack of penetration.

It's anecdotal, but a friend of mine with cancer switched to a keto diet and he had one tumor shrink, to the point where his doctor said they could (and should) now operate on it.

The large increase in sugar and starch in our diet does contribute to fast growing cancer cells and there are studies that link being overweight/obesity with cancer.

We weren't this fat decades ago. Some may blame more office work, but the biggest factor is the amount of sugar/carbs in our diets. It's grown tremendously and yet no one seems to take it seriously, discrediting things like Adkins/Keto as "fad diets" when they were closer/more consistent with American and Western European diets for several decades.

I know a few people who have lost a lot of weight on Atkins/keto. I think they are successful diets because they provide bright-line rules for people to follow, but they are also extreme and potentially dangerous. None of them could maintain those diets long-term because they developed things like kidney pain. Caloric restriction, eating a good balance of fat/carbs/protein, laying off saturated fats, sugar, and booze, will fix 95% of people's dietary problems, but for people with weight/eating issues the middle ground is often just too difficult a path to tread.

> kidney pain

I've had trouble finding good studies on this. I don't think it's true. I've been on pretty hard keto for over two years at a time. I know other friends who have and have never heard of kidney trouble.

You occasionally have a day or two where you eat out with friends or have some fried chicken every month, and initially there is a period where you feel sick for a week as your body withdraws from sugar, but other than that I've never had kidney issues and my blood work has always come back fine.

With Adkins you do start reintroducing some carbs eventually, but you still keep it under a limit, and go back down if you start getting unhealthy.

Anecdotally: my father did have kidney issues when he went Keto, but that's because he had too high a proportion of his calories from protein. He switched to more leaves and fats and that seems to have cleared up.

I don't think you can say that they are dangerous more than you can say that for example vegan diets are extreme and dangerous. They CAN be dangerous but there are many who are successfully are eating that way and it is even a treatment against epilepsy seizures for some.


My (very limited) understanding is that tumours need glucose to survive and grow, so presumably a keto diet or fasting could potentially stop a tumour from growing.

> Plentiful nutrition which has led to 90% of the population of the USA being overweight, obese, or overfat is another potential culprit.

Yeah I’m gonna need some data to believe that one. 90% obese / overweight? I’m calling bullshit.


Edit: I should have said adults in the parent post though, this statistic doesn’t apply to kids.

Looks like it's closer to 70% [0], for adults at least.

[0] https://www.cdc.gov/nchs/fastats/obesity-overweight.htm

Yes, it's more like 35-40%.

I'm going off this page: https://www.cdc.gov/nchs/data/databriefs/db288.pdf

Please note the table you showed the 73% is combination of "overweight" and "obese" while the row that shows obese is the same number as my citation.

I don't understand why people are downvoting me when I'm literally quoting the stats from the agency that defines this. If you're downvoting, can you at least reply to provide some technical signal so I can update my knowledge?

EDIT: I misread the original message and thought it was referring to just obesity, not obesity/overweight.

The original post included 'overweight'

The article says, "more than 500 studies, have found harmful biologic or health effects from exposure to RFR at intensities too low to cause significant heating." The link to the 500 articles is https://drive.google.com/file/d/19CbWmdGTnnW1iZ9pxlxq1ssAdYl....

I checked the "conclusions" of the first two: https://www.ncbi.nlm.nih.gov/pubmed/29996112 and https://www.ncbi.nlm.nih.gov/pubmed/29709736

Conclusion of 1) Despite the improved exposure assessment approach used in this study, no clear associations were identified. However, the results obtained for recent exposure to RF electric and magnetic fields are suggestive of a potential role in brain tumor promotion/progression and should be further investigated.

Conclusion of 2) Ever use of wireless phones was not significantly associated with risk of adult glioma, but there could be increased risk in long-term users.

They both read as, "we found no significant effect".

Careful, you’re misinterpreting the document. The two studies you’ve picked are listed as “neither evidence of an effect or a null finding”, they are not included in the “more than 500 studies” finding harmful effects. Those studies are prefixed by “P” in the document and, although I haven’t counted, a rough estimate based on the number of pages makes it plausible that there are > 500.

That said, given the scientific consensus from rigorous meta-analyses, I expect that these “positive” studies are mostly of low quality and/or limited sample size. And scepticism is generally warranted when advocates start listing large numbers of studies instead of referring to a few meta-studies. As it happens, the best available meta-studies come to the opposite conclusion (namely, that there’s probably no harm from mobile EMF), so this long list is essentially bogus.

Good point. Thanks for the patient explanation.

To partly redeem myself, I followed the links to the first two articles prefixed by "P". They were https://www.ncbi.nlm.nih.gov/pubmed/29725476 and https://www.ncbi.nlm.nih.gov/pubmed/29268055

The first didn't have a conclusion and the results were just the measured RF power in an apartment.

The conclusion of the second is, "A total of 900-MHz EMF applied in middle and late adolescence may cause changes in the morphology and biochemistry of the rat ovarium.", which makes no sense. "A total of 900-MHz EMF" is gibberish. The complete text is not freely available.

The codes used are:

>> Study Effect Codes:

>> P This study reported effects from the exposure or radiation category (effects can be either positive or negative and may be primary or secondary outcomes)

>> N This study reported no effects from the exposure or radiation category

>> - This study offered important insights or findings but is neither evidence of an effect or a null finding

I'll be honest, this ordering seems dubious to me. P (for positive?) can show either positive or negative effects. N (for negative) shows no effects. - (for neither?) shows neither evidence of an effect or a null finding. I would think that P needs to be subdivided better to show which papers show a positive or negative effect.

Can someone do a count on the number of each category?

At the same time, everybody seems to accept that the chance of getting cancer in your lifetime has risen to about one in three for men and one in five for women. And nobody knows why.

Because life expectancy has also risen; people who used to be dying of other things are now living long enough that cancer is more common.

Unfortunately the hunch you have just spendt 30 seconds thinking about is sadly incorrect. The increase in life expectancy doesn't account for the increased incidence of cancers. There are other factors at play, which need to be investigated (some that we know: obesity, pollution, cigarettes).


"In other words, the individual incidence of cancer deaths has actually fallen."

You are looking at cancer deaths, which indeed have gone down from better treatments and better screening. This does not imply less incidence of cancer.

I think you just answered your own question. As detection improves, so will the rates at which cancer is detected.

AFAIK, Better detection means earlier detection and may not lead to a decrease in cancers that are never detected.

It leads to in increase in cancers detected before the person dies of non-cancer causes.

Situation: person has an almost undetectable cancer. They see a doctor, no cancer detected, later that week they are shot by police at a routine road stop.

We get in our time machine, go back a week and a bit, and supply the doctor with a better detection kit.

Situation 2: person has an almost undetectable cancer. They see a doctor who refers them to a specialist, cancer detected, another notch on the cancer tally board. Later that week they are shot by police at a routine road stop.

Nothing has changed except in the second case there’s another cancer detected. The person is still dead from non-cancer causes, just in one scenario they died as a haver-of-cancer and in the other they didn’t.

Better/earlier detection will necessarily lead to a decrease in cancers that are never detected (which I interpret as an increase in cancers detected before mortality from other causes), otherwise it’s not better/earlier detection.

Agreed, but there are plenty of factors with a better-explained causal mechanism than non-ionizing radiation. Like the three you listed.

My money is on diet and pesticides/preservatives/etc. being another big one. With this, too, there is little official evidence that, say, Roundup, causes cancer, but there is, imo, a stronger lobby against a positive outcome in those studies, and they don't necessarily control for interactions such as Roundup combined with the surfactant that it is typically mixed with that increases cell penetration.

I agree with your comment in principle but the specific example of Roundup is rather weak. Yes, glyphosate is usually studied in isolation rather than as a commercial formulation including surfactants etc., but we have decades’ worth of epidemiological studies showing that proper use of Roundup has, at most, a marginal effect on tumour incidence. Conversely, the most-discussed studies that purport to show Roundup’s carcinogenicity have well-known, glaring methodological flaws (including some that formed the basis of the IARC report). The case of Roundup is made more complicated by the fact that Monsanto/Bayer has been caught red-handed skewing the publication record without disclosure of conflict of interest, and lobbying scientific journals. But the same is true for the opposition: for instance, the now-retracted 2012 Séralini study also failed to disclose the authors’ conflict of interest. And beyond improper publication practices (which, yes, is serious), there’s no evidence that Monsanto/Bayer actually falsified information.

In sum, I’d rank the risk of Roundup being carcinogenic on roughly the same level as that of 5G: possible but unlikely, given the best available evidence.

Ok, fair enough. I have enough other reasons to be against roundup without needing to cling to believing it's carcinogenic. Thanks!

For the record, the other reasons are to do with the larger ecological impact of roundup-based practices, such as harm to soil fungi and bacteria, and collateral damage from runoff or wind. Plus, there were some studies finding it may cause harm to intestinal lining and such, even if it's not actually a carcinogen.

Agroindustrial farming practices have led to most of our produce in stores becoming more caloric but less nutritionally dense. Interesting idea that perhaps our food plants are becoming more "obese-yet-malnourished" and this change in food plants could be at the root of a number of health risks.

I would add sleep deprivation to your list.

We live in an increasingly sleepless society and lack of sleep is effectively a carcinogen.

I checked this for you, and what appears to be the best meta study I could find shows that the only correlation found was that long sleep duration increases the risk of one type of cancer:

The present meta-analysis suggested that neither short nor long sleep duration was significantly associated with risk of cancer, although long sleep duration increased risk of with colorectal cancer.


"After just one night of only four or five hours’ sleep," Walker tells The Guardian, "your natural killer cells—the ones that attack the cancer cells that appear in your body every day—drop by 70%." Sleep deprivation has such serious outcomes that "the World Health Organisation has classed any form of night-time shift work as a probable carcinogen."

^Matthew Walker, presumably the 70% drop is from work at his Berkeley lab

Cigarette smoking has fallen a LOT in the US in the past few decades, though there does seem to be a resurgence lately with "vaping".

Pollution is lower too (again, in the US): cars used to pollute a LOT more. Smog used to be far, far worse in the LA area decades ago, so even with more people and more cars, pollution is lower, particularly localized pollution that affects people more. Of course, global warming pollution is certainly higher, but that isn't localized and shouldn't have any effect on you (it's just CO2).

Obesity is significantly worse, however.

That's not true, cancer rates have risen also in children and young adults.

A significant chunk of the rise in life expectancy is driven by a fall in infant mortality, is it not?

Well, if you could point me to a good scientific study that makes a link between higher life expectancy and cancer, I'd be inclined to believe it. Until then, I think the cancer incidence rise is shocking, and can't be explained by rise in life expectancy alone.


"In other words, the individual incidence of cancer deaths has actually fallen."

cancer incidence rising strongly correlates with our ability to detect as well as the push to look for it.

I also believe many of the cancers we successfully treat would be non issues if left alone. (all the young women who found lumps and become breast cancer survivors)

Before we go any further, are you a scientist or a doctor? You're arguing on a thread where there are a bunch of scientists who have fairly deep knowledge in this area (it's still an area that a lot of people who come from external areas, like physics, struggle with frequently).

Cancer rates are strongly driven by age- cancer incidence increases exponentially as people age (stochastically, of course).

you're gonna die, bucko! of A,B,..Z., sooner or later. Make it past Y,and Z's gonna get ya. Change my mind!

from where does this increase in cancer probability statistic come from? anyone have a link to a paper? tried googling but found nothing reliable sadly


"In other words, the individual incidence of cancer deaths has actually fallen."

Don’t double post. But also, cancer death can go down while cancer diagnosis can go up.

A big confounder in studies based on diagnosis numbers is that you can confuse better diagnosis with higher incidence, and earlier diagnosis with longer survival. No room for confusion with Death though, it's very cut and dried.

Suppose that today the average person tells their doctor about a symptom of Example Cancer (which is incurable) six months before it kills them. 1 million people per year in Standard Country die of Example Cancer, with an average of six months between diagnosis and death.

Now, let's imagine I invent a machine, it can scan seemingly healthy people and tell them if they've got Example Cancer on average 12 months earlier. Nothing changed in terms of whether people get Example Cancer, it's still incurable, but now we've improved time between Diagnosis and Death by 200% but even scarier the incidence of Example Cancer, the number of people who know they have it, has also increased by 200%. It's an epidemic!

That machine is pretty unrealistic. A more realistic machine also gives false positives for Example Cancer. Now the number of people living with Example Cancer has increased by 500% but good news, most of those people don't die of it, because they had what medics would call "Sub-clinical incidence" meaning, sure, you had the disease but it didn't actually affect your life so who cares?

A big confounding factor in counting incidence based on cancer deaths would be improved treatment. Whether by medical advances or simple earlier diagnosis having better cure rates.

In fact, that's why we're interested in diagnosing it in the first place.

cheers, that seems properly sourced

Sugar and seed oils.

The article writer is a known "truther" of the field. But I like his view, I believe we need opinions from his end of the spectrum, but I've also read a few times comments such as this:

>"Academia: Where Crazy People Can't Get Fired - Dr. Moskowitz disgraces the University of California-Berkeley in precisely the same way Dr. Oz and Mark Bittman disgrace Columbia University: They are charlatans who wrap themselves in the prestige of academia to peddle foolishness to anxious parents."


To be honest, I'm somewhat surprised (in a good way!!) that Moskowitz got published on Scientific American at all.

Anyway, I have my fair share of worries on large density mmwave equipment environments, mostly focused on other things, as in, not on its effects on us, but on microbial life, bacterial life, not the focus of this article, so I won't derail, but at least for me, Moskowitz isn't this zero sum game as he may be to some field agents.

As a biochemist, I just want to chime in that I think part of the reason for an apparent increase rise in cancer rates is that we have eliminated many other causes of death already - we have to die of something, and cancer is something that is not "curable" in the sense of other disorders.

After looking at the list of publications purported to be evidence, I have to agree with the other comments here casting doubt. Most, if not all, of these publications are in no-name journals with few citations. I found one paper where the author listed a gmail email address (are they unaffiliated with any institution?)

> I think part of the reason for an apparent increase rise in cancer rates is that we have eliminated many other causes of death already

Where did you read "an apparrent increase rise in cancer rates" ? in this article or in one of the articles it references? which one?

I am not a biochemist, but I would assume academics are referring to incidence rates, not causes of deaths... If you did actually witness such a confusion in the papers, it's important to point it out, but if you didn't it would be equivalent to a physicist suspecting a colleague of confusing mu (the reduced mass of a binary system) with mu (a muon)... rather incredulous if you ask me...

Every academic discipline expects its disciples to be at least proficient in disambiguating words from context, so when one refers to a "cancer rate" in the context of causation, that it would refer to "incidence rates" i.e. the transition probability per surviving individual per unit time. This is independent of deaths by other causes.

I was directly referencing the parent comment:

> At the same time, everybody seems to accept that the chance of getting cancer in your lifetime has risen to about one in three for men and one in five for women.

Which suggests that there is an apparent increase in cancer rates. So yes, I was referring to an incident rate. Regardless, the semantics here don't change the meaning of my statement. We have solved (for lack of a better word) many of the lower hanging fruits of human disorders. As such, we can't effectively control for incidence rates over time

I'm not disagreeing with you, but I have worked with world-class scientists where the author only had a gmail address (untill we gave them an appointment at Berkeley).

Of course, a gmail address in and of itself is not an indicator of quality, however on a single author paper it certainly makes me wary.

Just out of curiosity, what field was that in? I could certainly see some scientific disciplines having unaffiliated world-class scientists, but others it would be virtually impossible to do high quality research outside of a lab

bioinformatics/computational biology. Specifically, multiple sequence alignment and HMMs for protein recognition. The author was previously a physicist (PhD) who left physics due to the low number of jobs in the 70s-80s, founded a database company, sold it to Intel, and then visited Berkeley and saw a cool talk and volunteered.

There were already a few codes in the area and plenty of papers, and he was mathematically inclined, so it didn't take long for him to become an expert. Once he was an expert, he pointed out major problems in existing codes (both functional and performance).

This is an area where you're working with fairly straightforward data and math (linear strings from a chosen alphabet, probabilistic model is well-established). You don't need to understand the underlying biology in detail to contribute.

Thanks for the answer - I figured it would be some bioinformatics/mathematics/statistics leaning field - you can definitely do more as a lone wolf there than say, biology.

I think this take is a little unfair. Sure, calling scientists “wackos” with reviewing the publications is extreme, we also know there is a massive body of scientific evidence looking at the link between RF waves and cancer that has pretty much found zilch at this point.

If you'd read some of the research linked in the article, you'd perhaps found this one: https://www.ncbi.nlm.nih.gov/pubmed/25749340 which clearly shows a link.

Here's a longer list for you: https://www.sciencedirect.com/science/article/pii/S138357421... though a lot of those studies are about DNA damage and not cancer per se. Though DNA damage is known to increase cancer risk.

So no, not zilch, not at all.

> clearly shows a link

Clearly? The actual article (here's a full text: http://www.fraw.org.uk/data/esmog/lerchl_2015.pdf) doesn't seem to show that clear link. First, there's no dose-response effect, mice with higher doses didn't present higher rates of cancer (sometimes lower). Also, Table 1 with the actual findings is just crazy. For example, the group with no radiation has the second-highest incidence of lung carcinoma. The rate of lymphoma is the same at 0W/kg and 2W/kg, but doubles at 0.4W/kg.

I'm not going to say that the article is trash because it doesn't seem to be, but it is definitely not a clear link, there is a lot unanswered there. There is no mechanism proposed, there are a lot of carcinomas studied (high probability of finding something with a correlation) and there is no dose-response effect.

A finding from your link.

"the biological activity of any specific type of EMF is inversely proportional to its frequency and proportional to its intensity"

the best explanation for why more people get cancer is fewer people are dying of other diseases. Also, more people are surviving cancer. Note: I'm a scientist, and one of the reasons I'm not clicking on links in the article is that I've already done my own looking into IARC and I've concluded they are non-scientific.

Note: I'm a biophysicist who has studied cancer and RF at the graduate level, and postdoctorate level, I can read the literature, and also make reasoned efforts at evaluating whether the literature provides any useful information that would affect the roll out of 5G from health perspectives. I am unable to find any reliable evidence that would indicate that this rollout will actually have "crisis" levels of health impact.

Now. On to the next step: I completely support high quality research done by high quality scientists on non-ionizing radiation. I would, like many other scientists, to see convincing evidence about the nature of damage that could be done by 5G. So far, nearly everything has been indirect in a way that does not inspire enough confidence to propose policy changes.

Doesn't help that the article is written in a kind of exaggerated, fear mongering way. E.g.: " 5G also employs new technologies (e.g., active antennas capable of beam-forming; phased arrays; massive inputs and outputs, known as MIMO) "

MIMO is multiple inputs, multiple outputs, not massive. If the author is bending terminology to enhance his case, that makes the case look weaker...

Yeah. If an average person thinks MIMO stands for Massive Inputs and Outputs that's not a bad mistake. If someone purporting to speak with authority on the subject of RF radiation thinks that, it's grounds to suspect them of being a blithering fool.

> more than 240 scientists who have published peer-reviewed research on the biologic and health effects of nonionizing electromagnetic fields"

Those 240 scientist have not published peer-reviewed research on the biologic and health effects of nonionizing electromagnetic fields"

They are mostly experts in other fields.

Don't fight a knee jerk reaction with one from your own.

> Cancer have risen to 1 in 5 in women and 1 in 3 in men, and nobody knows why

'Nobody knows why', seriously? Did we not establish that the likelihood of cancer increase with lifespan? When you only gets to live to 50 of course you don't die of cancer and heart disease nearly as easily. Your risk of cancer increase each year of age lived after a certain age, and any increase in life expectancy will result in more people eventually dying in cancer.

To be fair, there’s also quite a few published studies showing a positive effect for various pseudoscientific medical practices. Those studies just tend to be very poorly designed, executed, controlled, and don’t represent the actual literature trend.

You really do have to spend significant time investigating the literature to actually.

The average lifetime risk of getting cancer in the US has actually fallen substantially due to a decrease in smoking.

Diagnosis has simultaneously increased, especially in cancers where the rationale for treatment is low (e.g. prostate cancer in most men).

(work in cancer research)

So I started checking the articles in the link, seeing the ones that reported positive effects:

- https://www.ncbi.nlm.nih.gov/pubmed/29725476 The article only measures exposition, no conclusion on effects other than a literature review.

- https://www.ncbi.nlm.nih.gov/pubmed/29268055 Can't get the full article, but I only see mentions of a 900MHz source (inside 3G frequency band) and no mention of power. Also it's a biochemical study on rats. Slim evidence.

- https://www.ncbi.nlm.nih.gov/pubmed/25747364 One author, talks about wildlife orientation.

- https://www.ncbi.nlm.nih.gov/pubmed/26017559 This study talks about the effect of intensive radiation (3x the FCC limit for mobile phones, during 8 months) and it looks like it's actually beneficial for Alzheimer's disease. Funnily enough, it links in the abstract a lot of studies showing either inconsistent or no association between RF and cancer.

- https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4427287/ An study from the department of ¿Psychology and Psychiatry? that finds changes in EEG activity due to mobile phone use, only when the phone is placed near the ear. Little mention of whether the RF radiation can interfere in the EEG measurement.

- https://www.ncbi.nlm.nih.gov/pubmed/25738972 Mentions the CERENAT study, which shows increased risk with really heavy mobile phone usage (as in calls). The only one with actual positive effects and looking like a serious study.

- https://www.ncbi.nlm.nih.gov/pubmed/25885019 Decreased nasal mucose in rats. No mention of whether thermal effects were at use here.

- https://www.ncbi.nlm.nih.gov/pubmed/25918601 Finds decreased sperm quality, but also discusses other studies finding no effect, and also says "A point of limitation in this study is the inability to assess [...] whether sperm affections are time related or not".

- https://www.ncbi.nlm.nih.gov/pubmed/25531835 This one is about nicotine sulfate + RF exposition in frog embryos, and doesn't find effects of RF alone.

That's just off the first page and a half. Probably someone should do a more thorough review, but it does not give me any assurance that most of the studies with reported positive effects are done by people not in the related fields, have no relation to the problem or do not answer the actual important questions.

> And nobody knows why. However everybody who points to a possible answer is shot down without much investigation. Sad, really.

Mobile phones are not the only thing that has changed radically in the last years.

>At the same time, everybody seems to accept that the chance of getting cancer in your lifetime has risen to about one in three for men and one in five for women.

Annual invasive cancer incidence rates have been declining in the U.S. for over two decades.


Also, the lifetime risk of incidence is about 40% for both men and women:



People don’t click the link because they already know better.

We are bathed in radiation everyday.

So guess what, the people who say 5G is unsafe, are also the same people that will say flying is unsafe, or using a microwave is unsafe, or eating a banana is unsafe, or being in your car is unsafe, or using your phone is unsafe. Because all these things emit radiation.

Do people care about that though? No. The effects of radiation are grossly exaggerated, and frankly any negligible effects we feel are just the price we pay for living in high tech times. I doubt anyone wants to go back to a tech free lifestyle just so they can live maybe a few more years only to die of something else anyway.

So yes, it’s quackery to say 5G is unsafe and the only reason an article like this would rise to the top is so people could come out and trash it. You want to see scientific comments then go to more interesting articles.

You conflated ionising and non-ionising radiation (flying/banana/chest x-ray vs. radio waves, wifi, light from lightbulbs).

Just so you know.

Your point still stands though, although i would say we're bathed in more intense radiation than 5G and have been bathing in it for millenias - sunlight is radiation as well!

Better live underground, i guess.

Oh, oh, but what about skin cancer you say? That's caused by sunlight, right, so radiowaves could harm you, right? Yeah, but skin cancer is caused by ultraviolet, high energy, high frequency EM radiation (3-30PHz). Petahertz, Coral! That's several orders of magnitude less than most extreme 5G!

Don't even get me started on the power levels of sun vs a base station!

So yeah, the claims of health impact are bullshit.

Other points: while excessive radiation is bad, below a certain point it is not harmful, and in fact is beneficial. There's some good reading here: https://en.wikipedia.org/wiki/Linear_no-threshold_model Most scientists today would say that below a threshold, there is no negative effect to radiation (or rather, you can't predict the effect in a stochastic way as a function of dose) and also that within certain regimes, radiation is beneficial (mutations leading to evolution).

> Most scientists today would say that below a threshold, there is no negative effect to radiation

Every time I was enticed to look up more about hormesis, I see the same issue: an intrinsically linear effect of a factor is studied, with a precise linear generator of the factor, but outside of the generator there is a background component to the factor which is ignored, which causes a misinterpretation of non-linearity.

A concrete example, suppose you have a light sensor in a "dark" enclosure, then at large enough intensities the current through the photodiode is linear with the incident illumination, but if there is some light leaking (or alternatively thermal radiation, and hence temperature, and hence dark current) then as the light generator is set to lower and lower levels, the light sensor will no longer linearly approach 0, since the signal starts to delve below the noise floor (so it will allways be measurable, but require more and more oversampling to decrease the noise floor). To confuse this effect which has nothing to do with photons getting converted to electron hole pairs, it is a misinterpretation to consider the effect "non-linear" close to the noise floor, and an even bigger misinterpretation to consider it "beneficial". Sure even high levels of ionizing radiation can be beneficial to the offspring of a colony of bacteria, fungi, or plants as a group, but it most certainly is harmful to the the individual bacteria, fungi or plants individually.

In the case of the light sensor, the current through the reverse biased photodiode will still be ideally linear with the total incident illumination, just no longer linear with the illumination of the non-dominant light source.

This effect is called hormesis - _some_ stress can be beneficial to health.

Humans kinda need sunlight to generate enough vitamin D. So if you live in a cave, that's not going to be good for you either. But sunlight causes cancer too, so what should you do?

I think it should be pretty obvious that we can't avoid radiation, and we were evolved to handle a certain amount of it.

Live in a cave and consume copious amounts of fish (oil)! Die of something other than cancer.

Really, complete bullshit for 100% of the population no questions asked? With all the wrong information Science has helped spread when it comes to Diet (example: eating fat makes you fat) is it really fair to say a study that says we need more studies is automatically bullshit?

Really, complete bullshit, just like "vaccines cause autism" or "MSG causes chinese restaurant syndrome". Do we need more studies to again disprove what already has been disproven? Bad bullshit science exists and "we need more studies" is its lifeline that keeps the gullible populace feeding it.

Oh and also. To make a scientific claim, you have to make a hypothesis. Not just a claim "5G causes cancer", but "5G causes cancer by this and this method". Without method of action best you can do is a corelational study, which is jus one step higher a case study, which is basically an anecdote.

>At the same time, everybody seems to accept that the chance of getting cancer in your lifetime has risen to about one in three for men and one in five for women. And nobody knows why.

There is a clear and well known pair of reasons for this. First people are living longer and the longer you live the more prone to cancer you are. People who died of small pox did not die of cancer.

Second, more controversially, atmospheric bomb testing in the 50's and 60's.

People live too long, and don't die of stuff like pneumonia or tuberculosis or syphilis or yellow fever much anymore.

Over a long enough time span, your probability of getting some form of cancer goes to 100%. Probably more like 200% or 300%, since there are so many types that they can cut out or beat back relatively successfully these days, at least until you get too old and decrepit.

I think the title of the link is phrased badly. "We have no reason to believe 5G is safe" makes me want to comment that "we have no reason to believe 5G is unsafe" without even reading it. Maybe the title should be more specific. For example: pointing out that millimeter waves are not present in <=4G.

To be fair, it doesn't seem unreasonable to start with the assumption that something is unsafe and work to prove that it is safe. Seems safer ;)

If this was some technology where not having it would be a major impediment to society, then maybe it would be a different story. But 5G just doesn't seem that important to me.

If we were to do this, it would cause large numbers of techologies to never ever reach the public because people couldn't do the work to "prove" that something is safe. I would only recommend this policy towards things that we could reasonably expect to have catastrophic outcomes.

Like I said, the reason it seems reasonable in this case is because 5G really just doesn't seem like it's super important. The telcos just need a new thing to market, and it seems totally reasonable for us to expect them to ensure it's safe, especially give the ubiquity of cell phones and the fact that most people will probably eventually be upgraded to 5G without even needing to opt in.

Frankly, that's SOP on Hacker News. Half the people are commenting on something they haven't read, and the other half are complaining that news from reputable sources isn't free and/or complain that they can't read it with NoScript in Lynx on Arch Linux.

> At the same time, everybody seems to accept that the chance of getting cancer in your lifetime has risen to about one in three for men and one in five for women. And nobody knows why.

Couldn't it simply be that we are curing/reducing the incidence of most other diseases? Cancer is something that typically doesn't occur until the later years of life. By reducing the number of people who die of other causes, you are increasing the potential population of people who end up getting cancer.

I promise I'm not being snarky but would higher cancer rates be linked to the fact that we rarely die from anything else nowadays, barring accidents?

I actually read the article as I'm unsure this is about safe-for-human-body or safe-for-network-security.

What about the cell-phone-radiation-for-brain-cancer concern? Haven't heard about it for a while.

5G will work better in populated cities and I don't think it is a good fit for most areas in USA. Likely we will have 5G for very dense areas while 4G for the rest.

I will absolutely not live or work in close vicinity of any 5G towers.

> At the same time, everybody seems to accept that the chance of getting cancer in your lifetime has risen to about one in three for men and one in five for women. And nobody knows why.

I'm not sure exactly what time interval you're making that claim over, but isn't a large part of it people not dying from plague and flu and having good medical care to live into old age in the first place?

>>Little is known the effects of exposure to 4G, a 10-year-old technology, because governments have been remiss in funding this research.

One thing's for sure: there _is_ paucity of research on this topic - judging by a brief search on scholar.google.com for "4G technology health effects humans".

Is it "paucity of research" or "a paucity of research"? Agreed though.

>At the same time, everybody seems to accept that the chance of getting cancer in your lifetime has risen to about one in three for men and one in five for women. And nobody knows why.

I mean, what else is there left to kill us?

Funny things start to happy on online discussion boards when billions of dollars are at stake. Comments deserve as much skepticism as the headline.

The dissonance runs high here on the subject of 5G, the only thing I can think of that comes close is vaccinations.

Once religion is out as a guide to navigate the world, there's really not much left except science to cling to.

Despite plenty of evidence that everyone and everything can be bought, despite plenty of respected researchers raising their voices against, despite all the proof you could want of how these situations tend to play out long term.

Once you give up blind belief in science, there's nothing left to hold on to; and that's obviously a scary thought for many.

I find the HN community to be bi-modal with respect to technology: either they are very distrusting (luddites), or blindly in love with it (techno-utopians).

5G gets the techno-utopians in HN all excited about the possibilities of [INSERT THE POINT OF 5G] and they'll get mad at anything that threatens its introduction.

As a contrary data-point, I'm an increasingly luddite programmer who doesn't care at all about 5G, but I live in a town that is kind of a hotbed for conspiracy stuff and I tire of the latest scare always going round without a plausible causal mechanism.

You can find PHDs who believe just about anything (I saw one recently with a sign saying "sunspots cause global warming"), and PHDs publish papers. So you have to look at whether they were peer reviewed, in what journal, and what the findings and methods really were.

It's a lot of work to properly evaluate studies, but I think that meta-analysis is easily abused in this arena. The other big problem is the null publication bias. Have one hundred people roll dice, and if you're only interested in snake-eyes and only publish papers on where that happens, you'll get like 9 studies where they rolled snake-eyes and one responsible scientist who publishes a null result, and conclude that there's a 90% chance of snake-eyes on dice rolls...

I work around RF sources professionally. I'm intensely interested in any biological effect of the radiation I work around. I have access to journals through work and my old university, and I read primary sources with great interest (and some knowledge, being an electrical engineer). I have yet to find any credible link between gigahertz RF and health problems (except the neck problems that come from staring at a phone).

I feel like the most damning lack of evidence is the lack of correlation between cell phone adoption and the purported ill effects. In the last 20 years pretty much the whole world started holding RF transmitters up to their head, from basically zero beforehand. If there was an effect, it would be epidemic.

Then there are people like me, who are in love with technology, but work with it and science so closely we have a very good understanding of when and how technology can have risks, and when it can have benefits. And when it's impossible to predict.

Who here actually gets excited about 5G? Everything I see about 5G is just full of business buzzwords. No one can explain why [a technical person outside of the mobile telecom industry] should care. Why the hell is there a new "G" if LTE was supposed to mean "Long Term Evolution" anyway?

I've coined the term "orchard picking" for this. If the subject is climate or vaccines, most people will yell "learn the facts" and "follow the science" and generally act as though dissenters deserve every ounce of contempt we can pour on them. On those topics I generally agree, except maybe for that last part. But somehow on other topics - e.g. most discussions of programmer productivity or effects of technology - it's all "meh, here's my anecdote instead" and other forms of pseudo-skepticism. There I tend to break from the crowd. I believe science is a mode of thought, not just another rhetorical club to pick up when it's convenient and set down again when it's not.

Perhaps because everyone here knows there's very little actual, proper science on programmer productivity, and even less of that is applicable in workplace environment.

There is no break from the pattern in this thread. People still call for following the science - they just consider this article's science to be the RF engineering equivalent of Wakefield study.

There is no "meh", just no link found between mobile phones and cancer.

Click links in the article? I feel accomplished with the fact I read the article....

I think the push to make these scientist seam like wackos, is all the money loss for the new tech(and even existing tech)

Oh, and more important than the studies is who is doing them and who is funding them.


> The scientists who signed this appeal arguably constitute the majority of experts on the effects of nonionizing radiation. They have published more than 2,000 papers and letters on EMF in professional journals.

That’s a pretty strong group.

The pioneers behind reverse transcriptase were ridiculed by their peers for years. As DNA could only travel in one direction. Francis Crick even called the scientist a wacko. But in the end, the 'wacko' was proven correct and won the Nobel Prize for being so. His name is Howard Temin.

Learned all about him from Malcom Gladwell's podcast this weekend. A really fascinating story of someone that wouldn't give in to peer-pressure, because he was convinced:


We have already solved all the worlds problems, unified quantum mechanics with classical physics, cured cancer and figured out how to factor primes. None of these works will ever see the light of day or be taken seriously because of the sources.

That may sound like hyperbole but somehow I doubt it's far from the mark. Tons of modern advances were from novices or unaccredited. Flight, Relativity, Baysian and Boolean logic; Two hicks, a patent clerk, a preacher, and a self-taught math nerd.

Radiation dosimetry is a combination of:

* Frequency of radiation

* Power of exposure

* Duration of exposure

* Where in the human body absorption is occurring

While the effects of the latter three are pretty well understood for certain kinds of radiation (ionizing and non-ionizing) ranging from "acute radiation sickness due to gamma burst" to "listening to the radio your whole life doesn't have a link to cancer", there is truth that a specific band of millimeter 5G has been less studied than others.

However, science follows patterns, and interpolating the existing data to this sub-infrared region opens a kind of wiggle room similar to, but in fact the opposite to, low dosimetry of ionizing radiation that has given the Linear No Threshold model a run for it's money. Except in this case, skeptics are typically concerned about chemical effects due to subdermal heating (not really as compelling as ionizing radiation effects), or debating the "non-ionizing-ness" (which is less common because its even less supported by evidence).

It comes down to a persons personal risk. In my opinion, the sun beats out all non ionizing radiation concerns, particularly when it comes to heating of the skin and subdermal tissues. Wear a hat and sunscreen (against the sun).

Still worth researching and acknowledging the data gap, as the EU does in its metastudy of 40+ years and X00 scientific papers [0], but there's no reason to be alarmed based on the existing corpus of evidence.

[0] https://www.google.com/url?sa=t&source=web&rct=j&url=https:/...

Edit to add citation

> It comes down to a persons personal risk. [..] Wear a hat and sunscreen.

What's the "hat and sunscreen" protection against millimeter wave cellphone towers, that someone else installed on their private property near you?

I'm not yet convinced the risks exist. But conceptually if there was a danger, there's no "personal risk" argument. We're blanketing the whole area around a tower with millimeter wave/5G, a person cannot opt out.

The sun is a 1000 watts per square meter exposure at your body, cell towers might be 100w if you hugged the antenna then inverse square law to distance to about 0.00000001 watt with a strong signal. Your cell device is a millwatts transmitter, equivalent to an LED light shining on you.

You don't need hat and sunscreen for street lights or flashlights nor would you need it for cellular power levels, orders of magnitude difference in exposure.

The Sun is also at 5400K.

Therefore, 99.9999038793% of the energy emitted is at wavelengths shorter than 1E-4m (0.1mm).

Or, across the entire continuous spectrum (0, 1e-4m], the total energy from the Sun is 0.0009612W, or 961uW.

You might complain that 961 uW is much more than 10nW, but again, you have to consider that 961 uW is across the entire spectrum (0,1E-4]. The Sun is less powerful at narrow spectrums because otherwise it'd drown out the cell phone tower (Or rather we pump power into the antenna to overcome the "noise" from the Sun).

I'd reproduce the numbers, but unfortunately my HP48 underflows at such narrow bandwidths.

And the "LED" at your face is, of course, much more powerful than the Sun since (as you pointed out) by the inverse square law, it has to pump out a lot of power to reach the tower.

Nor do I happen to think LEDs are harmless. Using LEDs to affect biological system is a very rich area of study.

Do I think non-ionizing radiation is harmful? I doubt it. But comparing a 1000W/m^2 @ 5400K black body radiator to a cell phone tower is dishonest.

Exactly, the sun is mostly higher energy (approaching ionizing) terahertz radiation.

The concern with 5G seems to revolve around the uses of higher frequency millimeter wave radiation compared to 3G/4G which has shown no repeatable damaging results at normal power levels.

If higher frequency = bad, which is actually true, comparing 5G the sun is not dishonest IMO. I am trying to give some perspective to show how it seems odd to worry about extremely low power cellular radiation while giving little thought to the extremely powerful nuclear radiator in the sky.

Dosage matters, cellular frequency will cook you given enough power, so will visible light. The power levels we are talking about do not generate enough heat to damage our tissue, so if they do harm it would need to be through some other unknown process. Should we keep looking for possible other processes, yes of course. However I would be much more concerned about the much more powerful visible range artificial radiators around us every day, like the monitor I am staring at right now emitting a 100 times the radiation of my cellphone right at my face all day long.

>Therefore, 99.9999038793% of the energy emitted is at wavelengths shorter than 1E-4m (0.1mm).

Shorter wavelengths have higher photon energy. It's the short stuff that you have to worry about.

Sure, ionizing vs non-ionizing but millimeter waves do do localized, penetrating, heating and the intensity of millimeter radiation from 5G is orders of magnitude stronger than that from the Sun.

Is it consequential? Probably not, but the OP's original comparison of 1000 W of Sun vs. 100W from a tower and nW from cell phone is disingenuous.

As a general heuristic, sure. But that is hardly an airtight thing.

This is a tangent, but it's crazy that we've built a communications network that relies on basically LEDs shining out of our pockets (with less interference from the sun, but still).

Except our pockets (unless you wear tinfoil pants) and many other materials are transparent at those wavelengths which makes it a lot less crazy.

You don't think making an antenna that can detect an LED shining in a flat miles away is crazy because pockets are transparent?

In darkness our eyes can easily detect a 1W LED from kilometers away.

But can they distinguish a specific 300 mW LED in a stadium with tens of thousands of other LEDs next to it?

If it were a different color, yes. If each of them were a different color, your eye would still be able to, as long as you had enough focus. So the analogy holds pretty well.

My understanding is that healthy human eyes are able to detect individual photons.

No. Amateur radio operators have done it, and it’s not difficult to imagine how one would.

That doesn't make it any less amazing.

Indeed, nothing is difficult to imagine how to do if the method is common knowledge.

Absorbsion rates vary by orders of magnitude depending on signal frequency.

A simple power comparison is not a great measure of affect.

The Sun transfers orders of magnitude more heat to your body than cell towers or your phone.

The problem with every argument about radiation from cell phones being dangerous is that for every proposed mechanism, the Sun is orders of magnitude more damaging.

The obvious worries with cell phones are repeated stress injuries, insomnia and disruption of personal relationships. There's just no plausible mechanism for the radiation to be damaging, though.

Out of curiosity, do you wear sunscreen to keep yourself cool?

Even at 100% absorption these are minuscule amounts of power compared to huge doses of ionizing radation we bath in every day.

What ionizing radiation are you referring to? I'd hope you are not bathing in ionizing radiation everyday...

Sorry I always forget UVB is not technically in the ionizing range, but "near ionizing" with enough energy to cause non thermal DNA damage.

It is, because if you assume the worst case of perfect 100% absorption, we're still talking about orders of magnitude too little power to matter.

Higher the frequency, less radiation penetration there is (skin effect), so higher = "better"

You're falsely equivocating sun's wide spectrum to 5G's very narrow set of bands. Most of sun's spectrum at the surface is visible, infrared light, and a bit of UV.

Note that without significant shielding, the sun will make you very sick or kill you.

Mostly the protection is the inverse square law. Provided you're not holding the device next to your head the exposure is tiny.

Cell modems are around 300 milliwatts transmit, it would be like holding a LED light up to your head that flickers on and off when sending data.

Extremely tiny exposure of non ionizing radiation, you probably get more from the monitor your looking at right now.

5G requires far more towers than 4G or 3G before it, and if biological damage is possible accumulated risk when measured over an entire population can still be problematic.

Ultimately this isn't simply an issue of "personal risk." If there's no danger, there's no danger. But if there is and you live in a major city simply not owning a cellular device may only lower your exposure.

None of what you say is false, but without numbers it's impossible to compare to other sources of danger. And like ionising radiation, you need to compare to background levels. How does it compare to, say, the sun? Or EMC emissions from motor vehicles?

(Yes, they're not zero: https://www.ofcom.org.uk/__data/assets/pdf_file/0029/44876/m... )

No it doesnt. It can work with more on 60GHz, or it can work with the same amount on the same frequencies as 4G.

And in either way - more towers, less power per tower.

Or in your pocket. Where do you carry your phone?

That would only apply to the phone transmitting, wouldn't it? The RX is broadcast from the nearest tower, and 5G uses beam-forming. If your device (or another device near you) is receiving from the tower then there's not much of a way to avoid getting exposure.

Not much way to avoid getting exposure to electro-magnetic waves?

You make it sound like EM waves aren’t being emitted or reflected by everything in the universe short of a black hole.

And cell phone EM exposure levels for someone who isn’t carrying a phone - hopefully this person knows better than to walk outside, or stand near a window!

Black holes still emit hawking radiation.

A hat and sunscreen does offer some protection against millimeter wave radiation - the had will block some (smallish but non-zero) percentage of the radiation, as will the sunscreen as long as you apply a thick layer.

As far as I know the exposure is a bit like a bell curve.

Right besides the tower you're in a blind spot, then it gets more intense as you move out of the blind spot and then it gets lower as you move further away.

5G requires more towers, closer to their end users. To quote the article:

> 5G will require cell antennas every 100 to 200 meters, exposing many people to millimeter wave radiation. 5G also employs new technologies (e.g., active antennas capable of beam-forming; phased arrays; massive inputs and outputs, known as MIMO) which pose unique challenges for measuring exposures.

Haha, it’s not massive inputs and outputs. It’s multiple input/output antenna arrays.

No it wont require, it will be able to work on the same frequencies as 4G does.

It can use more towers at higher frequencies in really really crowded areas or industrial setups, but those work at lower power outputs, and penetrate the body less (due to higher frequencies).

Isn't more cell antennas equivalent with much lower transmission power? Especially the phone needs to use way less power, so total amount of EM for humans nearby is going to be significantly lower.

Now compare to AM/FM transmitters that cover a huge geographic area. Those can be scary. FCC allows up to 50000 W transmission power. It's required because the antennas are spaced so far apart.

Isn't the amount of radiation you receive from the towers tiny in comparison to the radiation of your phone?

My City has them or they are under permitting for "towers" (of 30 feet or so) on nearly every block. Some right next to residences, daycares, schools, offices, etc.

I have a wallet with RFID protection. Could we make cell phone cases and/or glass with shielding that blocks 5G signals, limiting exposure in part of the phone that wouldn't be need to be sending and receiving signals? Basically have a naked antenna and shielded everything else. I am not a mobile phone hardware wizard :)

There is such a thing as RF shielding that uses copper or other conductive metals in a grated screen as a means to block out radio frequencies. While the application is typically used as a means to create telecommunication dead zones, i could easily see somebody capitalizing on this kind of health scare by putting copper shielding in say clothing like a hoodie.

I think when comparing this to the sun there are lots of things worth considering from many different angles.

1. To say this threat is nothing compared to the risks we face from the sun, completely ignores the fact that we have millions of years of evolutionary exposure to sunlight and because of that fact the immune system has become way more acclimated to dealing with it's carcinogenic properties. Of course, our immune systems aren't perfect but considering how much more exposure we all are to sunlight than any other carcinogen, if we didn't have a strong evolutionary tolerance to it; melanoma would obviously be the #1 leading cause of cancer. But it's not.

Now in hindsight, what we don't have evolutionary genetic tolerance for is microwave radiation. While i'm not in anyway an expert in biology or the physics of light, I do have enough insight to know that we should never underestimate the possible negative outcome of any potential problem when it's still just a mathematical theory we're playing with rather than a reality of nature we're all dying from.

I mean, we underestimate and miscalculate these kind of things all the time and it's kind of counter productive that the answer to the question of "When will we learn to stop doing this?" is always "Once, we underestimate and miscalculate, discover our error and become smarter because of it"...

2. The bigger issue this has when compared to the sun is the fact that all of us can agree while there is such a thing as getting too much sun. However, there isn't such a thing as internet that is too fast. We essentially all want to be able to download terabytes at the speed of light so that we can one day be on Mars and be able to seamlessly stream Netflix from Earth while were up there. And while of course there is a speed of light limitation of like i believe 8 minutes, that doesn't mean people aren't going to complain why they cant instantly stream from there.

This is what we need to understand about this issue. There is no limitation to how much radio wave radiation we want when we are thinking of that radiation in the language of the internet, as "Content".

3. When it comes to protecting ourselves from this, if it actually does turn out to be a substantial problem. All the solutions suck.

The radio wave EM shielding for example protects you from the potentially harmful waves. But it also creates like I said, a deadzone.

So putting it in your walls of your home, because you don't want you and your entire family constantly being exposed to the cell tower that might only be a football field length away from your house (as the one near my house is), also means you can't receive or make calls to and from your cell phone. This is a deal breaker for most people, and as far as i'm aware RF shielding isn't like sunscreen where it blocks out most of the harmful light while letting in the some you want. It's an all or nothing solution.

Which is why I think putting the RF shield in clothing like a hoodie would be an interesting venture idea. The shielding protects your body, while your phone is still outside of the shielded area. (although this obviously doesn't protect your face and i don't think RF shielded masks are going to find a market other than antifa)

And another reason why the implications of this would really suck, is that it means were stuck with crappy wires and the companies who own the rights to pumping internet to and fro over them. I am mostly looking forward to 5G because I view it as the primary means we are going to get ourselves out of the era of shitty slow internet and greedy cable company overlords who wouldn't know how to disrupt something if a tv with rainbow bars on the screen hit them in the face.

> Now in hindsight, what we don't have evolutionary genetic tolerance for is microwave radiation.

Nor would such a thing be possible unless we could somehow drastically reduce our water content.

But speaking of miscalculation, mentioning Mars before starting to go on about tinfoil hoodies is, well, there's a lot of actually definitely dangerous radiation on Mars and a whole lot more on the way there.

One last note, higher speed or bandwidth has not scaled linearly (or really at all, outside early 3g?) with antenna power.

But if you do want to shield your house, most telcos offer seamless wifi calling these days. Then again wifi runs on the same freq as microwave ovens.

The thing is that there's really ton of evidence that non-ionizing radiation is not significantly harmful to humans at levels that don't cook you.

If you want to counter that you can't just pile up small studies that might be hinting at possibility that there might be some other effect.

You need a smoking gun. Single study, but bulletproof and large, showing strong effect. Everything else will be dismissed as "maybe, possibly, but most likely not really".

This comment explains precisely why the dismissal from so many here. Correlational studies without a plausible causal mechanism are highly suspect.

As an illustration, people have used obviously bogus examples like, in the past 100 years, piracy has risen. So has global warming. Therefore, pirates cause global warming.

Unless a 5G study specifically addresses the mechanism, and how non-ionizing radiation can cause damage to DNA, or has a very large correlation established that does a very good job of controlling for other factors, these studies will be dismissed out of hand.

There's plenty of ways that non-ionizing radiation can result in increased cancer rates. For example, they can induce currents in the DNA (DNA is s molecular wire) which might jam the base excision repair system and prevent it from detecting DNA damage (which is a redox-driven process).

What's even scarier is that this sort of an effect will not be found in a standard Ames test and also is unlikely to be found in highly controlled lab settings, since it requires a second factor - a contaminating primary mutagen - to manifest its effect.

Maybe, possibly, but does this potential mechanism have any impact on our huge, complex, self-righting bodies, that every hour successfully defend from onslaught of scores of adversarial microorganisms and their chemical warfare targeted at us and each other and environmental factors like background ionizing radiation and completely natural toxins that we breath in and choose to ingest and our bodies own complexity that leads to many. many errors in functioning that are roughly corrected and worked around on the fly to keep us alive not for hours, which would be a feat in itself, but for decades.

Aren't some forms of cancer also overreactions by the imune system? Some kind of Leukemia comes from phagocytes, I just read in another thread this morning

alright, I'm getting downvoted, so I'll put some important references here:

Generally, the research of Jackie Barton showing that DNA is a conductor and speculating about the role fo 4Fe4S cluster in the BER mechanism was emergent in the late 2000s back when I was a biochemist and not a programmer, but they're now (last 2 years) publishing papers that are fleshing out that hypothesis:


Specific papers: https://www.annualreviews.org/doi/abs/10.1146/annurev-bioche...




This is quite interesting (I'm a biochemist and programmer as well), but I don't see any evidence that non-ionizing radiation could affect DNA repair pathways - that seems like a killer experiment that could (and should) be run to support or disprove this hypothesis. Induce some double strand breaks or DNA nicks, and look at how non-ionizing radiation affects repair rates

I agree that it's a killer experiment and someone should do it.

But BER doesn't sense double strand breaks. It senses changes in the curvature of the dna induced by point mutations that are solvable by extracting the base and replacing it templates by the other strand. So you'll want to induce damage with something that causes thymidine dimerization or DNA alkylation.

Wait, so a -30 dBm signal (pretty strong) will induce something like 7mV across a 5 cm antenna. Your cell nuclear is up to 10 microns wide. That's 1.4 microvolts. I'm having a really hard time believing that a 1.4 uV potential across any molecule at STP can have a measurable effect without a lot of energy going into amplifying the signal.

Energy doesn't scale with distance in that way, with quantum mechanics. A "mismatch in size" (roughly speaking) will result in a different quantum efficiency, not a lowered energy, and also keep in mind that DNA is a highly coiled molecule so the resonance frequency could be deceptive. The straightened length of DNA in a cell is on the length order of meters. (Though that doesn't mean that it will resonate with meter long waves, it will have a higher efficiency with wavelengths much longer than a micron)

they can induce currents in the DNA

By what mechanism (and it's likely not limited to DNA then)?

EMF radiation can induce a current in any condutor. Try putting a piece of tin foil in a microwave.

It's far from limited to DNA, but it's not hard to imagine why people care more about DNA than other conductive molecules.

Any photon is basically "something which induces motion in a charged particle". That's what it means to be electromagnetic radiation. The intensity to which that occurs has to do with the electrical potential landscape around that charged particle and the wavelength of that photon. If these two values are close to each other (coupled) then the motion of the charged particle is more likely.

Microwave radiation (especially in the GHz range) illuminating DNA has been long known as a phenomenon, of course it does depend on which frequency band in the GHz range you're in, too.

wouldn't all that depend on tge specific signal transmitted? If you just look at random memory access, you'll find that it's not prone to corruption. That didn't stop rowhammer for example.

> As an illustration, people have used obviously bogus examples like, in the past 100 years, piracy has risen. So has global warming. Therefore, pirates cause global warming.

Actually you got that wrong. This was related to real piracy (i.e. people with eye patches on ships robbing other ships) which went down, not software piracy. So the decline of pirates is related to global warming.

(The example came from Bobby Henderson, the founder of the flying spaghetti monster.)

There are actually a lot more of the traditional maritime pirates these days operating off the Somali coast than there were in 1919.

Agreed in principle, however from what I get :

(i) it seems this "non-ionizing radiation" has many parameters that can modify significantly the way it propagates in the environment (and in human bodies), so I'm not sure speaking of non-ionizing radiation in a general way is sufficient to address the problem

(ii) following this reasoning and your comment about "a single study, bulletproof and large": it does seem that it is in fact what is being asked by this group - that time be given for a meaningful and large study of the specific radiation from 5G tech

and therefore requiring that kind of large study before widespread implementation actually seems warranted.

I'd be interested in seeing signatures from people at institutions I recognize and trust. There are a lot of people with degrees FROM credible institutions, but very little in terms of currently being researchers in the field AT credible institutions.

It's not hard to find 250 wackos if you pull random scientists working in random fields at random institutions. Most have no better way to know safety of 5G than I do.

Now, there's obviously some frequency band where we get into health risks. 5G jumps us from single-digit GHz to double-digit -- I'd guess you'd have to go at least past visible light before you run into safety issues, at least barring extremely high levels of exposure. Intuitively, it seems to me like that ought to still be safe, but I'm no expert.

But an appeal to experts -- with no real experts behind it -- doesn't do it for me. Neither does an appeal to papers based on volume, without a clear description of what they found and how. Most science is junk science.

The scientists' appeal, including a list of signatories and their affiliations, isn't hard to find by following links.


Likewise for a list of relevant papers:


Is Harvard Medical School an institution you recognize and trust? Columbia? Monash? McGill? Can we dispense with the tired ad hominems and talk about science? A self-proclaimed guess from a self-proclaimed non-expert seems ill placed in a credentialist diatribe.

When the list of people also includes folks who claim to "customize clinical trials to support marketing claims"[0] it is hard to take even experts with solid credentials seriously.

0: http://energymedicineri.com/

Five out of five signers from my country are well-known EHS proponents, and at least three have financial ties to the illness. Having a Ph.D. degree from a well-known university does little to prevent later involvement in quackery.

"Is Harvard Medical School an institution you recognize and trust?"

After their primary university admissions scandal? Absolutely not, not that I ever trusted them in the first place (Memphis has a far better medical program.)

... I found them, and that's exactly what raised flags.

If a faculty at HMS signed on, that'd be okay. If a ransom person with a degree from HMS does, that doesn't carry the same weight. I was pretty clear about "at" versus "from."

Science isn't a democracy.

Harvard Medical School is good institution but people on the list are not experts in the subject. Autism researchers etc.

The autism researcher from Harvard has studied the plausibility of a link between autism and radiofrequency radiation: https://www.ncbi.nlm.nih.gov/m/pubmed/24095003/.


Goalposts notwithstanding you have to admit the list is kind of hit or miss. Whenever their job isn't listed they're either retired or were never in a related field. For example you listed Harvard, Columbia, Monash, and McGill above. Half the people from those institutions are either unqualified, retired, or both.

Plus there's places like "ElectroSensitivity UK" which aren't scientific institutions.

PS - I fully admit that there are subject experts from trusted institutions included too. But you're being far too defensive for quite legitimate questions/criticisms.

> the list is kind of hit or miss.

It seemed that way to me as well, but the real point is that it's irrelevant how well or poorly we judge those people. What matters is the facts they or others present.

> you're being far too defensive

Defensive of what? I haven't even expressed a position yet, other than "logical fallacies are bad" and I'm pretty willing to stand by that one. Are you here to argue the converse? To the extent that people or motivations matter at all, why condemn me but not the one who created the digression?

> ...the real point is that it's irrelevant how well or poorly we judge those people. What matters is the facts they or others present.

This matters if your expertise are in the subject to which the facts belong, otherwise it means little, particularly in a subject which requires an incredible amount of knowledge of an an incredible array of subjects.

I can look a bunch of absolute facts regarding 5g signals and make an assumption, however, because my expertise lie far outside the realm of 5g, an expert can easily come along afterwards and show me why these facts–while accurate–mean nothing to the subject at hand and why a completely different set of facts are what one should be looking at.

Expertise absolutely matters. And this is coming from a person who is confident that a greater than zero amount of companies would happily pay people to add noise to a topic in order to poison us for a fraction of a percentage boost in their quarterly growth.

Expertise matters on complex subjects and attempting to pretend as if all ideas are equal no matter who they come from or who receives them is a recipe for disaster.

The collective We really need to get back to a place where we freely and happily say “It’s really outside my realm of expertise. You should track down an expert.” and way more often daily and regularly “I don’t know.”

> as if all ideas

This isn't about ideas. It's about data. The quality of data is independent of where it came from, so it actually is equal in that sense.

> You should track down an expert.

Yes, we should refer people to experts more often. Why? Because they have data, not because they have titles or affiliations. Information is what makes them experts, and information can be shared.

Perhaps I should have been more clear, apologies if I assumed it would be inferred.

To use your term data, with complex topics it isn’t simply having data. One must know which data is relevant to consider for which topic at which point. And even more importantly, one must understand which missing sets of information need to be considered.

There is a reason our society has come to place such a high value on experience and expertise. And it isn’t solely because someone had books of data stored on massive bookshelves. While these bookshelves are important, it is their understanding in the nuances of which books to hunt for answers.

Our new ability to store significantly more of these books and retrieve them more efficiently doesn’t remove our need for someone to use and contextualize the information contained in these books. New tech doesn’t change the fact that expertise matters.

Interestingly enough, we now face a sort of reverse of the problems we’ve faced for centuries: While we used to struggle to find enough information to feed to experts, now we face too much data and not enough experts to properly make use of it.

Again, apologies if my first post wasn’t clarifying enough.

Data doesn't speak for itself. It has to be properly contextualized and positioned for an intended audience and message. It's for that reason that you don't ever see scientific papers that are just tables of numbers alone.

If you're outside a particular area of expertise, the intellectually honest thing to do is to say "I'm not knowledgeable enough to assess these findings" and defer to someone who is. At that point, you're assessing credibility, and affiliation and track record absolutely matter.

> It has to be properly contextualized and positioned

Contextualization and positioning are themselves information that an expert can pass on.

I can think of very few science cranks I've ever seen who didn't have some kind of data. That means nothing. Data isn't a magical, pure, and perfect substance that emerges from the aether.

"I haven't even expressed a position yet, other than "logical fallacies are bad" and I'm pretty willing to stand by that one."

Your fallacy is the "Fallacy Fallacy." If you can actually stand by your words (nobody in my nearly 40 years of life has been able to) you'd explode from the sheer paradox.

> but very little in terms of currently being researchers in the field

Wait, did the goalposts actually move at all? The original post said:

> There are a lot of people with degrees FROM credible institutions, but very little in terms of currently being researchers in the field AT credible institutions.

You responded literally saying that the institutions are credible:

> Is Harvard Medical School an institution you recognize and trust? Columbia? Monash? McGill?

That wasn't the original objection; they even said in their objection that the institutions are credible. I don't understand how the goalposts moved.


Is criticizing an expert's qualifications an ad hominem? Particularly when they're using their expertise to argue for an idea as here.

Seems like a double standard. They can use their expertise/reputation to endorse an idea but using that same expertise/reputation to qualify that endorsement is an "ad hominem."

To me discussing their expertise is core to their position, since their whole position is: "I'm an expert, I sign a letter based on my expertise for a specific course of action." If they aren't an expert it cuts right to the core of their position.

Is it though? Is it ad hominem to identify and separate people who know something about a thing from those who don’t?

You raise a good point. When an appeal to authority has already made credentials etc. the issue, challenging that appeal becomes valid. The problem is that OP wasn't just an appeal to authority. It did not open that door. It did make arguments other than identity, and linked to fuller expositions of those arguments. Thus, addressing only identity and ignoring any arguments in either direction remains ad hominem.

It's not ad hominem to separate between experts and non-experts.

Ad hominem is a fair counter-argument to appeal to authority.

I would like to add to this that many academics who study wireless research, specifically MIMO beam forming - aren't particularly incentivized to find issue with 5G. I've spoken to a few who have dismissed health concerns whilst waving their hands about the actual Science. To them this is worth anything up to 10 years of funding and research opportunities in what can be a very competitive research area.

They receive A LOT of funding from cellular companies in particular, any research that throws doubt on 5G could lead to their funding being pulled. Not only this, but most of the interesting 5G hardware is coming from the cellular companies on loan.

My point is: You have a perfect storm for a rushed technology with potential for health risks. Almost every Country on the planet is investing lots of money into 5G and the technology itself requires a significant number more cellular towers to be built in closer proximity to people.

If there is any genuine question about 5G's safety, I would rather stay on the side of caution. It's not as if people will kill over if they don't have 5G immediately. Not only this, the technology will be more mature and the price will likely come down for infrastructure development.

It is impossible to find a control group that hasn't been previously exposed to some form of 2/3/4g radiation .

This is often repeated, but I wonder if it's completely true. I'd completely understand if it was harder, and more expensive, to find such a control group, but it surprises me that it would be "impossible".

I can think of a few people I know that are much less exposed to 2/3/4g radiation, by living in the forest and not having a phone on them all the time. Isn't a control group that has been exposed to significantly less radiation still useful ?

The Sentinelese, but they shoot visitors.

There are large cellular areas not exposed to significant (read: detectable) levels of radiation if you're willing to travel. Third world Countries would be a good place to start.

The article explicitly calls the 200 participants the world's leading experts in non-ionizing radiation. I haven't researched their credentials or examined their research. But if all existing research points in one direction, that constitutes empirical evidence. Of course methodological and statistical problems abound across all academia. The article makes existing research seem unequivocal and argues we need to conduct further research for 4G as well as 5G health effects

Fortunately the article is wrong because the signers are almost all either: 1) long-retired and out of the game entirely, 2) not trained in the field or qualified at all, or 3) known quacks who have peddled this kind of thing for years, or some combination of the three, and the articles cited are all written by that same population (and often published in no-name journals that I could probably get published in) or don't say what the author claims they do (many of the articles explicitly state that they found no connection whatsoever).

So I guess the moral of the story is "don't believe everything you see in pop science magazines," although I would have hoped nobody did to begin with(?)

Not a true Scotsman among them, huh?

If you don't want to be called a quack, don't associate with quacks.

>Now, there's obviously some frequency band where we get into health risks. 5G jumps us from single-digit GHz to double-digit -- I'd guess you'd have to go at least past visible light before you run into safety issues, at least barring extremely high levels of exposure. Intuitively, it seems to me like that ought to still be safe, but I'm no expert.

I highly doubt that 26+ GHz will receive much attention. You have no range, you need line of sight otherwise it won't work. At least 3.4GHz (and the other LTE bands) don't have that issue.

Why is being below visible light a factor? Different wavelengths have different properties that don't necessarily map linearly with frequency, such as the ability to pass through your body.

E=h.nu;frequencies under UV have insuffient energy/photon to ionize anything.

I clicked on a few links and Googled a few names. It's roughly what I'd expect: People with few publications now working in other fields, people without scientific publications, but university degrees, people who regularly speak at events from radiation critics, people who are invested in other fringe theories.

People get fooled by "large lists of peer-reviewed publications". Peer review is a lowest level quality mark for a piece of science. It means that hopefully it's not complete bullshit. Sometimes it still is, because noone can forbid you to call your journal "peer-reviewed" with your own weak standards of peer review. And sometimes credible journals make huge mistakes (remember the "Wakefield-study"...). Even with only well-performed, non-flawed studies you'll always have some studies saying that something is there that actually isn't. That's simple statistics, you'll have outliers.

"Here's a large number of studies saying X" is meaningless in a topic where a very large number of studies have been done. What you need is systematic reviews of the literature that not only count studies, but evaluate their quality and combine their results.

Also it's not true that "nobody knows why" cancer incidence has risen. It's a mixture of people getting older and diagnosis getting better. Not mysterious at all.

> People with few publications now working in other fields

I'm surprised this statement passed muster on HN where there must be some number of PhDs now working in other fields. Certainly the ones I know wouldn't take kindly to the idea that somehow, just because they left academia their research can be easily and completely dismissed.

There's nothing wrong with changing fields. But if you tell me "these are the most relevant scientists in the field" and most of them aren't actively working scientists - it looks fishy...

You might find it more informative to look at the last author, who is typically the expert and ( ideally ) is helping to drive the research.

Being such a person is totally fine (I’m one myself). What’s damning is the fact that this list is largely composed of just such people (plus some that are less qualified).

Yeah, this is just a ridiculous way to dismiss the papers. Most authors are "people with few publications now working in other fields". One professor trains 25+ PhDs in their career, and usually all but one or two of them will end up fitting that description.

It's unfortunately common in science for someone who's an expert in one field to think they can easily be an expert in another field and get it completely wrong. Linux Pauling is the most well known example of this.

Furthermore, even if cell phones cause cancer then the risk will in all likelihood be so small that it doesn't justify banning the technology. [1]

For example, eating barbecued meat causes cancer, but it we don't ban it.

1: https://www.youtube.com/watch?v=wU5XkhUGzBs

Wouldn't the difference there be that the person consuming meat is voluntarily consuming it while even people sticking to older tech phones are exposed to the newer radiation? I am no luddite but this would be my argument from a devil's advocate position

No, because you're only exposed to radiation if you actually use a phone. Google for "inverse square law". Your exposure is orders of magnitude lower from "second-hand cellphones" than from using it yourself; it's not like second-hand smoking.

Not if there's a cell tower every 100 yards putting out 30dB more power than your phone. The inverse square law also doesn't apply when you have phased arrays doing beamforming if you're directly in the path.

The inverse square law totally applies in the case of phased array beamforming, you just have a much higher gain to start with.

You get a lower exponent than the square with a beam. For a perfectly parallel beam the exponent is technically 0 (same intensity regardless of distance - like a laser pointer) For a focused beam intensity will actually increase with distance up to the focal point. Some energy can be absorbed by the medium (air/moisture/rain), but that is not covered by the inverse square law.

Not with a planar or linear array. I'm skeptical, but prepared to learn something: what sort of array can generate a square beam?

Even lasers spread. This guy agrees with me: https://www.quora.com/Is-the-light-from-lasers-reduced-by-th...

The cross section of the beam can be any shape, the important thing is the signal source is an area not a point. The value of the inverse square (of distance from source) is dominates power density when the signal source is effectively a point - this becomes the situation for a laser or anything at long enough distance, but at distances where it is effectively a beam, the strength of power density does not dissipate by that value. The geometry of the beam concentrates the power density at its focal point, which can be far behind the source or in front of it. For an ideal beam focused to infinity (parallel rays) its power would never dissipate with distance - as all of the power goes in the same direction. We don't need to form an ideal beam to say its doesn't follow the inverse square rule, any focusing of rays breaks the rule (at scales where it is reasonable to treat it as a beam and not just rays emanating from a point)

I hope that helps picture the situation. My original comment that a beam can have a different exponent was incorrect except in some approximate sense. The inverse square value will still apply, but from the beams real or imaginary focal point (if it is a point) But it the case of a transmitter beaming a signal at another, that focal length can be far beyond the reciever, so the 'rule' can be completely confounded.

It depends. The inverse square law applies to spherical wavefronts. Due to diffraction, wave fronts always become spherical in the far field.

But in the near field, that's not the case.

Beamforming arrays have planar wavefronts close to the source, so the inverse square law does not apply. The wave fronts will become spherical again at a "large" distance from the emitter, where the meaning of "large" depends on wavelength and emitter size.

Focussed lasers also do not have spherical wave fronts in the near field. The distance at which the inverse square law starts to apply to lasers depends on beam width, coherence, and focus.

the inverse square law always applies, hence the law part.

Laws of Physics are in general not always applicable. Ohm's Law, for instance, is only applicable in a narrow range of conditions.

"Laws" tend to be less rigorous and more narrowly applicable than "theories," though the colloquial meaning of the terms is switched.

Yes, except in this case there isn't an exception.

Incorrect analysis can give the correct answer.

There are excpetions to many inverse square laws, they just aren't relevant to the problem at hand.

Actually the difference between law and theory is nothing whatsoever.

There is a substantial difference in my opinion. Laws rarely put forward a mechanism or seek a complete understanding of a relationship, while theories often do.

Ohm's law was a law long before the electron was discovered. It's an emperical linear relationship that just seemed to work. Since no mechanism is suggested, it's not possible to determine whether ohmic behavior is expected or not. In contrast, The Drude Model's explantion of ohmic behavior could be considered a theory. It makes an attempt to understand the underlying physics, and from this it is possible to predict whether a material will follow Ohm's Law or not.

Have you noticed that nothing is called a law anymore? It's a difference in word choice not meaning.

I think modern scientists are a lot less likely to call any empirical relationship a law. Theory is still pretty widely used, and not for things that could be called laws.

You can say the "the theory of quantum chromodynamics," but not "the law of quantum chromodynamics". 'Law' implies a small number of simple equations, while theory allows for a much larger scope of complexity or rigor. The meaning is absolutely different. Modern science is pretty complex compared to what went on 200 years ago, so it's not surprising law has fallen out of favor.

> it's not like second-hand smoking

Shouldn't smoke follow the inverse square law as well (in the absence of air currents)?

no. smoke is heavily influenced by ventilation and the enclosure. there's a fair amount of modelling in the literature, for example on how much air flow you would need in confined spaces to reduce secondhand smoke cancer risks to acceptable levels.

Not if you are in a sealed room with the smoker.

> Shouldn't smoke follow the inverse square law as well (in the absence of air currents)?

That's why smoking is allowed outdoors.

(Except when it's not.)

It's not really uncommon to see "no smoking on the patio" or "no smoking within 100 feet of this door" signs. And some states have legalized smoking cannabis, except outdoors.

Patios are usually very close to doors. Is the threshold really 100 feet in some places?

I've seen such limits imposed on private property, but as far as I know there aren't any laws (specifically for cigarette smoke) that extreme.

Don't get me wrong, I'm not a smoker and never have been. I like these laws. I'm just being a bit pedantic. There has been a general trend of restricting smoking and I think that trend has continued outdoors. It used to be that you could smoke anywhere, then businesses started creating indoor smoking sections. Then indoor smoking sections were banned and smokers moved to the outdoor seating. Then smoking outside near exterior doors was banned, in a way that effectively banned smoking at many restaurants entirely. Beyond this, in some places like NYC you have smoking bans in public parks and beaches as well, regardless of how far you are from any exterior door. (To be clear, I support these bans because cigarette smokers are notorious for their litter.)

And in the case of cannabis the restrictions are even more severe. In Washington you cannot smoke cannabis in view of the general public or in most buildings (except residential, although many apartment buildings have smoking bans too.) California is more permissive, but even they enforce a 1000(!) foot smoking ban around schools and youth centers.

But not everyone eats barbecued meat or has barbecued meat near/next to their body 24/7 like people who use cell phones. I can opt out of eating barbecued meat though it carrying a phone close to my body.

Fortunately/unfortunately some of us live in Kansas City and have no choice in the matter.

Regarding 5G or barbecued meat?

Meat barbecued with 5G

As I type this across from Jack Stack. Korean BBQ is better anyway.

> For example, eating barbecued meat causes cancer, but it we don't ban it.

Sure, but the absolute risk is very low and it's probably zero if you add veggies to your meal. And also causality was not established, observational studies on this topic suffering from the healthy-user bias in addition to a lot of confounding factors and in general for nutrition the evidence is very low quality, compared with medication research.

This isn't a boolean. Saying that "this or that causes cancer" is meaningless without giving the risk factor, which has to be statistically significant and in case of meat it isn't.

In other words if you want to compare the risks of 5G with anything, I think nutrition is a really poor choice due to the low standards for evidence we have and due to all the confounding factors and biases.

Not sure if assessing the safety of 5G is easier, but at the very least you can do double-blind studies. At the very least you can compare it with the placebo effect. Which in nutrition isn't possible.

Cigarettes and alcohol too. Radiowave effects, even if they do cause anything, are quite easy to mitigate in humans by wearing clothing with carefully positioned electrical conductor insertions.

What about the head? Is it time for tinfoil hats?

Working in a Faraday cage seems more appealing.

"Causes" implies that you will get cancer if you partake. Maybe you mean "increases the risk of". Agreed that your risk exposure might only be fractions of a percent and not worth abstaining or regulating.

I find your certainty regarding the reasons we have increasing incidents of cancer to seriously detract from your previous points.

Most people in the cancer field (I worked in a tangential one) believe that incidence of cancer is due to people living longer (cancer rates go up a lot as you age), and not dying of other diseases first. In a sense, cancer is what you get after you solved the other problems.

"Most people in the cancer field" seem to be pretty misinformed then. See [1] and [2]. At a minimum, health/diet habits have increased cancer risk in young people over the last 20 years, and probably other environmental factors/exposures as well.

[1] https://www.thelancet.com/journals/lanpub/article/PIIS2468-2...

[2] https://www.cancerresearchuk.org/health-professional/cancer-...

Sorry, i think you misunderstood. I believe that lifestyle effects do influence cancer rates, but I don't think they influence it as much as the effects I mentioned. Note that in the papers you describe, the increase in relative risk is actually fairly small.

Ok in that case I agree. I think it’s important to recognize that lifestyle and environmental factors are a contributor to cancer rates across age groups. Thanks for clarifying.

That reminds me of one of the cell phone radiation studies that I see get bandied about sometimes. People that are skeptical of cell phones like mentioning it because they found that rats exposed to massive amounts of 2G and 3G signals were more likely to die of cancer. However, what the study also found is that the rats exposed to the signals actually lived longer than the control group.

If cell phones give me cancer because they make me live longer, then maybe I should start carrying around a second one.


I agree - and you're probably right but I think we should hold ourselves to the same standards of evidence we expect of others.

Based on nothing scientific, I chalk up the increase in cancer to contaminants in our water, whether it's well or city water, and the introduction of petrochemicals/plastics.

"And nobody knows why."

Not only that but the article also leans on the somewhat infamous 'rat study', which produced hugely misleading findings, as explained in a good writeup by John Timmer here (https://arstechnica.com/science/2018/03/a-critical-analysis-...)

> "Here's a large number of studies saying X" is meaningless in a topic where a very large number of studies have been done. What you need is systematic reviews of the literature that not only count studies, but evaluate their quality and combine their results.

The sixth link in the article will get you what you want: https://www.sciencedirect.com/science/article/pii/S138357421...

"The present review - of results published by my group from 2006 until 2016"

Self-evaluation of quality is not quite a systemic review of literature. Neither is research on drosophila melanogaster quite in the same ballpark as research on homo sapiens.

Thank you. The sheer volume of blatantly false comments on this thread is astounding to me.

I haven't read the article, and I'm not a party to the 5G debate, but what you're describing in your comment is a meta-analysis, which is a practice that has produced some of the most controversial findings in recent times. If anything is needed to clear up an inconclusive body of studies, it's better and more reliable studies and experiments, not meta-analyses.‎

Meta-analyses try to find the unknown common truth by reviewing multiple studies and their methods and weighing them accourding to their percieved (really, calculated) quality.

By combining the results of the best studies and giving us an overview of them, meta-analyses really are the best studies we have.

"In addition to providing an estimate of the unknown common truth, meta-analysis has the capacity to contrast results from different studies and identify patterns among study results, sources of disagreement among those results, or other interesting relationships that may come to light in the context of multiple studies." - https://en.wikipedia.org/wiki/Meta-analysis

Meta-analysis is good if you assume no publication bias. The problem is that most of the times only studies with successful result get published, and in those studies only the successful (or at least interesting) parts are published. You can use meta-analysis only in areas that have pre-registered trials, and perhaps that is too optimistic.

Isn't meta-analysis necessary and critical today?

With a potentially large # of studies on a specific subject, some of which may be contradicting, meta-analysis will be done.

Whether by accredited scientists, or institutions who do this on a professional basis with open and established criteria and consistency in approach (though there may still be flaws in the approach); and/or by individuals when confronted with large # of studies, when I feel bias would almost certainly be the primary driver.

While you absolutely cannot eliminate the latter, it feels it would be foolish to dismiss or eliminate the former.

As long as:

a) World is a bit tricky (phenomena difficult to define and isolate; limitations in equipment and measurements; limitations in humans)

b) We use statistical analysis as primary differentiator of significance There will continue to be contradicting studies.

Not disagreeing it's always better to improve studies and experiments, but it feels like meta-analysis of multiple studies is not just a necessary evil, but a good and desirable next step.


Why not do the study? The phycologists have and they realized their field of study was built on sand. No amount of meta-analysis will fix that.

>> Why not do the study?

Hmm; I don't think we fully understand each other.

Absolutely do the study; in fact, do several.

What then, though? A "study of studies", or any method of reconciling or making sense of these multiple studies, is the "meta analysis" in question. It's going to happen, in some way, somehow, by somebody. There is no way to reconcile multiple studies without doing some form of what we are calling meta-analysis. Even if it's just you or me googling it up and then deciding which of the studies to trust, that's a "meta-analysis".

My claim, therefore, is that it's far better to acknowledge this need and reality, and have dedicated teams of experienced professionals do it with an open, consistent method; then for each of us to mentally, subconsciously, "biasedly", to pick & choose and prioritize the studies.

[in the end, proper meta-analysis, and study replication, are the methods of how we discover inconsistency, contradiction, data issues, or other problems, whether in psychology or other domains. It's not a method of "fixing something", but a method of collecting, analyzing, and reporting on multiple studies and data points]

Phycology, the study of seaweed, is indeed all on top of sand! ;)

Doh; and here I completely misunderstood the underlying message... thx for setting me straight! :-)

haha, good catch. ill keep it there

It's... more complicated than that.

Yeah, a meta analysis is only as good as its inputs. Unfortunately many meta analysises read like "we found 20 studies on the topic, but 19 of them are crap and the one good study didn't really ask the question we're asking, so we really don't know". And yeah, there's considerable wiggle room for a meta analysis. But I don't think you'll find any serious scientist doubting the usefulness of a meta analysis per se.

I knew other grad students who'd say, "oh, it's a meta analysis" and then toss it in the bin. But in recent years I've met people who worked on boards for cities and hospitals where they dug though a lot of papers and wrote detailed meta analysis about the larger body of work. They'd comment on things that seemed consistent and inconstant, and tried to help others make well informed decisions.

With a meta-analysis, you do have to look at the whole thing. Phrased like, "when controlled for.." you can't control for some things over 10 different studies. You need to read how they attempted to control for things and why the came to their conclusions.

They are still valuable and you shouldn't just chuck them in the bin.

That's criticizing a weak interpretation of my point and seems to conclude I'm rejecting the use of meta-analyses entirely, which is not the case. The point I make is in the context of the suggestion that a meta-analysis of an inconclusive body of studies is sufficient to yield substantially stronger conclusions.

Can you give some examples of the controversial findings of meta-analysis?

In 2010, a meta-analysis seeking to reevaluate the scientific evidence behind the prevailing view that dietary saturated fats are tied to adverse effects on cardiovascular health was published.[1]

The study's conclusion, "a meta-analysis of prospective epidemiologic [sic] studies showed that there is no significant evidence for concluding that dietary saturated fat is associated with an increased risk of CHD or CVD", unleashed a wave of news reports claiming dietary saturated fats were no longer considered harmful, further tilling the grounds for a movement urging the increase in consumption of saturated fats for added health benefits.

The 2010 meta-analysis has been criticized here: https://academic.oup.com/ajcn/article/91/3/497/4597072

Again, I'm not a party to this particular debate on dietary saturated fats, but I am critical of the suggestion that a meta-analysis of an inconclusive body of studies will yield an actionable conclusion.

[1] https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2824152/

Recently, a meta-analysis calling into question a connection between eating red and processed meat and heart failure.

So as to help me best evaluate the validity of the article, what is your background? Are you familiar w/that field? That would help me assess the usefulness of your critique of the backgrounds of people linked to in the article.

> Peer review is a lowest level quality mark for a piece of science.

Peer reviewed used to be the epitome of proof, now it's been relegated to base minimum?

Are you saying that consensus is no longer considered valid proof? (not that I disagree, it's just that this is a rare argument in my experience)

Also the World Health Organization (WHO) is cited:


"Lyon, France, May 31, 2011 ‐‐ The WHO/International Agency for Research on Cancer (IARC) has classified radiofrequency electromagnetic fields as possibly carcinogenic to humans (Group 2B), based on an increased risk for glioma, a malignant type of brain cancer1 , associated with wireless phone use."

The "possibly carcinogenic" declaration is quite weak ("evidence far from conclusive").

Even if it was "carcinogenic to humans", there's a lot of things that are vital to modern society which fall into that category. It's then up to the regulators to guess an "acceptable" limit of exposure.

“One of the painful things about our time is that those who feel certainty are stupid, and those with any imagination and understanding are filled with doubt and indecision.” seems appropriate here...

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact