I wonder if a good way to illustrate or dispel the harmfulness of millimeter wave radio would be to compare the wattage/sq ft covered. How much energy is carried by these waves compared to 900Mhz or 2.4Ghz radio ?
Besides the 'boy who cried wolf' nature of radio causing cancer, I'd also like to know the power consumption for the extreme density required to relay gigabit+ connections across town, I think I heard the range between towers is something like 500 ft? There are, ostensibly, not lower power devices to be processing all these connections.
> “The exposures used in the studies cannot be compared directly to the exposure that humans experience when using a cell phone,” said John Bucher, Ph.D., NTP senior scientist. “In our studies, rats and mice received radio frequency radiation across their whole bodies. By contrast, people are mostly exposed in specific local tissues close to where they hold the phone. In addition, the exposure levels and durations in our studies were greater than what people experience.”
> The lowest exposure level used in the studies was equal to the maximum local tissue exposure currently allowed for cell phone users. This power level rarely occurs with typical cell phone use. The highest exposure level in the studies was four times higher than the maximum power level permitted.
> The RFR exposure was intermittent, 10 minutes on and 10 minutes off, totaling about nine hours each day. RFR levels ranged from 1.5-6 watts per kilogram in rats, and 2.5-10 watts per kilogram in mice.
Yeah, they really cooked those critters. Takeaway: if you work on 200W transmitters, don't sit on them all day every day.
Meanwhile, I'd like to keep using the 1W ~0% duty cycle transmitter in my phone, thankyouverymuch.
> The lowest exposure level used in the studies was equal to the maximum local tissue exposure currently allowed for cell phone users. This power level rarely occurs with typical cell phone use. The highest exposure level in the studies was four times higher than the maximum power level permitted.
Does this mean in trains or public building where multiple phones are present, even hundreds of phones? 4x seems rather a small margin for harm considering every human is potentially carrying a device and using it at the same time occurs multiple times during the day.
Also bear in mind that this obeys the inverse distance squared law; a phone being used by somebody a couple of meters away from you is going to give you massively less exposure than your own phone pressed directly against your ear.
Not really, there always is and been some kind of separations between the phones when they send data.
Most 2G systems used TDMA, so the phones inside a cell spitted the time between them. e.g. one phone could send almost all the time but 10 phones sent 1/10:th of the time each.
3G and laster used WCDMA (as some US system used even in 2G) for the radio. Where encryption of the data over a wide frequency span makes the signals mix in the air. But at the same time lowering the needed effect to get the signal across. With WCDMA you can send on an effect below the background noise and still get the signal across.
Lowering the effect of the radio is a big goal as it drains battery to send radio signals. An effect you can see it you travels in areas with bad receptions your phone battery drains significantly quicker. The best way to lower the radio exposure is to have loots of radio towers close to people. The alarmists scaring for cell-networks have made some problems in building enough towers, and by there fear of cell-radio transmission they increase the energy transmitted by the phones in there area.
Besides the 'boy who cried wolf' nature of radio causing cancer, I'd also like to know the power consumption for the extreme density required to relay gigabit+ connections across town, I think I heard the range between towers is something like 500 ft? There are, ostensibly, not lower power devices to be processing all these connections.