Hacker News new | past | comments | ask | show | jobs | submit login
Observing my cellphone switch towers (fabiensanglard.net)
800 points by ingve on May 16, 2021 | hide | past | favorite | 132 comments



At this one industrial customer I visit every week for a full day, my Verizon cell phone date/time time will suddenly change back by 19 years, six months, and some days (I don't have the exact value). This has been going on for almost a year. My pet theory was that law enforcement turned on a stingray nearby back in 2020, it came up with a default date of 1/1/2001, and has been running ever since. Steel building with warehouse plus office space. No one in the building is running one of those cell phone boosters that plug into customer internet and act as a mini tower. My newish pixel 4 and my previous Moto phone both demonstrate the problem. I don't have the fortitude to call Verizon and work through support to get to someone who would understand the problem. Has this happened to anyone else? Any other theories on what it is? (My workaround is to turn off updating date and time from the network)


It is not uncommon that entire towers or basebands have invalid time. Some of the reason that i encountered personally:

- Badly configured basebands (for example lack of summer/winter time change flag!)

- Broken PTP

- Stations that take time from NTP which does not exist or does not have route to.

- I don't know the name for it, but getting time from local Nodeb is an feature - and sometimes features are not active or licensed[1].

- Station setup with GPS instead PTP/NTP (let's say for 2G or some slower HDSPA modes) with broken GPS module itself

- Software error [basebands have multiple versions of software, same goes for APG's, SCU'S, RRU's and dozen other black boxes. Even cabinet itself have some software. This is big pain in the ass to keep them all updated in large network - as upgrade process can be disruptive and sometimes it is better to not upgrade [cases of some really wacky stations]...

- Problem with software in terms of PTP implemetation. For example there are some discrepancies between Huawei and Ericsson software when it comes to PTP.

[1]. This is big shocker for Open Source Software community:

- EVERY SINGLE FEATURE in base station is not only NOT free, but Telecoms needs to pay them every single month[all manufactures: Nokia, Ericsson, HUA do that]. We are talking here in about 100-200 licenses for various features per site alone. If you don't buy new license every 28 days - your base station will turn off and it's nothing you can do.

- Hardware provider can also refuse to give you new licenses in theory...

- This is real hell of SaS world that NO ONE is talking about.


Yep, i complained and they fixed it (mobile operator or the stingray.. don't know who caused the issue or if there even was a stingray (or just a misconfigured basestation)).

The problem i had is, that the time in my case was in the future, so when i called someone (personA) from within that cell, the call was made sometime in 2025... this wouldn't bother me in general, but when you use the redial function in your headset (double tap on the button), the phone would call the last dialed number (which means max(timestamp), not the actual last call), and instead of ringing the last person dialed (eg. personB), it would redial the personA I called from the affected cell, and after misdialing them by accident, a couple of times, I had to manual remove that call from the calllog, to get the redial to work again... until I entered that cell again and called someone from there again.


A guy on a team that I worked on detected one of those devices and called the university police (we were a tenant in a university building), who laughed, and then proceeded to methodically and politely walk the hierarchy of agencies with jurisdiction (sheriff, city police, state police, FBI field office, FCC, etc). I believe he called a member of Congress as well.

Eventually somebody figured out that he wasn’t going away, and a couple of very pleasant guys popped in and dropped off their business cards. They told him it wouldn’t be a problem anymore and to give them a call if it became a problem again.

End of the day, if someone is creating interference, they need to stop. If you have the time and inclination, complain and they will.


> A guy on a team that I worked on detected one of those devices and called the university police (we were a tenant in a university building), who laughed, and then proceeded to methodically and politely walk the hierarchy of agencies with jurisdiction (sheriff, city police, state police, FBI field office, FCC, etc).

Since it took me a minute to parse: the guy on the team, not the university police, was the one who walked the hierarchy?


> ... a couple of very pleasant guys popped in and dropped off their business cards. They told him it wouldn’t be a problem anymore ...

> End of the day, if someone is creating interference, they need to stop.

I'm not sure which way to interpret this, your username might check out after all.


Could it possibly be because the time is being set from GPS, somewhere in the stack? The 19-and-a-half year offset is spot on.

https://en.wikipedia.org/wiki/GPS_week_number_rollover


If so according to the link the phone needs to be from 2013 or older, or an iPhone from 2012 or before. That would give gp an award in no unnecessary upgrades :-)


I think they mean at the cellular network level (or stingray,) not the phone itself.


It could be a cell tower that gets its time from GPS (incorrectly)


Yes, a similar thing used to happen to me just at this one particular bar in Chelsea. But it wasn't off by 19 years, it was off by just an hour, which is worse in a way. That made for some confusing drunken time warps until I figured out the pattern...


I was in a rental vacation house at the southeastern end of Lake Michigan three years ago. It was a pretty big house, and as you walked from one end to the other, the time zone would change. Really messed with me until I figured that out.


Antelope Canyon is a popular tourist attraction in Arizona.

During the summer Utah is on daylight savings time, Arizona does NOT observe daylight savings time but the Navajo Nation within Arizona DOES observe daylight savings time.

Antelope Canyon is on Navajo Nation land and requires a tour guide from a licensed tour guide company. The canyon and tour guides DO observe daylight savings time.

The canyon is only about 5 miles from the Utah border and many people are traveling from Utah. Because of all the competing cell phone towers you end up with people that show up at the wrong time.

I purposely brought an old school watch for my trip.


I recall in Nevada (Pacific time), there's one city right near the utah border that observes mountain time. I don't think my electronics figured it out.

edit: it is West Wendover NV

https://en.wikipedia.org/wiki/West_Wendover,_Nevada


I've been there. It's a bunch of casinos literally a foot over the state border. It's less than 2 hours west of Salt Lake City. Wendover is a tiny town in Utah but West Wendover in Nevada is completely different just a foot over the border.

They probably want to stay on Mountain time because 99% of their visitors/customers are from Salt Lake City on mountain time and their TV stations are also probably from Salt Lake City


I once ran an A/B time-of-send test for our readers where I analyzed the results by local time. Timezones and DST made it a painful experience.


Something similar has also happened to me. In my case, the symptom was that the TOTP 2FA to systems at work started to fail. It took me some time to find that the cause was an around 1 minute offset between my phone and the correct (NTP-synchronized) time as shown by my desktop. Setting the phone to not synchronize its date/time to the network (and correcting its time) fixed it, but it's annoying since without the network synchronization, the time slowly drifts. My theory is that there's one single tower that does not have its time correctly synchronized, and that annoyingly it's the "best" tower when I'm sitting at my desk.

Since then, I've not trusted that automatic network time adjustment, and I leave it disabled. Periodically, I check if it's drifted too much, and if it has, I quickly enable and then disable the network time adjustment (in a place far from the misconfigured tower).


You can disable automatic synchronization from BTS, and install some Android application for NTP time synchronization (for sure there are plenty of them). Then its win-win.


It happened to me. 1 hour off. It was related to DST. I missed a meeting because of this. Since then I disable "automatic time from network" on my new phones and have a casio watch. I thought the tower was misconfigured, why do you suppose it's a stingray?


Time-sync is important for mobile operators (there are standards to keep the towers in sync with the correct time) - and a tower out of sync would be noticed.

But not if it is a stingray as that's not managed by the network.


There is a simpler explanation: I once worked an (industrial+officies) site of high-tech company, and they had their own private Cell tower, with phones that only worked on the site; maybe they forgot to update the time. IIRC, small base stations have been deployed at hacker events(CCC) with similar internal-only features.


Happened to me too, but maybe worse: two minutes behind

This was when SMS were still widely used, so that made for very messed up chats since received messages carried the correct timestamp


On a related note, this same mechanism can be used as a fallback for GPS to determine device location.

I have worked on an IoT device with builtin SIM in the personal alarm space. Often when a device is indoors no GPS fix can be found, but the GSM module can report all celltower ids in range with signal strengths. These can be used to triangulate the device location. Google has an API for this where you can pass celtower ids and signal strengths.

edit: and of course similar triangulation fallbacks/methods can be applied to WiFi AP's or Bluetooth Beacons, if you have a database with APs or beacons and their location. This database can be externally sourced but also built automatically by sending celltower ids/aps/beacons along with GPS fixes. I assume this is one of the many things Google is doing with their fleet of Streetview cars.


I believe iOS may not bother involving the GPS if the application requests (or is only granted in iOS 14) approximate location, and there are enough towers in-range to triangulate. Not having to enable the GPS subsystems and get a fix saves a fair amount of battery.


It's the same on Android - there's an "eco" mode of the location subsystem which will only use "passive" location indicators like WiFi and cell towers. In the "high accuracy" mode, GPS* will be activated, but until it lock onto enough satellites, passive location is used as a fallback/best guess.


Passive isn't that. My understanding is that it's piggybacking on other apps' location requests if there are any.


I might be wrong, but I think it's used for both. *EDIT:* I've looked around and I must've been misremembering - the term "passive" is only used in the official docs when talking about passive location updates (there called the passive "provider", since to the app, it looks like one). The non-GPS category of providers (wifi, bt, cell) is just called "network".

~~Passive location requests are, as you say, when an app wants location updates, but doesn't care when and how often - when one makes an active request, all the passive listeners get called.

Passive location sources are methods of getting your location with the data already available - so WiFi if it's on and cellular if the modem is on.~~


Originally, that's what the distinction between the two location permissions on Android was about. ACCESS_COARSE_LOCATION was based on cell ids and wifi networks, and ACCESS_FINE_LOCATION used GPS. In modern Android versions I believe this is no longer true as it uses all the sources either way but intentionally degrades accuracy for apps that only have ACCESS_COARSE_LOCATION.


> I assume this is one of the many things Google is doing with their fleet of Streetview cars.

...and Google services-enabled Android smartphones.

Source: I have used a single access points in many locations across the country in a relatively short period of time. When there's no GPS fix, Google always thinks I'm where I just was. (And there are no Streetview cars in some of these locations.)


If an access point moves more than a few times per year, it will be blacklisted from Googles location database as unreliable.


I wish they would do that a bit more preemptively. Maybe things have improve, but sometimes when I went to an event where the WiFi was temporarily installed I was instantaneously transported hundreds of miles, presumably to the WiFi providing companies last job.

It should have been able to figure out that this was nonsense based on the unfeasible speed of travel, or from the fact that the area covered by the cell I was in did not contain the WiFi location, or probably from other WiFi networks in the area that hadn't moved.


This AP definitely moved a few times per year, so it must have skirted just below that threshold, then.


Or your Wifi MAC gets tagged by a streetcar at one location, you move and google still things your at the old location. Had that, they even have way to feed that back iirc and was able to get that updated without a driveby of a streetcar. Note this was over ten years ago and WiFi density has increased.


It can still easily happen in 2021 simply by moving an existing WiFi AP to a new location where the WiFi space isn’t crowded.


This is also how the original iPhone + iPod Touch got your location without any GPS hardware, so you could see your rough current location on the map after one of the early updates.


I'm doing same right now. My cheap soc has no gps module, which would need too much power also, but I can trivially get the tac+cellid from the AT+CEREG command, which have to do anyway to check network connection (I have no roaming with NB-IoT). And from this you can externally get the lat/long gps values via the OpenCellID database. They protected queries, but you can get the source also as csv. Didn't know about the Google API, but it's trivial to build something like this by yourself.

In my case I don't even need the gps coords, the cellid is enough, as my devices travel on fixed lines through various countries. For cars or buses you would need it though.


Isn't there a specific GSM command or whatever they're called to request location from the provider? I remember seeing it in a GSM module datasheet at some point, but I never tried it.


Modem specific. In my case AT+CEREG? in modes higher than 1 will tell you the connection status, tac and ci


> I have worked on an IoT device with builtin SIM in the personal alarm space

Are there any resources to that project? Comercial or free?


It was a commercial project for a client I cannot link. Nothing exotic though. Just a little smartwatch-type device with soc/mcu/gsm/gps, 1 button and a led display.

It only had 2G which is already retired in some countries and I don’t know if they’re doing an 3/4/5G refresh.

A fun space to work in though. A space where a small bit of tech can save lives and where tracking is actually helpful.

Edit: but stressful as well. As bugs in the code or platform can lead to people in distress not getting the help they need.


    ”There are no open source LTE stack to learn from”
That is not true, srsRAN is a opensource LTE project. That has both cell phone and tower versions. Runs on SDR boards like the limeSDR.


The Magma project is also an OSS LTE (and 5G) project: https://connectivity.fb.com/magma/


This is nicely done, and takes me back to when I worked in mobile network planning.

Most people are blissfully unaware of how much radio chatter and constant adjustment is actually holding up their browsing session from second to second, with the UE negotiating the best possible terms (power, throughput, etc.) with adjacent cell towers before it decides (or is prompted) to hand-over to a new cell.

Optimizing cell hand-over and adjacency matrices is something that operators need to do quite frequently as traffic patterns and network topology change and a computationally neat problem that can be scaled horizontally (I once optimized one such job to run in 4 hours instead of 24).


I learned that the cellular protocol stack wasn't as simple as it seemed when I was debugging my VoIP library under different network conditions. I would go into the *#*#INFO#*#* menu on an Android phone and force it to a particular network type. 2G/EDGE provided just barely enough throughput for me to fit my packets through on my lowest bitrate. But sometimes there were periods when the packets would just stop coming through. This would last for a perceptible amount of time, sometimes even more than a second, but after that, all the packets I sent during that time would come through in a burst. That's how I found out about the RLC protocol that's somewhere in the stack in 2G, 3G, and 4G, and that it has this "acknowledged mode", and that in this mode it does retransmissions if there are bit errors. This also made it very clear why TCP sometimes struggles so much when running over a 2G connection.

Anyway, I wish there was something like Wireshark to which I could connect a phone and see it talk to the cell towers in real time. This would really help understand how this all works.


Um, I'm pretty sure that there is not much of that handover stuff happening while browsing hacker news since that's likely in a low-velocity scenario, i.e., sitting on the toilet.


It can also happen due to other phones in the tower area moving and needing more or less power, which might result in "the system" deciding to move your phone over to another tower to reduce congestion. Even if you are just sitting on the toilet.


You would be surprised. If you're ever using your phone with a single available cell at low signal quality, just about anything will affect your signal... how you hold your phone, which room you're in, which way you're facing, etc.

The ability to switch to a different tower papers over all sorts of moment-to-moment connectivity issues.


Might be sitting on the toilet in a train


>Most people are blissfully unaware of how much radio chatter and constant adjustment is actually holding up their browsing session from second to second, with the UE negotiating the best possible terms (power, throughput, etc.) with adjacent cell towers before it decides (or is prompted) to hand-over to a new cell.

My experience in my networking class in college and how complex even a TCP/IP connection is over a reliable wired connection was enough to make me run screaming from the field as a profession.

The fact that I can get 200/mbit on my phone today still astounds me.

It's an interesting, but even to me an intimidating field.


There is no decision at the terminal side, there is no negotiation. Network collects reporting from the terminal(s) and decide centrally.


Something like this would make it pretty easy to spot one of those stingray devices, I bet. Cell towers tend to stay put, so mapping out an area would go quick. Were a new 'tower' show up in an odd spot -- there may be shenanigans at hand.


That type of exception analysis is exactly how University of Washington researchers do their stingray hunting. It doesn't take much time monitoring to figure out which towers smell funny:

https://seaglass.cs.washington.edu/




> "Were a new 'tower' show up in an odd spot -- there may be shenanigans at hand."

Or maybe it's simply https://en.wikipedia.org/wiki/Mobile_cell_sites


Those will only really show up around large events or during disasters. There's little reason to just put one out there unless there's an event calling for the extra demand or a quick patch of the network coverage.


the GPS location is reported by the tower, so nothing is stopping a mitm device from spoofing that too


The location is taken from an online database based on cell id. It's not clear where most of them get their data but services like wigle and mozilla use crowd sourcing of signal strengths so they might be a bit off. Not as accurate as the ones in the article I would think.


In cdma2000, it's an option. I didn't notice such a thing in LTE. (LTE does include some options to determine position by propagation delay between mobile and base station, but they all involve the mobile reporting time measurements to the network, and then the network has the info to compute location).


Edit: totally wrong. Thanks ectopod8!

I'm pretty sure gps location is derived from signals from GPS satellites and ground stations. You can see this if you're somewhere with no cell signal but have gps, or visa versa.


The GP is talking about the tower location, not your own location.


"There are no open source LTE stack to learn from" https://osmocom.org/projects


The most popular open source LTE stack is srsLTE (now called srsRAN). https://github.com/srsran/srsRAN


The srsLTE/srsRAN code is also fairly accessible and well written -- a year or so ago I used it as a reference for demodulating LTE downlink signals (in a python notebook).

Also worth calling out ShareTechnote which is basically the personal notebook of a (brilliantly prolific) telecom engineer (now at Apple): https://www.sharetechnote.com/. It's incredibly comprehensive and well linked -- a much easier reference than paging through hundreds of pages of PDFs.


Unfortunately this refers to srs.io for more info, but that domain is not found. So I am left wondering what SRS means.


You want https://www.srsran.com/, but to save you a click: SRS stands for Software Radio Systems.


That website is not the greatest for finding out about the state of the projects, as in, is anyone using this? Are there any notable deployments?

Can someone shed some light on this?


s/LTE stack/mmWave stack ?


There are barely corporation-backed 5G mmWave stacks, and they're absurdly more complicated than 4G.


Can you shed more light on this? Do you mean in terms of beam forming? And other radio resource management complexities that arise from it?

I worked in 5G mmWave on the UE side so can imagine some of the scheduling complexity. However for a basic mmW stack, like one to be implemented by OAI or srsRAN would likely scale the problem down to something more basic


Yes, pretty much indeed beamforming and scheduling complexity.

I could see srsRAN making a lowband (<6 GHz) impl of 5G, but not the crazily complicated high freqs. At least without some crazy funding.


How do i do this myself now?

I always have put my phone in air plane mode when i was in a train due to the modem consuming a lot more energy.

I always assumed and still doe its due switching towers and less connecitivy which means using more power.

Whenever people argue that public transport doesn't need wifi i would argue against it due to power usage and also for more accessablity of the internet in general.


I think trains are kind of a special case in telephony when it comes to handoffs. It’s a unique situation where you have hundreds and hundreds of UEs[0] having to detach and re-attach, which makes for a very busy MME[1]. Sometimes those phones are only briefly connected to a given eNB[2] too, which makes all that effort for naught.

[0] Phones

[1] The system which sits immediately behind cell towers and coordinates handoffs

[2] Tower


The canonical solution in train/subway/metro systems seems to be full-length antennas that span the tunnel corridor.

The cool solution would be base stations built into train carriages that then uplink through some kind of backhaul.

I think (but need to properly evaluate) that some Australian trains use something like this. EDIT: Nooope nope nope, horribly misread. https://goughlui.com/2014/03/23/random-post-railcorp-gsm-r-r...


> How do i do this myself now?

As the essay notes, it's through TelephonyManager on Android. On iOS it used to be possible through CoreTelephony private APIs, but I think Apple cracked down on these a few years back and either removed them entirely or locked them down behind entitlements, as they were getting abused by analytics frameworks.


There are plenty of free android applications showing you your local cellular info with CID LAC, CI etc.

For example "Signal strength" APK. It's just matter of time until you will find one with logging and export ability. You can always write it down with pencil :)


Naw, this is HN, so it'd be more like take a screen shot, upload to some ML tool you cooked up in a weekend that OCRs the data, loads it into a NoSQL database with the cotnent presented in the latest fad JS framework using Lamda functions or some other cloud hosted CI/CD blue-green deployment to support both users.


Something like Network Cell Info / Net Monitor / G-NetTrack on Android. They actually show tower locations with signal strengths.

What I don't get is why the author didn't draw the signal strength of all towers that were visible. This info is also trivially available.

What public transport needs are microcells, and to hell with wifi.



Btw you do use this with stingrays to trick the phone into full power boost so it will use the battery very quickly. Useful if you want to encourage the user to go home to charge.

I don’t remember the details but it has to do with selecting which cell tower is the best and how strong the cell phone needs to boost the signal are two different steps. Or at least used to be 10 years ago.


It seems that intentionally draining the targets battery would be counterproductive to the law enforcement's goals of tracking them


But not toward disbanding a protest coordinated by cell phone.


Useful pro-privacy tip from the article:

> According to the LTE specs, cell-towers don't have to perform UE hand-overs like in GSM/UMTS. The phone starts camping on the next tower while remaining in RCC_IDLE mode without emitting data. Not only does this save battery, it also means operators don't really know where the phone is as long as it remains in the same LAC.

And another tip: I wanted to know what "very expensive" means for LTE-related literature, so I searched DuckDuckGo for the first book mentioned (sold on Amazon for $105) and the second result was an OCR'd PDF at huaweicup.ru


The iPhone entries in the table could give a misleading impression of the timing. UMTS 3G phones had long been commonplace by the time the iPhone 1 was announced.


It's also missing pre-HSPA UMTS (384kbps) as a separate entry


How much does the positioning of the phone in the car affect it's antenna effectiveness and directionality. I'm astounded that you can "bury" the phone in a pit beside the gear shift with a pile of metal on top (the dashboard) and it still works.

The engineering me needs to place antennas optimally! Even though burying the phone in a pit and it still works it still irks me and I want to see it in an optimal placement where it is minimally shielded. Why don't car manufacturers have a place for the phone in the roof or something so that it gets optimal antenna placement? Yes it would be unseemly but wouldn't we get much better reception?


The better place to get reception would be through the charging or audio port.

It's an easy way to extend the phone antenna to the whole vehicle.


> Why don't car manufacturers have a place for the phone in the roof or something so that it gets optimal antenna placement? Yes it would be unseemly but wouldn't we get much better reception?

Don’t you answer your own question? It would be unseemly. Phones still get signal and consumers aren’t demanding an option for optimal antenna placement. If they did install them, would consumers even notice battery savings when most people charge their phones in the car anyway? So for car manufacturers, it seems like all downsides.


It's a side note at most, but the history of technology at the end of the article is a little bit misleading as it lacks the dates for the technologies. It seems to list the most notable devices for each category. Some of them appeared several years after a technology became mainstream.

Examples:

The Nokia 3310 is from the end of 2000 but 2G mobile phones were already in everybody's hands by then, at least here in Italy. The boom started in 1997. I resisted until 1999, then my boss gave one to me because "I don't want to call [another guy with a mobile] to talk with you" :-) It was the Nokia 8110 (Neo's phone in The Matrix.)

3G/UMTS was launched in Europe in 2003, much before the first 3G capable iPhone (2008). The marketing word in 2003 for UMTS was UMTS. The first device was the NEC 606 [1], from the 3 mobile operator (Italy, UK and a number of other countries.) There were a number of popular 3G phones in the 5 years before the iPhone 3G.

By the way, the NEC 6060 did video calls too. The screen was low resolution (not by the standards of the time) and the price was outrageous (4 times the voice calls?) Not a very popular feature and yet it was there.

[1] https://www.mobileindustryreview.com/2015/04/classic-handset...


This discussion reminded me of an iOS app that visualized cell towers in AR. I can't remember the name, though.


You probably mean Richard Vijgen's art project:

http://www.architectureofradio.com

The iOS application:

https://apps.apple.com/us/app/architecture-of-radio/id103516...


Wow, HN rocks! That's exactly the app I was thinking about.


I remember this also, sadly it's more of an art project.


It's quite hilarious seeing bunch of skilled and intelligent hackers being excited regrading process known as "Handover" :) It is well known, and well documented in every single book in subject...

Worth to mention that there are 3 types of handovers:

- Intra-relations: user can be switched between various cells at same tower.

- Intra-relations but with handover to lower technology (let's say when 4g is not available[too much traffic], so you will be dropped to let's say 3g or even 2g cell)

- External-relations: relations between cells on different towers.

As for towers:

- Not all operators allows for local roaming between various operators.

- Most towers have 3-5 sectors (pizza slices), but it's not that unusual to see OMNI antennas(single sector for 360 degrees).


The most interesting part of this article IMO is how the author built handover visualizations with unprivileged code and explained what was exposed in the API on Android vs iOS that was needed to do this. And they're quite good.


Handovers can exist on a higher level as well. 10 years ago there were calling apps that could handover a WiFi VoIP call to the gsm network (not using voip but actual gsm) when the WiFi goes bad and later switch the same call back to WiFi.


It's quite hilarious seeing a supposedly "skilled and intelligent hacker" (since we're on Hacker News after all) being so excited to recite the article.


It's quite hilarious seeing the term "skilled and intelligent hacker" being used to describe HN users


This was a good read. I laughed out loud when I read the notes column of their cell technology matrix which listed Mr Drummond and Gordan Gecko for 0G and 1G respectively.

I can also highly recommend the book "High Performance Browser Networking" by Ilya Grigorik referenced by this post. Although it's almost 8 years old most of is still very relevant. Although it would be wonderful if he would update it to reflect changes in TLS and 5G.

I also really liked the layout and fonts on this person's site. It has an almost 'zine aesthetic. Very easy on the eyes. Does anyone know what they might be using to produce this?


Those visualizations remind me of living in WV, where even the greatly exaggerated maps of coverage produced by the carriers clearly follow only the major highways and like 30% of the cities.

shudder


This is very interesting presentation. I wonder if the author considered researching the inter-eNB and backhaul latency and how it affected UE latency.


> Along the way, five towers and nine cells (a.k.a eNB for Evolved NodeB)

What does this mean? Whats the diff between tower and cell?


Towers are just locations of at least one set of sender and receiver. There can be more than one set of these at a given location, meaning that more than one cell can originate from a single tower.

The first graphic visualizes this. Recption of the same cell is indicated by the same color there, while towers are at the points that these colored cones radiate from. Several cones of different colors start at the same points, visualizing that there are towers that are providing more than one cell. Observe that the reception can jump back and forth between the same cells as can be seen on the brown and blue colored cells in the top left corner of the graphic.


He explains this two sentences later:

>- Several cellIDs map to the same eNB lat/long coordinates. That's because the antennas mounted on an eNB don't have 360° coverage. The angle and range of each antenna carves the space into pizza slice shaped cells.


It's explained in the article, one tower can have multiple antennas (author found out they're typically set at 120°) on a single mast. Each antenna is then a cell, altough mounted on a single tower.


Ah multiple antennas. Makes sense. Thanks!


it gets even more fun when you throw in beamforming.

a tower is just a mast on which radios are mounted. a site is the location of the equipment. a node is the name for a set of radios and base systems. a nodes service can be divided in to sectors. a cell is usually a certain coverage area served, the frequency of cells will differ a bit from it's neighbor to stop interface. and a beam is specific focused radio that serves one UE.


> Whats the diff between tower and cell?

The tower is what it sounds like, it is one physical structure containing the communications equipment, also called a Base Transceiver Station (BTS) [1]. The cell is the individual communications panel on the tower, also known as a CellID [2]. There are typically 3 panels on a tower, each covering 1/3rd of the surrounding area. So the tower gives you the physical location, and the cell gives you the general direction.

[1] https://en.wikipedia.org/wiki/Base_transceiver_station

[2] https://en.wikipedia.org/wiki/Cell_ID


I will enthusiastically upvote anything written by Fabien Sanglard. He's such a great technical writer


It's always a pleasure to read Sanglard's posts. Best known for his 3dfx dissertation.


I like the map plotting in this article. You can see from the one second rays exactly how much time spent at a stop, and with a few calculations you probably can determine velocity.


Would be great to have an app that shows a map with my position and the position of the current cell. And that could create such maps as in the article on-the-fly.

Anybody knows of such an app?



CellMapper[0] is not real-time, but it collects data to make sector maps and estimates the position of the current eNB. Users who contribute enough data can "pin" these eNBs in the correct location.

Despite not being real-time, it's the best / most accurate app I've found of its kind and has a good community behind it.

[0] https://www.cellmapper.net/map


The jolly celltower adventure could ensue by indirectly reading the weather and even to forecast it. The least difficult aspect is measuring humidity. ^_^ Good luck!


Isn't any cell tower essentially a way to measure humidity as long as your receiver sensitivity and signal resolution high enough?

Given known EMI patterns along the path of propagation of course.


Meteorologists are using such data on a daily basis already and machine learning does wonders there too but i'm sadly not familiar with the finer details.


>"Good developers know how things work.

Great developers know why things work.

- Steve Souders, High Performance Browser Networking Forewords"


It is worth noting that the mentioned books (along with many others about LTE) are available online, for free, if you know where to look.


So are the standards actually... but it's probably pretty difficult to get into if you don't work in the industry and have to learn all the terminology, approach, history/evolution of the standards, etc.


Seems like an obvious missed opportunity to actually draw the cells? I guess it's a veroni diagram?


Nice, that would be really nice to have for large Wi-Fi systems in big commercial buildings too.


Well Wifi meshes are surpringly easy to put up, I think it's just the lack of desire doing so.


This is exactly how (big) wifi systems work. (mesh) If this is not the case in a big commercial building, it was done by choice or because of incompetence.


No it's not. A campus network has a PoE line to each access point. It's not a mesh, they don't route the traffic between the access points over the air.


Isn’t that mesh with back haul? I thought the characteristics of a mesh were that the APs were communicating with themselves to provide handoff. Data can move over the AP or through back haul.


I think that's roaming. Mesh wifi should IMHO be routing the traffic through the nodes (with possible backhaul).


Correct, enterprise wifi is usually just APs with a common SSID and security configuration and some tuning to try to force clients to roam sensibly. There’s no AP to AP comms, it all goes back to the switchport


There are solutions for AP-side roaming that is transparent to the client and whole network is visible as single AP, like UniFi "Zero-Handoff Roaming". Though because it forces whole network to use single channel it is deprecated in favor of 802.11r client side fast roaming.

https://help.ui.com/hc/en-us/articles/115004662107-UniFi-Fas...


Is there an app which can show you this kind of map in real-time as you move?


Thanks. Was quite a pleasure to read.


This is so cool.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: