This is interesting, for sure, but the steps here are the easy part. The challenge is doing this legally, since I’m pretty sure srsRAN is going to use bands you almost certainly don’t have a license for.
The only viable options for the hobbyist is a network using unlicensed bands (like 2.4 GHz / 5GHz, or the 33cm ISM band) along with CBRS[0]. As I understand it, most UEs (handsets) that support one of those bands will not use it exclusively—only in tandem with another licensed band.
Unlicensed bands typically have the requirement that you implement some kind of backoff or collision detection mechanism to ensure fairness between users of the band and guarantee you are not sending more than X% of the time. I don't think LTE is prepared to do that at all.
Plenty of other unlicensed band users don't do this. Even bluetooth and wifi typically don't avoid eachother on 2.4G, with the exception of transmissions from the same device.
And of course all other unlicensed band users do this. Again, this is a regulatory requirement. And ultimately, it is not just that you need to ensure others have access to the band, you also need to be robust in the face of persistent high interference given the number of users in the unlicensed bands. It's not a good fit for default LTE.
ISM is the only that doesn't do avoidance (where it has primary status, like the 27MHz band).
If course it doesn't blast excessively, but it's also not mandated to shield to death.
Sadly srsRAN does not seem to implement TDMA. Ham radio operators have a slice at 2.3 GHz which overlaps Band 40, supported by a good number of phones.
Also band 40 is the only widely supported TDMA in range of the cheaper SDRs.
Even a limesdr mini can speak that, though one may want to get an LTE quartz to keep phase noise in spec (the fraction from the normal quartz has a really big denominator, that makes fractional PLL problematic if you don't sense fractional phase of the reference quartz) with the PLLs on those SDRs.
In theory you can use a ($540) bladeRF in place of the ($2100) B200/210. It's got the same family 2x2 MIMO tx/rx chip in it as the B210. I say in theory, as I've been trying to get srsRAN to work with it, so far without success. It gets a response, but fails to establish the link. Has anyone managed to get srsRAN working with a bladeRF?
Others have mentioned Fabrice Bellard's LTE code, but (AFIAK) it's not freely available. Don't take this as a criticism. Bellard has done some great Free software and it shouldn't be expected that he will release everything as Free software.
> Others have mentioned Fabrice Bellard's LTE code, but (AFIAK) it's not freely available. Don't take this as a criticism. Bellard has done some great Free software and it shouldn't be expected that he will release everything as Free software.
Also, I think that if Fabrice would release it as open-source, the pros/cons balance would be different compared to other projects (ffmpeg, qemu...) because
* very few people would be able to use and benefit this, considering that 1°/ you need proper (expensive) hardware, 2°/ anyway you cannot use licensed bands, 3°/ actually you can (mostly) also not use unlicensed bands without proper support from phones and without implementing the required band-sharing mechanisms from the regulation (e.g. listen-before-talk), and anyway LAA requires to use a licensed band as an anchor for signaling...
* it would have cancelled a source of revenue (today through Amarisoft)
Without knowing anything else about your setup my first guess would be synchronization issues. Do you have a GPSDO or some other means of providing a reference signal to your BladeRF?
Thanks for the idea. I'll give it a go. srsRAN includes carrier frequency offset estimation and compensation, but it can't hurt to assume it's got a problem there.
1) the LTE base clock has a giant denominator when using a LimeSDR with it's on-board crystal. A crystal for the LTE base clock is single-digit cost and fixes this (you can even set it up as a VCO to use as a fractional PLL off of some more-stable reference (10MHz GPSDO should be easiest, but it could also directly GPSDO the LTE crystal for mildly better performance at a big effort expense)), as the stability is to keep multiple eNodeB's OFDM symbols (and more importantly, TDMA slots) aligned so they can collaborate for broadcasts and get by with using the same spectrum on overlapping cells.
2) I'd worry much more about near end cross talk without a specially tuned diplexer that splits TX/RX of FDD LTE channels. Try a TDD band your hardware supports to try.
It's like one of those woodworking shows on PBS...
"Today we are building a picture frame. You need a square, wood, wood glue, and a $15,000 biscuit cutter that you will only use twice in your lifetime."
It's identical in design to any number for Makita, Dewalt, or other nicer brands (1k$ from Festool). The point was that it's not a 15K$ tool. It's a 200$ tool for a Dewalt. SMH my head. Yes of course you could buy a bench mount industrial one or something for lots of money.
The design is the same, but are the materials, manufacturing tolerances, and QA the same? That cheaper model is cheaper for a reason. While you're right that most people won't need the $15,000 industrial biscuit cutter, I'd hesitate to use the cheapest model on the shelf.
Having worked with many people who have used bands for a short time they were not authorized to and never ended up on the FCC shitlist, I would be okay with a very low power, temporary setup to play with/test something in the GAA band because it’s kind of a free-for-all among those who have a license to this spectrum anyway and being low power means you won’t be interfering with much, if anything.
The FCC tends to go after transmitters who are actually disrupting communication and if your signal drops off to the point where ambient noise on the band is louder outside the permitter of your house, you’re not going to get that knock on your door.
This all sounds very cool, but don’t you need licensed spectrum to operate your own LTE/NR network?
Is there an open/unlicensed band available for this sort of thing, or do hobbyists just “get away with it” by keeping their antennas small and power levels low?
Looks like you’re charged about CAD.04 per MHz per population of the licensed area. Didn’t dive too deep but it might be viable for a small town to put something together, if they’re unhappy with the big providers.
> CAD.04 per MHz per population of the licensed area.
That sounds like a pretty reasonable way to sell spectrum...
I hope this is some time-limited lease and the spectrum eventually goes back to the central government.
I hope that police/fire/military uses have to go through the same process and pay for their spectrum out of their budgets too. Too many other countries just give massive chunks of spectrum to the fire service who will then use it for walkie talkies for 3 fire trucks, when the same amount of spectrum could be used to give broadband, phone and TV to millions of people.
In the United States regulated by the FCC CBRS Tier 3 is available to "general authorized users." You can operate a private lte-network so long as you authorize and follow the spectrum rules. From the FCC Website
> The GAA tier is licensed-by-rule to permit open, flexible access to the band for the widest possible group of potential users. GAA users can operate throughout the 3550-3700 MHz band. GAA users must not cause harmful interference to Incumbent Access users or Priority Access Licensees and must accept interference from these users. GAA users also have no expectation of interference protection from other GAA users. Technical rules for GAA users can be found in Subpart E of Part 96.
Sort of a tangent-- I always wondered what was meant when you see FCC disclaimers (i. e. the Part 15 disclaimers on anything electric) that says "must accept interference."
That seems weird to specify-- interference seems like an environmental hazard to be expected, like saying a light bulb must "accept" a brownout.
Does it mean "it won't be damaged by the expected level of RFI" or is there some sort of "active rejection of interference" they're explicitly forbidding?
The requirement to "accept interference" is legal rather than technical. It means that if the device experiences issues with interference you can't use the normally available method of filing an FCC complaint to get the source of the interfrence shut down.
At chaos communications Congress we usually get a little spectrum for the local network.
But it's not treated as critical as the local DECT network.
I for one wish there'd be software support for the LTE feature that allows a PHY-level broadcast to be received by phones, because we could offer very low latency audio streaming of all the audio tracks (usually German/English presenter, the other via a live-dub, and sometimes french or so as a live dub), so that people could sit in the talk and get their audio via headphones.
The DECT network provided that offering previously, but the conference modules (it was a conference call where you defaulted to being muted) had to go for more base station (controller? For remote radio heads or so, because you'd not want to run RF antenna cables from the box to all the locations...) modules. Also there were doubts as to the PHY being broadcast/multicast or it being unicast, as the latter would be a problem for the large lecture halls due to issues with dividing them into a grid of Femto cells (DECT seems to be badly suited to that approach, but I don't know why exactly).
My idea of a solution would be a wifi distributed-mimo/synchronous base station that feels like one (or a handful) of BSSIDs per frequency (like, a particular 40 or 80 MHz channel in the 5GHz region), but doesn't make clients even aware of roaming happening: ideally it could do full MIMO between the clients transmitting (they listen before they start talking, and don't talk when they could hear someone else saying something (even if they can't make out any words); this could easily be multiple clients with many dozen base stations spread around) and all base station antennas that can get a decent signal strength (trying to make out each transmission even if any individual antenna heard a combination of the transmission, akin to normal wifi MIMO).
Then in the other direction utilizing beamforming to not confuse clients with more overlapping signals than their receive antennas can pick apart (and isolate the signal(s) intended for that client from everything else) (like, talking to clients that are sufficiently far apart so the base can make all other client's signals weak enough to not be a problem for the one client).
Because it can pre-compensate for the base transmissions from the nearby antennas overlapping at the clients, it can simultaneously use overlapping antenna "cells".
And PHY-broadcast (also useful for multicasting) can transmit on all at once simultaneously by being sufficiently synchronous (same mechanism as the echo/reflection/multi-path mitigation of OFDM).
I can’t wait to see what headphone manufacturers come up with in order to let a user subscribe to an LE Broadcast … maybe by jabbing two or three side-of-head hardware button combos, wait 30s, pair again, grr, now repeat x2 for the kids.
>maybe by jabbing two or three side-of-head hardware button combos, wait 30s, pair again, grr, now repeat x2 for the kids.
That would be frustrating. If only the headphones came with some sort of companion device with a touchscreen that could be used to navigate the pairing process.
Just getting the WiFi synced or SDR wifi band transmitters to inject without the base station's proper cooperation would allow synchronous broadcasting of FEC'd UDP streams that an app could easily decode.
Because OFDM allows for multipath even if that's due to literally multiple separate transmitters just synced much tighter than the guard interval (which would probably be 400ns).
Its not unlicensed, you have to apply & get the usage rights from FCC. If some one using at that location, you cant use. Plus, FCC has right to revoke temp or permanently the usage right. If I am not wrong, you have to pay minimal fee. To implement this architecture, you have to connect your private 4G/5G network with SAS service providers.
https://cbrs.wirelessinnovation.org/sas-administratorshttps://www.celona.io/cbrs/cbrs-sas
I thought I was the only one silly enough to do something like this! I've got a Nova 227 tucked away in a corner room of my home facing inwards connected to Magma (was Open5GS).
Did you run into the bug where the default Open5GS bearer rate limit (1Gbps) somehow triggers a bug in the Nova 227 and causes it to run super slow (~7Mbps down/100kbps up)? I found that setting its rate limit to anything else seems to solve the problem and I get 100Mbps/10Mbps with TDD2/SSF7.
Holy hell you might have saved my lab deployment! I need to test this when I get home.
If you haven't, have you considered creating an issue on GitHub over this so they can track it?
BTW -- I've done some really, really cursed things. I actually had ported Open5gs running on one of those little $20 USB LTE modems as a joke on top of Debian. I think I made the world's smallest EPC...
Not going to lie, I was a bit inebriated when I did it so I don't remember the exact stuff involved; my hardest challenge was reflashing the stick's Debian install because I accidentally bricked it.
It involved the openstick hacking[1] to get Debian onto it, and from there because it was just a straightforward arm64 box I had to compile the correct version of open5gs and mongoDB to support it. Not technically complicated, just a hilarious "just because I can doesn't mean I should" project.
The stick itself backhauled onto my home WiFi network, so in essence any cellphone was talking to my eNodeB wired to the network, then over WiFi to the USB stick running Debian, then back over the same WiFi back onto my LAN and out onto the world.
Ahaha yeah it bugged the hell out of me too. Near as I can tell, there's nothing wrong with the Open5GS S1AP message, so filing a GitHub issue isn't the way forward. I did speak with Baicells about it, but their response was sorta... Complex. tl;dr I'm somehow the first one to find it, and they probably won't be fixing it.
If you set the rate limits to ~200Mbps you should be able to achieve max throughput on your 227.
I've similarly run Open5GS on a few cursed systems - shame the OpenStick doesn't support band 48 (though, mine won't detect a SIM card anyway...) - I've snagged a few old Verizon MiFi models that happen to have B48 support in the interim, or my iPhone 11 which makes a good UE.
> LTEENB allows to build a real 4G LTE/5G NR base station (called an eNodeB (4G) or gNodeB (5G)) using a standard PC and a low cost software radio frontend. All the physical layer and protocol layer processing is done in real time inside the PC, so no dedicated hardware is necessary. NB-IoT and Cat-M1 devices are also supported. The software is now developped and distributed by Amarisoft.
This is good software, I used their stack to provision software "onto the cell tower" using Kubernetes (spectrum licenses required to be used on a phone): https://arxiv.org/pdf/1905.03883.pdf - maybe this helps someone to understand the stack!
LimeSDR didn't make nearly the splash it should have; why? I recall hearing the first iterations of the hardware were skittish and noisy; did they never improve?
In Rural areas some people need longer range wifi. IoT devices in different buildings or data service for things on a farm.
If the costs were lower, I deploy something in my neighborhood. For whatever reason, my neighborhood has terrible LTE service. I don't know the exact reasons why but I suspect I have poor service either due to geography like we're using another counties cell tower on a butte 5 miles away due to line of sight issues or maybe there are metallic properties of the land here on our canyon rim (mostly lava flows.)
As my spouse and I bike, take walks or drive to work I would love to have data service in that few mile radius around my house. If my phone could have an eSIM with my provider but a SIM with my own private LTE and a meeting call can fail over between data services, I would have a lot more flexibility.
I would have a lot of other use-cases with IoT and maybe quadrupeds as things like that become more generally available but data within a few mile radius would be the first thing.
I suppose I could get four 90 degree antennas at a higher db power output and blast my neighborhood with wifi but that has a lot of downsides.
That works fine if the use-case allows for high-latency and doesn't need TCP. To your point, I would imagine most all home use-cases are fine in this regard.
So I'm curious... The name LTE means Long Term Evolution. What exactly are/were we evolving from and over what term? It just seems weird to have 3G, LTE, and then 5G. What happened to the LTE wording/name/meaning?
In reality, they are all derived from the same 3GPP standard [1], the generations (3G, LTE, 4G, 5G, ...) just being labels applied to particular releases of the underlying standard. In git terms, the underlying 3GPP standard is the "master" branch and the releases are tags. Happily it is all freely available.
To answer your question, LTE referred to the intermediate stages between 3G and 4G. The labels are a bit arbitrary and interpretation of exactly what the label refers to varies with what the carrier is trying to sell.
It was an upgrade to 3G UMTS, the long-term evolution for the third generation as a stopgap to delaying 4G development. The industry then gave up on the 4G proper and rebadged 3G/3.5G/3.9G LTE upgrade into LTE/4G LTE, and moved leftover elements onto 5G and 6G.
It’s kind of repeating with 5G NSA(Non-Standalone)/5G SA(Standalone)/5G mmWave branding, it seems the marketing parts of telecommunication industry always wants to rebadge backported technology as the mainstream next generation, and engineering divisions wants to move onto one generation ahead.
Do you have any good sources of reading for the eSIM standard with regards to this?
Projects like Ukama make it look easy to issue your own eSIMs but I don't understand whether it's something they provide as a service to their users through a central authority or if it's actually a proper roll-your-own system.
The only viable options for the hobbyist is a network using unlicensed bands (like 2.4 GHz / 5GHz, or the 33cm ISM band) along with CBRS[0]. As I understand it, most UEs (handsets) that support one of those bands will not use it exclusively—only in tandem with another licensed band.
The other alternative is Amateur Radio bands, which some hams have apparently experimented with: https://github.com/mmtorni/HamLTE.
[0] https://en.wikipedia.org/wiki/Citizens_Broadband_Radio_Servi...