You can start your own telenetwork with these projects (but of course doing such would be highly illegal without a spectrum license), but first, you need a gNB or eNodeB which is lingo for a base station. It is possible to buy an SDR and use software like https://github.com/srsLTE/srsLTE to make the SDR receive and broadcast LTE/5G connections. After this, you need to have blank SIM cards to code. For example, this post instructs how to do so: https://cyberloginit.com/2018/05/03/build-a-lte-network-with.... You might also consider writing eSIMs https://github.com/bagyenda/njiwa.
After this, you should be able to use a commercial phone to connect to the network.
In general, these networks function quite well even on commodity hardware. Some tests I worked on produced end-to-end latency of around 20ms on LTE, and throughput of around 7MB/s. This was using NextEPC on a 400 euro computer with a 50 euro router and a Nokia base station. Further, it was possible to make the system to work in a Heroku like manner by exposing PaaS endpoints and running Kubernetes in the access network: https://www.semanticscholar.org/paper/Open-source-RANs-in-Pr...
I think it's quite likely that sooner or later we will have 5G networks with intra-services running as edge-services, providing us low-latency. It's worth noting that with the latest LTE versions 5G and LTE access latency is virtually the same. The real latency-optimization would be, at least to my perspective, to allow software to be deployed _into_ these cellular networks to minimize network hops.
That said, I'm not sure a commercial handset would connect to anything in that range. You could probably get an SDR-based UE (UE is telecom-speak for "thing that connects to a cell network") to work.
There are unlicensed variants of LTE/5G, but worth noting that these often are designed to use the unlicensed spectrum as supplemental downlink (i.e. License Assisted Access (LAA) in LTE), rather than actually allowing for uplink and downlink in the unlicensed spectrum.
The reason for that is simple politics - mobile operators are powerful lobbyists, and part of a powerful standards group (GSMA), and don't really fancy the idea you should be able to run what a customer will perceive as a mobile network without significant investment in spectrum (a finite resource).
LTE-U and LAA won't let you run a network by yourself just on unlicensed spectrum. Multefire should, but is quite rare for the obvious political reasons outlined above.
Sorry I cant find the part that mention political reasons in OP. I thought Multefire never took off was simply licensing issues.
The idea of being able to deliver a full mobile service to mobile phones is "sellable" - operators like to think of themselves as being the only ones able to provide the service people expect to a mobile handset. Then they can bundle handsets with service provision.
If you are interested in this topic, it's worth looking at the fraught relationship between mobile operators and WiFi. Operators historically rejected WiFi from handsets early on - WiFi was a rival to their high-price, high-margin mobile data services. Until perhaps the mid-3G days, when the idea of using WiFi for offload, due to the limited spectrum for mobile data started to become a tempting idea for operators.
Even that's controversial - the "enterprise" market around WiFi wasn't hugely keen on that either, since they felt the mobile operators were just trying to snap up and freeload on the license-exempt spectrum for extra capacity, and use up (finite) WiFi spectrum capacity while providing an operator-badged service.
In my view, Multefire hasn't taken off due to the general high complexity of the tech stack people would need to understand to use it (fine for me with a telecoms engineering background, less fine if you are from the pure IT world and just want something quick - it's a lot more effort and complexity than setting up a couple of WiFi APs), and the lack of pressure from mobile operators to support it. Handset support for features comes from operator demand/desire. Multefire isn't something handset makers will add, unless operators demand it. Absent that, it risks alienating or upsetting them, by opening up the handsets to competition, and since operators are the main route to market for your handsets, market dictates the rules...
While LTE-U does have some advantages over WiFi for WISP use (e.g. better handling of a large number of clients per AP), the higher cost (around ~$9k for an LTE-U eNodeB compared to as low as $500 for a WISP-grade 5GHz WiFi AP) and scarcity of vendors means that it's not very commonly used. We'll see if that changes over time, but WiFi has a huge amount of inertia in that space so it seems difficult for LTE-U to catch up considering that the power levels are limited to such an extent that it doesn't have a huge range advantage over WiFi with good antennas.
In fact, just looking at Baicells it seems that they offer fewer models for LTE-U than they used to, so they may be finding that sales aren't enough to keep up the product line.
Apparently, there are overlaps between common LTE bands and both ham radio bands and unlicensed (... low-power) ones... the above page states that any device with the right bands will work (... and they even got them working presumably?)
I believe OpenAirInterface can handle TDD, although it was firmly "research grade" code last time I looked at it.
The positive from CBRS is that it should (or at least is intended to) spawn a new generation of lower cost small cell base stations, using this band, and speaking the CBRS "protocol" for spectrum access coordination. And that has potential to help reduce prices of radio equipment.
Handset compatibility is coming on this band quicker because some existing mobile operators have purchased PALs (priority access licenses) for CBRS spectrum, and intend to use this for some extra capacity.
P.S. just as a very minor technical correction, CBRS is defined for band 48, rather than 42, although with some overlap. B48 is 3550 to 3700 MHz, while B42 is 3.4 to 3.6 GHz as you said. Therefore when looking for devices, it's best to look for B48 (although at a push, if you're doing your own R&D, B42 will be fine for use in the lower 50 MHz section of the band).
The problem is exactly what you described: there are currently no commercial UE or base stations. So yeah, I suppose one is supposed to use SDR for the 5Ghz band.
is this the same?
there is zero information about nokia kuha! interesting if this can hold more 24/7 broadband clients than wifimax et al.
It seems that this is aimed exclusively at operators that already have the infrastructure to issue SIM cards et al, and communities that are barred from deploying their own infrastructure but willing to pay for it. A weird use case but clearly aimed at US and AU markets I'd guess.
You need a community that will want to front the cost of laying fiber, and then instead of distributing or starting a small ISP/coop will give that new termination to a telecom giant who will plug this device into their existing billing network to provision their new subscribers with access under the existing plans costs.
From memory, a 1-year local spectrum license should cost about £50.
Most of the learning curve comes from the lingo. Especially now that 5G introduces softwarization of many of the network functions, the terms used by the cellular and software industries clash and may be confusing at times. But you can definitely learn the high-level idea of how the cellular networks function in this way.
For physical level things, I imagine that a degree in signal processing is required. And for more advanced software interfaces, I am afraid that you can only read about them, as most devices on which the software is implemented is proprietary. The open-source software you can find is in no-way as fully implemented as the actual software which runs commercial networks.
lol, I've found a lot of terminology around digital communications seem to be confusing. Terms meaning two or three different things depending on context, etc.
Noob question, sorry, but: Why not Wifi?
Network and smartphone monitor nearby cells. If you're just about to switch the cell because the quality becomes too bad, everything is already being prepared in the next cell.
802.11ax was basically moving some of the LTE tools over to WiFi ( OFDMA ) and completely failed. And pushing back some intended features to 802.11be.
Spectrum licensing is more about ensuring exclusive use of spectrum - in a busy built up area (less common during current times hopefully), you'll still suffer from "spectrum suffocation" like with WiFi - if you ever experienced data stalling where your phone indicated a signal, but weren't getting any actual traffic through, you were on a cell that was hitting the limits of the air interface capacity.
The nice part of cellular networks is that you can add more base stations using other licensed spectrum the operator has, in order to get more capacity. So with 3.5 GHz 5G bands, and wider cell widths supported, these can take some of the load in busy places. Otherwise the alternative is to deploy more base stations with lower power on each, to better and more effectively reuse the spectrum.
Since the same operator runs all the equipment on their own spectrum, they can coordinate this much more effectively than ad-hoc multi operator WiFi in a congested area. The downside is the vastly increased complexity of a cellular solution (the sheer number of pages of standards), and the "entrenched player lock-in" available to legacy telecoms operators who already hold spectrum.
Fun fact though - if a phone is without signal on its home network, it will attempt to join any valid 3GPP network it sees, and the cell broadcast/emergency alert will be triggered on the handset (even without a valid SIM) - no authentication takes place.
Although in saying that, this is unlikely to change any time soon, as the idea of CB/PWS is to provide an emergency message that can be highly time sensitive in some scenarios (earthquake, tsunami, etc.) without delays due to authentication etc. Failing to show the message could be higher risk than showing a false message in a very localised area (based on what someone with an SDR can send.)
That enables edge computing by allowing traffic for the edge computing node to be routed to the local edge node directly, without it going down the backhaul.
This is very much a software change though in the implementation of the base station.
Backhaul is becoming less and less of a problem as more and more sites get fibre backhaul. Point to point links can easily do 10gig/sec now even if they don't have fibre (to another tower with fibre). That is almost certainly as much if not more than the total capacity of most cell towers even with a stupendous amount of spectrum allocated.
The problem with backhaul isn't the availability of fiber, it's the _concentration of bandwidth_.
A 5G basestation is called a gNB, and a handset is called a UE. This project doesn't involve the radio interface of 5G, so neither of those are included. There's another comment on the post outlining how you might go about standing up an actual network using this.
If you want to add commercial base stations to it, you'll find you're missing a few non-standardised things missing from all "open source" solutions - you'd need base station orchestration and control, to handle configuring and managing the base stations and getting them to behave in the ways you want.
For one or two base stations you might cope without it, if you can figure out how to configure the radios, and they happen to work. But generally, there's a load of integration to be done that isn't part of the standards, and that you'll need in addition to this. You'll also need to handle scaling up and clustering some of the components that need to scale up with traffic.
In a way I really hope they won't. How could anything good come out of that?
From an engineering perspective: the software stack is conceptually (if you squint enough) a bit like webkit/chromium, but on top of that you also need cutting edge radio/antenna development.