Average power at the moment is around ~3mA to keep an LTE-M module (using a QCOM chipset) connected to the network in DRX mode. Bandwidth wise, LTE-M gives us between 500bytes and 3.5KBs per second on the AT&T network - its not terrible and the byte cost is also "low" from AT&T.
LTE-M network coverage is good - same as AT&T LTE as far as we can tell (as LTE-M is just an extension to the basestation feature set and doesn't need new infrastructure).
Only when AT&T starts to support eDRX and PSM will "a month long" battery be realizable for portable products like this.
It's essentially a long life "tag" (10 years depending on duty cycle) that can read sensors and go direct to satellite. Funded by Boeing. I'm not involved in the company, but in the last 20 years have regularly crossed paths with one of the founders at Information Theory conferences. He's a very smart guy.
We also open sourced our module design (EE-02) that uses nRF52 as the MCU: https://github.com/ExploratoryEngineering/ee0x-hardware
You can buy the modules here, but I think stocks might be running low. (However, if anyone needs larger numbers of modules we can put you into contact with the factory we use for manufacturing them): https://shop.exploratory.engineering/
We work for a telco, and this was just a learning exercise to understand the opportunities and challenges with LPWA. While waiting for NB-IoT chipsets and network rollout. We mostly set up the shop to have a orderly way for people to get the devices from us if they want to experiment (free stuff has a tendency to end up lining people's drawers unused)
We built everything from (and including) the devices to the applications, and everything inbetween minus the LoRa gateway. The attitude being "okay we can read a bunch of datasheets and PPT-decks, but we're not going know anything unless we get hands on". So we did.
The modules were just to get the radio and MCU bits into a reusable module which we then used for about a dozen or so prototypes. We run Apache MyNewt on them (the EE-02 was actually used to develop LoRa support for MyNewt).
Since Nordic Semiconductor are in the same building the nRF52 was a pretty obvious choice (I think we might have been the first customer). They make lovely chips. And I'm not getting paid to say that :-).
We did think about building a gateway as well, but we felt it didn't add much to the exercise.
And now? Now we are doing the whole exercise again for NB-IoT, but this time it is to apply what we've learnt. That being said: we plan to open source stuff in the future. Both hardware and software.
The short version is that all the intelligence is in the Network Service. The gateways are mostly resposible for converting packets into RF and RF into packets. Gateways talk to the network service through some backhaul (ethernet, wifi, 4G etc).
(The Network Service is implemented in Go. The frontend is in a separate repository and done in vue.js)
Especially if you are in an urban/sky contrained environment.
transmit is on average 4.1 watts. I get that for satellites that's pretty low power, but that's still in the order of 30dbm (assuming losses). compare that to the peak of 23dbm for LTE-m or 14dbm for lora.
Now the OP author doesn't seem to grasp that the reason bluetooth le can last so long/update so much is because its got a limited range. Hiber at a minimum has to reach 600km, more when the satellite is not directly above. LTE-M only has to reach a few KM at best, lora has tiny packets with low duty cycle.
Absolutely makes sense you're working with Kongsberg for earth station services, since their facilities are well positioned for uplink/downlink from polar orbit and highly inclined orbit LEO satellites.
I know that it had been shown in testing to reach 200 Mb/s - call me when real world usage data can show that kind of result.
In reality, the kind of progress that @barbegal mentions has done more to advance device capabilities, but for something that was initially sold as a 10x speed improvement over 3GPP, it hasn't really panned out.
The telecom folks make a bunch of noise about how you can replace a home connection with these kinds of technologies, but until prices come down and speeds go (way) up, the wireline people don't have much to worry about.
I've often found the most limiting factor (beside RF interference) is not LTE directly but the backhaul connections at cell sites. If you only have a 1 gig ethernet link, that limits your speeds when you get a handful of active users.
Tried it just now at my house on my iPhone SE: 29 mbps down, 28 msec ping. One floor right above my Wifi router, I’m getting just 30 mbps (7 msec ping however) over Wifi, so the LTE performance isn’t too shabby.
Random result in Stockholm, in a relatively built-up office district:
I’ve never found myself disappointed with LTE performance, even the RTT is “good enough” for 90% of my applications
These numbers typically don't reflect a single client, and it would be foolish to present those numbers. Real world numbers for LTE are an order of magnitude better than anything previously. It often is better than a good wifi connection, I just don't see the point in using wifi anymore.
In Austria, you can get unlimited 21-25Mbps LTE internet, without a contract for about 20€ per month. It may not be the fastest, but it is enough for Netflix.
granted at scale its not 150 megs a second, but thats partly down to resources.
On the train I can get between 40-80megabits a second peak. The average speed is around 22megabits: https://opensignal.com/reports/2018/04/uk/state-of-the-mobil....
The issue with NB-IoT right now is chicken vs egg.
NB-IoT stuff currently is limited to expensive modules.
Why? Because the carriers aren't rolling this out yet and when it is it's expensive. So, nobody is going to make a chip for it since the volume isn't going to be there for a while (Intel, for example, just pulled out of making NB-IoT chips). So, the badly integrated systems are going to be battery hungry and expensive, which means that the carriers don't see any volume and consequently don't feel the need to work very hard rolling it out.
Once the carriers finally get NB-IoT stuff universally deployed (even if it's expensive) then the silicon vendors will go to work on integrating everything into a single chip.
Once that happens, someone will say: "Hi, T-Mobile, I have 10 million devices I would like to activate and connect. Would you like to do this or should I go talk to Verizon, AT&T, etc." and the prices will come down.
This will follow the same trajectory as cellular data vs cellular voice.
Background: Intel has serious production issues with their latest 10nm process, meaning their x86 chips and LTE modems for Apple are filling most of their capacity to manufacture chips on the existing 14nm process. I doubt Intel has LTE modem designs that work on older processes, and NB-IOT being unproven, low volume and needing low power chips means it could really use the best process available. Thus, with no slack capacity to build chips, why spend engineering time on something you can't build in quantity?
If 10nm were usable, Intel would likely have numerous side projects filling their older 14nm fabs. Sadly this won't be the case for Intel anytime soon.
That is very highly dependent upon the system.
Some systems are all about leakage--these spend most of their lifetime on a shelf and a couple of days actually active.
Some systems are all about sensors or actuators and the communication is in the noise.
Some systems are all about calling home, and those require good TX consumption.
Seems promising, but I don't see the current consumption documented anywhere yet.
For example the well-known nRF51:
https://devzone.nordicsemi.com/f/nordic-q-a/1657/how-to-mini.... It can be as low as 20 uA. Between advertisements 4 uA.
LTE-M is a high(well couple of kilobytes per second) bandwidth, full duplex, high duty cycle transport layer. Not only that, its relatively long distance (well kinda)
Bluetooth is short range < 30m, and at high bandwidth shorter range. yes it can be full duplex, but not at low power (I've not validated that claim though) The BLE beacons don't emit a location every 4 seconds, they also don't have a GPS on all the time eating 35mah.
Lora has a maximum duty cycle of something like 5% (ie it has to be not transmitting 95% of the time) not only that, its not designed for realtime two way communications, It's niche is a sensor reporting the state of a sensor every n minutes.
You just basically say so by explaining how much higher the bandwidth is.
The thesis of the author of this article still stands. It's not just GPS.
"This mobile tracker is seemingly designed to resemble a popular Bluetooth tracker and sports a LTE-M radio and GPS receiver. But as mentioned here and here, a leopard can’t change its spots."
They try to be as low power as a Bluetooth tracker, but they just can't. It will never be actually low power.
On two AA batteries, 10 years. It's plural.
maybe Amazon or some other big guy interested in IoT should acquire LoRA, then for low-power use cases, LTE-M has absolutely no chance.