Hacker News new | past | comments | ask | show | jobs | submit login
We Can Stop Pretending LTE-M Is a Low-Power WAN (medium.com)
88 points by peburns on Sept 11, 2018 | hide | past | web | favorite | 65 comments



I'm just in the middle of launching an LTE-M based device that contains a primary power source (a 12v vehicle battery) with a supercap for backup power when the battery dies.

Average power at the moment is around ~3mA to keep an LTE-M module (using a QCOM chipset) connected to the network in DRX mode. Bandwidth wise, LTE-M gives us between 500bytes and 3.5KBs per second on the AT&T network - its not terrible and the byte cost is also "low" from AT&T.

LTE-M network coverage is good - same as AT&T LTE as far as we can tell (as LTE-M is just an extension to the basestation feature set and doesn't need new infrastructure).

Only when AT&T starts to support eDRX and PSM will "a month long" battery be realizable for portable products like this.


We at Hiber [0] are working on an actual low-power solution, which will give the device a way to transmit once a day to a sattelite with a custom 144 bytes user payload. The device location is already encoded, so you can focus on filling those 144 bytes with anything you'd like. You'll have global coverage and a central system to extract the data.

[0] https://hiber.global/


We would like to evaluate this. Currently looking at Sigfox, space.fleet and others. I sent a message via the Hiber contact page.


You might also be interested in http://myriota.com/

It's essentially a long life "tag" (10 years depending on duty cycle) that can read sensors and go direct to satellite. Funded by Boeing. I'm not involved in the company, but in the last 20 years have regularly crossed paths with one of the founders at Information Theory conferences. He's a very smart guy.


If you want to experiment with LoRaWAN, we open sourced our Network Service: https://github.com/exploratoryengineering/congress

We also open sourced our module design (EE-02) that uses nRF52 as the MCU: https://github.com/ExploratoryEngineering/ee0x-hardware

You can buy the modules here, but I think stocks might be running low. (However, if anyone needs larger numbers of modules we can put you into contact with the factory we use for manufacturing them): https://shop.exploratory.engineering/


Nice!! Cool that it's open hardware. What's your business model considering you don't want someone buying large numbers from you? :-)


You have no idea how often I am asked that :-)

We work for a telco, and this was just a learning exercise to understand the opportunities and challenges with LPWA. While waiting for NB-IoT chipsets and network rollout. We mostly set up the shop to have a orderly way for people to get the devices from us if they want to experiment (free stuff has a tendency to end up lining people's drawers unused)

We built everything from (and including) the devices to the applications, and everything inbetween minus the LoRa gateway. The attitude being "okay we can read a bunch of datasheets and PPT-decks, but we're not going know anything unless we get hands on". So we did.

The modules were just to get the radio and MCU bits into a reusable module which we then used for about a dozen or so prototypes. We run Apache MyNewt on them (the EE-02 was actually used to develop LoRa support for MyNewt).

Since Nordic Semiconductor are in the same building the nRF52 was a pretty obvious choice (I think we might have been the first customer). They make lovely chips. And I'm not getting paid to say that :-).

We did think about building a gateway as well, but we felt it didn't add much to the exercise.

And now? Now we are doing the whole exercise again for NB-IoT, but this time it is to apply what we've learnt. That being said: we plan to open source stuff in the future. Both hardware and software.


Nice! How can we stay up to date?


We have a blog where the team occasionally posts updates.

https://blog.exploratory.engineering/


Can you recommend a good write up on LoraWan architecture? (Curious about the need for a network service). Cheers!


The LoraWAN spec is surprisingly good. Dense as a bucket of lead shot, but usable to write an implementation :-)). https://lora-alliance.org/sites/default/files/2018-04/lorawa...

The short version is that all the intelligence is in the Network Service. The gateways are mostly resposible for converting packets into RF and RF into packets. Gateways talk to the network service through some backhaul (ethernet, wifi, 4G etc).

(The Network Service is implemented in Go. The frontend is in a separate repository and done in vue.js)


with respect, iridium is not low power. even the small "IoT" modem https://www.iridium.com/products/iridium-core-9523/ requires 2.5 watts peak. even once a day, you're lucky to get a months worth of reports from anything less than 10whr battery.

Especially if you are in an urban/sky contrained environment.


We are not running on Iridium, but our own satellites and frequencies.


sorry yes, I should have read this first:

https://support.hiber.global/hc/en-us/article_attachments/36...

transmit is on average 4.1 watts. I get that for satellites that's pretty low power, but that's still in the order of 30dbm (assuming losses). compare that to the peak of 23dbm for LTE-m or 14dbm for lora.

Now the OP author doesn't seem to grasp that the reason bluetooth le can last so long/update so much is because its got a limited range. Hiber at a minimum has to reach 600km, more when the satellite is not directly above. LTE-M only has to reach a few KM at best, lora has tiny packets with low duty cycle.


Claimed pole to pole coverage tells me this is based on an embedded Iridium SBD modem.


Nope, it is our home made solution, with our own frequencies and satellites.


What are the NORAD IDs for your satellites, and TLEs?

edit: https://space.skyrocket.de/doc_sdat/hiber-1.htm

edit: https://www.satellitetoday.com/ground-systems/2018/08/08/dut...

Absolutely makes sense you're working with Kongsberg for earth station services, since their facilities are well positioned for uplink/downlink from polar orbit and highly inclined orbit LEO satellites.


Hi, had a look at your tech, quite interesting, but saw no open positions, do you see any opening in the short-medium term?


Thats correct, however you can send your CV to maarten@hiber.global, the CTO.


Let's be honest, LTE was always oversold in terms of its technical limitations. The first reference to it that I can recall seeing (years ago) actually was describing LTE Advanced (~150 Mb/s), but conveniently left out that information.

I know that it had been shown in testing to reach 200 Mb/s - call me when real world usage data can show that kind of result.

In reality, the kind of progress that @barbegal mentions has done more to advance device capabilities, but for something that was initially sold as a 10x speed improvement over 3GPP, it hasn't really panned out.

The telecom folks make a bunch of noise about how you can replace a home connection with these kinds of technologies, but until prices come down and speeds go (way) up, the wireline people don't have much to worry about.


I have never been unimpressed by LTE. It's reliably fast, even if it's not blowing records. I know that if I have good-full LTE signal that if my shoddy 1Gb Spectrum connection or my wireless has issues I can just disable my phones wireless and get a great stable connection. Yeah, it's not going to replace my home internet but it is incredibly better than 3g. I know if my phones on 3g it's not even worth using for anything other than phone/sms. Definitely not MMS.


LTE Advanced with carrier aggregation can certainly deliver on speeds. Here is a fresh speedtest taken just now, were I'm getting 185Mbps down. I'm inside a building but only 1 block from the cell site and I'm getting 3x Carrier Aggregation on 2.5GHz (band 41) on Sprint. I'll admit these are pretty ideal circumstances for a speed test but I just opened the app an ran it to show the speeds are possible.

I've often found the most limiting factor (beside RF interference) is not LTE directly but the backhaul connections at cell sites. If you only have a 1 gig ethernet link, that limits your speeds when you get a handful of active users.

http://www.speedtest.net/my-result/a/4239773416


I’ve gotten 150 mbps down using the Ookla Speedtest app on my iPhone 6s at the strip mall nearest to me. (Nowhere special, suburban Annapolis.)

Tried it just now at my house on my iPhone SE: 29 mbps down, 28 msec ping. One floor right above my Wifi router, I’m getting just 30 mbps (7 msec ping however) over Wifi, so the LTE performance isn’t too shabby.


That's fair, although I suspect the infrastructure is more developed in certain areas because the Naval Academy is there. The SE result is pretty typical from what I've seen. When I only had 25/5, I'd be really excited by such a thing, but 100/35 has ruined me.


My experience with LTE in the US (primarily west coast) has left a lot to be desired but that’s not always the case overseas.

Random result in Stockholm, in a relatively built-up office district:

http://www.speedtest.net/my-result/i/2834370481

I’ve never found myself disappointed with LTE performance, even the RTT is “good enough” for 90% of my applications


LTE is awesome. The latency alone makes is huge and can not be overstated, it truly is a game changer.

These numbers typically don't reflect a single client, and it would be foolish to present those numbers. Real world numbers for LTE are an order of magnitude better than anything previously. It often is better than a good wifi connection, I just don't see the point in using wifi anymore.


> The telecom folks make a bunch of noise about how you can replace a home connection with these kinds of technologies, but until prices come down and speeds go (way) up, the wireline people don't have much to worry about.

In Austria, you can get unlimited 21-25Mbps LTE internet, without a contract for about 20€ per month. It may not be the fastest, but it is enough for Netflix.


Depends where you are. In london there is an ISP that does replace your internet connection with LTE.

granted at scale its not 150 megs a second, but thats partly down to resources.

On the train I can get between 40-80megabits a second peak. The average speed is around 22megabits: https://opensignal.com/reports/2018/04/uk/state-of-the-mobil....


Just sitting here behind my desk, running a speedtest, I get 210Mbit(!!) out of my iPhone 7, over the T-Mobile network. No optimisations at all, not on a high roof or taped to a mast. Doesn't get more real-world than that.


In central London on EE I can get 140Mbps sometimes and it’s usually above 80Mbps.


This is the kind of blather I expect from business types without hardware experience.

The issue with NB-IoT right now is chicken vs egg.

NB-IoT stuff currently is limited to expensive modules.

Why? Because the carriers aren't rolling this out yet and when it is it's expensive. So, nobody is going to make a chip for it since the volume isn't going to be there for a while (Intel, for example, just pulled out of making NB-IoT chips). So, the badly integrated systems are going to be battery hungry and expensive, which means that the carriers don't see any volume and consequently don't feel the need to work very hard rolling it out.

Once the carriers finally get NB-IoT stuff universally deployed (even if it's expensive) then the silicon vendors will go to work on integrating everything into a single chip.

Once that happens, someone will say: "Hi, T-Mobile, I have 10 million devices I would like to activate and connect. Would you like to do this or should I go talk to Verizon, AT&T, etc." and the prices will come down.

This will follow the same trajectory as cellular data vs cellular voice.


Intel pulling out is far from a canary in the coal mine. The rest of your post is fairly accurate though!

Background: Intel has serious production issues with their latest 10nm process, meaning their x86 chips and LTE modems for Apple are filling most of their capacity to manufacture chips on the existing 14nm process. I doubt Intel has LTE modem designs that work on older processes, and NB-IOT being unproven, low volume and needing low power chips means it could really use the best process available. Thus, with no slack capacity to build chips, why spend engineering time on something you can't build in quantity?

If 10nm were usable, Intel would likely have numerous side projects filling their older 14nm fabs. Sadly this won't be the case for Intel anytime soon.


Actually, ultra-low power devices (ie months-/years-long battery life) are probably better suited for older technology nodes, say 40+ nm for bulk, or 22+ nm for SoI. Standby leakage power is going to be too high on the lower nodes. Lasting months/years on a coin cell means sub-1 uA standby combined with infrequent, short-lived wake cycles. There's no real way around this until you go to energy harvesting, and right now that's limited to very very low power, single-function devices.


Excellent post -- a rarely understood truth. For low power wireless systems, still, though, the bulk of the energy is spent in TX and RX. But realistically the quiescent power drain still needs to be in the sub 5uW range to deliver long life on a small battery. A lot of GNSS and LTE SoC's standby in the 50uW range, which is just too much. Most of the LTE devices I've surveyed (M1, NB-IoT) are expected to be attached to MCUs, but they implement internally a small linux platform. Obviously, that makes it hard to get down to 5 uW.


> For low power wireless systems, still, though, the bulk of the energy is spent in TX and RX.

That is very highly dependent upon the system.

Some systems are all about leakage--these spend most of their lifetime on a shelf and a couple of days actually active.

Some systems are all about sensors or actuators and the communication is in the noise.

Some systems are all about calling home, and those require good TX consumption.


I am excited to see this come out:

https://www.nordicsemi.com/eng/Products/Low-Power-Cellular-I...

Seems promising, but I don't see the current consumption documented anywhere yet.


What will prompt carriers to roll this out?


The thing is that LTE-M allows you to have real IP connectivity in contrast to other true LPWAN technologies built on datagrams that are sufficiently small that any real cryptography is non-trivial challenge and can be transmited once each minute/hour/day. If there is any unserviced niche it is in duplex bursts of realtime traffic (ie. what you usually get for satellite telematics, but for IoT it has to be few orders of magnitude cheaper)


A few things. (1) If the payload is shorter than the key length (e.g. 128 bits) then crypto is actually really strong (and easy). For long payloads, you need to work a lot harder. (2) For doing a public key handshake, you can do it without a huge packet using one of the elliptical curve algos, but for any public key handshake you need a reasonably short duration of the handshake in order for it to be adequately secure. That's a bigger challenge than the data size. Low power LTE Cat M1 or NB-IoT systems aren't any better at practically serving low latency sessions than most LPWANs are. (3) "secure elements" should be called "insecure elements" when the items they are installed in are remotely operated in public environments. (4) Carriers tend to use TCP/IP with LTE, which is more of a limitation than a benefit, because it prevents any manner of broadcasting. There are a lot of applications that can be low power if they are broadcasted, but will never be low power (or easy) via TCP. (5) The direction LTE has gone, in my opinion, makes it undesirable for a lot of applications. It is nice for low-volume monitoring applications. So in a sense I think LTE as a LPWAN is really the great solution for niche use-cases.


Your claim (1) looks a lot like hubris to me unless you're imagining a product that sends only a single payload during its lifetime.


I’ve done a lot of prototyping with LTE, GPRS, LTE-M, and NB-LTE chipsets (along with having previously worked for a Bluetooth tracker company) and I think this article is making a false conclusion. The terrible battery life of these example products is much more caused by the power devouring GPS or a lack of adequate battery capacity in these products than by the LTE-M chipset.


If you think so, you've to state what the power use was of the chipsets you've used.

For example the well-known nRF51: https://devzone.nordicsemi.com/f/nordic-q-a/1657/how-to-mini.... It can be as low as 20 uA. Between advertisements 4 uA.


its down to duty cycle. That figure is the average consumption. I can make most chipsets consume 20uA, its trivial, just don't take them out of hibernation.

LTE-M is a high(well couple of kilobytes per second) bandwidth, full duplex, high duty cycle transport layer. Not only that, its relatively long distance (well kinda)

Bluetooth is short range < 30m, and at high bandwidth shorter range. yes it can be full duplex, but not at low power (I've not validated that claim though) The BLE beacons don't emit a location every 4 seconds, they also don't have a GPS on all the time eating 35mah.

Lora has a maximum duty cycle of something like 5% (ie it has to be not transmitting 95% of the time) not only that, its not designed for realtime two way communications, It's niche is a sensor reporting the state of a sensor every n minutes.


You won't be able to achieve the BLE figures with cellular technology. (Even without GPS. )

You just basically say so by explaining how much higher the bandwidth is.

The thesis of the author of this article still stands. It's not just GPS.


I can, if I don't turn it on, which is where the 20uA figure comes from, keeping the device in hibernation


Exactly. Duty cycle is something we had to be very aggressive about even on the BLE side.


Er, what relevance does BLE power consumption have to LTE-M, NBIOT, etc.?


Mmm. That's the thesis of the article?

"This mobile tracker is seemingly designed to resemble a popular Bluetooth tracker and sports a LTE-M radio and GPS receiver. But as mentioned here and here, a leopard can’t change its spots."

They try to be as low power as a Bluetooth tracker, but they just can't. It will never be actually low power.


One (or a few) crappily implemented products doesn’t prove a point to me. As I’m bound by NDA on my prototypes, but I’ll do a little digging for public info I can share.


BLE was included to demonstrate that the convention for "low power" IoT devices is usually expressed in terms of years (of battery life). This is not saying BLE is a LPWAN, but in the minds of customers the expectation, like for LoRa, Sigfox, or active RFID, low power means years of battery life.


Show me a BLE device that has years (plural) of battery life.


https://www.blueupbeacons.com/index.php?page=maxi

On two AA batteries, 10 years. It's plural.


Ok, I partially stand corrected. But let’s get a few facts straight. First, the default config battery life is 4 years, not 10. Anybody can make a device that’s not useful for the intended purpose last forever. That said, the default config is a very reasonable 400ms IBeacon with Nordic chipset and transmit power of -8dbm. The important thing to note is that it’s using 2600mAh AA batteries. That’s not your off the shelf AA, but a very high power version of the AA (roughly 2.6x more energy each). Another important thing to note is it doesn’t do any sensor reading or two way communications, it’s just doing a blip of radio broadcast 2.5x a second. Still true BLE, yes, but also very much comparing apples and oranges to an LTE-M chipset and literally anything you’d do with it.


No idea why this article takes the view it does on LTE based on 1 device? ... We do low power data loggers on 3G and now just starting doing LTE, and power usage is really good for LTE compared to 3G. We aren't rechargeable (though we do have options for that), and we do 3+ years depending on the scenario. We have 1000s of devices in the field and have been doing this for quite a number of years. LTE is going to let us do more for the same life span, or the same for a longer lifespan.


Hi Keith, I am interested, who is "we"?


that's absolutely true, the real low-power is LoRA technically, it's just that LORA is too small comparing to all these cellular giants.

maybe Amazon or some other big guy interested in IoT should acquire LoRA, then for low-power use cases, LTE-M has absolutely no chance.


I would love to see an actual low power, long range system: a cross between Bluetooth low energy and LTE. Allow devices to transmit at a higher power than Bluetooth and with much smaller receiver on time than LTE.


That's a really hard problem to solve, which is why you don't see lots of solutions. DASH7 has been around for a few years, and it addresses the technical aspects of this problem, although it still probably needs another year to mature to the point where it's easy to integrate. But if you have a high volume application, it's an option.


Isn't that exactly what Sigfox and LoRaWAN are?


Duplex? Scheduled or unscheduled transmissions? What about LoRaWAN?


On another note, GPS tracker is probably wrong application for this assesment as I would believe that GPS receiver has significantly larger power consumption than LTE-M radio.


No, it's usually much less for GNSS. The secret to low power GNSS is to find a way not to have the receiver on much of the time, but that's very possible. LTE has less flexibility in overhead energy, because there's overhead enforced by the network.


I'd be amazed that anything with a GPS in it would last seven days, never mind any other radios that are in it.


How about a LPWAN endpoint with GPS that last 2+ years? Like this: http://bit.ly/2pBcYqp


If it isn't open source, it can't optimize power efficiency up and down the stack.




Applications are open for YC Summer 2020

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: