Linux considers it a proprietary protocol  so Zigbee driver cannot be part of Linux kernel. Although Zigbee spec allows non-commercial individuals to freely use it, a commercial organization must be a member of the Zigbee Alliance in order to use Zigbee, which violates the GPL standard. 
At least the article speaks out "an open-source approach", let's see if things will get better.
mostly to prevent competition from patent-infringing clones manufactured in China.
Publish a set of tests a device must pass, broadly, to be certified compliant.
Allow manufacturers test their devices, for a fee, and win a certificate, or know where they failed. Much like you can take a language exam (say, TOEFL).
That's mental gymnastics. The fear is having to compete against cheaper products (possibly low quality, but possibly not). Even broken DRM is an excellent anti-competitive tool because no one can legally sell a competing product.
IoT devices don't support public key infrastructure, e.g. something like GnuPG or the SSL certificates used by web browsers.
It's a very practical solution to have a shared secret between the device and a central server, and, at registration time, have the central server make a key for both the hub and the device.
Without some kind of key exchange, people could set off your alarm, turn on your lights, open your door...
So no, there will not be a published protocol.
Probably most toxic companies on earth? Yeah, I'm sure it will be a wonder of openness.
I know it has other issues, but particularly since ZWave2, the security model on ZWave far surpasses bluetooth or zigbee
Zigbee etc. tries to get around the whole thing by requiring that the pairing process is done from a 5cm or something distance.
Imho that is good enough but I don't know how far you could extend it with directional antennas etc.
That anyone can reclaim a device they hold in their hand is a feature.
One thing they mentioned is that apart from RCA jacks, all the surround sound standards are tied up and closed.
Firewire was an originally open standard, but there came Apple and demanded to exact a symbolic 1$ per device payment.
Despite FireWire being an indisputably better standard from technical standpoint, that 1$ completely ruined the mood with OEMs, and they lost the market.
USB on the other hand, was not really open, but Intel's central leadership in it guaranteed that there were no disarray with association peers throwing random capricious demands.
Most USB implementers were random Taiwanese OEMs who never formally joined the USB-IF, but nevertheless Intel was smart enough to close eyes on that.
What? Firewire was a project initiated by Apple in mid-1980s, which then was further developed by the IEEE P1394 Working Group. Patents and pooling was part of it from the very beginning, with the primary drivers being Apple, Panasonic, Philips, and Sony (there were also a half-dozen odd smaller contributors). I don't remember there being any sort of "originally open standard" part to it whatsoever. Yeah, Apple owned the trademarked name "Firewire" for IEEE 1394, while Sony and TI used "i.LINK" and "Lynx" respectively, and Apple initially tried charging extra for that which was really stupid (but late 90s Apple was pretty dysfunctional). But even without that wasn't there still the standard $0.25/unit manufacturer royalty?
If you've got some other sources I'd be happy for the trip down memory lane because it's been a really, really long time since that particular battle. I'm not disagreeing that Apple's charge definitely harmed momentum, just that it's not like they came out of nowhere. Though it's worth noting that Firewire was inherently more costly anyway since it required dedicated silicon rather than handing everything off to the CPU. At the time that also gave it vastly more reliable real performance and latency vs USB, and Firewire 400 would typically obliterate USB 2 despite the latter having a sticker speed of 480 Mbps. But it was more inherently costly IIRC.
While VP8, VP9 arguably isn't as much of of Open Standards, it is not patents free ( Neither is AV1 ) but Royalty Free.
USB also had the advantage of Intel pushing it, resulting in almost every PC with an Intel CPU (large majority) having USB ports.
In the mid to late 1990s, USB was scarce. It didn't take off until after Apple replaced its ADB (Apple Desktop Bus) port with USB in the first iMacs. Once that happened, manufacturers flooded the market with consumer products (e.g. CDRW drives) and PC makers followed suit. Up to that point PC makers had standardized on the serial port.
I don't have contemporaneous links at the moment but the second paragraph of a relatively recent (2015) article summarizes:
> But Apple shocked the computing world when it swept its old connectors away with the original 1998 iMac. The bulbous computer adopted a then-struggling standard developed by Intel called USB (Universal Serial Bus). In a hint of what was to become the company’s ability to make or break certain technologies, USB would go on to live up to its “universal” descriptor and become the most prevalent connectivity standard in the world. The solid-and-hollow stacked rectangles of its “A” connector now appear in everything from alarm clocks to airplanes. 
I don't think its fair to say PC "Followed suit" in using USB.
The next couple of releases of Windows included totally redesigned USB support.
I didn't say that. I said, specifically, "PC makers followed suit".
Microsoft may have supported USB in their OS, but PC manufacturers lagged.
The poor support in Windows, as well as peripheral OEMs still using other ports slowed its adoption.
Apple’s smart home protocol famously does not support multiple users. Amazon is choosy on what features you can implement (turn on alarms but not turning them off, locking doors but not unlocking them). Google loves using radio hardware no one else supports. Zigbee has delightful legacy security vulnerabilities and consortium drama.
I look forward to these groups putting aside their differences, coming together, and creating a new standard that combines the best of all these anti-patterns.
Allowing someone to holler into an open (or recently broken) window to open my front door sounds terrifying.
Also, just require a passphrase? To go full star trek, "[Alexa/Google/Siri], unlock the front door, authorisation singingboyo Alpha Pi Pi Zulu" sounds workable.
Can you clarify this? I set up my "home" and was able to share it with my wife, who sees all of the same controls in the Home app that I do. We don't have any family sharing, etc set up.
It does? I'm added to my dad's account and can access everything just fine along with my own stuff.
Why should I pay $40+ for a zigbee/z-wave/smart-things/wifi-enabled door sensor when I can get a $35 RF Sonoff bridge and get each sensor for around $4-$8 from china.
So you can save a lot of money by using that tactic with door sensors/window sensors/PIR Sensors/Alarm systems/Sirens/Switches etc. because those will add up massively. You can't use that for TV/Speakers/Cameras ofc but you can bring everything together onto home assistant.
As far as I'm concerned, only the community can decide which protocol will win and these big techs should just put their weight behind a protocol built with the community instead of closed-door-consensus.
> The industry working group will take an open-source approach for the development and implementation of a new, unified connectivity protocol and increase compatibility for consumers.
Curious if I as a hobbyist will benefit from this? Or if this will become a: it works perfectly, but only if all your devices connect to our certification servers kind of thing, like Chromecast is becoming.
Their entire ecosystem together have less devices shipped than even some OEM nonames, not to say of Xiaomi or Huawei or Tuya who tower over them.
All kinds of "smart assistants" like Alexa end up in drawers very quickly after initial novelty passes, and it creeping up you in the middle of a conversation gets annoying. From data I have, sales of those smart speakers is already starting to taper off.
In Russia, there is an idiom "to divide the cake before it's baked." And those guys are doing exactly that: people don't even know what those "connected home" devices are and which ones sell well, yet they are already eager make up standards for them.
That's the guy who flopped with Silk Labs labs.
I genuinely think the Amazon team (at least in that particular regard) want to do some good. But until they can teach their machines how to understand context, I just don't want my unfiltered conversations going around potentially to medical institutions or law enforcement.
It would be far more palatable for the devices to wait for a command cue ("Computer--") to respond with an activation bleep. After the bleep, the commands begin to be interpereted.
Instead we have a listener always awaiting commands. What could be a helpful and invisible servant is instead some kind of jerk who interjects with the most literal interpretations of normal conversations.
If I wake up in the morning feeling grumpy (every day) and say some crazy crap (totally possible) on my way to the can, will an apple contractor employee be able to figure out what the hell I really wanted by reviewing the seconds of audio?
I have made death threats to wall hanging photographs in those 30 minutes before my medication kicks in. There is no checkbox for this in the privacy settings. I know with some of these smart things you can change the prompt, but this feels like not the best we can come up with.
This can be turned on in the Home app → Accessibility → "Play start sound" (as well as "Play end sound").
Wait, what? Please explain more.
It feels like you're saying Alexa heard you being ... passionate, and got concerned. But my understanding was that Alexa listens only after the trigger word. I'm really confused by what you've said and wish to know more context.
What I can say about the google equivelent is that I find saying 'Hey Google' everytime I want it to do something is a bit of a mouthful especially if you want to do several things is shortish succession.
And my other problem is that I apprently say 'OK Cool' too often when I'm on the desk phone at work as my google account is full of recordings of bits of my work phone convo's where I've triggered it unwittingly.
Whoa, is it really necessary to call BlueTooth out like that?
“It's easier to tell when you're not using BLE. :)
The Tesla Model 3 uses traditional Bluetooth for phone calls and streaming but the Phone as a Key functionality is BLE. When you walk up to the car and try to open it and the car says FU then BLE isn't working.
When your Xiaomi Mi Band smartwatch hasn't buzzed all day but you pull your smartphone out of your pocket and have 8 missed calls, 100 messages, and 500 emails then BLE isn't working.
When you're at a Tech Conference and the Conference App uses BLE Beacons to help navigate you indoors and it can't determine your location then BLE isn't working.”
This seems like a fair compromise to me.
I've been using it on a Raspberry Pi to control various things and recently decided to shape it up and release it to the public.
You can access stuff remotely if you have an Apple device which is paired to the homekit stuff and connected to the internet (e.g. an Apple TV, we use an iPad which is always at home).
I've deliberately designed my system to work offline as while our connection's pretty reliable I don't see why I should need an internet connection to turn on a light! :)
I hope manufacturers see it the same way.
I don't mind devices having to be certified, I mostly want to buy hardware of the shelve anyways for safety and convenience reasons and build the controlling/automation part myself. So my biggest worries are not having a local API, data exposure and having to invest in a ecosystem and having the manufacturer brick it remotely, wasting my money.
So only the "device" part, sadly not the "controller" part. So you'll still need a iOS device to setup stuff in your house, I tried it yesterday with only my iMac to no avail.
But still it's a step in the right direction.
Not a good idea with OEMs, and that's why I believe they gave up the white flag now: no adoption.
They are either jump on the smart home bandwagon now, or never.
Most OEMs chose APP + own protocol approach
Here's a discussion of the master key being leaked:
But I looked it it before that and never tried to look it up again.
So likely so, and with today's encryption it probably won't get hacked for hobbyists to use/learn/play but of course I guess the argument is that the Hue Bridge and other devices will have an API.
I’m still looking for other devices to mesh into a ZB3 network: I’d love to see an Opentherm capable controller able to use any connected thermometer and heating element valves to modulate heat flux production and distribution around my flat. Might be overthinking though... it’s so well thermally insulated.
Also, using mobile.twitter.com (as linked) I see a single reply, if I delete `mobile.` I see no replies. Interesting.
I.e. I just want a thermostat the is a big rotating button and speaks mqtt. It does not exist. If you want it to look good you end up with a Nest thermostat. Home Assistant needs to talk to the Nest online API, not to the device itself. Really annoying and unnecessary. I wish I could just pay 50$ more and get a Nest that does let me talk to it locally. Or whatever are they going to earn with my data? I'd probably pay it straight up.
That does mean some systems are not available at all, so "luckily".
GDPR is a joke and easily bypassed because users are overwhemingly dumb and agree to anything without reading.
By making it online service dependent they don't have to care, the experience will be horribly degraded without signing in.
Yet I don't see any mention of making those being able to work completely offline/standalone.
We rely too much on cloud services that ultimately get turned off after an undetermined about of time.
There is no way I am buying home automation equipment I cannot control myself, especially in a situation where the giants like Google could simply decide to terminate my account because I said or did something they didn't like and take down related systems with it.
I bought a few cheap POE Chinese cameras that I use with Zoneminder but they are all blocked from any internet access except talking to Zoneminder (local).
I definitely agree, although I'd expand that to "talking to any servers at all anywhere for anything I don't explicitly grant permission for". However for that very reason I prefer WiFi/IP devices, because it makes it very easy and straight forward to apply all the powerful network management tools we have for everything else. All devices can go on their own VLANs for example, with careful management and logging of how they communicate. The real shame is that there aren't better, more consumer friendly tools for managing that more visually/automatically.
Custom radios aren't any inherent defense there, already there have been demonstrations of getting right into Z-wave/Zigbee networks using customized SDRs. They have a purpose from an ultra low energy and meshing point of view, but you should be suspicious of what security practices for such things will actually be. WiFi/IP at least has the benefit of tons of open attention and development for security critical situations already.
As for Z-wave/Zigbee, I could be missing a potential security hole but personally I am less concerned with my Z*-devices being hacked and more concerned with IP-devices being hacked and being able to talk to other IP-based devices on my network.
For example, it would suck to have someone be able to hack my door or lights but it wouldn't be the end of the world AND it requires physical access/proximity. This is quite different from someone on the other side of the globe being able to hack a device, hack other devices on my network (non-IoT), and then do something malicious (ransomware, identity theft, etc).
Right, at one point it looked like something like UniFi could show the way there, but Ubiquiti unfortunately has turned into a development dumpster fire and really lost its way, and I don't know of anyone else attempting something similar. The principle remains though that it's another path forward, there are already powerful tools for network control and management, and there are accessible open standards there. Putting a better UX on that is worth considering alongside other solutions is all.
>As for Z-wave/Zigbee, I could be missing a potential security hole but personally I am less concerned with my Z-devices being hacked and more concerned with IP-devices being hacked and being able to talk to other IP-based devices on my network.
For example, it would suck to have someone be able to hack my door or lights but it wouldn't be the end of the world AND it requires physical access/proximity.
A lot depends on where you live. A few years ago for example there were a bunch of articles and demonstrations coming from research into and discovery of vulnerabilities in the ZigBee protocol itself. Because the whole point of it is meshing, if you're in an urban or even suburban environment with sufficient density, then a neighbor being hacked could then hack their neighbors etc in a chain reaction. And of course people had fun immediately putting SDRs on drones and doing a fresh new take on good 'ol war dialing, flying around owning anything they came across. Random example article:
Picked verge vs NYT since I don't think they're paywalled? Lots more though a quick DDG away covering the same thing at the time.
With meshing though, you do have to be somewhat careful about the concept of "proximity" and so on if there are protocol layer problems, which is less of a concern on WiFi for better and for worse. Your home might be locked down, but are you sure your neighbor or neighbor's neighbor and so on and so forth down the chain all have no entry point? I 100% grant it's more of a long term scalability consideration right now for many people, but hey, we're talking about a future protocol here!
Gonna be another exciting decade I guess :)
If you don't mind me asking what networking stack are you using?
Also thank you for the very well thought out and reasoned reply! I wasn't fully aware of some of those attack vectors.
Lastly I think I've been so anti-wifi IoT because of the inherent security issues with literally everything currently on the market. I see the wifi IoT as a bubble about to pop unless routers gain security features for IoT or some other major changes are made to how they work today.
I have a UniFi AP for WiFi, and a EdgeRouterX for route/switch. Of course the EdgeRouterX does not have the Fancy UniFi Management Portal but...
That was alot less than $800, I think I maybe have $125 in the hardware
* Cloud key
* 4-port POE switch
* Security gateway
* WiFi AP
I wanted to go all-in if I did it.
Having said all that, their PtP/PtMP links are still nice. Their APs are solid overall, and do have nice industrial design (though no word on WiFi 6, which for a new install I'd consider fairly important). The interface has degraded significantly over the last few versions, but it's still better and more unified than any other I know of. I mean, I'm still running it myself after all. But if you go that route know what you're getting into and look hard for open box and used stuff that'll be cheap. And I'd honestly suggest not bothering with the cloud key and just running the controller yourself, on an RPi or similar if you want something dedicated but cheap or else spin up a VM or container, or even just run native I guess if you've got a server you run otherwise. The CK is also ancient.
In summary: I adored UniFi, and the potential was(is?) fantastic, and their old vision was fantastic, and at one point they were a really solid venture all around. And I know of nothing else with the same vision either. Yet even so I'm expecting to have to dump it overall in the next few years, which sucks. But long bitter experience has taught me that glorious turnarounds are much more the exception than the rule :(.
I might go down the secondhand/used route if I do decide to do it. Right now I’ve got a single all-in-one router running LEDE and I like it but I’m not able to reach more than 60% of my fiber internet so I’ve been looking to upgrade. I decided that if I was going to throw a couple hundred at it I figured I might as well go all in.
It’s always sad to see a company throw away such a promising future. I saw their new AmpliFi “Alien” router and I’m half tempted to buy that and wait a few more years for a better option to present itself. Or even the UDM but it seems like a very odd offering to me... I guess I’ll keep looking, thank you again for the advice.
Was hoping to start with an extra AP or two for wifi coverage and then build out the rest in time.
The messaging would be agnostic of where the source is. It could be a device in the local network, or it could be a cloud-based service (assuming you open up your network).
HomeKit, for example, works locally, either over BLE or WiFi.
I wonder if it's official or just a workaround due to the fact that all the other logos have names and not just an icon like the standard Apple logo.
Here’s another example; same idea, different font:
> The goal of the first specification release will be Wi-Fi, up to and including 802.11ax (aka Wi-Fi 6), that is 802.11a/b/g/n/ac/ax; Thread over 802.15.4-2006 at 2.4 GHz; and IP implementations for Bluetooth Low Energy, versions 4.1, 4.2, and 5.0 for the network and physical wireless protocols.
> The Project intends to leverage development work and protocols from existing systems such as: Amazon’s Alexa Smart Home, Apple’s HomeKit, Google’s Weave, Zigbee Alliance’s Dotdot data models
Dotdot is basically ZCL over IP (in a way), but comes with a lot of legacy from ZCL.
Thread was my hope for a unified smart home network layer, but it didn't really get the adoption I'd hoped, and from a manufacturer's perspective, it did not include any application-layer messaging.
It looks like the goal is to standardize the application layer messaging (of which Dotdot was an attempt). Maybe call it Dotdot v2, but with better backing.
All my home automation / smart home integrations will have to be Z Wave compatible or I will not use it. If my internet goes out will all my things be useless without it?
Second, hopefully this doesn't come across as patronizing but based on your post it sounds like you aren't sure about this. IP is not "The Internet". You can have your own little private IP network which runs without the internet. If this is an open standard there should at least be the option to create a local only hub which only requires your local network is online.
That's not guaranteed considering who is backing this project, but it should be possible.
Seriously now, I see this as a good thing, if only because we're likely to get an interoperability stamp of approval of some kind.
Right now, and were it not for my using an Open Source solution for my Zigbee gateway, it would be impossible for me to hook up Hue, Xiaomi, IKEA and other devices to Homekit without a bunch of different gateways (because some Zigbee endpoints simply refuse to talk to anything other than their own peers).
I also hope that they manage to do this without going the Google way of having everything open up ports on your router (some of the newer Nest-branded stuff already knows how to talk to peers on a LAN, but the security model for Google/Alexa integrations is fundamentally broken for me - WeMo support excluded).
So far HomeKit can run _completely_ on-premises, with all devices interacting on the LAN, which is great (except for remote connections through the Home app to your Apple TV home hub, and Siri voice recognition to trigger scenes), and that is why I decided to stick with it.
And worse, in my "casual" case, because some sort of Philips/Apple agreement outright prevents it :/
E.g. add a "Hue compatible", but non-Philips bulb to your Hue hub. It will work in the Hue app and via Amazon Echo, but Homekit will refuse to see it.
A way to "trick" homekit to be able to use the bulb with it, is to create a Hue scene, and sync it to homekit, but that is terrible for per-bulb control. This stuff is apparently all Zigbee, but the usability situation is an absolute mess.
(Widely reported, for many years, not a bug).
On the other hand, my Philips Hue lights have all been absolutely flawless. They fire on time, every time.
I think implementation of the standards matters a lot more than the standard itself, in practice.
You can even visualise the network if you're using zigbee2mqtt with zigbee2mqtt/bridge/networkmap
Despite its walled-garden nature, I’m sad to see HomeKit go. It was the only home automation standard I’d trust in my home, though hopefully Apple’s participation in this new initiative means it will share HomeKit’s emphasis on security, cloud independence, and privacy.
I think I know what im talking about, because my (smart) home automation is based on KNX, I use Alexa voice commands or a KNX app on my mobile devices to controll all my KNX compatible devices, a server gets the commands and allows me to control the lights (on/off/dimming), to inquire an set the temperature in each room, to control the inner blinds as well as the outer shutters (open/close), to get data of water consumption, electricity consumption from many devices like the cooking plate, the owen, the lights, the AC, ... to get values from my weather station like wind speed, wind direction, sun intensity and much more ... I also have fire detection sensors, movement sensors, air pollution and water leak sensors which can trigger alarms... I can inquire on my phone if I forgot to turn off the owen in the kitchen, if the main door is being or has been left opened. Through Alexa I have also connected my Roomba as well as my TV and all the media devices connected to it (using the Logitech Harmony hub) but those two things are not KNX, everything else is.
Being able to control all this through Alexa is super fun. When I go to bed I just need say "Alexa good night" and Alexa tells my KNX shutters to move down to 100%, all my lights in any room to 0%. When I leave the house I say "Alexa, good bye" and Alexa checks if my appliences are turned off, turns the lights off and lowers the heating in all the rooms a bit. Also as im super lazy, if I finish cooking and throw myself on the couch but forgot to turn of the kitchen lights I just need to say "Alexa turn of the kitchen lights and turn on netflix".
What is also nice is that I can program (control and combine) everything myself. I currently use NodeRed (https://nodered.org/). So I can program routines, like "if the time is > this and the front door gets opened send me an email or SMS", if the wind speed is above a certain threshold open the shutters to avoid damage, ...
Looking at the differences between the KNX standard and the "Connected Home over IP" project.. From the latter's home page :
> By building upon Internet Protocol (IP), the project aims to enable communication across smart home devices, mobile apps, and cloud services and to define a specific set of IP-based networking technologies for device certification.
This seems like a higher level of abstraction than KNX (unless "smart home devices" in the above description includes the kind of individual sensors you mentioned) - and exclusively focused on using the Internet Protocol.
Reading the Wikipedia article on KNX, it does sound like it has all the elements needed for home automation, including what this new standard aims to achieve.
EDIT: Now reading about the ZigBee specs, I find there's a big overlap in protocols/functionality. As a complete newcomer, it's hard to disentangle the pros/cons of these standards.
The alexa bridge is not perfect, sometimes it has hald a second of delay. Twice a year the servers are down and it doesn't respond for few hours, but besides that I'm very happy with it. B.t.w. I used this server from PRO KNX to connect my Alexa(s) to the KNX server: https://proknx.com/en/news/2017/realknx-2-0-voice-control-al... (it is also compatible with Google Home as well as Apple Homekit).
I hope the Google, Apple, ... alliance decided to do create a new standard for good reasons, but I have doubts as I can't find someone that explains me what is so bad about the existing standard. Why Apple, Google and the others don't just join the KNX foundation and why this already open and royalty free KNX standard can be built upon!?
Most would not show me a price without me first creating an account and logging in, but I did find one German site that would show me a price, which after converting Euros to USD, came out to a little over $300. I also found a UK site that would show a price. That, after Pounds to USD conversion, was over $500.
I did a little searching to try to find out how lights are handled in KNX without bridging out to some completely different system like Hue. All I was able to find was lighting fixtures with KNX control built in. I didn't see bulbs with it built in.
Does this mean that if someone without any home automation decides that they want a couple of smart lights, to do it purely with KNX that have to replace some light fixtures?
That's not how most people in the US add some smart lights. Most here do it by buying bulbs that screw into fixtures meant for regular bulbs, and implement the smart stuff in the bulb itself. There is no need to replace fixtures (which you may not be allowed to do if you are renting).
I want a super secure hub that everything connects to. The hub is the only thing that speaks to my router. The hub is super secure & doesn't let devices send data back to their manufacturer. If I buy cheap devices off a flea market like Amazon, I want to sleep safe & know that the hub is preventing that device from messing with any other devices or accessing the internet. The hub can send me notifications and I can send it requests. It would be cool if I could choose to have the main hub database & software based in the cloud or on my local network.
Not sure if this is already possible. If it is, I would love to hear more.
The problem with these protocols is although they are more or less open, device manufacturers need to pay to be certified.
I get if you live in a densely populated area that drive-by type attacks would be very concerning.
not only that. One insecure system in the area could be infected and then controlled to attack other networks in the area. With a high density of IoT devices you could have malware spreading wirelessly device-to-device.
Also with a wide IoT adoption with such devices being used by public/utilities companies (smart street lamps, leakage monitoring in pipes, smart electric grid etc) you don't even need a densely populated area to have that problem.
>I am more concerned about the bigger threat, global attacks.
I once read something among the lines that IT security needs its own Pearl Harbor event. I can neither quote it exactly or attribute it to any source
I see zero reasons to rationalise the need to buy yet another white box for the buyer.
Think why they can't sell much of these
1. Their lightbulbs to not be co-opted into DDoS attacks
2. That their cameras are only used by them
3. That others can't control their devices
4. That their devices receive security updates
5. That their devices don't contain back-doors
6. That their devices can be used when isolated from the Internet
... I imagine.
Reading between the lines of how Apple's been handling the "smart home" business, they've been focusing on privacy and security relative to competitors, but it's been holding them back.
I think the market has kind of shown that privacy (e.g. devices that aren't streaming to / dependent on the cloud) and security have not been primary concerns of the people who buy smart home devices, but solving those problems better may be key to enlarging the "smart home" market to include normal people.
While all using the same transport layer they can continue to utilize different “brain” strategies.
Apple’s “brain” has always been in your home where your data belongs and should stay. Google started with the cloud but is moving towards the same model.
On the other hand Amazon seems to have no qualms slurping everything out of your home to their servers and no plans to change.
The more you know.
I used Z-wave for my home, but I'm not doing it "on the up and up;" my hub is homebrewed and using an unlicensed radio. It's a pain in the ass to maintain, but since I can't trust any hub company to survive past five years, it seemed the right call.
I try to only buy Z-wave devices and fallback to zigbee if I have to. It's been a very pleasant experience for me overall. I have 5 z-wave light switches, a handful of zigbee bulbs, zigbee door/temp sensors, and 2 z-wave locks. I plan on replacing all my light switches eventually (I've got like 5 left but all in low-traffic areas of my house).
Big tech have already tricked me into supporting them with their "open platform" bait-and-switch before (android), that trick wont work on me twice.
If average consumers are told to simply "look for the CHIP logo", and the CHIP standard includes facilities for must-phone-home messages akin to streaming DRM, I'm afraid we'll just get pulled further into the corporate surveillance dystopia.
I appreciate the enthusiasm in the home automation community, and I agree there's a need for an offline solution, but Home Assistant is not it, and we'd be better off scrapping that codebase and starting something better.
Needing connected to the internet (and all the latency associated with it) is an anti-feature that most consumers aren't even aware of or think of, hence why I hope solutions that have the feature aren't crowded out before the market for home automation truly takes off
I can't remember hearing of autobricking installs on hass.io which might be what you are referring to but I doubt it.
Totally in scope is provisioning, configuration, and notification of things it can detect such as motion. I’d anticipate those being managed as part of this new protocol, with streaming simply being to provide an endpoint address and maybe an authentication token to connect.
But the whole point of CHoIP is that PHY doesn't matter, so you can configure your camera using IP messages over BLE, and then the camera connects to services over WiFi to stream.
Currently, home automation typically uses a variety of custom protocols built on IEEE 802.15.4 (not 802.11*). It's a simple protocol that can be implemented by a cheap 8-bit microcontroller from 2 decades ago. Devices from one manufacturer may or may not communicate with those from another, and the network architecture is typically a hub with spokes - there's limited or no support for multi-router, multi-access point, or mesh networked setups.
This style of network honestly works pretty well for motion sensors, lights, outlets, temperature sensors, thermostats, etc. The master node might query a device every few seconds for a couple bytes of status, or might only send a short command when a user interacts with the device. Most smart home devices send a few bytes of data four times a day when you push "light on" and "light off" on the hub. This is great for operating for months or years on just a couple AA batteries. A camera sends a few bytes per pixel times (simplifying) 720 pixels down times 1280 pixels across times 30 frames per second times 86400 seconds per day, and can only run for an hour or two on a larger battery. That's almost 8 terabytes vs. 8 bytes.
But to use 802.15.4 networks, you need a hub, which is a barrier to entry - instead of one $20 light, you need a $15 light and a $50 hub. You likely already have an 802.11 router that could be the hub if the devices were smart enough to talk to it. And (tinfoil hat on) I think these companies would rather have their servers be the hub, rather than a device in your home they can't monitor and profit off of.
What is the obvious implementation out there? They just have both. They have the cheap 8-bit microcontroller from 2 decades ago transmitting on the low-bandwidth home automation network, and when it is decided that the camera should stream, it wakes up the beefy Ambarella or Socionext SoC to send its video feed over classic WiFi to the vendor cloud, and whoever wants to receive it just gets a stream URL back.
"Teams that don't matter" today. This market is absolutely rife with opportunity for someone to steal the whole thing with better UX and an interop promise.
In contrast, the Bluetooth SIG only requires a fee to be paid when you actually ship, as a once off not a yearly fee. The cost is roughly similar.
On the surface this "appears" very similar.
EDIT: Yes I'm aware the software layer is different, I was inferring the radios are the same.
Yes the software layer is different.
Edit: basically looking for a way to avoid more wireless transmitters in the house. Or are all outlets "receive only"?
I gather it was:
- might have required you to make a connection in your circuit breaker
- and was designed in the 1970s, when home electrical systems weren't very noisy (apparently all the switching power adapters we have today make a ton of noise and make it hard for the signal).
I don't see why something newer like this couldn't be done -- we have powerline modems, right? Probably not as fast as wifi, but it does go where it is needed and requires physical access to hack.
Some devices are using Zigbee (a different wireless system), but I understand it was developed without security and isn't hard for third-parties to hack into.
Yep, that's inherent.
> - might have required you to make a connection in your circuit breaker
That's one way to do it. Another way just requires a bypass and filter at your circuit breaker what is much simpler and doesn't have to connect anywhere else.
> - and was designed in the 1970s, when home electrical systems weren't very noisy
This is where things got worse, and then they got better. Yes, electrical lines are very noisy, but the 200Hz - 100kHz band is only getting cleaner. Electric motors will interfere with it, so you may need a filter on your blender (but even it is much better now), but this is the prime band for electrical wiring.
Also, on chasd00 comment that it requires capacitors bypasses on transformers, that's only true on the higher bands, and only on devices you want connected to the network. So only the smart devices need to adapt.