Yeah it really makes you think about what life would be like if intelligence could infuse anything- be it a ship or a datapad- even if his vision wasn't quite how I imagine it would turn out.
I've also seen it suggested that Harry Potter might be a more realistic look at what proliferated AI might be like.
That's not countering the argument that steering is what is preventing the bike from falling over rather than the gyroscopic effect of the wheels. You'd have to tie off the handlebars with a static line before rolling it in order to prove that it was the gyroscopic effect keeping the bike upright.
IIRC if your brokerage reports everything to the IRS properly, you only need to fill out net short- and long-term capital gains on your schedule D rather than specifying every single transaction on a bunch of Form 8949 copies.
Not sure if Double's underlying brokerage is reporting everything necessary for this to be the case though, as I believe some brokerages don't.
If your threat model includes someone with a quantum computer intercepting all of your traffic and storing it to decrypt later, you probably don't want to share your keys over a non-PQC channel unless you can guarantee that they haven't started eavesdropping on your traffic yet.
While sntrup761x25519-sha512 is a QC secure key exchange, sending a key over it doesn't count. It's not really a "pre-shared" key unless the sharing is done using organic, locally sourced sneakers. Unless FIPs, and then it's boots.
They might have been thinking of the recently discovered hardware backdoor issue, CVE-2023-38606 (see also Operation Triangulation). There was surprisingly little reporting on it.
Do you have a citation for that? It is undergrad level maths and I struggle to believe the technique is news to the AI people. The mathematicians would have known about it in theory for centuries.
1) Schmidhuber does NOT claim to have invented it. He even provides lots of really old references. You know it's old when he didn't invent it, at least in his own mind.
2) even with his generous attributions, "the first application of backpropagation to neural networks" is from 1980.
3) "LeCun et al. (1989) applied backpropagation to Fukushima’s convolutional architecture (1979)".
In other words, the chain rule is really old but figuring out how to use that to adjust weights in neural nets was surprisingly unobvious. It was even more unobvious that that was a good way of adjusting weights.
I've glanced through that material, and I still think it is all obvious. It just wouldn't have worked any earlier than when the results started coming out. If they'd tried these techniques in the 50s it'd have been computationally impossible. If they try them in 2020 they're computationally trivial.
These results would have all started to happen at about the time the cost of computation was within reach of the researcher's budgets. The "theoretical" breakthroughs are of the form "we can implement this technique from the 60s and get good results". Which is impressive, but it does not represent breakthroughs of knowledge as much as incremental improvements in hardware crossing key thresholds. The breakthrough is detecting that hardware can make something work now.
It seems to me that we had to wait until decent memory sizes and decent fp performance was a lot cheaper and therefore much more accessible => much easier to do experiments without having to justify them to higher-ups => somebody figured out 1) how to do backpropagation on neural nets and 2) that it was useful.
In other words, it wasn't obvious at all. It required experimentation.
It would have been practically useful from the 60's (for small neural nets and high-value problems) and 70's (not so small neural nets or lower-value problems) if somebody had figured out how to do it and that it was a useful thing to do.
We might note that it wouldn't have been useful if it was discovered much earlier. There is a minimum amount of computation required to get good results out of neural nets that we've only been crossing relatively recently. From some perspectives the technique could be argued as the most computationally intensive approach to problem solving humans have employed to date.
I want my guests to be able to cast to my TV, add songs to the Spotify queue, etc. As far as I can tell, these sorts of features work via broadcast frames and thus require the relevant devices to be on the same subnet.
Things like my printer and wifi-connected grill live on a much more restrictive VLAN. (with some firewall rules to allow devices on the trusted network to still print to my printer's hard-coded IP address)
You can do it some routers (e.g. opnsense) that let you retransmit that (e.g. with UDP broadcast relay). The main downside is that you have to set it up for each type, and open ports, troubleshoot a lot, waste many hours, etc.
I used to do this but it became too much of a hassle.