> It needs to be fast-updating for shared multi-user docs, like Google Docs/Sheets or Word/Excel 365.
In my experience, Google Docs has this, but realtime collaboration with Word is unusable. Which is interesting, because that means a huge number of existing Office 365 users have yet to experience it.
Imagine a fully statically linked version of Debian. What happens when there’s a security update in a commonly used library? Am I supposed to redownload a rebuild of basically the entire distro every time this happens, or else what?
Steel-manning the idea, perhaps they would ship object files (.o/.a) and the apt-get equivalent would link the system? I believe this arrangement was common in the days before dynamic linking. You don't have to redownload everything, but you do have to relink everything.
> Steel-manning the idea, perhaps they would ship object files (.o/.a) and the apt-get equivalent would link the system? I believe this arrangement was common in the days before dynamic linking. You don't have to redownload everything, but you do have to relink everything.
This was indeed comon for Unix. The only way to tune the systems (or even change the timezone) was to edit the very few source files and run make, which compiled those files then linked them into a new binary.
Linking-only is (or was) much faster than recompiling.
But if I have to relink everything, I need all the makefiles, linker scripts and source code structure. I might as well compile it outright. On the other hand, I might as well just link it whenever I run it, like, dynamically ;)
Only because of the enormous efforts put in by debian package maintainers and it's infrastructure.
If you're a an indie developer wanting your application to run on various debian based distros but the debian maintainers won't package your application, that's when you'd see why it's called DLL hell, how horribly fragmented the Linux packaging is and why even steam ships their whole run time.
Everything inside Debian is fine. That's most of the ecosystem apart from the very new stuff that isn't mature enough yet. Usually the reason something notable stays out if Debian long term is when that thing has such bad dependency hygiene that it cannot easily be brought up to standard.
Then you update those dependencies. Not very difficult with a package manager. And most dependencies aren't used by a ton of programs in a single system anyway. It is not a big deal in practice.
> I genuinely cannot understand why anyone would buy a car or a bed or a fridge that requires a subscription.
There's a huge car finance market where people do exactly that. How much they pay a finance company monthly vs. how much they pay the manufacturer monthly makes little difference to them. It's all about the monthly fee and what they get in return for it.
> And all of this has to be done in a way that will hold up in court, therefore snail mail.
This needs to change. Snail mail is no longer reliable. Letters often get delayed by weeks or go missing altogether, but the law still assumes that justice is being done by it being sufficient to assume that a letter that was posted has been received within a few days. It's no longer true.
It’s your choice of course, but in the messaging world of gatekeepers and walled gardens, I think AGPL makes the most sense. It’s a key tool we’re going to need if we want to be successful at having a federated network.
Additionally, landlords don't benefit from the higher prices (ie. market rate rent) either, since that also pushes house prices up. A landlord entering the market has a higher capital cost that absorbs the higher rental return, such that typically rental yield remains about the same (at slightly higher that the cost of capital that covers their increased risk compared to a more stable investment).
Those who benefit are those who own housing at the time of market rate increases. That's just regular investment return and the risk/reward can be directly compared to any other form of investment. Current owner occupiers and current landlords benefit at the time of every increase (even if their capital gains are not immediately realised). And then every household, whether they are owner occupiers or tenants, have to pay in the form of higher capital expense. Landlords simply pass the higher rent through to pay for their higher capital expenses.
> Additionally, landlords don't benefit from the higher prices (ie. market rate rent) either, since that also pushes house prices up.
I like this thinking. If you buy a house on the cheap and rent spikes and your profit increases dramatically, you’re not benefiting because somebody will kick in your front door and force you to buy a crazy expensive house.
Besides what you said, it is also in landlords' interest to maintain (but admittedly with as little cost/involvement as possible) the perception that the real estate goods they're renting are (quality wise) reasonable options for their prospective clients. That means that they (may go to lengths to) fend off troublesome tenants and thus contribute to the overall quality of life for the community in the neighborhood.
The problem is in remembering the voice commands. I could never do it. Word the command slightly “wrong” and it won’t work at all (at least not in my 2014 VW).
I’m optimistic that the latest progress in AI will fix this when the technology matures in cars. I reckon this is still a decade away though.
Honestly, other than that one single command ("Climate control defrost and floor") I never really use voice for anything else while actively driving. The temperature knob usually does what I want when driving, and I'll be stopped again soon enough if I want to fiddle with something else.
And that one voice command is easy-enough to remember, and the resulting manually-selected mode is easy-enough to cancel with the Auto button (which is the entire middle of the temperature knob -- simple enough).
AI is too easy to get wrong.
For example: At home when my hands are full and I'm headed to/from the basement, I might bark out the command "Alexa! Basement lights!"
This command sometimes results turning the lights on or off. But sometimes, it results in entering a conversation about the basement lights, when all anyone really wants from such simple diction is for the lights to toggle state -- like interacting with a regular light switch just toggles state.
I simply want computers to follow instructions. I am very particularly disinterested in ever having conversation -- a negotiation -- with a computer in my car.
But I can see plenty of merit to adding some context-aware tolerance for ambiguity to the accepted commands. Different people sometimes (quite rightly) use different words to describe the end result they want.
That doesn't take an LLM to accomplish, I don't think. After all, a car has a limited number of functions. It should be mostly a matter of broadening the voice recognition dictionary and expanding the fixed logic to deal with that breadth.
I reckon that this should have happened 5 years ago. :)
> That doesn't take an LLM to accomplish, I don't think. After all, a car has a limited number of functions. It should be mostly a matter of broadening the voice recognition dictionary and expanding the fixed logic to deal with that breadth.
I think the most effective way to get this accurate and effective is to give an LLM the user’s voice prompt and current context and ask to convert the user’s request into an API call. The user wouldn’t be chatting with the LLM directly.
The point is that it doesn’t require a static dictionary to already have your exact phrasing and will just work with plain English.
That requires either an online connection to a datacenter somewhere or something that (at present time) is a fairly substantial on-board computing system, and those are things that I think are worth trying to avoid for tasks like adjusting the HVAC in a Honda.
Maybe some day.
Right now, when we do have substantial on-board computing systems, they're trying to drive the car -- not change the temperature. Adding in an additional computational timeslot for LLM voice commands seems both foolhardy and expensive.
Meanwhile: Broader dictionaries and static flows with greater breadth for voice recognition? We can do that right now.
(We can even use LLMs to help generate the static flows, along with people to evaluate and test them. Once implemented, they become cheap to run.
This is in-keeping with a fairly common theme here on HN: Don't use the bot to process the data. Instead, use the bot to write the data processor.)
There is a limit. The cost of electricity required is bounded by the value of the reward (block reward plus transaction fees). The value of the reward is bounded too, since the "import" of electricity into the Bitcoin economy is inflationary.
In my experience, Google Docs has this, but realtime collaboration with Word is unusable. Which is interesting, because that means a huge number of existing Office 365 users have yet to experience it.
I wonder if there's an opportunity there.
reply