I used to keep my old TI-82 (or was it -84?) from high school and a simpler sturdy solar-powered calculator near my desk, but I realized I always just used either my computer (IRB in the terminal usually) or Apple's calculator app on my phone and never ever touched my physical calculators. So they've now been put in storage.
I don't think the GP is calling contributor guideline restrictions a form of DRM.
I think the GP is focusing on:
> I guess we're giving up on the idea that you're free to do whatever you want with software you own? ... But I see this as no different from DRM and user hostile
If I clone an open source git repository, I should be free to point an LLM to review it in any way I choose. I can't contribute code back, but guess what, I don't want to. I want to understand the codebase, and make modifications for me to use locally myself. I don't have a dev team, I have a feature need for my own personal use.
The LLM enables that. The projects that deliberately sabotage the use of LLMs cease to be providing software that meet the 'libre' definition of free software.
I think the other way to think of it is: You're still free to do whatever you want with a the repo. The restriction is happening on the LLM's end, so ultimately it's the LLM's fault, so use a LLM without the restriction you want to avoid.
> The projects that deliberately sabotage the use of LLMs
They don’t though. They add a mild inconvenience for users of a specific restrictive AI provider which has bizarrely glitchy checks.
In a way they are doing you a service if you are this serious about libre software you shouldn’t be using a closed platform which employees dark patterns to begin with.
Who do you think feels the effect of fraud/theft at retail stores? The "rich" owners feel a little of it, sure, but they have a proven strategy for keeping their profits up by reducing costs: fire employees and make those who remain do more work for the same pay. So you think this is "not actually a bad thing" because you're screwing over <insert big company here> but really you're just screwing over the workers.
That's not true. If any company loses revenue is has a lot of places to dump that loss. One is shrinking profit margins, another is raising prices, and another is lowering operating costs like labor, but also pulling lower-margin items off shelves and all other manner of cost cutting.
Let's oversimplify dramatically and say that every single lost dollar is paid through cutting the workforce. You're ignoring the fact that people benefit from the theft: those who need food and are able to steal it rather than going hungry. How do you know that feeding those people is worth less than employing the workers lost to their theft?
I'm not quite sure I follow your question. Are you asking how do I know that someone who loses their job needed their job to afford groceries? If so, I guess I felt it was a safe assumption that the people working at grocery stores are not financially independent.
No not my question. Sorry if it was unclear. I'm trying to understand how you're thinking about this. The question is "are there numbers x and y for which it's good that x number of people who would otherwise go hungry eat food even though it costs y number of people their job"
It's odd to see this comment, since I've always had the opposite experience (at least when comparing Windows and MacOS -- I haven't used desktop linux much in the past 20 years). On MacOS, when I click something, something happens, or at the very least starts to happen (and I get some visual indication). While in Windows I often click on something and get no indication that something happened or started happening, so I click again, and then suddenly perform the action twice. This most often happens when opening programs, but it happens in other places too sometimes.
I’ve found Mac OS to be snappier than any of the dozen or so Linux DEs I’ve tried. I use Fedora with XFCE and it’s ok in responsiveness, I’ve got PopOS on another machine. It’s good. But I’ve got MacOS on my other two machines and they just feel so much snappier. And the Macs are 6-7 years old. The other machines are newer (2/3yo).
In any case have you tested on the same machine for the most apt comparison? Agr may not be the best predictor of performance when io and memory may be more productive of snappiness than the latest CPU.
Input devices and monitors can make a difference as well.
For Windows, my last experience on a personal install was Windows 10 and that was yeeeaaars ago, so... Grain of salt :)
It's not the default, but IIRC Windows could be configured to have zero animations, and I found it to be quite responsive as such.
I'm not talking about the speed of opening programs, but more of the speed of every-second interactions: Unfolding a folder (or other interactions within a program with keyboard or mouse), alt-tabbing across windows, moving between desktops, etc. At least on Windows, I saw far fewer IO-blocking animations than I have on MacOS.
You're right about the "something starts to happen": Apple hides delays behind sigmoidal animations throughout much of their OS. For those who aren't aware of the trick, the delay between the start of the animation and the tail where it starts appears to just be an animation that started on the interaction.
I think separate vaults being in different window is probably intentional behavior to clearly show that they're separate vaults, not something that needs to be fixed.
If you want to sync only some of the things from a single vault and not other things, can't you just use different top-level folders? Have a "to-sync" folder and a "do-not-sync" folder, and only sync the to-sync one. I'm not sure if that's possible using Obsidian's paid sync, but it should be possible with other sync options.
It's been mentioned a few other places in the comments here, but I recommend PICO-8 for beginning game developers. It's a really well-designed fantasy console, so it feels like you're actually developing for specific hardware due to the limits. But you're programming in Lua with an easy-to-use API, and you can make the entire game using PICO-8, since it has a code editor, sprite editor, map editor, sound fx editor, and music editor. It's a really nice experience -- I think this is the primary reason people love it and you hear so much about it.
If you want a less-restrictive game dev system or can't afford the (well worth it!) $15 for Pico-8, there are many great free methods, like LÖVE (also uses Lua), Godot (GDScript or C#), Phaser (js), and so on.
I started rewatching For All Mankind a week or so before the Artemis II launch, so it's been pretty wild to watch an alt-history about people going to and settling on the Moon and Mars, and then to see real life people just starting to return to the Moon at the same time.
reply