Hacker Newsnew | past | comments | ask | show | jobs | submit | binaryturtle's commentslogin

I bought V2 a while ago too when it was offered extra cheap. The problem it doesn't run on my rusty machine. I bought it to have it as reserve once I upgrade my machine someday (who knows if my V1 stuff still runs then?). I learned about this weird activation server stuff afterwards, so ultimately I had to ask for my money back. There was no way to "activate" the software and store the key/keyfile in a backup. In no way this is future proof in my view.

I want to use my software w/o depending on the availability of some random 3rd party server. I guess it just got worse with this new app here. I'm not enthusiastic about it at all. This has nothing to do with a price point at all (I was happy to pay for all my 3 V1 apps separately).


I simply add "0.0.0.0 youtube.com" to the (network wide) hosts file on my router. Problem solved. I'm simply do understand why the problem is just with "shorts", most of the "longs" are not worth wasting time with either. :-)

YouTube has a massive amount of edutainment content that is absolutely fantastic. Content that could never make it onto mainstream television.

At this point I think I owe all my hobbies to YouTube.


And why do you need all these hobbies?

Because hobbies are fun.

Hedonism, got it.

Spoiling for some trolling, are you?

Having a hobby is not hedonism. There is more to life than just working for someone else.


Oh, there absolutely is. But most hobbies people seem to have are just pointless and waste time.

hobby /ˈhäbē/

    1. an activity done regularly in one's leisure time for pleasure.
If you enjoy the activity then it is neither pointless nor a waste of time since the primary purpose is enjoyment. If you don't enjoy it, it's not a hobby.

No doubt these posts could be your hobby; you do it for pleasure but it's otherwise just a pointless waste of time.


Why does this matter?

I'm curious. It matters to me.

Isn't D supported by the GNU compiler collection? I personally would prefer this type of tooling over what Rust and Go do (I can't even get their compilers to run on my old platform anymore; not to mention all this dependencies on remote resources typical Rust/Go projects seem to have: which seems to be enforced by the ecosystem?)

It is, however keeping LDC and GCC up to date is a volunteer effort with not enough people, so they are always a bit behind dmd.

Still much better than GCCGO, kind of useless for anything beyond Go 1.18, no one is updating it any longer, and may as well join gcj.


LDC isn't regularly behind DMD lately. The issue lately has been more the release process with respect to DMD. People issues impacting that.

Which was my point, volunteer work without enough people.

Having written real code in D, I can say that the slight discrepancy between dmd, LDC, and gdc isn't a roadblock in practice.

Depends how creative you happen to be with some features.

For regular common code that is indeed not an issue.


It is supported. However on Windows GDC does not work. LDC based on LLVM needs Visual Studio but I maybe wrong since there are clang/llvm distributions based on mingw64. Other than that DMD works fine, a bit slower than the other compilers, but a charm to use.

Ldc like dmd ships lld and mingw import libraries and has for a few years now.

They both work out of the box without MSVC installed.

It's only ImportC feature that requires MSVC. The work required to get clang working in its place hasn't happened yet.


It is

But is that a problem of the language itself, or is it just a problem of available toolchains? E.g. if the gcc compiler collection would come with BASIC support and you just could type something like "gbasic -O3 foobar.bas -o foobar" to get a properly optimised executable out of your BASIC source code file then some may would still use BASIC today, I guess?

I started with BASIC too. Also enjoyed BlitzBasic2 for a long time on the Amiga. That's where I learned programming… back then when programming was still fun.


Which basic. I remember Atari BASIC - which was a terrible language since the organization was line numbers. Visual BASIC looked a lot nicer the few times I saw it, but I have a total of 10 minutes of my life looking at visual BASIC and so I have no clue how it is in the real world. I do know Visual BASIC was used by a lot of untrained programmers who never learned why we have best practices and so there is a lot of bad BASIC out there - this isn't the fault of the language just the users.


Realistically? No, not at all. The reason there are no toolchains for BASIC is because nobody uses BASIC (because it's not functional in our modern world), not the other way round.


Why do you think BASIC is not functional? Our modern world does not differ from the 1980 world at all. Variables are variables, subroutines are subroutines.

It fall out of fashion, along with Pascal, Perl, Ruby, but that's just fashion.


I have even seen pretty darn impressive things done with VisualBasic back in the day. And that were not hobby things. I've seen it used in very important mission critical telecommunication equipment. The compiler was custom, not the one from Microsoft. After all, the language had pretty much anything other languages had.

How can a language be "inefficient"? You can say it lacked on expressiveness. Maybe was too verbose? But I would not place BASIC into the "verbose" category.


> Why do you think BASIC is not functional?

Because BASIC simply doesn't have first-class functions, and they would be quite hard to represent in a BASIC-like syntax while keeping the language idiomatic. Even the unreasonably clunky C pattern of having a pointer to a function taking void* as its first argument (to account for closure captures) gets you a whole lot closer to functional programming than even the fanciest BASICs.


Here, "functional" is being used to mean "ablity to function", not "relating to the functional programming paradigm".


I hate that languages have become fads. The concepts have not changed, but there is a constant churn of languages.

I don't have to relearn natural language every 5-10 years, but for some reason I'm expected to when it comes to programming.


Except there are,

https://learn.microsoft.com/en-us/dotnet/visual-basic/

https://www.xojo.com/

https://www.mikroe.com/mikrobasic-pic

Just the three that come to my mind, of BASIC toolchains that people actually pay real money to use.


For example there's also something called "ghost leeching" (side channel entirely bypassing tracker reporting) which can lead to other peers reporting upload for which there's no opposite account of download on the tracker. Making it look like peers over-reported upload and cheated when they are in fact entirely innocent. There's no way for a private tracker to be really sure about stats. The most the moderators can do is to check for repeating suspicious usage patterns across many torrents of a particular peer under scrutiny.


In Safari I get "Too Many Requests" error; in Firefox ESR I get a straight "403 Forbidden". What the heck is going on with gnu.org?

Edit, with cURL it's OK… 200.

That makes no sense.


I definitely try to avoid any public statement of political nature online. You never know how the tide will turn at some point and who gets into power. And then you do not want to have a record of having said the wrong thing about the new guy(s) at the top in your past.


This could also chill the social pressure caused by knowing other's opinions. Less pressure for conformity, leading to more fringe positions. Maybe.


What is that? Like 32x >100MB junk overhead per app? ~4GiB gone from the disk just to hold the same broken copy of some framework/library? It's quite the insanity, isn't it?

If there was one copy of that electron (e.g. installed to /Library somewhere) which all apps would simply use then you only would need to update one copy. Less disk space wasted. All apps fixed in one go.

Back in the old days on the Commodore Amiga we would just do that… install some .library to SYS:Libs/ first if a program required it. It's not like this process was so complicated nobody could do it, right?


>It's not like this process was so complicated nobody could do it, right?

Don't underestimate the utility of write once run anywhere. Needing to ensure compatibility with a bunch of different browser engines is not simple.


Ironically Microsoft had exactly this in 1999 with Internet Explorer 5:

- https://en.wikipedia.org/wiki/HTML_Application

- https://www.geoffchappell.com/studies/windows/ie/mshtml/clas...


There is not one Electron. There are multiple, they release a new version every month or so.

Some apps, like VS Code, update very quickly to the latest one. Others more rarely. So now you need to keep multiple shared Electron versions, and track dependencies and who uses what version.

And it's quite likely that everyone of your Electron using apps will be on a different version, so now you are back to square one.


4gb seems quite small for all of those apps to be honest.


No F shared libraries. Seriously.

Memory and storage is cheap enough nowadays to not have to deal with the insanity that shared libraries cause. I don’t care if I use 30gb of memory to run a browser and a note taking app.


Some of us do care. Devs should respect users systems more than their own instead of crapping all over them with Electron. I’d almost go as far as to say that it’s evil. Wasting resources, energy, people’s time, and money.


I don't understand why it's all-or-nothing. We know how to version things pretty well these days, why is there no blended solution where libraries are shared but version aware? I don't mind having two different versions of electron on my laptop but I don't want 30 copies of the same version.


You're basically describing Nix.

The big issue I see with Nix is that it's solving several related & very complex problems, and isn't doing so at a particularly easy level of abstraction. It's a PITA to package software that isn't using an already-supported build system. And mixing versions is messy, instead of just `[ package1="versionA", package2="versionB", …]` sort of thing with a lockfile to convert versions to immutable IDs like commit hashes you have to specify which commits of nixpkgs had each version and then do something like `nixpkgs-versionA=GIT_COMMIT_HASH_A; nixpkgs-versionB=GIT_COMMIT_HASH_B; [ nixpkgs-versionA.package1, nixpkgs-versionB.package2, …]`. There are lots of other "warts" like that, of varying levels of annoyance.


Because in practice nobody has solved it, while everyone claims they have.

In practice every software needs a particular version of a library. Even a minor upgrade to that library might, and will break it. In an idealized world it should not happen, but here we are. In a world that we setup whole containers so that we can run software.

So no. Shared libraries do not work in practice. At all. It should be straightforward, but they just do not work.


omg remember when we all had to install Java separately at the system level?


Multiple versions of it. And .NET and C++ runtimes etc. And could never uninstall any version of them because you did not know what would break.


No shit. One major frigging selling point of Electron vs OS web view is the developer controls the browser version and has a stable target to develop and test against, rather than having many moving targets that shift after the app is shipped.

And you really think the entire ecosystem has never heard of this honking great idea named shared libraries from the good old days? Being smug about obvious things like this usually just betrays your shallow understanding.

Disclosure: I’ve criticized Electron aplenty. But these are complex tradeoffs and people dismissing them with obvious wins in their minds are clueless.

Disclosure 2: I was once a member of the maintainer team of a major package manager. Shared libraries, oh my, I can tell you horror stories all night long. If your experiences with them are great chances are a team behind the scenes has blocked and/or fixed faulty updates for you and shielded you from most of the issues.


I find explaining browser defaults to non web-devs really eye opening for them.


I read that announcement and I have zero idea what this "immich" app (?) is. How about adding some short introduction text explaining what the thing is for folks just discovering this thing via a HN link (w/o forcing them to click further or digging the information from somewhere else)?


That post complaining about lack of context took longer than typping "immich" into duck duck go and reading the single sentence that pops up at the top describing exactly what immich is. ----------- Immich Immich is a project that lets you back up, organize, and manage your photos and videos on your own server. It is under active development and available under GNU AGPL v3 license.


Google Photos alternative -> https://immich.app


I agree that this text in its current style is very hard to read. Feels like the text was ballooned up to 3 or 4 times its original length with pointless "side content"? Lots of distracting noise basically. AI or not AI, this is not very good.

… and so I'll continue to stick with AVC, thanks! :-)


Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: