Hacker News new | past | comments | ask | show | jobs | submit | MortyWaves's comments login

Are there any good rsync GUIs for Windows?


I used QtdSync for a while (in differential mode which uses hardlinks on NTFS): https://qtdtools.com/page.php?tool=0&sub=0&lang=en

At some point I forgot about the scheme and started synchronizing things manually.


Rsync is pretty closely tied to unixy concepts of permissions, paths, access times, etc. You can find some rsync based tools for Windows, but IMO, it's a square peg / round hole unless you're using it inside of WSL.


robocopy is the Windows equivalent of rsync


Kinda, sorta. Robocopy is a very good tool, but it cannot do incremental file updates. Already mentioned bvckup2 can do deltas, but it's not an rsync client.


I use Bvckup2 for doing Windows backups; it's commercial but works really well for me.


Windows has rsync?


rsync is a library.


Are you sure the others use Squirrel? After it’s maintainer decided to crash out and abandon multiple libraries including Squirrel, I was under the impression it was abandoned?


Squirrel is abandoned for years and still used anyway, often by people who don't realize it's abandoned. It's the default if you use the Electron toolchain, so that's why. It has serious design problems too, like it breaks Windows networks by bloating people's roaming home directories with dozens of independent copies of Chrome which then get backed up, copied around on login etc. Admins loathe it as the whole point of this design is to bypass their ability to manage their own deployments, although they still get the blame when things break, of course.

There's a better way, which I am shamelessly self-promoting in this thread (as it's 100% on topic) - my company makes a tool that can ship self-updating Electron apps and beyond being not abandoned, it's got a lot of really useful features, like being able to do the build and upload of signed updating packages (using the tech MS is pushing here) from Linux CI workers, without needing a Windows license.

https://hydraulic.dev/

It can also do forced updates on launch, which can be helpful for apps where the protocol between client and server changes regularly. And it plays well with corporate Windows deployments. People can install apps locally without needing administrator access but it goes into c:\Program Files


That’s awesome, but keep in mind that the AppData thing is likely a feature and not a bug.

Think about it another way: if they install in AppData, they can likely bypass IT depts and other business bureaucracies and get a foothold somewhere in an organization. It’s absolutely malicious, both in terms of tech and business practices, but it works.


That might have been true a long time ago, but nowadays Windows makes it easy to black/whitelist executables from the home directory with stuff like AppLocker. Meanwhile, the MSIX subsystem lets users install apps safely without needing admin access into c:\Program Files.

So, what Electron does might have been a neat growth hack once, but now it's as likely to hit roadblocks as not (at least on any modern Windows network with a switched-on IT department).


There is no real reason you can't have a monorepo, with one project being the Astro site and the other Starlight docs.


True. It just limits the utility of it being built on top of Astro.


But why can the canvas still only be accessed in the main thread? Why introduce a whole other API as a workaround for this?

Also, as far as I have seen, people have been using WebAssembly for complex stuff and marshalling it to the main thread for the canvas.


The worklet API gives the browser the ability to spawn as many threads for the task as it wants when it wants, without needing to communicate and wait on the page's code in the main javascript thread each time.


The last time I read any updates on this, everyone on both sides of the legal process were trying to single out scapegoat individual software engineers and rake them over the coals. Did something change?


I don't think software engineers were independently looking at emissions data and unilaterally decided to "fix" the emissions shortcomings in software. I think they were told by others to do that. It's good that Germany is going after the people who decided that fraud was the answer.


> It's good that Germany is going after the people who decided that fraud

When the VW scandal broke, the US indicted seven senior executives. None of these seven were extradited to the US to stand trial [1].

The VW scandal was made public in 2015 [2] and involved cheating since 2009. Sentencing only two executives to jail a decade after their wrong doing made international news does not send a strong message.

[1] https://www.justice.gov/archives/opa/pr/former-ceo-volkswage...

[2] https://www.bbc.com/news/business-34324772


Germany does not extradite its nationals to the US at all. They can sometimes extradite to other EU states, but not to USA.

Sending own citizens to foreign country is generally big deal and not something that is done.


When the VW scandal broke, the US indicted seven senior executives [1]. Germany did not cooperate. None of these seven were extradited to the US to stand trial.

One more mid level engineer involved in the scandal made the mistake of taking a vacation to Florida. He was arrested in the airport awaiting his flight home to Germany [2]. He was sentenced to 84 months in prison but was let out after serving half of that sentence [3].

[1] https://www.justice.gov/archives/opa/pr/former-ceo-volkswage...

[2] https://money.cnn.com/2017/03/17/news/companies/volkswagen-e...

[3] https://www.autonews.com/automakers/ex-vw-manager-schmidt-ge...


Germany does not extradite its nationals to the US at all. Not sure why would you expect this case to be so special that Germany would break its own laws.


You're stating this very confidently, but I don't think it's a blanket non-extradition policy. In the case of a potential death penalty, it's a clear no, but that's not the case here.

https://www.state.gov/wp-content/uploads/2019/02/10-201.9-Ge...


German constitution, translation to English: - https://www.gesetze-im-internet.de/englisch_gg/englisch_gg.h...

Article 16 (2) No German may be extradited to a foreign country. The law may provide otherwise for extraditions to a member state of the European Union or to an international court, provided that the rule of law is observed.

USA is not a member of European Union nor an international court.


I’m imagining a really, really, really rough ride for this. I don’t mean the compiler, tsc CLI, or VS support. I know the TypeScript team will have already competently done their side of the story.

I think that as usual, it’s going to be the hellscape ecosystem of Node that is going to be a complete pain here for devs just wanting to use TypeScript.

Somehow, the vast myriad of tools and their dependencies, find a way to make writing TS/JS worse while trying to make it better.

You’d expect a simple package change to be straightforward, but after ten years of dealing with web tooling ranging from before the webpack era to now, I am constantly amazed by the new ways these tools manage to fuck everything up.

I imagine for Deno the change to the new compiler will be seamless and not even noticeable.


It basically depends how deep the other tooling is sticking its claws into TS internals which won’t be accessible in the same way anymore.

I’ll bet the two most painless improvements will be:

- tsc just for type checking. (A lot of places run tsc just as a type checking step, and avoid type checking during the full build.)

- TS language server. By far the worst part of TS for large repos today — in my main monorepo, it consumes like 10GB of RAM and actions like “jump to definition” become so slow that it’s unusable.

Things that may be painful:

- Integration with 3rd party compilers like webpack which use TSC. Also unsure how tools like Rspack will need to consume tsc, given that they use a highly parallel rust architecture already.

- Integration with tools like Jest which need to transpile code before running TS.

- Integration with running Node directly — eg swchook or ts-node for bin scripts


It's a shame that the author famous for shitting on Deno has caused them to even need to write this.


> An annotation can be thought of as a comment, recommendation or a like on any piece of content. Ann allows you to store your annotations, send annotations to your followers and receive annotations from people that you follow.

Sounds very similar to WebMentions, tbh.


Probably panicking and waiting to be told what to do by the security services that have been using this.


All of the information leaked in the headers is already readily available through lawful interception.


> Instead of memset() you've got ZeroMemory(), instead of memcpy() you've got CopyMemory().

What is or was the purpose of providing these instead of the existing Windows C std?


It's worth remembering that Windows 1.x and 2.x predates the C89 standard. This also explains why WINAPI calling convention was inherited from Pascal instead of C. The C standard library was "just another competitor" at the time.


The WINAPI calling convention is a cross between C and Pascal - C-style order of arguments on the stack, but Pascal-style callee cleaning the stack before return.

The reason for its use in Windows is that it makes generated code slightly smaller and more efficient, at the cost of not supporting varags easily (which you don't need for most functions anyway). Back when you had 640 Kb of RAM, saving a few bytes here and there adds up quickly.


Those functions explicitly? I can't find any definitive explanation on why they exist.

It looks like nowdays ZeroMemory() and RtlZeroMemory() are just macros for memset().

Here's an article on some of the RECT helper functions. Relevant for the 8088 CPU but probably not so much today: https://devblogs.microsoft.com/oldnewthing/20200224-00/?p=10...


Windows didn't standardize on C. It was mostly assembly and some Pascal in the beginning with C and C++ later.

Microsoft have always viewed C as just another language, it's not privileged in the way UNIX privileges C. By implication, the C standard library was provided by your compiler and shipped with your app as a dependency on Windows, it wasn't provided by the operating system.

These days that's been changing, partly because lots of installers dumped the MSVC runtime into c:\windows\system and so whether it was a part of the OS or not became blurred and partly because Microsoft got more willing to privilege languages at the OS level. Even so, the Windows group retains a commitment to language independence that other operating systems just don't have. WinRT comes with lots of metadata for binding it into other languages, for example.


> Windows didn't standardize on C. It was mostly assembly and some Pascal in the beginning with C and C++ later.

No, it was never Pascal. It was always C from the beginning. You may have been confused by them using the Pascal calling convention because it was generally faster on the 16-bit CPUs of the time.


Apple was the one going with Pascal for the OS, originally the Object Pascal linage was started at Apple, in collaboration with Niklaus Wirth that gave feedback on the design.


You could write code without using libc / the C runtime. You still can.


Unlike Unix, Windows historically didn't have a standard C runtime at all. Stuff like MSVCRT.DLL etc came later (and are themselves implemented on top of Win32 API, not directly on top of syscalls as is typical in Unix land).


Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: