Hacker News new | past | comments | ask | show | jobs | submit | dabedee's comments login

Congratulations on releasing your product! It looks super interesting. Some questions from the top of my head: - Where is the data stored geographically? (US, EU, or a specific country) - Do you plan on having multiple options for the web version (cloud vs self-hosted)

I find that data sovereignty is very important to me (it might not be for everyone), which is why I ask about those points specifically ;)


Thanks! Data sovereignty is also top of my mind, but I think we should approach it not with the physical location of the servers/databases, but with the ability to encrypt, thus own, your data.

Currently it uses GCP with data centers in the US, but when E2EE is rolled out, I don't think it would matter where it's located? There might be legal complications so I'm also thinking of moving the physical databases to jurisdictions like Switzerland, if it comes to that.

Given the complexity of the backend, I think a self-hosted version might not be possible in the short term. However, single tenant versions (like GitLab dedicated) is definitely doable.


Unfortunately, it always matters. There's always someone who wants to know where the data is physically located and they will always want an answer for it.


If you spend time reading the article, you can agree or disagree with his choices, but he provides several reasons for why he chose to rewrite in Rust over after the initial project was written in C:

"At that moment, I decided to switch to Rust (my first project in this language), which represented several advantages:

- Restart the project from the beginning, using lessons learned from previous mistakes

- Be a bit more innovative than just writing a Linux-like kernel in C. After all, just use Linux at that point

- Use the safety of the Rust language to leverage some difficulty of kernel programming. Using Rust’s typing system allows to shift some responsibility over memory safety from the programmer to the compiler."

Unfortunately, your comment doesn't add anything to the conversation and distracts from the interesting project being presented.


Also, there's the implicit "I'm doing this to learn and have fun with a personal project" aspect which seems valid


That's pure Rust evangelism. As devs we are responsible for what we send out into the world. A schism in the Linux kernel would be bad.


Linux started out as a hobby project that widened the schism in the Unix world. Yet things turned out fine


Why? There are multiple kernel versons with different support. Android had and might still have their own kernel version, and I didn't notice anything bad. It's just another Unix implication.


I have trouble not reading this as "forks are bad", which seems to be missing the point of libre software a bit. Am I missing some nuance lost in the brevity of two sentences?


Forks are not inherently good or bad but can seriously muck up an ecosystem if they are not well-justified. Mucking up the Linux ecosystem would be BAD.


Yes, they dismissed Gitlab in part because of the self-hosted argument: "I know I can download GitLab and set it up on my own server. However, I’m a software developer, not a sysadmin. I want to spend my time developing software, not putting out fires and paying AWS bills for the rest of time."

Yet they ended up choosing a self-hosted option, Gitea, because it was recommended by an acquaintance and they set up a couple lightsail servers on AWS to run it.

It's totally fine for the author to share their preferences; they're just exhibiting the internal inconsistencies and irrational behavior we're probably all guilty of at one point or another.


In the last five years, my experience with Firefox has always been far superior in terms of performance than when using Chrome. I just can't bring myself to justify using a Google product for something as important as browsing. I guess it's ideological, but I just want Firefox to succeed.


From the article: "The GPU can be seen by Windows, but Nvidia only publishes Arm drivers for Linux, not Windows. So in device manager you just see a Basic Display Adapter, and it can't really do anything."

Never thought I'd live in a world where drivers are published for Linux first. This is great.


I mean, arm drivers for GPUs are used a ton in scientific computing and deep learning.

Nvidia sells ARM servers with 8-16 GPUs to my knowledge.

So that makes sense


Nvidia also wanted to buy ARM and has just announced their own ARM CPU, so yeah.


And they have been pushing Tegra for more than a decade.


Their embedded SoCs include GPUs as well.


The NVIDIA Jetson is Arm based, with (duh) their GPU. Funnily enough, at $DAYJOB I used an Altra Arm server to do builds of our codebase - it replaced having to build the code directly on a device sitting on a rack in our server room (slow! Especially on Xavier), and saved me from having to set up a cross compile.


This is great and very user-friendly. As a quick & dirty alternative for people using the command line, you can easily send files by using netcat.

To quickly copy file from one machine to another.

  # on target machine
  $ nc -l PORT > file

  # on source machine
  $ cat file | nc HOST PORT
To quickly copy an entire folder:

  # on target machine
  $ nc -l PORT | tar xf -

  # on source machine
  $ tar cf - FOLDER | nc HOST PORT


And on top of it you can add pv in pipe for progress bar


As mentionned by two other people in the comments, this is spreading false propaganda against AGPL which only serves the interest of internet-based companies that want to benefit from FOSS code without having to share their own creations[1]: "Google wants to be able to incorporate FOSS software into their products and sell it to users without the obligation to release their derivative works."

[1] https://drewdevault.com/2020/07/27/Anti-AGPL-propaganda.html


See my reply to one of the pre-existing comments [0]: it's not Google that is the primary source of these ideas about AGPL, it's the companies who chose AGPL with the intention of choosing a license that is completely and totally viral, even over network calls. Companies like iText [1] want their customers to believe that AGPL means that there's no way to use the free version without open-sourcing your entire software stack.

Whether or not that's true, it's reasonable for Google to warn against using an untested license that has vendors using it who specifically argue that it is completely and totally viral. Interpreting that caution as a psy-op is a bit far fetched.

[0] https://news.ycombinator.com/item?id=37904285

[1] https://itextpdf.com/how-buy/AGPLv3-license


The way I see it, it looks like iText is simply lying on their page about at least some of the restrictions of AGPL.


I'm inclined to agree. And I'm more inclined to believe that they and companies like them are responsible for false narratives surrounding AGPL than I am to believe that it's a long con by the cloud providers to get more free work for their cloud services. Vendors of AGPL software have a much more concrete and proximate motivation for exaggerating the effects of AGPL than the cloud providers do.


Google could do something good and test the license. They have the resources to fight it out.


The fact that they won't use AGPL software isn't propaganda lol. They literally won't use it. You may disagree about whether that's good or bad for the ecosystem, but it's a fact that most serous internet-based companies (and what tech co isn't these days?) won't touch it.


As the article points out though, while most "serious internet-based companies" won't touch AGPL, they will engage with dual-licensed software. They use AGPL software under proprietary license, after paying for it.


At risk of sounding pedantic, legally speaking there is no such thing as using "AGPL software under proprietary license". In that case it is simply software under a proprietary license. This is an important distinction, legally speaking.


As a business owner I consider AGPL the same as a closed source. It's a thing I need to pay for and I'm not about to contribute free labour to it the same way I wouldn't contribute to any other commercial closed source entity.


What definition of open source are you using exactly, that the AGPL doesn't fall under?


Probably something like "a software that I can freely use without exposing myself to legal risk".

(This is just me describing reality. My private software is under AGPL to a significant extent, but I understand why my employer's legal dept does not like AGPL.)


In that case, If you end up using OSS without any necessity of contribution from you, would you pay for that OSS software? since you did not contribute labor to it.


I do pay for some more permissive open source projects that I use, even ones I contribute back to, yes.

I completely understand organizations banning AGPL software. Having an employee mistakenly violate the license is just too great a risk. The majority of AGPL projects seem to be offering the same product on a different commercial license, which is the only way I'd use an AGPL project. E.g. paying for a non-APGL license to use it without the risk.


(Not OP.) If you can, I think you should. Sometimes the type of contribution you can make is not accepted. For example, some projects don’t accept donations.


I think that's completely fair, even for more permissively licensed software, I dislike contributing to commercial open source products unless I'm being paid for it.


There's a lot of people on this forum who work at SaaS startups that also use FOSS software. It ain't just Google.


There are plenty of alternatives like the Business Source License that are perfectly acceptable for majority of the use cases.


Github does that and it always makes it very difficult to understand when something was actually done/committed.


Github was the first thing I thought of when reading this. My only use case for the commit date is to quickly check if a certain commit merged before a branch was automatically cut, so I need the day of the week and time, not "11 days ago."

Fortunately, if you hover over the relative date, it will expand to show the exact date time, but it would be nice if this were swapped.


Yup, agreed. I came hear looking for this one. Whenever I need a timestamp in GitHub I need the exact time. I always have to hover to get the info I need.


Seriously, of all the platforms that do this pattern, this one is the most user hostile.


Awesome project! Where is the realtime/live data coming from? Is it approximated via timetables or is something else going on?


Not time timetables i think. If you watch the webcams the trains usually pass on the video as they pass on the map.


Medium.com articles are now mostly behind paywalls and I tried loading the URL via the web.archive.org but it keeps being redirected and impossible to view. I wonder if this was a cleverly hidden way to obsolete the archiving of such web pages by Medium.


It loads just fine for me, but only the article summary. I suspect to archive the full piece you'd have to make a local copy when logged in and upload that or provide some credentials to archive.org (not sure how, maybe their Archive-It.org product supports this?)


Same here.

Pivotal does something similar: saving the page no longer loads lanes or stories. In Chrome, you can set to "be offline" so the Javascript doesn't try to erase the DOM.

It's harder with archive.org, since it requests on your behalf. In that case, archive.org could set its scraper to behave similarly, toggling network off after the fetch, before archiving.


Same here.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: