Hacker Newsnew | past | comments | ask | show | jobs | submit | dev_slash_null's commentslogin

How did you get started in R&D? I've always wanted to get into "exploratory" software development.


Learn to work on code that nobody else wants to touch.


Tried that and ended up stuck with legacy enterprise software.


Have you asked for a raise lately?


Instructions unclear, have career in infrastructure now.


I did a PhD in physics/ml and slowly since then my work has become more and more software development.


You also have to consider what happens if a 60mph belt breaks/halts while people are on it.


> How exactly is education a privilege? At least in the US, everyone has access to public education.

The article is literally about life in North Korea...


Should we then conclude that freedom of speech is a privilege because people in North Korea don’t have it?


Yes it is if the context is global.

Oxford dictionary:

privilege - a special right or advantage that a particular person or group of people has

1st example there: Education should be a universal right and not a privilege.

Globally, it's a privilege. Worldwide many girls can not get it. Or poor.

Even clean water is a privilege on a global scale. Perhaps in USA as well, ehm see Flint.

Language evolves. It's 21st century, privilege isn't about feudal taxes anymore. Queue meant tail and gay happy.


There's a danger when playing with words like this.

Privilege is bad. It's something we, as society, should strive to eliminate. When you say education is a privilege, what is the message there?

I understand that the OP, and you, are trying to convey a message of this type: I should be thankful that I get X, not everyone in the world gets X, and I will try to remind myself that X should not be taken for granted. Where X is education, but it can also be clean water, or maybe just the right to live, or the right to not go to a reeducation camp if you criticize the "dear leader".

I subscribe to that type of thinking.

I just don't think that using the word "privilege" is the best way to express this idea.


Except, in non English speaking counties, the vast majority of people do not change the Accept-Language header (through various browser/OS mechanisms), and receive default results. But then experiments show that they are more likely to interact with pages when shown in the languages Google thinks you actually use.

If you go to myaccount.google.com/language , you can remove languages you don't actually speak, or turn off automatically adding languages altogether. It doesn't always work for everything, but it's a good signal.

I agree there should be a way to force this better, but there's a bunch of work being done on it.

You can add a hl=en parameter to the URL, and most Google apps will respect it.


I've heard this excuse before, and I don't buy it.

I live in a non-English speaking country.

The vast majority of people use their device in the local language, and for any browser to see the OS is set to - let's say, Thai - and then just proceed to send accept-language: en is just absurd.

The other possibility is that these people who can not understand English, somehow manage to use a device set to English language on a daily basis. Also, absurd.


Unfortunately, thats just not true. Accept-Lanuage is set to English most of the time, even for people in non-enlish speaking counties. It's not that they cannot use English, it's that experiments show that users are more likely to interact with content that is in their predicted language, even when Accept-Language is set to English.


> Accept-Lanuage is set to English most of the time, even for people in non-enlish speaking counties.

Unless you can point to some kind of evidence that shows Accept-Language is set to English regardless of the host OS' language/region setting, there's zero reason to believe this claim.

> It's not that they cannot use English

Right, the entirety of non-english speaking, internet-using humans are able to read English, it's just a bunch of them choose not to, but only when it comes to using a webpage? Is that what you're suggesting?

Edit:

I just did some very quick testing of the theory that browsers will send the "wrong" Accept-Language header.

* Safari on macOS with preferred OS language set to English (Australian) sends `en-AU,en;q=0.9`

* Safari on macOS with preferred OS language set to Thai (and second pref of English) sends `th-TH,th;q=0.9`

* Chrome on macOS with preferred OS language set to English (Australian) for some reason sends `en-GB,en-US;q=0.9,en;q=0.8`

* Chrome on macOS with preferred OS language set to Thai (and second pref of English) sends `th-TH,th;q=0.9`

* Edge on Windows 11 with preferred OS language set to Thai (and second pref of English) sends `th,en;q=0.9,en-GB;q=0.8,en-US;q=0.7`

* Chrome on Windows 11 with preferred OS language set to Thai (and second pref of English) sends `th-TH,th;q=0.9`

Firefox is the one that can get it wrong, sometimes: in the macOS VM, I had already downloaded Firefox before setting the OS language to Thai, and the downloaded versions set a browser preferred language which doesn't change based on the OS language (the default is US English).

But when I downloaded Firefox using Edge in the Win11 VM, the language was already set to Thai, so I got a "Thai preferred" download link when I typed "Firefox" in the search bar, and thus the language string it sends is not surprising: `th,en-US;q=0.7,en;q=0.3`

So, again: without some evidence to show that somehow people who have no understanding of English, are using their computers or phones or tablets in English, I'm calling bullshit on this repeated claim that "non english user's browsers send requests for English language content".


Let me clarify what I'm claiming here.

In many counties, users have their OS/Browser settings set to English only, when they actually also speak other languages (or in certain countries, exclusively other languages). There's also a bunch of complexity in regards to spoken/written language comprehension, but I won't get into that here.

There is much more engagement when Google presents content in languages people actually know (or, rather, languages Google thinks they know), regardless of what their Accept-Language header is.

Unfortunately, I cannot present you with hard stats, as I no longer work there. I'm sorry that's unsatisfactory, and I'm not asking you to blindly trust the word of a stranger. Just wanted to offer up some insight into why the system behaves as it does sometimes.


I've seen quite a few people with my own eyes use devices with an English interface (or French) without knowing much English (or French). Some of them are illiterate. This is quite common in some parts of the world. These people find use in their English devices, because websites like YouTube provide content in their language even when the interface is not in their language.

Sometimes these people cannot switch the interface language of the operating system. For example٫ Windows Single Language edition forbids changing the language of the interface, and it is common in developing countries.


You can add a "hl=en" parameter to the URL, and most Google sites should respect it.


The “skeleton” of the site usually does, but not the content it pulls. Search results and suggestions are notorious offenders.


Just in case you haven't seen the postmortem of the Cloudflare outage which also was caused by a regex based DoS: https://blog.cloudflare.com/details-of-the-cloudflare-outage...


That was a great read, but there was one thing I didn't understand: Why would the regex string have "." twice in a row? What does ".." find that "." doesn't find? Does that just mean "at least two characters"?


It means specifically 2 characters, and is equivalent to .{2}

..+ or ...* are ways of writing "at least two characters".


A single `.` matches exactly one character. `..` matches exactly two characters (not more, not less).


Except I've been able to do medium complexity tasks in various Autodesk products(Inventor, 3ds Max) within minutes of opening the program for the first time, without looking at any documentation. While I struggle to do even simple tasks in blender, even with many hours of going through tutorials. I try to pick it up again every few years, and inevitably give up several weeks in.


You still have to trust your assembler and linker in that case.


In tune with the parent's point about decompilation, the transformation from assembly to machine code is more or less completely reversible and often local, allowing the assembly to be tested in chunks that are less detectable. Linking is also "reversible", though attacks on the linker are actually much more common in practice than attacks on the compiler (LD_PRELOAD injection etc). So the verification he was concerned with becomes much easier when using assembly for bootstrapping.


Locking your door at night is extremely unlikely to have any life threatening downsides. I suppose you could concoct a scenario where you live with someone with an altered mental status and the locked door prevents them from escaping during a fire, but that's reaching.


A gun at rest, which this case was, is also extremely unlikely to have any life threatening downsides. What is your point?


Are you trying to claim that locking your door at night, and carrying a firearm daily, both result in the same amount of risk of life threatening downsides? Because I would very much disagree with that statement.


Yes, because a gun that sits in a holster with a safety on is just as likely to kill you as a door is. Guns don't shoot themselves.


As someone who knows only the very basics of CPU architecture, this was extremely informative, thank you. It's given me lots to read up on.


I stayed at a holiday inn express last night

(PS don't forget agner fog's microarchitecture, you might be one of today's lucky 10,000! https://www.agner.org/optimize/microarchitecture.pdf )

Consider also scihub:9780849337581 for a general overview of architecture approaches in general.

It is funny that we keep reapproaching this "barrel processor" design that the CDC 6600 started so long ago. That "Vector processor + peripheral processor" design is super interesting. Nerds are attracted to this design like moths to a fly - it has been repeated and echoed in Sun Niagara, AMD Bulldozer, Xeon Phi, and now Royal Cores/rentable units/zen4c/5c/etc. We can just make one thing run fast, and have a bunch of workers servicing it, that are simple and slow and cheap, right?

https://cs.uwaterloo.ca/~mashti/cs850-f18/papers/cdc6600.pdf

https://archive.computerhistory.org/resources/text/CDC/cdc.6...

http://www.bitsavers.org/pdf/cdc/cyber/cyber_70/60045000_660...

http://ygdes.com/CDC/cdc6600.html

Very interesting source material etc, see how they talked about their own processors. A lot of older systems were exhaustively documented and the info is available now.

--

https://en.wikipedia.org/wiki/UltraSPARC_T1

https://en.wikipedia.org/wiki/Xeon_Phi#Knights_Landing


Thanks - this will take me a while to digest :).


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: