Hacker Newsnew | past | comments | ask | show | jobs | submit | psychoslave's commentslogin

> They're going to be a decade behind if they keep it up.

42 European here.

I've heard my whole adult life that Europe is ten years behind USA.

That doesn't feel that bad though. Being bleeding edge comes with the thrill of the avant garde prestige. But it does also mean you take the downsides of navigating the unexplored unknown in your face with no one to help with turn key solution when it happens.

If it means 10 years buffer on big social seismic troubles, that doesn't sound too bad if there is indeed an efficient shelve. That's not necessarily the case on every matter though, like global climate change is going to impact everyone, no matter the political isolation, and if a direct military aggression happens, it can be hurtful no matter how prepared is the society.


> It's only with hindsight that we think contagionism is obviously correct.

We, the mere median citizen on any specific topic which is out of our expertise, certainly not. And this also have an impact as a social pressure in term of which theory is going to be given the more credits.

That's not actually specific to science. Even theological arguments can be dumb as hell or super refined by the smartest people able to thrive in their society of the time.

Correctness of the theories and how great a match they are with collected data is only a part of what make mass adoption of any theory, and not necessarily the most weighted. It's interdependence with feedback loops everywhere, so even the data collected, the tool used to collect and analyze and the metatheorical frameworks to evaluate different models are nothing like absolute objective givens.


Is AGI about replicating human intelligence though? Like, human intelligence comes with its own defects, and whatever we fantasize about an intelligence free of this or that defects is done from a perspective that is full of defectiveness.

Collectively at global level, it seems we are just unable to avoid escalating conflicts into things as horrible as genocides. Individuals which have remarkable ability to achieve technical feats sometime can in the same time fall behind the most basic expectation in term of empathy which can also be considered a form off intelligence.


Not within a PhD, but as a side project I work on a research project on wikiversity about grammatical gender in French. It does reference a bunch of books and academic works, like probably a hundred I guess. The most tedious work though is to check which nouns are used only in a single gender of do have some epicenic or specific inflection used in the wild and giving a reference that attest that when it's not already so consensual that most general public dictionary would already document the fact. For that the research refers to thousand of webpages. I'm glad that most of the time I just need to drop the DOI, ISBN, or page URL and MediaWiki will handle the filing of the most relevant fields. That's not perfect, it generates the output with many different models currently (some don't have an excerpt field), and some required fields might be left blank, url to pdf won't work, and so on. But all in all it make the process of taking note of the reference quick and not going too much in my way. Creating a structured database out of it can certainly be done later.


There are two types of dichotomy: those which are autoreferring and sound, those which are absurd, and those which are really going too far.

What is STOA standing for here, please?

Likely a typo of State Of The Art.

That’s unavoidable given the goal: Unicode provides a unique number for every character, no matter what the platform, no matter what the program, no matter what the language.

https://www.unicode.org/standard/WhatIsUnicode.html


What does "every character" mean? Did it really need to include emojis, for example? Domino tiles? Alchemical symbols? A much smaller number of characters would have been sufficient for all but a tiny number of cases.

> What does "every character" mean? Did it really need to include emojis, for example?

You may be too young to remember, but there was a time when a lot of software had their own way to encode emoji if they supported them. This sucked for interoperability - especially when using common protocols like SMS.

Some of these implementations were essentially find/replace and would turn various strings of characters commonly occurring in code into emoji. Someone reading your mail containing code on their portable device or other weird client would see parts of that code replaced by emoji. Maybe you had to format your code a certain way, inserting spaces tactically, to avoid accidentally ending up with an emoji. I'm glad we put that behind us for the most part.

Living in a world where you can just copy-paste some text containing emoji (or not) from one software into another is honestly great. Same for all these other symbols that may be embedded into text.

If a software has to come up with their own text-embeddable encodings to represent symbols (to allow for copy-paste or sharing) things often end up less than optimal.


I take "every character" to mean "anything that was represented in a reasonably common pre-unicode code page or character encoding, as well as anything that might come up in OCR output of text documents".

Emojis obviously got in from Japanese character encodings, and imho the world is off better for that. Though many of the extensions of the emoji set really don't seem to get what emojis are used for. Similarly, chess and shogi pieces as well as symbols from Western playing cards go in through previous encodings, and domino tiles got accepted based on being conceptually similar. A bit questionable imho.

On the other hand the Azimuth sign seems to satifsy the "would appear in OCR scans", based on being published in font catalogues. Even if nobody has come forward with a book it appears in, I don't think they made and advertised lead type characters for fun. It has to have had some use in printed publications of some type (probably scientific, from the surrounding context)


>Natural languages are ambiguous. That's the reason why we created programming languages.

Programming languages can be ambiguous too. The thing with formal languages is more that they put a stricter and narrower interpretation freedom as a convention where it's used. If anything there are a subset of human expression space. Sometime they are the best tool for the job. Sometime a metaphor is more apt. Sometime you need some humour. Sometime you better stay in ambiguity to play the game at its finest.


Programming languages are non-ambiguous, in the sense that there is no doubt what will be executed. It's deterministic. If the program crashes, you can't say "no but this line was a joke, you should have ignored it". Your code was wrong, period.

>This is the VHS-versus-Betamax dynamic, or TCP/IP versus the OSI model, or QWERTY versus every ergonomic alternative proposed since 1936.

QWERTY has many variants, and every single geopolitical institution have their own odious anti-ergonomic layout, it seems. So this case is somehow different to my mind. As a French native, I use Bépo.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: