Hacker Newsnew | past | comments | ask | show | jobs | submit | mswtk's commentslogin

Technically, the actual statement in Galois theory is even more general. Roughly, it says that, for a given polynomial over a field, if there exists an algorithm that computes the roots of this polynomial, using only addition, subtraction, multiplication, division and radicals, then a particular algebraic structure associated with this polynomial, called its Galois group, has to have a very regular structure.

So it's a bit stronger than the term "closed formula" implies. You can then show explicit examples of degree 5 polynomials which don't fulfill this condition, prove a quantitative statement that "almost all" degree 5 polynomials are like this, explain the difference between degree 4 and 5 in terms of group theory, etc.


Sure, you can do that. The parent's point is that if you want this mapping to obey the rules that an actual definition in (say) first-order logic must obey, you run into trouble. In order to talk about definability without running into paradoxes, you need to do it "outside" your actual theory. And then statements about cardinalities - for example "There's more real numbers than there are definitions." - don't mean exactly what you'd intuitively expect. See the result about ZFC having countable models (as seen from the "outside") despite being able to prove uncountable sets exist (as seen from the "inside").


This is just plainly false. While not quite as bad as the average Ubisoft game, Elden Ring's world design is very similar to Elder Scrolls, Skyrim in particular. Most of what the author brings up are aspects of the game not related to the open world. The open world itself is not nearly as trend-bucking as a lot of other game design decisions From Software make, which is honestly a bit disappointing. In general, Elden Ring is their closest game to a typical AAA title, a lot of the quirks and contrarian elements from their earlier games are absent in it.


Yeah. I've finished Elden Ring, and as a long time admirer of what From Software did to revitalize the action RPG genre, I felt the open world was a deeply unnecessary addition that brings nothing but time wasters.

The truly wonderful content of Elden Ring is in the areas that received the level of care all souls games have, the legacy dungeons. Stormveil, Raya Lucaria, Leyndell, the sewers..

The basic open world fields present no challenge, nothing of value, no amount of enemy ambushes can stop you from just bugging away on the horse, the areas you "explore" as you discover the world consists of copy pasted dungeon tilesets with just a few variations that were hand crafted, I've seen multiple copies that were 1:1 of some rooms in places like caves, catacombs and so on. Bosses are reused to the point of exhaustion. There's like more than 10 bosses that are essentially clones of Dark Souls 1 first boss, the Asylum Demon, just with one or two new moves, and its attack hitting you harder than the boss it clones. The game cannot receive the excuse that "you can just skip this content, it's optional" either, because it was purposefully designed to force you to do open world chores if you want to get the "full experience". Upgrading weapons other than the ones that use somber stones is very painful if you do not explore the countless repetitive mining tunnels filled with mostly the same enemies and with always a copy pasted boss at the end, where in a previous souls games the materials would be laying around in places you'd naturally traverse as you complete the game, here, the Legacy Dungeons do not offer much in the way of materials, and you would miss on a lot of game lore if you didn't complete the copy pasted content because this is, after all a souls game, and souls games have most of their writing in... item descriptions. It worked ok in a 40 hours game like Dark Souls 3 where you'd pick up items as you progress through the game. It's.. infuriating when you're told you need to do 140 hours of copy pasted content to experience the same amount of -actual- content other souls games give you.

In many ways, the side content of this game feels like.. Bloodborne's computer generated Chalice Dungeons, which were a completely optional side feature of the game that could be safely ignored and left aside. Except that this time, From removed much of the content you'd find on the normal game path, and threw it all around those new chalice dungeons. Cool weapons, unique talismans, the game lore.. if you don't do this mind numbing copy pasted content you're only getting half a game.

I know, with all the accolades this game received from the gaming press, and the sales it achieved, earning itself a large part of a new audience that never played souls games, I just know, we're never going to see a traditional game by From software anymore, and I'm sad. I don't want more open world drudgery. The world didn't need another developer to fall into this trap.

Elden Ring doesn't have the quest compass, the quest log and other "easy mode" features of a game like skyrim. But it does have the copy pasted ultra linear dungeons that you complete in 5 minutes, it does have the repeated dragon fights that are all the same except one does a breath attack with blue colors and another does flamey breath attacks. It does have the formulaic world structure - each part of the map must contain X number of objectives to do.

In Elden Ring, these are : each "square" of the map has 1 catacomb, 1 or multiple caves, tunnels, "hero tomb" with a chariot that instantly kills whatever it runs over, 1 dragon to kill, 1 Erdtree avatar or putrid tree spirit, 1 church with an upgrade for your flask, 1 tower with a ridiculously simple """puzzle""" (but often time consuming, like find three hidden turtles to kill on the island to open it), 1 evergaol with a boss you fight in an open arena...

How, exactly, is this game a breakthrough going against the grain of open world design? It doesn't go against the grain, it followed the formula to a T. Lacking the UI of mainstream games doesn't mean the world isn't designed like a Bethesda or Ubisoft game.

The only thing that is unique to Elden Ring, is the parts that was already done by every other From games. The open world of ER, though, is nothing new, nothing grounds breaking and its main purpose is to inflate the amount of playtime. I understand de gustibus and all that... but, unanimous 10/10 in the gaming press? Is that all it took? If I was a game developer at Bethesda or Ubisoft, I would be very angry with the state of the gaming press.


Thanks, this comment really made me think. I noticed a lot of the same things you did, but my impression was much less negative.

Maybe it’s because of the presentation. In many open world games the UI makes it clear that there are “X towers to climb” or “Y camps to clear”… there’s a checklist of goals. Elden Ring just lets players stumble on things as they will. Even as the map design may be similar to other open world games I feel less pressure to “do all the things”; as a result my personal journey feels more organic.

I did recognize the template pieces used in caves/mines and it did turn me off a bit. But because I’m not guided to clear all of them checkbox style it was less offensive. In a way it’s smoke and mirrors: a lack of information makes the design more mysterious than it really is.

A sibling comment mentions that the length makes multiple playthroughs a pain, but I generally am a one and done for souls games.

In another thread I commented that the open world gives a casual player more options instead of getting forever stuck on a single challenge. I think that feeling of freedom combined with faster (but buggy) movement is responsible for the broader appeal and high scores. Oh and the art direction doesn’t hurt.


Yeah, that's more or less how I feel about it as well. I would even go further and say that the existence of the open world makes the Legacy Dungeons worse. It's quite unfortunate because in terms of visual design and architecture, they're some of From's best work; however, they tend to be full of the same enemies you've already fought multiple times in the open world, taking away from the feeling of venturing into the dangerous unknown that I personally find so compelling about these games.

It's also unfortunate that this looks like a game with a lot of build variety, that would really lend itself to multiple playthroughs, but actually playing through the content again sounds like a pain. Going through the wiki and making a list of places I actually need to go is not my idea of a good time.


You don't have to, and I personally also find small-talk with strangers rather tedious, but in the specific cases brought up by the author, it sounded like he would've saved himself time by just briefly explaining his reasoning. Which also would've had the nice side-effect of treating his interlocutors as rational human beings worthy of a measure of respect.


Fundamentally, a lot of knowledge we possess as individuals is socially constructed, in the sense that we trust the process and the institutions that created it. Even if this knowledge can, in principle, be verified, it is usually impractical for an individual to do so.

Putting science aside, how can I be confident that the basic facts presented in, say, a NYT article, are correct? I can trust the reputation of the NYT as an institution, and I can also trust that any inaccuracies in the article will be called out by other publications. But if I feel everyone's in on a conspiracy to push a particular viewpoint, then I need to anchor my knowledge in a different institutional framework - even if that might just be a random Facebook group of strangers, or my weird uncle, or a niche radio station.

This, incidentally, is why the censorious push against "disinformation" is misguided in principle. It's sweeping the underlying problems under the rug in the hope that they disappear spontaneously. In an open society, truth can only be established as a result of public discourse, anything else is just the representation of the perspective and interests of some authority. Whatever its source, the seemingly growing distrust in public institutions will not disappear simply because we make social media companies remove its most obvious symptoms.


I think the problem you describe-- of being unsure of the origins and level of distortion of one's information on its journey between "reality" and "perception" is an old problem, which is rather quickly being solved by transparent technology.

It used to be that you had to trust the institution, because they were the best suited and capable to verify and validate the data and the individuals. Trust problems were both not publicized widely, and also did not have any obvious solutions.

Today, we are technically capable of validating and verifying the entire chain of custody or traversal for any given piece of information, and the only thing missing is the infrastructure to do so. On the other hand, if an institution demonstrates untrustworthiness, it is difficult to conceal.

I think that, more and more, we will demand to see the entire chain of creation and origin for a piece of information, and either validate it ourselves, or delegate that validation to a party we personally trust. That validation can then itself be validated with reputation.

Imagine, for a minute, if an article came with a list of all the writers, contributors, scientists, interviewees, editors, etc., who contributed to the article. And not only that, imagine you can see the entire social graph between you and those people. That is what the future looks like, IMO.


> You think John Carmack was the only one trying to make 3d games when Doom came out? Thousands of programmers where trying to. Does that make him a 10x programmer in your opinion? Or more like a 1000x?

I think the idea of 10x programmers gets so much pushback because it's often bundled with this kind of toxic hero worship. There's a difference between acknowledging the impressive ability of outliers, and idolizing heroic one-man efforts as the pinnacle of what we should all aspire to as software developers.

Incidentally, Carmack was not the only programmer on early id Software games, and Doom was far from the first 3d first-person game made. Arguably, it wasn't even that ambitious compared to, say, Ultima Underworld, but the programmers working on that get much less immediate recognition.

As a matter of fact, these discussions often remind me of that famous IGN quote of Warren Spector. Except we really should know better on HN.


Warren Spector's quote, for reference:

>"There's a tendency among the press to attribute the creation of a game to a single person," says Warren Spector, creator of Thief and Deus Ex.

https://www.ign.com/articles/2001/11/12/deus-ex-2


Theres also this insane idea that people are just come out of the womb as 10x devs, as though they are literally different breeds and there is no amount of studying or effort that could bridge the gap. Its like a bunch of people who still believe their dad is magical & best and haven't discovered that we're all made of mostly the same stuff.


That honestly sounds like a failure to communicate with the researcher first and foremost. If it's difficult to prioritize the fix internally due to organizational politics, that's one thing, but that shouldn't stop the bounty team from communicating the status to the researcher. In fact, that should be the simplest part of the whole process, as it's completely within the purview of the bug bounty team. If they handle that right and build some trust, they might be able to successfully ask the researcher for an extension on disclosure.

Case in point, Apple likely could have come out of this looking much better if they didn't ignore and then actively lie to illusionofchaos. That really isn't a very high bar to clear.


My understanding of doing things "properly" as an engineer, is picking the solution with the right tradeoffs for my use case. If the cost to the business of having some amount of scheduled downtime occasionally is significantly less than the engineering cost of maintaining several 9s worth of availability over major migrations, then I consider the former to be "done right".


> I for one think it's great that people are willing to make things a little more accessible for more people. If people want to be part of our community of developers I think it's great that these organizations listen to people who might have a problem with certain terminology. Even though I don't have a problem with those terms, I think it's still worth evaluating if they're worth keeping if it makes it harder for someone to be part of our community.

Is it more accessible? As in, is this change driven by complaints from actual people who feel excluded by the terminology? As far as I'm aware, none of the projects making these changes even claims that, it's all speculation on behalf of hypothetical offended parties.

Not that it really makes it less annoying to have terminology used by people from all over the world be dictated by American cultural sensibilities, but it's easier to stomach if there's some material justification behind the change.


> Is it more accessible? As in, is this change driven by complaints from actual people who feel excluded by the terminology? As far as I'm aware, none of the projects making these changes even claims that, it's all speculation on behalf of hypothetical offended parties.

I'm an African American, and no I'm not offended by Git's branch name. White progressives spend so much time on virtue signalling but hardly pay any attention to pressing Black problems like Black poverty and education.


Genuine question: do you feel like the changes are also condescending?


I don't know, I can't speak for the people who's access is limited. However, I am Dutch and can say that these cultural sensibilities are far outside of just the "American" one.

There are plenty of people that struggle with this terminology in a realistic way. Even if you can't find anecdotal evidence of someone being offended by this, you can rationally come to the conclusion that it might be worth changing it. And for it to be accessible, it doesn't need to come 100% from the people that face problems with the terminology. If it were to be 100% those people, than it would be a great from of cultural emancipation however!


I mean, I don't think it's unreasonable to ask for at least a couple actual examples of this change helping people feel better about participating in technology. In the absence of such, it all feels very performative and, dare I say, a cheap way to score good PR for participating organizations. I don't think it really hurts anyone to an extent that it should be opposed, but neither does it really help, until proven otherwise.

> However, I am Dutch and can say that these cultural sensibilities are far outside of just the "American" one.

Are they? Does the master/slave terminology also have very negative connotations in your culture? I thought it was almost exclusively an American thing due to their historical circumstances.


I would suggest looking up what the Dutch did in relation to race equality in history and how we were involved in a lot of slave trade, sometimes even to the US.

I would say you have a point with saying it's performative, and I don't have anything to counter that. However, perhaps your energy could be spent looking for someone that is actually offended by this to counter your own perspective?

Kinda Karl Popper style of disprove your own theory?


> I would suggest looking up what the Dutch did in relation to race equality in history and how we were involved in a lot of slave trade, sometimes even to the US.

I am aware of the history, but that's not enough to give the words themselves emotional charge and significance. The reason this is so for Americans is that the consequences of slavery and racial segregation are keenly felt right now - it's not just an abstract wrong committed on people long ago and far away. As a point of comparison, I'm from Eastern Europe, and the word "slave" is derived from "Slav" - but this is effectively ancient history with little bearing on the present, and so the word doesn't carry any emotional charge or special meaning.

To put things differently, is there a segment of the Dutch populace for whom the words "master" and "slave" signify that kind of viscerally felt injustice, as they do for black people in the US? This isn't a gotcha question, I genuinely don't know, and these kinds can be arbitrary and irrational. For Poles, "slavery" is abstract, but "forced labor" brings up some major traumas from around World War 2, for example.

> Kinda Karl Popper style of disprove your own theory?

I was hoping someone would do it for me in this thread. :) Might still happen, if not, I might have to do some digging.


> I am aware of the history, but that's not enough to give the words themselves emotional charge and significance.

I disagree. Nazis, soviets did a lot of crimes against humanity and in certain countries symbols of those regimes are banned, also speaking positively about it also is banned by claiming it dismisses all those crimes.

It’s not required for that word to be relevant NOW in order to be somewhat negative/avoided.

I think it applies also to master/slave stuff: it attempts to normalize those terms by dismissing history of those words. Also - if we forget shortly that we are used to master branch in git: why word “master” is right choice for it? for me “main” makes sense.

as for DB - original/replica also makes sense.


> dictated by American cultural sensibilities

The English-language Internet (and tech) field has a large centre of gravity in the US, so those of us outside of the US do tend view a lot of the rending of garments on some topics to be quite strange.


There is no way to separate dissent from disinformation a priori, other than by making assumptions about the other party's good faith. And so, you can paint anything you disagree with as disinformation if you think poorly enough of them, as you've demonstrated in this post. And it's very easy to think poorly about political opponents in particular.

The end result will be no dissent whatsoever, just two groups calling each other liars at every step.


> There is no way to separate dissent from disinformation a priori

You might as well say "There is no way to separate pornography from innocent family photos, other than by making assumptions about the othes party's good faith."

Yes, it's a judgment call. But it's a judgment call that's not hard to make; it's a judgment call that we need to make in order to maintain a civil society.


> You might as well say "There is no way to separate pornography from innocent family photos, other than by making assumptions about the othes party's good faith."

I think sexual activity and nudity are a lot more relevant than one’s perception of the creator’s good faith in thise case.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: