> Whether something is a forgery is innate in the object and the methods used to produce it. It doesn't matter if nobody else ever sees the forged painting, or if it only hangs in a private home. It's a forgery because it's not authentic.
On a philosophical level I do not get the discussions about paintings. I love a painting for what it is not for being the first or the only one. An artist that paints something that I can't distinguish from a Van Gogh is a very skillful artist and the painting is very beautiful. Me labeling "authentic" it or not should not affect it's artistic value.
For a piece of code you might care about many things: correctness, maintainability, efficiency, etc. I don't care if someone wrote bad (or good) code by hand or uses LLM, it is still bad (or good code). Someone has to take the decision if the code fits the requirements, LLM, or software developer, and this will not go away.
> but also a specific geographic origin. There's a good reason for this.
Yes, but the "good reason" is more probably the desire of people to have monopolies and not change. Same as with the paintings, if the cheese is 99% the same I don't care if it was made in a region or not. Of course the region is happy because means more revenue for them, but not sure it is good.
> To stop the machines from lying, they have to cite their sources properly.
I would be curious how can this be applied to a human? Should we also cite all the courses, articles that we have read on a topic when we write code?
> I would be curious how can this be applied to a human? Should we also cite all the courses, articles that we have read on a topic when we write code?
Yea this is the kind of BS and counter-productiveness that irrational radicals try to push the crowd towards.
The idea that one owns your observations of their work and can collect rent on it is absurd.
The value of a piece is definitely not completely tied to its physical attributes, but the story around it. The story is what creates its scarcity and generates the value.
It is similar for collectible items. If I had in my possession the original costume that Michael Jackson wore in thriller, I am sure I could sell it for thousands of dollars. I can also buy a copy for less than a hundred.
Same with luxury brands. Their price is not necessarily linked to their quality, but to the status they bring and the story they tell (i.e. wearing this transforms me into somebody important).
It can seem quite silly, but I think we are all doing it to some extent. While you said that a good forgery shouldn't affect one's opinion on the object (and I agree with you), what about AI-generated content? If I made a novel painting in the style of Van Gogh, you might find it beautiful. What if I told you I just prompted it and painted it? What if I just printed it? There are levels of involvement that we are all willing to accept differently.
Regarding art, what do you feel about museums? Why would you go see an original instead of simply looking at a jpg.
Even if you aren't in the group, there is clearly a group of people who appreciate seeing the original, the thing that modified our collective artistic trajectory.
Forgeries and master studies have a long history in art. Every classically trained worth their salt has a handful of forgeries under their belt. Remaking work that you enjoy helps you appreciate it further, understand the choices they made and get a better for feel how they wielded the medium. Though these forgeries are for learning and not intended to be pieces in their own right.
> Regarding art, what do you feel about museums? Why would you go see an original instead of simply looking at a jpg.
I go to a museum to see a curated collection with explanations in a place that prevents distractions (I can't open a new tab) and going with people that might be interested to talk about what they see and feel. It's as well a social and personal experience on top information gathering.
> there is clearly a group of people who appreciate seeing the original,
There are many people interested in many things, do you want to say that "because some people think it is important, it must be important"? There were many people with really weird and despicable ideas along history and while I am neutral to this one, they definitely don't convince me just by their numbers.
> simply looking at a jpg.
Technically a jpg would not work because is lossy compression. But a png at the correct resolution might do the trick for some things (paintings that you see from far), but not for others. Museum have multiple objects that would be hard to put in an image (statues, clothes, bones, tables, etc.). You definitely can't put https://en.wikipedia.org/wiki/Comedian_(artwork) in a jpg - but the discussion surrounding it touches topics discussed here.
> An artist that paints something that I can't distinguish from a Van Gogh is a very skillful artist and the painting is very beautiful.
There are a lot such artists who can do that after having seen Van Gogh's paintings before. Only Van Gogh (as far as we know) did paint those without having seen anything like it before - in other words, he had a new idea.
So, if we apply to software, should we quote Dijkstra each time we use his graph algorithm?
Should we also say "if you can implement Dijkstra's algorithm" it's irrelevant because "you did not have the idea"?
It's great to credit people that have an idea first. I fail to see how using an idea is that "bad" or "not worthy", ideas should be spread and used, not locked by the first one that had them (except some small time period maybe).
Even the mechanical skill of painting gets a lot harder without an example to look at. Most people can get pretty good at painting from example within a year or two but it’s a big leap to simply paint from memory, much less create something original.
> higher-profile journals in general have more retractions and research misconduct [1-2]
Given that reviews are not a mechanism to check for truth but soundness, the higher profile the thing I would imagine there would be more misconduct. I mean would one risk prison to steal 10$ or to steal 1 million $?
> lower research quality [3]
To cite exactly from your link "the evidence is mixed about whether they are strongly correlated with indicators of research quality.". I find saying "lower" a bit too strong given the original quote.
> in fact weaker statistical power and reliability [4]
For a specific field "cognitive neuroscience and psychology papers published recently"!
> statistical reliability even in high prestige journals is still extremely poor overall [5]
> Also, making it through peer review is highly random and dependent on who you get as a reviewer [6], or is just basically a coin toss even when looking at reviewer groups:
It's a coin toss if paper could get accepted at all, and that's less than ideal but what the system should do (at least) is reject obvious crap, not ensure that something gets clearly accepted. The danger is False Positive (accepted even if it's crap) rather than False Negative (rejected even if it might be something useful).
Overall note: the review system is not ideal and should be improved. But it's a hard, complex and delicate problem.
Oh, I agree this is all super complex and delicate. If I had more time, I'd love to write a more nuanced, many-thousands-of-words blog post going into which journals and fields actually have good peer review and can be more / less trusted.
I just wanted to make a strong rhetorical case by highlighting some things that might be surprising to people making more naive defenses of journals via peer-review-based arguments.
Would the parents comply though? Many of the restrictions work because most adults agree is OK. For example for alcohol, children could drink as much as they want at home, if adults would permit it.
If most adults would be convinced there is an issue, one probably has enough lock-down modes even nowadays, not sure it is a "technical" problem.
I strongly believe that most would actually. All parents I've talked to have had issues with parenting their children's online activity. They know there are harmful things they want to prevent them from accessing but it is simply to hard to configure and set up existing tools for it. (Besides every single friend they have don't have any restrictions so it all seem pointless.)
I can also see also large support for uploading ID to various services when talking about kids, but when you re-frame the question to adults, most seems to really dislike the idea immensely.
Sure there will be children with access to unrestricted devices, just like we had kids with porn mags hidden in a forest somewhere back in the day, or how that one sketchy guy was buying alcohol, etc. But I think this is an acceptable level of risk for whatever harm people want to prevent.
Definitely makes it easier for parents. It also normalizes screen time limits for kids. When none of your kids' friends have screen time limits, it's harder to enforce. When at least there's a few of them, it's easier to get buy-in from your kids.
At that point it's on the parents. We can't stop parents from giving their kids alcohol or drugs either. (Not saying internet access is necessarily on the same level as that but you get the point.)
Consider that even with something as divisive as covid lockdowns and vaccines, the overwhelming majority of people complied with government instructions.
There are a minority of people currently refusing to vaccinate their children properly, and their fucking around is being found out with measles outbreaks in various countries.
Why would this be different? Why wouldn't it be a minority of parents permitting their children to drink, to smoke, to use unrestricted computing resources?
I personally am more afraid of what "someone" can convince other people to do rather than listening to me. Sadly there are enough people that are easily manipulated that probably the "smarter" people are completely ignored.
If I would be to place a bet I would place it on mass propaganda targeting people below average - it might be simpler, easier and cost effective. So lots of this talk about "encryption", "privacy" might be in fact great for those "actors": smart people worry about their precious technology and principles, while "they" talk to "the masses".
You remind me of the fact that humans do not in fact have sensors in the skin to detect specifically wetness.
I think given the amount of ideas floating around, it is occasionally good to revisit things that are "known", just in case some underlying assumption changed, especially for economics which is harder to get right as it deals a lot with what human want and do.
I can't see how anyone can think "the exporters pay the tariff" makes any sense. TBH, we'll never know how many people thought it made sense because it didn't matter.
In the end money move around. If - for example - the government would just give the citizens the money from the tariffs in equal share (I mean not that I suggest they would, but technically possible), it would be like taking from the citizens that consume more and give it to the citizens that consume less.
So, yes, it is correct in a practical immediate sense that "the exporters pay the tariff" but that excludes many relevant issues like how prices evolve (which are paid by the consumers), what the government does with the money (it could share or not) and what others decide to produce (to avoid tariffs). But definitely many people didn't thought of all that ...
I think the brains of our stone age ancestors were not built for relativity either. In the end the normal sequence of generations (having children and then die at some point) offers "re-trainings" of the brains. So, besides waiting/hoping for artificial intelligence, we should continue to make (and train) children. Worked great so far.
Maybe that's due to Amdahl's law applied to software. Everybody imagines that task X which is improved by 10x is 100% of the total work, so you will get 10x overall benefit, when in fact might be something like 20% of the work, so your overall benefit is only 1.21x .
I have seen people that cannot focus or are not confident enough about their ideas unless they are with someone. Yes, for me meetings are annoying, but for those people they reduces their anxiety (and tbh sometimes they do have bad ideas that are better shutdown).
On remote work, I do see an advantage of having people interact occasionally (I agree daily is probably too much) on work topics, besides the meeting. Spontaneous "can you have a look at" or "oh what is that program that you use". This will help much less the best performers (they know how to solve things, they look actively for new tools, etc.), but most companies have lots of profiles.
Someone in this thread was also complaining "management does not get engineering", which I feel is also made worse by working fully remotely - they will not get all topics in a meeting and if you have more informal talks, if they hear the discussions they might get (a bit) better.
> and programmers turn those instructions into code that does something somewhere, usually after finding ways to avoid bad or unfeasible ideas, while still complying with the instructions.
Some do, but not all did that. With work-from-home I was staying at a friend of a friend that was a programmer and had a meeting. I was amazed about the level of simple things he was discussing, like "please add an error check, now you made a form where you can insert wrong data; make sure form is visible on a small screen, now it is not; etc.". And they talked ~2 hours about each, with the other person showing the guy exactly what it was not working. I do not know the history (maybe he was reconverting or something), but if this was what was he was doing usually, it was very inefficient and quite simple.
Most of my work was more similar to what you describe (fighting vague instructions and push back unfeasible ideas), but I wonder how much of "the industry" does this.
On the management I worked with all kinds, the employees have a small part of responsibility to look and select good organizations, otherwise power-hungry idiots arrive on top and start dictate and nothing crumbles because everybody just stays.
On a philosophical level I do not get the discussions about paintings. I love a painting for what it is not for being the first or the only one. An artist that paints something that I can't distinguish from a Van Gogh is a very skillful artist and the painting is very beautiful. Me labeling "authentic" it or not should not affect it's artistic value.
For a piece of code you might care about many things: correctness, maintainability, efficiency, etc. I don't care if someone wrote bad (or good) code by hand or uses LLM, it is still bad (or good code). Someone has to take the decision if the code fits the requirements, LLM, or software developer, and this will not go away.
> but also a specific geographic origin. There's a good reason for this.
Yes, but the "good reason" is more probably the desire of people to have monopolies and not change. Same as with the paintings, if the cheese is 99% the same I don't care if it was made in a region or not. Of course the region is happy because means more revenue for them, but not sure it is good.
> To stop the machines from lying, they have to cite their sources properly.
I would be curious how can this be applied to a human? Should we also cite all the courses, articles that we have read on a topic when we write code?
reply