Hacker Newsnew | past | comments | ask | show | jobs | submit | qlm's commentslogin

"The screeching of peeling tape is a familiar albeit annoying sound."

I don't find it annoying, I quite like it. The only time I've experienced "ASMR" was from somebody peeling tape next to me.


Agree, it really depends on the tape for me. Painters tape has quite a nice sound as it's peeled

I really like the doppler effect (?) of the sound as the tape gets longer.

Not the Doppler effect. The tone gets lower, the wavelength increases, with the increasing length of the resonating body.

At that point why not just have an actual deterministic transpiler?


I feel that the devil is in the edge cases and this allows you to have the freedom to say "ok I want to try for 1.0 match between everything, I can accept 0.98 match, and files which have less of a match it can detail notes for and I can manually approve them". So for things where the languages differ too much for specific patterns such as maybe an event handing module, you can allow more leniency and tell it to use the target languages patterns more easily, without having to be so precise as to define every single transformation as you would with a transpiler.

In short: because it's faster and more flexible.


Everything that has been "invented" was invented by humans and on some level depends on the laws of nature to function.

I recently bought whey protein powder that doesn't come from milk. It was synthesized by human-engineered microbes. Did this invention "arrive"?


No I'm fairly certain it was invented and that this style of breathless science fiction roleplay will be looked back on as an embarrassing relic of the era.


I didn't even read the article and know that the headline is 100% correct.

It's the result of stochastic hill climbing of a vast reservoir of talented people, industry, and science. Each pushing the frontiers year by year, building the infra, building the connective tissue.

We built the collection of requirements that enabled it through human curiosity, random capitalistic process, boredom, etc. It was gaming GPUs for goodness sake that enabled the scale up of the algorithms. You can't get more serendipitous than that. (Perhaps some of the post-WWII/cold war tech even better qualifies for random hill climbing luck. Microwave ovens, MRI machines, etc. etc.)

Machine learning is inevitable in a civilization that has evolved intelligence, industrialization, and computation.

We've passed all the hard steps to this point. Let's see what's next. Hopefully not the great filter.


How is that different from "Compact Discs weren't invented, they arrived"?


Point to the single inventor of AI. You're going to have trouble.

Maybe you give it to the authors of a few papers, but even then you'll struggle to capture even a fraction of the necessary preconditions.

The successes also rely on observing the failures and the alternative approaches. Do we throw out their credit as well?

The list would be longer than the human genome paper.


Yes and exactly the same thing could be said for the invention of compact discs. You're just describing "history".


CDs are designed to be exactly in the way they are, and you don't get out of them anything more, or different, than what you put in.

Compute and transformers are a substratum, but the stuff that developed on it through training isn't made according to our design.


I don't have a problem with the headline but the article is kind of bad.

And the headline is vague enough that you could read many meanings into it.

My take would be going back to Turing, he could see AI in the future was likely and the output of a Turing complete system is kind of a mathematical function - we just need the algorithms and hardware to crank through it which he thought we might have 50 years on but it's taken nearer 75.

The "intelligence did not get installed. It condensed" stuff reads like LLM slop.


Very cool, although I found the link to LLMs toward the end to be a little odd.


Definitely, it’s a “we know the only thing anyone cares about right now is AI, so, look this is kind of like AI, sort of, if you squint”


Haha, luckily not! It's a very speculative link, so we didn't want to talk about "AI" too much in the main post. But we originally got interested in this concept because we are interested in other forms of input to the brain (other than the classic reading, listening, watching, etc). The nose is interesting because it seems to have many independent basis vectors and very sharp discrimination ability, so it might be a sensor into which you can pack many inputs. LLMs are just a proof-by-example that ~1k input dimensions is enough to really encode semantic meaning.


I'd personally be very sceptical that the human brain could derive much meaning from smell beyond "smells bad don't eat" or "reminds me of something", but I guess I would have said the same about creating smells via ultrasound so what do I know.


Do you have a testable hypothesis about this or are you flailing in the dark?


Sounds like reactionary nonsense to me. It's just some names. It's not indicative of the debasement of society.


In isolation, yes. But other things have happened as well. People dress like slobs; interestingly, in my country, where GDP per capita skyrocketed since 1989, standards of clothing seem to have gone down, especially for formal occasions. We have a major problem with physical fitness, Westerners of the 1970s were much thinner and moved more. People read fewer books and spend their days consuming brainrot on Tiktok, Instagram and YouTube shorts.

(Notice that the very word brainrot is a neologism?)

I don't think we should pooh-pooh such developments as irrelevant, and I am very unhappy that they have been subsumed to the universal polarization of the culture wars that consume everything while producing nothing of value.

The Moloch indeed.


> standards of clothing

Examined more closely, this appears to mean nothing more than "people spend less time wearing the clothes that a previously dominant culture considered to be high status markers".


I think you just re-formulated what I said, in a more intellectuallish and dismissive way.

People will now turn out for a funeral in a tracksuit. Yes, previously dominant culture frowned upon such things. Yes, the culture has obviously changed.

Our main disagreement seems to be whether such change is good, bad, or irrelevant.

I could live with people dressing in a disgusting way, but I really dislike the death of book reading. That will make us all worse at thinking.


You were the one who insisted that "standards of clothing have gone down" (emphasis mine).

When it comes to culture, I believe that things change rather than go up or down. In general, I suspect there are two very long term (i.e. many millenia-long) trends that occur in parallel, one of them generally improving the human condition and one of them degrading it. The world is literally going to hell in a handbasket, at the same thing as nearly everything is getting better.

Your concerns about book reading are, of course, the opposite of those of the Greek philosophers who imagined that it would make us all more stupid.


> When it comes to culture, I believe that things change rather than go up or down.

In the '80s movie Trancers, Jack Deth is a visitor from the future, and as he's slicking his hair back with water from a flower vase a woman from the present day asks something like, "People from the future put vase-water in their hair?!" and Jack Deth replies very seriously, "Dry hair is for squids."


Yes, I can live with it, but I think the standards have gone down. It also seems to me that you basically consider that change irrelevant. We can surely disagree on that.

As for the Greek philosophers, I feel you are being too dismissive saying that they imagined us being more stupid. First, it was mostly about Socrates and second, his position was a bit more nuanced than how you present it. He was concerned about education becoming impersonal, which definitely has some downsides (until today, we haven't discovered any educational mode more efficient than 1:1 tutoring, at least from the student's individual point of view; the economic dimension, of course, differs). Second, he believed that our memory capabilities would go down, which they probably did. We don't have much contact with purely oral cultures now, but the little we do, show that pre-literate people were indeed better at remembering their collective past, including their culture, in the sense of "actually having it in their own heads" instead of "hearing about it once in the class and then promptly forgetting what they heard".

How many people today can recite a thousand songs from memory? Not that long ago, people like that would exist and keep ancient songs alive.

Today I hear Ed Sheeran ten times a day (ugh), but I wouldn't be able to recollect the lyrics even if threatened with an execution.

That is certainly one way of being stupider than before. Yes, it is compensated by other improvements, no doubt about that.


> People will now turn out for a funeral in a tracksuit.

I bet if they had showed up in a sport coat you wouldn't have found it notable despite the fact they were the tracksuits of their day: https://en.wikipedia.org/wiki/Sport_coat


No, because they lost that meaning in the meantime.

Yes, it is possible that tracksuits will become the to-go clothing for funerals and theatres as well as gyms.


One theory is that you, like myself, has reached the age where many modern ways seem dumb and younger people aren't even aware of what has been lost.

Importantly: even if this is the case, it doesn't mean we're wrong!


I suggest you reflect on the value you are placing on aesthetics, and where this way of thinking ultimately leads.


You don't really know how much of a value I place on aesthetics (not that much, in fact, just more than zero, which is enough to make some judgments).

And "where this way of thinking ultimately leads"? Nowhere special.


“Standards of clothing” is not a set with a total order, and society has never had one way to dress. You’re unfairly projecting your values (of a certain style of dress) onto society as if it’s shared by everyone.


This is not maths, and nothing is shared by everyone in a human society.

I am actually an algebra major and I always felt that the need of some of my peers to stuff the entire outside world into mathematical definitions does not lead anywhere. Please don't mathematize societal concepts ("a set with a total order"), you will only mislead yourself and others. Maths isn't a good tool to understand people.

Let us talk about humans in a human language instead.


Perhaps a controversial view on this particular forum but I find the tendency of a certain type of person* to write about everything in this overly-technical way regardless of whether it is appropriate to the subject matter to be very tiresome ("executing cached heuristics", "constrained the search space").

*I associate it with the asinine contemporary "rationalist" movement (LessWrong et al.) but I'm not making any claims the author is associated with this.


What diction is "appropriate to the subject matter" is a negotiation between author and reader.

I think the author is ok with it being inappropriate for many; it's clearly written for those who enjoy math or CS.


I think it's a trick. It seems to be the article is just a series of ad-hoc assumptions and hypotheses without any support. The language aims to hide this, and makes you think about the language instead of its contents. Which is logically unsound: In a sharp peak, micro optimizations would give you a clearer signal where the optimum lies since the gradient is steeper.


> In a sharp peak, micro optimizations would give you a clearer signal where the optimum lies since the gradient is steeper.

I would refuse to even engage with the piece on this level, since it lends credibility to the idea that the creative process is even remotely related to or analogous to gradient descent.


I wouldn't jump to call it a trick, but I agree, the author sacrificed too much clarity in a try for efficiency.

The author set up an interesting analogy but failed to explore where it breaks down or how all the relationships work in the model.

My inference about the author's meaning was such: In a sharp peak, searching for useful moves is harder because you have fewer acceptable options as you approach the peak.


Fewer absolute or relative? If you scale down your search space... This only makes some kind of sense if your step size is fixed. While I agree with another poster that a reduction of a creative process to gradient descent is not wise, the article also misses the point what makes such a gradient descent hard -- it's not sharp peaks, it's the flat area around them -- and the presence of local minima.


I see your point. I'd meant relatively fewer progressive options compared to an absolute and unchanging number of total options.

But that's not what the author's analogy would imply.

Still, I think you're saying the author is deducing the creative process as a kind of gradient descent, whereas my reading was the author was trying to abductively explore an analogy.


True, but my point is that not only does the analogy not work, the author also doesn't understand the thing he makes the analogy with, or at least explores the thought so shoddily that it makes no sense.

It's somewhat like saying cars are faster than motorbikes because they have more wheels-- it's like with horses and humans, horses have four legs and because of that are faster than humans with two legs. It's wrong on both sides of the analogy.


I enjoy maths and CS and I could barely understand a word of it. It seems to me rather to have been written to give the impression of being inappropriate for many, as a stand-in for actually expressing anything with any intellectual weight.


A bit harsh, but I see what you mean. It is tempting to try and fit every description of the world into a rigorous technical straightjacket, perhaps because it feels like you have understood it better?

Maybe it is similar to how scientist get flack for writing in technical jargon instead of 'plain language'. Partly it is a necessity - to be unambiguous - however it is also partly a choice, a way to signal that you are doing Science, not just describing messing about with chemicals or whatever.


I have observed it too, it is heavily inspired by economics and mathematics.

Saying "it's better to complete something imperfect than spend forever polishing" - dull, trite, anyone knows that. Saying "effort is a utility curve function that must be clamped to achieve meta-optimisation" - now that sounds clever

If I was going to be uncharitable, I think there is are corners of the internet where people write straightforward things dressed it up in technical language to launder it as somehow academic and data driven.

And you're right, it does show up in the worse parts of the EA / rationalist community.

(This writing style, at its worst, allows people to say things like "I don't want my tax spent on teaching poor kids to read" but without looking like complete psychopaths - "aggregate outcomes in standardised literacy programmes lag behind individualised tutorials")

That's not what the blog post here is doing, but it is definitely bad language use that is doing more work to obscure ideas than illuminate them


Yes, you articulated my issue in a much better way than I managed to!


It's a middle school essay that is trying to score points based on the number of metaphors used. Very unappealing and I wouldn't call it technical.

EDIT: For all the people saying the writing is inspired by math/cs, that's not at all true. That's not how technical writing is done. This guy is just a poser.


> I wouldn't call it technical

Fair. Perhaps I should have said it gives the illusion of being technical.


I'll be the first to admit I was unable to follow the article because of this.


I mean, I talk like this as well. It's not really intentional. My interests influence the language that I use.

Why is the rationalist movement asinine? I don't know much about it but it seems interesting.


Just reading the abstract, I have to agree with you.


To be fair, it's always an artistic choice if you think it is appropriate here or not, but, yeah, this article is a really heavy offender. Reading the "Abstract claim" I caught myself thinking that this word salad hardly makes any sense, but I don't know and am just gonna let it go, because I am not yet convinced that it's worth my time to decipher that.

Also, "asinine contemporary "rationalist" movement" is pretty lightweight in this regard. Making an art out of writing as bad as possible has been a professional skill of any "academic philosopher" (both "continental" and "analytical" sides) for a century at the very least.


no, we need more of this, the opposite of this is Robin Williams destroying the poetry theory book in dead poeta society, the result was weak kids and one of them commited suicide. More technical stuff in relation to art is a good thing, but its expected that anglosaxon people have allergy to this, they think is somehow socialist or something and they need art to be unfefined etc


I am not sure you watched the same movie I did.


Respectfully, I have no idea what you're talking about. Dead Poets Society is a story and the message of the story isn't that Robin Williams' character is bad.

Are you saying my perspective is anti-socialist? What is "refined" art?


of course in the movie they sell the idea that art is not subject to scientific or technical analysis, but if you do an indepent analysis you realize those kids didnt become stronger or freer. Art like the article explained is related to effort and technique. but people in the US LOVE stuff like Jackson pollock, they need for art to not being a thing you put effort and mind into


You can put art through all sorts of scientific and technical analysis. Being analyzed is how we teach the techniques to new artists. A mechanical reproduction of it from that analysis is not art, though, and sometimes you get the break the rules in favor of the expression.

Did you know, for example, that Shakespeare coined a great many words and phrases used in English to this day? Before Sam Clemens, people tended to speak in proper schoolhouse English no matter the setting or character. Poetry and prose are not just the ability to arrange words on a page. Novels and plays are not limited to the three-act or five-act story arc. Simile and metaphor are often encouraged, but overused ones are actually frowned upon.


You're confusing art with technical skill. You like art that demonstrates technical skill, that's fine. But art doesn't have to demonstrate technical skill to be artistic - indeed defining what 'art' is exactly is surprisingly difficult.


Can you give an example of an artwork you think is acceptable?


It isn't a generational thing. The choice of emoji is a generational thing, but people of all ages do it. AI most certainly does not use emoji in the same way a young person does (unless you encourage it to, but even then it comes across as cringeworthy). If anything it's closer to how a middle-aged person uses them.

I'd also say the use of text emoticons has all but died out in anything other than ironic usage, or in situations where it's difficult to use unicode emoji (e.g. games or this very site)

When text is very obviously generated by AI it communicates to the reader that there is nothing of value to be read. It always writes in the same vapid, overly enthusiastic, overly verbose way. It's grating and generally conveys very little information per word. It's a cliché at this point, but if nobody bothered to write it then why would I bother to read it?


There is a 0% chance that the vast majority of this site and the repo that was linked elsewhere was written by a human. I would have zero confidence in anything about this language, and frankly your former colleague should be embarrassed about putting this out.

Edit: I just noticed in another comment: "Perfect for : Trading systems, industrial control, Medical devices, aerospace applications". I'd go further than embarrassed, and say this person should be ashamed of themself and take this down.


Maybe I'm misunderstanding what you're saying but applications like this tend to be horrible to use. How do you handle somebody navigating in two tabs at once? What about the back button?


Also bookmarks etc? For example if you have a view where you can have complex filters etc, you may want to bookmark this.


I guess they use something like sessionStorage to hold tab specific ids.

But something that can bite you with these solutions if that browsers allow you to duplicate tabs, so you also need some inter-tab mechanisms (like the broadcast API or local storage with polling) to resolve duplicate ids


Agreed. Also, when you paste somebody a URL, they should see what you saw... if at all possible.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: