Hacker News new | past | comments | ask | show | jobs | submit | lou1306's comments login

> Why does an LLM have to be better than you to be useful to you?

If

((time to craft the prompt) + (time required to fix LLM output)) ~ (time to achieve the task on my own)

it's not hard to see that working on my own is a very attractive proposition. It drives down complexity, does not require me to acquire new skills (i.e., prompt engineering), does not require me to provide data to a third party nor to set up an expensive rig to run a model locally, etc.


Then they might indeed not be the right tool for what you're trying to do.

I'm just a little bit tired of sweeping generalizations like "LLMs are completely broken". You can easily use them as a tool part of a process that then ends up being broken (because it's the wrong tool!), yet that doesn't disqualify them for all tool use.


I am by no means an expert in CPython/V8 internals, but arguably Python is way more dynamic (e.g., dunder methods), performs way more runtime checks (whereas JS will happily keep on trucking if, say, you add a string to an integer), and its design makes it very hard to infer which value inhabits a variable at a given point in time (which you would need in order to apply optimizations to the bytecode). This kind of flexibility cannot come for free.

Smalltalk and lisp are arguably even more dynamic and yet we have examples of faster runtimes for both.

He (probably) means that the geometry of a PDF "page" can be customized. You can even have different sizes within the same document. But most people using LaTeX or even basic plotting utilities which export to PDF know this.


I'd reckon the median user is not someone who has been "publishing blog posts multiple times a day, seven days a week" at any point of their life and that the most likely risk is mindless consumption rather than endless production. But just quit following random strangers on the Internet, keep social media as a way to maintain your irw network of relations, and you'll usually be alright.

Sure, then we can discuss about the inherent issues of the platforms (which are many), but first one has to exploit their own agency to the utmost degree.


Maybe OOP is a strong word, but it would make absolute sense to have an imperative core if you absolutely need the highest possible performance, or if the OOP/imperative core is pre-existing and you want to "regulate" access to it through the affordances of FP.


Not a physicist but it seems to be part of astrophysics culture, and it's a good "base measure" for interstellar space (i.e., if something is < 1pc away it's likely within your same star system). A parsec is ~3.26 lys, for context.


At the stellar density in our region of the galaxy, sure. The density is hundreds of times higher closer to the center.


Yeah it is obviously informed by our local environment, the very definition of parsec is tied to the structure of our solar system.


Well, Standard ML was initially designed for a theorem prover, so you're definitely onto something here :)


Kudos for publishing in OOPSLA, but alas I'm skeptical. This seems to add visual clutter/mental overhead as one needs to parse a decision tree rather than having a "flat" sequence of patterns with the optional when-condition.

Also,

> all the branches in the corresponding pattern matching expression would need to destructure the pair, even when only one of its components is needed

Can't one just bind the unneeded component to _ ? Isn't that actually a good thing, as it clearly self-documents we won't need that value on that branch?


This is not really related to Mediaset, but rather to illegal streaming of football matches.


And mediaset has never ever transmitted a football match, right?

-_-'

The current government has done other laws in that sense. Which incidentally allow blocking any website.


Current football right holders (Sky, DAZN) have little to share with Mediaset


For VSCode users who want to try out LanguageTool, I cannot recommend the LTeX extension [1] highly enough. Setting up a self-hosted configuration is really easy and it integrates very neatly with the editor. It was originally built for LaTeX but also supports Markdown now.

[1]: https://github.com/valentjn/vscode-ltex


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: