Hacker Newsnew | past | comments | ask | show | jobs | submit | dadoum's commentslogin

My main interaction tool with the system is the pointer. Reaching out for the keyboard is something I do when I want to type, but for example when I am consuming content on my computer I just keep a single hand on the mouse or the trackpad. In that case shortcuts are just plain annoying.

On KDE, something nice is that if you have a maximized window and a panel on the top of the screen, I can drag that panel to grab the window (or maybe it was a setting of Latte dock or something). And since window titlebars nowadays can be cluttered with buttons, it is a predictable way to grab those windows only using the mouse.


The paragraph in the beginning reminded me of the 5-step story structure I was taught at school, and I just noticed that it is only featured on the French Wikipedia page [0]. In my experience it worked quite well for classical linear stories, and highlighting it in a text back at school also scored a lot of marks during exams, so now I am somewhat trained at recognizing it.

[0]: https://fr.wikipedia.org/wiki/Sch%C3%A9ma_narratif (https://fr.wikipedia.org/wiki/Sch%C3%A9ma_quinaire is also describing the same thing)


You are spot on. The simpler version of this is the three step story structure - setup, conflict, resolution. Which is what is used in most pitches etc.

But as stories get more complex, with multiple stories weaving in and also as you bring different genres in, some structures are better than others for different stories.

While I have figured out 15 so far, I want to take the WGA 101 screenplays of all time, which goes all the way from Casablanca, - and i want to see how some of these structures have evolved and are evolving over time.

For eg, since the past 2-3 years, leaving an open end (like in the case of Project Hail Mary in a new universe) shows up in 12% of films, compared to less than 1% before that. Those kind of insights are interesting.

Thanks for sharing that link.


The music analogy is very pleasant. Let me share another stab with similar inspiration. The parallel I'd make is with harmony. For instance in LOTR Dominant is when Frodo and Gollum struggle at the cliff of Mount Doom, they lose control of the Ring that makes an upwards arc spinning. Max tension. Fast forward, Frodo returns to Shire, music is at home again. Tension resolves. (Tonic). So setup, conflict, resolution would be pre-dominant, dominant and tonic. Subplots are secondary dominants.

You are spot on. That peak is the climax at the mountain.

https://postimg.cc/nM9cTkpt


amazing this is measured

I would like someone to explain me Go. Really, I will use strong words but that's really what I feel.

The syntax changes a lot from the C one, and I can't see any reason for it. To me, it looks unstructured, with the lack of colons for example. It ignores memory safety, it feels like it ignored all of the typing system research since C, no discriminated union, and structures and types in general are heavy to write. It encourages bad patterns, errors out on mundane things like an unused variable, forces you to handle errors with a lot of code while not catching much more than C in terms of bug-prone practices. The package/module system is a nightmare for contributing to open source projects. Modifying a dependency to find a bug is very hard, even swapping a dependency (version) is annoying.

And what do you get from all of this compared to C? A garbage collector, tuples, and goroutines. No metaprogramming (aside from generics, and that was a whole story), interop with C is limited. To me, it looks like it does not focus on the algorithms, but on the code implementation, which is imo what leads us into poor programming and missing critical logic flaws, because the logic is buried. I may have forgotten other gripes I got while working with Go, but honestly, if I wanted all of that, I would pick D, at least it interops well with C and has metaprogramming (and has been made earlier, which excuses a little the lack of certain things).

But really, I am open to someone explaining me how they enjoy Go. Because I feel like I should be wrong as I see most people (which, for some of them, I know are clever) praise Go.

Edit: I added modal expressions to make it clear that it is my opinion.


C, for better or worse, is like a high level assembly language. You can do anything, which pretty much means you are going to have many security and correctness issues. Double free, use after free, off by one errors, buffer overflows, etc. Thus numerous CVEs.

Go has less flexibility, no pointer arithmetic, a healthy package system, and a smaller domain. Mostly consuming or providing network services. My favorite feature is channels. For me they make levering the performance of multi-core CPUs straight forward, and dramatically nicer than the C approaches I've tried like pthreads and mutexes.

I wouldn't rate go as secure as rust, but has a pleasingly developer friendly approach. Seems way more secure than C.

Making a pipeline where each stage is 1 to N threads is pleasingly easy, reliable, and performant.


I like it because I have more control over the size and layout of my memory structures than many GC languages, and the goroutine/channel data flow design model lets me use all the cores pretty evenly without having to worry about mutexes or subtleties. Pretty easy to get into the 100k request per second performance regime without special tweaking. I tend to either write long lived servers where the performance per container directly affects the cost, or analytics sort of Calais where I want my laptop to use 1/2 or 3/4 of its cores and get a faster answer from scraping 10M whatevers.

Sorry I have a question that is a little off-topic: what's the value of generating an image of a laptop on a desk? That's not like it's particularly relevant, when you could have integrated a screen shot of your set-up (like the same one you put on a few of your repos) or something more unique, and even if you want to show that, it's easy to find similar images with the same vibe, so I guess it's for some fun I missed in the process?

I like the image. It was simple.

But doesn't that make it bad? It doesn't say anything new. Unlike the software in question, which is personalized, so it's not even symbolically reflecting the topic. It's a sheer waste of pixels and time spent looking at it or scrolling past the cognitive junk food.

Come on, bro. Why you gotta be like that?

I mean, I'm a gwern fan, but...can't you just let people enjoy things?


I didn't enjoy it. It flunks my criteria for good AI images, because there is no there there: https://gwern.net/blog/2025/good-ai-samples It wasted my thoughts as I stared at it, trying to learn about the author and 'X uses this' and thinking about how moleskin notebooks related to personal computing etc... only to realize I had been lied to and my time wasted, as he took advantage of my good faith - the good faith I extend to a fellow hacker, that their images will be meaningful and worth reading.

Just as their words are presumed to be meaningful and worth reading... but slightly less so every day, I fear.


> I didn't enjoy it.

Then it wasn't for you. No big whoop.

> It wasted my thoughts as I stared at it,

I think we have arrived at the kernel of the problem. :-)

The rest of this comment is left as an exercise for the student.


> Then it wasn't for you. No big whoop.

Ah yes. I remember this in Penny Arcade: https://www.penny-arcade.com/comic/2004/03/24/the-adventures...


Consider the cost of generating the image.

> can't you just let people enjoy things?

Dumping slop into the public commons deserves criticism.


I don't know either what they meant, but for comparison NumWorks calculators are clocked at 216 MHz (100 MHz for the older models, and 550 MHz for some of the latest ones, but not everywhere), so it doesn't look that much out of the ordinary, maybe a little underpowered from my experience with the first NumWorks but eh idk it's a calculator and unlike the first NumWorks they don't try to do CAS.

> Europe has had many decades since then to innovate in technology and they have still not done so. They are almost completely dependent on American and Asian tech. And that is not changing anytime soon.

You are stating that like this has been the state of things for a century. The dependence on American and Asian tech has been a gradual process, that accelerated in the 1990s and 2000s. Before that time, every European countries had their own tech industries able to compete with the tech giants (Nokia, Siemens, Grundig, Alcatel, Thomson, Olivetti, Philips, Ericsson, Amstrad and that's only citing a few of the ones that marked history forever, only in the field consumer electronics, a lot of them back in the day were competing but ended up fading away, and also others were everywhere in the tech industry before without being really exposed to consumers).


There is Reynard if you're motivated too (Gecko-based, but it's not ready for prime time yet, and to get good performance you'll have to resort to some workaround to get JIT enabled, as it does not rely on Apple's BrowserEngineKit; one of the goals of the project is giving to not up-to-date iOS devices access to a modern browser).


Hey, love that thing and I am considering to sell my current laptop to get one, but I wanted to first know if the laptop features pen/stylus support? I guess not, as it is not using a full glass cover like the 12 and otherwise it would have probably been advertised, and can we expect in the future an upgrade path towards that (by replacing only the panel or the whole top part maybe?)


I think it is an acceptable quirk for a permission system that has been retrofitted on top of an ecosystem which was not designed with that threat model in mind.

But sure, if I was assigned to make an all-purpose desktop operating system today from scratch, I would likely do this differently, but along with a bunch of other things I think (and the app would have to be implemented differently too).


marcan once said this was not possible on M1 macs. It was possible before, as coolbooter demonstrated, but it seems now that the hardware cannot be completely reinitialized without being power cycled (it was on Mastodon in 2024, he has since deleted his account so I cannot give you the exact quote). But you can do wizardry to load macOS' userspace on top of iOS' kernel [0] with a jailbreak.

[0]: https://x.com/khanhduytran0/status/1954724636727587237


You can't reinitialize the hardware, but if whatever you are trying to load is compatible with what's going on, then it should work. In a sense you could consider kexec to be like booting on a kind of weird machine where your interface to talking to the hardware is whatever macOS initialized the devices to.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: