Hacker News new | past | comments | ask | show | jobs | submit | fsloth's comments login

Thanks! Looks very interesting and usefull.

I guess the main problem was nobody had a budget for those plans. ”Technically speaking we can do it but we can’t afford it” unfortunately still means ”we can’t do it”.

Now only if Project Orion had not been canceled due to international treaties banning nuclear testing we’d be on the moons of Jupiter by now (sigh).


“ The aliasing is quite extreme ”

One interesting thing in becoming a graphics expert and discussing graphics with non-graphicsprogrammers, is that many don’t care about aliasing.

Aliasing in sound is usually painfully untolerable. It does not seem to be like that in graphics.

Based on this, I dont’t think aliasing is a technical critical fault - it should be considered a specific aesthetic IMHO. As long as people can perceive the shape of the geometry information displayed, it’s ‘fit for purpose’.

If one is rendering in constrained environments I’m fairly sure they’ve made peace with the platform limitations.

(I’m not arguing for aliasing, just that in many practical cases it does not actually matter)


> it should be considered a specific aesthetic IMHO

Case in point, in a 3D platform/adventure game i made back in 2020 for an MSDOS game jam[0], the rasterizer has intentional "flaws" - like lack of subpixel and subtexel accuracy, no zbuffering and no perspective correction (and perhaps unintuitively, the last two actually made the whole renderer more complex) because i had in my mind a specific "glitchy" look. Adding those would be trivial (and probably speed up the rendering a bit on 90s hardware), but things like the wavy lines (as seen in the first shot) and "trembling" vertices are intentional.

Similarly when i ported the game to use 3D hardware APIs, i went for full bilinear filtering because i wanted that 90s "just ported our SW 3D game to use 3dfx" style that a lot of 90s 3D games had (and FWIW the game also runs on an actual Voodoo 1 too[1] :-P). Though i do know some people dislike it so i added an option to disable it.

[0] https://bad-sector.itch.io/post-apocalyptic-petra

[1] https://i.imgur.com/JssBdox.jpg


I forgot about taht game, I loved it.

source for the rest:

https://codeberg.org/badsector/PetraEngine


I can imagine no z buffer requiring finding a way to efficiently sort the triangles but how can affine mapping be harder than perspective correct?


(i assume with "harder" you meant "slower")

Right now i tessellate the world quads dynamically based on the distance from the camera, which not only has an overhead of its own but it also adds additional geometric overhead (more transformations, more sorting), so perspective correct texture mapping would let me get rid of that tessellation.

Also 3D meshes (which are static, no dynamic tessellation) need to have enough tessellation to avoid getting too distorted when the camera is close even if the extra geometry isn't really necessary (e.g. IIRC the top sides of the desks are 9 quads/18 triangles when with perspective correct mapping they'd be just 1 quad/2 triangles).


> Aliasing in sound is usually painfully untolerable. It does not seem to be like that in graphics.

It's tolerable in graphics in many cases, but becomes painfully obvious when the spatial frequency of some model approaches the pixel grid's frequency, and you get very distracting Moiré patterns.

edit:

But I guess in 3D rendering you deal with this differently, probably. You probably don't want to spend resources on painting model details that are half a pixel in size, so they get entirely culled, instead of causing any Moiré problems.


This. There are tasks where implementing something might take up to one hour yourself, that you can validate with high enough confidence in a few seconds to minutes.

Of course not all tasks are like that.


Sure. But the added value of SWE is not ”spitting code”. Let’s see if I need to calibrate my optimism once I take the new model to a spin.


Agree, SWE as a profession is not going anywhere, unless we AGI, and that would mean all the rules change anyway.

Actually now is really good time to get to SWE. The craft contains lots of pointless cruft that LLM:s cut through like knife through hot butter.

I’m actually enjoying my job now more than ever since I dont’t need to pretend to like the abysmal tools the industry forces on us (like git), and can focus mostly on value adding tasks. The amount of tiresome shoveling has decreased considerably.


I'd agree with this take. Everyone is so pessimistic about LLMs, but I've really enjoyed this new era.

A lot of the tasks that used to take considerable time are so much faster and less tedious now. It still puts a smile on my face to tell an LLM to write me scripts that do X Y and Z. Or hand it code and ask for unit tests.

And I feel like I'm more likely to reach for work that I might otherwise shrink from / outside my usual comfort zone, because asking questions of an LLM is just so much better than doing trivial beginner tutorials or diving through 15 vaguely related stack overflow questions (I wonder if SO has seen any significant dip in traffic over the last year).

Most people I've seen disappointed with these tools are doing way more advanced work than I appear to be doing in my day to day work. They fail me too here and there, but more often than not I'm able to get at least something helpful or useful out of them.


Exactly this. The menial tasks become less of a burden and you can just power through them with LLM generated scripts.

If someone expects the LLM to be the senior contributor in novel algorithm development, they will be disappointed for sure. But there is so, so much stuff to do to idiot savant junior trainees with infinite patience.


I'm not sure what type of code you develop but it sure ain't the entirety of the real world.

In my corner of the real world we are quite concerned about the difference between stack and heap and profile our applications to pinpoint any opportunities for optimization in e.g. heap allocations.

Like, most of the world runs on puny chips, and if the chips are not puny, the workloads are still going to eat up battery and contend with all of the other crap running on the end users device.


sorry if i struck a nerve. i maintain that this is a domain specific skillset and that only specific types of coders really need that knowledge.


Apple pay over the watch is killer app IMO. You can pay everything with it. In civilized cities like london you can use it to pay the public transit. While it may sound gimmicky, this is actually an improvement (if only slight) to daily chores.


I live in London. You have to wear it on the right hand side for TFL gates and contort yourself to use them. Also it's not quite as reliable as the iPhone.

It was fine for ordering beer.


It sort of can. For example you can load spotify playlists to it, podcasts over podcast app etc. The interface to achieve this is quite cumbersome, though.


Those things are not mp3s.


This drove me insane for a while trying to figure out how to get long (1h+ mp3s) downloaded to the watch for offline playback. The solution is to use an app called iCatcher!

Works reliably for me


”Templates are not that bad as a user”

My take would be they are bad for both users and authors.

In 2024 the expected compiler output for a syntax error in a statically typed language is a specific-as-possible report where in the written source the syntax fails - not 40 lines of illegible template error messages.

There are some cases where templates are the best design option. But they should be used only as the last resort when it’s obvious that’s the best way.


In 2024, I expect people implementing templates making use of concepts, and if not possible, at very least static_assert alongside enable_if.

Naturally reality often times doesn't match expectations.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: