Hacker News new | past | comments | ask | show | jobs | submit | alexjplant's comments login

I saw the band Failure live two years ago a few blocks from where I live. The lights went down shortly after we arrived and a video started playing on the projector wall at the back of the stage. It was a series of interview clips with musicians talking about the band we came to see: how transcendentally amazing Failure was, how incredible the sonic textures on "Fantastic Planet" were, how Ken Andrews is an unappreciated genius, etc. I thought that it would be interrupted after maybe 90 seconds by a loud guitar to kick the show off, but nope - we got to hear Hayley Williams and that Zonie vintner guy who sometimes sings for Tool gush about the band that we were waiting to see for _30 minutes straight_.

At one point I checked my receipt to make sure that we didn't accidentally get tickets to some sort of virtual experience or pre-release screening instead of a concert. The video eventually ended, the band came on, and they gave a great performance. I left feeling more confused than anything; the rest of the crowd's reaction ran the gamut from impassioned to dismissive.

If the art you're putting on display already has a cult following I don't see the need to drive the point home via these weird metatextual commentaries. I'm a weirdo that likes watching movies with crew commentary but I like to do that in my living room, not in a theater.


More practical methods were fairly common during that period since they was actually cheaper and quicker than real computer graphics. The wireframe sequences in "Escape from New York", for instance, were actually miniatures with fluorescent paint applied to the edges.

Even Tron had quite a bit of rotoscoping with a „computer look“, especially the scenes with human actors. Not shots like the lightcycle scene though, those were actual CGI.

Fun fact: That was one of James Cameron's first gigs.

Not sure about Tron but here are a few details [1] about the Foonly F1 used on that film and how it was later used for Flight of the Navigator:

> They had pushed for Triple-I to build the DFP, the first (that I know of) high-resolution digital film printer for motion pictures. This was the next generation PFR, using an 8" CRT which had fast-decaying phosphors so that it could be used for scanning in film (using photomultiplier tubes built into a special camera) as well as printing. The imagery was amazing

> Since the Foonly only had enough disk storage to hold the frame being computed and the frame being printed, the numbers worked out like this: 30 seconds of film at 24 frames per second works out to 720 images each computed and printed at 6000 x 4000 pixels.

[1] http://dave.zfxinc.net/f1.html


It reminds me of the rote methods of essay writing that they teach in US schools in order to pass standardized tests. It usually comprises an obvious "thesis statement" followed by supporting evidence and a final, contrived recapitulation of the previous few paragraphs. Thankfully this essay starts off well enough by avoiding the stilted "here is my point followed by a slew of observations and three filler sentences" convention, but yeah, that last bit seems a little rough.

In the interest of fairness I'm not sure that I could do much better since I've not written long-form in years.


This method predates standardized testing by years.

I know several people in the US who use the verb and noun forms of "wank" synonymously with "whine" (perhaps because they both have the same starting consonant sound) without realizing how vulgar it is. I've gently pointed out to them what it means in British English and none seem to care.

jQuery II: The Quickening!

I recently built a toy Golang SSR project using Zepto, a jQuery-like library, and felt like I was 17 again (**EDIT: it's unmaintained - don't use it). Also of note is that "Highlander II" takes place in the year 2024 [1]. It's a sign! Everything old is new again! `$` is immortal!

[1] https://en.wikipedia.org/wiki/Highlander_II:_The_Quickening


Zepto is unmaintained, which isn’t good for security issues or bugs, unfortunately.

jQuery proper is both maintained and has actively been working on a 4.0 release and still gets security and bug fixes


Thanks for bringing this up. Edited my OP - there's no impression as such on their front page and it seemed fairly ergonomic. My bad on that one.

I also created almost 10 years ago Umbrella JS, which is a tiny jQuery replacement but with more array-like arguments in the functions:

https://umbrellajs.com/

https://www.bennadel.com/blog/4184-replacing-jquery-110kb-wi...


IIRC, I used Zepto with Phonegap around ~2012. What a blast from the past.

"There can be only one"


Several years ago static site generators were all the hotness. Around then I switched to Hugo [1] from Wordpress and it's been a good experience. I do all editing locally with the CLI then chuck it to Git to be built and hosted by Netlify.

[1] https://gohugo.io/


> You do all editing locally with the CLI then chuck it to Git to be built and hosted by Netlify.

Or CloudFlare Pages, orGitHub Pages, or Firebase Hosting, or Fastly, or Vercel, or S3+CloudFront, or your own nginx instance.


Edited to reflect that it's my workflow specifically.

>The idea that accepting cash is cheap is actually a myth. While some business owners might think the 3 percent fee for processing credit cards is a burden, research from IHL Group shows that cash handling costs many retailers between 4.7 and 15.3 percent.

How cute. The value proposition of accepting cash isn't to save on transaction fees, it's to save on Uncle Sam's fees. Cash doesn't have to go into a register and get Z'd at the end of the day (or have a Quicken invoice to go with it, depending upon the type of business). Although I neither condone, participate in, or tolerate tax fraud, it's absolutely a thing, and their failure to mention it doesn't inspire confidence in their conclusions.


That's a thing only if you don't emit receipts or use a modern POS system. In particular it would happen in flee markets or in old shops that only keep a very forgiving paper trail.

But otherwise your POS won't let you get away with that, and your customers will usually be grossed if you're handing them goods for cash without any receipts. Food business could be the last bastion where it would somewhat work on a regular basis IMHO.


20 years ago it did. People (kids, at least) made a real distinction between an "iPod" and an "MP3 player" even though one is obviously a subset of the other because Apple is a named luxury brand. It was basically the blue bubble controversy of its time and especially frustrating for socially-awkward, self-important 14-year-olds that had superior products like the Creative Zen Vision:M who had to endure ridicule from their friend group in sixth period lunch because their music player didn't have a click wheel :-).


> Nvidia has graphics chips, but it doesn't have the CPUs. Yes, Nvidia can make ARM CPUs, but they haven't been putting out amazing custom cores.

Ignorant question - do they have to? The last time I was up on gaming hardware it seemed as though most workloads were GPU-bound and that having a higher-end GPU was more important than having a blazing fast CPU. GPUs have also grown much more flexible rendering pipelines as game engines have gotten much more sophisticated and, presumably, parallelized. Would it not make sense for Nvidia to crank out a cost-optimized design comprising their last-gen GPU architecture with 12 ARM cores on an affordable node size?

The reason I ask is because I've been reading a lot about 90s console architectures recently. My understanding is that back then the CPU and specialized co-processors had to do a lot of heavy lifting on geometry calculations before telling the display hardware what to draw. In contrast I think most contemporary GPU designs take care of all of the vertex calculations themselves and therefore free the CPU up a lot in this regard. If you have an entity-based game engine and are able to split that object graph into well-defined clusters you can probably parallelize the simulation and scale horizontally decently well. Given these trends I'd think a bunch of cheaper cores could work as well for cheaper than higher-end ones.


I think a PS6 needs to play PS5 games, or Sony will have a hard time selling them until the PS6 catalog is big; and they'll have a hard time getting 3rd party developers if they're going to have a hard time with console sales. I don't think you're going to play existing PS5 games on an ARM CPU unless it's an "amazing" core. Apple does pretty good at running x86 code on their CPUs, but they added special modes to make it work, and I don't know how timing sensitive PS5 games are --- when there's only a handful of hardware variants, you can easily end up with tricky timing requirements.


I mean, the PS4 didn't play PS3 games and that didn't hurt it any. Backwards compatibility is nice but it isn't the only factor.


The first year of PS4 was pretty dry because of the lack of BC; It really helped that the competition was the Xbox One, which was less appealing for a lot of reasons


At this point people have loved the PS5 and Xbox Series for having full backwards compatibility. The Xbox goes even further through software. People liked the Wii’s backwards compatibility and the Wii U (for those who had it).

And Nintendo’s long chain of BC from the GB to the 3DS (though eventually dropping GB/GBC) was legendary.

The Switch was such a leap over the 3DS and WiiU Nintendo got away with it. It’s had such a long life having no BC could be a huge hit if the Switch 2 didn’t have it.

I think all three intended to try and keep it going forward at this point.


Which is also the reason why many games on PS 5 and XBox Series are kind of lame, as studios want to keep PS 4 and XBone gamers in the sales loop, and why PS 5 Pro is more of scam kind of thing for hardcore fans that will buy anything that a console vendor puts out.


One data point: there was no chip shortage at the PS4 launch, but I still waited more than a year to get one because there was little to play on it.

While with the PS5 I got one as soon as I could (that still took more than a year since launch, but for chip shortage reasons) because I knew I could simply replace the PS4 with it under the TV and carry on.


We're not in 2012 anymore. Modern players don't only want a clean break to play the new AAA games every month, they also want access to a large indie marketplace, they also want the games they play every day, they also want to improve the performance of the games they already have.


PS5 had Zen 2 which was fairly new at the time. If PS6 targets 120 fps they'll want a CPU that's double the performance of Zen 2 per thread. You could definitely achieve this with ARM but I'm not sure how new of an ARM core you would need.


Is there a need to ever target 120 fps? Only the best-of-best eyes will even notice a slight difference from 60.


Yes.

You say that, but you can absolutely notice. Motion is smoother, the picture is clearer (higher temporal resolution), and input latency is half what it is at 60.

Does every game need it? Absolutely not. But high-speed action games and driving games can definitely benefit. Maybe others. There’s a reason the PC world has been going nuts with frame rates for years.

We have 120 fps on consoles today on a few games. They either have to significantly cut back (detail, down to 1080p, etc) or are simpler to begin with (Ori, Prince of Persia). But it’s a great experience.


My eyes are not best-of-best but the difference between 60 and 120hz in something first-person is dramatic and obvious. It depends on the content but there are many such games for consoles. Your claim that it's "slight" is one that only gets repeated by people who haven't seen the difference.


Honestly, I can't even tell the difference between 30 and 60. Maybe I'm not playing the right games or something but I never notice framerate at all unless it's less than 10-20 or so.


I would guess it's partly the games you play not having a lot of fast motion and maybe partly that you're not really looking for it.


I don't think my TV can display 120 fps and I'm not buying a new one. But they promise 4K 60 (with upscaling) on the PS5 Pro, so they have to have something beyond that for PS6.


They have 120 today, it’s just not used much.

Even if people stick to 4K 60, which I suspect they will, the additional power means higher detail and more enemies on screen and better ray tracing.

I think of the difference between the PlayStation three games that could run at 1080 and PS4 games at 1080. Or PS4 Pro and PS5 at 4k or even 1440p.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: