Hacker Newsnew | past | comments | ask | show | jobs | submit | dartdartdart's commentslogin

no displayport 2.1


Ouch. Why is nvidia always so far behind on this stuff? "Gamers don't care" I guess, but come on.


What’s the use case for 4K 144hz HDR10 monitor besides gaming?


There is one exception to the foreigner comment though. Blonde / white foreigners usually have an advantage, even over native Japanese people since the Eurocentric beauty standard is much in play there. You can see evidence of this in their modeling industry where high paying modeling gigs go to those who are Caucasian or half Caucasian.

Source: Interview with a model in Japan (https://youtu.be/VEubSC2CidA?t=157)


That’s true in certain industries like modeling, but not others, even if you’re a white native Japanese. Often, these types struggle to ever be fully accepted into society. They’re seen as outsiders and, at best, seen as “cool” in an exotic way.


4k 240hz? I don't think HDMI 2.1 even supports that. How can that be? Displayport 2.1 barely supports that


Compression. Apparently 10k@100Hz is possible. https://en.wikipedia.org/wiki/Display_Stream_Compression

(Depending on your definition of compression, my monitor shows more colors than there are particles in the Universe at 1 billion frames per second. It's just that there's a little bit of quality loss from the compression.)


I also like the N-1 strategy for things that are released often. I also like going with the most recent release for things that are released every 3+ years though


Still displayport 1.4

Intel has already moved to 2.0, and AMD is rumored to be supporting displayport 2.0

Would have liked to see this for 40 series, but I guess I will wait to build my first PC


Intel has moved to DisplayPort 2.0, but their current product specifications, e.g. for NUC12SNK, say: "DisplayPort 2.0 (1.4 certified)".

But I assume that they will get a certification in the future.

For Thunderbolt, where it was Intel who did the certifications, many companies have sold motherboards or computers for months or years with non-certified Thunderbolt ports which worked OK, while waiting for the certification.


Seems like we will need this sooner rather than later too with 500hz screens on the horizon.

Strange since the new frame interpolation in DLSS 3 might bring us much closer to smooth motion on sample-and-hold displays.


return it!


Regarding the Process–architecture–optimization model from intel, what do each of those upgrades mean to the user?

I believe Process has the most to deal with energy efficiency due to the shrinkage, which is why M1 was so energy efficient when it was launched.

M2 seems like an architecture/optimization change, seems like they are just able to cram more stuff which is why it's faster without increasing battery life.

For those familiar with Intel, what does the consumer mainly gain out of optimization product launches?


> I believe Process has the most to deal with energy efficiency due to the shrinkage, which is why M1 was so energy efficient when it was launched.

Eh, that's part of it, but a lot of it has to do with the M1 having very very high IPC (much higher than comparable x86-64 parts), meaning they could run the chip at a much lower max clock speed (3.2 GHz versus boosting to 5 GHz for most competitive x86-64 CPUs) for similar overall performance.

This makes a huge difference because power consumption increases exponentially with clock speed.

edit: thinking about how it relates to Intel's process–architecture–optimization, it feels a bit tricky to compare. Apple's process seems to be something like: architecture(A-series chip for iPhones)-optimize-with-new-die-reusing-basic-cores-in-diff-arrangement-targetting-perf-in-small-thermal-envelope(M1 for iPad Pro and small Macbooks)-optimize-again-with-another-new-die-reusing-basic-cores-but-in-yet-another-arrangement-targetting-perf-in-a-wider-thermal-envelope(M1 Pro/Max/Ultra), and that all happens before you get to the next M-series increment, which begins with an A-series increment.

So the M2 is less the optimization of the M1 than it is the re-use of the cores in the new A-series chip preceding it, which was an architectural change, plus optimizations and other SoC differences.


> This makes a huge difference because power consumption increases exponentially with clock speed.

It scales linearly with clock speed.



These graphs look like they depict f only scaling?


Who said anything about f?

I said "power consumption increases exponentially with clock speed" not "power consumption increases exponentially with f". That any modern processor has to crank up V to crank up its clock speed is a given.

Hence, power consumption is exponential with clock speed (which is achieved by cranking up both V - the polynomial term in P – and f).

(Which, fine, if you want to be pedantic, is polynomial growth, but that changes nothing about the point, which is that you have to burn a shitload more power at 5 GHz than at 3.2 GHz, because your consumption isn't scaling anything close to linear).


Clock speed is just f.


P = V^2 * f, yes, but V and f are correlated. You can't increase f without also increasing V, hence: exponential scaling.


That's still a polynomial, not an exponential


That's polynomial scaling, not exponential.


Waiting is not bad, most child abuse cases come from people who shouldn't have had kids in the first place. The worst case if you wait is that you'll have to adopt kids, which is arguably way more humane than introducing another hungry soul into the world.


It's not that we have too many mouths to feed, it's that we have too many selfish people in positions of leadership and authority who exploit the masses to line their own pockets.

Having 50% less people would reduce the overall hunger numbers, but not the ratio.


> Having 50% less people would reduce the overall hunger numbers, but not the ratio.

This argument is flawed because of the we're consuming significantly more resources then our planet could provide sustainably.

Increasing the quantity of consumers will always be detrimental for as long as resources are finite.


> we're consuming significantly more resources then our planet could provide sustainably.

That is not true of everyone. Billions live on/with very little, sustainably.


There very significantly less trees in Europe (also wildlife in many places) 100 years ago. Same applies for industrial pollution (not CO2 emissions).


Yes, pollution and climate change affect everyone and I am saying that is entirely unfair because many people did not choose these things, do not contribute, and do not benefit. The loss of trees in Europe is due to global effects, not just pollution in/by Europe.


Sorry, there was a typo in my comment (very -> were). What I wanted to say is that the situation in Europe is actually better in most places now than it was in the past.


With respect to food poverty, the increase in logistics/technology/marketing productivity may well offset some (perhaps all) of the resource pressure that a growing population creates.

Generally speaking, logistics have huge economies of scale (see Amazon) so an increase in market size will reduce the logistics cost per each item of food. Especially, for cold perishable dairy foodstuffs like dairy the cold chain costs may well dominate the costs of cows.


But we’re nowhere near capacity in terms of food production. Not even close. Few countries are seriously even trying to maximize food production.

And don’t forget minimizing waste - which is another major part of the picture.


IIRC the ratio of people at risk of famine as been going down steadily while the population increased until Covid (I assume it should get back on track eventually, though)


Right, best would be if rich people adopt poor kids


or simply stopped stealing the relief aid meant for their village/city/state/country


Which is a non existent phenomenon. You don't help people by not giving birth, or do nothing in general, you do that by actually donating/adopting etc


Adoption is not always a viable option.


But having children is? How come?


Why is it great for remote work? The land is already developed and the undeveloped areas get the sweltering sun of socal without the cooling ocean breeze to balance it out. The "good climate" only lasts for a mile or so from shore from


Currently SF is in a more severe drought than LA

Src: https://droughtmonitor.unl.edu/CurrentMap/StateDroughtMonito...


Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: