Hacker News new | past | comments | ask | show | jobs | submit | Drew_'s comments login

> I spent way, way too long working out the totals from the various methods of getting a new phone

Did you add this time you spent into those totals? I think if you did, your math would come out differently. Personally, if I even feel the need to do any math like this, the answer is already "no, I can't afford this".


How can comparing different methods for a couple minutes be charged? Or rather, would it not be charged evenly to multiple options since multiple options are being researched, including the proposed "default" of buying a phone outright and then finding a plan that will actually provide suitable service while saving money.

To your credit, just stick with the subsidized phone deal, and then don't upgrade when it's paid off. At that point, your phone is now technically unlocked


"I spent way, way too long" sounds like a lot more than "a couple minutes" to me.

My point is that there is no research cost for buying outright. You just buy the phone and then you pick whatever service and plan you want. If I have to spend significant amounts of time penny pinching on "deals" and contracts that probably means I'm trying to buy something I can't afford. Moreover, this tendency definitely leads to _more_ spending in the long, not less.


My question is why that the presumed default as opposed to getting a free or heavily discounted phone from Verizon or AT&T? I guess geography matters, but very few people I know in the US buy their phone outright, so it's not the default.


Yup it's the same thing with ProMotion missing on the display even though it's identical to last year's Pro model.


They were probably subwoofers. The direction doesn’t make a difference for those.


Apple kind of does something like this with iCloud however their per user "databases" are only virtual:

https://news.ycombinator.com/item?id=39028672


Publications need to charge a-la-carte instead of force feeding subscriptions.


People say this a lot about Google, but I'm pretty sure this is true for every big company.


I think the point was that it used to be different at Google in the past(I saw it first hand). People ensuring services functioned correctly after launch were valued and rewarded appropriately.

Over time, the value given to maintenance and smooth operations decreased in reviews, esp for 'non core' services, which inevitably led to engineers doing the rational thing and prioritizing launching features, getting credit and moving on before they got saddled with pesky things like maintenance.

This also had the unintended consequence of politicizing work assignments quite a bit with more savvy political operators getting the most opportunities to launch featu ... er 'deliver impact'.


Interesting. Any hypotheses as to why this shift happened?


Stonks only go up /s


> i7-7700k from 2017 coupled with a modern GPU will comfortably output 60fps+ at 1440p

I can say by first hand this is not true for any modern MP game.

In general, I hate these "it does X FPS at Y resolution" claims. They're all so reductive and usually provably false with at least a few examples.


It so happens that toms recently tested this exact question (how much performance do you lose pairing a modern card with an older gpu, compared to the same card with a modern cpu). Full results at [0] but the short answer is that a $900 RTX 4080 gpu with a 2017 cpu will generally do 60+fps at 1440p in most games, but as low as 55 in a few.

0: https://www.tomshardware.com/pc-components/gpus/cpu-vs-gpu-u...


Not super surprising for single player games since they're usually much easier on the CPU than multiplayer. I was not getting minimum 60 FPS in Warzone for example.


A big chunk of gamers just play games like Valorant, CS, Fifa and CoD which usually run much better.


I would be shocked if a 7700K with a modern GPU does not get 60fps at 1440p in Rainbow Six, Rocket League, LoL, Dota, CS, Fortnite, etc.


I game on archaic i5-4460 (4 cores, max 3.2ghz), paired with rtx 2070 super and 16gb I can run literally everything new in 50-100 fps range, coupled with 34" 1440p VRR display not much more to desire for single player. Running this maybe 6 years with only graphic card and corresponding psu change.

Ie Cyberpunk 2077 everything max apart from rtx stuff and its rarely below 80fps, dlss on good looks. Baldurs gate 3 slightly less but still smooth. God of war same. Usually first think I install is some HD texture pack, so far never any performance hit to speak of. Literally why upgrade?

Consoles and their weak performance are effectively throttling PC games last few years (and decade before), not much reason to invest into beefy expensive power hungry noisy setup for most gaming, just to have few extra shiny surface reflections?


These are all old games so I wouldn't be surprised either


> not true for any modern MP game

Pretty big claim

> at least a few examples

Reasonable claim

Comfortably doesn't mean everything will be at 60fps, it means that most things will be at 60fps so someone with a 7700k will not be feeling pressure to change their CPU (the entire point of the thread).


you'd be amazed what runs at 60fps in 4k if you simply turn down the settings


On the contrary I can say first hand that it is true (3700X which is two years newer, but on benchamrks it is a toss up between the two). What modern GPU at you using?


3700X's 8-cores is a generational leap above the 7700k's quad core for modern games. They aren't comparable at all.

I ran a GTX 1080 and then an RTX 3080. The performance was not very good in modern games designed for current gen consoles like pretty much any BR game for example. Some games got high FPS at times but with low minimum FPS.


I generally rule out all Intel CPUs between the 4770 and 11th Gen.

The exception is when I need dirt-cheap + lower-ish power consumption (which isn't gaming)


Your preference does not change whether or not it can achieve > 60fps at 1440p in modern multiplayer games when paired with a modern GPU


An now 13th and 14th Gen... Before it is absolutely certain issues are fixed... So that leaves 12th?


12700k here, absolute unit of a CPU.


> This suggests the iterated optimum strategy is to just keep chasing new points trying to land that golden moment, though.

Yup exactly. We call that venture capital


I've been thinking about this for a few years. It blows my mind that publications are not selling articles a la carte. Surely getting _some_ revenue from readers that don't want to subscribe is better than none at all? It feels like a great way to generate leads that might convert into subscriptions as well.

This is not some genius new idea so surely publications have thought about it and refused it, but for what reason?


Here's what I can think of for why not:

- Hard to predict/forecast recurring revenue (though it's additional revenue)

- Incentivizes click baiting for readers

- Requires setting up a "non-standard" billing system


The stopper (or at least one) is probably credit card transaction fees. There is usually a minimum fee of like 15¢ which makes it less feasible to do transactions under a dollar.


Good point though I’m personally willing to pay more for articles. For example a big feature article from a reputable publication is easily worth $10+ to me. Smaller reports would be less but still at least $1.

Obviously that’s very expensive compared to a subscription but that’s kind of the point.


Accumulate and bill once per month. Or pre-pay.


Agreed - pre-pay. I'd pay $5-10 easy and let it run out slowly over time


The papers should offer their own card


The fee comes from the payment processor. There's no way to avoid it while providing credit card services without starting your own payment processing company (at which point you're then eating those costs directly).

FedNow should help in the longer term as non-credit card payment methods show up that use it, but it's still not going to get to the realm of 10¢ micropayments (the service has a flat $0.045 per transaction fee, and companies still have to then consider their own service cost overheads on top of that).


> past that fps is sort of meaningless as the response time is slower than the refresh rate and you get artifacts.

Not quite accurate since OLED panels do just fine driving >240 Hz refresh rates. There are already 360 Hz OLED panels on the market.


What I meant is that fps by itself, without other information, is meaningless for judging panels. Like if someone says they got a 500hz panel, it is not necessarily better than a 240hz - LCD vs. OLED for example. But if you are comparing panels from the same manufacturer, same tech etc. then of course it is relevant. And obviously if you have a panel it would be stupid to run it at anything less than max fps, if your system can keep up.


The deadline for YC's W25 batch is 8pm PT tonight. Go for it!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: