Hacker Newsnew | past | comments | ask | show | jobs | submit | sublinear's commentslogin

I find this line of thinking fascinating considering how many things we do without a second thought (forced to drive for basic errands, etc.) that are orders of magnitude more dangerous.

Anyway, my point is that the people most at risk of poisoning themselves are those unfamiliar with the process. I'm pretty sure a ton of people were doing this anyway for non-commercial purposes without realizing an unenforced federal law even existed.


The table shows this better. All have just a single reported layoff "event" except Amazon which has three. Even a decade ago Amazon had many reports on Glassdoor and elsewhere of horrible middle management and employee churn.

All these layoffs seem to track better with longer-term decline than AI progress. One would otherwise expect the layoffs to reflect the multiple and much hyped "step change" improvements over the past few years. Instead the chart shows a sudden plateau starting a month ago. Probably when this last made the rounds somewhere else (maybe reddit? too lazy to search).

There's also a huge hole in media reporting regarding smaller businesses. That's where you'd expect AI to have the biggest impact. Instead we hear crickets.


> One would otherwise expect the layoffs to reflect the multiple and much hyped "step change" improvements over the past few years. Instead the chart shows a sudden plateau starting a month ago.

Can you clarify the theory here? So if there is a “step change” you expect companies to do layoffs all at once? How does this account for I.e. diffusion lag or companies deciding if it’s better to chase growth vs. capital efficiency?


Yes I would expect a lag, but we've been hearing hype with every model update about huge AI productivity improvements for years now.

Despite this, the layoffs are steady over that time with no such spikes. If the goal was ever to cut costs, payroll is never spared since it's too big to ignore. Chasing growth is unlikely when lending and investment is tight. Why invest in other tech companies when you can invest in AI?


> the big players optimize for the casual user

This is the OG enshittification.

Software quality is declining because people don't have the same problems anymore. They've become so detached from their true desires and learned to cope with their walled garden ecosystems. If their iPhone doesn't do it they just pretend it's not possible.


I think you're far better off looking after your longer term diet to prevent the inflammation in the first place. Antioxidants in plant foods are your phenols, carotenoids, and vitamins while in meat they are amino acids making up complete proteins. The mechanisms at play there are way better understood.

I personally try to make sure I include ingredients like garlic, cinnamon, ginger, etc. where possible, guiding my snacks more towards nuts and cheeses, and avoiding too much saturated fat while still getting most of my protein for the day from real meat. I take my salads and stir-fries very seriously, but it seems to be a lost art at times.

I try not to overthink these basics, but I'm willing to bet many people have mediocre to poor diets from this perspective despite knowing better because they lose track and things get boring.

I feel like in this day and age we should be in the middle of a scientific and culinary renaissance full of exciting recipes that incorporate these ingredients in new ways. Instead I see a lot of traditional or ethnic-inspired cuisine lacking creativity. Not that what we have is bad, just boring.

All this to ask if anyone has solid cookbook recommendations?


Why not just loosely wrap the antenna or entire box in foil or move it to the basement/garage/roof?

If you're going for realism, bad wifi is a radio signal problem.


Oh I'll write that down! Looks like it can at least do something bad to the signal.

Not necessarily, it could also be on-band or off-bad interference, or bugs in the AP, or too many clients on the network.

I don't think there's anything inherently autistic about that. We just finally have these technologies sufficiently mature that materials and design are no longer strictly dictated by their function.

These objects are becoming more like clothing and less like unyielding industrial machines. It's to the point that I'd be genuinely disgusted to handle any used laptop regardless of how "clean" it is.


>We just finally have these technologies sufficiently mature that materials and design are no longer strictly dictated by their function.

It's not a new thing, cars started getting fins in 1948.

https://en.wikipedia.org/wiki/Car_tailfin


> materials and design are no longer strictly dictated by their function.

Ok… but I don't like to injure my wrists…


Nope. Virtual windows are rectangular because the screen is also rectangular while being small enough to see the edges within our field of view.

They don't have to be any particular shape or size. The property of being virtual overrides everything else when free of these self-imposed constraints.

Even if you lose the GUI and go back to text, the ideal terminal is a plane of infinite columns of arbitrary cell size that dynamically fills your field of view.

I'd further argue that the only reason VR/AR isn't more widely adopted is the lack of orthographic vs perspective modality per application (and uncomfortable headsets). In VR/AR, you don't want a window manager or even windows at all. What you want is a field manager (as in FOV "fields" of varying opacity that can be composited by the user). Shape and size is just an arbitrary region blended in with the environment.

For the sake of ergonomics, you'd more often prefer to project an interface onto a surface if you had the choice. When you don't, you probably want the projection to be orthographic, but for the edges to be fuzzy if not invisible. You'd generally want to be able to layer these interfaces as well instead of having opaque rectangles always in your way.


I don't think GP was advocating for actually square windows. Rather that the corners should be right angles.

This makes perfect sense considering that most LCD displays, and practically all computer displays, don't have rounded corners. This trend of rounding displays and GUI elements is purely an aesthetic choice. I also find this obnoxious since the only thing it does is rob me of a few pixels which are often useful.

But considering Apple users have accepted living without a large block of pixels dead center at the top of the screen, which they've been sold as a "feature", the rounded corners are likely even less of an issue.

I'm not sure that an infinite plane of pixels makes sense even in XR. I want to see a clear edge of where digital content begins and ends, and a rectangle is the simplest and most optimal shape for that. So I would rather have physical display-like floating rectangles, than floating text in arbitrary locations, or rounded off corners for the sake of aesthetics. I'm not opposed to a very slight rounding off of edges on certain elements, but the trend Apple is pushing is supremely ridiculous.


Yeah I don't think we disagree. I just think you all's preference for windows, tiles, etc. (anything rectangular and opaque) is rooted in an idealistic efficiency of pixels (or irrational fear of deception?) just as unergonomic and frustrating to everyone else.

I'm saying that there is room for your arbitrary preference for opaque rectangles if we all abandon the notion of a "screen". We are well past the point where we can do this economically. It only persists because of consumer acceptance. Traditional screens are less efficient in every tangible way. They are less power efficient for their apparent brightness and require more material to construct.

Even the notion of clear boundaries and pixel size is an illusion. Traditional screens only make the pixels so big because they require sufficient brightness and power to see them at that distance, not because we cannot manufacture smaller pixels for cheaper. We could have much better results for everyone and the only remaining cost/problem is finding a way to comfortably wear the display.


This is untrue. What is being diminished is the value of humans doing repetitive or uncreative tasks.

Many have built their careers from that kind of work in the past and yes they are threatened, but that kind of work is inherently not collaborative and more vocational.


The vast majority of people on this planet work repetitive, uncreative jobs.

Oh? And what extensive knowledge and experience makes YOU qualified to determine what "the vast majority of people on this planet" are doing for work and if those tasks are creative or uncreative?

Not sure what you're insinuating. What do you think is the statistically average job on this planet? It's still going to be cultivating a smallholder farm in developing countries, or working in logistics, manufacturing or the broader service in developed countries.

All of these average jobs are structurally repetitive. Yes, humans do constantly inject creativity, but it's a means to an end, to getting the job done.

You apparently mistook my descriptive comment for a value judgment, but it isn't.


There is no such job done by humans today that is 100% uncreative, but people will continue to insist there is.

The devaluing may come from AI pressure, but the harm is coming from humans foolishly not seeing the value in what's left behind. Most people have not and will not lose their jobs.


Most consumers are primarily on mobile devices.

Windows persists in the workplace where the cost to replace it is significantly higher than keeping it, and keeping it doesn't cost much to begin with. Part of that cost would be training, yes.

The other part is finding compliant equivalents for the rest of the software they use. If the MFA, VPN, chat, email, etc. are all already vetted and designed to be compatible, there's no way they'd want to switch. Many policies regarding proprietary information disclosure are also built off this ecosystem and the certifications Microsoft's cloud already has.


> A vast majority of US states and territories also automatically register men for selective service when driver's licenses are issued.

I think a lot of people either (understandably) forgot or didn't pay attention to the details of when they got their driver's license, or we have a bot problem spreading toxic propaganda.

The majority of men are already registered. I agree this is not controversial.


usually things are controversial in context.

its bizarre when people act like the preceding day, week, month, etc has no bearing on the weight of activities. same with who makes the decisions and their biases

if you want to present the history of why now, go ahead but acting like a gollygeewhiz LLM about it is bizzare and incurious.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: