Hacker News new | past | comments | ask | show | jobs | submit | thunder-blue-3's comments login

Mexico has so many greater problems to discuss than a few people learning the ancient tongue. 2 days ago a beauty influencer was shot dead on a live stream, and female (and male) mayors have been gunned down regularly. I couldn't care less about what they're speaking over there, I hope they take care of their basic human rights and giving their citizens dignity first.

What do you expect the average Mexican to do about that? The Cartels have substantially more power than the state.

I think it's great that they're reclaiming some power by relearning their ancient languages that were nearly destroyed by their colonizers


> The Cartels have substantially more power than the state.

This is a common misconception. The state can absolutely dominate any cartel in Mexico, they just choose not to for political reasons.

> relearning their ancient languages that were nearly destroyed by their colonizers

Nahuatl is actually a colonizer language. The Aztecs brutally subjugated other native peoples, so brutally in fact that those groups were extremely eager to ally with the Spanish to overthrow the Aztec empire.


Mexicans could start with liberalizing their gun laws since all the bad guys already have them. Zapatistas and other local resistance groups aren't afraid to fight them when they have weapons, and some of the communities that actually have gotten their hands on guns have managed to make it more trouble than it's worth for the cartels.


Well, the cartels have more power as you get away from the Valley of Mexico. Much of the power distribution in Mexico is related to geography afaik. When terrain is difficult to cross, enforcing a monopoly on power is difficult. For another example of this problem, see Afghanistan.

What kind of power are you speaking of? "Cultural power" or something? Does it mean much in practice in this context? I fail to see what its reclaim would achieve against fighting cartels.

Human beings are highly irrational. Increases in cultural power often gives them a sense of greater empowerment that causes them to take increases in political power. That's why dictatorships seek to suppress and control cultural practices that could lead to empowerment such as martial arts, religion, meditation, language, art, gender nonconformance, etc.

I don't disagree, and I see how that would go about, but does it have any immediate effects? Is it not more of a long-term thing?

human rights and dignity are not something to be given (by whom?) but to be fought for. and the most important weapon in that fight is building a community. discovering your identity and making a connection to the community you live in is a big part of that. and learning your ancestral language is a way to make that connection.

> Mexico has so many greater problems to discuss than a few people learning the ancient tongue. 2 days ago a beauty influencer was shot dead on a live stream

Yucatan ain't Jalisco. That's like saying Alaska shouldn't support indigenous Alaskan languages because there is racial animus or police brutality in Mississippi.

Mexico is a federal state like the US, that's why it's the Estados Unidos Mexicanos/United States of Mexico.


People without past have no future. Connecting to your ancient traditions is a form of empowerment. Look at what happened to Jews with Hebrew. It helped rebuild a united identity and contributed to Palestine's de-colonization effort. I hope people of the Americas will do the same and free themselves of the Spanish colonizers.

Most genetic studies show the average Mexican has around half and half Native American and European ancestry with about 5% African ancestry. 99% of Mexicans speak Spanish and 94% speak only Spanish.

I’d love to know what Spanish decolonization in such a place looks like.

There is no objectively correct demographic language or culture for a given location. You have to pick a point in time to go back to and there is no way to do that that isn’t arbitrary.


Mexicans are indeed a new people drawn from both native and European stock and a fusion of those cultures. There is a notion of Mexicans (or even Latinos) as la raza cósmica, which is deeply connected with Our Lady of Guadalupe, regarded as "the first mestiza". This mestizo identity is core to Mexican identity. It isn't colonial even if colonialism served as a vector and a catalyst for it.

The idea of "going back" to some kind of pre-Spanish Mexico is nonsensical, and it would entail the very negation of Mexican identity and the invention of a fictional identity. Such "decolonization" movements are ahistorical. And frankly, I doubt most Mexicans would want a "return", whatever that even means.

Of course, this is different from learning Náhuatl. And it's worth noting that the Jesuits worked to preserve the native languages of the New World. You see this with Náhuatl. You see this in Paraguay where the Jesuits immediately began codifying and preserving Guarani in their missions, and where it is still widely spoken today.


I couldn't have written a better response than this, absolutely fantastic.

I was not suggesting "going back" to some sort of medieval past. Aboriginal languages and cultures do exist and they are oppressed. They are not fictional. Oddly, your arguments sound like Putin's points on Ukraine and "fictional" Ukrainian culture. 40 years ago, they all spoke Russian, and the moment they tried to unite around an indigenous, a more deeply connected culture for them, they got attacked by a colonial power.

40 years ago about 2/3 of Ukraine were native speakers of Ukrainian and primarily spoke Ukrainian at home (close to the same number as today).

A little over 1% of Mexicans speak Náhuatl (the most common indigenous language).

There is no comparison here.

If by decolonize you just mean stop oppressing minority cultures and languages then that sounds great. But decolonization is the wrong word for that.


I wonder when the Perimeter Institute will begin to get more name recognition. Some of the top PhD graduates from US R5 universities (and now assistant professors) have gone through there and have done phenomenal in their career.


Being small and relatively unknown has advantages. Sean Carroll touched on this:

>But what I'm trying to get across is there are a bunch of structural reasons why physics departments tend to be conservative, and the conservative in the sense that they're gonna hire people who are working in the areas that are sort of the sure things rather than the gambles, and the same thing goes for funding agencies and prize committees and so forth, academia in general, not just physics departments, there's a lot of structural reasons why things are conservative, and I do think that's a problem, you even see it in institutions like the Perimeter Institute, which is one of the world's greatest physics institutes right now, but when it started out, it was much quirkier, Lee Smallin was there, and Fortiny Makapulu and a bunch of people, and they were doing loop quantum gravity and weird approaches to the foundations of quantum mechanics.

>4:10:24.8 SC: And as it grew and became more respectable, they turned into one of the world's great physics institutions, as I said. But they also became much more just mainstream and ordinary. It's a part of the life cycle of a Physics Department or Institute. You have a plucky band of rebels and they kind of equilibrate and they become more normal and traditional, and you can't blame them, can't plan that particular institute, 'cause they're just trying to be a good a Physics Institute, and their little part that they play turns out overall, to make it harder and harder for small idiosyncratic research programs to flourish, there are people who have tenure or senior people and they can work on their own quirky little ideas.

https://www.preposterousuniverse.com/podcast/2023/07/31/245-...


Reputation and the dynamics of social media is a tricky one. Within the scientific community, it is pretty well known as far as my experience goes.

When it comes to the general public, it requires some work that will get alot of attention. See for instance deepseek. Some of the my "normy" friends are even aware of the company despite not being into ML.

Maybe something regarding the foundational stuff regarding the formulation of Quantum Mechanics or Quantum Information Theory


wow, TIL. Can't believe I've done nearly 40 circles around our sun and never knew about WSL/2.


To be fair, WSL/2 didn't exist for most of those circles.


Speaking from personal experience, many director-level and above positions at Intel, especially in growth related areas are filled through nepotism and professional connections. I've never seen a headline about Intel’s decline and thought, 'Wow, how could that happen?'


I had a business partner that I agreed on a lot of things with but not about Intel. My assumption was that any small software package from Intel, such as a graph processing toolkit, was trash. He thought they could do no wrong.

Intel really is good at certain kinds of software like compilers or MKL but my belief is that organizations like that have a belief in their "number oneness" that gets in their way of doing anything that it outside what they're good at. Maybe it is the people, processes, organization, values, etc. that gets in the way. Or maybe not having the flexibility to know that what is good at task A is not good at task B.


I saw always intel as a HW company making terribly bad SW. Anywhere I saw intel SW I would run away. Lately I used a big open source library from them, which is standard in the embedded space. Work great, but if you look the code you will be puking for a week.


In my experience Intel's WiFi and Bluetooth drivers on Linux are, by far, the best. They're reliably available on the latest kernel and they actually work. After having used other brands on Linux, I have no intention of getting non-intel WiFi or Bluetooth any time soon. The one time that I found a bug, emailing them about it got me in direct contact with the developers of the driver.

I had a different non-Intel WiFi card before where the driver literally permanently fried all occupied PCIe slots -- they never worked again and the problem happened right after installing the driver. I don't know how a driver such as this causes that but it looks like it did.


Yes, their open source drivers had a painful birth, but they are good once they're sanded and sharpened with the community.

However, they somehow managed to bork e1000e driver in a way that certain older cards sometimes fail to initialize and require a reboot. I have been bitten by the bug, and the problem was fixed later by reverting the problematic patch in Debian.

I don't know current state of the driver since I passed the system on. Besides a couple of bad patches in their VGA drivers, their cards are reliable and works well.

From my experience, their open source driver quality does not depend on the process, but on specific people and their knowledge and love for what they do.

I don't like the aggressive Intel which undercuts everyone by shady tactics, but I don't want them to wither and die, either, but seems like their process, frequency and performance "tricks" are biting them now.


Interesting. Does Bluez fall under that umbrella?

I have found bluez by far the hardest stack to use for Bluetooth Low Energy Peripherals. I have used iOS’s stack, suffered the evolution of the Android stack, used the ACI (ST’s layer), and finally done just straight python to the HCI on pi. Bluez is hands down my least favorite.


that's only because their hardware is extremely simple.

so the driver have little to screw up. but they still manage to! for example, the pci cards are all broken, when it's literary the same hardware as the USB ones.


The team working on their Realsense depth cameras was doing great work on the SDK, in my opinion.

Frequent releases, GitHub repo with good enough user interaction, examples, bug fixing and feedback.


    > such as a graph processing toolkit
This is oddly specific. Can you share the exact Intel software toolkit?

    > "number oneness"
Why does this not affect NVidia, Amazon, Apple, or TSMC?


The affliction he’s imputing is born of absolute dominance over decades. Apple has never had the same level of dominance, and NVidia has only had it for two or three years.

It could possibly come to haunt NVidia or TSMC in decades to come.


A friend who developed a game engine from scratch and is familiar with inner workings and behavior of NVIDIA driver calls it an absolute circus of a driver.

Also, their latest consumer card launches are less then stellar, and the tricks they use to pump up performance numbers are borderline fraud.

As Gamers Nexus puts it "Fake prices for fake frames".


My response is somewhat tangential: When I look at GPUs strictly from the perspective of gaming performance, the last few generations have been so underwhelming. I am not a gamer, but games basically look life-like at this point. What kind of improvements are gamers expecting going forward? Seriously, a mid-level GPU has life-like raytracing at 4K/60HZ. What else do you need for gaming? (Please don't read this as looking down upon gaming; I am only questioning what else gamers need from their GPUs.)

To me, the situation is similar with monitors. After we got the pixel density of 4K at 27 inches with 60Hz refresh rate (enough pixels, enough inches, enough refresh rate), how can it get any better for normies? Ok, maybe we can add HDR, but monitors are mostly finished, similar to mobile phones. Ah, one last one: I guess we can upgrade to OLED when the prices are not so scandalous. Still, for the corporate normies, who account for the lion's share of people siting in front of 1990s-style desktop PCs with a monitor, they are fine with 4K at 27 inches with 60Hz refresh rate forever.


I can't answer the first part, since I'm not playing any modern games, but continuously visit RTS games like C&C & Starcraft series.

However, I can talk about monitors. Yes, a 27" 4K@60 monitor is really, really good, but panel quality (lighting, uniformity and color correctness) goes a long way. After using Dell and HPs "business" monitors for so long, most "normal monitors for normies" look bad to me. Uncomfortable with harsh light and bad uniformity.

So, the monitor quality is not "finished" yet. I don't like OLEDs on big screens, because I tend to use what I buy for a very long time, and I don't wany my screen to age non-uniformly, esp. if I'm looking to it everyday and for long periods of time.


Is OLED burnout still a thing? If yes, then you are probably right: Normies will not upgrade to OLED until that issue is fixed, or a new technology replaces it.


See the funny thing is, even with all of this stuff about Intel that I hear about (and agree with as reported), I also just committed a cardinal sin just recently.

I'm old, i.e. "never buy ATI" is something that I've stuck to since the very early Nvidia days. I.e. switched from Matrox and Voodoo to Nvidia while commiserating and witnessing friend's and colleagues ATI woes for years.

The high end gaming days are long gone, even had a time of laptops where 3D graphics was of no concern whatsoever. I happened to have Intel chips and integrated graphics. Could even start up some gaming I missed out on during the years or replay old favourites just fine as even a business laptop Intel integrated graphics chip was fine for it.

And then I bought an AMD based laptop with integrated Radeon graphics because of all that negative stuff you hear about Intel and AMD itself is fine, sometimes even better, so I thought it was fair to give it a try.

Oh my was that a mistake. AMD Radeon graphics is still the old ATI in full blown problem glory. I guess it's going to be another 25 years until I might make that mistake again.


It's a bummer you've had poor experiences with ATI and later AMD, especially on a new system. I have an AMD laptop with Ryzen 7 7840U which includes a Radeon 780M for integrated graphics and it's been rock solid. I tested many old and new titles on it, albeit at medium-ish settings.

What kind of problems did you see on your laptop?


Built a PC with a top-of-the line AMD CPU, it's great. AMD APUs are great in dedicated gaming devices like the XBOX ONE, PS 4 and 5 and Steam Deck.

On the other hand I still think of Intel Integrated GPU in "that thing that screws up your web browser chrome of if you have a laptop with dedicated graphics"


Not tharkun__:

AMD basically stopped supporting (including updating drivers) for GPUs before RDNA (in particular GCN), while such GPUs were still part of AMD's Zen 3 APU offerings.


Well back when, literally 25 years ago, when it was all ATI, there were constant driver issues with ATI. I think it's a pretty well known thing. At least was back when.

I did think that given ATI was bought out by AMD and AMD itself is fine it should be OK. AMD always was. I've had systems with AMD CPUs and Nvidia GPUs back when it was an actual desktop tower gaming system I was building/upgrading myself. Heck my basement server is still an AMD CPU system with zero issues whatsoever. Of course it's got zero graphics duties.

On the laptop side, for a time I'd buy something with discrete Nvidia cards when I was still gaming more actively. But then life happened, so graphics was no longer important and I do keep my systems for a long time / buy non-latest gen. So by chance I've been with Intel for a long time and gaming came up again, casually. The Intel HD graphics were of course totally inadequate for any "real" current gaming. But I found that replaying some old favs and even "newer" games I had missed out on (new as in, playing a 2013 game for the very first time in 2023 type thing) was totally fine on an Intel iGPU.

So when I was getting to newer titles, the Intel HD graphics no longer cut it but I'm still not a "gamer" again, I looked at a more recent system and thought I'd be totally fine trying an AMD system. Exactly like another poster said, "post 2015 should be fine, right?! And then there's all this recent bad news about Intel, this is the time to switch!".

Still iGPU. I'm not going to shell out thousands of dollars here.

And then I get the system and I get into Windows and ... everything just looks way too bright, washed out, hard to look at. I doctored around, installed the latest AMD Adrenalin driver, played around with brightness, contract, HDR, color balance, tried to disable the Vari-Brightness I read was supposed to be the culprit etc. It does get worse once you get into a game. Like you're in Windows and it's bearable. Then you start a game and you might Alt-Tab back to do something and everything is just awfully weirdly bright and it doesn't go away when you shut down the game either.

I stuck with it and kept doctoring for over 6 months now.

I've had enough. I bought a new laptop, two generations behind with an Intel Iris Xe for the same amount of money as the ATI system. I open Windows and ... everything is entirely totally 150% fine, no need to adjust anything. It's comfortable, colors are fine, brightness and contrast are fine. And the performance is entirely adequately the same as with the AMD system. Again, still iGPU and that's fine and expected. It's the quality I'm concerned with, not the performance I'm paying for. I expect to be able to get proper quality software and hardware even if I pay for less performance than gamer kid me back when was willing to.


> And then I get the system and I get into Windows and ... everything just looks way too bright, washed out, hard to look at.

I've seen OEMs do that to an Intel+NVIDIA laptop, too. Whatever you imagine AMD's software incompetence to be, PC OEMs are worse.


It's Lenovo. FWIW, one thing I really didn't like much either was that I found out that AMD really tries to hide what actual GPU is in there.

Everything just reports it as "with Radeon graphics", including benchmarking software, so it's almost impossible to find anything about it online.

The only thing I found helped was GPU-Z. Maybe it's just one of the known bad ones and everything else is fine and "I bought the one lemon from a prime steak company" but that doesn't change that my first experience with the lemon company turned prime steak company is ... another lemon ;)

It's a Lucienne C2 apparently. And again, performance wise, absolute exactly as I expected. Graphics quality and AMD software? Unfortunately exactly what I expected from ATI :(

And I'm not alone when I look online and what you find online is not just all Lenovo. So I do doubt it's that. All and I mean all my laptops I'm talking about here were Lenovos. Including when they were called IBM ThinkPads and just built by Lenovo ;)


Laptops have really gone to hell in the past few years. IMO the only sane laptop choices remaining are Framework and Apple. Every other vendor is mess, especially when it comes to properly sleeping when closing the lid.


I bought an AMD Ryzen Thinkpad late last year, and I had the same issue with bright/saturated colours. I fixed it by running X-Rite Color Assistant which was bundled with the laptop, and setting the profile to sRGB. I then turned up the brightness a little.

I think this a consequence of the laptop having HDR colour, and the vendor wanting to make it obvious. It's the blinding blue LED of the current day.


Yeah, I read HDR might be the issue. Didn't know X-Rite and did not come with the laptop, but did play with disabling / trying to adjust HDR, making sure sRGB was set etc. Did not help. Also ran all the calibrations I could find for gamma, brightness and contrast many many times to try and find something that was better.

What I settled on for quite some time was manually adjusted color balance and contrast and turning the brightness down. That made it bearable but especially right next to another system, it's just "off" and still washed out.

If this was HDR and one can't get rid of it, then yeah agreed, it's just bad. I'm actually surprised you'd turn the brightness up. That was one of the worst things to do, to have the brightness too high. Felt like it was burning my eyes.


So long story short...

You don't like current AMD systems because one of them had an HDR screen? Nothing to do with CPU/GPU/APU?


If the diagnosis is that AMD GPUs can't do HDR properly then yes. There was not a single setting anywhere in Windows itself nor the Adrenalin driver software that allowed me to configure the screen to a comfortable setting. Even when specifically trying to disable anything HDR related.

My work Macbook on the other hand has zero issues with HDR and its display.

To be fair, you can still blame the OEM of course but as a user I have no way to distinguish that, especially in my specific situation.


I think I found X-Rite by just searching for color with the start menu.

Before I used that tool, I tried a few of the built-in colour profiles under the display settings, and that didn't help.

I had to turn the brightness up because when the display is in sRGB it gets dimmer. Everything is much more dim and muted, like a conventional laptop screen. But if I change it back to say, one of the DICOM profiles, then yeah, torch mode. (And if I turn the brightness down in that mode, bright colours are fine but dim colours are too dim and everything is still too saturated).


Did you time travel from 2015 or something? Haven't heard of anyone having AMD issues in a very long time...


I’ve been consistently impressed with AMD for a while now. They’re constantly undervalued for no reason other than CUDA from what I can tell.


AMD is appropriately valued IMO, Intel is undervalued and Nvidia is wildly overvalued. We're hitting a wall with LLMs, Nvidia was at one point valued higher than Apple which is insane.

Also CUDA doesn't matter that much, Nvidia was powered by intense AGI FOMO but I think that frenzy is more or less done.


What?!

Nvidia is valuably precisely because the software, which is also why AMD is not so valuable. CUDA matters a lot (though that might become less true soon). And Nvidia's CUDA/software forward thinking most certainly predated AGI FOMO and that is the CAUSE of them doing so well with this "AI boom".

It's also not wildly overvalued, purely on a forward PE basis.*

I do wonder about the LLM focus, specifically whether we're designing hardware too much for LLM at the cost of other ML/scientific computing workflows, especially the focus on low precision ops.

But.. 1) I don't know how a company like Nvidia could feasibly not focus on designing for LLM in the midst of this craziness and not be sued by shareholders for negligence or something 2) they're able to roll out new architectures with great improvements, especially in memory, on a 2 year cycle! I obviously don't know the counterfactual, but I think without the LLM craze, the hypothetical generation of GPU/compute chips would be behind where they are now.

I think it's possible AMD is undervalued. I've been hoping forever they'd somehow catch up on software. They do very well in server business, and if Intel continues fucking up as much as they have been, AMD will own CPU/servers. I also think what deepseek has done may convince people it's worth it programming closer to the hardware, somewhat weakening Nvidias software moat.

*Of course, it's possible I'm not discounting enough for the geopolitical risk.


> It's also not wildly overvalued, purely on a forward PE basis.*

Once you start approaching a critical mass of sales, it's very difficult to keep growing it. Nvidia is being valued as though they'll reach a trillion dollars worth of sales per year. So nearly 10x growth.

You need to make a lot of assumptions to explain how they'll reach that, versus a ton of risk.

Risk #1: arbitrage principle aka. wherever there's profit to be made other players will move in. AMD has AI chips that are doing quite well, Amazon and Google both have their own AI chips, Apple has their own AI chips... IMO it's far more likely that we'll see commodification of AI chips than that the whole industry will do nothing and pay Nvidia's markup. Especially since TSMC is the one making the chips, not Nvidia.

Risk #2: AI is hitting a wall. VCs claim is isn't so but it's pretty obvious that it is. We went from "AGI in 2025" to AI companies essentially adding traditional AI elements to LLMs to make then useful. LLMs will never reach AGI, we need another technological breakthrough. Companies won't be willing to keep buying every generation of Nvidia chip for ever-diminishing returns.

Risk #3: Geopolitical, as you mentioned. Tariffs, China, etc...

Risk #4: CUDA isn't a moat. It was when no one else had the incentive to create an alternative and it gave everyone on Nvidia a head start. But now everything runs on AMD now too. Google and Amazon have obviously figured out something for their own accelerators.

The only way Nvidia reaches enough revenue to justify their market cap is if Jensen Huang's wild futuristic predictions become reality AND the Googles, Amazons, Apples, AMDs, Qualcomms, Mediateks and every other chip company all fail to catch up.

What I see right now is AI hitting a wall and the commodification of chip production.


Not really. I don't want to just re-paste everything, but basically this: https://news.ycombinator.com/item?id=43688088 where I also sort of address your 2015 mention here.


Ah, Windows OEM nonsense...

I've used Linux exclusively for 15 years so probably why my experience is so positive. Both Intel and AMD are pretty much flawless on Linux, drivers for both are in the kernel nowadays, AMD just wins slightly with their iGPUs.


Yet my AMD APU was never properly supported for hardware video decoding, and could only do up to OpenGL 3.3, while the Windows 10 driver could go up to OpenGL 4.1.


Weird. Was it pre-Zen?

I had a Ryzen 2700u that was fully supported, latest OpenGL and Vulkan from day 1, hardware decoding, etc... but on Linux.


Meanwhile PC gamers have no trouble using their AMD GPUs to play Windows games on Linux.


That's actually something I have not tried at all again yet.

Back in the day, w/ AMD CPU and Nvidia GPU, I was gaming on Linux a lot. ATI was basically unusable on Linux while Nvidia (not with the nouveau driver of course), if you looked past the whole kernel driver controversy with GPL hardliners, was excellent quality and performance. It just worked and it performed.

I was playing World of Warcraft back in the mid 2000s via Wine on Linux and the experience was actually better than in Windows. And other titles like say Counter Strike 1.5, 1.6 and Q3 of course.

I have not tried that in a long time. I did hear exactly what you're saying here. Then again I heard the same about AMD buying ATI and things being OK now. My other reply(ies) elaborate on what exactly the experience has been if you're interested.


Can’t say what your experience with your particular box will be, but the steam deck is absolutely fantastic.


I wish I had an AMD card. Instead our work laptops are X1 extremes with discrete nvidia cards and they are absolutely infuriating. The external outputs are all routed through the nvidia card, so one frequently ends up with the fan blowing on full blast when plugged into a monitor. Moreover, when unplugging the laptop often fails to shutdown the discrete graphics card so suddenly the battery is empty (because the discrete card uses twice the power). The Intel card on the other hand seems to prevent S3 sleep when on battery, i.e. the laptop starts sleeping and immediately wakes up again (I chased it down to the Intel driver but couldn't get further).

And I'm not even talking about the hassle of the nvidia drivers on Linux (which admittedly has become quite a bit better).

All that just for some negligible graphics power that I'm never using on the laptop.


That’s not specific to Intel though. That’s how Directors and above are recruited in any big company.

For example, Uber hired a VP from Amazon. And the first thing he did was to hire most of his immediate reports at Amazon to Director/Senior Director positions at Uber.

At that level of management work gets done mostly through connections, favors and networking.


I tell people that if they get a new boss who is at Director or above, assume that you are re-interviewing for your job for the first 6 months with the new boss.


Major companies like that become infected with large hierarchies of scum sucking middle management that eat revenue with bonuses.

Of course they are obsessed with shrinking labor costs and resisting all downsizing until it reaches comical levels.

Take a company like health insurance that can't show a large dividend because it would be a public relations disaster. Filled to the gills with vice presidents to suck up extra earnings. Or medical devices.

Software is also very difficult for these hierarchies of overpaid management, because you need to pay labor well to get good software, and the only raison d'etre of these guys is wage suppression.

Leadership is hard for these managers because the primary thing rewarded is middle management machiavellianism, turf wars, and domain building, and any visionary leadership or inspiration is quashed.

It almost fascinates me that large company organizations basically are like Soviet style communism, Even though there are opportunities for internal competition. Like data centers and hosting and it groups. They always need to be centralized for" efficiency".

Meanwhile, they are like 20 data centers and if you had each of them compete for the company's internal business, they'd all run more efficiently.


  > It almost fascinates me that large company organizations basically are like Soviet style communism, Even though there are opportunities for internal competition. 
probably because continuous competition is inefficient within an organization and can cause division/animosity between teams?


"within an organization and can cause division/animosity between teams"

Are you aware of what goes on in middle management? This is the normal state of affairs between managers.

If what you are saying is true, then .......

Why is there competition in the open marketplace? You have just validated my suggestion that internally companies operate like communists.


  > Why is there competition in the open marketplace? You have just validated my suggestion that internally companies operate like communists.
i am not an expert, but i think the theory of competition leading to better outcomes in a marketplace is the availability of alternatives if one company went bad (in addition to price competition etc)

inside a company you are working for the same goal "against" the outside, so its probably more an artifact of how our economy is oriented

i'd guess if our economy was oriented around cooperation instead of "competition" (while keeping alternatives around) that dichotomy might go away...

just some random thoughts from an internet person


A sufficiently large corporation, and to emphasize that we are in the age of monopoly and cartel with very little for these companies to fear from antitrust litigation, will have numerous opportunities for internal competition.

At a certain health insurance conglomerate back when I worked there, it was maddeningly hard to get servers and support, with ridiculous internal charge rates, "three months for a server", etc etc.

This was a company that easily had 10-20 datacenters, and likely support groups from a dozen acquisitions. Yes, internal competition would have greatly improved things. The proof has likely happened everywhere, when AWS came for these internal groups's lunches with far superior service and often lower rates. Those were the days when AWS was the hero, unlike today.

I too think humanity needs a mechanism to harness/encourage/reward altruism. The institution that used to provide this ... roughly ... was religion and local churches. I SAID ROUGHLY! I am no theist.

Capitalism rewards sociopathy, and successive generations of social engineering under maximalized capitalism (such as we see the march toward these days) will beat down altruism.

Or at least I used to think it would. Now it is apparent that maximalized consumer capitalism leads to collapsing birth rates and social withdrawal, which will cause humanity to collapse. It's a race between that and how fast we can wreck the environment.


The only time I use search engines now is when I’m screen sharing and feel obligated not to show my five different ChatGPT tabs. I glance over the links and feign interest, "Oh, that’s great..."


It's been about 15 years since I've worked with RoR, but my favorite aspect of ruby was and will always be the library names. Shout out to factory_girl which I found out this morning was unforunately renamed to factory_bot


chronic and cocaine


Thanks for the downvotes! I named two ruby gems (made by the founders of GitHub btw) because they were always funny to me. Please downvote me some more!


I've been following this news for the past couple of weeks-- in essence your statement is what they are hypothesizing, and that the "something at the center that unites the two poles" might be that we are within a black hole. https://en.wikipedia.org/wiki/Black_hole_cosmology for the curious.


It was my understanding that if two black holes collide, they just form a bigger black hole, but we know there's a black hole in our universe, which then would mean that there's a black hole inside of a black hole that did not merge with the parent black hole, right? Is that something that is considered possible?


I'm under the impression that we really have no clue what's going on inside of black holes, so the most we can really say with confidence is that when two black holes collide they appear from the outside to now be a single black hole.


It’s a reasonable assumption. If two solar masses collide, their masses tend to combine[^]. Just “look” at planets that smash into each other. Ergo, a more massive black hole.

[^]: Ignoring ejections. But black holes also don’t “eject” mass. Or maybe they do? Hawking Radiation is weird.


Actually, they can shoot out relativistic jets at the poles. https://www.nustar.caltech.edu/page/relativistic-jets


But there, the black hole is ejecting other mass near it, not its own.


Mm, I didn't correctly interpret their comment. You're absolutely correct.


Yeah. I could've worded it better. By "ejections" I meant how, when two planets/moon sized masses collide, rocks shoot out into space. But because black holes have so much gravitational pull, everything theoretically just falls in.


When two black holes collide, gravitational radiation shoots out into space. The origin of the radiation is in the dynamical spacetime outside each black hole's horizon, however. This is what the gravitational wave detectors operated by LIGO, Virgo, KAGRA, and others look for.

Similarly, the dynamical spacetime around a black hole not near any other black hole can couple with quantum fields -- even fields in a no-particle "vacuum" state as measured by an observer, for example one in orbit around the black hole -- with the result that Hawking radiation is produced.

Both gravitational radiation and Hawking radiation carry away energy (in the sense of ability to do work, per the "sticky bead" argument) from the environment immediately around a black hole. This in turn means that the horizon radius will be less than it could be.

So as a Hawking-radiating isolated black hole will tend to shrink (if it's not fed by hotter cosmic microwave background radiation, for example), the mass of a post-merger binary black hole will be less than the sum of the unmerged binary.

Just because things can't cross from the inside of a black hole horizon to the outside doesn't mean the horizon is always the same -- the horizon can grow and shrink dynamically when interacting with other self-gravitating bodies, with matter like dust or starlight, or with "the quantum vacuum".


The inner black hole did not come from the outside, it formed inside and if i had to guess, it is stuck in the inside together with all the other matter, unable to interact with the outside of the outer black hole.


Just thinking about this infinite recursion gives me the mental equivalent of a stack overflow.


I don't think it is infinite - each universe can only have that mass/energy that fell into the outer black hole in the parent universe. At some level you'll inevitably have black holes with universes that do not have enough mass to form another inner black hole.


Unless, although there's no reason to currently believe this, the energy requirements for physics are relative within each black hole, sort of (but not strictly) like how the speed of light is relative for all observers. And we can get a little crazier, and imagine a meta universe that is sort of like a Klein bottle in that it doesn't just recurse all the way down but somehow folds back into itself. Again, no current reason to believe anything like this but it's a mind-boggling to visualize.


How much mass is required to form a black hole in a new universe with perhaps different physical constants? It could be that 'ability to make black holes' is a prerequisite for successful universes in the way way that good genes are a prerequisite for successful organisms. The universes that fail to spawn black holes are 'dead ends' so any life is statistically likely to find itself in a black hole spawning universe.

Maybe there is an 'incentive' for universes to form with physical constants tuned to produce black holes with the available energy in that universe.


This circle of ideals seems to be known as: https://en.wikipedia.org/wiki/Cosmological_natural_selection


The trick is that bigger black holes are less dense. Supermassive black holes can have the density of water. If the universe is gravitationally closed, it would have the density of... well, just look up at night. (Actually much less than that; you see more stars because you're inside a galaxy.)

The density makes the scale recursion less mysterious.


That’s interesting! When you are referring to density, are you referring to average density within the event horizon? Isn’t most (effectively all) matter concentrated in the singularity? Would love to hear you elaborate on this thought further.


We can't really talk about what's inside a black hole. From outside, it has a volume and a mass, and that's all there is to know.

We can say that any particle inside the horizon is inevitably headed to the center. (That's why we can't say any more: no other information can escape.) That does lead to a problem in that all of the mass would be concentrated at a single point at the center, whose density is division-by-zero.

But I wouldn't put too much weight on that. We already know from quantum mechanics that there isn't really any such thing as a "point". The math is still a problem, but the solution almost certainly lies in that direction.


Maybe generating a stack overflow was the true depiction of God!


“The universe is an orb and that orb is rotating causing all the other stuff to spiral.” This was a long held theory of mine because I could not understand why a galaxy would spiral.

I think there is a men in black scene, where an alien is rotating the universe globe like a toy they are playing.


> This was a long held theory of mine because I could not understand why a galaxy would spiral.

I think in general it would be unusual if they didn’t rotate. Any large non-uniform mass of gas or rocks when colliding will induce some rotation. What is odd though is that for galaxies we see more of them spinning one way than another.


Ok but what is making the universe spin? This kind of theory is turtles all the way down.


Is this getting into questions like "Where did the singularity come from?" and "What came before the singularity?". We don't have a way to answer these kinds of questions.


My point is that it's not much helpful to say, "galaxies spin a certain way because the universe spins", because it shifts the problem without actually answering the "why". "Turtles all the way down" is a saying about such infinite regress. https://en.wikipedia.org/wiki/Turtles_all_the_way_down

And yes, I'm familiar with Dawkins' famous retort when someone asked how magnets repel things.


> And yes, I'm familiar with Dawkins' famous retort when someone asked how magnets repel things.

I'm not. I was unable to substantiate that anyone named Dawkins, Richard or otherwise, made or is popularly associated with a comment about magnets. What was the retort?


I deeply apologize, it was late and I mixed up my Richards. :)

https://youtube.com/watch?v=36GT2zI8lVA


Doesn’t it have to spiral? Think of the gravity well, anything not orbiting is just falling. The only things not racing towards the black hole at the center of the galaxy are the ones that are orbiting.


It can be directly sucked into the center. A spiral implies a lateral movement plus a centripetal force


Right, I guess what I am saying is if it didn't spiral, it wouldn't be a galaxy for "long." It would just be a super massive black hole.


The trick to having a galaxy is for mass to fall towards the black hole and miss it all the time.


From what I remembe of Undergrad physics this isn't actually possible. According to GR, within an event horizon, space-like pths become "time-like" which effecitvely means the singularity is unavoidably "in the future". No matter how big a black hole is, you can't just drift around inside it as literally all paths lead downward (hence even light not escaping)

If you were inside a black hole you wouldn't be able to see light from "deeper" because it wouldn't be able to travel towards you.

This is not what we see within the universe, so I don't think we can be inside a black hole


All paths _eventually_ lead downward. Is there any limit to how long? Can't we just be near the outside of the blackhole and can't see the doom yet?


There are no stable orbits inside the event horizon, and my understanding is even things like atom vibration can't move further from the sigularity so I'm guessing timespans would be limited!


Interesting, thanks. That does sound like you could tell from inside if you there or not.

I'll have to read up on that, I always had the vague sense that on ~finite scale of time there existed a region of space where you couldn't really tell the difference if you're inside of a big enough black hole or not.

Which sounds like I'm probably just wrong.


It's the usual rabbit hole if you search for it, but there are some useful comments here: https://astronomy.stackexchange.com/questions/58610/would-it...

The way I see it is every surface inside an event horizon is another slightly "stronger" event horizon


Every time I see something pertaining to a pomodoro timer, I'm reminded of interviewing with a YC founder in downtown san mateo in the early 2010s (he was working on websockets + slide deck/prezi-like tech iirc), and half our interview consisted of him hyping up this technique. The company went under within a year, and I could never respect this technique afterwards


The technique isn't what caused his company to fail. A big part of it was likely his obsession with the technique. He also probably wasted massive amounts of time setting up his other tools instead of using them.


honestly feeling this - I (or my friend who now has the device) don't really follow this religiously either. I think it's great when you have a lot of work to be done and feel a little overwhelmed. Getting down into this rhythm can help you chip away at it.


understandable, it's a very useful way to trick yourself into getting shit done but it's also very possible to turn "making the perfect work tracker-timer-app" into a giant rabbit hole instead of just getting a cheap, goofy-looking kitchen timer and keeping some notes on paper.


I too wonder about anomalous ionizations first thing in the mornings


it happens to a lot of us - I'm guessing we all had chilli for dinner last night!


I was once offered an engineering manager position at iridium (which i discussed here https://news.ycombinator.com/item?id=41748519)-- that entire company is a race to reduce the bottom line. They offered me (an engineering manager to 5 engineers) a lower salary than I was offered as a new grad. Also their talent pipeline is quite stale, most of the engineers on my prospective team were at the org for 10-20 years. For such an interesting aspect of technology, it's ashame they can't attract more talent, such an untapped market low earth orbit satellite networks are...


Iridium and other satellite companies also went bankrupt and their satellites were going to be de-orbited until the US Government bailed them out in the 2000s. They couldn’t get enough customers to support enough launches.

Terrestrial networks in the meantime have only gotten better and improved coverage. Not that many customers, relatively, need satellite comms.

Now SpaceX is eating their lunch.

I don’t think the market for satellite comms has ever been big enough for a pure-satellite company to get enough money to do something cool. SpaceX can afford the R&D because they are a little more diversified.


> They couldn’t get enough customers to support enough launches.

No surprise, the only usecases back then for the price that Iridium and others commanded were SAR, a few military/secret service style use cases and execs who deem themselves to be of such importance that they need to be reachable on the globe 24/7 even if they are just taking a flight over the Atlantic or on a cruise ship, and Iridium can't be reasonably used for much more than that.

> Now SpaceX is eating their lunch.

Partially due to physics. Latency on Starlink is reportedly low enough to run online games or telephony and the bandwidth high enough to allow for video streaming in the outback, which makes the potential market size muuuuch bigger so the price point can be lowered enough to be competitive with landline DSL of all things.

The problem is, SpaceX isn't something that the US government can rely on forever. For now, its leader is in good standing with the 47th, but that may change overnight (it has happened with either of these characters before and both have quite the large egos that will collide rather sooner than later). And what to do then?


>Now SpaceX is eating their lunch. Fact Check Time! Iridium stock jumped 15% today, because their 4Q earnings vastly beat expectations. They earned $0.31 per share versus expectations of $0.16 Their Revenue grew 9% Year over Year to $213 million


Iridium, that is a name I've not heard in a long time.

IMHO, the worst places to be are organizations that were supposed to change the world, but didn't, and don't quite get it.

Your experience totally tracks with that.


They set up global satellite communications over 20 years ago. They did change the world.


This seems like it should be totally expected. Iridium's engineering efforts are largely in the past, they're purely in the revenue extraction mode at this point. Your job description is basically just "maintain obsolete legacy system just enough to make money."


Starlink ate Iridium's lunch. Any benefits Iridium was supposed to provide are currently achieved by Starlink.


Maybe specialty hardware? Are there handsets yet which can connect to starlink?


iPhone, most notably.


Much more than iPhone. From Tmobile's FAQ [0]:

Apple iPhone 14 and later (including Plus, Pro & Pro Max), Google Pixel 9 (including Pro, Pro Fold, & Pro XL), Motorola 2024 and later (including razr, razr+, edge and g series), Samsung Galaxy A14, A15, A16, A35, A53, A54, Samsung Galaxy S21 and later (including Plus, Ultra and Fan Edition), Samsung Galaxy X Cover6 Pro, Samsung Galaxy Z Flip3 and later, Samsung Galaxy Z Fold3 and later and REVVL 7 (including Pro)

[0] https://www.t-mobile.com/coverage/satellite-phone-service?ic...


Starlink Direct to Cell is not available yet


I have it on my Pixel 9 Pro XL right now and have had it since the end of January. Worked well so far for me in the country where Tmo typically has had dead spots, if not a little slow.

https://i.imgur.com/wrl5KLf.png


That is just texting though, not voice or data.


Let's not move goalposts. It's still undeniably both "Starlink" and "Direct To Cell," so I would say that Starlink Direct To Cell is indeed available.


Moving the goalposts? The point I was refuting was "Any benefits Iridium was supposed to provide are currently achieved by Starlink", but Iridium offers services today which StarLink does not.


Is it 5G only or does LTE also work?


It's only for texting so it doesn't really matter. That said Iridium is so slow it's mostly only useful for texting type situations as well. Even the voice is so heavily compressed and laggy as to be mildly unpleasant to use.


Sadly I expect them to be at the stage of no relevance. Just enough that as another commenter said it could make some money but satellites have no business value.


Their value is the niche of being able to work at the poles, unlike any other constellation, despite being dialup speed.


But how can you translate that to dollars today?


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: