Hacker News new | past | comments | ask | show | jobs | submit | ManBeardPc's comments login

Bazel also allows to pull dependencies from different sources via Bzlmod.

1. define dependency like bazel_dep(name = "protobuf", version = "3.19.0") 2. define repositories where to look for it

A repository is just a file structure like /modules/$MODULE/$VERSION + some info about the module. Can be on a HTTP server or just on your local file system.

https://bazel.build/external/overview#bzlmod

I hope more tools adopt something similar. Or maybe a single package manager for everything, so we can finally build cross-language software without having to debug why tool A can't find stuff from B.


Bazel mod requires a registry of packages in a git repo, so it’s quite centralized.


From the documentation: https://bazel.build/external/registry

> Bzlmod discovers dependencies by requesting their information from Bazel registries: databases of Bazel modules. Currently, Bzlmod only supports index registries — local directories or static HTTP servers following a specific format.

It just uses the Bazel Central Registry (BCR) by default. You can specify your own via the --registry flag and then it uses them instead. It is possible to specify multiple registries at the same time, so you can mix the official, company internal and on your local filesystem at the same time.


> There can be no new oil and gas infrastructure if the planet is to avoid careering past 1.5C (2.7F)

The 1.5C goal is already practically impossible. We have not managed to slow down the increase in atmospheric CO2 at all, not even a dent in the grand scheme of things. It is so unrealistic that I wonder how anyone can seriously expect that goal to be met. How should it work? We continue like always and just shut everything down the last moment?

PDF, Page 21, Figure 18: The Mauna Loa CO2 record: https://www.thegwpf.org/content/uploads/2024/03/Humlum-State...


There is already a variant where they try to get someone to say „yes“ and just use a recording of it to use as „proof“ that you agreed to some contract.


I actually don’t answer unknown callers with “hello” or any words actually. I simply just say “mmmhhmm” or make a dumb sound if it is automated it will trigger the automatic message. Someone asked why and I said voice cloning software they said wtf you have nothing to steal. Just feels risky idk why.



OMG that explains so much. I kept getting these calls where they would ask "Am I speaking with the head of the household?"...crap


Good thing for Google call screening ...


Phone providers have been doing this one in italy for over a decade.


Depends on what you want to scale. Memory, CPU and disc scale well. High concurrency can become a problem due to the 1 process per connection architecture. Keeping latency low between your app and db server and understanding how and where locks are used helps. So far I have used and seen it being used in pretty big companies successfully in a single instance (+ standby) setup.


> High concurrency can become a problem due to the 1 process per connection architecture.

If I understand correctly what you mean, then this is no longer a problem. You will simply need to use a connection pool, such as Odyssey or PgBouncer. Even SPQR has its own pool of connections for each shard.


PgBouncer just pools the connection, but each connection still needs its own process in PostgreSQL itself. Each query blocks the whole process. That limits the amount of queries that can run in parallel/concurrently to the amount of connections. Long-running queries can easily clog up everything. No tool can fix this, you need to be aware of it and consider it in your design.


It's not really about the processes. Even if each query ran on a thread within a process, or some form of greenthreading were in use, there are I/O constraints and locking to consider.


What I talk about is blocking of limited resources. Processes/connections are expensive and therefore you want/have to limit the max amount of them. Each query/transaction requires its own process and blocks it for everyone else until it is done.

I/O or compute constraints are another issue, if your CPU or disc is already saturated you get probably no additional benefit.

But if you wait for something (locks, I/O) the connection/process can't do other things. High latency between app/database and long running transactions can also use up your available processes, even if they don't consume a lot of CPU or I/O or fight for the same locks.

Lock contention is its own problem, but makes the blocking of processes/connections worse.


What I meant is, even if processes had no cost, you will probably bottleneck on the I/O. So it doesn't matter a lot how many Postgres connections you have.


Sure, it’s not the most common problem to experience, just something that’s useful to know once you have more and more clients. It did happen to me, all connections used up for long running transactions (for multiple reasons) and then nothing else was able to run, CPU and I/O nearly idling at <5%, no significant lock contention.


Ah, long-running transactions wreak havoc with connection pools.


Have you tried Typst? It's like a modern version of LaTeX and allows to generate nice looking documents quickly. Can be called from the console and makes it easy to create templates and import resources like images, fonts and data (csv, json, toml, yaml, raw files, ...). Of course it is its own language instead of HTML/CSS but so far I found it quite pleasant to use.


In Bavaria I've seen beer vending machines inside of factories with heavy machinery and forklifts everywhere.


Beer is to Bavarians, what guns are to Texans. Don‘t dare to even think of taking it from them.


The amount of beer consumed in Bavaria is on a decreasing trend. Breweries are worried and some are looking into exports. Times are changing.


In my experience indefinite is pretty much the default here in Germany. I only heard about limited leases for students that only move to a certain location because of university.


A fixed term lease in Germany is only legal in some narrow circumstances and can’t be renewed multiple times in a row. I think after a few years the landlord has to offer you an unlimited lease.


Are German leases indexed to inflation at least?


Haven't met anyone which such a contract. It's usually a fixed value. The owner may choose to raise it every now and then, but laws limit it to a certain amount per year and there needs to be a reason, for example modernization or other buildings in the area are becoming more expensive.

You can of course make a contract that allows for automatic raises, but that is also limited (earliest 12 month after the begin of the lease, only a fixed/absolute value and not percentage) and you can no longer raise the price for the other reasons.

There are probably other models, but I don't know much about them.


So far most of the use cases of AI I see in the wild has been some stupid chat bot, replacing specific job groups or magically solving a problem with AI.

All of these attempts fail miserably. It‘s Blockchain and NFTs all over again. The promises are so far fetched from current capabilities that it is not even funny.

Not saying that AI is useless, there are many impressive achievements made possible by it, but I do think it’s currently overhyped and filled with empty promises.


Exactly. Many of the current applications of AI are just marginal improvements, and very few are disruptive/groundbreaking.

Of course no one will tell you that because they are afraid to, in case they get cancelled or lynched.


It is much more akin to the dotcom bubble. NFTs are entirely useless, the dotcom bubble was just too early (many dumb ideas then are successful now).


Depends on the use case. I would argue promising capabilities that are not there yet and can’t be developed by the supplier themselves (for example they just fine tune an existing model) in a reasonable timeframe are practically just as useless.

Other promises like predicting the future or magically solving security are pretty comparable to NFTs.


My experience with Kubernetes has been mostly bad. I always see an explosion of complexity and there is something that needs fixing all the time. The knowledge required comes on top of the existing stack.

Maybe I'm biased and just have the wrong kind of projects, but so far everything I encountered could be built with a simple tech stack on virtual or native hardware. A reverse proxy/webserver, some frontend library/framework, a backend, database, maybe some queues/logs/caching solutions on any server Linux distribution. Maintenance is minimal, dirt cheap, no vendor lock-in and easy to teach. Is everyone building the next Amazon/Netflix/Goole and needs to scale to infinity? I feel there is such a huge amount of software and companies that will never require or benefit from Kubernetes.


Company CTOs in my experience get sold very easily the idea of infinite scalability. In practice not many companies reach that point, but many that go down this road have to build on top of dozens of layers of compute/networking abstractions that only few experts on the team can manage, if any, competently.

I think the cost of self-managed Linux VMs and monoliths is smaller than the cloud vendors made it seem.

Containers are nice when you have to deal with a language like Python and it's packaging ecosystem, but when Go/Rust/.Net/etc binaries are placed in containers as well... I think sight of what we're trying to solve in real life has been kind of lost.


Monoliths are so much easier for smaller teams. No additional tooling needed, no service discovery, instead of networks calls you have function calls, can share resources, etc. Much less overhead as well, so you may not even need to scale. The amount of requests a single Go/Rust server can handle on a dedicated machine is insanely high with modern hardware.


It is really difficult to recycle plastic, because there are just so much different materials that we call plastic. Some can be reshaped with heat and others need chemicals. Packaging often contains multiple different plastics and it is REALLY hard to separate them. Also the molecules break down over time/use, degrading the quality of the material.

Another nasty property is that some chemicals used to make plastic soft or change other properties (additives) often don't chemically bind with it and they leak out over time.

"Recycled" could probably be removed from the title. Plastic and its additives are very likely bad for you period. "A path towards safer reuse of plastics" is missing a more useful point: reduce single use plastic and replace if other materials make more sense for the use-case. Sure, there a many points we could improved regarding to the usage cycle but some problems are just inherent to the current set of plastics.

I try to avoid plastic as much as possible, especially for food or beverages (even more important if hot).

It's sad that such a useful and versatile group of material also has so many downsides.


Even not hot — the recent spectroscopy studies on plastic particles in bottled water should be enough to convince people to avoid them if possible


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: