Hacker News new | past | comments | ask | show | jobs | submit | staunton's comments login

What's wrong with Ubuntu?

> TikTok pushes more content favoring the Chinese Government (CCP).

Does that violate EU law? (Serious question, I really don't know)


no, and as far I know it doesn't violate any US law either. the thing against tiktok is not based on law, but based on suspicions.

Let's say you had a lossless image format that's 20% smaller (on average for pictures people send over networks) than PNG. Let's say it takes 10% more computing power than PNG. Do you stand to make money? What would it be used for?

I can't imagine people will start storing their family pictures in a new format they've never heard of which is not supported by any software they use for "just" 20% better compression. Do they even want lossless compression in the first place (if you don't ask them directly and call it that)?


That's the portability bit the presenter mentions, and is a very important concern in practice. But how about recompression? For example many PNG files are suboptimally compressed partly because PNG is an old format and also because many softwares have been too dumb to produce a well-optimized PNG. In that case we may benefit from a transparent recompression, which may be done either by using a better library like libdeflate or by internally using a separate format that can be quickly transformed from and back to PNG. In fact Dropbox did so for JPEG files [1]. When I'm saying "so much better" I was thinking about such opportunities that benefit end users.

[1] https://github.com/dropbox/lepton


Dropbox apparently abandoned the project. Do you know what their takeaways were from trying to improve the JPEG storage?

For example, was it worth it in the end? Did they announce anything? Did they switch to another method or give up on the idea, or do we not know?


Dropbox apparently still uses Lepton or any successor internally, but the open source version is abandoned because it posed larger maintenance burden than internal projects.

Has Dropbox said this anywhere? Or are you assuming it based on something like "this kind of project is very easy to maintain internally so there's no reason why they would have stopped using it"?

At least at the time of abandonment, Dropbox did say (emphasis mine):

> While we did ensure that the reported vulnerabilities don’t affect our internal use of Lepton, we unfortunately don’t have the capacity to properly fix these and future issues in this public repo.

As far as I know this is indeed the last known public mention of its use, but given that Lepton was already in use and dropping it would substantially increase its traffic, it is reasonable to assume that its use somehow continues to this day.


That's pretty strange. If they're using it in a meaningful way, then they're applying it to arbitrary files, so they'd generally need fixes for almost all the bugs. So what resources would be lacking so badly that they give up on releasing?

There won't be "real world applications" for many years to come.

If I had to bet on what (impactful) application might come first, I'd guess simulation of chemical/physical properties used for drug development and materials science.


"Both organizations will integrate the 256-qubit superconducting quantum computer into its platform for hybrid quantum computing lineup and offer it to companies and research institutions"

But they offer it for rent. Who would be a buyer for the quantum part of the hybrid?

"Research institutions" but for what kind of research?

Or is this rather wishful thinking/PR "we bring quantum computing to the market (just nobody uses it)"?


> "Research institutions" but for what kind of research?

Quantum computing research. I'd guess a big chunk of revenue will come from universities and research institutes. Some companies might also pay for it, e.g. quantum computing startups in need of anything they can show before they have hardware, or startups that aren't even planning to build their own hardware.

There are people working on finding useful problems that these devices might help with and how to best make use of them, how to build "infrastructure" for it. It's useful for them to have something to play with. Also, many organizations want to be (seen as) at the forefront of quantum computing, know the current capabilities, strengths and weaknesses of the various platforms, train and educate people about quantum computing and quantum technology in general, etc.


Is there some minimum number of qubits at which some minimum viable quantum-supreme task can theoretically be achieved?

What would be required to factor a 1024 bit integer key?


You might as well ask what would be required to factor an 8 bit integer key. Because decades after the factorization of 21, we're still waiting for a quantum computer able to factor any product of two 4-bit primes with the general Shor algorithm.

Ok, so ... what would be required for the 8 bit key? Do we have reputable numbers? And are the qubits in the article equivalent to other qubits or are they lacking in some way?

You need a few dozen logical qubits. Qubits that have negligible errors and do not lose coherence. The problem is that a single logical qubit takes hundreds of physical qubits and advanced error correction techniques to construct.

The more I hear about quantum computing the more it sounds like make believe grift. I first heard about it in a 2600 magazine in 1997 or so. And the claims and "way forward" then and now are roughly equivalent.

Read as: I've heard for nearly 30 years that quantum is just around the corner, and we need post quantum cryptography.

Or, as reverend Sharpton said: "All hell's gunna break loose; and you're gunna need a Bitcoin!"


To be fair the story has been consistent. The hardware is lacking and the predictions are testable given advances in it.

When you compare it to the historical development of classical computers it's proceeding at a decent rate. Imagine if we'd needed hundreds of thousands of transistors before being able to demonstrate actually useful work by a classical computer. They likely never would have been developed in the first place.

Cryptography wise I'd expect dire warnings about any theoretical attack that's reasonably plausible. Better to react immediately than sit around waiting for it to materialize. It took over 15 years after the warnings for SHA to be broken in practice and I don't necessarily expect that SHA2 ever will be but we've moved on to SHA3 nonetheless.


to compare, ENIAC had 18,000 tubes and 1200 relays, and could perform 5000 additions, or 3 square roots per second. in 1956, when it was decommissioned.

that was 80 years ago, for the military. So plotting that out, first actual PC was 25 years after ENIAC was decommissioned, the IBM PC 5150, with 29,000 transistors in the 8088. 12 years later, the 586 had 3.1mm transistors, P4 had 42mm, 10 years later (2003) p4xe had 169mm (but a year earlier there were only 65mm in the p4). haswell, ten years later, 1.4 billion transistors. in 2023, AMD ryzen 7800x3d had 6.5 billion transistors.

here's a graph i threw together to see what the trendline was https://i.imgur.com/4ofV7Xr.png


Serious question: has such a calculation ever successfully predicted a technology trend? How much should we believe it and to what accuracy?

> Is there some minimum number of qubits at which some minimum viable quantum-supreme task can theoretically be achieved?

That is a very broad range of possibilities, so allow me to narrow it to cryptography. I am by no means an expert on this, but I spent the weekend reading about quantum motivations to change the cryptographic algorithms society uses and as at as I can tell, nobody knows what the hard lower bound is for breaking hardness assumptions in classical cryptography. The best guess is that it is many orders of magnitude higher than what current machines can do.

We are so far from machines capable of this that it is unclear that a machine capable of it will be made in our lifetimes, despite optimism/fear/hope, depending on who you are, to the contrary.

> What would be required to factor a 1024 bit integer key?

I assume you mean a 1024-bit RSA key (if you mean a 1024-bit ECC key, I want to know what curve). You can crack 1024-bit RSA with 2051 logical qubits according to:

https://arxiv.org/abs/quant-ph/0205095

In order for it to actually work, it is believed that you will need between 1000 to 10000 physical qubits for every logical qubit, so it could take up to 20 million qubits.

Coincidentally, the following paper claims that cracking a 2048-bit RSA key can be done in 8 hours with 20 million physical qubits:

https://arxiv.org/abs/1905.09749

That sounds like it should not make sense given the previous upper estimate of 20 million physical qubits for a 1024-bit RSA key. As far as I can tell, there are different ways of implementing Shor’s algorithm and some ways use more qubits and some ways use less. The biggest factor in the number of physical qubits used is the error correction. If you can do better error correction, you can use fewer physical qubits.


Factoring requires much better noise performance. So you don't have to consider the number of qubits yet for that particular application. A fundamental breakthrough is required.

There might be applications other than factoring that can be addressed with the noisy qubits we can actually create.


What’s the purpose of quantum computing other than solving discrete logarithm or factorisation? Doesn’t seem very promising at the moment

It is said to be useful for optimization problems like protein folding, although they have so few qubits that I am not sure how useful it really could be. A microcontroller with only 1024 bits of RAM for example has very limited usefulness so I would not think a quantum computer with a similar number of qubits would be very useful either.

Pretty much everything you read (especially when aimed at a non-expert audience) about quantum computing in the media is "easy-to-mediatize" information only.

People building these things are trying to oversell their achievements while carefully avoiding making them easy to check, reproduce, or objectively compare to others. It's hard to objectively evaluate even for people who work in the field but haven't worked on the exact technology platform reported on. Metrics are taylored to marketing goals, for example, IBM made up a performance metric called "quantum volume", only to basically stop using it when it seemed to no longer favour them.

That being said, it's also undeniable that quantum computing is making significant progress, error correction being a major milestone. What this ends up being actually used for, if anything, remains to be seen (I'm rather sure we'll find something).


> They just want the Shah back.

Is that a reference to something? Haven't heard that phrase before.


I'm guessing it's a reference to the last shah of Iran: https://en.wikipedia.org/wiki/Mohammad_Reza_Pahlavi

I'd say, in this case "chore" means "boring, nothing to see here".

It's interesting, because "chore" to me has strong connotations of "tedious, unpleasant".

Right. It derives from the idea that programmers are supposed to find "solving interesting problems" pleasant. On the other hand, boring, repetitive tasks are called "chores".

I don’t find it appropriate nor useful to place such a sentiment in a commit message, much less as a standard tag.

It's a nerdy colloquialism. ie, it's not that serious

That’s part of the reason why I’d object to it in a commit message, in a professional setting.

Some organizations strongly encourage marking all commits as one of a list of categories such as "feature/fix/chore/...". The tags are then bound to loose all meaning (literal or figurative) very soon.

Unless there was some "conspiracy" to violate the license (my original comment was an attempt at playfully hinting at that possibility, though I don't find it very likely), I'm sure the person who wrote that commit message thought about it for less than three seconds.


A major point is communicating your intentions to people who care about them and who will respect how you wish your project to be treated.

Any sources you would recommend?

Surely, finding a counterexample would be huge news, a noteworthy advance in mathematics, and thus a great and widely praised achievement.

It'd also be an end to the project and would make the conjecture far less interesting.

IMO it would make the conjecture far more interesting, as it would be a surprise to most people who have thought about the problem.

Many natural questions would arise, starting with “Is this the only counterexample?”


Possibly, but it would join other false conjectures such as Euler's sum of powers conjecture - posed in 1769 and no counterexample found until 1966. There's only been three primitive counterexamples found so far.

(I got that from https://math.stackexchange.com/questions/514/conjectures-tha... which features some other false conjectures that may be of interest to you)


Not even the same implications. All empirical evidence strongly support the Goldbach conjecture. Any counterexample would mean an entire field of Mathematics has to be rewritten.

Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: