Hacker News new | comments | ask | show | jobs | submit login
GPUs Mine Astronomical Datasets for Golden Insight Nuggets (nextplatform.com)
85 points by rbanffy 9 months ago | hide | past | web | favorite | 32 comments

At my university's astronomy dept we've developed gpu software and used it to discover many new objects in the kuiper belt. It's robust enough that it can also be used to detect airplanes.


That's really neat! What do you use for training data?

GPUs can do things besides machine learning. It uses a maximum likelihood estimator, so no training data.

This makes me curious – remember back in the oughts where SETI@Home and Folding@Home were popular? Were those adapted for GPUs and saw a huge acceleration in calculations?

Yes. The vast majority of the folding@home compute comes from GPUs.


If distributed computing had a fraction of time on the millions of GPUs mining cryptocurrencies, I can't imagine how close we'd be to curing disease and finding ET.

You dont just cure disease by doing protein folding simulations. thats just a very tiny part of drug development.

I'd rather be a tiny way close to new cures than a more pointless hashing for blockchains.

Many things are pointless if you start evaluating things by one's standards. Let the market decide what they should focus on. Otherwise, you can always rely on government funding for less-market-focused initiatives like fundamental research.

> pointless hashing for blockchains

Surely there's a point to the person performing the hashing since the action takes a significant amount of time, resources, and money.

I've thought of setting up a pool for deep learning. I think there could be an interest in using GPUs for other than coin mining.

Time for a cryptocurrency for F@H?

Gridcoin is the biggest I'm aware of. Most people who mine it (that I've spoken to) think of it as a way of subsidizing their hardware and electricity, not as a get-rich-quick scheme. Since it lets you choose what project to work on, you can choose one that you approve of and that is well-suited to your hardware.

I was considering buying some old servers to mine it. I'd have pretty much broken even back when it was $0.10/grc, but it would have taken me ages to make back my investment.

The problem of replacing the general purpose proof of work in cryptocurrencies (finding partial SHA256 collisions in the case of Bitcoin) with something more useful is that the attributes for a good PoW computation are hard to find in real world distributed problems.

In particular you want something a problem with the following attributes:

- must derive somehow from the block data you're trying to mine, otherwise you could reuse your work for an different block and make double-spend attacks trivial. It's very important that once a new block is mined everybody else must start from scratch for the next one, otherwise you could "premine" an arbitrary number of blocks and later append them at an arbitrary position in the block "tree", potentially rewriting history.

- difficulty should easily be modifiable to account for the current "hashrate" otherwise your blockrate will go whack as the amount of computing power available changes. It also means that you should be able ahead of time to guess the difficulty of a problem and the average amount of processing power required to solve it.

- easy to validate: the nodes of the network should be able to check that the proof of work is valid using a tiny fraction of the computing power necessary to actually produce the proof (finding hash collisions is hard, verifying them is comparatively trivial).

- doesn't require access to a centralized resource. If you need to connect to some central repository to fetch the work set then not everybody is on equal footing. You have a single point of failure and some miners could have privileged access to the work data.

It's very difficult to find real world problems that have all these attributes.

It wansn't a particularly serious suggestion, but thanks for the explanation.

Oh I took it very seriously because it's something I've given quite a lot of thought. I'm not a huge believer in cryptocurrencies but I would be a lot more optimistic about them if they weren't wasting so much energy. Harnessing all that processing power to do something useful would be amazing. Unfortunately so far the most useful PoW people have manage to implement are things like "compute very large prime numbers" which I suppose is mildly more useful than finding SHA256 collisions but not by a very large margin.

quick search reveals these: https://curecoin.net/ https://foldingcoin.net/

pretty cool idea! if you're going to do difficult maths for proof of work it might as well be useful I suppose?

I was on the Curecoin team as of last year. It's a neat project, but they weren't able to manage a working relationship with Stanford.

There is a unique opportunity here to use digital currencies to fund scientific research through the use of a reward mechanism and distributed ledgers.

Our project is hoping to take a similar idea to scale (research/project-based "work) by utilizing open datasets posted on decentralized technology. We hope to build relationships with institutions, non-profits, and the public sector to track the economic and social value of campaigns similar to the Folding@Home and Seti@Home projects, but with a capacity to on-board projects as they appear.


I suggest sending the leaders of aforementioned institutions/non-profits/public sector establishments a healthy dosage of LSD if you hope to persuade them to work with you.

Crypto is the pinnacle of human dogshit, but we'd probably be no closer to curing disease or finding ET

And achieving world peace and developing nuclear fusion power. Knock off the hyperbole.

Molecular dynamics has been using GPUs for much longer than Machine Learning. Yes, the accelerations are enormous.

Linear Algebra libraries have been using GPUs for longer than molecular dynamics. What's your point?

That doesn't jibe with my memory - the first codes running on GPUs at the supercomputer centers that had privileged early access were running MD, without any GPU-accelerated LA libraries.

My recollection (which is admittedly fuzzy) is in line with yours. MD was the first application of GPU-accelerated computing that I recall (partly cause NVidia seemed to push that). BLAS, LAPACK, etc got GPU-enabled later.

But the accelerations aren't as enormous as they are on Anton's ASICs. :)

Can we please never use the phrase "data nuggets" again

'How are your Golden Insight Nuggets today?'

'I've been collecting all these great Golden Insight Nuggets.'

God it's horrible.

Why not just: "GPUs Mine Astronomical Datasets for Golden Insights"

Honestly wasn't that egregious. Definitely not at the bottom of the barrel. No references to Elon Musk or Ja Rule's predictions for the AI singularity.

I second that.

Also, leverage.

Why do I have a data lake if I can't leverage that for insight nuggets?

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact