
Black Hole Tech? - superfx
http://blog.stephenwolfram.com/2016/02/black-hole-tech/
======
olympus
Normally I skip articles by Wolfram because all he does is spout about how
good Mathematica is. But filter out the free advertising for Mathematica and
you have a pretty decent article that starts out discussing gravitational
waves and continues on to some interesting coverage on some neat results about
gravity. I'm not sure we'll be creating a stable lattice of bodies all
orbiting each other any time soon, but it was fun to read about. If you are
moderately talented at math/physics you can probably understand most of this
article. No PhD required, and the animations are fairly informative.

~~~
brudgers
My father was a scientist by profession. I remember how much he talked about
SMP when his branch of NRL got it. Later he learned a bit of Mathematica.
Anecdote is not data, but when Wolfram claims that scientists use Wolfram
Research's products to investigate relativity, I tend to believe him. Not just
because it conforms to my experience but because he is in a position to know.

I have found that pretty much all of what Wolfram writes is closer to this
article than the popular internet opinion. I suspect that that's because his
tools scratch his intellectual curiosity...they exist so that he can start
with the sort of math that starts this article and go from there...or write a
book like _A New Kind of Science_.

~~~
chm
Some universities pay for a Mathematica site license. I have used the software
since version 6. It's really good for prototyping (my whole MSc was done in
Mathematica), but the _biggest_ problem for me is that code re-use is a total
mess.

It's easier to teach someone to write some Python and use a Git GUI than it is
to teach a new student (who is not a programmer) how to properly use
Mathematica with version control. Notebooks are, by design, made to be
interactive and promote an incremental, playful discovery process. That is
intuitive and students pick it up very quickly.

This means that code is passed around as (in my experience) bloated, poorly
written, poorly documented and inefficient (no FP, all imperative) notebooks
containing multiple "orphan" sections which were just quick hacks to try and
see if something worked. Of course you _can_ learn to write packages and good
documentation which integrates flawlessly in the documentation center, but
that takes time and effort most scientists would not be willing to give.

~~~
davesque
I agree. I actually love Mathematica and the Wolfram Language but its main
problem is the lack of any convenient ways to package it, track it in git,
edit it in vim, etc. I spent some time investigating how to do that but it all
just seemed really opaque and complicated. That's one of the risks of using
proprietary software, I suppose.

~~~
chm
You can do that with _packages_ but not (at least not easily and conveniently)
with _notebooks_. So basically your functions can reside in a package, but
your computation has to reside inside notebooks.

It is possible to write notebooks directly in a text editor and then have them
run. That just gave me an idea for a weekend project...

~~~
davesque
I'm sure it's possible, but is it straight-forward how you're supposed to do
this properly? And is there any way to access the Wolfram kernel from the
command-line (like a python executable)?

Update: I guess I owe the Wolfram Language another look.

~~~
chm
Yes, you can use the kernel from the command line:

[https://reference.wolfram.com/language/tutorial/UsingATextBa...](https://reference.wolfram.com/language/tutorial/UsingATextBasedInterface.html)

[https://reference.wolfram.com/language/tutorial/WolframLangu...](https://reference.wolfram.com/language/tutorial/WolframLanguageScripts.html)

~~~
davesque
Ahh, well alright then :). Thx for the info.

------
wyager
Mathematica is an insidious pox on the research community. Research needs to
be open and verifiable, not obfuscated and closed. If the system you are using
to perform calculations is just a giant black box, it's often extremely
difficult to understand what is actually going on under the hood, which makes
it very difficult to check for mistakes.

Here's a good reddit thread about some really terrible arithmetic errors in
mathematica:
[https://m.reddit.com/r/math/comments/2kjyrc/known_error_in_m...](https://m.reddit.com/r/math/comments/2kjyrc/known_error_in_mathematica_has_not_been_fixed_in/)

As a computer scientist doing physics research, I don't understand how my
colleagues put up with the terrible trifecta of mathematica, matlab, and
labview. These are some of the lowest-quality and most frustrating pieces of
software I have ever had to put up with, yet they are ubiquitous in many
research communities. There are vastly superior solutions that are free, open-
source, and, as far as I can tell, much easier to use.

Every day I see students and researchers struggling to circumvent the
idiosyncratic and senseless designs of these softwares. I think it might just
be a vicious cycle of professors only knowing shitty software, so the students
only use shitty software, don't learn good software, become professors, and
the cycle repeats.

~~~
gaur
> These are some of the lowest-quality and most frustrating pieces of software
> I have ever had to put up with

I'm a strong advocate for migrating research code to FOSS wherever possible,
but even I don't agree with this statement. Mathematica and Matlab are pretty
good at what they do (labview is a different story).

What "vastly superior" FOSS solutions do you have in mind? sympy? sage? Do you
think these aren't going to have errors in them?

~~~
wyager
>What "vastly superior" FOSS solutions do you have in mind?

Depends on the problem domain. A good example is 1D numerical perturbation
theory; my colleagues struggle to beat mathematica over the head until it
gives them roughly what they want. On the other hand, it's much easier using
jupyter, numpy, and matplotlib.

>Do you think these aren't going to have errors in them?

Of course they do, but they won't go unfixed for years on end. You can also
figure out what's wrong without having to reverse engineer a big binary blob.

------
dave_sullivan
Anyone who finds this interesting would also probably find this interesting:
[http://accelerating.org/articles/transcensionhypothesis.html](http://accelerating.org/articles/transcensionhypothesis.html)

The author thinks "black hole technology" ends up being very important in the
development of advanced civilizations and offers a solution to the Fermi
paradox (essentially, some black holes and other dense objects are better
Dyson spheres used by post brain upload civilizations).

------
arcanus
I'm not a big fan of Stephen's cellular automata approaches with respect to
fundamental research, as I consider them overly reductionist and largely
unable to be validated.

Regardless, a very interesting article. Stephen is a pretty crazy guy
(especially in person) but he is undoubtedly smart and thinking deeply about
big problems.

------
jpt1
> But as of a little more than a week ago I’m finally convinced that black
> holes exist, just as General Relativity suggests.

Didn't we know black holes 100% exist 20 yrs ago because of the orbits of the
stars around the centre of the galaxy and before that because of quasars?

~~~
ajkjk
Well, we had good reason to believe that there were things with approximately
their mass in approximately their area that emitted approximately 0 light to
us. Which, yeah, isn't quite the same thing, cause you could imagine that if
GR was slightly wrong, you could have something slightly different at the
center than a True Singularity.

------
peter303
Interesting summary of some aspects of Black Holes.

I recently read a centennial volume on history of General Relativity of which
Black Holes are extreme cases. Except for some initial GR solutions and
astronomical confirmations in its first decade, GR became a backwater branch
of physics for the next 50 years until Thorne and Hawking came along. I was in
college in 1973 when the term Black Hole came into public use, though the book
mentions some claims of earlier usage. Just giving some exotic phenomenon a
snappy name can focus attention. I recently re-saw the 1967 Star Trek episode
about time travel back as an UFO incident. They use the black hole concept,
except called it a dark star then. Gravitational singularity was a competing
name, but not as snappy. (To be precise it is possible to have an event
horizon without a singularity, so they are not exactly the same thing.)

------
gus_massa
It's a long article with lot of information and at some parts it got boring.
But continue (skip :)) to the last part that is really interesting!!!

------
cf
It's really odd for him to keep harping on Mathematica when the LIGO
computations were clearly done in a Python ecosystem as shown in:

[http://journals.aps.org/prl/pdf/10.1103/PhysRevLett.116.0611...](http://journals.aps.org/prl/pdf/10.1103/PhysRevLett.116.061102)

[https://dcc.ligo.org/public/0122/P1500217/014/LIGO-P1500217_...](https://dcc.ligo.org/public/0122/P1500217/014/LIGO-P1500217_GW150914_Rates.pdf)

[https://software.intel.com/en-us/blogs/2016/02/14/python-
bri...](https://software.intel.com/en-us/blogs/2016/02/14/python-brings-us-
the-ligo-gravity-wave-sound)

------
Jun8
I found the "gravitational crystal" concept and the following part
interesting:

"But what about the three-body problem? The pictures above suggest a very
different story. And indeed my guess is that the evolution of a three-body
system can correspond to an arbitrarily sophisticated computation—and that
with suitable initial conditions it should in fact be able, for example, to
emulate any Turing machine, and thus act as a universal computer."

Building a complicated physical computer to simulate real-world similar to the
one in _The Hitchhiker 's Guide to the Galaxy_ is an interesting concept. But
wouldn't designing that computer require even greater computational power?

------
phn
Does anyone have any more insight/interesting reads about the electron being a
black hole thing?

Somehow the idea that electricity may come from when something is being
"sucked out" of our universe is appealing.

I'm just a layman thinking of a sci-fiesque scenario, of course :)

~~~
505
Not exactly about the electron, but if you want good fiction about
singularities and the like, find books by Greg Egan. /Incandescence/ is about
a civilisation living near a black hole. When you read it, keep a paper and
pencil handy. Egan's web is at
[http://gregegan.customer.netspace.net.au/](http://gregegan.customer.netspace.net.au/).

------
davesque
"When I was 15 or so, I remember asking a distinguished physicist whether
electrons could actually be black holes."

Honestly, what's the point of that comment? Just as much information could
have been conveyed sans ego by saying "I remember one time when I asked a
distinguished physicist whether electrons could actually be black holes."

