Hacker News new | past | comments | ask | show | jobs | submit login
1060-hour image of the Large Magellanic Cloud captured by amateur astronomers (astrospace-page.blogspot.com)
752 points by dmitrybrant on April 15, 2019 | hide | past | favorite | 114 comments

Hi Everyone! I'm glad to see so much interest in this Image in this amazing community. I'm the CTO of Observatorio El Sauce, the observatory where this telescope is located. This place is a fully robotic observatory that provides a service called "Telescope Hosting", which basically means that people send their telescopes and observe remotely from wherever they live. Why would they do that instead of observing from their backyards? That's because in that part of Chile we have the best sky quality for astronomy in the world, in terms of amount of clear nights a year (320 clera nights a year), light pollution (class 1 en Bortle scale), and in something called "seeing" (average below 1"), which is a measure of the smallest thing the sky would allow you to image (the smaller the better). Thus, from our observatory our clients get the best possible image they can get with their telescopes.

Also, a good friend of mine developed this digital scope so you can zoom in the picture easily without going back to the 90ties internet experience: https://scope.avocco.com/case/20/eWKcUiIXpuQU9V0z

I'd be happy to answer to your questions :) Enjoy!

Cool service! Can you say anything on how many telescopes you host, and ballpark price, and any history of how the observatory got started (land, permits etc)?

Hi! Thanks :) We have already 30 telescopes for different purposes, mainly for science and astrophotography. Our standard plans, which include maintenance and most of the support you'd need, cost around 7500 usd/year. The project started in 2012, we spent a couple of years finding the land, doing all the necessary paperwork and getting permits, getting internet, etc. Our first telescope arrived in the beginning of 2015. Our goal is to be a professional alternative to the big scientific observatories, but mainly purposed for small and middle-sized telescopes.

Nice little setup. I liked how they set up the 'small' telescopes (pardon my ignorance). I didn't see any pricelist though.

[1]: http://www.cielaustral.com/construction4.htm [2]: http://www.cielaustral.com/cons4/cons13.jpg

(Not related to the project, just checked their website)

The robotic telescope mount they use (Paramount MX+) is made by a small company in Golden, CO called Software Bisque [1].

These mounts are amazing pieces of engineering. I want one just to fondle the finely machined and anodized aluminum.

[1] http://www.bisque.com/sc/pages/ParamountLandingPage.aspx

Full image http://www.cielaustral.com/galerie/photo95.htm?fbclid=IwAR3G.... (Linked in article) Really incredible.

Magnificent! Congrats to Ciel Austral. We live in a New Golden Age in astrophysics and astronomy. With M87 black hole, TESS exoplanet catalog, Charon flyby, LIGO-Virgo, etc. ;)

“In an eternally inflating universe, anything that can happen will happen; in fact, it will happen an infinite number of times. Thus, the question of what is possible becomes trivial—anything is possible […] The fraction of universes with any particular property is therefore equal to infinity divided by infinity—a meaningless ratio.” -Alan Guth


Sometimes I envy what appear to be more interesting parts of the universe.

That image evokes all those feelings.

Then I remember interesting probably = bad, and we may not be living to appreciate it all.

If you actually lived there it would probably just seem like a lot of uninteresting empty space. It only looks interesting because what you're looking at is far beyond huge, all squashed together into one picture.

“Space is big. You just won't believe how vastly, hugely, mind-bogglingly big it is. I mean, you may think it's a long way down the road to the chemist's, but that's just peanuts to space.”

Peanuts are a gross underestimation. More like a quark or a Planck length.

Well, I think you could live a whole life on this planet without seeing the same place twice. If you think you have seen it all, you probably didn't look close enough.

I mean, you can see every place on this planet within a few seconds if you are taking a look at some satellite photo and yet you would not have seen your house and the living beings around it. Similarly, there are many more details which we do not perceive in our everyday life and which you can explore without have to invent a couple of scientific miracles ;-)

It is not that I don't value the stars. It is just that I think we shouldn't dream of arriving at some wonderful planet one day when we already are on a wonderful planet. We just don't appreciate it like that.

Oh, I think we should dream of all of it. One does not diminish the other.

We do live in an amazing place.

If you're thinking of the LMC as an "intersting place", you're committing a pretty huge error of scale.

The really mind bending part of that is on a galactic scale the LMC is "close". :)

I'll bite. Why is LMC less interesting than other stuff? What's more interesting in your opinion?

I think OP means that LMS isn't a place per-se (like a planet, or maybe even a solar system is a "place"). It's a whole bunch a places bound together by gravity.

Even so. if you happened to live there you'd get a view of all those nebulae, and the Milky Way in the distance.

You might need eyes the size of a satellite dish to see them all though.

We can see all that from here though.

A solar system is a whole bunch of places bound together by gravity, too. What's the fundamental distinction here? That the center of mass isn't approximately one massive thing (e.g. the solar system is 99% the center -- the sun)?

That's right.

In a very shallow, simplistic sense, we see all these other amazing phenomena out there. Our home seems like a little rural backwater, crossroads and a store kind of thing, by comparison.


For anyone with a shitty computer like me, zooming into that is currently crashing my browser ¯\_(ツ)_/¯

I'm sure someone will be able to put this into a Google Maps style viewer pretty quickly to err, not crash your browser as quickly.

It's an 80mb JPEG. It worked on my PC with Chrome but I have 8gb RAM. Try using Ifranview I heard that's good.

I have random freezes. Firefox can't handle 200 MP, I guess

Works just fine for me. Loads in 2 seconds, zooms instantly. Maybe something wrong with your computer or Internet?

My first guess, not enough ram.

No problem with Firefox (66.0.3) here. Zooming into the image makes my FF use about an additional 1G of ram.

works fine on my 4yo desktop, so I checked it on my 2010 laptop. The webprocess on the laptop is using 1 gig RES ram, but it's not 2004 any more.

I opened the full image (14400x14200), which took a good minute to load, and spent some time just looking at every single dot in that picture, which there is a lot of.

Memories of Netscape, on dial up, waiting for the picture to load, but now the detail and resolution is so incredible. It's a great way to start a Monday!

I opened it in Chrome 73 & Preview on MacOS to see memory consumption on Activity Monitor.

When the full image was loaded up in Chrome, Chrome Helper showed up with ~1.30GB of memory consumption; it sometimes went up to 5.60 GB on repeated tests and quickly reverted to ~ 1.30 GB on average.

Preview consumed ~550MB at the beginning and further zooming /zooming out consumed ~250 MB at average.

Chrome, struggled a bit after clicking on Zoom. Preview gave a low resolution image when Zoom was clicked and then gave the original resolution. Preview's user experience was comparatively better.

Intel power gadget showed CPU spikes during these actions, but nominal GPU spikes; I don't think metal is being used in preview.

Edit : RAM : 32GB, Core i5 7th Gen

took about 22 seconds for me, but seemed to stall for about 5 seconds in the middle of loading :-P

Took 6 seconds with Internet Download Manager. By the time I fired up ACDSee, the file was there (running win8.1 on SSD)

do you have a HDD? perhaps that's memory to disk caching. I have ZERO clue, just guessing.

" Indeed, astrophotographers used a couple of special filters which transmit narrow parts -lines- of the visible spectrum : the Hydrogen Alpha line at 656 nm, the Sulfur line at 672 nm and the Oxygen III spectral line at 500 nm"

Can anyone comment on how much the images produced by these filters differ from what the human eye would see if somehow it was able to look at these objects. Are they also taking in information from the non visible spectrum and coloring it or is this all just a focusing of a light that real humans would have been able to perceive?

I know they mentioned using different filters to achieve the two different images but was

There are several parts to the answer of your question.

First of all, emission nebulas are not very bright, so no telescope can give a picture as bright as a long-time exposure does. If you can see a nebula through a telescope at all, it will be very faint. Which triggers another effect in your eye: the cells for color reception are not very sensitive. Like with general night vision, you will see nebulas usually only with your light-sensitive receptors, which don't see colors. So it will appear in a grey-greenish color.

High quality pictures of nebulas are taken at very specific wavelengths, of common emission frequencies, you listed them. Even at high brightness, they wouldn't directly convert into a good color picture, as 500nm is turquise, 656 and 672nm are very deep red. A color image converting these wavelengths directly into RGB-values would be not very impressive, it would look more like the bottom image on the page. So usually a color mapping is used to generate impressive images which also show a lot of the detail information. With 3 different "colors" in the source image, you can apply an arbitrary transformation to generate an RGB-image. For example, most images from the Hubble telescope use a common mapping which is consequently called the Hubble-telescope mapping. Like shown on the page, you can create very different looking images from the same data set by choosing the color mapping.

Under excellent conditions visual experience is very similar to pictures. Emission nebula only emits light at very narrow band, and filter will suppress stars and makes nebula more contrast.

This link might help your question:


Short answer: the colors they mapped to green is actually closer to red and the color they mapped to blue is closer to green, so it would have less cyan (blue and green) and more magenta (blue and red). It would probably look a little more purple-ish.

I love that this was done by five amateur astronomers. Here is a description of the group and their set up: http://www.cielaustral.com

It blows my mind to zoom in on just one part of the picture and see how many stars there are. And then multiply that by the entire picture. And then multiply that by all the galaxies in the universe. My mind just isn't built to comprehend numbers that large.

And some of those "stars" are actually galaxies themselves! Truly mind blowing.

Actually, most of those dots are galaxies. The dots that make up the large structures are stars in the LMC which is a collection of star clusters, but everything else that is not part of some larger structure must be a galaxy.

Yeah, if universe doesn't make one humble, that probably nothing will. Due to recent photoshooting of M87 and its central black hole, I ended up on the page which lists the biggest black holes known to us [1], pretty humbling to read the details on some (ie quasars overshadowing its own whole galaxy so we see only it). Universe is surprisingly diverse

[1] https://en.wikipedia.org/wiki/List_of_most_massive_black_hol...

The glowing up arrow this site displays is makes reading on mobile really annoying. Why distract from the content with a UI pulsar?

Maybe the designer believes the feature of scrolling to the banner image at the top is absolutely essential and should be done frequently! That's why they made it red and blink, it's very important!

I came here to say this. That one feature makes their site unreadable for me on mobile.

While spending about 30 minutes zooming in and marvelling at it, i came across https://imgur.com/fdb6JZH. Is this an exploding star?

This is DEM L316. It might seem like one object, but these are the remnants of two different supernovas (of different types: smaller is Type Ia, bigger Type II).

[0] http://chandra.harvard.edu/photo/2005/d316/

Ah! Thanks for the link.

This was made with 3936 photos taken between July 2017 and January 2019, each of up to 20 min long, looking at several parts of the spectrum. Amazing!

Here is a viewer for the images


Does anybody know how expensive the setup described is?

- Remotely-controlled observatory at the El Sauce Observatory in Chile

- A 160-mm APO-refractor telescope and a Moravian CCD

- Presumably hefty image processing requirements

- etc.

I get that these guys are amateurs in that they are not being paid for this but presumably this costs some serious money? Or are the components they use in reach of a well to do hobbyist these days (all relative I know)?

All in USD: Scope 13k Camera 7k Mount 9k

Thats just the big ticket stuff. Theyll have a guidescope, colour filters, laptop (i assume), all sorts of paraphernalia supporting the effort.

Given that its remote controlled and in an observatory in Chile I suspect that adds another order of magnitude to the cost. But I'm unsure specifically how much, or if they're renting scope time.

You can buy much cheaper equipment and still do admirably, this set up is really quite extreme for a hobbyist.

Here is a writeup with pictures of a remote controlled observatory located in southern France (ROSA-REMOTE): http://lievenpersoons.com/astrophotography/observatory.html

Sounds like it took several trips and weeks to get it right.

Can someone give a quick explanation of the objects the picture? Are all the nebulas in the LMC or in the foreground? Is the LMC the reddish haze in the backround?

The nebula are regions where stars are forming in the LMC. In fact, the brightest and largest on the middle left, is the Tarantula Nebula, with the young forming star cluster 30 Doradus. 30 Dor is notable for being the biggest, baddest star forming region in the Milky Way or its satellites--it hosts a few hundred stars more massive than sixty times the mass of the sun in it's core, and hosts the candidates for the highest mass stars observed, above around 150 solar masses. If placed 100 times closer to be where the Orion Nebula is, its illumination would cast visible shadows, taking up a quarter of the night sky with a surface brightness on average that of Venus.

How long will 150-solar-mass stars last?

As far as my knowledge goes, it's unknown, we have nothing to compare this star against (and it might go as high as 380 solar masses, we don't know).

It'll likely spend a few million years burning hydrogen before going to helium and heavier elements for a few thousand years.

A black hole is almost inevitable.

It's not quite your question, but here's a cool site that put its location in space in context for me: http://www.atlasoftheuniverse.com/sattelit.html

The LMC is effectively a galaxy that we captured and started tearing stars from. The white disk is the core.

Are the cloudy puffs remnants of exploded stars?

Some of them, yes.


Your mass, and mine, are parties to the gravitational conspiracy, yes. One can even trivially prove that the Freemasons and the Roman Catholic Church are in on it. "We" is an appropriate pronoun in this instance, just as it would be for a nation of which you are a citizen.

The LMC is 160k light years away; 100 years ago 'we' didn't exist, the constituent matter that form us was there, but 'we'. Perhaps the twinkle in the eye, but no 'we' yet.

Since information and gravity movement is limited by the speed of light the effect of 'we' is by our age, so to a 100 light years for all but a very few people.

Also by that count all matter in the universe interacts with all other matter in the universe (OK out to the limit of the CMB).

So I am hedging my bets by saying it's wrong, or that it's so obvious its redundant.

In this case, the "we" was a stand-in pronoun for the Milky Way galaxy, of which we are part. It would have been linguistically tortuous to have phrased it in any other way; "we" was the appropriate pronoun, no matter how much it may embarrass you to be implicated in the action. Natural languages are not instances or implementations of propositional calculus.

I'd love to print this on 15 foot wide wallpaper for the kids bedroom.

Well, that's what I'm doing tomorrow, but it will be for my living room. It's a 135x242cm piece of wall, which is a not as a square ratio as the image, so I had to crop it from its upper left corner down. Inkscape exported it to a 666MB PNG (ugh) I don't know why. So let's see...

That's really cool, if you get it to work, do a 'show HN' for sure.

How large are the sections that you're printing? And how are you printing them?

It's one section only which is 15945x28583px. I'll have it printed at a shop near home which is specialized in large printings for cars and trucks. Their printings are like peel out stickers.

It wouldn't be cheap but I definitely could do that, 17" wide strips of photo paper aligned very carefully.

I'm pretty sure you can get custom wallpaper printed.

There are companies that have canvas printers. Look for those that print on work cloth for other companies.

What kinds of image processing are used for this? Do the images need to be aligned, or can telescopes be pointed precisely enough? Are the images combined using mean/median, or something more sophisticated than that? What settings are the original photos captured with?

There are several software options on image processing for astrophotography. Do a search for 'astrophotography image stacking', and you'll get a list of software, tutorials, videos, etc. A couple of the popular ones are Deep Sky Stacker[0] or PixInsight[1] or even Photoshop. They offer different options/capabilities.

The main thing about the capture settings is to use RAW. Other settings ISO/exposure time/etc is dependent on camera being used. However, whatever you can do to capture as much light as possible within each frame is the goal.

The software does image alignment rotate/scale/etc to do the stacking. You can stack images taken of the same object from different physical locations. Spend a weekend in the desert shooting an object, then spend another weekend the next month at the top of a mountain shooting the same object, and all of the images can be stacked.

The telescope alignment precision is important, but less so than it used to be for a couple of reasons. With gear available today, you can take "portable" telescopes into the field, do a decent polar alignment and then allow the guide scope/software to correct for any imprecision of the main scope's alignment and even tracking issues from manufacturing issues with the mount's worm gear. A guide scope is a second smaller telescope (wider field of view) attached to the main scope with a camera attached to it. That camera is connected to a computer running the guide software, and will track a designated star. The guide software will talk to the telescope's motors, and can speed up/slow down the motors to keep the guide star to within a 1/4 pixel deviation.

Also, with digital cameras, images of shorter exposure times are taken and then stacked in software. There's multiple benefits to doing this. Consider exposing a single frame for 60 minutes, or 12 5 minute exposures, or 30 2 minute exposures. If anything bad happens during that exposure (a plane or a satellite crosses your view, someone uses a laser pointer through your frame of view, a bug lands on your primary, etc) it's not "that big of a deal" to capture it again. Also, digital camera sensors tend to get noisy with longer exposures due to heat build up around the sensor (a problem film cameras do not suffer).

[0] http://deepskystacker.free.fr/english/index.html [1] http://www.pixinsight.com/

> Also, digital camera sensors tend to get noisy with longer exposures due to heat build up around the sensor (a problem film cameras do not suffer).

Maybe worth pointing out that film has its own issues with long exposures, though. If I remember right, film's response to light isn't strictly linear with exposure time so you get less and less useful additional exposure as you expose longer.

Yes, with film there is the Schwarzschild-effect which causes the sensitivity and the color reaction vary with the exposure time.

While digital camera sensors usually pick up noise for long-time exposures, this is less an issue for astronomical cameras, because they fight this noise by cooling the CCD-chip. Usually the chip is cooled via a Peltier-element to temperatures below -20C, where thermal noise is very low.

The aforementioned effect is termed "reciprocity failure".

According to the official website [1] (which is linked to from TFA), the images were processed using PixInsight, a program popular among amateur astronomers. This page: [2] explains how PixInsight performs image alignment; it turns out to be a pretty complex (and interesting) process. The same page also explains the process of merging images.

[1] http://www.cielaustral.com/galerie/photo95.htm?fbclid=IwAR3G...

[2] http://pixinsight.com/doc/tools/StarAlignment/StarAlignment....

They use an image stacking process that realigns a set of images to form the single image. See for instance https://rogergroom.com/astronomy-deep-sky-stacking-software/

It is not possible to ever see a scene like this even if one were sitting in deep space is it? These sorts of images are the result long exposures, but a human would only see blackness and stars, and maybe some faint puffs of light here and there.

You wouldnt ever see the colors. They are far too dim without magnification. If you were standing in the cloud you would probably see it a little, like we see our galaxy as a blurry cloud, but only on the darkest nights.

There's some astrophotography that fills in bands of the wavelength we can't see with colours to give us the perception of being able to see gas clouds etc. I'm not sure if this is done here but it's probably worth mentioning that not all space images you see are realistic in terms of human visible wavelengths.

Then we should also mention that human color vision changes depending on light levels, with us being more sensitive to some colors than others. So when you over-expose an image you aren't just making it brighter but changing the ratios of perceived colors. (A big deal for eye witness reports of crime at night.) At very low levels our vision becomes essentially black and white.

Lol, nostalgia. Watching that image load slowly from top to bottom is so 1994 for me.

Are you using zmodem? Believe it can resume a broken transmission. ;-)

With these hi-res images I'm always curious which of the stars actually belong to the galaxy and which ones are "noise", i.e., stars that are from our own galaxy "blurring" the view.

I think filtering out "local" stars should be very doable given ML/CV progress.

There is probably some easier way of doing this but if your favourite tool is a hammer, everything starts looking like nails I'm sure :-)

With images taken 6 months apart, spanning earths orbit of the sun, you might be able to detect some parallax motion of the nearest stars.

Propper Motion is usually the greater of the two effects that cause stars to appear in different places. So it's not quite so simple.

I would love to see a version of this with the "noise" of the stars cleaned up and the brightness/saturation of the clouds increased.

The actual title is: 1,060-hour image of the Large Magellanic Cloud (LMC) captured by Amateur Astronomers

It should probably be changed to a shortened form of that.

Are there areas of the world with no light pollution where you see anything remotely like this with the naked eye? Any parts of the Milky Way?

You won’t see anything like that with the naked eye, period. We can’t build up a composite of all photons our eye gathers over 1060 hours! There are places where you can get minimal light pollution and see amazing things though.


That site will help.

No, the best you can do unaided is just a general view of the Milky Way.

But if you're willing to accept some optical aids like a reflector and eye piece, a large amateur "light bucket" dobsonian telescope can unveil deep space objects to the naked eye.

I don't think it's possible to get anything like these photos though, the sensor is collecting light over a very long duration to present as a single image. The only way to get more light into your naked eye real-time is with more aperture, obviously there are practical limits there.

I've seen the Magellanic Clouds with my naked eye. I worked at an observatory in rural Argentina (location because light pollution). One night, I went out to one of the telescopes for emergency maintenance. When we got back out into the dark and hadn't turned on the headlamps of the car yet, the Milky Way stretched like a band across the sky, and you could see both Magellanic Clouds as small but macroscopic objects, indeed looking like clouds.

This was among the most breathtaking things I've ever seen (the other being a particularly vivid showing of northern lights in Alaska). The southern hemisphere's sky is infinitely more exciting than the northern one.

The fractalness (is that a word?) of the universe never ceases to amaze.


Fractal nature of?


Sounds like Mortal Kombat finishing move.

or something Andy Samberg would say on Nine-Nine.

What is the very bright object with a bluish hue in the top left?

It says "Bean" there.

Also: Does anyone here know what star that is? Is it one of the brightest stars in the night sky, or is it just super bright relative to everything else in this picture?

Most likely a star between us and the LMC.

It certainly has a Van Gogh feel to it. Really awesome :)

That link redirects to a spam site for me. Anyone else?

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact