Hacker News new | past | comments | ask | show | jobs | submit login
Dragonfly: An optical telescope built from an array of off-the-shelf Canon lens (utoronto.ca)
225 points by fanf2 7 months ago | hide | past | favorite | 78 comments



This kind of a setup of a large number of small, cheap detectors works well for observing diffuse objects with low surface brightness. Generally speaking telescopes improve with size because larger telescopes can resolve smaller objects, so you can concentrate the light from your source into a smaller patch and increase the signal-to-noise with respect to the background. But once you have resolved the object (which doesn't require a very large diameter for a diffuse object) you no longer get any benefit from a larger telescope except for the greater light collecting power. So there's no benefit to having a single large mirror vs a large number of smaller detectors. Since it's a lot easier and cheaper to just buy a bunch of off the shelf components rather than build a large mirror from scratch that is what they did here.

A friend of mine in grad school worked on a project that is similar in spirit called ASAS-SN. It also used off the shelf cameras but distributed them around the world so that they could detect supernovae and other transients. Because everything was off the shelf they could build out their network on a shoestring budget. I think they're the first to discover the vast majority of all bright supernovae these days.


- "diffuse objects with low surface brightness"

Are things amateur photographers with small telescopes (and lots of patience) sometimes discover,

https://old.reddit.com/r/space/comments/13uco46/i_discovered... ("I discovered this planetary nebula using a $500 camera lens, now it carries my name")

https://www.astrobin.com/i9yy6f/ (18 hours!)


I was struggling to see the planetary nebula inside that blue circle before I realized that is the planetary nebula. cool!


Somewhere there's a large, bright nebula in the shape of a red arrow no astronomer's ever noticed.



I still don’t understand how objects in space of that angular diameter are still being discovered. I would have to imagine lots of people have seen it, but just never chose to document or catalog it?


This is an extremely long (18 hour) exposure in specialized narrowband spectral filters that have no usefulness for anything other than these particular targets.


Oh nice!!! I looked at a couple sky surveys in different, and it was nowhere to be found in their data, that’s so cool!


So that seems like it would be a great use for the Seeestar s50. There's a bunch of people that bought the Seestar s50. Which is a $500 smart telescope that is controlled with your phone, and can rotate and track objects in the sky. They're now distributed throughout the planet.


It’s interesting that technically an individual camera’s sensor is itself an array of smaller sensors that capture individual pixels. So you have like a tree of arrays

Maybe we can keep stacking them. Build an array of arrays of cameras/telescopes

What would be the limit?


The sky is the limit. Or the target at least.


Can one use millions of smaller detectors if one finds a way to point them in one direction and synchronize them to take pictures at the same exact moment?

I mean, can millions of phone cameras make one giant virtual telescope?


Broadly, yes.

The key term you are looking for is "exposure stacking". See for example https://markus-enzweiler.de/software/starstax/ and https://www.cloudynights.com/topic/719318-stacking-data-from...


Took me a while to understand what you meant. A phone camera already is millions of smaller detectors .... But I think you mean coordinating millions of people to all take photos of the same direction in the sky and then combining all the photos? I'm sure it can be done with an app and a way to build that crowd of users! But the field of view will still be huge because they're not telescopes/telephoto lenses.


Yeah, modern phones use multiple cameras to produce a single image. Would it be possible to produce a higher resolution photo using millions of images taken from millions of locations?

I have no clue what I am talking about, but would love to hear somebody knowledgeable speculate on this.


For laughs I one time combine frames from some really old footage. I upscale the frames so that each pixel becomes a cube of same color pixels. Then I stack them and shift them to line up properly. The resolution goes up and more detail is revealed. Not sure what the limit is of that approach but if there is only one frame that has them you can remove it's grain and correct bends (wobbles? distortions? wrinkles? waves? what is the word?)


> correct bends (wobbles? distortions? wrinkles? waves? what is the word?)

Since you're talking about video footage, I would guess it's rolling shutter distortion you saw. This can result in wobbles, skew, or aliasing artifacts.


Now, maybe. Soon - not really, as AI features in camera apps will erase weak signals and/or replace them with some creative interpretation of a generative model. COTS cameras are increasingly becoming useless for doing science.


Are the AI changes also made on RAW format images?


Unfortunately their website is not built from a large, redundant array of off-the-shelf server parts.

Here's a mirror,

https://web.archive.org/web/20240507234024/https://www.dunla...


This article talks about the upgrades they recently (2022) made to the telescope array and has a few example images.

https://www.dunlap.utoronto.ca/new-dragonfly/

The publication resulting from the work, "Giant Shell of Ionized Gas Discovered near M82", can be found:

https://iopscience.iop.org/article/10.3847/1538-4357/ac50b6/...

The companion publication from Yale, "Nascent Tidal Dwarf Galaxy Forming", can be found:

https://iopscience.iop.org/article/10.3847/2041-8213/ac3ca6/...

Edit: And, actually from the post below, full publications related at:

https://www.dragonflytelescope.org/publications.html


>Here's a mirror,

Ah!

So there's a large mirror involved?

(sorry)


No, just the camera lenses and sensors for each lens.

I am wondering if they just "stack" the images or if there is something more involved.


Whoosh



Lovely project. When comparing images from this and other scopes, remember that almost all Astrophotography nowadays has moved on from exposure from a single light pipe into a negative, and development onto a photo: They are constructs from grids of sensors, with differing receptance across the RF spectrum (visible and non-visible) subject to intense signal processing, and false colour projections.

About the only point you can discuss as a comparator is the angular resolution and even that means asking how much has been interpolated.

I love modern astro images but I am unsure we even should call them "photography" because they're "painted", by the astronomer or their appointed colour artist.

"this one has more IR boost, but I added a hint of green for dramatic effect"

Also to the angular resolution thing, we're often also looking at what is projected into our mind as a 2D structure. The horse head nebula is a 3D state of matter. It wouldn't look like JFK's head from our angle, but might well from another point of view, or a melon, or like.. nothing at all.

Most "constellations" are not physically interconnected in any sense, and are not a group the way a galaxy or star+planets are, and need not lie the same distance away from us. They're the view of these objects on an imaginary surface painted by the astrologers.

https://en.wikipedia.org/wiki/Flammarion_engraving


Can someone smarter than me explain why astronmers can't stick something like this on the back of an existing geostationary platform (like what is used for the XM radio sattelites) and get amazing data out of them? Surely sticking something like this array 100km into space will yield better results without the overhead of a 20 year mission plan like James Webb or Hubble.


Cause it wouldn't give better results. The big advantage of putting telescope in space is that don't have to deal with the movement of the air distorting the image. That doesn't matter when taking pictures of diffuse objects.

The disadvantage is that it is in space, you have to spend 10x or 100x as much making something that can work in space, and you can't maintain it. I bet it would be much better to spend that money making dozens of these around the world, or iterating on the design.

The other advantage is that atmosphere is opaque for some wavelengths. The infrared wavelength that JWST looks in are absorbed. It also helps to be able to cool down the detectors to lower temps. One reason we aren't seeing direct replacement for Hubble is that the big ground telescopes with active optics are as good.


The lenses are made of materials that will not resist the conditions in space (high temperature gradients, oxidizing environment, radiation).

Once in space they cannot tweak the array.

Launch weight and stresses would damage this array.


I've always wondered that myself - If the Russians could build and launch sputnik in 1957 and get it around the earth a few times why aren't we seeing a huge number of backyard dad+son duos building rockets to launch their own telescope array. Its amazing that its a 60 year old feat that is still only in the hands of governments and massive corps


The problem is the lack of a "backyard" ICBM program to piggyback off of... the R-7 that the sputnik launcher was based on "was designed with excess thrust since they were unsure how heavy the hydrogen bomb payload would be" (wikipedia.)


1) You need more than just getting to space, you need to go really fast when you get there. So, lots of propellant and a big rocket is needed. So, it's really expensive.

2) Because you're setting fire to a big tube of propellant that then goes crazy fast, you need all sorts of permits and safety reviews to do it

3) Space is hard, so your rocket will almost certainly blow up / fail a couple of times.

All of this means: big budgets and state-level patience and persistence is needed


But there are actually so so many startups and garage enthusiasts at various stages of readiness to put payloads into orbit

https://www.youtube.com/watch?v=SH3lR2GLgT0


Rocket science ain't easy. Just because you can build a great telescope does not mean you can build a rocket. Also, I could only imagine the NIMBY revolt when you file your permit plans to build launchpad-38A in your backyard. I hope you don't spend too much time wondering before coming to obvious answers


You might look up the sagas of Rocket Lab and other small launch providers if you're interested in what it takes to put ~200lb into LEO today. It's way beyond dad and son stuff, still incredibly hard, but not so much anymore that you need to be Boeing/Lockheed/SpaceX/etc if not a national agency. This is a recent achievement.


Just because it was possible for the soviets back then, doesn't mean it's trivial today.

Rocket fuel is also not exactly easy to come by.


Projects like this have been done; see for example https://www.jpl.nasa.gov/missions/arcsecond-space-telescope-...


I am no expert, but the things I would worry about:

- cost to get into geostationary orbit might dominate the value/saving of the cheap instrument, so it might be smart to spend more on that - managing and controlling it might be very challenging - need to get the data down from it - might create difficulties and costs that kill the value - heating and cooling in space might kill it - radiation in space might kill the hardware - acceleration during launch might kill the hardware - the payload needs to be stable during launch or there will be an accident - scientific value might be lower than other missions for similar spend


Not a professional so take this with a grain of salt but my guess would be weight first and foremost. From what I understand geostationary orbit isn’t cheap to get to and each added kilogram increases cost significantly. These lenses while not incredibly heavy are heavy enough to add a fair amount of cost.


I also doubt these lenses will hold up in a very cold or very hot near-vacuum.


I don't think there's that big of an advantage for space-based astronomy here, for visible-wavelength light with large pixel scales, and relatively bright (total luminosity) objects. Because it's done in narrowband filters, it's particularly good at erasing sky noise.

/not an astrophotographer


There are... nine main limitations on telescope imagery that I can think of. In no particular order:

First is weather. We can't see through clouds. Most new astronomy is about sources too faint to have been analyzed a hundred years ago, and even clouds that are barely visible to the human eye will drown those out.

Second is various engineering difficulties resulting from differential temperatures in the air in close proximity to the telescope dome, defects in the mirror surface, and limitations to the optical design (you're projecting a spherical globe onto a flat surface).

Third is 'atmospheric seeing' - high-order distortions caused by thermal patterns in the air which change significantly on a tens of milliseconds timescale, ultimately leading to a gaussian blur of the light in long exposures. The lower your altitude, and the more disturbed the airmass, and the more humid this is, the worse this is.

Fourth is sky glow - light pollution from nearby upwards facing lightbulbs, from the full moon, and from the sun at twilight & in the daytime

Fifth is the diffraction limit. A perfectly engineered, spherical-cow-world telescope with a perfect sensor has fundamental optical limits to the resolution it can observe, and optical resolution in arc seconds scales with wavelength / aperture.

Sixth is bright-source confusion and the limitations of your background field. It's very difficult with CCD & CMOS sensors (and even with spherical-cow sensors, the optics present limitations) to image a faint thing next to a bright thing. This is why we have fewer galaxies mapped on the other side of the Milky Way,, and why it can be very difficult to pick up, say, a nebula right next to a bright nearby star

Seventh is light-gathering ability, thermal noise, and readout noise. If you're trying to capture a photon every second, it's going to be very difficult if your CCD is absorbing thousands of photons per second thermally from the surrounding blackbody radiation and the readout circuitry.

Eighth is differential focus. To make matters more complicated, optical resolution is not 'fixed' because focus is not identical in different parts of the iamger; Typically telescopes are optimized for nominal focus at the center of their field, but get a few arc-minutes off of the center and optical resolution goes down. Get a few degrees off and it can go down to un-usability. There are characteristic abberations that crop up, and every optical design that aims for wide fields is a compromise between these abberations.

Ninth is atmospheric windows. Atmosphere absorbs hard UV. And portions of infrared. And portions of radio. To get a full spectrograph of a source, to detect the exotic portions of the EM spectrum that we don't really deal with frequently, you can't do it through atmosphere.

Generally speaking, it's relatively easy with on Earth for professional observatories to reach a point where atmospheric seeing limits your observations more than diffraction or readout noise or field distortions or sky glow or ambient light. It's not easy to defeat bright-source confusion with a larger and larger telescope. Many astronomers have had to content themselves with knowing little about the sky right next to bright sources like nearby stars. The telescope in the article tries to probe this known unknown with numerous small low-res cameras.

Space observatories provide us a small amount (10x?) better surveys because of no sky glow, daytime observations, no weather, etc. They eliminate atmospheric windows and simplify some engineering issues (while complicating others).

Part of the big remaining purpose of space observatories, the thing it's very difficult to do on the ground (we've tried!) is to defeat the atmospheric seeing limit and allow us to use very large telescopes which are relatively simply designed. Light-gathering ability from a source scales with aperture^2, and light-concentrating ability scales with aperture^2, so ideally sensitivity to sources should scale with aperture^4. It rarely does on the ground, because we have to put up with atmospheric seeing. The technologies we've used on the ground to fight atmospheric seeing are extremely limiting, expensive, complex, the subject of an inane number of PhD theses, and only suitable for very small fields.

This goal of survey astronomy is at cross purposes to the telescopes in the article, which aim to get diffuse low resolution impressions of the light near bright objects, defeating problem number 6; They can do this with relatively short exposures over hundreds of sensors, so that none of the electron wells in the sensors ever saturate from being full of too much light and spill over into their neighboring electron wells


Question: why is, saying having 100 different lenses simultaneously take images, rather than just a single one taking 100 sequential images, and then using something like registax to combine them?


I think you always want to take many pictures (significantly more than 100, it mentions a 10h integration time, which I think means taking photos for 10 hours in a row?), so if you take them 100x slower it will both take much longer, and there will be more drift in that time to account for.


That write up could have used significantly more details. Is this competitive with a traditional telescope? 1/1000 the cost? Just the benefit of not having to share instrument time? Being able to incrementally expand the array?

Some kind of technical measurement for me to better appreciate the work.


Forget this telescope. I'm more intrigued by the location: New Mexico Skies, a campground for telescopes. I'd rather visit such a place than most any museum.

https://nmskies.com/


This is "when I grow up" type of wish to have my gear at this type of location for my remote sessions. Otherwise, I'd just really like for my retirement plot to be remote like this.


> with unprecedented nano-fabricated coatings with sub-wavelength structure on optical glasses.

Also known as an anti-reflection coating. Definitely not unprecedented.

Cool project, though.


That’s exactly why they are using commercial lenses! Canon’s modern AR coatings are so good that the performance for low-surface brightness imaging beats larger-aperture telescopes which can gather more light but scatter it around the image. See the third page of https://arxiv.org/abs/1401.5473


From the various links in the comments, the telescope uses special filters in front of the lenses to capture the faint glow in specific wavelengths emitted by the circumgalactic medium (a new word for me!) aka "gas" around galaxies.


Can someone provide/link-to a principle of operation? I looked at references (https://www.dragonflytelescope.org/publications.html) but they point to publications that look to be behind pay walls.

In particular, I don't see how N co-aligned cameras is any different than N images taken in sequence with one camera (averaging over noise), other than a reduction in time required to take N images with one camera.


> they point to publications that look to be behind pay walls.

If you click the “ADS” link you will find most will have links to a free preprint on arXiv.

> other than a reduction in time required to take N images with one camera.

Life is short!


Anyone know what sensor they are using behind all that nice Canon glass?


- "Each of the eight lenses in the array is connected to a Santa Barbara Imaging Group (SBIG) STF-8300M CCD camera. These cameras have Kodak KAF-8300 CCD detectors, which have a 3326 × 2504 pixel format."

https://arxiv.org/abs/1401.5473 ("Ultra Low Surface Brightness Imaging with the Dragonfly Telephoto Array")


Thank you!


How does this relate to interferometry? I do not think they are keeping waveforms or timing or anything (except taking the pictures at about the sames time. Are they just stacking images?


Essentially. It's better than conventional stacking because they are taken through different lenses, so any lens flares/distortions/reflections etc in each camera should get averaged away by the others.


Not too much to add from what others have mentioned but this looks like a good (low cost wide fov) photon bucket with low complexity (important for reducing stray/scattered light).


Cool.. a more up-to-dateish version of Wide Angle Search for Planets (WASP[1] and SuperWASP[2])

[1]https://warwick.ac.uk/fac/sci/physics/research/astro/researc... [2]https://www.superwasp.org/about/


Different science goals, though.


For drones something similar is done:

"The 1.8 gigapixel sensor is made up of a matrix of 368 Aptina MT9P031 5-megapixel smartphone CCDs."

https://newatlas.com/argus-is-darpa-gigapixel-camers/26078/


How can multi-element lenses have less scattering than a large mirror telescope? Do they mean smaller diffraction patterns instead?


See the third page of https://arxiv.org/abs/1401.5473


Though the characterisation of nano-structured coatings isn't quite right

> However, the latest generation of Canon lenses features the first commercialized availability of nano-fabricated coatings with sub-wavelength structure on optical glasses.

Nikon was actually first to market with this, as far as I can tell in the AF-S 300mm f2.8 VR in 2004.


Is it possible to download somewhere in full quality a raw image captured with it?


RAIT (Redundant Array of Inexpensive Telephoto lenses)?


Not that inexpensive - that's a $12000 lens with a nearly 6" aperture. The telescope itself is a fairly specialized instrument. It's more like a redundant array of very fast (for a telescope) lenses. From the paper: "Our primary goal is to test predictions that at very low surface brightness levels galaxies display a wealth of structures that are not seen at higher surface brightness levels." (But yes, the lenses are going to be cheaper from Canon than the tubes would be from a specialized telescope maker).


The thing to appreciate is that in academic science, $12000 lens is inexpensive; the scale is different from consumer and prosumer pricing. A typical project like this would get at least a million dollaras in funding (see https://www.artsci.utoronto.ca/news/astronomer-roberto-abrah... which shows they got $2M to buy lenses; 120 lenses * 12K = $1.5M.

I work with microscopes that cost $1M and just sit in a lab. That's not atypical for an academic or industrial microscope.

One of the biggest issues in modern science is that to make many discoveries you need to pay very high prices to get the latest and greatest hardware. I've been exploring how to make lower cost telescopes and microscopes (and definitely love this project), that are "good enough" to open up new areas of research/discovery for people with budgets in the $1K-10K range. But it's hard! So far I have mostly been relearning what people already knew in the 1800s and early 1900s, that is easily obtained off-the-shelf tech today.


The point is it's not that inexpensive for the optics. I've worked in an interferometry lab where we used high-volume photographic lenses where we could, and yeah, they were probably an order of magnitude less expensive than something made for the purpose, and real exotic optics were more than two orders of magnitude more expensive.

But talking apples to apples, a $12000 professional camera lens is closer to something like a $12000 research microscope than you think in terms of the build-or-buy decision. There's also a whole industry of telescopes for amateurs that are made in low volumes to much higher optical standards than photographic lenses that are mostly not that expensive. A top tier 6" refractor might be $15K and blow the camera lens away as a general purpose astronomical instrument, but it would not be nearly as fast, which is important for this application, and if you placed an order for 120 of them, you might get them in five years. Maybe. I'd guess that made-for-the-purpose tubes would come in within a factor of two or three of the off-the-shelf lens if you could find a supplier. They might even be cheaper. The project risk would be larger, and that might be determining also.


> I work with microscopes that cost $1M and just sit in a lab. That's not atypical for an academic or industrial microscope.

Ooo, what for!? I always love hearing what people do with optical equipment I can't afford. :)

I'm sure I'd be interested in your pet project as well!


The scopes I work near are typically for applying cutting edge imaging techniques to cellular or organoid growth. Like this: https://www.zeiss.com/microscopy/en/resources/insights-hub/l...


Wow, thanks! That first image is really something else.

I wonder if Dr. Pilipp Seidel has any relation to Philipp Ludwig von Seidel.


There are plenty of much cheaper telescopes in that 6" aperture range. My 152mm was only $1k. My polar mount was $1500. Even adding in a tracking scope/camera to attach is still a fraction of the price of single lens. Tack on a similar SBIG dedicated astro camera and cheap laptop to run the guiding and imaging would still come in under that price tag.

So, is there an advantage of having all of the lenses on the same tracking platform to justify the expense of the single mount? If you place individual 6" scopes in an area where humans could comfortably move between them all pointed at the same object or even slightly different areas to get the wider image, would that not be the same/similar result? Essentially, building the VLA but with commodity off-the-shelf visible scopes.


It's more of a "that they could do it at all" sort of feat. They're coming in somewhat cheaper (probably) and at lower project risk by using off the shelf professional camera lenses than by having tubes made for the purpose.

Your 6" scope is slower, probably much slower, than the telephoto lens they used. There really aren't any amateur telescope tubes I know of that you could directly compare to the 143mm aperture f2.8 Canon lens. The right comparison would be to a 6" apo, which would run $8K-$16K and still be slower.


You of course are correct regarding f-value. My specific scope maths out to an unusual f/4.8 at 731mm focal length. However, I'm not trying to take 1/100th images. I'm doing 30s exposures, so a f/2.8 vs f/4.8 isn't that bad of a trade off.

Even if this isn't doing the same "science", it would be an interesting thing to play with for sciene. Instead of stacking images from the same camera, just stack each image from the array. Or capturing an entire mosaic in one "snap" which is essentially what WASP is doing (mentioned in a post from yesterday).


That's almost certainly an achromat unless you got a screaming deal I want to know about. It's not in the same league optically as the Canon lens, which is more like an apochromat, even though you'd probably be disappointed using the Canon lens as a visual telescope - it will be tough getting and keeping sharp focus on stars with the Canon. They had to use a feedback control system on the focuser in the early paper. They also use a lot of image processing.

Also they're not so much using the speed of the lens for shorter exposure times but for field of view and for high sensitivity.

In general photographic lenses make mediocre telescopes and telescopes make mediocre photographic lenses - try using one of your tubes for some terrestrial photography to see. So it's pretty amazing that the Canon lens performs so well to begin with, that they're able to use it like a fast apochromat, and even more so that they're able to build it out to be roughly equivalent to a really large apochromat. With eight lenses, the early paper claims the instrument is equivalent to a 40cm f/1 refractor. How would you build such a thing? Well, this is how.


From what I understand, reconstructing images from virtual telescopes requires extremely accurate timing so all the wavefronts are in phase: https://en.wikipedia.org/wiki/Aperture_synthesis


Yes, but that’s not what they’re trying to do.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: