• outlay and then the learning curve is astonishing hats off to the Boss for taking it on

    👀

    Well the outlay is less than, or equivalent to, a very nice bike. It's not out of the reach of people, a bike shaped object is an entry telescope, a bespoke titanium thing with Dura-Ace is about as top of the market as astrophotography gets. If cycling is accessible, so too is stargazing.

    I may simplify here the learning curve, but it's just a camera (or eyepiece) behind a lens... this is photography.

    It's also more like photography from 100 years ago, it's slow photography where you do not really know how the outcome will look and so you have to experiment more.

    In fact, I'd argue it's easier than photography as the universe supplies an infinite number of already composed incredible views, and they're there every night.

    If one can understand focal length from photography (i.e. 20mm prime lens vs 35mm prime lens vs 55m prime lens) then one can understand that a telescope is just a prime lens at a very different focal length (the one I'm buying is 450mm). The only things that differentiates a camera lens from a telescope is that a telescope is optimised towards focusing on infinity and is designed to be mounted directly onto a tripod mount for very long exposures.

    Even filters are standard in film and photography, who hasn't noticed the teal and orange filters in the movie industry, it's everywhere https://theabyssgazes.blogspot.com/2010/03/teal-and-orange-hollywood-please-stop.html and filters are used all the time in photography to balance the exposure, handle reflections, etc.

    Filter use in astrophotography is mostly like polarising filter use in photography, to stop certain types of light. What astrophotographers are trying to stop is all of the local light from human made things, the photos really just want to include the light from the deep sky object, and not the light from Earth or local things.

    Filters exist which allow just one wavelength to get through... which is far better than trying to exclude lots of different wavelengths.

    This is the "Hubble" palette, also known as "SHO Hubble Palette", and it's made up of 3 distinct filters each allowing a tiny wavelength of light to get through:

    Where the wavelengths are:

    What you need to know is that SHO stand for the first letters of the gases these filters target:

    • S-II 672nm is Sulfur-II
    • Hα 656nm is Hydrogen-Alpha
    • O-III 496nm is Oxygen-III

    Different gases have different colours and the light of those gases reach us at different times as the wavelengths vary, meaning to see a wavelength of light is to see a point in time.

    But pictures need a few different colours to make them fully visible to the human eye, and the SHO Palette is an artificial palette where the different gases are mapped to red, green and blue:

    • S-II is Red
    • Hα is Green
    • O-III is Blue

    As nearly everyone uses the same colours you can look at the Tadpoles Nebula (IC-410) and see that there's a lot of Oxygen in this region:

    The above wavelength image of those are basically narrowband, meaning a single filter allowing only that lightwave through, so you take many photos of the same spot in the sky using different filters, and then colourise and layer together... this is not a new thing, this is how design and colour printing worked for 100 years, it's gels and films producing a colour magazine by layering different transparent films over each other.

    I'm actually not doing that, I've bought a full colour camera, and that can take a pic of the whole sky, also known as a "one shot" camera as a single photo captures all colours... but the whole sky is full of light pollution and the advantage that using filters has is that it's excluding the light pollution.

    Light pollution in most of the World is from street lights and such, i.e. street lights interacting with the atmosphere. So what you really want is a filter that lets through SHO but doesn't let through the various types of light pollution.

    So I've gone for a broadband filter, broadband as it lets through multiple wavelengths.

    This is what I'm going to be using:

    The white line shows that SII, Hα, OIII all get through, as well as other Nebulae colours, but the most common light pollution does not.

    The only really unique thing about astrophotography, beyond the "needs dark and clear skies", is stacking.

    The whole thing is about light, and very little light reaches us from these deep sky objects, and a tiny telescope on a rock hurling through space isn't going to have a lot of those photons hitting it. To stand any chance of getting a good picture you need a long exposure, tens of hours is not uncommon... but a camera in a fixed location pointing at the sky will have star trails after barely 5-8 seconds as we're on a rock spinning quickly through space.

    To do long exposures then, you need a special mount used to hold the lens (telescope). It is special as it aligns to the Earth's axis, meaning it is aligned to the poles of the Earth, and it includes a motor and some technology to control the motor, such that it will gradually move the telescope at the same speed that the Earth is spinning so that the camera has a fixed view of the sky despite being on a rock that is spinning in space.

    By this point you've just got a camera, with a big lens, on a mount that allows you to take long exposures of the sky (5 minute exposures!)... and yet you still get noise on the photo, a lack of detail, and some atmospheric effects, trails from a plane or a Starlink satellite.

    Now software can help... just take 300 photos! If you take 300 pics of the same spot in the sky, let's say 300 x 5 minute exposures, then what you have is 25 hours of exposure. More than one night of doing this, could actually be 10 nights of only 2 and a bit hours per night. 300 pics though... software can do "stacking", which is to merge multiple pictures and to keep all the things that are similar in the pics, and to reduce in importance the things which only appear in one picture.

    The effect of stacking is that with 300 x 5 minute exposures you have 25 hours of a deep sky object, and perhaps 2 seconds of a Starlink satellite whooshing by or a plane overhead, and only a minute or two of some specific atmospheric condition (haze, high altitude wispy clouds, etc)... the act of stacking enhances the deep sky object and removes the ephemeral conditions and light pollution.

    And really that's it as far as I can tell.

    Just a camera, a lens, a tripod with a clever mount, a filter, and some software to merge images together.

    Sources, I wrote this post myself, but googled for images to explain things, and most come from a single site, so it's worth citing:

About

Avatar for paininthe @paininthe started