Sunday, April 29, 2018

#133 - Friday, April 27, 2018 - At the Mercy of the Atmosphere


After wrapping up a session of playing Minecraft with a friend of mine after both of us were yawning and saying it was time for sleep, I looked outside and saw it wasn’t as cloudy as Clear Sky had predicted, and suddenly I was quite awake and decided to set up my gear for imaging Jupiter and the moon!  (I was building a giant floating observatory in Minecraft...)
Maybe I'll share another screenshot or two when it's all done...I'm going to put a telescope in it, and the interdiction torches have been replaced with red lamp squares.
The choice of redwood planks was to match the ring of redwoods this sits beside.
I used Plotz for the plan - it can also do spheres!

Of course, as soon as I was set up ready to go, clouds rolled in…I attempted to image in between the cloud banks, but it just wasn’t going to happen with Jupiter, since I needed all four color channels, and it rotates rapidly.  Also, my frame rate wasn’t very high, only about 19 fps with a small ROI (region of interest) on Jupiter, and only 1 or 2 fps full-frame on the Moon.  I couldn’t choose too small of a ROI for Jupiter – I had set up the camera to be attached to an eyepiece projector, and I had a 25mm eyepiece inside, which gave a pretty magnified view to the camera.
I'll just come out and say it, it looks kind of ridiculous.

The seeing didn’t really support planetary imaging anyway, I’d say it was about 2/5.  But since I was set up, I decided to image the moon anyway, since I only need luminance for that and not also RGB (the moon is quite gray), and maybe I could get enough clear frames to get a decent result with the magic of RegiStax.  It didn’t pan out, unfortunately…but I did get to try out eyepiece projection with the ZWO, and on nights with good seeing, there is definitely potential!
Probably the most badass picture of a telescope I've ever taken.


Even in Minecraft, I accidentally started the observatory build in the level where the clouds float through, so I had to make the floor two blocks thick so that I didn't have clouds dusting the floor constantly.


Table of Contents

Friday, April 27, 2018

#132 - Thursday, April 26, 2018 - More library telescope outreach!



Another library telescope program!  This one had a smaller crowd, but the town this library is located in is a smaller community.  There were five patrons, plus one librarian.  But they seemed to get a lot out of it, and hopefully word will spread for when I do another round of these in the fall. 

I had just returned from a conference earlier that day – I got home at 6 PM, ate dinner, and turned around and drove the half hour to the library!  The first leg of my flight was delayed due to a maintenance problem, so I nearly missed our connection in Charlotte.  I ran to the gate, and just managed to make it.  Then that flight was delayed as well due to another maintenance issue.  So I almost didn’t make it!

We started out inside, and I showed them the parts of the telescope and how it works.  Then I loaded up Stellarium on my laptop connected to a projector and showed them how to find a few things in the night sky.  Then we went outside, but unfortunately there was quite a bit of cloud cover.  Luckily, it wasn’t very thick, so we were still able to look at two things: the moon and Venus.  They got to practice finding things in the finderscope, moving the scope around, and focusing.  The red-dot finders take some practice to figure out exactly what angle to put your head in order to see through it. 

The waxing gibbous moon looked pretty cool through the cloud cover, and with the zoom eyepiece on that scope zoomed into 8mm, you could see that Venus had some spherical shape to it as opposed to just being a really bright star.  It was a good time as always! 



#131 - Friday, April 20, 2018 – Astronomy Night with the Brownies


In addition to the outreach I do with the local astronomy club, I am also a Girl Scout troop leader, and sometimes I do astronomy nights with local troops.  A local Brownie troop (2nd and 3rd graders) invited me to their campout at one of the nearby Girl Scout camps.  It wasn’t in a particularly dark location, so I had to choose my targets carefully; in addition, elementary schoolers are definitely not interested in looking at "dim fuzzies".  Fortunately, the moon was a nice waxing crescent, and there were a few other interesting things to look at.

I drove about an hour to the camp with my 8-inch Celestron Schmidt-Cassegrain and its NexStar alt-az mount loaded into the back of my car, and arrived about an hour before sunset.  I also brought out my new eyepiece, a 40mm focal length, 2-inch monstrosity of a Plossl made by Meade.  One of my fellow club members gave it to me.  The FOV is so large that in conjunction with my 0.63x focal reducer, I could see the secondary mirror as a darkened area in the center when I was focusing it on a nearby tree!  One of the leaders had also checked out one of the library telescopes I’ve been doing outreach events with lately, so we set that up on a picnic table as well.

When I arrived at the lodge they were staying in, I was dismayed by the height of the trees.  But they pointed me to a field that was right next door, and it was perfect!  So I got my gear setup, and then went back to the lodge to advise the girls to dress warm.

The waxing crescent moon was already bright, as was Venus, which was high in the western sky.  I aimed at Venus first, since it wouldn’t be around for long before it sank behind the trees.  I swapped the 2-inch monstrosity back out for my 1.25” 17mm Plossl so that they could have a bit more of a magnified view.  I didn’t want to zoom in too far though, since I hadn’t aligned the scope yet (it was too soon after sunset for stars), and aligning on a planet has gotten me mixed results on that NexStar mount as far as how long it will stay on that object goes.  Venus mostly looks like a fat, bright star, and the girls were not terribly impressed.  As they cycled through, I talked about some Venus facts, like its thick atmosphere, hot surface, and runaway greenhouse effect, which had them worried about what greenhouse gases might eventually do to the Earth.  Then I turned the scope to the moon and first just had my 1.25” 25mm eyepiece in so they could see the whole thing, and then one of the leaders asked how much I could magnify it.  With a focal length of 2000mm on the scope, and my shortest eyepiece being 6mm, I can do 333x, so that’s what I did!  The atmosphere was pretty decent, so we got a nice view of some of the craters near the terminator. 

I took a break from my scope for a bit to show the girls how to use the library telescope so that they could try it out while I was aligning my scope, now that we had plenty of stars.  I also showed them a few of the more easily-visible constellations, like Leo and the Big Dipper, with my purple laser pointer (I have terrible luck getting the green ones to work – it seems like the frequency-doubling crystals in those things are always falling out of place).  The purple is less well visible, but not bad.

I aligned on a couple of stars, and slewed to M67, an open cluster in Cancer, up and to the west of star Procyon, about halfway between there and constellation Leo.  It’s fairly bright, and looked nice in my SCT.  At that point, most of the girls were getting too cold, so a few of the parents took them back to the lodge, and a couple stayed behind to keep trying on the library scope.  The adults were interested in seeing a galaxy, so I slewed to M51.  It was just barely discernible against the moonlight and the light pollution (the area was rated orange on the Bortle scale, according to DarkSiteFinder), but after looking at it for a minute or so, it would start to come out.  I also showed them a picture so they knew what they were looking for. You could see the two cores of M51 and its partner NGC 5195, as well as some nebulosity around M51’s core.  I explained how it was 23 million lightyears away, meaning that the light was emitted 23 million years ago and was just now being absorbed by their retinas.  It’s always amazing to realize that. 

Unfortunately we were not up late enough to see Jupiter, so I haven’t had a chance to see it yet this year.  But soon, I’ll pull out the 8-inch into my front yard and do some imaging of it with my ZWO, hopefully when the Great Red Spot is visible!

All in all, a fun evening!  I was also prepped to teach them a few camp songs related to space – one about satellites, “Baby Moon,” and “Star Trekkin’,” but ended up not having time.  Darn!


#130 - Thursday, April 19, 2018 – Eye of the Needle (and Utility of the Refrigerator)


The forecasts were looking very promising for Thursday night, so I loaded up my camera gear and went out to the observatory to image!  It’s been about a month since I was last out there.  It’s “Galaxy Season,” the time of year when the winter Milky Way and all the nebulae around Orion are setting early, but when the summertime Milky Way isn’t rising until late at night.  This means we are seeing above the plane of the galaxy in the first half of the night, a great time to see the plethora of galaxies that are visible in amateur scopes, including dozens in Coma Berenices.  The Needle Galaxy is a rather bright and rather large member of that group, making it a worthy target for the astronomy club’s 5-inch refractor.  I’ve only imaged it once before – at last year’s Texas Star Party, through my 11-inch Schmidt-Cassegrain, after seeing it in a fellow club member’s Dobsonian scope and being surprised at how big and bright it was for not being a Messier object.  How did Charles Messier miss this gem??
Finally, a good selfie with this scope.

Now that spring is finally shoving winter back into its deep dark hole, night is coming later now – sunset wasn’t until about 8:15 PM.  Since it was a weeknight, I could only be out until about midnight or so, but I decided to image with the ZWO ASI1600MM Pro anyway.  I decided to go ahead and try to get a full set of LRGB data, since I’m not going to be able to get out to image again mostly likely until after the Texas Star Party.  So I planned on taking an hour’s worth of luminance data, and an hour’s worth of color data, which is 20 minutes on each color channel.  I stuck with my standby exposure time of 5 minutes, although next time I think I will try doing many shorter exposures, which I’ve read online is really much more in this camera’s wheelhouse. 

I also wanted to try using AstroTortilla, which is an application that does plate solving and will slew your scope to center the object you want for you.  I had set it up earlier that day, and during installation, you have to set the range of your fields-of-view, largest and smallest, to cover all of your gear setups.  My smallest FOV is 21 arcmin x 16 arcmin, which is my ZWO attached to my 11-inch SCT without a focal reducer (what I might do for planetary nebulae and small galaxies, and my largest is my Nikon D5300 attached to my Borg, which is about 2.6x1.8 degrees (the largest FOV on a mount that AstroTortilla could control, anyway – so this excludes my Vixen Polarie, of course).  During setup, you select the index files that not only cover this range, but on the narrow end includes 20% of your smallest FOV, which for me is the 2.8-4 arcmin index file.  The smaller the FOV, the larger the index file for plate solving is (the index file is the reference images of the night sky to compare your camera’s images to), so I had to download several gigabytes of data.  But it finished just before I left the house!

After getting all attached and powered up, a process that went smoothly, it was still light out, so I slewed to the moon.  I had to do this manually – the catalogs in Gemini’s computer-control app contain a lot of catalogs of both bright and dim objects, but it does not contain a catalog of solar system objects!  I opened up SharpCap and got all the settings set how I wanted them, and I had a pretty decent frame rate going of about 15 fps, but once I hit the Capture button, it fell to like 0.5 fps and it was dropping more frames than it was saving.  I eventually found the culprit: my tablet’s hard drive was nearly full!  I had forgotten to move some darks I had been taking recently over to my desktop, and the 16-bit nearly-full-frame images I was saving off were quickly filling up the leftover space.  I went through and deleted as much extraneous stuff as I could (forgetting all about the 21 GB of movies I had on it for airline trips).  I deleted previous Windows installations and some other things, and managed to create just enough space to record a few captures on the moon, and later Jupiter, but using only 8-bit images instead of 16 bit.  I got much better frame rates after that – nearly 50 fps when I used a smaller region-of-interest for Jupiter at the end of the night.  I was worried I wouldn’t have enough space for the rest of the imaging I was planning to do that night, but I was taking long exposures of the Needle Galaxy, so I didn’t fill up as much memory.  Needless to say, I went onto Amazon and bought a 64GB micro SD card that very night to put into my tablet!
Anyway, I got some nice images of the moon!

Date: 19 April 2018
Object: Moon
Camera: ZWO ASI 1600MM Pro
Telescope: Vixen 140mm neo-achromatic refractor
Accessories: Astronomik L filter, Type 2c, 1.25"
Mount: Losmandy Gemini II
Short exposure (top): 156 frames
Long exposure (bottom): 436 frames
Exposure: 10ms (top)
   300 ms (bottom)
ISO/Gain: 0 (top)
 300 (bottom)
Stacking program: RegiStax 6


Once it was dark, I slewed to Arcturus for focusing, and the diffraction spikes from my Bahtinov focusing mask already looked pretty close since I had just focused on the moon, so I slewed over to the Needle Galaxy.  AstroTortilla did appear to start working, although I had to close out of SharpCap so that it could talk to the camera, but it wasn’t able to plate-solve the image.  With how little time I had that evening, I decided I’d try that again on a different night, and since the Needle Galaxy is bright, I was able to see it readily in Sequence Generator Pro with 3-second frames when I turned on auto-stretching the histogram.  I got it centered, and started my luminance frames.

Those went well, although they looked out-of-focus in the raw frames.  Raw FITS frames, which is what comes off of astro cameras, look grainy and terrible, and sometimes also out of focus.  But I had just focused and didn’t move the scope very far, so I felt pretty certain they were actually in focus.  Guess I’d have to find out!  
Single 5-minute luminance frame on the Needle Galaxy.  It's not as blown out as it looks - FITS files have adjustable luminance curves.

After an hour of 5-minute luminance frames, I rotated my filter wheel to red for 20 minutes, then green, then blue.  All went smoothly, and guiding looked great – I was getting only about 1.5 arcsecs of error, which is pretty decent. 

After midnight, when I was done capturing the Needle Galaxy, Jupiter was up, and I couldn’t resist!  However, it is very tiny inside that 140mm aperture, 800mm focal length refractor, so I was pretty sure it was going to come out poorly.  I took the data anyway, and upon further review, yeah it was pretty crappy.  I’ll just have to set up my 8-inch in my front yard soon!  Luckily, you can image planets out to Saturn easily under even a full moon because of how short of exposures you need, so I can do that next week if we have some clear nights that aren’t freezing.

I didn’t get a chance to process the Needle Galaxy image for a few days, but I stayed up late the following Saturday night and rush-processed them before heading off to a work conference so I could write the blog post :D  I didn’t think it was going to turn out that well anyway, since I only got four frames per color channel, which gives poor signal-to-noise ratio for the color channels.  I also didn’t have a set of bias frames for the temperature I ran the camera at (-25C) yet, but I decided to stack without them anyway.  I ran into a snag early on though – a weird issue with the new version of DeepSkyStacker that just came out, 4.1.0.  I was really excited about it because they finally had a 64-bit version, which means it should be able to use a lot more RAM.  I stacked the luminance, red, green, and blue frames, saved them out as 32-bit TIFFs, and imported them into Photoshop.  I zoomed in to see how well the detail on the luminance channel came out and saw that the image was full of holes!  Missing pixels everywhere.  So I went back to DSS and turned off hot pixel removal, just in case (I hadn’t had troubles with that before), but got the same result. 
                

I thought next that maybe it had to do with the lack of bias frames, and since I can control the temperature of my camera and am therefore not at the mercy of the exact outdoor temperature, I decided to throw the camera out on the back porch like I always do to take darks and just grab a set of bias frames real quick.  Bias frames are the same gain and temperature, but the shortest available exposure time, as your light frames, so they’re quick to acquire.  However, it was kind of warm outside, and my dryer was running, blowing hot air right onto my dark capture spot!  So the only alternative was to put my camera in the fridge!

The door still closed easily over the cables, so I let it cool, grabbed the frames, and was back in business!

So I re-stacked just the luminance frames with the biases, aaaaaand still had holes in my picture.  So there were two more options: either my newly-acquired flat frames from a few weeks ago were bad, or something was wrong with the new version of DeepSkyStacker.  To check, I launched the older version of DSS and processed the images – and they came out fine, no holes!  Now I need to get onto the DSS forum and report the problem.  And see if anyone else is having this problem, probably over on Cloudy Nights.

I stacked all the channels in the old version of DSS, stretched them in Photoshop, and saw that my flats didn’t flatten very well, so I had to make synthetic flats.  (I’ll write a blog post on that soon, I promise!)  I combined the LRGB all together, adjusted levels and curves to get the color balance and contrast I wanted balanced against the high noise that’s a result of the low number of frames, ran the blue halo removal routine in Noel Carboni’s astronomy tools for Photoshop toolkit (which works well on my DSLR images, but less well on my astro camera images, it seems), and here it is!
Date: 19 April 2018
Object: NGC 4565 Needle Galaxy
Camera: ZWO ASI 1600MM Pro
Telescope: Vixen 140mm neo-achromatic refractor
Accessories: Astronomik LRGB Type 2c filters
Mount: Losamndy Gemini II
Guide scope: Celestron 102mm
Guide camera: QHY5
Subframes: L: 12x300s (1h), 1x1
   R: 4x300s (20m), 1x1
   G: 4x300s (20m), 1x1
   B: 4x300s (20m), 1x1
   2h total
Gain/ISO: 139 (unity)
Stacking program: DeepSkyStacker
Stacking method (lights): Auto-Adaptive Weighted Average
Darks: 20
Biases: 30
Flats: 30
Temperature: -25C (chip), 45F (ambient)


Kind of noisy, got some weird dark halos around the stars as well that let me not be able to blacken the background as much as I would have liked, but not bad coming from my light-polluted area on an achromatic refractor!  I’ll most likely image this again at the Texas Star Party with this camera on my 11-inch SCT to try and get better detail.  This time, I’ll try many shorter exposures though to see how that works, since this is a brighter target that could probably support that method.


Thursday, April 19, 2018

#129 - Wednesday, April 11, 2018 - Library Telescope Program

Last year, my astronomy club furnished two small telescopes to one of the local libraries.  More and more astronomy clubs and libraries are teaming up to bring astronomy to the public this way, and it's fantastic!  Library patrons can check out a telescope to bring home for a week and do some astronomy from their backyard!  Several other libraries in the area thought they were awesome, so they bought their own.  One of the club members modified the scopes to make them more user-friendly and less likely to lose parts - the eyepiece gets glued in, strings attach the objective cover and eyepiece cover to the mount, a plastic plate is put over the end so the primary mirror doesn't get messed with, the button-cell battery in the red dot finder is replaced with two much more easily-accessible AAA batteries, etc.  Instructions and star charts are also included and are clipped onto the mount and attached with string.

The telescopes our local libraries have, and I think many other libraries as using as well, are Orion Starblast 4.5's, which are 4.5-inch Newtonian telescopes on tabletop alt-az mounts.

The first program I did with these was actually last August, just before the solar eclipse, as part of a "Millennials" series one of the libraries was putting on.  (Some decidedly non-milennials joined as well!)  I talked about the upcoming eclipse, how to watch it, and then also about the library telescopes, including a demo and looking at the some various objects.  The same library invited me back again on March 22nd, and I showed a group of about 15 people on a chilly but clear night at a local park how to use the telescope, and then we looked at a few things, such as the Pleiades cluster and the Mizar-Alcor double star.  I was actually rather impressed with what we were able to see with this scope!  And it was very simple to use.  I was glad for this, since as an imager, I have to shamefully admit I only have a rough idea of where the things I image are.  (The Rosette Nebula is somewhere off Orion's right shoulder...M81 and M82 are in the north somewhere...)  But I was able to find things pretty readily (I took some notes on where to find things using my SkySafari app, and since that scope has a fairly wide field-of-view, I just sort of hunted till I found it...we're mostly looking at bright things anyway!)

I liked this program enough that I got in touch with the four other libraries in the area who had purchased their own, and I have done this at two of them so far - two to go!  One of them we had to hold indoors, however, due to cloud cover and rain, so I'm not counting that toward my log.  (And I forgot to log the first one!)

For the indoors one on April 10th, I showed patrons (we had a group of nearly 30, including kids, which was amazing for that small community and the fact that it was cloudy!) how to use the scopes, and had them look through the eyepiece just at the wall to get a feel for it.  Then I pulled up Stellarium on my laptop and showed them how to find things like the Double Cluster, Orion Nebula, Polaris, and Mizar, and pointed them to apps and websites where they could find star charts and interactive sky maps.  

On April 11th, I did the program with a group of about 8 at another local park.  It was chilly, but not too bad.  We did have some fairly significant cloud cover roll in, however.  Fortunately, Sirius poked its face out through a thin section of clouds, so the patrons got to practice using the scope on an object that was very easy to find.  They seemed to get the hang of it, and I think they feel much more comfortable with the scopes now (several had checked them out previously, but had trouble figuring out how to use them).  There were also two young boys (maybe 5th grade) who asked some really smart questions on astronomy and astrophysics, like what is dark matter and what does it do, what will happen to the sun when it dies, things that I was more than happy to dive into and go way over time!  

I've got another program next week, and then one in July sometime!  And I'll probably repeat some of these with the libraries again in the fall.  I love doing outreach!


Friday, April 6, 2018

Combating Noise in Astro Images

When it comes to measuring anything, whether that be recording sounds, taking images, or scientific measurements in a laboratory, noise is an inescapable part of that measurement.  While nature adheres quite closely to the laws of physics, nature is also messy and chock full of uncertainty.

It is impossible to measure something with infinite accuracy - the exact point will always be a little bit spread out.  In addition, some level of background noise is always there.  In daytime photography, your signal will usually vastly overwhelm your noise, so the noise is not noticeable - think about the background sound of a a ticking watch while in a crowded cafeteria.  But in astrophotography, our signal is very weak, oftentimes just barely above the noise, especially if you are shooting from a light polluted location or with a camera operating at ambient temperature.

I have often had people show me some attempts they've made at astrophotography, and how disappointed they have felt when their single images are noisy and dark.  The key is, you can't stop at a single image - you've got to get statistics to work for you rather than against you.  Imagine drawing conclusions about how wind effects gas mileage when you have only driven one mile!  You have to increase your sample size in order to increase what's known as the signal-to-noise ratio, or SNR for short - how far above the noise your signal is, or how much more prominent the deep sky object (DSO) is over the background light and noise of your image.  When it comes to a good astrophoto, it's all about SNR.

Sources of Noise

There are several sources of noise:
Dark current - The basic function of a camera sensor is to convert photons into electrons.  The electrons generated over the length of a single exposure are held in each pixel, and then the accumulated charge is converted to a digital value of intensity, which is stored in the image file.  Cameras are not only sensitive to visible light, however - ambient heat and heat from generated in the battery and circuitry can also create electrons in the pixel, generating a false signal.  This is known as dark current.  If you put the lenscap on your camera, you can record that dark current.  It is random, however, since who can say whether a given pixel will be the one to record a bit of that heat?  In general, the intensity of the dark current will increase with the exposure time (so doubling the exposure time will double the intensity of the recorded dark current), and it will also double for about every 13 degrees Fahrenheit (6 degrees Celsius). Some pixels will also be more sensitive to this noise than others.  Below is an example dark frame, a 5-minute exposure on my Nikon D5300 at ISO-3200 and 72 degrees F.  The top image is the raw frame, but it can be hard to see (especially when it's converted to jpeg as it is here), so on the bottom is a brightened version.

 An example dark frame from my DSLR (top), and a brightened version of the same frame (bottom).

Read noise -  Wherever there is electronic circuitry, there is electronic noise.  Some of it shows up as a random pattern, since noise is an inherently random process.  Some of it will show up as a fixed pattern, since variations in the manufacturing process will cause some pixels to run "hotter" than others, or have some low level of baseline charge (known as offset or bias).  Some of it may show up as horizontal or vertical lines in your image, especially if you are using a CCD camera, since those are read off row-by-row instead of pixel-by-pixel.

Quantization error - The charge that is held in each pixel is an analog signal.  In order to create a digital image, however, it must be digitized and turned into a number - it must be measured.  This is done using an analog-to-digital converter (ADC).  Different cameras record images at different bit depths.  My new ZWO ASI1600MM Pro will record images at 12 bits.  This means that 0 is black (no signal was collected), and 2^12, or 4,096, is white (the pixel was saturated, meaning it held as much charge as it can store).  This means that the images saved out from the camera can have 4,096 shades of intensity.  Now, my camera has a well depth of 20,000 electrons, meaning it can hold 20,000 electrons before it saturates and can't hold anymore.  This means that every level of intensity is a difference of 5 electrons.  If we pretend the camera has perfect quantum efficiency (how well it converts photons to electrons), this means you can not differentiate between parts of an image that differ by fewer than 5 photons, since they will appear the same brightness.  This loses you that subtle detail in knots and Bok globules in a nebula, for instance.  Now, you could decrease the gain (or ISO) on your camera until the pixel could only hold 4,096 electrons, but then you lose dynamic range, since the separation between the brightest and darkest parts of your image is much smaller.  So it's a tradeoff.

Shot noise - Shot noise has less to do with the camera and more to do with the uncertainty that exists at the most fundamental levels of physics.  Due to how dim and distant the things we image as astrophotographers are, the photon flux, or rate at which photons arrive at your camera is quite low.  Imagine you are standing in a rainstorm; the raindrops are coming frequently enough that you can't tell the difference in how many raindrops are hitting you per second.  But if it's just beginning to rain, you have no idea when the next raindrop is going to arrive - a half second later, five seconds later, etc.  The same goes for photons - at these low rates, you don't know when they're going to arrive.  In one frame, 5 photons might hit your camera; in the next, it could be 12, or 3.  It may surprise you to know that the shot noise actually increases with intensity - but as the square root.  If you have 10 photons get absorbed by your camera in one frame, and then 100 in another, your signal has increased 10 times, but your noise has only increased by sqrt(10), or 3.2.  So even though the noise is higher, the signal is much higher.

Quantum and Transmission Efficiency - Quantum efficiency is essentially how good the sensor is at converting photons to electrons.  There is no guarantee that just because a photon strikes the detector that it will get converted to an electron.  If it doesn't, then it is lost.  The fewer signal photons you collect, the harder it is to distinguish your signal from the noise.  Transmission efficiency is how much light makes it through all of the optics and filters between your camera and the sky.  Generally, telescope optics are high quality and have high transmission.  Color filters, however, can lessen your signal.  This is especially true in DSLRs, where the Bayer matrix (the array of red, green, and blue filters laid over the top of the sensor so that you can image all three colors at once) can have relatively low transmission.  I borrowed the chart below from one of Craig Stark's presentations on astrophotography (found here, and it is a great resource!).
The top graph compares a monochrome CCD camera, the QSI 540, to is color version, the 540c.  The bottom chart compares the monochrome QSI 540 to a Canon 40D/50D DSLR.  As far as the filters themselves go, independent from the camera sensor (the images above include the camera's responsiveness to those wavelengths), they can have nearly 100% transmission, such as with the Astronomik LRGB Type 2c filters I have (and love).
Again, fewer photons means it's harder to distinguish the signal from the noise.

Light pollution - Light pollution isn't quite the same as the other sources of noise, but it can certainly result in a lot of problems.  Light from a nearby town reflects of particles in the atmosphere, and the camera will capture those as well.  Think about trying to see a dim cell phone screen out in full daylight versus in a dark room - it's a lot harder to get much contrast in the galaxy you're trying to image when the background light is just as bright!  In order to get more signal, people will usually turn up the gain or ISO on their cameras, which increases the camera's sensitivity not only to light, but also to heat and read noise.  In addition, remember shot noise?  Light pollution is also a source of shot noise, except it doesn't contribute to your signal (the object you're imaging), so it's only adding additional noise.
Take a look at the two images below - the top one was taken in a suburban/rural transition sky (5 or yellow on the Bortle Scale), and the bottom in the dark skies of west Texas.  Note that the top image used a light pollution filter - the useful thing to look at here is the contrast between the object and the background.
5-minute frame, ISO-1600, from a light-polluted location

 6-minute frame, ISO-1600, from a much darker location

Okay so the bottom frame is one minute longer exposure time, but you get the picture.

How can we ever hear the beautiful music over all of this noise??

It sounds like a daunting task!  Don't fret, however - we have the power of digital processing at our fingertips.  Stacking, calibration, and post-processing are extraordinarily powerful tools that let us turn noisy messes into beautiful recreations of the universe's many wonders.

I've got oodles of examples of subframes that look noisy and terrible, and processed frames that look way more awesome.  As far as distinguishing the target from light pollution, though, this one takes the cake.

I have wondered for some time now, is it possible for me to quantify how much better the images get with stacking and processing?  The answer is yes, of course, if you don't mind a little math!

Stacking

The whole point of stacking (see this post for a tutorial on how to stack astro images) is to increase the certainty that the light in a given pixel is "real" and not noise or light pollution.  If you have one picture, and you look at a given pixel, you may not be able to tell.  But if you have 20 pictures, and in 19 of them the pixel is just about the same color and brightness, you can be pretty sure that that is the real value of the pixel.  Stacking is a statistical process that increases that certainty, and the bottom line is you get an increase in the SNR.  In general, your SNR increases by the square root of the number of frames you, but you can see much larger gains by applying calibration frames and doing some post-processing, as you well see evidence for in a moment.

Calibration

There are two kinds of calibration images that will help you reduce noise - darks and biases.  (Flats reduce vingetting, or the darkening of the corners of your image, so I'm not including those here).  Darks record your dark current, and biases record your read noise and fixed pattern noise.  It is important to note that your dark frames will also have read/fixed pattern noise, but apps like DeepSkyStacker handle the subtraction so that the biases don't get subtracted twice.  For more on how to capture calibration frames, see this post.  Now, remember how I said noise is random - you can't take a dark and a bias frame and just subtract them.  The distribution of noise moves around like static on an empty analog broadcast TV channel.  Here again we get some help from statistics.  You take several dark and bias frames, and then DeepSkyStacker or your other favorite stacking app will average them and subtract a "master" from your stacked image (where the noise has also been averaged).  In Gaussian statistics, which most noise sources in nature are (the class bell-shaped curve), the average value approached the truth.  If you have a noisy camera (like a DSLR), you're going to want more dark frames.  I find that DeepSkyStacker struggles with more than about 60 (and that's if you've undergone the process of expanding its RAM-using capability from 2 GB to 4 GB - see this website for how to do it (it does require Microsoft Visual Studio, but the Community Edition (the free one) will do it)).  (Wow, nested parentheses!)  I usually use 20.  I wouldn't go less than 10.  (Again the square root law applies here, the mainstay of Gaussian statistics).

Other considerations

Having a cooled camera sensor makes a world of difference, as I have already begun to see in the first images from my ZWO ASI1600MM Pro.  Your dark current diminishes dramatically, which is a huge source of noise.  The combination of a cooled sensor, longer subframes, more subframes, a lower-read-noise chip, and much higher quantum efficiency and transmission efficiency combined to drastically decrease the noise between these two images of the Orion Nebula.  
Nikon D5300, 12x60s frames at ISO-1600, taken on an Orion ST-80 ahchromatic refractor, ambient temperature = 36 F (2 C)

ZWO ASI1600MM using Astronomik LRGB filters, total 113x60s frames, gain=unity (139), taken on a 140mm Vixen neo-achromat refractor, sensor temperature = -30C (-22F)

Light pollution

Darker skies will give you greater contrast, making it much easier to pick up dim details (see the Whirlpool Galaxy images above) and enabling you to distinguish very dim signal from the noise and background light.  They will also decrease the extra shot noise added by light pollution.

All right, show me the numbers!

For my experiment, I chose a dataset where I actually had enough subframes to measure the difference in SNR in stacking greater numbers of subframes - usually I am rather impatient and don't gather more than about 25 subframes.  I picked M8-M20 #2, an image of both the Lagoon and Trifid Nebulae (M8 and M20) captured on the back side of Casper Mountain on August 17th, 2017 while I was there for the solar eclipse.  It was taken with my Nikon D5300 attached to my Borg 76mm apochromatic refractor, using a Hotech SCA field flattener.  The telescope was attached to my Celestron NexStar SE mount (chosen for this trip so I wouldn't have to polar align it before dawn the morning of the eclipse, since this mount is an alt-az mount).  The subframes are 30 seconds long (the NexStar has some serious periodic tracking error) and ISO-1600.  The temperature was between 53-55F over the course of the acquisition of that dataset.  All stacks were done in DeepSkyStacker, and with the exception of the image I stacked without calibration frames to measure the difference in SNR, I used 20 darks, 20 biases, and no flats (I didn't have any for that scope yet, it was fairly new to me).  I stacked 10 frames, 44, and 88, and then stacked 88 without the calibration (dark and bias) frames (all using the auto-adaptive weighted average stacking option, my preference as of late).  I saved out the raw 16-bit TIFFs with changes embedded, not applied, and didn't do any adjustments in DSS.  In Photoshop, I stretched the histograms of the stacked images, and the 88-frame one the calibration files I used my post-processed, completed image.  I didn't make any additional adjustments to the 10 or 44 stacks, or the 88 stack without calibration.

The method of calculating SNR is quite simple: take a sample of the image over a flat area (on your DSO, since that's what you care about the most), so either in a nebulous region that isn't changing brightness much and doesn't include any stars, or between spirals of a galaxy, or something like that.  Grab the mean and standard deviation of that area.  SNR = mean / standard deviation - that's it!  It's a unitless value, although if you are into radio or other kinds of signal transmission and love your decibels, you can convert SNR to decibels using dB = 20*log10(SNR).  (log10 = log base 10, if that wasn't clear).  

Now, you need all of the images your comparing to be positioned the same way so that you are sure to grab the exact same area of each image, since the SNR over the region you are sampling will change depending on where you are sampling it.  I did this by opening up each of the images I compared in Photoshop, cropped them to be the same size, and then copied them over one of the images all as layers.

The Layers pane in Photoshop (if you don't already see it in the lower right corner: Window -> Layers)

The order these are in does not matter.  I turned off viewing all of the layers except for the base layer (my finished image) and each image at a time by clicking the eyeball box to the left of the layer thumbnail, clicked on the image I wanted to align (in the screencap above, the "single frame" layer), and turned the opacity down to about 50-60% (the "opacity" selector box).  Then I hit Ctrl + A for Select All, and clicked on the Move Tool on the left panel.  Then I used the arrow keys to nudge the image until it was in line with the base image by looking at the stars.  Get it as close as you can, knowing that there is probably some sub-pixel shift and you won't be able to get it exactly.  
"Single frame" not aligned with "Complete"
 The two images are now aligned (or are at least close)

Repeat for any other versions of the image you want to compare - different numbers of stacked frames, calibrated vs not calibrated, using different stacking methods, using a different number of darks and biases, etc.  Hit Ctrl + D when done to deselect.

Next, zoom in on your DSO, and try to find a flat area without stars or much change in brightness.  The larger your sample, the better, but if it's too large you'll get true variation in the DSO, which will skew your measurement; I used a 25x25 pixel box.  This is 625 pixels total; the square root of that is 25, which is 7.7% of 625, so I'll have an inherent 7.7% error in my SNR measurements (less than 10% is safe, by rule of thumb). You can set the size of the selection box by clicking the Rectangular Marquee Tool (the dashed-box-shaped icon on the left panel), change the "Style" to "Fixed Size," and set your width and height.  Click the area you want to put the selector box there.  I use brightest image of my comparison set so I can know what I'm looking at to choose the area - in this case, the completed image (again by clicking the eyeballs to toggle viewing the layers).


If you go to Window -> Info, you can have a panel open that shows your coordinates (in inches by default, although you can click the options button (the three lines in the corner of the panel) and change it to pixels, centimeters, what have you).  I'd record these wherever you are writing down your measurements for future reference, and I use the upper left corner of the selected area.  

Finally, in the Histogram panel, click the options button (three lines) and select Expanded View and Show Statistics.  This will show you the statistics you need.


In Source, select "Selected Layer" so that you're only measuring the image layer you have selected.  In Channel, select Luminosity.  This is so you aren't taking color variances into account.  The human eye notices differences in luminance far more than chominance anyway.
Now we are ready to roll!

First, here is a raw, single frame (converted to jpeg of course).

It is quite dark.  If you zoom in, you will see the noise.
Single raw frame.  You will also see how good the tracking is on my NexStar mount. (sarcasm)

I recorded the mean of that square to be 20.15, and the standard deviation to be 5.07.  Dividing those two, we get a SNR of 3.97.  This means that your signal doesn't rise very far above your noise, a fact that is easy to see in the image above.

Next, I selected the layer that is my stack of 10 frames, and recorded a mean of 43.56 and standard deviation of 2.72, which yields SNR = 16.01.  Just by stacking 10 frames and doing dark and bias subtraction, we have quadrupled our signal to noise ratio!

Stack of 10 frames, dark and bias subtracted.

Now, statistics says a stack of 10 will only get us a SNR increase of sqrt(10) = 3.16, and 3.97 x 3.16 = 12.5.  But again, dark and bias subtraction help us out.

Next, I measured a stack of 44 frames (half of the maximum), and recorded a mean of 35.99 and standard deviation of 1.58.  This means a SNR of 22.77.  This is a 1.4x increase.  You can see our diminishing returns happening already, but it's still a solid increase.

Stack of 44x30s frames, dark and bias subtracted.

Finally, I measured my stack of 88 that I post-processed - stretched the histogram, adjusted the light curves, adjusted the color balance (not important for noise), clipped the left end of the histogram (important for background and dim noise reduction, although you also lose real signal that's lost in the mix), and denoising blurring algorithm.  I recorded a mean of 99.07 and standard deviation of 2.45, which gives us a SNR of 40.44.  Fantastic!

By stacking, calibrating, and post-processing, we have increased the signal-to-noise ratio by 10 times!  And it shows - compare the single raw frame to the final product.  

Like all good research, let's summarize the results in a nice Excel table.
"Increase from single sub" here is the factor of the increase of the SNR - SNR of the frame / SNR of the single frame

I am going to look into some image quality metric tools as well, but I wanted to do this as a warm-up.  What a fun exercise!

Bottom line: Signal-to-noise ratio matters a lot!  Cameras are noisy, but we can beat that down with the power of statistics and get some really nice-looking space images, even though the odds can seem to be against us.  

Whew!  That was a long post.