After borrowing my minion Miqaela's Celestron Advanced VX mount for the Texas Star party, I discovered that it works really pretty well with scopes that have a short focal length, like my 3-inch Borg refractor. And if I can get it well-polar-aligned and guiding, it could very well handle my 8-inch SCT. As a present for myself for winning the Astronomical League award and having my corona composite image be selected as AstroBin's Image of the Day (Sept 4, 2017), I hopped onto Astromart to see if anyone happened to be selling an AVX. Well, as luck would have it, a gentleman in Massachusetts was selling one he'd only used a couple of times for $600 - $300 off the new price! So heck yeah I jumped on that. It arrived Friday (and was very well packaged, I might add), and while it was of course cloudy Friday night (we had our club meeting that night anyway), it cleared up nicely for Saturday night, which must be the first time in the history of astronomy that getting new gear didn't cause cloudy nights for a week.
After a couple of hiccups (like plugging the DEC cable in the wrong direction - there is no labeling on the cable for which way the electrons need to flow!), I got it up and running. Woohoo!
Now, normally I wouldn't test out two new things at the same time - that's just bad science. But since I'd used the AVX before with my Borg refractor, I decided to just go for it and also take the moonlit night to test out a CCD camera I'm borrowing from club member Phil - a SBIG ST-8300M. It's a monochrome camera with a CCD chip.
[ Brief aside - CCD vs CMOS ]
So what is the difference between CCD and CMOS anyway?
First, a couple definitions.
CCD stands for charge-coupled device. CCD chips have been around a long time - it was invented in 1969 at Bell Labs. Basically, light strikes the photoactive layer (a layer of silicon), and is converted to electrons, and a capacitor for each pixel accumulates charge based on the amount of light that falls onto that pixel. Once the exposures is signaled to be complete, the capacitors dump their charge into their neighbor, and on down the line until it reaches the last capacitor in the row, and then the charge is dumped into a voltage amplifier. This is done sequentially until all of the pixels are read off and digitized, and then the computer displays that information as a picture.
CMOS stands for complementary metal-oxide semiconductor, and that is the type of sensor you will find in consumer cameras and cell phones. It's functions similarly to CCD chips, except that each pixel has its own amplifier.
They each have their advantages. CMOS chips are cheaper to manufacture and consume less power, but they typically have higher noise, and are less sensitive. CCD chips are more expensive, but have higher sensitivity and lower noise profiles. However, CCD chips more readily have blooming effects when pixels are saturated (the "potential well" fills up with electrons and can't accept any more, so the charge bleeds over to neighboring pixels).
For daytime imaging, noise isn't usually an issue - bright sunlight vastly dwarfs the noise of the chip, and you have no shortage of signal, so the lower light sensitivity of CMOS isn't an issue. But for astrophotography, it becomes a big issue, since you have very low signal coming from those dim fuzzies.
Most astrophotography cameras you will find out there are CCDs. However, with the huge demand from cell phone consumers for high-quality cameras, especially in low-light conditions, CMOS chips are quickly closing the gap on quality, sensitivity, and noise. In fact, the astrophotography camera I'm thinking about purchasing in the not-too-distant future is the ZWO ASI1600MM, which is actually a cooled CMOS chip instead of a CCD, and it has less noise than many CCD chips (at least until you get to super expensive cameras.)
As far as astrophotography cameras go, another big advantage for CCD cameras vs DSLRs is the fact that they're usually cooled, or at least the ones meant for deep-sky imaging are. This greatly reduces the noise. For example, the SBIG I'm borrowing has both a fan and a TE (thermoelectric) cooler that can drop it up to 35 degrees C below ambient temperature. So last night, since it was 50F outside (10C), I set it at -20C. The CMOS-chipped ZWO also has this two-stage cooling.
All right, so how'd the pictures come out?
I tested out the SBIG at my house about a week or two ago to make sure I could get it to talk to the computer. It's an older model, and SBIG doesn't make available the software used to control it anymore - CCDOps. They're instead pushing their expensive TheSky software on people. Fortunately, Phil still had the disks, so I was able to get CCDOps installed, and Sequence Generator Pro will run it as well. CCDOps is helpful though for its focusing routine and other features. It can grab single frames, but can't run a sequence. (I'm still running the Lite version of Sequence Generator Pro, which still lets you run sequences, view your images, set the sensor cooling temperature, and lots of other stuff, you just can't access some of its other features like PHD integration and stuff like that. I'll buy it later when I actually buy a camera).
Since I couldn't get guiding to work, I decided just to take a few test frames anyway to see what the sensitivity was like, so I aimed for M16, the Eagle Nebula, which being mostly hydrogen alpha wavelength, would look nice in monochrome, since I didn't want to mess with color filters the first night out. This is a 60-second image taken using Sequence Generator Pro.
I was surprised by how dark it was! Just to check, I took one in CCDOps using its Grab tool, and I could actually see it:
There it is! I figured that CCDOps must just be stretching the histogram a lot so you could see it, and that the data really was there in the dark image, it was just hard to see. By the time I got to this point, though, M16 was getting into the muck, and I still wasn't sure whether it would be bright enough for my testing, so I slewed to M27 instead, the Dumbbell Nebula, a much brighter target. After using CCDOps single grabbed frames to get it mostly centered, I took a series of 15 with Sequence Generator Pro, at 3 minutes exposure time, hoping that I wouldn't lose too many frames to periodic tracking error since guiding wasn't working.
So here's a single 3-minute frame:
Yikes! Hot pixel city! I took dark frames and bias frames that were also pretty crazy.
Now, these frames are stretched pretty far so you can see them, and I don't have a good reference for what setting I'd put the stretching at to compare it with my DSLR. (By the way, I'm using a piece of freeware called AvisFV to view the FITS files that Sequence Generator Pro saves out, and you can just use the scroll wheel to change the amount of stretching on the image - basically, make it brighter or dimmer).
Time for some magic
It's not customary for a scientist to call a mathematical or scientific process magic, but even though I have a decent understanding of how stacking works, let's face it, it's still magic.
I gave DeepSkyStacker the light frames, the darks, and the biases (a review of what these are can be found here) (and I ended up not having time to take flats before it got dark), and out came a virtually noiseless, rather nice image of M27!
Date: 9 September 2017
Location: John Bryan State Park Observatory
Object: M27 Dumbbell Nebula
Camera: SBIG ST-8300 (Phil's)
Telescope: Borg 76ED
Accessories: Astronomik CLS filter, Hotech SCA field flattener
Mount: Celestron Advanced VX
Guide scope: N/A
Guide camera: N/A
Subframes: 6x180s (18m)
Temperature: -25C (chip), 50F (ambient)
Now, as you can see, my field flattener is not quite doing its job (according to the directions, I may need a spacer or something for using it with a CCD camera as opposed to a DSLR, where the chip is set pretty far back from the front of the body), but hey lookie, it's my first CCD astro image, and it came out pretty nice! So my next stop is going to be using it with RGB filters to get some color in there!