I did take a break earlier in the afternoon though to run up to the vendor tent and see if they had some solution for getting my Orion 50mm guide scope attached to my Borg while the one with the attachment I need was attached to the Takahashi. While they didn't have an adapter from the T-style connection to the smaller Vixen-esque type, one of the vendors, Camera Concepts and Telescopes, did have a set of rings for a finderscope or guidescope of that size with the correct dovetail, so I had a solution! I also bought a cheap solar cover for the C8 so I wouldn't have to move it under the canopy tent during the day.
After a dinner of the meat trifecta of chicken, brisket, and sausage, I went down to the observing field to set up. Fellow club member Derek, from whom I was borrowing a second Celestron AVX in place of my misbehaving Celestron CGE Pro, wanted his AVX back, so I took it partially apart and brought it back over to him. Then I went and found one of the TSP staff, Jeff, who had graciously offered to let me borrow his AVX. Score! So I got that set up as the sun went down.
Jeff's Celestron AVX with my Borg 76ED, ZWO ASI1600MM Pro camera, and Starlight Xpress filter wheel on top. The other scope is my Celestron AVX with my uncle's Takahashi FSQ-106N on top, and his SBIG STF-8300M.
Jeff had a wooden tripod with an AVX mounting plate on it instead of the tripod that comes with the AVX. This wooden tripod had spiky feet with a place on each leg you could put your foot on to press it into the dirt to prevent settling. It also had height markings on the legs to make it easier to level, on flat ground at least.
Once the other AVX was built and ready to go, I set up my Nikon D3100 in front of my gear to do a timelapse of them moving around as the sky rotated behind them.
I also showed my dad a quick tutorial on how to use my Celestron NexStar SE mount, and he had a fun time exploring the hand controller's catalog.
I also showed my dad a quick tutorial on how to use my Celestron NexStar SE mount, and he had a fun time exploring the hand controller's catalog.
Once enough stars popped out, I started alignment on Jeff's AVX, but the declination axis kept slipping and then wouldn't slew. After a few tries, I figured out that as the dec axis moved around, it was loosening the dec clutch when it hit the dec axis motor casing. The clutch knob just needed to be reset back a few notches, which was an easy process -- I just pulled out the Philips head screw that was holding the knob in, rotated it back a few notches, and re-attached it. Done! Now it wouldn't hit the motor casing.
Once that was done, alignment and polar alignment went smoothly. Then I needed to pick a target, so I went back to my original target list for first-half-of-the-night targets and selected galaxy M106. I calibrated guiding, slewed to M106, and started acquiring 5-minute subframes. The guide graph looked terrible, but with the short 500mm focal length of the Borg, none of that showed up in the images.
I realized earlier when I was going through my data that the stretching of the stars in the unguided data weren't entirely from bad tracking, since the direction of the apparent stretching did not match the direction of drift when I played the images sequentially using the Blink process in PixInsight. I realized instead that my Hotech field flattener must not have been completely flat with the image plane. This is a problem I have frequently on the Borg setup, and it was made worse by the heavy filter wheel I added. Basically, the 2-inch nosepiece on the field flattener is supposed to be self-centering, so it has three metal rings separated by rubber rings. It is just a hair too wide for the 2-inch adapter for the Borg, so I can't insert it all the way. It gets stuck and is really hard to pull out. So it does tend to hang a bit.
Despite the slightly ill-formed stars, it's not as noticeable when you're zoomed out, and I was pleased with the amount of detail I could make out in the galaxy, as well as how far out I could see the feathery edges of the galaxy even in the raw frame. This was going to be awesome!
Over on the Takahashi, after going through all the frames I'd gathered so far and deleting a few earlier in the day, I decided to take a few more on M101 to make up for the ones I'd deleted before moving over to my next target for that night, M81 and M82. I was determined to capture the hydrogen alpha jets coming out of M82! With a set of narrowband filters on the SBIG and dark skies, I figured I could do it. So after capturing the couple on M101, I used Cartes du Ciel to center the AVX where I wanted it, and started taking RGB frames. I took those first so that in case we didn't get another clear night, I would have enough to work with, since I could make a synthetic luminance and still have a processed image.
While those were going, I was talking to my Uncle Chris, who was planning on leaving Thursday, about how I could keep borrowing the Takahashi and the SBIG for the rest of the week and get them back to him. He suggested that he would sell them like he wants to, and then I would ship them to whoever he sold them to. I don't remember exactly how it came up, but he offered me a super awesome "family discount" price on the Takahashi. I was already thinking about buying a better refractor than the Borg at some point, although I wasn't planning on getting a Tak for a long time. But I just couldn't pass up on this deal -- I would never get a price like that again! So I bought it! :D:D:D It's a Takahashi FSQ-106N, in particular. He had recently bought a newer version of it that I guess has somewhat better contrast and a few other improvements, but this one was plenty good. It puts me a little behind on potentially buying a new mount, but #worthit...
My very own Takahashi FSQ-106N!!
While that was going, an awesome meteor came through in the south -- I missed it, but saw the flash of it on my parent's faces who I was facing at the moment, it was that bright! They said it broke into two pieces toward the end. Guess whose DSLR was pointed in that direction? :D
Nikon D3100, ISO-1600, 15s, f/3.5, 18mm
It must have been toward the end of the frame because I didn't catch the breaking apart (darn!), but it's still a nice shot!
The night was flying by, and before I knew it, the Milky Way was rising. So I set up my Sky-Watcher Star Adventurer again with my Nikon D5300. This time, I tried the 50mm f/1.4 lens I was borrowing from club member Paul, and I was planning on imaging the whole Veil Nebula, a supernova remnant up near Cygnus. The 50mm lens showed a lot of chromatic aberration and weirdness though, so I switched to the 85mm f/1.8 lens, which is a beast! So I let that one roll.
Nikon D5300, 85mm, f/2, 180s, ISO-1600
You can just barely see it in the center , a little closer to the bottom...
After that, I went back over to the mounts to change targets. I moved the Borg over to the Western Veil Nebula, which I have an ok DSLR image of, but not an astro camera image of yet. It was just barely visible in the subframes, so hopefully it will come out more in stacking.
Western Veil Nebula, ZWO ASI1600MM Pro, luminance channel, 300s, Borg 76ED
I did get to look at this one in a neighbor's big Dob telescope using an OIII filter, and it looked phenomenal. I could see so much detail on it, and tons of other nebulosity associated with it in the area! So cool. It took a bit to get that sequence started because I kept forgetting to turn autoguiding back on after adjusting the nebula's location in the frame.
Over at the Tak, I decided to go ahead and take some narrowband images while I had the filters. I had already taken some hydrogen alpha images on M16 Eagle Nebula, and I decided to go ahead and take the whole suite, including sulfur-II and oxygen-III. I went ahead and re-calibrated PHD2 autoguiding because it was starting to look a bit bad and the stars were starting to look a little stretched at 5 minutes, and even though I had the mount staked down, it almost certainly had been jostled around a bit between taking the scope cover on and off and the wind. The calibration didn't go well though, and it looked like it was possibly because of backlash. Sometimes PHD does weird stuff though, so I rebooted both the mount and PHD, and it still seemed to be clearing backlash during calibration even though the star was back in the crosshairs, but guiding worked fine after that and I was able to take 10-minute subframes again.
It was very chilly out, around the lower 40s, and I had all of my layers on! I needed to get up and move around in order to stay warm. My minion Miqaela went to bed around 3 AM, but I didn't want her to have to stop imaging the Snake Nebula, so I offered to close up her scope for her and take dark frames later in the night. Meanwhile, I went over to my telescope-neighbors to see what they were up to. One guy had an 18-inch Obsession, which is what I saw the Western Veil Nebula through I mentioned earlier, and I also looked at M101, M51, and galaxy NGC 5907. M101 is such an odd-looking spiral. M51 showed a lot of detail. NGC 5907 was a large edge-on galaxy that I added to my imaging list.
Earlier in the week, my family kept going to bed before the Milky Way got high enough out of the lower-elevation parts of the atmosphere and the airglow to see well, so they promised they would get up early instead to come check it out. Sure enough, at 4:30 AM, there they were! I was so pleased. The Milky Way really did look great. They got a chance to see the Dumbbell Nebula ("still just a fuzzy blob!") and Saturn and Jupiter in one of the smaller Dobs in the same group. That was a real treat. My view of the Dumbbell with my younger eyes showed the curled edges of the football shape, the apple core was easy to make out, and the central star was obvious. I swore I could even make out the X-shape in the middle, but maybe I was just seeing what I wanted to see.
My family went back to bed at 5:30 AM, and I went and covered Miqaela's scope for dark frames. Then I went and checked on how my exposures were going. Now, Uncle Chris wasn't exactly sure of the order of filters in the filter wheel, so this may not actually be OIII, but we'll see when I process the image. Whatever filter it was, my jaw hit the floor!
M16 Eagle Nebula, 10-minute subframe with OIII filter (?), Takahashi FSQ-106N, SBIG STF-8300M
Close-up of the Pillars of Creation area
Woweeeeeeeee! Now I can see why it's called the Eagle Nebula! Simply astonishing. I was extremely excited. I can't wait to process this!
I started packing up while the last filter, potentially SII, was going, and I got to bed at 6:40 AM on that happy note. I had a total of 12 H-alpha frames (including from the previous night), 5 OIII frames, and 5 SII frames. Another great night!
Here's the video of my scopes moving around. I wish I had lit the scopes, and used a higher ISO again, but it's still kind of cool.
[ Update May 24, 2019 ]
M101
Finished processing M101!
Date: 29, 30 April 2019; 1 May 2019
Location: Texas Star Party, Fort Davis, TX
Object: M101 Pinwheel Galaxy
Attempt: 5
Camera: SBIG STF-8300M (Uncle Chris')
Telescope: Takahashi FSQ-106N
Accessories: SBIG filter wheel, Baader 36mm LRGB filters
Mount: Celestron AVX
Guide scope: Orion 50mm mini-guider
Guide camera: QHY5
Subframes: L: 10x300s
R: 7x300s
G: 12x300s
B: 11x300s
Total: 3h20m
Gain/ISO: N/A
Stacking program: PixInsight 1.8.6
Post-Processing program: PixInsight 1.8.6
Darks: -10C: 20
-20C: 20
Biases: -10C: 20
-20C: 20
Flats: 0
Temperature: 29 Apr: -10C (chip)
30 Apr: L,R: -10C (chip)
30 Apr: G,B: -20C (chip)
1 May: -20C (chip)
This one turned out to be a little tricky to process -- each channel was relatively noisy because of the low number of frames, and the red channel ended up out-of-focus. There was also a fair amount of light pollution from the oil fields to the north that I had to battle. But it was still dark enough to reveal a lot of the dimmer parts of the galaxy, like the disturbed arm. Cool!
Here's my PixInsight process:
PixInsight process:
- Integrated -10C biases (-20C biases done previously)
- Created superbias for -10C (-20C done previously)
- Calibrated both darks with superbiases
- Integrated both sets of darks
- Calibrated lights with master dark and superbias
- Used SubframeSelector to select the best frames
- Registered with StarAlignment, used highest-scoring L frame as reference
- Stacked each channel; L,G,B Winsorized sigma clipping, R sigma clipping
- Cropped with DynamicCrop
- DynamicBackgroundExtraction on each channel
- Combined RGB channels
- Denoised with MultiscaleLinearTransform, with luminance mask
- Color corrected with PhotometricColorCalibration
- Killed background again with DBE on RGB image
- Denoised RGB again with MultiscaleLinearTransform
- Denoised L with MultiscaleLinearTransform, with stretched mask
- Applied Deconvolution with range mask and star mask; used DynamicPSF to create sample point-spread function; used 40 iterations
- Stretched L and RGB
- Applied L to RGB
- Stars have red halos from out-of-focus red channel
- Used ColorMask utility script to select red stars
- Dilated with MorphologicalTransform (see "reducing magenta stars" in Hubble palette Light Vortex tutorial)
- Didn't like that mask, so used star mask made previously, and dilated stars same way (MorphologicalTransform, MorphologicalSelection, 0.90 amount)
- Did another DyanmicBackgroundExtraction
- Adjusted curves with CurvesTransformation
- HDRMultiscaleTransform to increase contrast in core
I lost some detail on the smaller galaxies, but like how M101 came out!
You can find this one on a variety of Zazzle products.
- Calibrated darks with superbias
- Integrated cal'd darks to make master dark
- Calibrated lights with master dark and superbias
- Used SubframeSelector to find highest-weight subframe and save out weights
- Registered lights with StarAlignment
- Seeing a lot of cosmic rays, so adjusting sigma high and low
- Using low of 3.0, and high of 2.0 for OIII
- Using low of 2.75, high of 1.75 for Ha and SII
- Cropped each channel
- Applied DynamicBackgroundExtraction to each channel
- Denoised each with MultiscaleLinearTransform, with stretched mask
- Applied LinearFit to each, with Ha as reference
- Stretched with MaskedStretch
- Applied LinearFit again
- Adjusted curves of each channel with CurvesTransformation
- Applied SCNR to remove green
- Went back and used CurvesTransformation first for hue adjustment (see Light Vortex), and then SCNR, to preserve detail
- Used ColorMask utility to make mask of magenta stars, then dilated it with MorphologicalTransform, then applied it to the image and reduced the saturation in magenta
- Did some curves and histogram adjustments
- Created range_mask - star_mask mask with PixelMath
- Applied HDRMultiScaleTransform, with mask (median unchecked, to lightness and lightness mask checked)
- Sharpened with MultiscaleLinearTransform, with range-star mask
- Applied DarkStructureEnhance
I am going to keep messing around with the colors when I've processed the other data to see if there are other pleasing combinations. There is a super-handy website that will give you a preview of what your images will look like if you upload a small JPG version of each channel. The sample image happened to be M16, so I didn't even have to upload mine to see.
Now, a quick note on narrowband images. They're false-color, meaning I arbitrarily assign a color to each channel. In reality, Ha is red, but so is SII, and OIII is a blue-green. Assigning false colors allows you to differentiate the elements in the image. It provides a unique way to see the universe!
M16 Eagle Nebula
In addition to M101, I processed the narrowband frames on the Eagle Nebula, and combined them using the Hubble Palette, which is SHO -- red = sulfur-II, green = hydrogen alpha, and blue = oxygen-III. As it turned out, the filter order wasn't Ha, OIII, and SII, but instead OIII, Ha, and SII. So the single subframe above is a Ha frame.
Date: 30 April 2019
Location: Texas Star Party, Fort Davis, TX
Object: M16 Eagle Nebula
Attempt: 6
Camera: SBIG STF-8300M (Uncle Chris')
Telescope: Takahashi FSQ-106N
Accessories: SBIG filter wheel, Baader 8nm Ha, OIII, SII filters (36mm)
Mount: Celestron AVX
Guide scope: Orion 50mm mini-guider
Guide camera: QHY5
Subframes: Ha: 5x600s
OIII: 12x600s
SII: 4x600s
Total: 3h30m
Gain/ISO: N/A
Stacking program: PixInsight 1.8.6
Post-Processing program: PixInsight 1.8.6
Darks: 20
Biases: 20
Flats: 0
Temperature: -20C (chip)
At first, there was a ton of green because H-alpha light just overpowers all over narrowband wavelengths. But after doing some reading online, I figured out that usually people apply the SCNR process in PixInsight to their narrowband images to reduce the green and get the blue-and-orange Hubble-like images we are used to seeing. So I did that here, using these directions.
Here's the whole process:
- Integrated biases and created superbias- Calibrated darks with superbias
- Integrated cal'd darks to make master dark
- Calibrated lights with master dark and superbias
- Used SubframeSelector to find highest-weight subframe and save out weights
- Registered lights with StarAlignment
- Seeing a lot of cosmic rays, so adjusting sigma high and low
- Using low of 3.0, and high of 2.0 for OIII
- Using low of 2.75, high of 1.75 for Ha and SII
- Cropped each channel
- Applied DynamicBackgroundExtraction to each channel
- Denoised each with MultiscaleLinearTransform, with stretched mask
- Applied LinearFit to each, with Ha as reference
- Stretched with MaskedStretch
- Applied LinearFit again
- Adjusted curves of each channel with CurvesTransformation
- Applied SCNR to remove green
- Went back and used CurvesTransformation first for hue adjustment (see Light Vortex), and then SCNR, to preserve detail
- Used ColorMask utility to make mask of magenta stars, then dilated it with MorphologicalTransform, then applied it to the image and reduced the saturation in magenta
- Did some curves and histogram adjustments
- Created range_mask - star_mask mask with PixelMath
- Applied HDRMultiScaleTransform, with mask (median unchecked, to lightness and lightness mask checked)
- Sharpened with MultiscaleLinearTransform, with range-star mask
- Applied DarkStructureEnhance
I am going to keep messing around with the colors when I've processed the other data to see if there are other pleasing combinations. There is a super-handy website that will give you a preview of what your images will look like if you upload a small JPG version of each channel. The sample image happened to be M16, so I didn't even have to upload mine to see.
Now, a quick note on narrowband images. They're false-color, meaning I arbitrarily assign a color to each channel. In reality, Ha is red, but so is SII, and OIII is a blue-green. Assigning false colors allows you to differentiate the elements in the image. It provides a unique way to see the universe!
M106
Just today, I finished processing M106. I had started shortly after the Texas Star Party while I was at a conference, but didn't get to finish. This one gave me some real trouble at the start because my stars were so bad from the field flattener that registration didn't want to work. I had to seriously massage the registration parameters in order to align all of the frames! Nonetheless, as long as you don't zoom in and judge the star shape too harshly, it came out pretty cool!
Date: 1 May 2019
Location: Texas Star Party, Fort Davis, TX
Object: M106
Attempt: 1
Camera: ZWO ASI1600MM Pro
Telescope: Borg 76ED
Accessories: Starlight Xpress filter wheel, Astronomik Type 2c 2-inch LRGB filters,
Hotech field flattener
Mount: Celestron AVX (Borrowed)
Guide scope: Orion 50mm mini-guider
Guide camera: QHY5L-II
Subframes: L: 13x300s
R: 10x180s
G: 12x180s
B: 8x180s
Total: 2h35m
Gain/ISO: 139
Stacking program: PixInsight 1.8.6
Post-Processing program: PixInsight: 1.8.6
Darks: 300s: 20
180s: 20
Biases: 0
Flats: 0
Temperature: -20C
After seeing the single subframe up above, you might wonder how I got so much detail in the core -- and the answer is a sweet algorithm called HDRMultiscaleTransform.
Before HDRMultiscaleTransform
After
It's a magical algorithm that can pull out the dimmer parts of the data that are still there but buried in the histogram. I applied a range mask to protect the non-galaxy and star parts of the image, and adjusted the number of layers until I found a setting I liked.
The number of layers will vary depending on the image, but the lower the number, the stronger the contrast enhancement, essentially.
Here's the whole process:
- Integrated bias frames with ImageIntegration
- Created superbias
- Calibrated dark frames (both exposure times) with superbias
- Integrated dark frames
- Calibrated lights with master darks and superbiases
- SubframeSelector on L frames to set weights and select highest-weight frame for reference
- Registered to highest-scoring L frame
- Checked: all the RGB frames (180s) came out extra-dark, may be bias frame issue I came
across previously
- Frames look fine; created master dark not calibrated with bias
- Calibrated RGB frames without bias; look fine
- L frames look fine though, and were also cal'd with the same bias...
- Green frames wouldn't register; look fine
- Tried changing settings are per Light Vortex, but still couldn't register them
- Finally increased peak response to 1.0, and that worked
- Integrated lights: L,R,G: wisorized sigma clipping B: averaged sigma clipping
- Didn't do SubframeSelector on RGB, so used Noise Evaluation for weights instead
- Needed to change sigma low to 3 and high to 2 to reduce satellite trail
- Needed to change sigma low to 2.75 and high to 1.25 to reduce satellite trail
- Applied LinearFit to each color channel, L as reference
- Combined RGB channels using ChannelCombination
- Cropped
- Applied DynamicBackgroundExtraction to L and RGB channels
- Denoised RGB and L with MultiscaleLinearTransform, with stretched luminance mask
- Color-calibrated with PhotometricColorCalibration, but undid because it dropped the histogram off the left side; will re-do after combo with L
- Tried just background neutralization, but that's what was killing the histogram in PCC
- Kept the BackgroundNeutralization and re-did PCC without its own background neutralization
- But that made it all green
- Undid both BackgroundNeutralization and PhotometricColorCal, and actually set appropriate settings for background neutralization in PCC
- Applied Deconvolution on L channel, with sample point spread function from DynamicPSF and range-star mask
- Stretched L and RGB channels
- Applied L to RGB
- Adjusted curves and saturation with CurvesTransformation
- Another round of denoising with MultiscaleLinearTransform
- HDRMultiscaleTransform, 7 layers
Cool cool stuff! I especially love finding other, more distant galaxies in the images, especially targets in galaxy-rich areas, like M106's Canes Venatici (the "hunting dogs"). M106 lies about 24 million lightyears away. One really cool thing about M106 is that it has a water vapor "megamaser," or galactic-scale laser emissions in the microwave wavelength region instead of optical light. It has helped determine M106's distance, and thus calibrated one of the rungs in the cosmic distance ladder, which is how we step out to measuring more and more distant galaxies as other distance-determining features (like Cepheid variable stars) become too dim.
[ Update May 28, 2019 ]
Finished processing the M81 & M82 data, and it came out great!!
Date: RGB: 1 May 2019
L,Ha: 3 May 2019
L,Ha: 3 May 2019
Location: Texas Star Party, Fort Davis, TX
Object: M81 & M82
Attempt: 6
Camera: SBIG STF-8300M (Uncle Chris')
Telescope: Takahashi FS106QN
Accessories: SBIG filter wheel, Astrononik 36mm LRGB filters
Mount: Celestron AVX
Guide scope: Orion 50mm mini-guider
Guide camera: QHY5
Subframes: L: 6x300s
R: 10x300s
G: 11x300s
B: 12x300s
Ha: 2x600s
Total: 3h35m
Gain/ISO: N/A
Stacking program: PixInsight 1.8.6
Post-Processing program: PixInsight 1.8.6
Darks: 20
Biases: 20
Flats: 0
Temperature: -20C
This dataset proved interesting, and I learned a few new techniques! First of all, most of the green frames ended up out of focus before I caught it later in the night, which isn't too big of a deal except that my stars ended up with green halos. Second, since we weren't sure on the order of the filters in the SBIG camera's filter wheel, and I couldn't tell by looking at the raw frames in AvisFV, I only had two hydrogen-alpha frames. Even just those two helped though! Third, I figured out that I could use a range filter to select just the galaxies, and then increase their saturation and not increase the saturation of the noisy background or the green-halo'd stars. This let me boost the saturation enough to see the color, which originally wasn't very intense at all. I'll outline these procedures below.
I ran into several other interesting problems. First was stacking the hydrogen-alpha frames. PixInsight won't let you stack fewer than three, since only two is not statistically significant enough to really stack. You can either average them with PixelMath, or you can duplicate them in ImageIntegration so that it looks like you actually have four frames, which basically does the same thing (1/2 = 2/4). You have to disable pixel rejection though since yeah, you don't have enough statistics to do that with only two frames. Second, my RGB combination image would not color calibrate, and I couldn't figure out why! I tried PhotometricColorCalibration first, one of my favorite algorithms in all of PixInsight, but the result came out completely green, whether or not I enabled background neutralization. So then I tried just doing BackgroundNeutralization, but that yielded similar results, even when I moved the preview window around. Finally, I tried the regular ColorCalibration algorithm, and used M81 as the white reference, and it worked beautifully! Colors were perfect.
Dealing with Star Halos
Your color frames don't need to be very sharp at all -- in fact, you can significantly blur the color channels, and it won't affect the look of your final image whatsoever. (I explain that here). However, even though the luminance channel was focused, the defocused green stars showed up in the luminance + color combination anyway. (PixInsight must do some averaging or something that Photoshop doesn't do).
There's a solution for it though -- well, two actually.
The first one is using MorphologicalTransformation to erode the stars in the defocused channel before combination with the other color channels. However, this can lead to dark circles around your stars or other artifacts.
The second one, which is the one I used and learned when I was processing the Hubble palette M16 image to reduce pink in Hubble palette stars, is to select objects of a given color and desaturate them using ColorSaturation. However, I ended up doing this a different way -- I used the star mask I had created earlier for the Deconvolution process (they're very easy to make just with the StarMask process, see this tutorial), dilated it using MorphologicalSelection (see the pink star solution that does the same thing in this tutorial), and then desaturated the green channel, which is also described in that tutorial. I had to repeat the the desaturation a few times, but it did help.
Enhancing LRGB with Ha
Both M81 and M82 have some excellent star-forming regions that glow strongly in the deep red hydrogen-alpha wavelength. While my red filter can pick this up, it's not as strong of a signal as selecting that exact wavelength with a narrowband Ha filter. The other benefit of narrowband is the sharpness and high signal-to-noise (since you don't get hardly any light pollution or airglow in that narrow waveband, so shot noise is reduced so signal stands higher than the background). So it's beneficial to use your Ha data to enhance both the red channel and the luminance channel.
I followed this tutorial for enhancing the red and this one (on the same page) for enhancing the luminance, but I'll outline the highlights of that procedure here.
Basically, you apply the PixelMath process twice, doing two different things. The first is extracting the Ha signal of just the target -- so the jets of M82 and the stellar nurseries of M81, in this case. That is done with the following settings:
Where Ha_DBE is my stacked and otherwise processed (DynamicBackgroundExtraction etc) image, R_DBE is the same but the red channel, R_bandwidth is the bandwidth of my red filter (generally closer to 100, but you want to lower the number until you don't get hardly any stars, galaxy continuum, etc when you apply a screen stretch), and Ha_bandwidth is the bandwidth of my narrowband filter (mine was actually 8 but I couldn't remember, and 7's close enough).
This produces a rather weird-looking image:
Any of the straight-ish lines still left are cosmic rays, since I only had two frames and thus didn't have enough statistics to do pixel rejection.
Then I used PixelMath again to combine this with the red channel.
The $T for the green and blue means keep them the same.
I have to admit that I'm having a hard time conceptualizing what this image math is doing, but when I did it, the red channel did have a little more signal in it. How helpful it was definitely showed up in the end product.
RGB image before adding Ha
RGB image after adding Ha
For adding the Ha to the luminance channel, the math was quite similar, and is shown in the tutorial mentioned above. The signal did increase a bit.
Targeted Saturation Boost
After combining my RGB frames, even after adding the Ha, my color channel was pretty lame.
Since the color wasn't very strong here, it wasn't very strong after I combined it with the luminance channel either. But towards the end of my processing, I messed around with the saturation curve in CurvesTransformation, and the color popped right out when I raised the lows and the midtones! But so did all of the noisy color in the background of the image, and the green halos re-appeared.
So I used the range mask - stark mask I had created earlier for the Deconvolution process (generated by making a star mask and a range mask using these instructions and then using PixelMath to subtract the two) to make a mask that left the galaxies unprotected, and then protected the background and most of the area around stars (I could probably tweak the star mask more to increase that star coverage).
Red areas are protected from whatever processing you apply.
This let me adjust the saturation on pretty much just the galaxies.
Epic!
Then, finally, I applied HDRMultiscaleTransform to bring out that awesome detail in the cores of both galaxies, especially M81.
Lastly, I ran the denoising settings for MultiscaleLinearTransform again but without a protective mask over the galaxies to intentionally blur the image to reduce the graininess a bit because I find that more aesthetically pleasing.
Here are all the steps I followed.
- Calibrated lights with master dark and superbias using ImageCalibration
- Used CosmeticCorrection to remove hot pixels
- Used SubframeSelector to get rid of a few of the worst frames, and find highest score
- Registered frames with StarAlignment, with highest-scoring L frame as reference
- Stacked each channel with ImageIntegration
- Cropped with DynamicCrop
- Applied DynamicBackgroundExtraction to each channel
- Applied LinearFit to each channel, with L as reference
- Combined Ha + red
- In PixelMath, did
- RGB/K=((Ha_DBE * R_bandwidth) - (R_DBE * HA_bandwidth)) / (R_bandwidth - HA_bandwidth),
- Symbols = R_bandwidth=70, HA_bandwidth=7
- Combined regular RGB channels before next step
- Applied PhotometricColorCalibration to RGB combo
- This failed both with and without background neutralization - all green!
- BackgroundNeutralization had a similar result
- ColorCalibration worked
- Combined with RGB in Pixelmath,
- R/K: $T + ((Ha_starless - Med(Ha_starless)) * BoostFactor)
- G and B: $T
- Symbols: BoostFactor = 1.0
- Applied to RGB combo
- Difference is slight, but noticeable
- Combined Ha + L
- In PixelMath, did:
- RGB/K = ((Ha_DBE * L_bandwidth) - (L_DBE * HA_bandwidth)) / (L_bandwidth - HA_bandwidth)
- Symbols = L_bandwidth=100, HA_bandwidth=7
- Then combined with L in PixelMath:
- RGB/K = $T + ((Ha_starless_L - Med(Ha_starless_L)) * BoostFactor)
- Symbols = BoostFactor=1.0
- Created sampled point spread function from L image
- Denoised with MultiscaleLinearTransform on L and RGB, with luminance masks
- Applied Deconvolution to L, with range_mask-star_mask and the sampled PSF
- Combined L & RGB channels with LRGBCombination
- Grew stark mask a couple times with MorphologicalTransform
- Used new star mask to reduce green saturation on stars (to make up for the defocused green)
- Applied HDRMultiscaleTransform
- Adjusted curves and saturation with CurvesTransformation
- Used a range mask to boost the low and midtones in saturation on the galaxies - wow!
- Another round of denoising with MultisacleLinearTransform
So I learned a few new techniques, and got an epic image out the back -- woo hoo! Onward and upward. And holy sweet goodness did I get a lot of data that night.
Next post: #187 - Friday, May 3, 2019 - Baggin' Targets: Texas Star Party Night #6
No comments:
Post a Comment