Using Astropy/Python/Juypter

Whilst trying to process my limited data on M101 I noticed the Pixinsight AnnotateImage Script had an option to output the solved objects to a text file.

What I decided I wanted to know was the distance for each of the identified objects so it was time to break out Python Jupyter Notebook and connect to SIMBAD via the very useful Astropy module.

First we need to iterate over the contents of the solved objects text file and push them into a Python list data object (objects).

Now we have a list of object names we can send this to the SIMBAD data service via the query_objects function to retrieve information on each of the objects in the list. However, for the purpose of this exercise I’m only really interested in the Z_value (RedShift) and maybe the Radial Velocity, using the Hubble Constant I could calculate the light year distance. Instead of calculating it myself I use the Planck13 class to return the value for the lookback_time function. This returns a value in Gyrs where 1,000,000,000 light years = 1 Gyr.

I really need to read up more on the Astropy API so I can deal with the errors shown above and look to perform inline sorting for the highest Z value. Regardless of the errors most of the objects were returned and based on the Z value the furthest object returned was PGC2489445 at some 3.04 billion light years.

M101 SuperNova (27th-28th May)

DSW invited me over to the IMT3 observatory for an imaging weekend to image M101 and the recent supernova2023ixf discovery. DSW was using the 12inch RDK and I decided to put the FSQ85 on the Pegasus NYX-101 to test out the setup ready for our Tenerife trip to Mount Teide.

I ran the QHY268C at high gain mode, gain 56, offset 30 and -20℃. I finally managed to cure the noise banding I was experiencing on the QHY268C but using a fully shielded high quality USB3 short length cable that I run from direct to a Pegasus UPBv2 that sits on top of the scope.

Although it’s mid summer and the Moon was bright and approaching full the phase, the sky conditions on the first night appeared to be okay at first sight. Before processing I decided to check on data quality via the blink module in PixInsight it was obvious that they were a lot of unusable subs due to high cloud and using them would have ruined the quality of any resulting stacked image.

Running the data stack of raw images through the PixInsight Subframe Selector to analyse the PSF SNR versus noise it clearly shows that I could only use 7 frames (35 minutes) didn’t drop below 0.08 from night one whilst all the data from night two should go straight into the bin along with most of night one 🙁

Pixinsight SubFrameSelector

Given I don’t have enough data to do the end result justice due to my poor PixInsight skills I decided to invert the images – I really should subscribe to Adam Block Studios (Shout out !)

Inverted image of M101, supernova 2023ixf and surrounding area

Running the AnnotateImage Script labels the various galaxies in the image which I enjoy looking up to see which type they are, magnitude and how far away they are.

Annotated FoV for area around M101, supernova is not labelled

Zooming in to M101 to see the Supernova better, it is located to the right of NGC5461 and indicated by the two arrows.

M101 with SN2023ixf indicated by the arrows

Light Curve

The AAVSO have a light curve plotted from measurements submitted by amateurs, just enter “SN 2023ixf” and submit here. It was still around 11th magnitude on 18th June but there does appear to be a slight decline in the brightness curve.

Viewing report & Filter Comparison 8th March 10pm 2023

I wanted to compare 2 filters this evening as I have recently purchased the new Antlia Triband RGB Ultra filter. The original filter I had for the one shot colour was the ZWO Duo-Band filter.

M42/M43 Filter Comparison

As can be seen above the difference without a filter is quite dramatic (Top – No Filter, Middle – Antlia Triband RGB Ultra, Bottom – ZWO Duo-Band). There is more broadband light captured. The red nebula is less apparent and the background sky is much brighter.

Running Man Nebula Filter Comparison

The above image from the running man nebula, NGC 1977 demonstrates that without a filter a reflection nebula comes through best (Left – No Filter, Middle – Antlia Triband RGB Ultra, Right – ZWO Duo-Band). The ZWO filter gives a more green image over that of the Antila, which in itself reduces the reflection nebula but does start to pick up some of the red emission nebula within the Running Man.

Inverted background sky and stars

The inverted background above gives a sense of the reduction in star luminance that is allowed through without a filter.

This image shows the background with the details of the readout from each of the pixel across the colour channels. Here you get a sense of the green seen in the ZWO filter is less the extra green coming through, moreover the lack of blue being allowed through. Without a filter the background sky is swamped with all colour channels.

With no filter the full effect can be seen above, much brighter background, nebula less colourful and less detailed.

With the Antlia filter above, the final single image I personally find is much more pleasing.

Finally with the ZWO filter you can see quite clearly the green effect.

So in summary I would say the ZWO filter is better than no filter except when imaging reflection nebula, however the best filter is the Antlia filter when paired with my one shot colour ZWO ASI2600MC camera.

Below is a random drawing of a scientist with a Tak laser beam.

PixInsight – Noise & BlurXTerminator plugins


Two plugins I have recently been using are the NoiseXTerminator and BlurXTerminator written by Russell Croman and available from the RC-Astro website.

The new BlurXTerminator plugin is priced at $99.95 although you will get a $10 discount if you already own other RC Astro products and provide the license key at purchase time. Before purchasing it is suggested that you should first check the web site that your hardware and OS meet the requirements to function and download the trial version to test.

The data used was 60*300seconds (5 hours) frames at -20℃ of the Iris Nebula (NGC7023) captured at the IMT3 dark site using a NEQ6 mount, Takahashi FSQ85, Tak Flattener, QHY OAG and QHY268C CMOS camera.


Recommended Usage

Taken directly from the web site :

  • NoiseXTerminator can be used at any point in your processing flow. The PixInsight version can handle both linear and nonlinear (stretched) images.
  • Using NoiseXTerminator on images that have already been heavily processed, particularly with other noise reduction/sharpening software, can produce less than optimal results.
  • If processing a linear (unstretched) image in PixInsight:
    • Make sure PixInsight is configured to use 24-bit STF lookup tables. Otherwise you might see what looks like posterization in your image, when it is really just limitations of the lower-precision default lookup tables.
    • In PixInsight, you can create a preview containing a representative sample of your image, including bright and dark regions, important detail, etc. Select this preview and run NoiseXTerminator on it to allow rapid adjustment of the parameters.

Before & After Comparison

It’s clear that the noise reduction plugin has done a great job although it would have been better had I collected more data to increase the SNR in the first place !


Recently released in December 2022, I thought I would give this plugin a try as I’m rubbish at all the deconvolution/sharpening attempts and I tend to make my images poorer rather than better !

The web page states the following – BlurXTerminator can additionally correct for other aberrations present in an image in limited amounts. Among those currently comprehended for most instruments are:

  • Guiding errors
  • Astigmatism
  • Primary and secondary coma
  • Chromatic aberration (color fringing)
  • Varying star diameter (FWHM) and halos in each color channel

Before & After Comparison

The central part of image before BlurXterminator
The central part of the image after BlurXterminator

Again we can see that the RC Astro BlurXterminator has done a good job at sharpening the detail in the dust cloud.


Investing in these two plugins should be considered as money well spent especially when it can save you time in the processing pipeline. The minimally processed image (DBE, SCNR) of NGC7023 (Iris Nebula) where hot pixels and other artifacts still exist but is shown to demonstrate the power of Noise and BlurXterminator :

NGC7023 (Iris Nebula) Minimal Process

Compressing FITs files

In the days of consumer high capacity storage things look great but modern CMOS cameras output fairly large sized FITs but coupled with short exposure sub one evening can result in a large amount of data to process.

So various solutions are available – block compression and byte compression. Taking one FITs image ( from a OSC CMOS we can compare different compression methods.

Windows Compression

Using the built compression accessed via a file’s advanced options the file is reduced from 49.8MB down to a 43.5MB of disk usage.

Gzip compression

Using the Linux gzip compression utility (v1.9) we perform a byte-level compression in userspace.

# cksum
4076709869 52223040
# gzip -9
# ls -lashi
524306 33M -rw-r--r--. 1 root root 33M Jul 15 14:34
# du -sh
# cksum
1554386539 34046021
# gunzip
# cksum
4076709869 52223040

Gzipping the 50MB fits files results in 33MB of disk usage. Using fpack (v1.7.0) results in similar results and the file format (fit.fz). The fit.fz file is supported by the processing tool Pixinsight.

# fpack -g -q0 -v

Block Compression

Here we utilise the block-level compression feature of IBM Spectrum Scale Developer Edition.

# ls -lashi
524305 50M -rw-r--r--. 1 root root 50M Jul 15 14:34
# du -sh
# cksum
4076709869 52223040
# /usr/lpp/mmfs/bin/mmchattr --compression z
# ls -lashi
524305 36M -rw-r--r--. 1 root root 50M Jul 15 14:34
# cksum
4076709869 52223040
# /usr/lpp/mmfs/bin/mmchattr --compression no
# du -sh

The compression is transparent to the user application and user and they seen as the original file, modified files will need to be recompressed. As this is at the file system block-level the checksum is the same if the file is compressed or not.

Pixinsight XIF Compression

Pixinsight offers the XISF format as an alternative to the FITs format. Existing FITs file can be converted to XISF using the BatchFormatConversion script. To utilise compression you need to supply the appropriate output hints.

Pixinsight Batch Format Conversion

The process console output shows the resulting size of the converted FITs image being reduced to 49% of its original size.

Pixinsight Process Console


Pixinsight XIF compression gives the best compression but the issue is that it is currently not currently an accepted standard and other utilities (astropy) do not have the ability to read them. Pixinsight can also read fit.fz files

NINA can save camera files as FITS or XISF, the XISF method offers LZ4, LZ4HC and Zlib compression but no gzip/fpack option for FITS.

For now I will continue to compress FITS to fit.gz or fit.fz to give me the benefits of space savings whilst still allowing me to use the data across Windows and Linux utilities.

PixInsight – Load Default Project/Process Icons

So whilst Dave was processing our M45 QHY268C data, he mentioned how it is frustrating that he has to reload his process icons for his workflow every single time.

After finishing the communications/process diagram for IMT I decided to have a quick look if it was possible. Watching PixInsight startup I noticed access to a few files – banner and startup.scp. For me these were located in the C:\Program Files\PixInsight\etc\startup directory.

Looking through the documentation it seemed possible to add statements to the file which was possible once I had modified it as Administrator.

Method 1 – Load Process Icons

This will load just the process icons into the current workspace on startup. Add the line below to the bottom of C:\Program Files\PixInsight\etc\startup\startup.scp :

open "C:\Users\gingergeek\Pixinsight\Pixinsight DSW Process Icons V10.1.6.xpsm"

Save the file and restart pixinsight.

Method 2 – Load An Empty Project With Process Icons

Another method (preferred) is to create a new project (Empty-process-icons.pxiproject), load in the process icons. Save the project and then change the properties to make it read-only so you can’t accidentally overwrite it later on.

Add the line below to the bottom of C:\Program Files\PixInsight\etc\startup\startup.scp :

open "C:\Users\gingergeek\Pixinsight\Empty-process-icons.pxiproject"

Save the file and restart pixinsight.

I also modified the banner file (in the same directory as startup.scp) so it would show the IMT3b designation. I generator the ASCII art from one of the many online sites, if I can remember which one I will link it here.

\x1b[1;38;2;255;000;000m██╗███╗ ███╗████████╗██████╗ ██╗\x1b[39;21m
\x1b[1;38;2;230;000;000m██║████╗ ████║╚══██╔══╝╚════██╗██║\x1b[39;21m
\x1b[1;38;2;204;000;102m██║██╔████╔██║ ██║ █████╔╝██████╗\x1b[39;21m
\x1b[1;38;2;179;000;153m██║██║╚██╔╝██║ ██║ ╚═══██╗██╔══██╗\x1b[39;21m
\x1b[1;38;2;153;000;204m██║██║ ╚═╝ ██║ ██║ ██████╔╝██████╔╝\x1b[39;21m
\x1b[1;38;2;128;000;255m╚═╝╚═╝ ╚═╝ ╚═╝ ╚═════╝ ╚═════╝\x1b[39;21m
Our new PixInsight default project with process icons


The downside to both these methods is that if PixInsight is upgraded/reinstalled then you will lose the settings – not a disaster to be honest as they are easy to put back into place.

Processed Image (M45)

So visiting Dave one evening as we have not met for a while whilst I was updating software Dave processed the QHY268C data we took of M45.

M45 – PixInsight Processed Image

So I’m disappointed that although the image is a good first start I forgot to change the change setting on the gain which in SGPro is in the event settings and not in the top level sequence display 🙁

Dave ran the image through the annotate function of PixInsight. The galaxy PGC13696 near the bottom of the image is actually 232 million light years away.

M45 – PixInsight Annotated Image

Viewing Report 25th July 2020 – IMT3

21:47 – 00:23


GingerGeek and I were out imaging tonight. The sky unexpectedly cleared and we thought given the impending move of IMT3 to another site that we would try to gather some more data on M57, specifically LRGB and some additional Ha on the 12″.

We ran Autofocus on Luminance which gave 60,671 at 20.64℃ and HFR 6.5. We then started to image and after a few frames the temperature started to drop. In the main this is because we opened the dome last minute rather than a minimum of 2 hours before we used it so the dome was warm and now cooling down the optical train shifts.

M57 quick frame and focus

We refocused to 62,712 on luminance at 19.10 with an of HFR 5.15. We then ran the image acquisition and below is a screenshot of the guiding, which looked like it was going to cause an issue but it was ok. If it had then it would have been the local fog rather than anything on the mount. At midnight we performed a Meridian flip nice and early which afforded us to leave the observatory running and go to bed. Notice the graph below, the yellow line drops as we performed the meridian flip, this is due to the dome now shielding against next doors light !!!!!!!

Guiding ok, notice the drop in the graph

The neighbours light continue to be a pain as can be seen here, I really cannot wait to move the observatory to it’s new dark site.


Finally managed to capture LRGB and Ha, below the RGB and Ha frames can be seen in PI. Note the central star is not visible in the Ha frame,

RGB and Ha raw frames

Finally I stacked 1 of each colour without calibration to see what it would look like.

RGB quick single stack no calibration

So we left the observatory imaging, I had a quick peek outside around 12:30am and there was water running off the dome and the outside windows of the orangery! The Observatory ran until the dome shut at 3:58am (it woke me up) when the light levels started to rise.

Image Processing Notes for CMOS Using Flat Darks

I thought I ought to document this so that I remember this is now the new normal for making a flat master for my CMOS camera, the ZWO ASI1600MM. The problem I found again after not processing images for some time, was that the normal way of processing without Flat Darks produces a master flat with embossed, so raised doughnuts across the image.

Batchpreprocessing – > Darks tab -> Optimization Threshold -> move from 3 to 10 – > this removes the dark entirely and also removes the amp glow but introduces loads of noise so clearly not right at all. So I contacted my friend Dave Boddington who is a bit of an expert on this topic and he gave me some good advice that has of course worked.

So first let’s detail what I am calibrating. On the 20th April 2020 I took a set go Ha frames of M84, these were 300s exposure and with a Gain of 193 and I believe an Offset of 21, however we had some changes over the previous week so driver the Offset is no longer stored in the FITS header. It was when we were using the ZWO native driver. The temperature of the cooler was set to -26℃. I have 8 of these frames.

M94 300s light

I also have a set of 10 darks at the same settings. However when using the Statistics tool Dave noticed the Mean of the image was 800 and the Mean of the Ha frame was 353. This is in a 16 bit notation. The camera however is a 12 bit camera and this means the Mean for the dark is 50 and the Mean for the Ha is 22, so a difference of 28 in 12 bit and 447 in 16 bit. I will come back to this later.

Mean of Ha 300s light
Mean of Dark 300s

First I created a Master Dark for the Ha frames using the normal ImageIntegration settings. I did not calibrate darks with Bias as you do not need bias with a CMOS cooled camera. Next I created a Master Flat Dark for the Flat frames using the same ImageIntegration settings.

Single 300s Dark with hot pixels and amp glow

Then I found the Ha images did not need to have the flats applied so I skipped that step for the narrowband images. Next I Calibrated the Ha lights with ImageCalibration and because of that discrepancy above which looks like it was induced by having the Offset for the darks set to 12 and the Offset for the lights set to 21 I added 600 as suggested by Dave Boddington to the Output Pedestal in the Output files section of ImageCalibration. I made sure Evaluate Noise was ticked and that both Calibrate and Optimise were unticked in the Master Dark section. Master Bias was unticked and so was Master Flat for the narrow band images as mentioned.

Calibrating Ha lights with Master Dark

This created a clean set of calibrated Ha lights that did not require flats to be applied.

Calibrated 300s Ha light with Master Dark

Next I had some issues in Star Aligning the frames. The error I received was ‘Unable to find an initial set of putative star pair matches’, due to the frames being very sparsely filled with stars and the background being quite light compared to the stars. A quick look on the PI forum showed increasing the Noise Reduction in the Star Detection section from 0 to 4 sorted the issue, with all but 1 frame being aligned. I was then down to 7 x 300s Ha lights. The final frame was very light due to cloud.

7 x 300s Ha Calibrated with Darks, Aligned and stacked

I then integrated these 7 frames together. I had a challenge with trying to get the hot pixels in a few areas to disappear using Cosmetic Correction and pixel rejection during stacking so I will remove these after by hand before combining into the larger set

hot pixels not removed

So in essence what I have learnt is that I need to have really clean filters and camera glass. That all the doughnuts are on the those surfaces and not anywhere else. That the flats must be between 22k and 26k for the CMOS cameras, although this has some tolerance either way. That I need to set the camera to the right Gain, Offset and Temp as the lights and that I need the right flats for the right lights!

Viewing Report 27th March 2020 – IMT3 12″

Viewing time period – 17:18 – 02:07

Cooling down telescope ready for tonights viewing

IMT3 Cooling down

M94 and NGC 3395/3396 are the 2 targets for tonight, some luminance on M94 and RGB on NGC 3395/3396 if I get enough time. I always try to open the dome early to give at least 2-4 hours cooling before I use.

View from the bridge

When I was about to start with autofocus I tried to recenter back on the target but the mount respond and it transpired that the mount thought it was out of balance. I went to the dome and the mount was beeping proving it was out of balance. So I turned the mount off, manually moved the scopes pack to the park position and then turned the mount back on and all was well.

@19:57 I performed the autofocus for the night on Luminance which scammer in at a position of 75282 on the focuser.

1st AutoFocus run

@20:10 I started an imaging run of 24 x NGC 3395/3396 with Luminance filter. Once done I planned on grabbing RGB frames before moving on to M94.

NGC 3395/3396 Luminance

@22:32 I started on the RGB frames for NGC 3395/3396 after refocusing on the Red filter.

Single Blue frame for NGC 3395/3396

@1:40 I slewed to M94 and changed the filter to Luminance. I performed a refocus and shifted from 77895 to 75884 on Red filter by accident. So we (I had Bob on Zoom by this point) refocused on the Luminance and the new focus position was 74884. So the difference is 1000 for Luminance to Red. I also changed the step size for the focuser temperature compensation from 531 to 431 to see if the HFR is more stable.

A new autofocus on Luminance

I noticed tonight that PHD2 lost the Use Direct Guide check mark twice and thus complained about pulse guide not being supported. I had to stop guiding, disconnect the mount in PHD2 and go into the settings, check the Use Direct Guide and reconnect the mount and start guiding again. Something to look into possibly.

Quick frame and focus 20s of M94 Luminance

@02:07 I went to bed and left the scope gathering another 2 hours of Luminance data on M94.

Addendum …….

The following day I took the ZWO ASI1600MM CMOS Camera off the back of the 12″ and cleaned the sensor window. What I found was the dark dust doughnuts disappeared and the rest for the doughnuts were actually on the filters.

Before cleaning Flat from Luminance on ASI1600MM
Flat from Red after cleaning sensor window
Flat from Green after cleaning sensor window
Flat from Blue after cleaning sensor window

Things to still resolve……..

  1. Check out why WSX is loosing connection and shutting the dome
  2. Fix Slew Here and Centre Here in SGPro that does not work
  3. Clean filters for the 12″ to get rid of doughnuts
  4. Clean sensor for QHY168C

Image processing notes for travel setup

So I managed to go out and quickly bag a few images of M13 to test the travel scope on the night of the 1st to the 2nd September. It was relatively cool and clear. The main aim was could I take images that were not overexposed on stars whilst capturing the fainter stars at the same time. Also I wanted to make sure I could process an image too.

So all in I took 10 x 5 minute exposures but unfortunately I had not read the Skywatcher manual and had not locked up the focus tube. This meant that the first 3 frames were out of focus so I tightened the locking latch and then took the other 7.

On processing the image I noted the black (white) band to the top and right of the image where I had not switched off the setting for Overscan. I could not PixInsight to recognise it properly so I simply pre-processed the image and then cropped it out before processing.

Overscan area present

I managed to get Photometric Colour Calibration working which helped get the colour just right. I then processed in my usual way using the following workflow.

Photometric Colour Calibration Results
Photometric Colour Calibration Settings
  1. Calibrate with Flats and Darks only no Bias as it is a CMOS camera
  2. Integrate the frames
  3. Align
  4. Perform Cosmetic Correction
  5. Debayer
  6. Crop
  7. ABE
  8. Background Neutralisation
  9. Platesolve
  10. Photometric Colour Calibration
  11. Histogram Stretch
  12. TGVDenoise
  13. ACDNR
  14. Curves
  15. Dark Structure Enhance
  16. Exponential Transformation
  17. 2nd set of Curves
  18. SCNR for green

The final image was ok for the short amount of data I obtained and proved my capture setting and workflow worked