Tuesday, October 22, 2013

pyAstroStack: Calibration works, affine transformations by ImageMagick

Actually all that worked already before the previous post. I had to do couple of changes to the calibration functions because now images are monochrome and before RGB, but otherwise everything was in order.

I had some problems on master dark being white, but I found out that to be because the program was reusing old temporary dark#.tiff files. Master bias was subtracted once every time I ran the program causing values of uint16 go below zero. I now changed the program to use float32 during the process and output int16 only when everything is ready. Also temporary files are now manually removed before every new test run.

I noticed ImageMagick can do affine transformations based on matching pairs. Just what I need! It's a lot faster than Scikit-Image, which I won't be needing anymore so one less dependency to worry about. I could probably do a lot more with ImageMagick as well so I have to look into that some more. This project isn't about coding everything by myself. It's about getting astronomical image stacking done on open source software. Hence, ImageMagick is fine.

Next I think I should make some kind of a project file which holds information about temporary files and which can be removed and which reused.

So here's the first result of full calibration process.

That's of course not what my code outputs. Postprocessing has been done with Darktable. Bigger version again on Flickr: http://www.flickr.com/photos/96700120@N06/10427590704/

I addition to project file I'll start working on interpolating calibrated raw images into RGB.

Btw, the whole process takes now 7 min 30 s. That's for 30 bias, 10 dark, 7 flat and 30 light frames. That's quite good, I think.

Monday, October 21, 2013

Unexpected problems with image alignment

Continuing the development of pyAstroStack (I have to come up with a better name for it...) and I ran into problems where I didn't quite expect them.

I'm also starting to find out that when I thought I knew what happens in the image registration and stacking, I'm actually missing quite a lot. I knew about the Bayer filter in DSLR, but seems like I misunderstood how it's used. I thought there are 12M red pixels, 12M blue pixels and 24M green pixels in 12Mpix sensor, when there actually are 3M red, 3M blue and 6M green. I also didn't understand that converting a raw photo into fits or tiff with DCRaw's (or Rawtran's) default settings, doesn't give me the real raw data. So the first version of my program used interpolated color data from the beginning. I started to fix this.

I had difficulties understanding Rawtrans switches but I knew what I had to do with DCRaw in order to get debayered data from raws. Rawtran gave me FITS, but DCRaw PPM or TIFF. I wanted my program to output TIFF so I decided to change AstroPy.Fits to something that uses TIFF. I found Pillow. It can also import images into numpy.arrays, so I should be able to do the transition easily...

I made a lot of changes before sunning proper tests. I tried to include dark, bias and flat calibrations at the same time and when I finally ran tests, all the resulting images were mostly black. I removed functions about calibration from my test program but to no effect. I stretched images to extreme and found this:

Seems to me like registration fails. I really couldn't understand this since I hadn't touched anything related to registration. All the changes were in image loading and stacking. The weirdest thing to me was, why alignment fails only for Y-coordinates and X is ok. It took me a while to figure this out. I reverted to older, working, version of the code and started making all the changes to it one by one and running tests after each change. Pillow and TIFF was the cause. Still I couldn't understand why until I ran everything using FITS but the output as TIFF. Result was perfectly aligned image, upside down! Either TIFF handles coordinates in different order than FITS or just Pillow-library makes the numpy.array with flipped Y, but now that I knew the cause, it was simple to fix.

Star coordinates were fetched on SExtractor using FITS even when everything else used TIFF so the Y-coordinate was always reversed. I simply changed the Y SExtractor gave into Ymax - Y and everything worked.

My plan was to have calibration done by now, but this mix up of coordinates took more time than it should. Reminds me how amateur I still am...

What next?

Maybe now I can work on the calibration. For what I've understood the procedure is
  • masterbias = stack(bias)
  • masterdark = stack(dark - masterbias)
  • masterflat = stack(flat - masterdark - masterbias)
  • stack((light - masterdark - masterbias)/masterflat)
Feels like masterflat should be normalized. It doesn't make any sense to me dividing by same and bigger values you find in light images. Dividing by flat normalized to [0,1] feels like a better idea.

Also colouring the images would be nice. As I said, the first images were made from interpolated raws and now I'm using properly debayered (I think). After calibrations I should interpolate monochromes into colour images with a correct bayer mask. If I'm right about how it's done, it doesn't sound too fast of an operation on Python. Perhaps PyCuda here? Some introduction to PyCuda I read said it's at its best on calculating numpy.arrays.

Sunday, October 13, 2013

New project: pyAstroStack

It has been bothering me that there are no free stacking software for astrophotographers for Linux. I've heard PixInsight is awesome and I have no doubt, but it costs money. I wonder why no one has ever made a free (as in freedom) alternative. Maybe because there are decent free (as in free beer) programs such as DSS, Regim or IRIS (which I compared here).

I decided to try and code one myself. I basically understand a lot of the mathematics involved. I've studied programming a bit alongside physics and mathematics so I thought I might have the skills... Still there has been some problems where I least expected them. For example making an affine transform for a data matrix was surprisingly difficult.

So now I announce:


An open source stacking software for astronomical images

For now the program is extremely limited. It works from command line and is configured by editing the source code. It also does stacking only by average value, doesn't calibrate images with dark, flat and bias, saves result only in three fits (one for each colour channel)... But it works for my test data! That's when I thought I'd make this public.

My test data was the best astrophoto I've taken. Not much as you can see, but nevertheless it is my best. Here's the first successful result of my own code.

Andromeda, stacked with pyAstroStack and postprocessed with ImageMagick and Darktable
Stacking was done in pyAstroStack and open source software was used also for the postprocessing. First I tried Iris for setting colour balance on the FITS and saving it as TIFF and it did a lot better job than I could in Darktable. My goal was to have everything done on open source software so that's why no Iris is used on this image.

And here's the same stacked with Iris http://www.flickr.com/photos/96700120@N06/10002768985/in/set-72157634344389164

The code can be seen in Bitbucket. It's licensed under GPLv3. I hope this'll go somewhere and that I have time and resources to make it easier to use and install. If you read this far, I assume you are somewhat interested in the project. Awesome. If you have any ideas on how to make the registering faster or reduce the number of required Python libraries, I'm all ears.

Feel free to add enhancement or proposal ideas on issue tracker in Bitbucket.