I got four 120 seconds ISO 800 RAW frames. Better option would be to go for 60 seconds ISO 1600 and minimize tracking errors, stacking should eliminate noise anyway. Since pictures are taken under suburban sky, they are overexposed and have ugly orange background color. When we open them in Darktable default white balance preset is camera white balance and that looks like this.
Now we want to eliminate that ugly orange background and we switch from camera white balance to spot white balance.
If images were saved as JPEG, camera white balance would be applied, colors shifted to allow better compression and we won’t be able to do much processing. Before exporting images to TIFF, to export hit Ctrl+e, we will tweak exposure as on picture:
I export them as 16 bit integer per channel TIFF, if you do not know how to manage export settings it is explained in previous blog entries.
Now to do stacking I will open terminal and execute magic formula:
align_image_stack -a tif *.tiff
gmic tif0000.tif tif0001.tif -div 256 -gimp_blend 3,1,0 -mul 256 -c 0,65536 -type ushort -output one.tiff
gmic tif0002.tif tif0003.tif -div 256 -gimp_blend 3,1,0 -mul 256 -c 0,65536 -type ushort -output two.tiff
gmic one.tiff two.tiff -div 256 -gimp_blend 3,1,0 -mul 256 -c 0,65536 -type ushort -output tutorial.tiff
If you compare it with previous blog entries about G’MIC stacking you will see that new version of G’MIC is not completely backward compatible. Some people would maybe like to use DSS instead and that is also OK. Finally we open image in GIMP bump up contrast and LAB color decompose image. We duplicate A and B component, set copy mode to overlay and then merge them down. Do not flatten layers, there should be L, A and B layers. L layer can be slightly stretched or left how it is. When we LAB compose layers back into RGB image we will have nice saturated colors. Now, some people will proceed playing with curves but I will just add background gradient. I am not really using it as intended. I switch from divide to overlay and reduce opacity to 50%. That background gradient is part of astronomy plugin for GIMP by Georg Hennig and 2.8 compatible version is available from here git://gitorious.org/gimp-plugins-ambulance/gimp-plugin-astronomy.git. Building plugin is trivial. Here is result:
Complete damage control was done in Darktable in three straightforward steps and that is why I call it easy. If level of black was lower and and for example exposure was reduced only -0.5EV, we could further increase contrast and get more of that Flame nebula. Though it will be more fiddling in GIMP and may be not so simple as it sounds.
Showing posts with label Photography. Show all posts
Showing posts with label Photography. Show all posts
Sunday, March 24, 2013
Wednesday, February 13, 2013
Talking Exposure Timer
Is simple talking count-down timer for Android. As title says purpose of it is to help taking pictures where long exposure is required and one can’t look at watch, for example in astrophotography. That is also main reason why user interface is black and red. So, according to my intentions primary users should be people who are doing astrophotography and manual tracking or barn door tracking.
User interface is simple and we got frequency of announcements and duration spinners at the top, start and cancel buttons in the middle and info text at the bottom. If text to speech engine fails to initialize, start button is disabled and appropriate info is displayed. If text to speech engine initializes it will use default language for phone. Cancel is not immediate but is applied on first delayed post, during next announcement of remaining time.
To use it select how frequently you want announcements and how long exposure should be, after delay of 10 seconds, common for DSLR cameras in remote shutter BULB mode, it starts talking and counting down. While count down lasts, application obtains wake lock, screen will go dim but will not go to “sleep”. It is compiled for and tested on Android 2.2 device LG-P500.
Signed APK is available for download from here https://docs.google.com/file/d/0B0cIChfVrJ7WbkhISE5ZdGdSdmM/edit?usp=sharing, download, copy to SD card and open in file manager to install.
Eclipse project is here https://docs.google.com/file/d/0B0cIChfVrJ7WaS1oRl9lTUJUNmc/edit?usp=sharing, download, import in Eclipse and build.
User interface is simple and we got frequency of announcements and duration spinners at the top, start and cancel buttons in the middle and info text at the bottom. If text to speech engine fails to initialize, start button is disabled and appropriate info is displayed. If text to speech engine initializes it will use default language for phone. Cancel is not immediate but is applied on first delayed post, during next announcement of remaining time.
To use it select how frequently you want announcements and how long exposure should be, after delay of 10 seconds, common for DSLR cameras in remote shutter BULB mode, it starts talking and counting down. While count down lasts, application obtains wake lock, screen will go dim but will not go to “sleep”. It is compiled for and tested on Android 2.2 device LG-P500.
Signed APK is available for download from here https://docs.google.com/file/d/0B0cIChfVrJ7WbkhISE5ZdGdSdmM/edit?usp=sharing, download, copy to SD card and open in file manager to install.
Eclipse project is here https://docs.google.com/file/d/0B0cIChfVrJ7WaS1oRl9lTUJUNmc/edit?usp=sharing, download, import in Eclipse and build.
Saturday, February 9, 2013
Why to use RAW?
Quite frequently that question is repeated in different forms on Google+ Open Source Photography community https://plus.google.com/u/0/communities/110647644928874455108.
For me argument that RAW encodes data into 14 bit per channel vs 8 bit per channel, for JPEG, is more than sufficient. But for non-technical person that is not very convincing. If we add to it fact that internal camera processor in 90% of situations does great job when it creates JPEG and you need quite high level of processing skills to achieve the same starting from RAW, rock solid RAW argument doesn’t look so good. People advocating RAW start to look like Groucho Marx: “Who are you going to believe, me or your lying eyes?”.
So, here is simple example why RAW is good. Camera handling of great difference in luminosity is nowhere near to human eye. With automatic exposure on we will get or dark is too dark or bright is too bright. Offending photo is stored as RAW and I will open it in Darktable. It should be opened by default in darkroom mode. After doing sharpening and lens correction, in correction group (one with broken circle symbol) I switch to basic group. Here is what we got:
After activating overexposed plugin all overexposed parts will become red, like here:
To remedy that I switch on exposure plugin and push slide to unreasonably low -2.61EV. Now everything what is underexposed is blue.
After adjustment of exposure to reasonable -1.42EV we still have some underexposed areas but that is in shade and we can safely ignore it.
Now we can export it and do further processing in GIMP or even we can try exporting different levels of exposure and later do exposure blending.
For me argument that RAW encodes data into 14 bit per channel vs 8 bit per channel, for JPEG, is more than sufficient. But for non-technical person that is not very convincing. If we add to it fact that internal camera processor in 90% of situations does great job when it creates JPEG and you need quite high level of processing skills to achieve the same starting from RAW, rock solid RAW argument doesn’t look so good. People advocating RAW start to look like Groucho Marx: “Who are you going to believe, me or your lying eyes?”.
So, here is simple example why RAW is good. Camera handling of great difference in luminosity is nowhere near to human eye. With automatic exposure on we will get or dark is too dark or bright is too bright. Offending photo is stored as RAW and I will open it in Darktable. It should be opened by default in darkroom mode. After doing sharpening and lens correction, in correction group (one with broken circle symbol) I switch to basic group. Here is what we got:
After activating overexposed plugin all overexposed parts will become red, like here:
To remedy that I switch on exposure plugin and push slide to unreasonably low -2.61EV. Now everything what is underexposed is blue.
After adjustment of exposure to reasonable -1.42EV we still have some underexposed areas but that is in shade and we can safely ignore it.
Now we can export it and do further processing in GIMP or even we can try exporting different levels of exposure and later do exposure blending.
Friday, February 1, 2013
Bits and pieces
This is something like what I should say in previous posts but I forgot to do it. If you are aligning stack using Hugin tools, align_image_stack is HDR tool and it can’t handle well significant number of images in stack or significant movement between images. Workaround here is venerable divide and conquer strategy. Divide images into groups of few and stack them group by group. Then at the end stack those between steps. If your exposures are uneven, stack first shorter ones and later add them to longer ones. Specifying stacking order instead of giving asterisk may help.
After every contrast stretching operation keep noise level under control. Best tool around is wavelet denoise which is part of gimp-tool-registry. Don’t overdo denoising you will lose details and sharpnes. Reasonable level is up to 1.25 with residual 0.10.
If you are into photography then adding following repositories may be interesting:
sudo add-apt-repository ppa:otto-kesselgulasch/gimp
sudo add-apt-repository ppa:pmjdebruijn/darktable-unstable
sudo add-apt-repository ppa:philip5/extra
sudo add-apt-repository ppa:hugin/hugin-builds
They usually contain name of package, except for Philip Johnsson PPA. That one is suggested because it contains Luminance HDR but it also contains about everything else, though others have newer version. When you want to install Luminance HDR specify luminance-hdr and not common qtpfsgui, for qtpfsgui you will get old version from official repo.
For entangle, program for tethered camera control, you download deb from:
http://mirrors.dotsrc.org/getdeb/ubuntu/pool/apps/e/entangle/
that is GetDeb mirror. Pick 32 or 64 bit depending what system you are running. There are no exotic dependencies for entangle.
In order to increase capacity and avoid loss of data many picture processing tools internally are using 32 bit float per channel. That is perfectly alright, though I do not know for CCD which is digitizing picture into anything better than 16 bit integer per channel. Problem is that you can’t easily see those 32 bit float per channel TIFF images, no many viewers around for them. Popular image stacking program Deep Sky Stacker produces Autosave.tif in that format and you can’t see what it looks like - very irritating. Perfectly capable viewer for many image formats is G’MIC and it can handle 32 bit float per channel TIFF. Once you install it open terminal (command line) and cd to folder with Autosave.tif. Execute:
gmic Autosave.tif
and GUI with Autosave.tif will show up, moving cursor over image you will be able to see values for current pixel. Now if you want to convert those TIFF images for less capable viewers, you can achieve that also using G’MIC. For example channel values will be between 0 and 1, for Deep Sky Stacker. So if we want to convert that into 16 bit integer we simply multiply channel value to 2 on power of 16 minus 1, what is 65535. Naturally we are talking about unsigned integers. So magic formula to convert it to 16 bit TIFF is
gmic Autosave.tif -mul 65535 -c 0,65535 -type ushort -output auto16.tiff
Now we can use almost any viewer to see what it looks like.
After every contrast stretching operation keep noise level under control. Best tool around is wavelet denoise which is part of gimp-tool-registry. Don’t overdo denoising you will lose details and sharpnes. Reasonable level is up to 1.25 with residual 0.10.
If you are into photography then adding following repositories may be interesting:
sudo add-apt-repository ppa:otto-kesselgulasch/gimp
sudo add-apt-repository ppa:pmjdebruijn/darktable-unstable
sudo add-apt-repository ppa:philip5/extra
sudo add-apt-repository ppa:hugin/hugin-builds
They usually contain name of package, except for Philip Johnsson PPA. That one is suggested because it contains Luminance HDR but it also contains about everything else, though others have newer version. When you want to install Luminance HDR specify luminance-hdr and not common qtpfsgui, for qtpfsgui you will get old version from official repo.
For entangle, program for tethered camera control, you download deb from:
http://mirrors.dotsrc.org/getdeb/ubuntu/pool/apps/e/entangle/
that is GetDeb mirror. Pick 32 or 64 bit depending what system you are running. There are no exotic dependencies for entangle.
In order to increase capacity and avoid loss of data many picture processing tools internally are using 32 bit float per channel. That is perfectly alright, though I do not know for CCD which is digitizing picture into anything better than 16 bit integer per channel. Problem is that you can’t easily see those 32 bit float per channel TIFF images, no many viewers around for them. Popular image stacking program Deep Sky Stacker produces Autosave.tif in that format and you can’t see what it looks like - very irritating. Perfectly capable viewer for many image formats is G’MIC and it can handle 32 bit float per channel TIFF. Once you install it open terminal (command line) and cd to folder with Autosave.tif. Execute:
gmic Autosave.tif
and GUI with Autosave.tif will show up, moving cursor over image you will be able to see values for current pixel. Now if you want to convert those TIFF images for less capable viewers, you can achieve that also using G’MIC. For example channel values will be between 0 and 1, for Deep Sky Stacker. So if we want to convert that into 16 bit integer we simply multiply channel value to 2 on power of 16 minus 1, what is 65535. Naturally we are talking about unsigned integers. So magic formula to convert it to 16 bit TIFF is
gmic Autosave.tif -mul 65535 -c 0,65535 -type ushort -output auto16.tiff
Now we can use almost any viewer to see what it looks like.
Friday, January 11, 2013
Orion’s belt and sword, processing with open source tools
Capturing photos
To do processing one needs to take few photos first. Minimal hardware is DSLR with 50mm lens, remote shutter release for DSLR and any kind of cheap telescope with equatorial mount. It is not necessary to buy 50mm lens, kit lens usually 18-55mm will do, if your DSLR have live view and you can set focus. Manually setting focus on kit lens without live view is quite difficult. But it will be nice to have one 50mm F1.8 lens which is much faster and sharper than kit lens. If you have longer focal length lens it is even better, 100mm or even up to 200mm, over it tracking becomes big problem. Why remote shutter release? Camera supports up to 30 seconds exposure and after that BULB, you should press and hold shutter button what will cause lot of shaking and ruin photo. For that reason remote shutter release is used, we can have long exposure without shaking and strain. Telescope is used so that we can piggyback camera on it and achieve long exposure without trailing stars. How to attach camera to scope, there are piggyback brackets, piggyback camera mounts, you can make them on your own using hose clamp. If your telescope got RA motor drive, do polar alignment and switch it on, shorter focal length of the lens is more tolerant towards tracking errors. If there is no RA motor drive you will have to do manual tracking. Telescope with alt-azimuth mount won’t do it must be with equatorial mount. There are specialized devices like Vixen Polarie or AstroTrac which can be used instead, but they are more expensive than small 5" Newtonians with RA motor drive. Why one shouldn’t just do few hundreds shots from tripod and later stack that in Deep Sky Stacker? Because 5 seconds would be longest acceptable exposure for 50mm lens and to get to 5 minutes you need 3600 photos. Way to go is increasing exposure time.If you are not sure where is Orion or M42 inside Orion install Stellarium http://www.stellarium.org/ it is open source and works on all major operating systems.
Processing
So I went out under my light polluted suburban sky and managed to get few decent photos between clouds. I did the same night before and now I have 13 frames 55mm F5.6 ISO 800 where exposure is from one minute to two minutes. When I decided to crop them and stack them together align_image_stack from Hugin did poor job. Was that too narrow cropping or something else I do not know, for that reason new strategy was stack them by night and later stack final results. So, imported RAWs into darktable, applied chromatic aberration and lens correction and exported them to 16 bit TIF. Should do hot pixels removal but I forgot to do it and did it later in G’MIC, like this:gmic IMG_0451.tiff -remove_hotpixels 3,10 -c 0,65536 -type ushort -output IMG_0451hp.tiff
Now I aligned stack:
align_image_stack -a tif *.tiff
and averaged them two by two, saving output:
gmic tif0000.tif tif0001.tif -div 256 -gimp_compose_average 1,0 -mul 256 -c 0,65536 -type ushort -output step1.tiff
Later I combined those steps, two by two, until final result. To stack those two images I need to rotate one and to crop both of them:
gmic m4291.tiff -rotate -66 -crop 1400,2500,3500,4700 -c 0,65536 -type ushort -output m4291c.tiff
gmic m4281.tiff -crop 1450,767,3550,2967 -c 0,65536 -type ushort -output m4281c.tiff
To find out how much to rotate and crop I used GIMP. Now fine alignment of those two with align_image_stack:
align_image_stack -a tif *.tiff
and final blending for16 bit output:
gmic tif0000.tif tif0001.tif -div 256 -gimp_compose_average 1,0 -mul 256 -c 0,65536 -type ushort -output m4289.tiff
and also one 8 bit for GIMP:
gmic tif0000.tif tif0001.tif -div 256 -gimp_compose_average 1,0 -output m4289.jpg
Light pollution and clouds contributed to final result.
It is too bright, too much orange in it, but M42 is visible and Flame nebula is just barely visible. If we look at histogram, this is where we are and where we want to be:
From GIMP menu we select Color->Levels. In Adjust Color Levels we set Channel to Red and move upper slide from the bottom towards middle.
We do the same for green, looking at histogram set to RGB and picture itself. After this image is still too bright and Flame nebula is invisible. To remedy that I will rise contrast, Colors->Brightness-Contrast and I set contrast on 40. Now LAB decompose color boost, described in one of previous tutorials and some more contrast stretching via Colors->Auto->White Balance and after that Edit->Fade, we want histogram stretched but not that much, again checking image and histogram to find out how much.
Here is the final result:
Conclusion I need lens and light pollution filter ;-)
Labels:
Astrophotography,
G'MIC,
GIMP,
image stacking,
open source,
Photography
Friday, November 23, 2012
More about stacking
In the last tutorial we stacked few frames using GIMP after manually aligning them. Since we were setting layers to screen mode it was at the end quite bright. What we could do to make it more natural is to make additional layers transparent. Opacity of bottom layer 100%, next 50%, next 25% and so on.
Here Windows users have really wide choice of free or paid programs. For example Deep Sky Stacker and RegiStax. If there is dozen or more frames to stack it pays to start KVM or VBox and struggle with Windows for for few minutes in order to use Deep Sky Stacker. There is no real equivalent which will do the same task on Linux, register, align and stack frames using GUI. For smaller number of frames we can use ale - The Anti-Lamenessing Engine written by David Hilvert. Typically it is not compiled with ImageMagick support enabled, so we have to prepare images to be processed an to convert them into ppm format. We can install ImageMagick and use mogrify and convert to convert photos from terminal, but in order to make things easier we will use GUI.
If we are using DSLR camera or some of those new compact cameras we may be able store images in RAW format. It allows us to do significant amount of processing on such image. There is few really good programs for RAW processing, like UFRaw, RawTherapee or Darktable. We are going to use the last one Darktable. After installation we will right click on RAW file and select from context menu Open With Darktable. User interface is non-conventional so here is quick explanation. We want our RAW to be converted to ppm and we want to remove hot pixels.
Proper removal of hot pixels would be taking “dark frames” at the end of the session. That is place cap on lens and take picture using the same ISO value and exposure time as data frames. After that we can add dark to data frame as layer in GIMP and subtract it to remove hot pixels.
When Darktable shows-up we will see in the right pane tab which says more plugins. Clicking on it we open it and select hot pixels plugin. Again clicking on it we close it and under correct tab is hot pixels plugin, which is "switched off". We "switch it on" and it removes hot pixels.
We are happy with all default processing so far. Now we want to export image to ppm. For that we hit key “l” what brigs us to lighttable mode. On the right pane we locate export selected and make it to look like this:
Export could be achieved via export button or keyboard shortcut Ctrl+e. To go back to darkroom mode we hit “d”. We repeat process on all frames. If we added some exposure, EV or maybe two, it is likely that we have generated noise. Also if we do not want full size picture we may want to resize it, ale will finish processing much faster. In GIMP we do Filters -> Enhance ->Wavelet Denoise and here is how it looks after and before denoising:
If we want to resize that is Image -> Scale image, and we export image back to original ppm.
Now we can open terminal and cd to folder with ppms. This is what I did and what was output in terminal
Here is explanation --md Set element minimum dimension x. (100 is default), --dchain fine:box:1 approximates drizzling. and two remaining params are input and output. To get more info execute ale --hA.
Result was rather cold and dark. In order to bring some warmth and dynamics I opened stacked.ppm in GIMP and did Colors -> Components -> Decompose where from drop-down LAB is desired. Now for each layer I made only one which I am currently working on visible and others invisible, clicking on eye in layers docking window. Then after duplicating layer and setting copy layer to Overlay mode, I merged visible layers accepting default option expand as necessary. That was repeated for L, A and B layer.
After that Colors -> Components -> Compose, again selecting LAB. At the end Colors -> Auto -> White Balance and Edit -> Fade Levels with default Replace mode and Opacity 33. Here is result:
Those are the same four frames from last tutorial, if you do not have own to process, download them, convert them to ppm and you can try ale stacking and post-processing with them.
Automated stacking
Here Windows users have really wide choice of free or paid programs. For example Deep Sky Stacker and RegiStax. If there is dozen or more frames to stack it pays to start KVM or VBox and struggle with Windows for for few minutes in order to use Deep Sky Stacker. There is no real equivalent which will do the same task on Linux, register, align and stack frames using GUI. For smaller number of frames we can use ale - The Anti-Lamenessing Engine written by David Hilvert. Typically it is not compiled with ImageMagick support enabled, so we have to prepare images to be processed an to convert them into ppm format. We can install ImageMagick and use mogrify and convert to convert photos from terminal, but in order to make things easier we will use GUI.
If we are using DSLR camera or some of those new compact cameras we may be able store images in RAW format. It allows us to do significant amount of processing on such image. There is few really good programs for RAW processing, like UFRaw, RawTherapee or Darktable. We are going to use the last one Darktable. After installation we will right click on RAW file and select from context menu Open With Darktable. User interface is non-conventional so here is quick explanation. We want our RAW to be converted to ppm and we want to remove hot pixels.
Proper removal of hot pixels would be taking “dark frames” at the end of the session. That is place cap on lens and take picture using the same ISO value and exposure time as data frames. After that we can add dark to data frame as layer in GIMP and subtract it to remove hot pixels.
When Darktable shows-up we will see in the right pane tab which says more plugins. Clicking on it we open it and select hot pixels plugin. Again clicking on it we close it and under correct tab is hot pixels plugin, which is "switched off". We "switch it on" and it removes hot pixels.
We are happy with all default processing so far. Now we want to export image to ppm. For that we hit key “l” what brigs us to lighttable mode. On the right pane we locate export selected and make it to look like this:
Export could be achieved via export button or keyboard shortcut Ctrl+e. To go back to darkroom mode we hit “d”. We repeat process on all frames. If we added some exposure, EV or maybe two, it is likely that we have generated noise. Also if we do not want full size picture we may want to resize it, ale will finish processing much faster. In GIMP we do Filters -> Enhance ->Wavelet Denoise and here is how it looks after and before denoising:
If we want to resize that is Image -> Scale image, and we export image back to original ppm.
Now we can open terminal and cd to folder with ppms. This is what I did and what was output in terminal
ale --md 64 --dchain fine:box:1 *.ppm stacked.ppm
Output file will be'stacked.ppm'.
Original Frame: 'img_0001_01.ppm'.
Supplemental Frames:
'img_0001_02.ppm'***** okay (88.869874% match).
'img_0001_03.ppm'***** okay (90.897255% match).
'img_0001.ppm'***** okay (91.984785% match).
Iterating Irani-Peleg.
Average match: 90.583972%
Here is explanation --md
Result was rather cold and dark. In order to bring some warmth and dynamics I opened stacked.ppm in GIMP and did Colors -> Components -> Decompose where from drop-down LAB is desired. Now for each layer I made only one which I am currently working on visible and others invisible, clicking on eye in layers docking window. Then after duplicating layer and setting copy layer to Overlay mode, I merged visible layers accepting default option expand as necessary. That was repeated for L, A and B layer.
After that Colors -> Components -> Compose, again selecting LAB. At the end Colors -> Auto -> White Balance and Edit -> Fade Levels with default Replace mode and Opacity 33. Here is result:
Those are the same four frames from last tutorial, if you do not have own to process, download them, convert them to ppm and you can try ale stacking and post-processing with them.
Labels:
ale,
Astronomy,
Astrophotography,
Darktable,
GIMP,
image stacking,
Linux,
Photography
Wednesday, November 21, 2012
Even More Astrophotography
Everything what was described in previous tutorial should work on Windows as well. Fiji uses Java and works everywhere as advertised and GIMP installer for Windows certainly exists. How to install plugin registry on Windows I really do not know, Google is your friend.
We can go downloading and processing FITS files from many places beside mentioned LCOGT we can use Hubble data from http://hla.stsci.edu/hlaview.html or maybe data from Misti Mountain Observatory http://www.mistisoftware.com/astronomy/index_fits.htm to name few.
All that is nice, but real fun begins when we capture own data using our own camera.
Any kind of camera, any kind of lens, capable of delivering sharp pictures will do. There is another inexpensive piece of equipment which is a must - tripod. I am using old Sony A290 DSLR with SAL75300 telephoto lens.
Now we go out place tripod and put camera on it. Select manual mode, adjust ISO to 800, maybe 1600 and exposure long as possible bat not too long to avoid star trailing. Shorter focal length will allow longer exposures. How long exposure can be? That depends on many things, your position on the globe, declination of target and so on. With 75mm focal length on DSLR what corresponds to 112.5mm on 35mm SLR I am happy with 4 to 5 seconds of exposure in Johannesburg, South Africa. If 50mm lens is available I would go for 6 to 8 seconds exposure. So, select exposure, go into drive mode select three or five shots burst, aim and fire.
If you are going to use some stacking software like Deep Sky Stacker you can take RAW and JPEG picture simultaneously. Deep Sky Stacker will not work on Linux and you will have to run Windows inside virtual machine to use it.
When you have few nice snapshots of for example Milky Way you can go back to computer and stack them using GIMP. We will describe stacking in the last part of article.
Soon as you stack few snapshots appetite will start growing. Going for hundred snapshots is not way forward. Way forward is to increase exposure time. If you do not have $$$$ to spend on real telescope with computerised equatorial mount, you need to look for cost-effective solution. For example to build barn door tracker or you may just have cheap telescope with equatorial mount which is only good for taking Moon snapshots. Cheap telescopes are coming with poor quality equatorial mounts which are very shaky. Now worst thing which you may attempt is spending money to stabilise cheap mount. Attach weight to its tripod, few rounds of rope around legs to tighten it and that’s it. Doesn’t look nice but does job.
Placing camera on telescope could be done using proper piggyback mount or you can made one. I am using ordinary wire ties tightened between camera and quick release head. Don’t go too sloppy you may destroy camera in that way.
Placing eyepiece with higher magnification in and locating bright star close to edge of viewing field you are ready to go. My camera goes up to 30 seconds and after that BULB, for BULB I need remote, so 30 seconds is what I am aiming for. How long exposure can be? It depends of equatorial mount setup, with quick alignment you should be able to pool one minute. Tolerance for your tracking errors is again function of focal length, 300mm is likely just waste of time with maybe one good frame out dozen.
I uploaded four re-sized frames of area around M 8 if you want to practice before you take own snapshots. Here are links:
https://docs.google.com/open?id=0B0cIChfVrJ7WbkJtZjg3LWlGbXM
https://docs.google.com/open?id=0B0cIChfVrJ7WSWF1LUZ5UnBVRTg
https://docs.google.com/open?id=0B0cIChfVrJ7WMzdnRXdmZmJOSDQ
https://docs.google.com/open?id=0B0cIChfVrJ7WNFplbk5kb2ozYUU
We open the first frame and possibly reduce noise, if required, as in previous tutorial.
If you are using my frames, which are re-sized, there is no need for noise reduction or hot pixels removal. Hot pixels removal will remove quite few stars on re-sized image.
Since those are longer exposure captures we will have hot pixels. To eliminate hot pixels we open Filters -> G’MIC -> Enhancement -> Hot pixels filtering and apply it with default values. Now we open as layer next frame, select it in layers and in Filters we Repeat “G’MIC”. Now we set mode from Normal to Screen, zoom to 100% or more and align layers. We repeat the same for remaining frames. At the end we Merge Visible Layers from context menu for layers (right click one) accepting default expand as necessary option. If picture is too bright what would be a case with supplied pictures, we will do contrast stretching. As we add frames we increase level of signal and histogram changes like this:
So, Colors -> Auto -> White Balance and after that Edit -> Fade Levels where we set mode to Multiply. This is what final result should look like:
That would be simple manual stacking of JPEG frames with satisfactory final result. We could also align color levels on this picture but that was not goal of this tutorial.
We can go downloading and processing FITS files from many places beside mentioned LCOGT we can use Hubble data from http://hla.stsci.edu/hlaview.html or maybe data from Misti Mountain Observatory http://www.mistisoftware.com/astronomy/index_fits.htm to name few.
All that is nice, but real fun begins when we capture own data using our own camera.
Taking pictures without tracking
Any kind of camera, any kind of lens, capable of delivering sharp pictures will do. There is another inexpensive piece of equipment which is a must - tripod. I am using old Sony A290 DSLR with SAL75300 telephoto lens.
Now we go out place tripod and put camera on it. Select manual mode, adjust ISO to 800, maybe 1600 and exposure long as possible bat not too long to avoid star trailing. Shorter focal length will allow longer exposures. How long exposure can be? That depends on many things, your position on the globe, declination of target and so on. With 75mm focal length on DSLR what corresponds to 112.5mm on 35mm SLR I am happy with 4 to 5 seconds of exposure in Johannesburg, South Africa. If 50mm lens is available I would go for 6 to 8 seconds exposure. So, select exposure, go into drive mode select three or five shots burst, aim and fire.
If you are going to use some stacking software like Deep Sky Stacker you can take RAW and JPEG picture simultaneously. Deep Sky Stacker will not work on Linux and you will have to run Windows inside virtual machine to use it.
When you have few nice snapshots of for example Milky Way you can go back to computer and stack them using GIMP. We will describe stacking in the last part of article.
Manual tracking
Soon as you stack few snapshots appetite will start growing. Going for hundred snapshots is not way forward. Way forward is to increase exposure time. If you do not have $$$$ to spend on real telescope with computerised equatorial mount, you need to look for cost-effective solution. For example to build barn door tracker or you may just have cheap telescope with equatorial mount which is only good for taking Moon snapshots. Cheap telescopes are coming with poor quality equatorial mounts which are very shaky. Now worst thing which you may attempt is spending money to stabilise cheap mount. Attach weight to its tripod, few rounds of rope around legs to tighten it and that’s it. Doesn’t look nice but does job.
Placing camera on telescope could be done using proper piggyback mount or you can made one. I am using ordinary wire ties tightened between camera and quick release head. Don’t go too sloppy you may destroy camera in that way.
Placing eyepiece with higher magnification in and locating bright star close to edge of viewing field you are ready to go. My camera goes up to 30 seconds and after that BULB, for BULB I need remote, so 30 seconds is what I am aiming for. How long exposure can be? It depends of equatorial mount setup, with quick alignment you should be able to pool one minute. Tolerance for your tracking errors is again function of focal length, 300mm is likely just waste of time with maybe one good frame out dozen.
Stacking frames in GIMP
I uploaded four re-sized frames of area around M 8 if you want to practice before you take own snapshots. Here are links:
https://docs.google.com/open?id=0B0cIChfVrJ7WbkJtZjg3LWlGbXM
https://docs.google.com/open?id=0B0cIChfVrJ7WSWF1LUZ5UnBVRTg
https://docs.google.com/open?id=0B0cIChfVrJ7WMzdnRXdmZmJOSDQ
https://docs.google.com/open?id=0B0cIChfVrJ7WNFplbk5kb2ozYUU
We open the first frame and possibly reduce noise, if required, as in previous tutorial.
If you are using my frames, which are re-sized, there is no need for noise reduction or hot pixels removal. Hot pixels removal will remove quite few stars on re-sized image.
Since those are longer exposure captures we will have hot pixels. To eliminate hot pixels we open Filters -> G’MIC -> Enhancement -> Hot pixels filtering and apply it with default values. Now we open as layer next frame, select it in layers and in Filters we Repeat “G’MIC”. Now we set mode from Normal to Screen, zoom to 100% or more and align layers. We repeat the same for remaining frames. At the end we Merge Visible Layers from context menu for layers (right click one) accepting default expand as necessary option. If picture is too bright what would be a case with supplied pictures, we will do contrast stretching. As we add frames we increase level of signal and histogram changes like this:
So, Colors -> Auto -> White Balance and after that Edit -> Fade Levels where we set mode to Multiply. This is what final result should look like:
That would be simple manual stacking of JPEG frames with satisfactory final result. We could also align color levels on this picture but that was not goal of this tutorial.
Labels:
Astronomy,
Astrophotography,
GIMP,
image stacking,
Photography
Sunday, November 18, 2012
GIMP, Fiji and astrophotography on Linux
Ever wanted to know how to process
those wonderful Hubble like pictures on Linux? It is not difficult, I
will show you how. We will do RGB processing and LRGB is very
similar.
On the Web we will
encounter plenty tutorials where author uses some proprietary tool to
do contrast stretching and after that Photo Shop to do final
processing. Usually none of those will be available on Linux.
Replacement for Photo
Shop is no brainer, it is naturally popular GIMP. Tool for contrast
stretching is trickier. For nonlinear stretching I am using Fiji,
what is almost the same as ImageJ. Guess there is no distribution
which doesn’t offer GIMP so just follow usual install path for your
distro. It is good idea to install gimp-plugin-registry as well. For
Fiji you will have to download it from
http://fiji.sc/wiki/index.php/Downloads
and untar it. It comes with or without Java runtime, so we pick what
we need depending do we already have Java or not.
Now we installed required
programs and we need data. Typically data consists of three or more
gray images in FITS format. FITS is abbreviation for Flexible Image
Transport System. Good source of FITS data is Las Cumbres Observatory
Global Telescope Network and here is their website http://lcogt.net/
They have two two meter
reflectors and few smaller telescopes and observation data is freely
available under Creative Commons license. If you get data from 2m
telescope you will end-up with three 2008x2008 pixels images about
eight megabytes each. So go there and pick some galaxy from
Observations section, I will go for NGC6946, looks nice. After
downloading Blue, Green and Red FITS we can start.
And we immediately see why
we need contrast stretching, there is barely few stars. Now from menu
we do Process -> Enhance Contrast, tick equalize histogram and hit
OK button. Result looks like this:
There is much more to see but also
huge amount of noise. We repeat the same story for remaining files
and save them as Tiff. If we want we can go to Image -> Color ->
Merge Channels and create composite to see how approximately it will
look like.
It is nice but too much noise, time
for GIMP.
We open all three tifs in GIMP and we
do Filters -> Enhance ->Wavelet Denoise with settings like on
picture. If you don't have gimp-plugin-registry installed there will be no Wavelet Denoise then just do Despeckle few times.
We do the same on remaining two
pictures doing Ctrl+F repeating last filter. Next step is Image ->
Mode -> RGB followed by Colors -> Colorify and we apply what is
actual color on it.
Now we copy green and paste as layer
over red, rename Pasted Layer to something and change layer mode to
Screen, we do the same with blue one.
If alignment is OK we can merge
layers.
Now we can play with curves, levels
or do decomposition to enhance colors and so on, get imaginative
here. Here is how it looks like without any additional processing.
If frames are properly aligned we
could place them as layers into single image and do Colors ->
Components -> Compose what is simpler than doing Mode and
Colorify.
If we have LRGB, then we process RGB
as above. L we stretch, denoise and at the end use it as value layer
and RGB as color layer.
Labels:
Astronomy,
Astrophotography,
Fiji,
FITS processing,
GIMP,
Linux,
Photography,
RGB
Subscribe to:
Posts (Atom)