Wednesday, November 28, 2012

Image upload via RESTful web service

During some project I had to write Android client for RESTful web service where between other things, image upload was goal. Server was handled by Zaries http://zaries.wordpress.com/ so it was Lisp on Hunchentoot. I didn’t want to start with two unknowns, implementing problematic multipart/form-data on Android and writing from the scratch Lisp web service, so I started searching for simple Java example of upload RESTful web service. I expected something on NetBeans but closest what I managed to find was Jersey hello world example http://www.mkyong.com/webservices/jax-rs/jersey-hello-world-example/ done using Maven. So, here I am writing user friendly tutorial for user friendly NetBeans for millions of prospective Android developers which are not quite comfortable with Java EE development, yet. Android client will be separate article.
This was done on LinuxMint Maya, IDE is NetBeans 7.2 and deployment target is Tomcat 7.0.27.
We start from New Project dialog where from group Java Web we select Web Application. With exception of Server and Settings where we want to target Apache Tomcat during all other steps we accept defaults. If we downloaded bundle, Glassfish is default target.




Now we want to add RESTful web service. We right click on project icon in left pane and select New -> RESTful Web Services from Patterns ...
Again we accept default values with exception of package name where we type in za.org.droid. Before we start coding we add two libraries, those are Jersey 1.8 and JAX-WS 2.2.6. Inside Projects pane we right click on Libraries folder and select Add Library ...


Now we can delete useless GET and PUT Hello World methods generated by IDE and copy and paste this

@POST
@Path("/images")
@Consumes(MediaType.MULTIPART_FORM_DATA)
public Response imageUpload(@FormDataParam("image") InputStream hereIsImage, @FormDataParam("image") FormDataContentDisposition hereIsName) {
    String path = System.getenv("HOME")+"/tmp/";
    if(hereIsName.getSize()==0) {
        return Response.status(500).entity("image parameter is missing").build();
    }
    String name = hereIsName.getFileName();
    path += name;

    try {
        OutputStream out = new FileOutputStream(new File(path));
        int read;
        byte[] bytes = new byte[1024];
        while ((read = hereIsImage.read(bytes)) != -1) {
            out.write(bytes, 0, read);
        }
        out.flush();
        out.close();
    } catch (IOException e) {
        return Response.status(500).entity(name + " was not uploaded\n"+e.getMessage()).build();
    }
    return Response.status(200).entity(name + " was uploaded").build();
}


We should create in our $HOME folder tmp folder where images will be saved. We look for image parameter and it will tell us what is image called and also it will contain raw image data. We return response informing client about how successful was upload attempt.
Since we accepted default names for RESTful web service it will have “generic” path assigned, that is important because we use that path to call it.
Only what is left is to do configuration. In WEB-INF folder we create web.xml file and paste the following in:




We can save xml and deploy web application. End-point to call will be http://localhost:8080/WebApplication3/xyz/generic/images, application name WebApplication3 may be different so please change it accordingly. To test upload one can use HttpClient, httpcomponents-client-4.2.1 contains working example. I will add blog about Android client in day or two.

Sunday, November 25, 2012

Final part of Android tutorial

Here we are going to take a look at ContentProvider, subclassing of SimpleCursorAdapter, ListView and assembling of all that into almost usable application.

ContentProvider


Most irritating concept of whole Android platform. That is some kind of grand unified interface to access all data publicly available on system. As we may expect from Internet search oriented company, there is Uri which starts with "content://" then we have “authority” and “base path”, like this:

private static final String AUTHORITY = "za.org.droidika.tutorial.SearchResultProvider";
private static final String TWEETS_BASE_PATH = "tweets";
public static final Uri CONTENT_URI = Uri.parse("content://" + AUTHORITY
        + "/" + TWEETS_BASE_PATH);


That is not all, we also differentiate between operations on unit and bulk operations:

public static final int TWEETS = 100;
public static final int TWEET_ID = 110;
public static final String CONTENT_ITEM_TYPE = ContentResolver.CURSOR_ITEM_BASE_TYPE
        + "vnd.org.droidika.tutorial/tweets";
public static final String CONTENT_TYPE = ContentResolver.CURSOR_DIR_BASE_TYPE
        + "vnd.org.droidika.tutorial/tweets";


Further in the source code (available from here https://github.com/FBulovic/grumpyoldprogrammer) we combine our identifiers and load into UriMatcher and we add DB mappings to HashMap. That, with help of SQLiteOpenHelper, is enough of configuration and we can finally implement CRUD methods. I will explain bulk insert and you can find out what is going on in others on you own.

public int bulkInsert(Uri uri, ContentValues[] values) {
    final SQLiteDatabase db = dbInst.getWritableDatabase();
    final int match = sURIMatcher.match(uri);
    switch(match){
    case TWEETS:
        int numInserted= 0;
        db.beginTransaction();
        try {
            SQLiteStatement insert =
                db.compileStatement("insert into " + DbHelper.TABLE_NAME
                        + "(" + DbHelper.USER + "," + DbHelper.DATE
                        + "," + DbHelper.TEXT + ")"
                        +" values " + "(?,?,?)");
            for (ContentValues value : values){
                insert.bindString(1, value.getAsString(DbHelper.USER));
                insert.bindString(2, value.getAsString(DbHelper.DATE));
                insert.bindString(3, value.getAsString(DbHelper.TEXT));
                insert.execute();
            }
            db.setTransactionSuccessful();
            numInserted = values.length;
        } finally {
            db.endTransaction();
        }
        getContext().getContentResolver().notifyChange(uri, null);
        return numInserted;
    default:
        throw new UnsupportedOperationException("unsupported uri: " + uri);
    }
}


We obtain writable instance of DB and we ask URIMatcher do we have right Uri, we do not want to attempt inserting wrong data or do bulk inserting of single row. Next we compile insert statement and open transaction. Inside for loop we assign parameters and execute inserts. If everything vent well we commit transaction and close it inside finally. At the end we ask ContentResolver to notify subscribers about new situation.
We do not pay any attention on resource management, we do not manage cursors, nothing. Suddenly SQLite is friendly and cooperative. That is probably main reason, beside ability to share data across application boundaries, why we should use providers. Naturally there is more, but it happens not in code but inside AndroidManifes.xml
Within application element we place this:

    android:name=".SearchResultProvider"
    android:authorities="za.org.droidika.tutorial.SearchResultProvider" />


Now ContentResolver knows how to find our provider.


Subclassing SimpleCursorAdapter



Since whole idea behind application was implementing something like autocompletion, we type and ListView content changes while we typing, we need custom data adapter. There is only one interesting method to implement:

public Cursor runQuery(CharSequence constraint) {
    String searchString = constraint.toString();
    if (searchString == null || searchString.length() == 0)
        c = contentResolver.query(SearchResultProvider.CONTENT_URI, null, null, null, null);
    else {
        c = contentResolver.query(SearchResultProvider.CONTENT_URI, null, DbHelper.TEXT + " like '%"
                + searchString + "%'", null, null);
    }
    if (c != null) {
        c.moveToFirst();
    }
    return c;
}


If we have search string we do like query and return cursor and if we do not have search string we return everything. Again we do not manage cursor, we just leave everything to Android and it behaves really friendly.

Assembling application


User interface is influenced with SearchableDictionary sample from Android SDK. We type in EditText on top of the screen our search string and data adapter and provider load result into ListView. In order to retrieve data we start “cron job” and it does Twitter search every three minutes and stores data into DB. MainActivity contains example how to create and use menu, how to check is network available and only nontrivial method there is this one:

private void buildList() {
    String[] columns = new String[] { DbHelper.DATE, DbHelper.TEXT,
            DbHelper.USER };
    int[] to = new int[] { R.id.textView2, R.id.textView4, R.id.textView6 };
    Cursor cursor = createCursor();
    final SearchableCursorAdapter dataAdapter = new SearchableCursorAdapter(this, R.layout.list_entry,
            cursor, columns, to);
    ListView listView = (ListView) findViewById(R.id.missingList);
    listView.setAdapter(dataAdapter);
    EditText textFilter = (EditText) findViewById(R.id.myFilter);
    textFilter.addTextChangedListener(new TextWatcher() {

        public void afterTextChanged(Editable s) {
        }

        public void beforeTextChanged(CharSequence s, int start, int count,
                int after) {
        }

        public void onTextChanged(CharSequence s, int start, int before,
                int count) {
            if (dataAdapter != null) {
                    dataAdapter.getFilter().filter(s.toString());
            }
        }
    });
    dataAdapter.setFilterQueryProvider(dataAdapter);
}


We do setup of ListView, create data adapter, assign data adapter and use TextWatcher to run queries against content provider. Not very complicated.
Again, visit repository https://github.com/FBulovic/grumpyoldprogrammer retrieve code and you have quite comprehensive example, written in such way that is easy to understand. If you are going to use it in production configure HttpClient properly, how is that done is described here http://grumpyoldprogrammer.blogspot.com/2012/10/is-it-safe.html

Saturday, November 24, 2012

Install Oracle JDK in LinuxMint Maya or Ubuntu 12.04 LTS

This primarily describes setup required for Android development on 64 bit LinuxMint Maya what is very much the same as Ubuntu 12.04 but with usable window manager. For those two popular distros we have OpenJDK in repository and we can easily install it using apt-get from terminal or GUI Software Manager. But for Android development only Oracle JDK is supported and Android SDK is 32 bit what implies:

sudo apt-get install ia32-libs

Otherwise we will get confusing error message that for example adb was not found and we attempted to run it.
Current version of Oracle JDK can be downloaded from here http://www.oracle.com/technetwork/java/javase/downloads/index.html
For example we select jdk-7u7-linux-x64.tar.gz accept license and download it using Firefox. When download finishes we typically check signature of it and that is done from terminal, so cd to Downloads and run:

$ md5sum jdk-7u7-linux-x64.tar.gz
15f4b80901111f002894c33a3d78124c  jdk-7u7-linux-x64.tar.gz


Here I do Google search on md5 to be sure that I downloaded right archive. Then we unpack archive simply right clicking on it and selecting Extract Here. That creates directoru jdk1.7.0_07 in Downloads. JDK should be in /usr/lib/jvm unless we want to specify execution path every time, for example this is how it looks on my box:

/usr/lib/jvm $ ls -l
total 20
lrwxrwxrwx 1 root root   24 Oct 31 15:39 default-java -> java-1.6.0-openjdk-amd64
lrwxrwxrwx 1 root root   24 Oct 31 15:39 java-1.6.0-openjdk -> java-1.6.0-openjdk-amd64
lrwxrwxrwx 1 root root   20 Oct 31 15:39 java-1.6.0-openjdk-amd64 -> java-6-openjdk-amd64
lrwxrwxrwx 1 root root   24 Oct 31 15:39 java-6-openjdk -> java-1.6.0-openjdk-amd64
drwxr-xr-x 7 root root 4096 Nov  4 00:00 java-6-openjdk-amd64
drwxr-xr-x 3 root root 4096 May  3  2012 java-6-openjdk-common
lrwxrwxrwx 1 root root   24 Nov  1 11:26 java-6-oracle -> /usr/lib/jvm/jdk1.6.0_37
drwxr-xr-x 5 root root 4096 May  3  2012 java-7-openjdk-amd64
lrwxrwxrwx 1 root root   24 Oct 31 16:52 java-7-oracle -> /usr/lib/jvm/jdk1.7.0_07
drwxr-xr-x 8 root root 4096 Nov  1 11:22 jdk1.6.0_37
drwxr-xr-x 8 root root 4096 Aug 29 03:12 jdk1.7.0_07


In order to move jdk1.7.0_07 from downloads we can use

sudo mv jdk1.7.0_07 /usr/lib/jvm/

we are doing that from terminal in Downloads, or maybe start caja or gnome as root and do it from GUI. If we are in GUI we recursively change ownership to root using properties and if we are doing it from terminal:

sudo chown -R root:root /usr/lib/jvm/jdk1.7.0_07

Now we need symlink which we use later to switch between different versions of Java:

sudo ln -s /usr/lib/jvm/jdk1.7.0_07 /usr/lib/jvm/java-7-oracle

now we can install runtime and compiler:

sudo update-alternatives --install /usr/bin/java java /usr/lib/jvm/java-7-oracle/jre/bin/java 2
sudo update-alternatives --install /usr/bin/javac javac /usr/lib/jvm/java-7-oracle/bin/javac 1


That allows us to configure runtime and compiler respectively using the following two:

sudo update-alternatives --config java
sudo update-alternatives --config javac


we need simply to type number of desired version and hit enter. To check what we are actually running we can execute:

javac -version
java -version

Friday, November 23, 2012

More about stacking

In the last tutorial we stacked few frames using GIMP after manually aligning them. Since we were setting layers to screen mode it was at the end quite bright. What we could do to make it more natural is to make additional layers transparent. Opacity of bottom layer 100%, next 50%, next 25% and so on.

Automated stacking


Here Windows users have really wide choice of free or paid programs. For example Deep Sky Stacker and RegiStax. If there is dozen or more frames to stack it pays to start KVM or VBox and struggle with Windows for for few minutes in order to use Deep Sky Stacker. There is no real equivalent which will do the same task on Linux, register, align and stack frames using GUI. For smaller number of frames we can use ale - The Anti-Lamenessing Engine written by David Hilvert. Typically it is not compiled with ImageMagick support enabled, so we have to prepare images to be processed an to convert them into ppm format. We can install ImageMagick and use mogrify and convert to convert photos from terminal, but in order to make things easier we will use GUI.
If we are using DSLR camera or some of those new compact cameras we may be able store images in RAW format. It allows us to do significant amount of processing on such image. There is few really good programs for RAW processing, like UFRaw, RawTherapee or Darktable. We are going to use the last one Darktable. After installation we will right click on RAW file and select from context menu Open With Darktable. User interface is non-conventional so here is quick explanation. We want our RAW to be converted to ppm and we want to remove hot pixels.
Proper removal of hot pixels would be taking “dark frames” at the end of the session. That is place cap on lens and take picture using the same ISO value and exposure time as data frames. After that we can add dark to data frame as layer in GIMP and subtract it to remove hot pixels.
When Darktable shows-up we will see in the right pane tab which says more plugins. Clicking on it we open it and select hot pixels plugin. Again clicking on it we close it and under correct tab is hot pixels plugin, which is "switched off". We "switch it on" and it removes hot pixels.
We are happy with all default processing so far. Now we want to export image to ppm. For that we hit key “l” what brigs us to lighttable mode. On the right pane we locate export selected and make it to look like this:

Export could be achieved via export button or keyboard shortcut Ctrl+e. To go back to darkroom mode we hit “d”. We repeat process on all frames. If we added some exposure, EV or maybe two, it is likely that we have generated noise. Also if we do not want full size picture we may want to resize it, ale will finish processing much faster. In GIMP we do Filters -> Enhance ->Wavelet Denoise and here is how it looks after and before denoising:

If we want to resize that is Image -> Scale image, and we export image back to original ppm.
Now we can open terminal and cd to folder with ppms. This is what I did and what was output in terminal


ale --md 64 --dchain fine:box:1 *.ppm stacked.ppm
Output file will be'stacked.ppm'.                                 
Original Frame: 'img_0001_01.ppm'. 
Supplemental Frames: 
'img_0001_02.ppm'***** okay (88.869874% match). 
'img_0001_03.ppm'***** okay (90.897255% match). 
'img_0001.ppm'***** okay (91.984785% match). 
Iterating Irani-Peleg. 
Average match: 90.583972% 


Here is explanation --md Set element minimum dimension x. (100 is default),  --dchain fine:box:1 approximates drizzling. and two remaining params are input and output. To get more info execute ale --hA.
Result was rather cold and dark. In order to bring some warmth and dynamics I opened stacked.ppm in GIMP and did Colors -> Components -> Decompose where from drop-down LAB is desired. Now for each layer I made only one which I am currently working on visible and others invisible, clicking on eye in layers docking window. Then after duplicating layer and setting copy layer to Overlay mode, I merged visible layers accepting default option expand as necessary. That was repeated for L, A and B layer.


After that Colors -> Components -> Compose, again selecting LAB. At the end Colors -> Auto -> White Balance and Edit -> Fade Levels with default Replace mode and Opacity 33. Here is result:

Those are the same four frames from last tutorial, if you do not have own to process, download them, convert them to ppm and you can try ale stacking and post-processing with them.

Wednesday, November 21, 2012

Even More Astrophotography

Everything what was described in previous tutorial should work on Windows as well. Fiji uses Java and works everywhere as advertised and GIMP installer for Windows certainly exists. How to install plugin registry on Windows I really do not know, Google is your friend.

We can go downloading and processing FITS files from many places beside mentioned LCOGT we can use Hubble data from http://hla.stsci.edu/hlaview.html or maybe data from Misti Mountain Observatory http://www.mistisoftware.com/astronomy/index_fits.htm to name few.

All that is nice, but real fun begins when we capture own data using our own camera.

Taking pictures without tracking


Any kind of camera, any kind of lens, capable of delivering sharp pictures will do. There is another inexpensive piece of equipment which is a must - tripod. I am using old Sony A290 DSLR with SAL75300 telephoto lens.
Now we go out place tripod and put camera on it. Select manual mode, adjust ISO to 800, maybe 1600 and exposure long as possible bat not too long to avoid star trailing. Shorter focal length will allow longer exposures. How long exposure can be? That depends on many things, your position on the globe, declination of target and so on. With 75mm focal length on DSLR what corresponds to 112.5mm on 35mm SLR I am happy with 4 to 5 seconds of exposure in Johannesburg, South Africa. If 50mm lens is available I would go for 6 to 8 seconds exposure. So, select exposure, go into drive mode select three or five shots burst, aim and fire.
If you are going to use some stacking software like Deep Sky Stacker you can take RAW and JPEG picture simultaneously. Deep Sky Stacker will not work on Linux and you will have to run Windows inside virtual machine to use it.
When you have few nice snapshots of for example Milky Way you can go back to computer and stack them using GIMP. We will describe stacking in the last part of article.

Manual tracking


Soon as you stack few snapshots appetite will start growing. Going for hundred snapshots is not way forward. Way forward is to increase exposure time. If you do not have $$$$ to spend on real telescope with computerised equatorial mount, you need to look for cost-effective solution. For example to build barn door tracker or you may just have cheap telescope with equatorial mount which is only good for taking Moon snapshots. Cheap telescopes are coming with poor quality equatorial mounts which are very shaky. Now worst thing which you may attempt is spending money to stabilise cheap mount. Attach weight to its tripod, few rounds of rope around legs to tighten it and that’s it. Doesn’t look nice but does job.

Placing camera on telescope could be done using proper piggyback mount or you can made one. I am using ordinary wire ties tightened between camera and quick release head. Don’t go too sloppy you may destroy camera in that way.
Placing eyepiece with higher magnification in and locating bright star close to edge of viewing field you are ready to go. My camera goes up to 30 seconds and after that BULB, for BULB I need remote, so 30 seconds is what I am aiming for. How long exposure can be? It depends of equatorial mount setup, with quick alignment you should be able to pool one minute. Tolerance for your tracking errors is again function of focal length, 300mm is likely just waste of time with maybe one  good frame out dozen.

Stacking frames in GIMP


I uploaded four re-sized frames of area around M 8 if you want to practice before you take own snapshots. Here are links:

https://docs.google.com/open?id=0B0cIChfVrJ7WbkJtZjg3LWlGbXM
https://docs.google.com/open?id=0B0cIChfVrJ7WSWF1LUZ5UnBVRTg
https://docs.google.com/open?id=0B0cIChfVrJ7WMzdnRXdmZmJOSDQ
https://docs.google.com/open?id=0B0cIChfVrJ7WNFplbk5kb2ozYUU

We open the first frame and possibly reduce noise, if required, as in previous tutorial.
If you are using my frames, which are re-sized, there is no need for noise reduction or hot pixels removal. Hot pixels removal will remove quite few stars on re-sized image.
Since those are longer exposure captures we will have hot pixels. To eliminate hot pixels we open Filters -> G’MIC -> Enhancement -> Hot pixels filtering and apply it with default values. Now we open as layer next frame, select it in layers and in Filters we Repeat “G’MIC”. Now we set mode from Normal to Screen, zoom to 100% or more and align layers. We repeat the same for remaining frames. At the end we Merge Visible Layers from context menu for layers (right click one) accepting default expand as necessary option. If picture is too bright what would be a case with supplied pictures, we will do contrast stretching. As we add frames we increase level of signal and histogram changes like this:


So, Colors -> Auto -> White Balance and after that Edit -> Fade Levels where we set mode to Multiply. This is what final result should look like:

That would be simple manual stacking of JPEG frames with satisfactory final result. We could also align color levels on this picture but that was not goal of this tutorial.

Sunday, November 18, 2012

GIMP, Fiji and astrophotography on Linux


Ever wanted to know how to process those wonderful Hubble like pictures on Linux? It is not difficult, I will show you how. We will do RGB processing and LRGB is very similar.
On the Web we will encounter plenty tutorials where author uses some proprietary tool to do contrast stretching and after that Photo Shop to do final processing. Usually none of those will be available on Linux. Replacement for Photo Shop is no brainer, it is naturally popular GIMP. Tool for contrast stretching is trickier. For nonlinear stretching I am using Fiji, what is almost the same as ImageJ. Guess there is no distribution which doesn’t offer GIMP so just follow usual install path for your distro. It is good idea to install gimp-plugin-registry as well. For Fiji you will have to download it from http://fiji.sc/wiki/index.php/Downloads and untar it. It comes with or without Java runtime, so we pick what we need depending do we already have Java or not.
Now we installed required programs and we need data. Typically data consists of three or more gray images in FITS format. FITS is abbreviation for Flexible Image Transport System. Good source of FITS data is Las Cumbres Observatory Global Telescope Network and here is their website http://lcogt.net/
They have two two meter reflectors and few smaller telescopes and observation data is freely available under Creative Commons license. If you get data from 2m telescope you will end-up with three 2008x2008 pixels images about eight megabytes each. So go there and pick some galaxy from Observations section, I will go for NGC6946, looks nice. After downloading Blue, Green and Red FITS we can start.

And we immediately see why we need contrast stretching, there is barely few stars. Now from menu we do Process -> Enhance Contrast, tick equalize histogram and hit OK button. Result looks like this:

There is much more to see but also huge amount of noise. We repeat the same story for remaining files and save them as Tiff. If we want we can go to Image -> Color -> Merge Channels and create composite to see how approximately it will look like.

It is nice but too much noise, time for GIMP.
We open all three tifs in GIMP and we do Filters -> Enhance ->Wavelet Denoise with settings like on picture. If you don't have gimp-plugin-registry installed there will be no Wavelet Denoise then just do Despeckle few times.

We do the same on remaining two pictures doing Ctrl+F repeating last filter. Next step is Image -> Mode -> RGB followed by Colors -> Colorify and we apply what is actual color on it.
Now we copy green and paste as layer over red, rename Pasted Layer to something and change layer mode to Screen, we do the same with blue one.

If alignment is OK we can merge layers.
Now we can play with curves, levels or do decomposition to enhance colors and so on, get imaginative here. Here is how it looks like without any additional processing.

If frames are properly aligned we could place them as layers into single image and do Colors -> Components -> Compose what is simpler than doing Mode and Colorify.
If we have LRGB, then we process RGB as above. L we stretch, denoise and at the end use it as value layer and RGB as color layer.