One Summer for AstroPy

A GSoC 2013 blog for AstroPy

PSF Photometry GSoC Finish

This is the last official week, before the “soft” deadline. As I’ve already documented the largest part of my Code, I’m still working on new functionalities. During the last week and days I worked on extending the PSF photometry to acutally PRFs, which allows fitting of fluxes with subpixel accuracy. Furthermore I have a finished a GaussianPSF class. Again as a first test I’ve used some Spitzer data, the resulting histogram, compared to the Spitzer catalog looks as following:

The result is already pretty close to the catalog reference. I estimated the PRF from ~ 10000 sources and fitted then again to a selection of 500 sources. Currently I’m working on a IPython Notebook, to demonstrate how to use the photometry functions.

Creating and Fitting PSFs

During the last week I implemented a function to create PSFs from images. Given a set of positions and size, the PSFs are extracted from these positions and combined to a single discrete PSF model. I’ve made a short test on SPITZER data. Here’s what the result looks like:

As a test I will have to fit this model to data again and compare to the results of the actual Spitzer catalog. I’ve already worked on the body for PSF fitting. Discrete PSf models can already be fitted, but still have to be integrated in a function “psf_photometry”. I plan to have this ready on Tuesday.

On friday I’ve made a few final chnages to the convolution kernel module. It is now possible to convolve two kernels by using the multiplication operator. Likewise the star operator syntax for convolution.

Kernel Docs and PSF Photometry

During the last week I’ve worked further on the documentation for the convolution kernels and they are nearly finished. I’ve advertised my work on the astropy-dev mailing list and asked for feedback.

In parallel I’ve worked on the photometry module and started to implement the first PSF fitting routines and a first try of the PSF class. This is what I will work on during the next few weeks…

Slowly but Steadily

During the last week I nearly finished the work on the convolution kernel module. I’ve written a documentation with some example applications and colorful images. It still needs some review, but all in all I’m quitesatisfied with the result. Here is an example for a colorful image:

This week I will start to implement the actual photometry functions. Therefore I will have to set up a new Point Spread Function (PSF) class, which can be used for PSF fitting photometry. I discussed with my mentor, that it could be useful to also work with Point Response Functions (PRF). This is a slightly different concept, with a few advantages, because it works with a subsampled grid.

I hope that I can build upon my previous work, because a PSF is conceptionally similar to a convolution kernel (mathematically it is the same), except for the fitting process to the data. And a few other differences in application.

My progress is a little slower than I expected, but it is steady. So all in all I’m quite satisfied. And that was my personal conclusion for the midterm evaluation!

Working on Convolution Module

This week I’ve worked on the convolution module. The kernel class is finished. I implemented kernel arithmetics, so you can define e.g a difference of Gaussian filter or sum two kernels.

Furthemore I’ve written a general discretize function for astropy.models. Currently I’m still having some trouble with the ‘oversample’ mode of this function which sometimes doesn’t seem to give the correct results. This function will be also important for the PSF class and photutils module.

Next week I plan to begin to work on the PSF class and photometry functions.

Concluding Work on Models

The previous week I still worked on astropy.modeling: I added a few other models and modified the testing, so that it works also with Polynomial models. I set up an IPython notebook which helps to implement new mathematical models. I lets you compute the model derivates and test values using SymPy. It also outputs a latex string for the model docstring.

I added latex representations and links for almost every model. A minor change was to make the fitter work with 2D derivates and make the NonLinearFitter work with Polynomials.

This week I’m going to work on the convolution stuff again.

Back to Work

I spent the last week in St.Petersburg on the ICVS 2013, a conference on Computer Vision Systems. I’m quite happy that our paper “Is Crowdsourcing for Optical Gropund Truth Generation feasable?” was awarded as one of the two best contributions.

This week I will continue to work on the following points of the convolution module : 1. Extend the convolution function, so that it works with seperable and non-weighted filters 2. Extend the analysis and comparison of the performance of the convolution operation 3. Restructure the testing for the kernels 4. Write examples for the use of the filter kernels

Currently I’m a little bit behind my work plan, but I hope to catch up this week.

Not Writing Blog… But Code!

As written in the title, I totally forgot about reporting my progress during the last two weeks. So I’d like to make up for this now.

During the first week I’ve worked on simplifying the interface for implementing new one dimensional models. The pull request for this was finally merged a few days ago.

At the same time I’ve started to work on a new convolution kernel class, which is almost finished. I have some background in image processing, where convolution is a very important task, so it was quite fun for me to work on this. I’ve opened a pull request for the code yesterday and I’m waiting for some feedback and comments. The actual convolution function has still to be adapted to the new kernel class.

Last week I’ve written the new interface for two dimensional models and implemented a few models which are useful for the convolution kernel class. E.g. Box-, Gaussian-, MexicanHat-, Airy-Models etc. Furthermore I’ve sligtly restructured the tests for the models. It is now better adapted to a growing number of models. I’ve opened a pull request yesterday.

Currently I’m a little behind my work plan, but I’ve tackled the proposed issues more generally than planned. Instead of coding immediately the new models, I’ve reworked the current interface. Instead of just coding the new convolution kernels I’ve set up a new kernel class.

Tomorrow I will travel to St.Petersburg in Russia, to attend a conference on image processing. Are there any russian GSoC participants from St.Petersburg, who feel like showing me the city?

I will be back on Thursday 18th July and will continue working on the convolution function.

First Week Is Over

The first week of the coding phase is over. At the beginning I had to deal with a few minor problems: When I ran the code analysis of PyDev it always got stuck at a certain point and Eclipse froze. After updating Eclipse it worked again and has never appeared again since.

Currently I’m working on the astropy.modeling package. My original plan was to implement new models, but I’ve decided to simplify the interface first, as implementing new models required overriding many different methods and I wanted to avoid doing it again and again. This way I expect to save even more time in the end. Therefore I spent further one or two days, more than I expected, reading and understand existing code.

The new interface is now ready and I’ve already defined a few new models. I will now continue working on further models this week.

Preparation Phase

Currently I’m preparing for the coding phase which is going to begin next week. For the coding I will use Eclipse with PyDev.

On monday I had the first meeting with my mentor and we decided to change the work plan a little. So I will start to work first on astropy.models and implement a few more specific models. This was planned originally for the second month, but it seemed to be the better way get in.

I’ve started to read the source code of astropy.models to get an overview of what is implemented where, how and why. The base classes are defined in the core module. The base for the specific models will be the ParametricModel class. To set up a new models I have to define the model parameters in the constructor and handle errors during initialisation. The actual model formulas are coded in the eval() method. The deriv() method contains the formulas for the analytical derivatives of the models, this method is called when the models are fitted.

I’m not yet sure how to test the models properly. My first idea is to check whether they integrate correctly in a certain interval and compare with the analytical value.