Tuesday, November 28, 2006

Mad Props

Sorry I haven't posted in a while, things have been busy around here. I successfully recovered from the Thanksgiving meal, but it took a day or two. The core has been flooding in at 50-70m/day. Which is awesome considering the slow start we got, but it also means a lot of work for me and everyone else on the night shift. We obviously can't keep pace with the bit, so the gap between what has been logged and what has been drilled continues to widen.

I have to give mad props to my fellow nightwalkers. Larry, Ellen, Gavin, Thom, and Franco have all been pushing themselves to the limit to get as much core described as possible each night--sometimes pushing out 40m or more. The other allstars are the curators. Matt, Davide, and Kelly work flat out for 12 hours or more on splitting and imaging the core so that it is ready for the sedimentologists to describe. The current record is 60m of core split in one 24 hour period, and over 40 of that was done during the night by Matt and Davide. Vanessa and Matteo also deserve recognition. Matteo has been entering all the clasts that Franco draws, a job that would drive me bonkers. Vanessa has been helping the sedimentologists with smear slides, and they both have been helping Donata with the spectrophometer, which judging by their excitement when they walk up to the RAC tent is not a very fun job.

As for me, I've been trying to automate as much of my work as I can. Otherwise, I'd quickly fall hopelessly behind. I've written a bunch of scripts and helper programs to minimize the amount of manual work involved with the Corelyzer server administration. I still have to manually crop the split core images, but after that is done the script can take care of everything. After the split core images were under control, I turned my attention to the whole core images. These need to be converted from BMP to JPEG, renamed to a sane file naming scheme, and loaded into Corelyzer. The most time consuming and error prone part of the whole process was converting the files and then naming them based on some data in a spreadsheet. So in an act of inspiration (or perhaps disgust), I wrote up a little program to read the spreadsheet and do the conversion/renaming step. So I can start it and just walk away. As they say, "necessity is the mother of invention".

So now that a lot of things are automated, I've been able to focus in a bit more on PSICAT. I made some major advances the last two days. There had been some performance problems we've been experiencing. They hadn't shown up before, mainly because we've never had 400m+ of continuous core described in PSICAT. So I went out and got a trial license for the YourKit Java Profiler to see if I could improve things at all. Within the first 10 minutes I had found and optimized a loop that PSICAT was spending most of its time in. The fix was relatively trivial once I took a closer look at the loop and the problem it was trying to address. This made a noticeable speed improvement when opening up a large project. The other thing I fixed with the help of the profiler was a heap space issue. I was running out of heap space when exporting images, and I bumped up the heap size, but to no avail. The profiler was instrumental in tracking down where the memory leak was. It took me a bit longer to fix that problem but now you can export all 400m of core as images w/o running into any heap space issues. Now I can turn my attention to a few of the new features that need to get developed for the on-ice report.

2 comments:

Arun Rao said...

Mad props to you!

Julian said...

Awesome! Thanks for all the works running server!