Recent posts (max 20) - Browse or Archive for more

RESTish WMSGetFeatureInfo API

The basic issue when using OGC web services is there's no simple way to ask for the value of a layer at a specific Latitude & Longitude. WMS and WMTS require that you specify bounding boxes(BBOX) or tiles, service version, pixel coordinates and whole bunch of other stuff. WFS requires that the layer be a Vector layer, I have mostly Raster layers. What happens if you just want a raster cell value for one point. I suppose you could query that pixel from a WCS but you'd have use knowledge of the size of the original data pixels to make a BBOX

Example for 10.75,13.25,13,11,14&WIDTH=100&HEIGHT=100&X=75&Y=75

A lot of the required parameters really don't change for a given use case. In this project, I'm always querying in Lat Lon WGS84 and I'm not necessarily loading a map that corresponds. So I came up with a way to shorten out all the repetitive stuff and simplify the components for the request without having to abandon using the WMS server I already had serving the data.

With that knowledge and Apache Rewrite you can simply this quite a bit.

<IfModule rewrite_module>
    RewriteEngine  on
    RewriteRule "^/api/maps/([^/]*)/layers/([^/]*)$" "/maps?MAP=$1&QUERY_LAYERS=$2&LAYERS=$2&SERVICE=WMS&VERSION=1.1.1&REQUEST=GetFeatureInfo&STYLES=default&SRS=EPSG:4326&FEATURE_COUNT=1&INFO_FORMAT=text/html" [PT,QSA]

The end result is now you can write a more sensible url in REST style (almost)




The trick here, is make a 1x1 degree box based on rounding your coordinates to the nearest integer. Then using the remainder to get the fractional distance out of the 100x100 pixels in the request. You can also use 1000x1000 which WMS servers will allow to get 1 extra decimal place added accuracy.

The big Gotchas

  • Pixel space 0,0 is upper left corner (positive numbers go down)
  • WMS 1.3 for EPSG:4326 swaps the Lat Lon ordering in the BBOX
  • Negative coordinates require a little extra handling.

See diagram below.

WMS RESTishAPI diagram Note: For WMS 1.3 it's CRS not SRS, and I & J not X & Y

And now an example of how to build the request url in Javascript. If you're WMS service is set to return json or geojson or html that should be all you need.

function getInfoUrl(lonlat) {

    //retrieve the name of actively selected layer you want to query
    var layername = $("#variableSelect :selected").val();

    var lon = lonlat[0];
    var lat = lonlat[1];

    //handle quadrant shift
    var minx = Math.floor(lon);
    var maxx = minx+1;
    var miny = Math.floor(lat);
    var maxy = miny+1;

    var bbox = "BBOX="+minx+","+miny+","+maxx+","+maxy;
    var wh = "WIDTH=100&HEIGHT=100";
    //Pixel within bbox at given width and height
    if (lon < 0) {
        var x = Math.round(100+((lon % 1)*100));
    } else {
        var x = Math.round((lon % 1)*100);
    if (lat < 0) {
        var y= Math.round(Math.abs((lat % 1)*100));
    } else {
        var y = Math.round(100-((lat % 1)*100));

    var baseurl = ""
    var fullurl = baseurl+layername+"?"+bbox+"&"+wh+"&X="+x+"&Y="+y;


It would be awesome to eliminate the BBOX part and the pixel space coordinates so all you need is Lat & Long. But so far it looks like one needs to write a small web application to do that.

Lens Correction for GoPro Hero2

Fisheye lenses are awesome, such a wide view captures great scenes. However if you want to stitch the photos together, that distortion on the edges causes some real trouble.

So you want to fix it, great, Digikam has a lens correction tool built in based on  Lensfun. Hmm, wait there's no record for GoPros. Ok take test shots with lots of straight lines(buildings work great) and upload them to the awesome maintainer of Lensfun via Wait a couple days, get back a lens model, sweet.

In Digikam, try it on 1 photo, awesome it works, well at least on Digikam 3.5 (in Digikam 2.x it creates distortion art). But then you go to do it in batch on all your photos and wham, nothing, big fat nothing. Turns out it's a known bug:

Tried GIMP, with Gimp-Lenfun. Got it installed and working but it doesn't correct the distortion. Also batch processing in GIMP is somewhat a PITA, there are a couple of addons but they don't always work, or allow using addon filters.

gopro distortion before and after

I was going to leave it at that and wait for a fix sometime down the road, but then had some down time. So a few hours later using Python, Lensfun (lensfunpy) and OpenCv (python-opencv) I got it to work at least on my laptop running Ubuntu 14.04. Trying it on my Ubuntu 12.04 desktop resulted in similar distortion art as digikam. So I tried updating opencv, no change. Turns out there's a bug in lensfun fixed between 2.5 and 2.8. So I had to Backport 2.8 to Ubuntu 12.04 ( instructions). If you want my backport its in my  PPA

Want to see my code for doing it in batch, you're in luck head over to github:

Save battery by slowing data

The title sums up this post pretty good. If you've got Android, more specifically a variant of Android that let's you really tweak your settings, like  Cyanogenmod

Do you live in a place with 4G or did you just travel somewhere that has 4G? If so and you want your phone to actually last the whole day or two toggle the speed to 2G (it will actually toggle to not 4G which is often 3G). I find that if I don't 4G drains my battery very quickly to where I worry about running out before getting home or to a hotel while travelling.

The trade-off is great if you're just checking email and getting directions. When you want to stream video etc, just toggle the speed back on.

TODO: I'll get picture of my quick-bar power settings so people can see what it looks like.

Tip 2: Want to use Wifi on a plane, toggle to Airplane mode, then toggle wifi back on. PS: Airplane mode also breaks many in program ads.

Upgrade notes Ubuntu 12.04 to 14.04

Just a few things I encountered in my upgrade on my Zotac Zbox going to from Ubuntu Precise (12.04) to Ubuntu Trusty (14.04)

  • Couldn't get it to use an iso as the upgrade material since there's no alternate cd anymore, so did an online upgrade which worked fine.


  • Atheros driver is way better, I went from 1 Mbps to 4 Mbps on, nothing else changed in my network, the latter speed what I always got from other computers.
  • Streaming video full screen no longer requires gpu acceleration to be disabled.

Bugs (related):

  • nouveau driver hiccups on sound every few seconds when streaming videos
  • Nvidia Ion graphics/sounds always transmits sound on HDMI even if you switch to analog. In my case this provided a weird problem, where I couldn't use analog audio to bypass the previous bug above. See fix below...


  • Installed the Nvidia drivers, which had major issues in 12.04 (screen blank or not lined up with monitor/tv) - works great now
  • Forgot that Amazon Prime streaming require hal for flash drm get it from this  ppa
  • Chromium and Chrome no longer work with Adobe flash from the repos, you need pepperflash
    sudo apt-get install pepperflashplugin-nonfree
    sudo update-pepperflashplugin-nonfree --install
  • If you're using Apache pay attention to the 2.2 to 2.4 upgrade, syntax of allows and conf file names changed and are important.

R tip: G-Test G-Statistic G^2 likelihood ratio, or whatever else you might want to call it.

When analyzing categorical data, sometimes Chi-Square just isn't the right distribution for testing goodness-of-fit or testing Independence. So many people recommend a G test instead.

Being a user of  R, obviously I'd like to also run this test along with my other tests. A little searching the web, and answers are littered with, "R doesn't have g-test built in, here's code to do it yourself..." Which is 1/2 true, unlike the chisq.test the base R does not appear to have a g-test. I'd rather leave coding of standard statistics to people who really know the ins and outs of the formulas and have a good way to verify the answer.

So, a few hours later I find  Deducer has  likelihood.test So we're all good, right?

Well then when I got significant results I started looking for Post-hoc tests. In doing so it turns out that the following also do G-tests as part of their Measures of Association tests (typically used as post-hoc tests):

So there, base R doesn't have it, but at least 3 packages do so people don't need to keep re-writing it.

FYI is awesome if you haven't seen it yet.

Scan over wifi from multi-functions in Linux

Ended up needing to configure a few multi-function machines to print and scan via wifi with Linux. Here's the details of what you need to know. Specifically I did a Brother HL-2280DW and an Epson WF-3540 on Ubuntu 12.04

In general set a static IP address, either on the printer or with your home router using DHCP reservations based on MAC address.

Figuring out the device URI was the trickiest part as Ubuntu never seems to guess that quite right. The drivers for printing tend to be found automatically. If that fails both vendors have them available on their website.



Add Printer, from network, give it the ip of the machine, then pick the lpd option.

Device URI: lpd://


Go to the brother  support site and get the following files for installation.

  • Scanner driver
    • brscan4-0.4.2-1.amd64.deb
  • Scanner Setting File
    • brother-udev-rule-type1-1.0.0-1.all.deb

Now also make sure you have sane installed.

Run the following to register your multi-function

brsaneconfig4 -a name=Brother model=HL-2280DW ip=

Should now work with sane based programs.



Add Printer, from network, give it the ip of the machine, then pick the lpd option.

Device URI lpd://


Search the  epson download site for drivers. I needed:

  • WF-3540 Series Scanner Driver Linux core package&data package
    • iscan-data_1.28.0-2_all.deb
    • iscan_2.29.3-1~usb0.1.ltdl7_amd64.deb
  • WF-3540 Series Scanner Driver Linux network plugin package
    • iscan-network-nt_1.1.1-1_amd64.deb

Install them in that order. Now also make sure you have sane installed. Then edit /etc/sane.d/epkowa.conf (This is the part no one on web seems to describe.) Can't find the file you might needs to install libsane-extras

In the net section add a line with your multi-function ip address.


Save that and now when you open iscan or sane it should find your scanner.

Batch convert Natural Earth SQLite to Spatialite

Happened to be making some maps today, and realized 1:110m would be better than 1:10m for small world maps in R (much faster too). I had the whole  Natural Earth dataset downloaded in sqlite format. SQLite is great but I can't run spatial queries on that in Spatialite format (they store the geometries differently).

 GDAL/OGR to the rescue:

ogr2ogr -f SQLite natearth_vector_spatialite.sqlite natural_earth_vector.sqlite -skip-failures -nlt PROMOTE_TO_MULTI -dsco SPATIALITE=YES

Turns out Spatialite, and I suspect Postgis, don't like when you mix Multi and non Multi geometries if a column is declared Multi. Thankfully EvenR solved this in gdal 1.10 with -nlt PROMOTE_TO_MULTI

A few hours later, 400+MB of great base material for cartography...

Oh wait, try to dissolve countries into UN subregions, what are all those weird partial lines in the middle of what should be solid polygons? Slivers of course, places where the topology of borders are not snapped.


  1. Processing in QGIS, GRASS tool v.dissolve, advanced set a tolerance


  1. Buffer the polygons 1st before smushing (Thanks Brian)
    CREATE TABLE subregionsT AS
    SELECT subregion,CastToMultiPolygon(GUnion(Buffer(Geometry,0.00001))) as geometry
    FROM ne_110m_admin_0_countries
    GROUP BY subregion;

Solution 1 is probably cleaner, as I don't have to now clip the continents to match the coastline again, but solution 2 let me keep it all in the same db where the data was to start with less steps.

Wine,Skype, Google Earth etc.. ia32 on 12.04.1

It seems that some PPAs have newer versions of apps than the stock 12.04. This can cause nightmares when you go to install stuff that needs the ia32 mutliarch stuff because the 386 version has to be the same as the amd64 version.

After a couple of days of trying to resolve the packages by hand and force versions I came across this post that uses apt-pinning in the preferences to downgrade everything to stock.

Once running stock packages, Wine, skype etc should install....

Transcript "

Re: ia32-libs error [Cant install on amd64]ia32-libs error [Cant install on amd64]

I had a similar problem with broken dependencies when trying to install wine and acroread, just after upgrading to 12.04 from 11.04 (passing over 11.10). It seems that some ppa's I had in 11.04 installed newer versions of applications in the system. After upgrading, the remains of these apps seemed to do some mess in the dependencies.

The solution that seems to work (until now), was found on a german ubuntu board (, posts from user Lasall):

First a downgrade is required and done with the following: create the 'preferences' file: Code:

sudo vi /etc/apt/preferences

and insert the following lines: Code:

Package: *       
Pin: release a=precise*
Pin-Priority: 2012

Pin-Priority must be greater than 1000.

Then you may downgrade the programs with: Code:

sudo apt-get dist-upgrade

Then you may install packages that complained about dependencies, like Code:

sudo apt-get install ia32-libs-multiarch

Finally, you should remove the file you just created: Code:

rm /etc/apt/preferences

because else no new updates would be found.

Hope this helps you too!

GeoMeetup Slides

I've posted slides from my talk at the  GeoMeetup in San Francisco.

The talk was on Python Plugin development for QGIS, get the slides over on  Scribd

OSGeo Live Partial Download Estimates

I had a chance this last week to do a little bit of analysis on the download logs for the  OSGeo-Live project. The basics: downloads have increased quite a bit from version 4.5 to 5.0 and the full 4.4 GB iso file is the most popular but that doesn't mean there aren't quite a few people downloading the other variants.

There is some uncertainty in the actual numbers as I haven't had a chance to filter out bots, incomplete downloads, etc... Also for those wondering I do plan to follow up with a Map of downloads by country/region soon but early estimate is people from 100 different countries have downloaded.

These graphs represent data for all of 2011 from 2 of 5 servers, the 2 in California.

osgeo 2011 downloads by type osgeo 2011 downloads by version

Anyone know what the difference is between viewed, entry and exit on awstats?

GPT Booting with Ubuntu

So if you buy a 3 TB drive (or anything bigger than 2TB) and want to use it as the primary drive for your machine you will need to use a GPT paritioning system instead of the classic MBR.

Here's a couple of tricks/tips which should help:

  • You need to be using an OS that has GRUB2
  • When partitioning, the 1st partition should be a 1 MB section with the bios_grub flag (recent versions of the Ubuntu installer, at least 11.04 has this option, 10.04 I had to set if with a Live disc and parted)
  • When you get to the install GRUB question, if you happen to be installing to something other than /dev/sda say no, and then it will ask you which drive or partition to install to.

Free and Open Source Tools for GPS Data Management and Analysis

Here's a copy of the poster I did for  AAG 2011 meeting. It's part of my master's thesis on Geoinformatic techniques for dealing with GPS telemetry data using an Open Source stack.


  • Python
  • Spatialite (SQLite)
  • QGIS
  • R

See the attached pdf which was created in latex using Beamer and the Beamerposter packages.

Router gone bad, Open Source It!

I'm not sure why but home routers seem to have a finite lifetime before they start misbehaving in strange ways. Last week mine started acting up, in a way I've never seen before, one in which power cycling seems to have no effect.

What was it doing? It decided I wasn't allowed to view websites or any other type of connection to one very specific subnet - which is where my servers happen to live. The rest of Internet worked as usual.

After, a couple of days of trying to figure out where the problem was I did narrow it down to my home router by plugging my laptop into the Internet service directly - which worked.

Part 2: Now that I knew it was the router and had some traceroutes handy, my best guess was that my computer was sending the request but the data was never coming back from the server, browser gave messages like server took too long to return request. Notice how it didn't say it couldn't find the server. Traceroutes no matter how many hops (should have been 16) keep going with * * * which made me think and endless loop was somewhere.

Fingers started to point to the built in  SPI Firewall.

So I tried turning off SPI, NAT filtering...upgrade the router firmware, reset the settings...nothing. (Maybe I need to try the mythical 30-30-30 method to flush the nvram).

Plan B Giving up on the router I went to my spare router. Hooked it all up, got connected, turned on the firewall and wham no Internet at all and reverting the settings didn't fix it.

Good news is I had intentionally bought 2 routers that shipped or were capable of running Linux based open source firmware. (Netgear WGR614Gv8, Asus WL520gu) So began the night of researching how to flash an open source firmware onto a router.

Solution: After reading many pages, and some 20-100 step processes I found a nifty 3 step that worked great the first time.  Flashing an Asus wl520gu in 3 steps with Tomato (I actually used  Tomato-usb)

Note: this method will probably work with  DD-WRT or  OpenWrt too but I didn't try it.

How big was that database?

Database servers are great, but there's a lot of magic in there sometimes and it can be hard to figure just how much storage is being taken up by what database and which tables.

A nice little hint on how to check the size of the whole or parts of you database server (Postgres):

Or for the lazy

SELECT pg_database.datname,
       pg_size_pretty(pg_database_size(pg_database.datname)) AS size
  FROM pg_database;

Public Laboratory, citizen science getting serious.

Just wanted to share a project I recently became aware of after making a trek over to  WhereCamp2011

They've got some great ideas for home brewing some nice science equipment for remote sensing, check it out at

Here's a link to my  flickr stream with photos of some of their airborne camera platforms.

Un-doing the partition mess from a dual boot

More and more, when I make a dual boot system it turns out that 6 months to a year down the line the windows partition just isn't needed anymore. But now you've got 10GB+ of disk just sitting out at the front of the drive.

Over the holiday I tackled a shuffling of partitions and here's the important tips I picked up.

  1. Copy your important data to another drive (an external usb is great)
  2. Using the Ubuntu disk tools like gparted blank the space where you want to move stuff to.
  3. Using the  Clonezilla live disc (and either partimage or  partclone [the new variant that handles ext4]) clone your / partition over to the new space.
  4. Relabel the UUID of this new partition, otherwise it will be identical to the UUID of the original and the bootloader will quasi load both
      tune2fs /dev/hdaX -U numbergeneratedbyuuidgen

5.Edit your grub config to boot the new drive. If you reboot into Ubuntu running the update-grub will find it.

  1. Once you're sure you can boot the relocated / you can add the empty space onto your /home (I always recommend separate / and /home partitions)

Things I also recommend:

  1. Converting ext3 to ext4
  2. Creating a Private directory for storing encrypted stuff.

Open Source Mapping Workflow

This quarter some students and professors got together to reinvent/recreate/re-instigate  Cartography at UC Davis. While this isn't my first Cartography course it's been a bit more realistic in terms of applying the ideas to making maps.

Below is an example of mine, showing the possibilities of an Open Source cartographic workflow. I used  Spatialite to crunch the data,  QGIS to prep and  Inkscape to Polish.

It's a semi-ficticous map showing major air routes that cross the Arctic Circle using data from and a background map from

I'll link to the full pdf later. Creative Commons license in the footer applies.

A Quail is born

So in what might seem to others as a random turn of events I've embarked on a Japanese Quail breeding program at home. Low and behold as we discussed what to do since the eggs were overdue, I opened the incubator to find someone staring at me quite cutely.


More reliabe VMware Console?

So VMWare server is an interesting product for virtualization. It does some things really well (Like letting you open a desktop OS without installing remote desktop tools) and seems to just fail at others (like a web management tool that you can't get into 1/2 the time).

Tonight's frustration, lack of support for Firefox 3.6. But there's a bit of a workaround. If you go into about:config and find security.enable_ssl2, and set it to true the Web Access site actual seems to work reliably (so far).

However the console to any VM will always timeout. To work around this:

  1. make sure you've installed the console plugin
  2. go to your firefox settings directory
  3. find your way into your profile/extensions/VMWare.../plugins
  4. way down here you'll find a vmware-vmrc
    1. to be safe enable execution permission on this and all the other vmware scripts in this folder, in the bin(vmware-vmrc) and in lib ( folders in this directory
  5. now you can directly call, setup a shortcut or start vmware-vmrc
    vmware-vmrc -h [<hostname>:<port>] [-u <username> -p <password>] [-M <moid> | <datastore path>]
    vmware-vmrc.exe -h <hostname>:<port> [-u <username> -p <password>] -M <moid> | <datastore path>
  1. if you leave off command parameters it will just ask you in the GUI

The port number is really important, no idea what moid is yet. And walla it seems to work. It also seems to be more reliable than the web interface (note there is a tool in the web interface to create a shortcut that does the above, and big surprise it doesn't work in Firefox 3.6 hence the hack around).

 Where I found the answer

Website Shuffle

Some of you may have arrived here looking for my photos. That site is temporarily down while I shift some things around, upgrade some servers, and come up with a better long term plan of what I want to do.

As it was, I hadn't added any new photos for several years and that seemed quite silly. Primarily because it was a technical issue; who knew moving 100 of photos onto a decent web server where visitors can browse efficiently would be so confusing.

Anyways, be patient, let me know if you have questions. tech at wildintellect dot com