Posts by author wildintellect

RESTish WMSGetFeatureInfo API

The basic issue when using OGC web services is there's no simple way to ask for the value of a layer at a specific Latitude & Longitude. WMS and WMTS require that you specify bounding boxes(BBOX) or tiles, service version, pixel coordinates and whole bunch of other stuff. WFS requires that the layer be a Vector layer, I have mostly Raster layers. What happens if you just want a raster cell value for one point. I suppose you could query that pixel from a WCS but you'd have use knowledge of the size of the original data pixels to make a BBOX

Example for 10.75,13.25,13,11,14&WIDTH=100&HEIGHT=100&X=75&Y=75

A lot of the required parameters really don't change for a given use case. In this project, I'm always querying in Lat Lon WGS84 and I'm not necessarily loading a map that corresponds. So I came up with a way to shorten out all the repetitive stuff and simplify the components for the request without having to abandon using the WMS server I already had serving the data.

With that knowledge and Apache Rewrite you can simply this quite a bit.

<IfModule rewrite_module>
    RewriteEngine  on
    RewriteRule "^/api/maps/([^/]*)/layers/([^/]*)$" "/maps?MAP=$1&QUERY_LAYERS=$2&LAYERS=$2&SERVICE=WMS&VERSION=1.1.1&REQUEST=GetFeatureInfo&STYLES=default&SRS=EPSG:4326&FEATURE_COUNT=1&INFO_FORMAT=text/html" [PT,QSA]

The end result is now you can write a more sensible url in REST style (almost)




The trick here, is make a 1x1 degree box based on rounding your coordinates to the nearest integer. Then using the remainder to get the fractional distance out of the 100x100 pixels in the request. You can also use 1000x1000 which WMS servers will allow to get 1 extra decimal place added accuracy.

The big Gotchas

  • Pixel space 0,0 is upper left corner (positive numbers go down)
  • WMS 1.3 for EPSG:4326 swaps the Lat Lon ordering in the BBOX
  • Negative coordinates require a little extra handling.

See diagram below.

WMS RESTishAPI diagram Note: For WMS 1.3 it's CRS not SRS, and I & J not X & Y

And now an example of how to build the request url in Javascript. If you're WMS service is set to return json or geojson or html that should be all you need.

function getInfoUrl(lonlat) {

    //retrieve the name of actively selected layer you want to query
    var layername = $("#variableSelect :selected").val();

    var lon = lonlat[0];
    var lat = lonlat[1];

    //handle quadrant shift
    var minx = Math.floor(lon);
    var maxx = minx+1;
    var miny = Math.floor(lat);
    var maxy = miny+1;

    var bbox = "BBOX="+minx+","+miny+","+maxx+","+maxy;
    var wh = "WIDTH=100&HEIGHT=100";
    //Pixel within bbox at given width and height
    if (lon < 0) {
        var x = Math.round(100+((lon % 1)*100));
    } else {
        var x = Math.round((lon % 1)*100);
    if (lat < 0) {
        var y= Math.round(Math.abs((lat % 1)*100));
    } else {
        var y = Math.round(100-((lat % 1)*100));

    var baseurl = ""
    var fullurl = baseurl+layername+"?"+bbox+"&"+wh+"&X="+x+"&Y="+y;


It would be awesome to eliminate the BBOX part and the pixel space coordinates so all you need is Lat & Long. But so far it looks like one needs to write a small web application to do that.

Lens Correction for GoPro Hero2

Fisheye lenses are awesome, such a wide view captures great scenes. However if you want to stitch the photos together, that distortion on the edges causes some real trouble.

So you want to fix it, great, Digikam has a lens correction tool built in based on  Lensfun. Hmm, wait there's no record for GoPros. Ok take test shots with lots of straight lines(buildings work great) and upload them to the awesome maintainer of Lensfun via Wait a couple days, get back a lens model, sweet.

In Digikam, try it on 1 photo, awesome it works, well at least on Digikam 3.5 (in Digikam 2.x it creates distortion art). But then you go to do it in batch on all your photos and wham, nothing, big fat nothing. Turns out it's a known bug:

Tried GIMP, with Gimp-Lenfun. Got it installed and working but it doesn't correct the distortion. Also batch processing in GIMP is somewhat a PITA, there are a couple of addons but they don't always work, or allow using addon filters.

gopro distortion before and after

I was going to leave it at that and wait for a fix sometime down the road, but then had some down time. So a few hours later using Python, Lensfun (lensfunpy) and OpenCv (python-opencv) I got it to work at least on my laptop running Ubuntu 14.04. Trying it on my Ubuntu 12.04 desktop resulted in similar distortion art as digikam. So I tried updating opencv, no change. Turns out there's a bug in lensfun fixed between 2.5 and 2.8. So I had to Backport 2.8 to Ubuntu 12.04 ( instructions). If you want my backport its in my  PPA

Want to see my code for doing it in batch, you're in luck head over to github:

Save battery by slowing data

The title sums up this post pretty good. If you've got Android, more specifically a variant of Android that let's you really tweak your settings, like  Cyanogenmod

Do you live in a place with 4G or did you just travel somewhere that has 4G? If so and you want your phone to actually last the whole day or two toggle the speed to 2G (it will actually toggle to not 4G which is often 3G). I find that if I don't 4G drains my battery very quickly to where I worry about running out before getting home or to a hotel while travelling.

The trade-off is great if you're just checking email and getting directions. When you want to stream video etc, just toggle the speed back on.

TODO: I'll get picture of my quick-bar power settings so people can see what it looks like.

Tip 2: Want to use Wifi on a plane, toggle to Airplane mode, then toggle wifi back on. PS: Airplane mode also breaks many in program ads.

Upgrade notes Ubuntu 12.04 to 14.04

Just a few things I encountered in my upgrade on my Zotac Zbox going to from Ubuntu Precise (12.04) to Ubuntu Trusty (14.04)

  • Couldn't get it to use an iso as the upgrade material since there's no alternate cd anymore, so did an online upgrade which worked fine.


  • Atheros driver is way better, I went from 1 Mbps to 4 Mbps on, nothing else changed in my network, the latter speed what I always got from other computers.
  • Streaming video full screen no longer requires gpu acceleration to be disabled.

Bugs (related):

  • nouveau driver hiccups on sound every few seconds when streaming videos
  • Nvidia Ion graphics/sounds always transmits sound on HDMI even if you switch to analog. In my case this provided a weird problem, where I couldn't use analog audio to bypass the previous bug above. See fix below...


  • Installed the Nvidia drivers, which had major issues in 12.04 (screen blank or not lined up with monitor/tv) - works great now
  • Forgot that Amazon Prime streaming require hal for flash drm get it from this  ppa
  • Chromium and Chrome no longer work with Adobe flash from the repos, you need pepperflash
    sudo apt-get install pepperflashplugin-nonfree
    sudo update-pepperflashplugin-nonfree --install
  • If you're using Apache pay attention to the 2.2 to 2.4 upgrade, syntax of allows and conf file names changed and are important.

R tip: G-Test G-Statistic G^2 likelihood ratio, or whatever else you might want to call it.

When analyzing categorical data, sometimes Chi-Square just isn't the right distribution for testing goodness-of-fit or testing Independence. So many people recommend a G test instead.

Being a user of  R, obviously I'd like to also run this test along with my other tests. A little searching the web, and answers are littered with, "R doesn't have g-test built in, here's code to do it yourself..." Which is 1/2 true, unlike the chisq.test the base R does not appear to have a g-test. I'd rather leave coding of standard statistics to people who really know the ins and outs of the formulas and have a good way to verify the answer.

So, a few hours later I find  Deducer has  likelihood.test So we're all good, right?

Well then when I got significant results I started looking for Post-hoc tests. In doing so it turns out that the following also do G-tests as part of their Measures of Association tests (typically used as post-hoc tests):

So there, base R doesn't have it, but at least 3 packages do so people don't need to keep re-writing it.

FYI is awesome if you haven't seen it yet.

Scan over wifi from multi-functions in Linux

Ended up needing to configure a few multi-function machines to print and scan via wifi with Linux. Here's the details of what you need to know. Specifically I did a Brother HL-2280DW and an Epson WF-3540 on Ubuntu 12.04

In general set a static IP address, either on the printer or with your home router using DHCP reservations based on MAC address.

Figuring out the device URI was the trickiest part as Ubuntu never seems to guess that quite right. The drivers for printing tend to be found automatically. If that fails both vendors have them available on their website.



Add Printer, from network, give it the ip of the machine, then pick the lpd option.

Device URI: lpd://


Go to the brother  support site and get the following files for installation.

  • Scanner driver
    • brscan4-0.4.2-1.amd64.deb
  • Scanner Setting File
    • brother-udev-rule-type1-1.0.0-1.all.deb

Now also make sure you have sane installed.

Run the following to register your multi-function

brsaneconfig4 -a name=Brother model=HL-2280DW ip=

Should now work with sane based programs.



Add Printer, from network, give it the ip of the machine, then pick the lpd option.

Device URI lpd://


Search the  epson download site for drivers. I needed:

  • WF-3540 Series Scanner Driver Linux core package&data package
    • iscan-data_1.28.0-2_all.deb
    • iscan_2.29.3-1~usb0.1.ltdl7_amd64.deb
  • WF-3540 Series Scanner Driver Linux network plugin package
    • iscan-network-nt_1.1.1-1_amd64.deb

Install them in that order. Now also make sure you have sane installed. Then edit /etc/sane.d/epkowa.conf (This is the part no one on web seems to describe.) Can't find the file you might needs to install libsane-extras

In the net section add a line with your multi-function ip address.


Save that and now when you open iscan or sane it should find your scanner.

Batch convert Natural Earth SQLite to Spatialite

Happened to be making some maps today, and realized 1:110m would be better than 1:10m for small world maps in R (much faster too). I had the whole  Natural Earth dataset downloaded in sqlite format. SQLite is great but I can't run spatial queries on that in Spatialite format (they store the geometries differently).

 GDAL/OGR to the rescue:

ogr2ogr -f SQLite natearth_vector_spatialite.sqlite natural_earth_vector.sqlite -skip-failures -nlt PROMOTE_TO_MULTI -dsco SPATIALITE=YES

Turns out Spatialite, and I suspect Postgis, don't like when you mix Multi and non Multi geometries if a column is declared Multi. Thankfully EvenR solved this in gdal 1.10 with -nlt PROMOTE_TO_MULTI

A few hours later, 400+MB of great base material for cartography...

Oh wait, try to dissolve countries into UN subregions, what are all those weird partial lines in the middle of what should be solid polygons? Slivers of course, places where the topology of borders are not snapped.


  1. Processing in QGIS, GRASS tool v.dissolve, advanced set a tolerance


  1. Buffer the polygons 1st before smushing (Thanks Brian)
    CREATE TABLE subregionsT AS
    SELECT subregion,CastToMultiPolygon(GUnion(Buffer(Geometry,0.00001))) as geometry
    FROM ne_110m_admin_0_countries
    GROUP BY subregion;

Solution 1 is probably cleaner, as I don't have to now clip the continents to match the coastline again, but solution 2 let me keep it all in the same db where the data was to start with less steps.

Wine,Skype, Google Earth etc.. ia32 on 12.04.1

It seems that some PPAs have newer versions of apps than the stock 12.04. This can cause nightmares when you go to install stuff that needs the ia32 mutliarch stuff because the 386 version has to be the same as the amd64 version.

After a couple of days of trying to resolve the packages by hand and force versions I came across this post that uses apt-pinning in the preferences to downgrade everything to stock.

Once running stock packages, Wine, skype etc should install....

Transcript "

Re: ia32-libs error [Cant install on amd64]ia32-libs error [Cant install on amd64]

I had a similar problem with broken dependencies when trying to install wine and acroread, just after upgrading to 12.04 from 11.04 (passing over 11.10). It seems that some ppa's I had in 11.04 installed newer versions of applications in the system. After upgrading, the remains of these apps seemed to do some mess in the dependencies.

The solution that seems to work (until now), was found on a german ubuntu board (, posts from user Lasall):

First a downgrade is required and done with the following: create the 'preferences' file: Code:

sudo vi /etc/apt/preferences

and insert the following lines: Code:

Package: *       
Pin: release a=precise*
Pin-Priority: 2012

Pin-Priority must be greater than 1000.

Then you may downgrade the programs with: Code:

sudo apt-get dist-upgrade

Then you may install packages that complained about dependencies, like Code:

sudo apt-get install ia32-libs-multiarch

Finally, you should remove the file you just created: Code:

rm /etc/apt/preferences

because else no new updates would be found.

Hope this helps you too!

GeoMeetup Slides

I've posted slides from my talk at the  GeoMeetup in San Francisco.

The talk was on Python Plugin development for QGIS, get the slides over on  Scribd

OSGeo Live Partial Download Estimates

I had a chance this last week to do a little bit of analysis on the download logs for the  OSGeo-Live project. The basics: downloads have increased quite a bit from version 4.5 to 5.0 and the full 4.4 GB iso file is the most popular but that doesn't mean there aren't quite a few people downloading the other variants.

There is some uncertainty in the actual numbers as I haven't had a chance to filter out bots, incomplete downloads, etc... Also for those wondering I do plan to follow up with a Map of downloads by country/region soon but early estimate is people from 100 different countries have downloaded.

These graphs represent data for all of 2011 from 2 of 5 servers, the 2 in California.

osgeo 2011 downloads by type osgeo 2011 downloads by version

Anyone know what the difference is between viewed, entry and exit on awstats?

GPT Booting with Ubuntu

So if you buy a 3 TB drive (or anything bigger than 2TB) and want to use it as the primary drive for your machine you will need to use a GPT paritioning system instead of the classic MBR.

Here's a couple of tricks/tips which should help:

  • You need to be using an OS that has GRUB2
  • When partitioning, the 1st partition should be a 1 MB section with the bios_grub flag (recent versions of the Ubuntu installer, at least 11.04 has this option, 10.04 I had to set if with a Live disc and parted)
  • When you get to the install GRUB question, if you happen to be installing to something other than /dev/sda say no, and then it will ask you which drive or partition to install to.

Free and Open Source Tools for GPS Data Management and Analysis

Here's a copy of the poster I did for  AAG 2011 meeting. It's part of my master's thesis on Geoinformatic techniques for dealing with GPS telemetry data using an Open Source stack.


  • Python
  • Spatialite (SQLite)
  • QGIS
  • R

See the attached pdf which was created in latex using Beamer and the Beamerposter packages.

Router gone bad, Open Source It!

I'm not sure why but home routers seem to have a finite lifetime before they start misbehaving in strange ways. Last week mine started acting up, in a way I've never seen before, one in which power cycling seems to have no effect.

What was it doing? It decided I wasn't allowed to view websites or any other type of connection to one very specific subnet - which is where my servers happen to live. The rest of Internet worked as usual.

After, a couple of days of trying to figure out where the problem was I did narrow it down to my home router by plugging my laptop into the Internet service directly - which worked.

Part 2: Now that I knew it was the router and had some traceroutes handy, my best guess was that my computer was sending the request but the data was never coming back from the server, browser gave messages like server took too long to return request. Notice how it didn't say it couldn't find the server. Traceroutes no matter how many hops (should have been 16) keep going with * * * which made me think and endless loop was somewhere.

Fingers started to point to the built in  SPI Firewall.

So I tried turning off SPI, NAT filtering...upgrade the router firmware, reset the settings...nothing. (Maybe I need to try the mythical 30-30-30 method to flush the nvram).

Plan B Giving up on the router I went to my spare router. Hooked it all up, got connected, turned on the firewall and wham no Internet at all and reverting the settings didn't fix it.

Good news is I had intentionally bought 2 routers that shipped or were capable of running Linux based open source firmware. (Netgear WGR614Gv8, Asus WL520gu) So began the night of researching how to flash an open source firmware onto a router.

Solution: After reading many pages, and some 20-100 step processes I found a nifty 3 step that worked great the first time.  Flashing an Asus wl520gu in 3 steps with Tomato (I actually used  Tomato-usb)

Note: this method will probably work with  DD-WRT or  OpenWrt too but I didn't try it.

How big was that database?

Database servers are great, but there's a lot of magic in there sometimes and it can be hard to figure just how much storage is being taken up by what database and which tables.

A nice little hint on how to check the size of the whole or parts of you database server (Postgres):

Or for the lazy

SELECT pg_database.datname,
       pg_size_pretty(pg_database_size(pg_database.datname)) AS size
  FROM pg_database;

Public Laboratory, citizen science getting serious.

Just wanted to share a project I recently became aware of after making a trek over to  WhereCamp2011

They've got some great ideas for home brewing some nice science equipment for remote sensing, check it out at

Here's a link to my  flickr stream with photos of some of their airborne camera platforms.

Un-doing the partition mess from a dual boot

More and more, when I make a dual boot system it turns out that 6 months to a year down the line the windows partition just isn't needed anymore. But now you've got 10GB+ of disk just sitting out at the front of the drive.

Over the holiday I tackled a shuffling of partitions and here's the important tips I picked up.

  1. Copy your important data to another drive (an external usb is great)
  2. Using the Ubuntu disk tools like gparted blank the space where you want to move stuff to.
  3. Using the  Clonezilla live disc (and either partimage or  partclone [the new variant that handles ext4]) clone your / partition over to the new space.
  4. Relabel the UUID of this new partition, otherwise it will be identical to the UUID of the original and the bootloader will quasi load both
      tune2fs /dev/hdaX -U numbergeneratedbyuuidgen

5.Edit your grub config to boot the new drive. If you reboot into Ubuntu running the update-grub will find it.

  1. Once you're sure you can boot the relocated / you can add the empty space onto your /home (I always recommend separate / and /home partitions)

Things I also recommend:

  1. Converting ext3 to ext4
  2. Creating a Private directory for storing encrypted stuff.

Open Source Mapping Workflow

This quarter some students and professors got together to reinvent/recreate/re-instigate  Cartography at UC Davis. While this isn't my first Cartography course it's been a bit more realistic in terms of applying the ideas to making maps.

Below is an example of mine, showing the possibilities of an Open Source cartographic workflow. I used  Spatialite to crunch the data,  QGIS to prep and  Inkscape to Polish.

It's a semi-ficticous map showing major air routes that cross the Arctic Circle using data from and a background map from

I'll link to the full pdf later. Creative Commons license in the footer applies.

A Quail is born

So in what might seem to others as a random turn of events I've embarked on a Japanese Quail breeding program at home. Low and behold as we discussed what to do since the eggs were overdue, I opened the incubator to find someone staring at me quite cutely.


More reliabe VMware Console?

So VMWare server is an interesting product for virtualization. It does some things really well (Like letting you open a desktop OS without installing remote desktop tools) and seems to just fail at others (like a web management tool that you can't get into 1/2 the time).

Tonight's frustration, lack of support for Firefox 3.6. But there's a bit of a workaround. If you go into about:config and find security.enable_ssl2, and set it to true the Web Access site actual seems to work reliably (so far).

However the console to any VM will always timeout. To work around this:

  1. make sure you've installed the console plugin
  2. go to your firefox settings directory
  3. find your way into your profile/extensions/VMWare.../plugins
  4. way down here you'll find a vmware-vmrc
    1. to be safe enable execution permission on this and all the other vmware scripts in this folder, in the bin(vmware-vmrc) and in lib ( folders in this directory
  5. now you can directly call, setup a shortcut or start vmware-vmrc
    vmware-vmrc -h [<hostname>:<port>] [-u <username> -p <password>] [-M <moid> | <datastore path>]
    vmware-vmrc.exe -h <hostname>:<port> [-u <username> -p <password>] -M <moid> | <datastore path>
  1. if you leave off command parameters it will just ask you in the GUI

The port number is really important, no idea what moid is yet. And walla it seems to work. It also seems to be more reliable than the web interface (note there is a tool in the web interface to create a shortcut that does the above, and big surprise it doesn't work in Firefox 3.6 hence the hack around).

 Where I found the answer

Website Shuffle

Some of you may have arrived here looking for my photos. That site is temporarily down while I shift some things around, upgrade some servers, and come up with a better long term plan of what I want to do.

As it was, I hadn't added any new photos for several years and that seemed quite silly. Primarily because it was a technical issue; who knew moving 100 of photos onto a decent web server where visitors can browse efficiently would be so confusing.

Anyways, be patient, let me know if you have questions. tech at wildintellect dot com

NACIS 2009 Opening up

So at the AAG Conference last year, we ran an  OSGeo booth. Some representative from North American Cartographic Information Society ( NACIS) approached and invited us to their conference.(It wasn't the 1st time after one of my talks on FOSS previously I had been asked).

Now the important part, the California Chapter gave a 50 minute, 4 app demo at the NACIS "Practical Cartography Day" to an audience of 150.  Details Take home message - Cartographers want good svg output.

Notes from the rest of the conference, "Open" was actually mentioned a lot. Here's a rough breakdown of the frequency of relevant topics(In presentations):

  • Postgis ++
  • OpenLayers(not by name but showed up in slides and on demo sites) +++
  • mapnik ++
  • GDAL +
  • Modestmaps
  • php +++
  • OpenSource +++++(Even ESRI)
  • Python +++
  • OpenStreetmap ++++
  • Flash/Flex +++++++
  • OGC +
  • Inkscape +
  • GIMP +
  • WMS +

(Maybe I'll post a plot when I get chance)

Next post: Some new public domain datasets people are going to want to get their hands on...

Desktop Open Source goes mainstream, old school style

Gary Sherman's book in the UC Davis library Congratulations to Gary Sherman who's recent book has successfully made it to the shelves of academia. Well that might be in part to our librarian taking advice on what open source gis books are missing that should be on the shelf. Lucky for everyone else, since the publisher didn't classify it as a text book it's also affordable too if you want your own copy, paper or ebook.

Desktop GIS: Mapping the Planet with Open Source. Pragmatic Bookshelf, 360 pages, ISBN 1934356069,

Wondering what other books you've missed see the  OSGeo Library

Inkscape to Scribus to PDF document production: How to make a flyer

It's comes up quite often that I need a flyer for this or that. Just a few pages, sometime quarter, third or half sheets for putting up around campus for people to see. Once you do a few though, it often happens that you just need the same thing again later with a few minor variations. Sure you could just do it all in one application, but when not doing full pages then you have to keep messing with duplicating your information 2-4 times on the same in a way that lines up well with being cut.

This is where layout comes in handy, more specifically I use  Scribus. The idea here is make one image and then replicate it multiple times across a page all at once evenly. Well that and make a high resolution ready to print PDF.

So start by making your image/item. In this case I don't have a ton of text and it's kinda free float style (not paragraph) so I used  Inkscape, well that and it's the format the flyer was originally given to me in. Had there been more text I would have started with  OpenOffice, done the graphics in Inkscape or Gimp and done 100% of the layout in Scribus.

After writing the text, changing and scaling fonts, putting in the image, adjusting transparencies and background colors it's now time to export the image. From Inkscape particularly exporting to bitmap(png) gives you the chance to specify you dpi and ensure it will show up correctly when you insert it in to other documents. For printing I usually use 300dpi, and in this case to cut out dealing with margins only exported the drawing, not the page.

In Scribus:

  1. Now I set a guide to split the page in 1/2.
  2. Turn on guide snapping and grid snaping.
  3. Draw an image box, snapping it to the guides.
  4. Get picture, grab the png export
  5. Duplicate(copy) and snap a second one onto the bottom 1/2
  6. PDF export, no compression

And walla, the next Linux User's Group of Davis Installfest flyer is done.

See Attached:

Network analysis using GRASS

I ended up wanting to analyze commute paths on several networks, but instructions on how to properly prepare a network file with new points snapped to it as nodes was a little less than clear. I'm not 100% sure this is right but it is pieced together from the command history  GRASS stored with each layer in my mapset.

#bring the layer in -o dsn="/scratch/congelton/davis_ped_net/ped_net_sep28.shp" output="pednets28" min_area=0.0001 snap=-1

#find the nearest line to a point and create a line that connects them
v.distance -p from="[email protected]" to="pednetsep28" from_type="point" to_type="point,line,area" from_layer=1 to_layer=1 output="ppl2pednet" dmax=-1 upload="dist" column="dist"

#add categories to the distance lines(I think this is required otherwise won't work later, if the cat column is already populated then you can skip this)
v.category input="ppl2pednet" output="ppl2pednetcat" type="point,line,boundary,centroid,area" option="add" cat=1 layer=1 step=1

#patch the distance lines to the to the original points, so you have the nodes for
v.patch input="ppl2pednetcat,pednets28" output="pplpednet"

# patch the distance lines to the network
v.patch input="pplpednet,davissubset" output="pplonpednet"

#I ran a clean before I did the actual command to make sure I dropped things that wouldn't work, outliers
v.clean input="pplonpednet" output="pplonpednetclean3" type="line,point" tool="snap,break" thresh=3,3

#run the network shortest path using the original points as starting points and end points in batch from a csv, the point id is it's cat input="pplonpednetclean3" output="dcommute3" type="line,boundary" alayer=1 nlayer=1 file="pplonpednetclean.csv" dmax=1000

#example of the csv
#autonumber,Start node cat, end node cat
1       1       3000
2       5       3000
3       6       3000
4       7       3000
5       8       3000
6       9       3000
7       10      3000
8       14      3000
9       15      3000
10      25      3000
11      26      3000
12      27      3000
#yes all my people traveled to the same end point

Things to watch out for:

  • A network file should have both lines and points with the same layer number(ie 1_points 1_lines)
  • A network file with no cat column in the points component

Reshape R - long to wide conversion

--May be incorrect, working on a fix will post when done-- Keeping data in long format just makes sense, but for some reason statistics often requires your data in wide format. The good news is that it's much easier to go from long to wide than the other way around. Although the tool I'm about to describe can go both ways.

Using  R and pulling a dataframe in from an SQLite database the following command will take the dataframe and for every Species listed create a new column based on it. Then all the records are grouped by their Plot and the resulting Percent Cover for a given species in a plot is now a value in one of the columns instead of it being it's own row.

Plant (the data.frame)

Plot Species PrCover
A Poppy 5
A Redwood 20
B Oak 50
B Poppy 10
 WidePlant <- reshape(Plant, v.names = "PrCover", idvar = "Plot", timevar = "Species", direction = "wide")

WidePlant (the results)

Plot PrCover.Poppy PrCover.Redwood PrCover.Oak
A 5 20 NA
B 10 NA 50

The documentation is kinda hard to read, so here's my attempt at plain english

  • v.names = the values you want to show up under your new columns
  • idvar = the id that you want to group your data record by
  • timevar = the values that you want to make up the new columns, however many distinct values are in this column determines the number of new columns
  • direction = wide, the destination or resulting format we want

Installing Sqlite 3.6.x on Ubuntu Intrepid

I needed sqlite 3.6 or newer for an application I'm working on but Ubuntu Intrepid has 3.5.9, specifically for Rtree spatial indexes. (In order to build and use  Spatialite.

After weighing my options and doing a little research I noticed that the Jaunty packages barely have any dependencies and they are already met by Intrepid.

So I downloaded:

Steps to follow:

  1. Uninstall libsqlite3-dev 3.5.9
  2. Install libsqlite3 3.6.10
  3. Install libsqlite3-dev 3.6.10
  4. Install sqlite3 3.6.10

To test with python(happens to be what I'm developing with)

from pysqlite2 import dpapi2 as sqlite3
print sqlite3.sqlite_version

Grass Syntax Hints

Short Story

 GRASS GIS command line syntax can be a little tricky and none of the graphical interfaces seem to make it easy because there's always some option you need that isn't on the GUI.

Importing a shapefile dsn=/path/to/folder/ layer=nameofshp output=giveitaname

notes: don't put .shp on the layer name, If it complains about not being the right projection but you know it is add a -o (no that's not a zero)

Long Story

I was testing out  QGIS 1.0 and the grass toolbox was having issues, without giving me an useful error message to work from.

So I compiled the latest grass release (6.4 RC2) and tried the new wxpython interface which also failed.

Lucky for me the good ol command line worked once I gave it all the info it wanted in the proper syntax.

SQLite and ODBC for Data Entry

The one downside of all the good database systems is the lack of an easy tool for entering data, especially coding in data off of hand written field forms.

I recently revisited the idea of using Open Office Base or Access as a front end to better databases. In this particular case due the number of issues and my familiarity I got Access working, I plan to go back and also get OOo working next chance I get and taking over my friends windows box (The data entry is for her anyways).

Tools: sqlite-odbc driver (I tested it on Windows and Linux) An ODBC client: Access, Excel, Open Office Base, Calc


  1. All your tables must have a primary key declared.
    1. If you don't have one it's real quick using the Firefox SQL Manager to fix that, however you have to make new tables), something like this.
      CREATE TABLE NewData (pk INTEGER PRIMARY KEY AUTOINCREMENT, Afield, someotherfield);
      INSERT INTO NewData (Afield,someotherfield) SELECT * FROM Data;
      DROP TABLE Data;
  2. Declaring field types as TEXT, Access will import them as blobs if you do and this makes linking the tables difficult. Just drop the type.
  3. The Relationship tool in Access and Open Office are useless with linked-tables in this case. OO tells you so, Access doens't.
    1. To get around this I created nested forms each based on 1 table. When inserting a nested table into another I built in the relationship to the form so that for every record in the parent child records would be matched automatically.

More details to come soon...

List of useful links:

  • a
  • b

Ever wonder what's in the box without, shutting down and crawling under...

A few helpful commands I'm collecting to figure out what's inside the machine from the Command Line Interface (CLI):

$ sudo dmidecode | more
#A readout from your motherboard, give you name, model, which ram slot's have chips etc.
$ cat /proc/cpuinfo
#What processor do you have?

Setting up Trac, getting past the errors.

So I thought the install would go smooth but a few hiccups always creep in.

Basically I followed these  instructions and another page for  authentication

With some diligent work I worked through: (make sure to look at your apache logs, mines at /var/log/apache2/error.log)

  1. mod_python is the way to go in terms of setup, speed etc.
  2. Get an IOError: zipimport: go check the permission on the mentioned file, make sure it's at least 644 example:
     sudo chmod 644 setuptools-0.6c9-py2.5.egg
  1. Python Option and Set Env PYTHON_EGG_CACHE flat out didn't work for me though as  reported and worked around, although a fix to the code eludes.
  2. Import Error on compat, take a look at this  ticket

Welcome to Wildintellect's Blog

Hello and welcome to my blog and code hosting site. If you're looking for my photography that's currently down for a major rewrite.

This site is more oriented towards the geek details to keep the world running, and most importantly to serve as my memory since I clearly can't remember things for more than a couple of hours.

Alex aka Wildintellect

  • Posted: 2008-12-18 23:40 (Updated: 2011-06-16 18:42)
  • Author: wildintellect
  • Categories: (none)
  • Comments (0)

Just started, will it work

Well, I've got it installed. Not 100% configured yet but we'll see if this manages to work as a decent blog platform.

For those curious I'm using the  FullBlogPlugin with a  Trac site. Not exactly meant for a blog, but it will let me post code snippets and downloadable applications I write, including python eggs.