Minutely Mapnik Pre2010

From OpenStreetMap Wiki
Jump to: navigation, search

WARNING: This page is for historical reference only. The current procedure for updating is located at: Minutely_Mapnik.


User:Crschmidt got this working Jan 2009. Since then User:Matt set it up to show the NoName map layer at [1]. Note that this site is no longer being updated, as the server it was running on couldn't take the load.

Setting up Minutely Mapnik

It's done using a combination of osm2pgsql and osmosis to keep the database up to date.

Tools needed:

I installed postgresql-8.3 and postgresql-8.3-contrib from Debian modules. Osmosis and osm2pgsql were taken from nightlies and SVN, respectively.

Firstly create all stuff described on Mapnik and Mapnik/PostGIS and Osm2pgsql pages

Then you can do: psql -f /usr/share/postgresql/8.3/contrib/_int.sql gis.

I then ran osm2pgsql with the following options:

 time ./osm2pgsql -P 5433 --slim -s -C 3000 planet-date.bz

This loads the initial planet. Then, I loaded the daily and hourly diffs manually to catch up: the same command, with '-a' stuck in there for append.

Then, I set up an osmosis --rci working directory:

 /home/crschmidt/bin/osmosis --read-change-interval-init workingDirectory=/home/crschmidt/osm/osmosis/rci initialDate=2009-09-09_00:00:00

This sets up the directory with a configuration.txt file, which I modified to be:

 baseUrl=http://planet.openstreetmap.org/minute
 intervalLength=60
 changeFileBeginFormat=yyyyMMddHHmm
 changeFileEndFormat=yyyyMMddHHmm
 maxDownloadCount = 20

Finally, I have a 'keepup.sh' that runs on a cronjob:

#!/bin/bash
export DIR=/home/crschmidt
if [ -f $DIR/osm/keepup.lock ]; then
  if [ "$(ps -p `cat $DIR/osm/keepup.lock` | wc -l)" -gt 1 ]; then
    # process is still running
    echo "Locked"
    exit 1
  else
    # process not running, but lock file not deleted?
    rm $DIR/osm/keepup.lock
  fi
fi
echo $$ >$DIR/osm/keepup.lock
$DIR/bin/osmosis -q --read-change-interval workingDirectory=$DIR/osm/osmosis/rci --write-xml-change "-" | \ 
$DIR/bin/osm2pgsql -S $DIR/osm/osm2pgsql/default.style -P 5433 --slim \ 
   -s -C 3000 -a - 2>> $DIR/osm/osm2pgsql.log
rm $DIR/osm/keepup.lock

The minutely diffs take about 5s to process apiece (based on 'time'). Hourly diffs took about 45m-70m to load. Almost all of this is disk bound, (and I have a machine with beefy disks) so your performance may vary.

Potential Gotchas

When creating your database, make sure you specify the encoding as utf-8, otherwise you will get crappy encoding issues. (This is currently the case for the Up-to-Date DB: I'll probably wait until the new planet is out, and update it after it comes out.)

Mapnik as run on tile.openstreetmap.org

This is an excerpt from a mailing list post by Jon Burgess about the setup of the main OSM tile server (As of October 09):

I use a single DB on the tile server but perform the full import into a
new set of tables. The replication diffs are applied into the main
tables roughly once per minute in parallel with any rendering. 

Performing concurrent rendering & importing definitely does slow things
down. I have seen some evidence that running both slows things down to
less than half the speed of running each individually but I have not
tried to quantify it.

During the full planet import I will import the data into a new set of
tables with the renderer accessing the old data. 

A breakdown of the steps I do when performing the bulk update is:
- Download new planet dump
- Stop the diff import
- Drop the tables used by the diffs: nodes, ways, rels.
- Import the new data with a new table prefix (osm2pgsql -p ...)
- Wait many hours for import to complete
- Stop rendering
- Drop old point, line, polygon & roads.
- Rename new tables with the planet_osm_ prefix
- Restart rendering daemon
- Restart diff import but with tile invalidation disabled
- Wait for diffs to catch up (may take a day or two)
- Turn on tile invalidation
- Touch the planet_import_complete flag to force all tiles to re-render

During those steps I also run the coastcheck program to generate an
updated set of coastline shapefiles.