From OpenStreetMap Wiki
Jump to navigation Jump to search


rather than downloading the entire planet.osm file every time it is updated, is it possible to do some sort of differential update, and only download the parts that have changed? i would guess this would bring a weekly update into the realm of 100MB, rather than 1+GB? are these file provided by OSM, or is there some way of doing it using tools provided in Linux? Myfanwy 02:57, 12 November 2007 (UTC)

Hi Myfanwy. There is actually a daily diff download, introduced recently. I've added some description on the page here. -- Harry Wood 10:39, 12 November 2007 (UTC)
fantastic, thanks for that - now my house mates won't hate me for killing our internet every week. are there any programmes around that will do a one-click/one-command update of OSM data? a repository that i could add to sources.list for all the OSM tools (including things like this, if it exists) would be really useful Myfanwy 00:29, 13 November 2007 (UTC)
Yes. Information on how to use the diff files, to reconstruct a full planet file, are still missing from this wiki page currently. I would guess that you can use Osmosis (set of java tools). Based on this readme I found out that the diff is created by Osmosis in the first place. -- Harry Wood 09:34, 14 November 2007 (UTC)
As update to this old discussion. Planet.osm/diffs page now has the explanation of how to use the diffs. -- Harry Wood 15:38, 24 May 2011 (BST)


"The weekly dump starts at 01:00 each Wednesday night" - local time (which?), UTC, etc. ??

Yes, please, could somebody answer this? Vid the Kid 20:03, 20 January 2009 (UTC)
I think this phrase is ambiguous anyway. Is this 01:00 Wednesday morning or 01:00 Thursday morning (in England this would often be considered Wednesday night despite being past midnight.) Could someone who knows which is correct clarify please? Daveemtb 09:57, 10 June 2009 (UTC)
I've tweaked the text slightly. Dump starts at 01:11am early hours of Wednesday, UK local time. It takes about 12 hours to complete. Feel free to tweak text. -- Firefishy 13:48, 11 June 2009 (UTC)
Firefly, is it really local time (adjusted for DST), or is it UTC? Not that one hour will really make that much of a difference, but if we're going to make our maps as accurate as possible, our times should be as accurate as possible, too. -- Tahongawaka 00:02, 28 October 2010 (BST)
The planet.osm header has a timestamp in the header. Best to use that. Run time is 1:11am British Summer Time. --Firefishy 16:05, 29 October 2010 (BST)
<?xml version="1.0" encoding="UTF-8"?>
<osm version="0.6" generator="OpenStreetMap planet.c" timestamp="2010-10-27T00:11:03Z">
  <bound box="-90,-180,90,180" origin="" />


As of Nov/2020, official torrents have now been implemented at

Historical information:

New BitTorrent experiment started on January 2020 with PBF planet dumps at Cquest (talk) 06:42, 29 January 2020 (UTC) You're welcome to test and report your download results below !

  • Cédric: 28mn download at 32MB/s, with fiber link (free, in France) (2020-01-28)
  • Thibaultmol: planet-200120 download at 44MB/s +-, gigabit down (fiber to endpoint, coax to house) (Telenet Klik Business, Belgium) (2020-01-29)
    • Thibaultmol: history-200120 same download speed (2020-01-30) (only 3 seeds available at the time, and http sources)
  • Jacques Lavignotte : 2020-02-10 planet200203 available thru Transmission-daemon running on a RaspBerry 4 + external USB disk. Fiber 400Mb/s (Orange french op.) Subject to interruptions.
  • Wchatx: 2020-10-02 planet200921.osm.pbf retrieved in 25 minutes from 4-6 peers and 4 web seeds over Google Fiber gigabit connection

We need an IPv6 tunnel to download planet.osm via torrent? Seriously....? --Regulator 09:25, 24 May 2011 (BST)

Hmmm, no, not at all. The torrent files specifies several public trackers, only one of which is IPv6 (others are IPv4). It also specifies webseed, so your client should work even if all trackers and seeds/peers are dead if it supports this extension. So it should work for you over IPv4, yes. I haven't heard of such problems yet (and had several success stories). How exactly does the problem manifests and what bittorrent client are you using? Are you able to download other torrents normally? (I'm currently downloading from IPv4 only host from 5 peers using Transmission client on Debian GNU/Linux) --mnalis 13:23, 24 May 2011 (BST)

Currently an experiment, I might automate this, maybe talking to the other mirror people as well. --Skinkie 20:31, 4 April 2009 (UTC) The origin is powered by Oxilion.

Hi, (hopefully I am not completely blind and I did not see it but) is there any reason, not to distribute the planet-file via Bit-torrent? I would expect, that this would greatly discharge the osm-server, especially if the user were prompted to use it. -- User:Kinglui 08:21, 22 August 2008

Gurkensalat is not hosting planets (anymore?) User:Milovanderlinden 14:09, 11 June 2009
Well this section Planet.osm#Bittorrent ...mentions some torrents. Maybe you are completely blind! :-)
Mind you, it looks like there hasn't been a latest planet for a while. Maybe User:Skinkie knows more
-- Harry Wood 15:06, 24 June 2009 (UTC)
I am happy to provide the code or service to do so, but I am not willing to do it unless the crap about seeding is not resolved. The basic idea would be that the seed is started at and then picked up by the mirror. That is the smartest way of getting the data at the mirrors and the more mirrors join, the fasted the internal traffic will become anyway. --Skinkie 14:45, 3 July 2009 (UTC)
Have there been any movements in that direction? I do not see the it should be complicated on OSM side, it should be as simple as "apt-get install mktorrent rtorrent", adding startup script and config file for rtorrent to look in directory where torrents are saved, and modifying script that creates planet to also call mktorrent to produce .torrent file. One hour job at most, easily 15 minutes if you're familiar with the programs. We could use public trackers, or provide our own. And it would actually lessen the network load on OSM servers --mnalis 20:47, 24 October 2010 (BST)
Each week's export would have to be named differently, which wouldn't really be a problem if it's coded YYYYWW to keep the American date format and European date format from confusing anyone. Would all the weekly archives still be seeded or removed from the OSM torrent server? -- Tahongawaka 00:05, 28 October 2010 (BST)
I think we should simply name the torrents as the planet*.osm.bz2 files are currently named at . Also, all files present there should also be seeded by OSM. When planet files get removed, so should corresponding torrent files (rtorrent for example automatically handles such removals and stops seeding). As I said, tehnically that is rather simple procedure to configure and set up, it is just a question of good will. --mnalis 13:42, 29 October 2010 (BST)
Fine, but who should be the tracker? ? --Firefishy 14:33, 29 October 2010 (BST)
Yes, in addition to there is and we should also add (maybe even with highest preference?) in order to promote IPv6 usage for the folks that have it (or the initiative for those who don't yet). I'd recommend those three, as there should always be at least two trackers (to handle possible outage), and one of the three is ipv6-only. --mnalis 19:06, 8 November 2010 (UTC)
For example, to create test torrent file you just run this (one would replace changesets-101103.osm.bz2 with real planet file of course, and put it in the script just after planet-xxxxxx.osm.bz2 is generated; this is just example for test):
 mktorrent -a,udp:// \
 -a udp:// -a udp:// \
 -w changesets-101103.osm.bz2
if you had a big swarm with at least a few smarter clients, web-seed (-w option) would even allow the swarm to run if your bittorrent seeding client is not working! However, for normal use you'd of course want to run bittorrent seeding client; for example run rtorrent with ~/.rtorrent.rc config file like this:
schedule = watch_directory,10,10,load_start=/home/mnalis/public_html/osm-planet/*.torrent
schedule = tied_directory,10,10,start_tied=
schedule = untied_directory,10,10,close_untied=
use_udp_trackers = yes
dht = auto
dht_port = 6881
peer_exchange = yes
And that is all, it would all work automatically. rtorrent (running under screen for example) would automatically find new torrent files and seed them to interested clients, and when cleanup script removes planet-xxxxxx.osm.bz2 (and planet-xxxxxx.osm.bz2.torrent !), it would automatically stop the seeding. All in all, 15 minutes of work at most. --mnalis 21:26, 8 November 2010 (UTC)
I've now set up as a proof of concept (on non-OSMF servers, unfortunately so expect it to be lagging somewhat), for those willing to try it. --mnalis 16:43, 12 December 2010 (UTC)
Just a note that the torrents seem to work fine. I'm currently seeding the newest planet over IPv4 and IPv6. Ideally the torrents should indeed be created on OSM's servers to reduce delays and unnecessary file transfers. Avij 21:20, 16 December 2010 (UTC)
Seeding from the same servers that host is difficult, our host blocks most ports. I'll see what I can do -- Firefishy 22:48, 29 December 2010 (UTC)
Thanks Firefishy! rtorrent has port_range (and dht_port) directive if you need to limit it to just a fixed port or two in order to allow it through firewall. And even if it turns out it is not feasible to run torrent client on machine, the thing would work even without OSM running torrent client, as long as there was at least one webseed-aware torrent client in the swarm (which would then automatically use standard HTTP web access to retrieve first copy and share it with the rest of swarm: see -w options to mktorrent in my proof-of-concept [script]) --mnalis 01:23, 30 December 2010 (UTC)
Hey Firefishy, had you perhaps had any luck adding bittorrent support to Anything I could do to help? --mnalis 13:41, 12 March 2011 (UTC)
Just an update that [1] torrent service will be shutting down soon. It was (working) proof of concept, intended so it can be implemented directly on servers (which is the only place where it makes sense, and is easiest and cheapest to do), but it was not to be... Thanks for everyone who participated, it was a fun ride! --mnalis (talk) 22:30, 14 September 2014 (UTC)


Hi, I am searching for maps (best would be a country map) of Taiwan. I see that there was a link ([2]), who provided Taiwan maps. It is unfortunately down. Is there by any chance someone who has those maps? I would be very grateful :-) I use a Garmin eTrex Vista hcx, so it would be even better if it would be converted to a Garmin-map already. As I am a real newbie i would also be happy about a short tutorial about how I can make such a country map myself. Maybe I am blind, but so far I couldn't find that. Thanks a lot in advance. --AngMoKio 14:13, 24 June 2009 (UTC)

Try CloudMade downloads: More information: OSM Map On Garmin -- Harry Wood 15:01, 24 June 2009 (UTC)
Thanks a lot. Will try that out. --AngMoKio 07:59, 26 June 2009 (UTC)
Resolved: Mateusz Konieczny (talk) 05:55, 8 March 2023 (UTC)

How to download the latest Planet data?

I am unable to download the latest planet files from "". "planet-080528.osm.bz2 29-May-2008 00:10 4.0G" is the latest file which will download. I thought only the older files links were the ones to be left broken? Can someone tell me how to download the latest Planet File so I can then use the daily diffs to keep my data current? User:Maw269 15:39, 11 May 2010

Internet Explorer and some other browsers cannot download files over 4GB. Firefox works, you will also likely require a filesystem that supports 4GB files. (NTFS on Windows)
-- Firefishy 22:44, 11 May 2010 (UTC)
Excellent, that did the trick....Many thanks for the quick reply! -- User:Maw269
Resolved: Mateusz Konieczny (talk) 05:53, 8 March 2023 (UTC)

Symlink for daily latest

Hi! There seems to be a symlink for planet latest (the whole planet file), but there is no latest symlink for the daily updates at It would make automatic downloads easier if we had a symlink in this dictionary too. Could somebody please create a dynamic symlink? Thank you very much! --Marqqs 11:35, 9 December 2010 (UTC)

Not sure of the pros and cons of setting up that. Seems like it might be reasonable request
I used this ruby script to do daily diff downloads, but this is all a bit shakey. These days the recommendation is to use Osmosis and set up 'replication'. This can still be done on a daily basis if you prefer, but uses the minutely and hourly diff downloads which are of a different format. Osmosis decides which files to download automatically, and in generally it's all a bit more reliable. My understanding is that daily diffs are only available as a legacy thing
-- Harry Wood 15:45, 24 May 2011 (BST)
Meanwhile I solved the problem, using a strange-looking line in my download script:
OSCFILE=$(date -u -d yesterday +%Y%m%d)"-"$(date -u +%Y%m%d)".osc.gz"
But to have a static download link for daily change files would be really nice. FYI: Id don't use Osmosis, I take the very-light tool osmchange to merge the daily change files into an existing .osm file. Here is the description of the toolchain. --Marqqs 19:44, 24 May 2011 (BST)

Raw SQL dump

Is there a raw sql dump available besides Planet.osm?--Kozuch 15:31, 22 December 2011 (UTC)

As far as I know: no Mateusz Konieczny (talk) 05:53, 8 March 2023 (UTC)

Geofabrik changes, Sept 2012

Geofabrik main download site now appears to be holding ODbl data. The old site continues at with CC-by-SA data from mid September - I have no idea if this is intended to be a permanent change. Perhaps a German speaker could find out? --tms13 08:41, 27 September 2012 (BST)

Yes, some months ago Frederik told us that the download path will permanerntly change for ODbL-Data to --Stephan75 17:05, 27 September 2012 (BST)
Resolved: Mateusz Konieczny (talk) 05:55, 8 March 2023 (UTC)

Is there a reason why is not listed in the available sources?

Adavidson (talk) 01:41, 19 October 2016 (UTC)

Probably because noone added it so far Mateusz Konieczny (talk) 05:51, 8 March 2023 (UTC)

Resolved: is nowadays listed Mateusz Konieczny (talk) 05:58, 8 March 2023 (UTC)

state.txt for planet extract

I would be nice to have a state.txt file for the latest planet.osm.pbf file. After every re-import of the planet file, I always have to care for a suitable state.txt for minutely updates. What about adding a respective state.txt to the download section? --Derstefan (talk) 19:39, 25 July 2017 (UTC)

Where to best mention OSMaxx on this wiki page?

Has anyone a clue what is a good section or position for the OSMaxx hereon this wiki page? --Stephan75 (talk) 11:42, 7 October 2017 (UTC)

Where is the repository for ?

Where code for resides? I expected it at but failed to track it down. seems to not mention it

Mateusz Konieczny (talk) 05:51, 8 March 2023 (UTC) and - thanks to simonpoole Mateusz Konieczny (talk) 07:38, 8 March 2023 (UTC)

EC2 metadata

User:Mmd added a note stating "EC2 metadata lookup might slow things down, see for discussion". That issue is for the C++ SDK, not the AWS CLI. I don't see that "long client configuration creation times" would impact the CLI. Is there any evidence that this is a real issue? I've been able to max out my network with the cli Pnorman (talk) 00:02, 11 October 2023 (UTC)

I experienced the issue first hand when using the AWS CLI (not the C++ SDK), and was looking for solutions. Adding the environment variable as mentioned in issue did in fact fix the issue for me. Maybe that's specific to my environment. mmd (talk) 13:36, 27 October 2023 (UTC)
The ec2 metadata service is used by the AWS CLI, it allows the CLI to assume an assigned EC2 host role if running on an EC2 instance. The ec2 metadata service is supported by all of AWS's SDKs. A common reason the EC2 metadata service is queried is if there are missing or invalid credentials or the AWS region is not know. -- Firefishy (talk) 15:42, 27 October 2023 (UTC)