Talk:Free The Postcode

From OpenStreetMap Wiki
Jump to: navigation, search

UK Coverage?

It's be nice to be able to know the coverage of the UK data - is it concentrated in some areas, or does it do most of the country? To know that, we'd want a list of: UK postal towns/areas + their 1/2 letter code, how many first part numbers do each have?

We could then produce something showing: "OX - Oxford - 1-44 - 1,2,4,7,20,26,29 covered"

Does anyone know if we can get a free database showing this, or should we get people to contribute one? Gagravarr

There's this. No idea as to copyright status. --Richard 08:55, 25 May 2006 (UTC)
The guy behind jibble had to pull his, as it wasn't copyright clean. He is, however, interested in using free the postcode for the new version! I was actually thinking that we could use this sort of process to find out when we'd re-created data of a similar degree of coverage -- Gagravarr 09:44, 25 May 2006 (UTC)
I'm guessing that the positional information on his file was a (copyrighted) derived work, but what about the OX7,OX8 etc. bit, which I think is what you're after? I'm not 100% sure that such a list would even be copyrightable....
But then, maybe it might be easiest just to get it from a few hours' Google searching. --Richard 10:19, 25 May 2006 (UTC)
Royal Mail (or whoever the data came from on Jibble) might still claim database rights over the collection of OX7, OX8 etc, even if it didn't come with the lat+long. They might not be right, but we're probably going to have to play it safe. --Gagravarr 12:14, 25 May 2006 (UTC)
The guys who runs suggests that the Royal Mail claim they have ownership of the postcode info regardless of who has collected it: As in "you can't even make a list of areas codes like "OX7", let alone link it to coordinates, this is what the royal mail enforcement people tell me, anyhow" has a nice list of postcode letter parts, which we ought to be able to use for starters. Does anyone know on the license compatibility with wikipedia? --Gagravarr 19:17, 25 May 2006 (UTC) is an even better Wikipedia page, since it seems to list all of them, along with the name, and in some cases also the list of first numbers. I think this would make a good basis, even if we do have to put that one page under GFDL --Gagravarr 19:43, 25 May 2006 (UTC) seems to claim that it is public domain - but I wonder how it got made in the first place. 80n 19:51, 25 May 2006 (UTC)
I can't seem to see where that one does talk about the license? It does seem very detailed (including apparently un-used codes), I do have to wonder where all of it came from. --Gagravarr 10:05, 26 May 2006 (UTC)
The guy being pocket info doesn't know if it's based on Royal Mail data or not (I've emailed to ask). I suspect we need to play it safe then, and not use it? --Gagravarr 13:52, 26 May 2006 (UTC)

End Result

What's the required end result? A cloud with latitude & longitude coordinates for a given postcode? A postcode is an area, not a point so how does this fit in together? When I look for a point I want to test if it's inside a given area with some postcode, don't want to look for all points in the region and decide which is closest. In a small town 3 kilometers could be a different town as in a city it can still be same city... Cimm 7 Nov 2006

Dave Stubbs postcode map shows what happens when you derive postcode areas as bisecting lines between nodes data. We get a pretty good approximation of a map, but even so, not completely accurate. The more data the better it gets, especially if you add nodes near the edge of an area.
Mind you, that's looking at areas of with different postcode prefixes. Your "end result" question relates to the small area covered by a one particular post code. Should we ultimately aim to gather a add a cloud of different GPS readings for different buildings which have the same postcode? I don't know. Probably no harm in adding that level of detail. Or maybe either end of the street, with two different nodes close together, signifying where neighbouring buildings have different codes.
Pragmatically speaking we'll never reach that level accuracy everywhere anyway, so uses for that level detail will be limited.
-- Harry Wood 17:47, 28 August 2007 (BST)


There is a bit of a debate going on on the main page, so I thought I'd bring it in here.

If you find your house in a map and then calculate the lat/lon from that you are making a derived work, just as if you went on google maps and traced all the roads (that's just many points instead of one). Sad as it is that data is owned and licensed by OS, and even though our tax pennies paid for it, we have to pay again, and again, and again. This is why was setup with out of copyright maps, but they aren't all that accurate. -- Tshannon 11:31, 20 November 2006

Yes there is a lot of 'discussion' on the main Free The Postcode 'article'. Can we please clearly state some established facts in the 'Legal' section there? I'll kick this off. In the meantime I'm moving all that discussion to here (following) -- Harry Wood 11:01, 18 April 2007 (BST)

Is it possible to use positions found with Google Earth? For example, if a person finds their house using Google Earth and pins it that provides a lat/lon which could be entered.

as long as they don't look up the post code in a database it should be ok.
really? Are you a lawyer?
Surely it does look it up in a database. Besides which, you'd have to check if using the data this way was compatible with the Google Earth licence. Either way, I'd want proper legal advice before we started doing this

There are also algorithms for converting standard OS grid references to lat/lon as well. If you use an OS map to look up your home and enter that what's the position (legal position that is...)?

I would guess that finding a location, and deriving lat/lon from OpenStreetmap is OK - if so, perhaps the web interface should have a drop-down giving this option of data collection vs. GPS...

At present it's unlikely that getting a position from OSM is legally safe. In brief, OSM data is licenced as ShareAlike (CC-SA-2.0), so any derived work also has to be ShareAlike - but FreeThePostcode isn't, it's public domain. This is a possible argument for rewriting the OSM licence to explicitly permit such derivations. --Richard 12:01, 30 Aug 2006 (BST)

Maybe the form should be updated to have some options about how the data was collected - if any of the data is then proven to be shaky it can be removed from the database later without putting a question mark over all of the data held.

Deriving from OSM maps

Is there anything to stop people from looking up Postcodes based on OSM and inputting the Lat and Long into freethepostcode? --Nick 17:13, 15 Mar 2006 (UTC)

Nope, since you're looking at free data. Use streets, not landsat though as it can be way out. Oh and the viewer would need to tell you the lat/lon of the cursor somehow. The applet can do thi trivially. User:Steve
JOSM seems pretty good about showing you the lat/long of where your mouse is --Gagravarr 10:05, 26 May 2006 (UTC)
Pragmatically speaking I dont think anyone in the OSM community has any objection to such derivations being released as public domain, however my understanding, as Richard said above, is that strictly they should be shared-alike (can't be public domain).
Seems like a bit of an untidy outstanding licensing issue, but maybe we could just ignore it -- Harry Wood 11:21, 18 April 2007 (BST)
For this reason I think OSM's license is a terrible choice, I really wish that it was also a public domain source. The idea of forced attribution on something that will be added to for hundreds of years to come is very untidy and badly thought out. Bitplane 07:15, 22 January 2009 (UTC)

Is Data Mining the Web Legal?

Most postcodes are probably on the web somewhere. I googled all the postcodes of my relatives and friends and all of them were there somewhere for some reason. Even someone I know who lives in a tiny rural Welsh town. For example mine is on the web because a neighbour made a planning application to the council which was published.

It might be possible to produce a script to automatically query the web (don't think Google allow this anymore but other search engines do e.g. MSN). You write a program to detect post codes found on web pages and parse out the associated address. You then automatically look up the address on OSM to get a lat and lon.

This process, if done automatically, will result in many errors because it will be difficult to parse addresses taken from web pages perfectly. However you reduce error by re-querying the web for all the post codes that have been found to find other addresses in the same postcode and then cluster the lat/lons and take an average.

An appropriate set of search queries likely to find addresses might be the a list of the first four characters of the post code for example of which I think there are only about 1000'ish maybe? If it's too copyright dodgy to do this, you could guess the first four characters as there are a small enough number of combinations for a computer to run through i.e. A-Z,A-Z,1-99 is 66924 combinations?

Probably technically possible but would this produce a legally free list of post codes?

--Kesfan 18:20, 12 January 2008 (UTC)

I've been looking into this and made a proof of concept but data-mining Google isn't allowed under their T&Cs, they have canned their SOAP plugins, banned known crawlers (ie Python urllib) and will actively ban your IP address if you do too many search requests. However, they do provide an AJAX library for Google searching so a human guided semi-automatic search may be possible using a web front-end. The main problem I can see from my limited PoC is that the best sources for postcodes are business directories, which may or may not be derived from codepoint so these would have to be filtered out.

Here's my list of ideas so far-
  • An AJAX enabled website connected to a database that lets the user select a level 13 or higher area on OSM, downloads the data for this area, finds "highway=" tags with "name=" tags but without "postcode=" tags (server side). It lets the user manually set the town and county fields, then lets them choose one street at a time and searches using Google's AJAX API for "Street Name, Town Name, County Name" and validates and strips out postcodes after the bold close-tag. Before displaying the results to the user it checks the 2nd level domain from the search result against the database, so that the same domain isn't used twice (taking one post code from each website can't be seen as a database right infringement), then after user validation it can submit the changed way back to OSM. We can also add "source:postcode=website" to the ways in case of future legal dispute.
  • Another idea is to contact hand-built business directories that do not use the codepoint database and get permission to either crawl their sites for the data, or get them to send us their lists of addresses. We automagically amend the OSM database using their data and credit them with "source:postcode=their site" or something similar. Local directories are our best bet here.
  • A mixture of this idea and the first- contact business directories and ask if they'd like their website to be part of a custom Google search engine used for OSM researchers, and use this to search for the postcodes of street names. This way we can search only sites that have explicitly given permission, and have semi-automatic but legal way of adding the data to OSM. I'll try contacting a few directories and see if I get anywhere. Watch this space.
I have set up a custom Google search engine, currently I've only got permission from one site though. You can try it here. If you'd like to contribute contact me by my member page.
  • We could start at the Open Directory Project and crawl websites for addresses, avoiding the need for Google. This wouldn't yield so many results and we'd have to check ODP's terms and conditions.

Yahoo has free web services that can be used for mining (however they're limited to 5000 queries per day per IP).

Another problem is how to solve the "closest town" problem for streets. Should streets have an is_in tag to make this easier? If so, this project could also add the is_in tags too.

Bitplane 07:11, 22 January 2009 (UTC)

Microsoft Live search allows queries via a web service. There's no need to actually use Google, you're just looking for a long list of web pages in no particular order so any search engine would do I think:

Microsoft Live search API documentation

I hadn't thought about the business directory problem you suggest. How about narrowing it to local council web sites with the first few digits of a post code known to be in the town (taken from free the post code) e.g. SN1

You pick up a lot of post codes with addresses this way as the councils have a lot of interest in the local area and publish a lot of information. Planning applications are a good source of info.

--Kesfan 09:13, 26 May 2009 (UTC)

Hi, just wanted to say that I've actually done this fairly successfully and it's actually being used by several websites. I don't charge for it but I do ask that people acknowledge the service. I offer a free API. I don't plan to charge because it would then compete with paid alternatives and it isn't 100% accurate but surprisingly good. [1] More than happy to receive any communication from people about this mini project of mine. Thanks --Wikichris 00:16, 30 March 2011 (BST)

Your service sounds to be free-as-in-beer rather than free-as-in-speech, nor do you address the issue of whether the data you have accessed might infringe the copyright of others. Also the information on this page is seriously out-of-date. You should read the page on Codepoint Open. We have all live postcodes for Great Britain geolocated with an acceptable licence. The task now is to associate these with buildings. So far we have around 80,000 buildings (roughly analogous to delivery points) and about 0.1% of all postcodes properly entered in OSM.

For more wiki pages covering OSM legalities see Category:Legal

DE Coverage?

I found this by accident, seems to be a database of german postcodes with tagged areas. -- Dekarl 12:13, 2 February 2007 (UTC)

Data -> PLZ. Freie Geometrien deutscher PLZ-Gebiete. Herzlichsten Dank an den / die Spender! Wir werden zusehen, ob wir diese Daten irgendwann in opengeodb integrieren können. (from OpenGeoDB)

See for always latest release... --Traut 16:59, 14 January 2008 (UTC)

IE Postcodes

I'm editing this section to more correctly reflect the non-postcode nature of what we currently use in Ireland.

Country Prefix

Free_The_Postcode#HR names the HR-prefix for the postal code. Is this still valid? It was recommended once in Germany, while nowadays the post asks NOT to use these prefixes, but to use the country's name instead. --traut 13:34, 4 February 2008 (UTC)

NO Coverage

In Norway the postcode is four diget followed by the office name (nnnn NAME) written in upper case. The number is a running number starting from 0001 (The royal palace I think). The first digit sort of gives a large geographic area, the second a narrower area while the last two are only running numbers. The first digit are not increasing sequently as you go along the cost, but have a slight tendency to jump, specially close to centers with high population density. --Skippern 00:32, 8 September 2008 (UTC)

BR Codes

BR Postcodes are aa-nn.nnn-nnn or nn.nnn-nnn-aa where aa is the state and the numbers are in some way broken down to street level post codes, though I don't know the system. --Skippern 00:48, 8 September 2008 (UTC)


Any chance they could be exported to and vica versa? These 2 projects have similar interests, could be useful to integrate, no?

postal_code tag

We need to link to Key:postal_code and explain how this OSM tag relates to Free The Postcode. -- Harry Wood 23:16, 10 May 2009 (UTC)

The End

I read on that the data was released, and it is 100% at .

Should this page be marked with the equivalent of Wikimedia's {{historical}}?

Or is it actively being used for other countries? --Jayvdb (talk) 04:51, 2 September 2016 (UTC)