Talk:Wikidata

From OpenStreetMap Wiki
Jump to navigation Jump to search

Tools

Is there a tool to add Wikidata tags to existing Wikipedia tags? --LA2 (talk) 10:37, 6 August 2014 (UTC)

This should be quite easy to do. If an object has the tag wikipedia:{lang}={article}, it suffices to read the URL http://www.wikidata.org/wiki/Special:ItemByTitle/{lang}wiki/{article} and get the corresponding wikidata code. However, I think it is premature to do any bulk addition of wikidata tags. I personally would oppose doing that, precisly because the wikipedia -> wikidata translation is such a trivial thing to do (can be done on the fly by any application). Augusto S (talk) 20:53, 25 August 2014 (UTC)

Progress

How far have we come?

Date Wikipedia
keys
Wikidata
keys
Wikidata keys as
% of all objects
Wikidata keys as
% of Wikipedia tags
Wikidata keys as % of
Wikidata items with locations
2014-08-06 355 026 18 676 0.00 5.26
2020-08-09 1 279 225 1 740 639 0.00 136 21.6

--LA2 (talk) 10:37, 6 August 2014 (UTC)

I've added a new column: 'Number of wikidata articles with location tag'. This value hints at roughly how many wikidata articles which have a location have a corresponding node/way/area in OSM. A value of 100% means that all wikidata articles that have a location are also mapped to an OSM object. Less than 100% suggests that there are nodes/ways/areas in OSM that need a corresponding wikidata tag. It is taken as a percentage by dividing the number of wikidata keys by the number of wikidata articles with location tags Coolmule0 (talk) 15:41, 9 August 2020 (UTC)

OSM's IDs

Why OSM's IDs are not stable? --Pastakhov (talk) 17:48, 16 September 2014 (UTC)

One case: A feature may be mapped as a node at first, but it's later changed to an area for the sake of more geographical precision, and consequently change ID.
Another case: A new user is having some trouble changing the current geography of an area, so he simply deletes it and recreates it.
Another case: A vandal silently deletes an object. Later a good mapper sees the feature is missing, but is not aware the previous object was deleted and creates a new one instead.
There probably are other specific cases, but the main takeaway is that in OSM there is no guarantee a feature will always be represent by the same object. --Jgpacker (talk) 18:12, 16 September 2014 (UTC)
Have any idea how to get around this problem? --Pastakhov (talk) 03:13, 17 September 2014 (UTC)
I see only one solution to preserve the integrity of the data, it is to use triggers and use key like wikidataid for linking from Wikidata to OSM. Trigger should not allow delete objects that have the wikidataid key which used in wikidata (it is vandals case) or if the transaction does not contain any other object which received this wikidataid key (it is other cases). Is it theoretically possible? --Pastakhov (talk) 03:52, 17 September 2014 (UTC)
My impression is that the OSM community is too conservative to do something like this. So far, they have avoided to put restrictions on the user as much as possible. I believe it could be possible, but the benefits to OSM in doing this don't seem to be worth it (I might be wrong). --Jgpacker (talk) 12:28, 17 September 2014 (UTC)
Well, there is another way that is make a selection from history of edits and find wikidataid keys that were deleted or changed. Then need check whether the keys exists in the current OSM database and Wikidata. The filtered wikidataid keys give community for recovering. It is also possible instead of the wikidataid key we can use the existing wikidata key if it is possible to link back from Wikidata to OSM by the wikidata key. What do you think about this? --Pastakhov (talk) 13:59, 17 September 2014 (UTC)
I think it's a good idea to have an updated list of changed/removed wikidata ids for review. I'm not sure how to do this, but I think it's possible. Yes, as said on the page, we can link both from OSM to Wikidata, and from Wikidata to OSM, and I believe some people already do this. --Jgpacker (talk) 14:14, 17 September 2014 (UTC)
Can someone explain to me what problem we are trying to solve here? Wikidata has stable IDs, so we can create the connection by linking from OSM to WIkidata. What else do we need? --Tordanik 14:18, 17 September 2014 (UTC)
I'm trying to understand the meaning of this idea. And I can not understand anything except "Testing Wikibase to a scale of 60x its current size". I can not find any benefit from it and how it can work with unstable identities. I thought that there is is a problem that idea is trying to solve, but it seems it is not. Thank you for your time. --Pastakhov (talk) 04:53, 18 September 2014 (UTC)
That grant proposal has nothing to do with Wikidata tagging in OSM, or with the links between OSM and Wikidata. They are trying to convert the OSM database into the Wikidata format and importing it into their own instance of the Wikidata software (Wikibase). They then want to allow people to edit that OSM data using the editing tools from the Wikidata community. In the end, the changes should be sent back to the main OSM database, as with other editor software. During all this, they only use OSM identifiers and content, no Wikidata content involved.
If you ask me, though, the project still does not solve a real problem, as specialized editing software for OSM is already available. At best it might be interesting for a niche audience (e.g. Wikidata users who can start editing OSM without a learning curve). --Tordanik 13:10, 18 September 2014 (UTC)
Thanks, it is more understandable about grant's goal. Maybe I'm wrong, but in this case it looks like using a jackhammer to crack a nut. Store and handle such an enormous data volume only for the user interface... Is it really? Likely I should ask this in the grant proposal... --Pastakhov (talk) 15:07, 18 September 2014 (UTC)

Wikidata queries for administrative objects to compare with OSM hierarchies?

For quite some time I try to dig into Wikidata model generally to just understand how Wiki data works and how we can use it. Thus I found some similarities to the OSM data structures when dealing with administrative boundary relations inside OSM data.

For example in OSM we can easily query for all sub districts inside an upper district via the Wizard mode of overpass-turbo.eu ... try to enter in the wizard: boundary=administrative and type:relation in "Landkreis Cuxhaven"

How can I do such a query in Wikidata?

Well, https://wdq.wmflabs.org is an external tool to build such queries, and I managed to find out the following (by entering the English expressions in its auto-complete-search feature), so two conditions are needed: instance of [P31] : municipality of Germany [Q262166] AND located in the administrative territorial entity [P131]: Landkreis Cuxhaven [Q5897]

Finally we need the string CLAIM[31:262166] AND CLAIM[131:5897] there. Hopefully the link to the Autolist webservice is updated correctly, you can click to start a query there.

So general question: Is this the right way to find out what administrative Objects and hierarchies are "tagged" in Wikidata, and to verify if they are correct? --Stephan75 (talk) 09:36, 3 January 2015 (UTC)

How to link from a wikidata item to a way

Wikidata can already link from to a point, using coordinates. Wikidata would also really like to link wikidata items like countries, towns, motorways, rivers to a 'way' which defines these borders (including ways for historic/obsolete borders) of these linear and spatial objects. There has been talk of having a datatype for ways in wikidata but the general feeling is that OSM is a much more sensible place to create, edit and maintain such geographical objects rather than wikidata trying to duplicate your efforts somehow. The problem is that we keep being told that OSM doesn't have any stable ID for such ways.

Can anyone think of a workaround to make this work, based on existing OSM practice or based on something that OSM could be persuaded to adopt? Filceolaire (talk) 00:25, 3 August 2015 (UTC)

Ok, let me first describe the challenges we have to tackle when connecting OSM to Wikidata:
  • As you said, OSM IDs were not designed with stability in mind. They are not directly visible to the user when editing data, and regular editing operations sometimes affect IDs. In some cases (e.g. representing a feature with a closed way that was previously represented as a single point), it is even impossible to keep the ID intact.
  • Not all Wikidata items map 1:1 to an OSM feature. For example, roads are sometimes split into multiple segments, and only the most important ones (e.g. motorways) have relations collecting the segments.
  • There is sometimes no clean semantic separation of entities in OSM - e.g. the attributes for a restaurant and for the building it occupies might be on the same way. This shorthand is generally only accepted if it doesn't cause problems, though, so you could simply fix any problematic instances of this practice when trying to link with Wikidata
Now for the possible solutions. The easiest approach would probably be to link from OSM to Wikidata instead, as the Wikidata IDs seem to be more stable. The wikidata key can be used for this. Of course that would be harder to integrate into query tools and other Wikidata infrastructure, but there is something to be said for not duplicating the work. Adding these tags is already pretty much accepted in OSM, so you wouldn't have to do any persuasion on our side.
Alternatively, you could just accept the flaws of OSM IDs and use them anyway, as long as you are prepared for link breakage and the other issues above. It would be wise to at least set up some automated testing of the ids in that case, though.
Some other ideas have surfaced during discussions within the OSM community in the past. Some have suggested storing queries (e.g. with Overpass API instead of or in addition to IDs. Some have suggested to add permanent, unique IDs as attributes to OSM elements, and making it mappers' obligation to preserve these across editing operations. There hasn't been any real conclusion so far, though. --Tordanik 11:46, 4 August 2015 (UTC)
Thanks Tordanik. I think Wikidata would mainly be interested in linking to relations. For something like a Relation:boundary wikidata could want to link one wikidata item to multiple different boundaries (i.e. the current boundary and various historic boundaries) The OSM could presumably have "start time =" and "end time =" tags in addition to the wikidata key so we can distinguish between the various boundaries for the same wikidata item however those sorts of time properties would seem to be the sort of thing Wikidata would do better.
If we want the info box for a wikipedia article to show the boundaries then wikidata needs to be able to tell wikipedia where to find the boundary relation. Having a key on the OSM relation won't help much here. We need a link in the other direction.
Sounds like the best bet is to just link Wikidata items to OSM relations and accept that these have to be updated regularly.
Is there any other forum on OSM where I can ask about this? Filceolaire (talk) 21:29, 6 August 2015 (UTC)
Considering that there has been a solution for a few years to display boundaries (and many other items) on Wikipedia articles based on keys of OSM ways and relations ("WIWOSM"), I'm surprised that you don't consider that option.
As for using only relations, keep in mind that relations are generally used on a "only use if necessary" basis in OSM. Most linkable objects won't have a relation, but only, say, a way. You may be able to link to country boundaries with relations, but not to e.g. footprints of historic buildings or roads in cities. Historic information is likewise a bit contentious, with many considering it outside of the scope of OSM and suggesting the use of a specialized DB (using the OSM data model) such as Open Historical Map.
If you want to discuss this with the broader community, it's probably best to join the talk mailing list. --Tordanik 08:57, 7 August 2015 (UTC)

September 2015 discussion

Yesterday, a discussion on using Wikidata tags in OSM took place, as part of Wikidata's regular "office hours", on IRC.

The discussion log starts at https://tools.wmflabs.org/meetbot/wikimedia-office/2015/wikimedia-office.2015-09-23-17.01.log.html#l-109

-- Andy Mabbett (User:Pigsonthewing); Andy's talk; Andy's edits 12:31, 24 September 2015 (UTC)

WikiData API and wrappers for various programming languages

I guess I am not the only one who have problems finding this out. I am looking for a not too complicated API, or wrappers to query information from WikiData, such as list of name:*=* tags from the WikiData entry based on the qID. The best thing I have found so far is reading the WikiData page of the entry, and parse the entire page. --Skippern (talk) 15:55, 25 November 2016 (UTC)

Wikidata has a REST API that can simply return JSON data, instead of returning plain HTML (or some internal wiki syntax used in its pages, not directly editable with its UI where the wiki editor is completely disabled in its main namespace storing "Qnnn" data items or "Pnnn" property items, and replaced by the data editor).
E.g. w:d:Special:EntityData/Q42.json (this retrieves the full dataset for Q42, in JSON format, without any properties filter)
E.g. w:d:Special:EntityData/Q42.php (same request, but the returned JSON is reformatted into a "serialized" PHP array)
E.g. w:d:Special:EntityData/Q42 (same request, but the returned JSON is reformatted as HTML with the usual editable Wikidata UI)
The internal RESTAPI on Wikidata is still the MediaWiki RESTAPI (i.e. the standard "action" API of MediaWiki), where you can set the JSON output format with the "format=json" parameter (in fact, Wikipages for "Qnnn" items and "Pnnn" properties are just standard internal Wiki redirects to the "Special:EntityData" page of Wikidata, which itself will create a request to this RESTAPI).
See examples on https://www.wikidata.org/w/api.php?action=help&modules=wbgetentities
Only the "json" and "php" formats are supported for now by this Mediawiki REST API ("html" returns the the usual editable Wikidata UI, "xml" and other documented formats are still not supported by "action=wb*" API requests), so,
E.g. w:d:Special:EntityData/Q42.xml (same request, but still fails, as "xml" is not a supported format). Some wiki developers have suggested the addition of a "py" format for Python, or other formats for Ruby, or some other standardized wrapping formats. For now everyone seems to live with the "json" format (even if they need a JSON parser in their prefered programming language, something that most languages already have)
The "php" format was added because it is far better performing than using a JSON parser in PHP. If an additional format is added, it will probably be "xml" first.
Some simple raw text formats would also be useful ("csv"?), but the queries would need to be more selective (using filters) to be flattened with a less complex tabular 2D structure.
Wikidata also comes with a separate "Wikibase library" (directly installable in other wikis), however it is based on the MediaWiki "modules" requiring the support of Lua, and some internal authorizations to be setup to allow performing requests to another site/database through Lua. Basically, this Lua library is made for allowing integration within MediaWiki templates and pages (without needing any client-side Javascript, like with the current "Taglist" template recently introduced on this OSM wiki): clients are still viewing standard HTML pages, they can edit them in the wiki syntax or with the Visual Editor, and clients won't perform any direct connections to the external Wikidata server: the local wiki will make this connection itself via the library, the Lua module will parse the results, and present it in standard Wiki syntax which is then embedable in Wiki templates and pages, and finally formatted to HTML by the MediaWiki server). But this library is still not usable on the OSM wiki, as it currently has no support for Lua modules. If you think about developping a Javascript extension (similar to Taglist), you should better use the REST API to perform JSON data requests directly to the Wikidata server, and you don't need that Wikibase library.
There's also an external "WDQ" tool server (https://wdq.wmflabs.org/api_documentation.html) which can process the data and perform data queries with some specific request syntax, and some more advanced capabilities to perform "joins" for recursive traversal through linked Qnnnn elements or Pnnn properties (this tool still internally uses the MediaWiki RESTAPI). This WDQ API is also a REST API.
More information in w:d:Wikidata:Data access if you want more specific filters — Verdy_p (talk) 16:58, 25 November 2016 (UTC)
Thanks for the answer, the Special:EntityData returning JSON should allow me to do the queries I was looking for. --Skippern (talk) 17:14, 25 November 2016 (UTC)
If you just want the translated "labels" and not all other properties, use the MediaWiki RESTAPI directly (instead of the Special page that has no filters at all):
https://www.wikidata.org/w/api.php?action=wbgetentities&format=json&ids=Q42&props=labels
You can experiment this query in the API sandbox of Wikidata (the filter used is "props=labels", you can set here a list of property names separated by vertical bars):
https://www.wikidata.org/wiki/Special:ApiSandbox#action=wbgetentities&format=json&ids=Q42&props=labels
For example you can query the label in a single language (parameter "languages=fr") with an additional parameter to use fallbacks if there's no label currently in that language (parameter "languagefallbacks=1"):
https://www.wikidata.org/w/api.php?action=wbgetentities&format=json&ids=Q42&props=labels&languages=fr&languagefallback=1
See how the volume of data returned is much reduced with these "props" and "languages" filters !
Another very useful API parameter is "utf8=1" (to avoid many characters being escaped as sequences of hexadecimal UTF-16 codeunits "\uNNNN", i.e. for all non-ASCII characters, and some ASCII characters, however the delimitation quotes of JSON string literal are still escaped). Here also it generally reduces the volume of data returned (notably when you query properties in non-Latin languages such as Russian, Arabic or Chinese).
https://www.wikidata.org/w/api.php?action=wbgetentities&format=json&ids=Q42&props=labels&languages=ru&languagefallback=1
https://www.wikidata.org/w/api.php?action=wbgetentities&format=json&utf8=1&ids=Q42&props=labels&languages=ru&languagefallback=1
An additional API parameter you may need in your application (using concurrent requests, possibly from multiple users) is the possibility to define your own query id by an additional parameter, which will be present in the returned response, so that you can handle them asynchronously and map them to the state of your initial concurrent queries, instead of using blocking threads (this will increase the performance of your application if it is hosted on a server with many users).
But be aware of the Wikidata API usage policy: your server application should use some reasonnable cache to avoid "spamming" the Wikidata server with too many repeated requests for the same data. If your queries are sending data massively to Wikidata, you'll need to authenticate with a Wikimedia user account authorized to run on Wikidata as a "bot" user (and you'll need to also specify your current bot authorization token in the documented parameter for some very privileged actions, according to the general Wikidata policy about bots and to the security requirements for using some very restricted privileges).
The API sandbox on Wikidata is providing a very friendly UI to help you build your queries, with help provided for many options. Just click the proposed options or fill in their values, other panels may appear on the right panel you can select to view additional parameters you can set. Then execute your query: you'll see in the results tab the generated URL along with the data returned shown just below: copy-paste this URL as an example you can reuse in your app.
Final note: this API may be used with either a GET method (with url-encoded query strings) or a POST method (with parameters attached as web form data in the request body): for security or privacy reasons, some queries require you to use POST, notably those requiring a user authentication or for editing data if you need an edit token or if you use CORS requests requiring another token; the documentation informs you when POST is required, because POST requests are normally not cachable. But for most read-only data requests not containing private user data in the query itself or its reponse (and that should then be cachable and reusable independantly of users), you should use the GET method (all examples above are using the GET method, and are fully cachable).
Verdy_p (talk) 18:20, 25 November 2016 (UTC)

Wikidata id -> display object on map

Is there an easy way if I have e.g. Q13378328 Zagorje ob Savi Municipality to construct an URL using the Wikidata ID as an argument that shows it on an OSM map e.g.

- Salgo60 (talk) 07:44, 18 November 2017 (UTC)

Just replace the three instances of the Wikidata ID in this wikicode which encapsulates correctly the Overpass QL request:
http://overpass-turbo.eu/map.html?Q={{URLENCODE:
[out:json][timeout:25];
(
  node["wikidata"="Q13378328"];
  way["wikidata"="Q13378328"];
  relation["wikidata"="Q13378328"];
);
out body;
>;
out skel qt;
|QUERY}}
which generates the following URL: http://overpass-turbo.eu/map.html?Q=%5Bout%3Ajson%5D%5Btimeout%3A25%5D%3B%0A%28%0A++node%5B%22wikidata%22%3D%22Q13378328%22%5D%3B%0A++way%5B%22wikidata%22%3D%22Q13378328%22%5D%3B%0A++relation%5B%22wikidata%22%3D%22Q13378328%22%5D%3B%0A%29%3B%0Aout+body%3B%0A%3E%3B%0Aout+skel+qt%3B
Note that this URL can be easily obtained from the Overpass Turbo link you have provided. Just click on "Export" then "Map" then "interactive Map". —seav (talk) 08:29, 18 November 2017 (UTC)
Before exporting as map, you could reduce it to the bare minimum: http://overpass-turbo.eu/s/t42 Mmd (talk) 08:39, 18 November 2017 (UTC)
Note that I used URLENCODE to more easily convert the request in a readable way: this also allows building a template easily. — Verdy_p (talk) 14:35, 18 November 2017 (UTC)
Newbie question isn't just relation enough for a border link - Salgo60 (talk) 20:42, 18 November 2017 (UTC)
If you look for places known only by their Wikidata Q-ID, you don't know if this place has a defined relation boundary, most places don't have a relation boundary so it's not stupid to look for any kind on OSM primitive that may reference this Q-ID, so the previous request is correct. For mass usage, shortening the URL will not work as long URLs with the full request have to be requested first with the query, then an action taken to record it on the OVerpass server to get an unpredictable short ID. The OVerpass server will not appreciate you ask it to create massive amounts of queries, the short link is only useful when you need to share the query over social medias or in emails but will be created individually after already using the first method above (and there's no warranty that the short URL will remain persistant for long, the short ID is a temporary facility) — Verdy_p (talk) 21:21, 18 November 2017 (UTC)
The short id is provided by overpass turbo, not Overpass API server! It is being backup-ed and stable for a long time, not a "temporary facility" as you write. However, for the sake of running a query based on a Wikidata id, it really doesn't make sense to use the shortening service, as it is not flexible enough for an arbitrary wikidata id. Just create the proper url to the map service via urlencode including the wikidata id as a parameter. Mmd (talk) 22:13, 18 November 2017 (UTC)
In this scenario I try to convince a genealogy community www.wikitree.com that they should create a Wiki Category structure in Wikitree.com with material from Wikidata and maybe have links to maps in OSM so we will know that all the objects are administrative borders - Salgo60 (talk) 08:55, 19 November 2017 (UTC)
An other maybe : retrieving the coordinates with the endpoint SPARQL of Wikidata. Cdlt, VIGNERON (talk) 08:30, 18 November 2017 (UTC)
Thanks the Wikidata and SPARQL part is no problem link I am starting to learn OSM and like seeing the OSM maps with relations like 1681511 - Salgo60 (talk) 15:03, 18 November 2017 (UTC)

Streets with multiple ways

We have streets (and other linear objects, like streams and rivers) made up of multiple ways, but which relate to a single Wikidata item.

For example, Trump Street in London is (rightly) split into three parts:

but one Wikidata item:

There are several possible ways we could represent this:

  1. Add the Wikidata item to each way (currently not recommended)
  2. Add all three ways to a relation, and add the Wikidata ID to that (ditto)
  3. Create an item for each way in Wikidata,as "part of" the existing item, then use their IDs in OSM (contrary to its notability possible, very kludgey, and not sustainable when ways get split in OSM)
  4. something else I haven't thought of.

Does anyone have any suggestions for a workable solution? Andy Mabbett (User:Pigsonthewing); Andy's talk; Andy's edits 16:27, 2 February 2018 (UTC)

I use "Add the Wikidata item to each way". Not ideal but alternatives are worse. Mateusz Konieczny (talk) 10:23, 3 February 2018 (UTC)

Confusion with capitals, city-state homonymous, etc

A city can be an instance of the "capital concept", the Wikidata ID must be the city, not the capital instance... Examples of errors:

City of Roma (IT): the city is Q220, the capital is Q3940419. The correct is to use the tag wikidata="Q220", but at OSM today we see Q3940419.

About city-state homonymous:

City of São Paulo (BR): the city is Q174, the state is Q175. The correct, at city polygon is to use the tag wikidata="Q174", but Wikidata's city was pointing to state.

So, suggest better control and explinations. --Krauss (talk) 00:00, 7 July 2018 (UTC)

Monitoring the reciprocal use

See https://github.com/OSMBrasil/semantic-bridge

WdOsm-semanticBridge.jpeg

--Krauss (talk) 13:43, 7 July 2018 (UTC)

List of conflicts and known issues with Wikidata

The current article text is rather "boosterish" in favor of Wikidata. However, I would like to mention several issues that I am aware of. Please add to this list if you can think of others:

  • Wikidata names are not human-readable, unlike other things in Openstreetmap
  • Wikidata tags can not be easily verified to be correct or incorrect by mappers
  • There is only one wikidata tag for a set of wikipedia articles, but each language article may be rather different
  • Often one wikidata item could be used for several different Openstreetmap database objects, for example: a (boundary=administrative) municipality, place=neighborhood, place=city, and place=island might have the same name and wikidata but will be mapped differently in Openstreetmap.
  • Some wikidata items represent features which are always mapped as a series of several objects in Openstreetmap (e.g. long streets, streams)
  • Wikidata item labels conflict with Openstreetmap wiki data items labels - both start with "Q"
  • Wikidata can be confused with the "wikibase" / "data items" / "wiki data items" now used here at wiki.openstreetmap.org

Please add any other issues to this list. --Jeisenbe (talk) 03:14, 16 January 2020 (UTC)

If the article seems to lean too heavily toward advocacy, I think you should feel free to balance out the article and improve it in any other way you see fit. These are interesting and constructive points that I'd like to expand upon and in some cases rebut, but I think it would be more efficient to hash it out in the article itself. We could also use this opportunity to point to alternatives that are often used in OSM, whether Wikipedia or some other external reference like GNIS feature IDs. Although I see the value in Wikidata-OSM integration, it doesn't hold a monopoly on external references and we can make that clear in the article. – Minh Nguyễn 💬 09:23, 16 January 2020 (UTC)
I've edited the article for English usage and added some of the issues/conflicts to the page...so shoot me, or edit it too ;) Personally, I had some misgivings with getting tightly coupled to Wikidata or ceding control over geographical information, but I feel that it provides a good solution for 'edge' data, i.e. stuff that could be added to OSM but is debatable, e.g. current Tagging List discussion over adding Stanford Institute Volcano IDs Jnicho02 (talk) 10:24, 31 January 2020 (UTC)
Readability and verifiability are already partly solved as iD and JOSM (with plugins) support fetching and displaying Wikidata labels. Some of the other items are indeed unresolved issues, such as the lack of clear distinctions between islands, administrative entities, etc., although almost all of the problems also exist with Wikipedia links as well. I do feel that some of the concerns are a bit overstated, though, in particular the potential collision with data items on the OSM wiki. After all, Wikipedia article titles "conflict" with OSM wiki article titles as well, but I don't think many mappers are confused by this fact. --Tordanik 18:01, 17 January 2020 (UTC)

Drawbacks of Wikidata part 2 - database decomposition

The original usage scenario of the wikidata tag was to connect a museum, city or school (e.g., site relation) to its unambiguous Wikidata/Wikipedia page that can contain lots of further data about such POI or discover relevant wikivoyage and other content that accompanies it.

However, we must recognize when we are overdoing it. A member proposed (as a form of premature optimization) to potentially reduce the size of the OpenStreetMap Postgresql database by shoveling parts of micromapping data from OSM to Wikidata as a form of template. He would start to suffix various other keys whose related content was migrated away from OSM, such as model:wikidata=*. Imagine at the very extreme, we could actually remove all tags of a fire hydrant, and instead create a single bare node having a single tag: model:wikidata=*. Theoretically, every other tag becomes redundant from that point on as navigating the wikidata link reveals who the manufacturer is, that this is a fire hydrant, it is of a pillar kind, how many couplings it has, what color it is, etc.

(I would caution against going to the extreme without consulting with the community and proper planning before execution. -Bkil (talk) 20:40, 8 March 2024 (UTC))

Feel free to incorporate this section into the article or extend it yourself inline with further concerns. Just add your name here - edited by:

  1. -Bkil (talk) 20:40, 8 March 2024 (UTC)

WIP unstructured editor comments go here

AMA -Bkil (talk) 20:40, 8 March 2024 (UTC)

Offline

One major feature that OSM is advertised with is that it is fully functional when offline. We already store a downloaded OSM extract of the country. Should we then start downloading the whole wikidata database to the phone as well?

On the ground rule

Certain model properties generalized to a wikimedia entry may not be verifiable. We measured the physical properties of certain items on a ground survey and their dimensions could be sometimes multiple centimeters off in each dimension or had a different colour compared to ones in the product brochure downloaded from the manufacturer. The OSM on the ground rule commands that we must believe our eyes, not theories filled in from the couch.

Piecewise access API

Many simple OSM-based web apps use temporary, cheap bulk download APIs on-demand per visitor. An example is the CORS-aware Overpass API with which many slippymaps aiming specialized OSM contributors fetch certain types of POIs that the given target audience would like to map or resurvey. Such an endpoint returns a whole batch of POI within a given bbox. However, as more and more properties are migrated to wikidata, such data user will somehow also need to incrementally fetch all wikidata within the bbox in parallel or for those list of IDs that the previous result links to (increasing round trip count). An alternative would be to place an increased burden on Overpass to internally minute-replicate wikidata, join all tables real time and then return pseudo-OSM nodes in its API response. This will need further research to implement and fine tune.

Barrier to existing data use developers

3D renderers already utilize physical properties to depict each object. Even if certain types of objects are manufactured according to the same template, should we expect to update each and every data user to download wikidata and join its respective tables with OSM before reconstructing the previous schema to proceed to render? This increases the burden on existing data users in general to update their potentially complicated pipeline.

Barrier to new data use developers

This raises the barrier to entry for implementors of new data users. OSM was always known to be volunteer-friendly and easy to thinker with.

Barrier to new map contributors

The country-level granular and complicated ad-hoc rules of OSM already poses a steep learning curve. It would increase complexity even more if we had mandated new rules regarding how to use Wikidata and then yet again newer rules how to factor data between OSM and Wikidata.

Presets

We already have a well known solution for micromapping using editor presets. During OSM contribution, editing subsets of properties on wikidata and subsets of them on OSM, linking, unlinking, merging and unmerging entries and the corresponding user interface is missing, and it can be laborious and error prone to accomplish manually.

Missing or mismatching tag semantics in Wikidata

Certain established and in-use OSM keys have no corresponding properties right now on Wikidata or their schema and value range differs between OSM and Wikidata. We might introduce yet another partially overlapping set of Wikidata properties to copy OSM tags verbatim there, but that might result in backlash from the Wikidata community. What are the rules for establishing new Wikidata keys in general and about partially overlapping meaning? As an alternative, should we submit a new OSM proposal to import their exact schema to OSM, adjust all OSM data users and only then factor this data to Wikidata?

Multiple accounts

A contributor will now have to register on, regularly log in to and accept the terms of service of both OSM and on Wikidata. Registration is optional on Wikidata right now, but the terms of service still applies and there's a privacy issue if you don't register.

Quality assurance

We are already struggling with quality assurance on OSM, with still room for improvement in both our processes and our tools. Wikidata is known to be struggling with issues about a data quality, moderation and verification. Registration on Wikidata is optional, so just as on Wikipedia, slight abuse can go undetected for long periods of time. Especially since checking the validity of a data change is not as trivial as in high quality free form text as on Wikipedia.

...

Templates

What templates have we got to link from this wiki to wikidata's wiki, wikidata itself and reasonator?

I can see on this page (OpenStreetMap wiki page for wikidata) a nice link, graphic and being done manually, but surely template would give a more consistent (and easily improvable) result Back ache (talk) 11:16, 25 February 2022 (UTC)

@Back ache Template:KeyDescription and Template:ValueDescription. Lectrician1 (talk) 11:10, 16 March 2022 (UTC)

This page imho needs a clean-up

Hi, I wanted to add a reference to this plug-in in this page, but I find it quite confusing in its actual state. Many info are duplicated, Sophox for example is listed both in "2.2.1 Validation tools", then again in "5 Querying Wikidata and OSM" and "10 See also". "OSM ↔ Wikidata" is duplicated in "2.2 Tools" and "10 See also". "Property P402" is listed in 3 and then again in 3.1. "OSM-Wikidata Map" in 2.2.1 and 10 ecc.