Questions tagged [wikidata]

Wikidata is a free, collaborative, multilingual database, collecting structured data to provide support for Wikipedia, Wikimedia Commons, and the other wikis of the Wikimedia movement. One can retrieve all data using Wikidata Query Service (Wikidata SPARQL endpoint) or Wikidata API (Wikidata MediaWiki API extension). If you are completely new both to SPARQL and MediaWiki API, please consider to ask your question on opendata.stackexchange.com.

Wikidata is a free, collaboratively edited knowledge base operated by the Wikimedia Foundation. It is intended to provide a common source of certain data types (for example, birth dates) which can be used by Wikimedia projects such as Wikipedia. This is similar to the way Wikimedia Commons provides storage for media files and access to those files for all Wikimedia projects.

Concepts:

Wikidata is a document-oriented database, focused around items. Each item represents a topic (or an administrative page used to maintain Wikipedia) and is identified by a unique number, prefixed with the letter Q — for example, the item for the topic Politics is Q7163. This enables the basic information required to identify the topic the item covers to be translated without favouring any one language.

Information is added to items by creating statements. Statements take the form of key-value pairs, with each statement consisting of a property (the key) and a value linked to the property.

Wikidata statement

Further reference:

887 questions
0
votes
1 answer

How to get all events in wikidata

I am using the wikidata toolkit and I want to get a list of all events. I wrote a EntityDocumentProcessor, where I want to filter for the events out of my dump. Well I know that the event document has the id Q1190554 and that I have to somehow check…
Safari
  • 3,302
  • 9
  • 45
  • 64
0
votes
2 answers

Formatting a date retrieved from Wikidata

So let's say I have an Infobox template in Wikipedia that retrieves a company foundation date from Wikidata. It contains the following code: |label7 = Year founded |text7 = {{wikidata|p571|{{{founded|}}}}} My problem is that what's retrieved is…
Sergey Snegirev
  • 1,016
  • 1
  • 7
  • 23
0
votes
1 answer

dbpedia server status

As suggested I'm wondering why wikidata sparql endpoint does not work: http://wikidata.dbpedia.org/sparql General info: http://wiki.dbpedia.org/news/dbpedia-based-rdf-dumps-wikidata Is it temporally as http error code suggests or is it simply not…
vedar
  • 483
  • 8
  • 15
0
votes
1 answer

Accessing Wiki Data that's similar to the actual wiki page

wikidata url: https://www.wikidata.org/w/api.php?action=wbgetentities&ids=Q254138&format=json&props=claims The part I'm currently looking at is genre, which is P136. This only contains one genre in it, and that ID links to "heavy metal music"…
rebnat
  • 121
  • 9
0
votes
1 answer

Wikidata: list every physical objects

I'm trying to get the name of the all physical things (tangible concepts) Wikidata knows about (objects, places, countries, etc), or in other words everything non-abstract. There are examples close to what I need, but with only a depth of one: all…
Vincent Cloutier
  • 331
  • 2
  • 11
0
votes
2 answers

How to get list of USA states from Wikidata by API?

I tried to find the states of USA with Wikidata API but there is no results. For example: http://wdq.wmflabs.org/api?q=claim[150:30] P150 - contains administrative territorial entity Q30 - United States of America What am I doing wrong?
Enstl
  • 15
  • 3
0
votes
1 answer

Regex in SPARQL: Why [[:alnum:]] doesn't return matches?

I want to find the Wikidata resource corresponding to a specific website. This is my query SELECT DISTINCT ?resource ?instanceOfLabel ?website WHERE { ?resource wdt:P856 ?website. FILTER (REGEX(str(?website),…
CptNemo
  • 6,455
  • 16
  • 58
  • 107
0
votes
1 answer

How to get Wikipedia Article length through API?

If I have article URL, how can I get the article metadata? specially length, category and son on? I'm developing Java application.
fattah.safa
  • 926
  • 2
  • 14
  • 36
0
votes
2 answers

get a wikidata resource in different language

I want to get for a given wikidata resource its corresponding resource in other language, for example for http://wikidata.dbpedia.org/page/Q178794 resource , i want to get ar ساعة يد az Qol saatıbg Ръчен часовникbn হাতঘড়ি ca Rellotge de…
Nad
  • 35
  • 2
  • 10
0
votes
0 answers

Elasticsearch query on nested data structure

I have indexed into elasticsearch a wikidata dump. I have a nested structure of my objects, with some subfields that are duplicate in the same field. I show you the structure of only one object: "hits": { "total": 10397696, "max_score": 1, …
Lupanoide
  • 3,132
  • 20
  • 36
0
votes
2 answers

OpenLayers LonLat transformation

I am trying to combine the OSM OpenLayers example with the results I got from query.wikidata.org, but it seems that I am doing the wrong transformation. What would be the right transformation of long and lat?
Spoom
  • 195
  • 3
  • 12
0
votes
1 answer

incremental updates documentation is not clear enough

I have a database where I need keep up with changes on Wikidata changes, and while I was looking for ways to do it, I found these three: RSS API Call Socket.IO I would like to know if there are other ways and which one is the best or recommended…
Luiz E.
  • 6,769
  • 10
  • 58
  • 98
0
votes
2 answers

Efficient way to compare time values over huge dataset R

I am using R to carry out an analysis of Wikidata dumps. I have previously extracted the variables I need from the XML dumps and create my own dataset in smaller csv files. Here how my files look like. Q939818;35199259;2013-05-04T20:28:48Z;KLBot2;/*…
Aliossandro
  • 209
  • 3
  • 12
0
votes
1 answer

Why does this Wikidata query return no results?

Shouldn't this Wikidata query return something? https://www.wikidata.org/w/api.php?action=query&titles=Kevin_Bacon What am I missing?
fandang
  • 605
  • 1
  • 6
  • 14
0
votes
0 answers

Unable to parse Wikidata URL

I'm trying to analyze data from the Wikidata API, but I keep getting timed out when running the var_dump function. Please see my code below:

Search

Search:
Oroku
  • 443
  • 1
  • 3
  • 15