Zum Inhalt überspringen
Zum Inhalt überspringen

Kommentare

  1. […] of a statement. Denny Vrandečić published 2 blog posts about the ideas behind Wikidata: “Wikidata Quality and Quantity” and “A categorical imperative?“. In addition, he shared a few thoughts on the […]

  2. Amrapali
    19. September 2013 at 11:49

    Im currently working on a survey paper focused towards data quality assessment methodologies, dimensions, metrics and tools particularly for LOD: http://www.semantic-web-journal.net/content/quality-assessment-methodologies-linked-open-data (although currently under review). I think this would be a potential answer to your statement; "How to make quality measurable in Wikidata, which metrics correlate with quality – it has simply not yet been investigated sufficiently. I expect that science will provide some answers in the coming months and years." Of course, in this case not particularly for Wikidata but definitely can be applied to it !

  3. Thieol
    8. September 2013 at 02:12

    in my opinion, wikidata quality will increase naturally as the number of statements. Many statements are linked together will obvious rules. "A term cannot have a date of birth" and so forth. bots will then be able to detect errors by logical rules.

  4. Torsten
    5. September 2013 at 10:59

    Press "Random article" 100 times and count the incidences of Wikidata use. (All inter wiki language links together count only as one incident.)

  5. Boris Schneider
    4. September 2013 at 19:20

    I hope the project will have many contributors and that it will be successful. :) Thanks for the great article about Wikidata.

Leave a Reply

Your email address will not be published. Required fields are marked *