The 2020 Darwin Core Million Challenge

Bob Mesibov is probably well known to people here, he’s an advocate of rigorous checks on data quality, and has made something of a career of pointing out the ways in which biodiversity datasets fail fairly basic tests.

Instead of just being a curmudgeon - probably why we get along :wink: - he’s decided to put his money where his mouth is and he’s hosting his own challenge: send him a million records from a dataset you’ve uploaded to GBIF and if he doesn’t find any errors, he’ll pay $AUD 150. For details see The 2020 Darwin Core Million].

1 Like

Data quality is a very wide ranging theme covering issues like: standardized methods for measurements and observations, metadata inclusion with the data itself, efficient and effective long term data management (who’s responsible?). While (GBIF) technology is great as a tool, the major problem is “organizing the data” which has nothing to do with technology and everything with logical thinking. Perhaps it’s time to start talking about a GBIF Data Policy, covering the above mentioned issues. Because after all, the key is the reliability and trustworthiness of the data communicated through GBIF.

1 Like

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.