API importjson spreadsheet, quota limit reached

Hello
I introduce myself: I am an agricultural technician. Among the various consultations with farmers I have activated a monitoring of parasitoid chalcidoid hymenoptera, often “useful” against plant phytophagus, with 1-2 Malaise traps per farm, thanks to the great experience of identifying the species of an entomologist friend with 30 years of experience on this superfamily.
He is not very inclined to computer science, and is based on paper books. So I thought of creating an electronic sheet in google, with the importjson function I created columns where he can immediately check the valid and updated name and the link to the page in gbif for each species it identifies.
But there is a limit to 100 requests per day. The problem is that it seems that importjson updates the page every time you open it, so the 100 requests happen quickly. Has anyone tried any other methods?
Thank you

Hi Enrico,

Glad to hear you’re making use of our API!

The limit is properly imposed by Google when using the UrlFetchApp function called by importjson. I use importjson myself regularly, and I know that it updates automatically every time you open a sheet—but you might be able to work around it. If you’re willing to share your sheet with me (dnoesgaard@gbif.org), I’m happy to take a quick look?

Best,
Daniel
GBIF Secretariat

Hello and thank Daniel @dnoesgaard
I have shared the sheet now.
Two indications on formulas:
> IMPORTJSON(CONCATENATE("https://api.gbif.org/v1/species/match?verbose=true&rank=species&strict=false&name=";$F3);"/scientificName$";$F$1:$G$1)
$F3 is th concatenate(‘genera’;" ";‘species’) in hidden column, and $F$1:$G$1 is two simple cells: “hiddenHeaders” and “TRUE”.

Now I understand: importjson has a daily quota limit, not gbif API or google sheet. For a upper limit I have to pay 19$/month!

Hi Enrico,

Thanks for sharing the sheet. I see that you’ve got instructions to copy and paste values only, which makes sense, so it doesn’t keep reloading results. I’ve had to do something similar in my own sheets.

The importJSON function I use isn’t actually an add-on but a self-contained script that you simply add to the Script Editor. It has no limits on number of queries.

The code is available here: https://github.com/bradjasper/ImportJSON/blob/master/ImportJSON.gs

It might work slightly differently, but at least it’s completely free to use.

Good luck!

Hi Enrico,
I have been using the same code as Daniel was referring to. Very easy to use with formulas like: =importJSON(SUBSTITUTE(“https://api.gbif.org/v1/organization/{value}”,"{value}",J3), “/title”, “noTruncate,noHeaders”) You might however still encounter a daily quota limit from google sheets (at least I did). What you can do is copy-paste the values after the formulas received the data to another sheet and work with these.

how many requests google sheets allows in a day depends on your (or your institution’s) google subscription level. The main issue with this is that when you get errors in formulas using the importJSON function because of timeouts, they will not refresh anymore. To fix this, you can use:
/**

Yes, the script runs. But my subscription level for drive has a same quota limit! Not a problem: I tell the entomologist to complete only the column of the genus and that of the species and then I will take care of it.
I had also tried importxml with http://resolver.globalnames.org/ but it is too slow.
Finally, it would be interesting to have an API that connects with the Universal Chalcidoid Database and automatically downloads the names of the Hosts, perhaps even inside R bipartite, and thus to create the host-parasitoids graphs of each monitoring!
But maybe I ask too much !!

The quota for URL Fetch calls should be at least 20,000 according to this:

https://developers.google.com/apps-script/guides/services/quotas

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.