Export data to Firebase?


I would like to automatically export data (result of crawler, in JSON format) to my Firebase database. I have no idea how to do it.

Anybody knows?



Hi @ing.pavel.kriz,
here are some options you can try:

  • We have an integration to Zapier (in Beta) same as Firebase. You can use it to store results from crawler to Firebase.
  • Firebase has a REST API you can use from a Page function. So you can POST Page function’s result there using jQuery.ajax()
  • We are working on a next version of Apifier focusing on tasks like this one. Let me know at jakub@apifier.com and we’ll have a look at your use case in advance.


Thank you for your replay.

I suppose there are (will be) many developers trying to save data from Apifier to Firebase. This is why I ask you for short example how to do (as you mentioned):

Firebase has a REST API you can use from a Page function. So you can POST Page function’s result there using jQuery.ajax()

So, the short example will be very useful for us.


I’m not familiar with Firebase, but you can use their REST API in a same way we’re using Dropbox REST API in this topic.
Also please have a look at Zapier - you can set up integration between Apifier and Firebase in 5 minutes there.


I have tried REST API (post json to Firebase) many times; it seems there is a problem with Firebase Authentification. I have no idea how to solve it.

Anyway, thank you for your help. I will look to Zapier integration.


We would also like to know how to write directly to Firebase from Apifier.


I went through firebase documentation and I found solution. I created short example of it and I published it like community crawler https://www.apifier.com/community/crawlers/drobnikj/DreoL-api-generic .

It uses generic database secrets to authenticate. It was little bit tricky to find this secrets for me, but you can find it there (Project settings>Service accounts>Database secrets)


Thank you for your help and investigation. It seems the “secret” usage of Firebase auth is deprecated and we do not know how long it will be supported.

Thank you!


I published new version of example for integration with Firebase. I used authentication with Firebase API instead of deprecated DB secrets. There is community crawler with example https://www.apifier.com/community/crawlers/drobnikj/eWpJs-api-generic .

Note: this can slow down crawling, because crawler have to inject Firebase API to every pageFunction.


Thank you so much! You are the man.


Thank you very much for the example.

For my specific application I have to scrape a website and inject to Firebase. The website I will be scraping is updating their information irregularly (might be every day, every two days etc.).

My idea is to use scheduling in Apifier and scrape the website every hour. Since I don’t know exactly when the information on the website is updated, there is a high risk for me to scrape the same information multiple times. Of course I don’t want to duplicate a particular information into my Firebase database.

Is there any way to check if the existing information from a website has already been injected into Firebase at a previous scraping in order to avoid adding it multiple times?

Hopefully the above makes sense to you!

Many thanks again!


Yeah, It is possible.

You need to find or create some unique key and before you insert new line to firebase DB try to find it. If unique key already exists in your DB you do not insert it again.

There is quick example of it: https://www.apifier.com/community/crawlers/drobnikj/bgeS7-api-bbc-com

You can also fetching results from firebase DB by others attributes, but it depends on you use case.



I suppose the Firebase database is yours. So, you know the structure of the record (item) stored in Firebase.
You can obtain last record from Firebase and check if the corresponding parts (in Firebase, in crowler) are the same or not. If not, then you can update/insert new data.


Good news! :slight_smile:

There are new feature: Firebase Cloud Functions.


I will experiment with the refreshed Amazon crawler.

What pagination choices would you prescribe in the crawler, so that the same number of items by catchphrase are caught.


Hello @canaryyellowe09,
This is a nice feature, we can add it for sure.
Please if you have some more details for that, add them here to the issue.
When the feature will be added, I will let you know there.