Apify and Crawlee Official Forum

Updated 2 months ago

Saving scraped data from dynamic URLs using Crawlee in an Express Server?

Hello all.
I've been trying to build an app that triggers a scraping job when the api is hit.

The initial endpoint hits a crawlee router which has 2 handlers. one for the url-list scraping and the other for scraping the detail from each of the detail-page. (the url-list handler enqueues the next url-list page to url-list handler too btw)

I'm saving the data from each of these scrapes inside a KVstore but I want a way to save all the data in the KV store related to a particular job into database.
The attached screenshots are the MRE snippets from my code.
Attachments
code-mre-1.png
code-mre-2.png
code-mre-3.png
M
3 comments
In case the 3rd screenshot is not very clear. Here's the split of both handlers
Attachments
code-mre-3-2.png
code-mre-3-1.png
I want to be able to hit the endpoint concurrently for multiple job request.
Each job gets routed to handler 1- extracts let's say 10 detail urls and routes each one to detail-url.
then each detail url handler saves the detail page results into a kv store
Add a reply
Sign up and join the conversation on Discord