Apify and Crawlee Official Forum

Updated 2 months ago

All requests from the queue have been processed, the crawler will shut down.

I'm working on news web crawler, and setting purgeOnStart=false so that I don't scrape duplicated news, however sometimes in some cases I got the message "All requests from the queue have been processed, the crawler will shut down." and the crawler don't run, any suggestion to fix this issue??
H
a
A
6 comments
the message means that all requests in your requestQueue are already handled, so there is no point to process them again
yes I know that, however how to add more urls to the requestQueue, because the domains I'm collecting the data from have more
You can add same url bit then you need to specify different uniqueKey for them request. By default uniqueKey is the same as url
Alright I'll try that
@person just advanced to level 1! Thanks for your contributions! πŸŽ‰
Add a reply
Sign up and join the conversation on Discord