Apify

Apify and Crawlee Official Forum

b
F
A
J
A

Stoping Crawler when done in scraping

Good day everyone, How can I make the crawler stop? When its done to scraper/request a certain url?
Because I want to setup my crawlee project that continously running when it has no url to request, like waiting to a queue of urls (redis)
L
B
2 comments
Perhaps you throw Crawlee CriticalError which will cause the crawler to shutdown.
How can I do that? throw an CriticlError even its not encountering error?
Add a reply
Sign up and join the conversation on Discord
Join