Apify and Crawlee Official Forum
b
F
A
J
A
1,148
Apify Platform Forum
Crawlee JavaScript Forum
Crawlee for Python Forum
Sign up for Apify Platform here
Star Crawlee on GitHub
Star Crawlee for Python on GitHub
Powered by
Hall
Apify Platform Forum
0
Join
on Discord
Stoping Crawler when done in scraping
Apify Platform Forum
Stoping Crawler when done in scraping
0
Join
on Discord
B
Banul;
last year
Good day everyone, How can I make the crawler stop? When its done to scraper/request a certain url?
Because I want to setup my crawlee project that continously running when it has no url to request, like waiting to a queue of urls (redis)
L
B
2 comments
Share
L
LeMoussel
last year
Perhaps you throw Crawlee
CriticalError
which will cause the crawler to shutdown.
B
Banul;
last year
How can I do that? throw an CriticlError even its not encountering error?
Add a reply
Sign up and join the conversation on Discord
Join
on Discord