Apify and Crawlee Official Forum

Updated 3 months ago

Handle browser failure

I have Puppeteer scraper that is doing lots of actions on a page, at one point the browser fails.
It's a page with infinite scroll and I have to click a button and scroll down. After 70-80 interactions the browser crashes, and the request is getting retried as usual.
The main idea is that with those actions I'm collecting
urls that I wan't to navigate.
I want to somehow handle the browser crashing so I can start with those urls when the browser crashes.
P
2 comments
Hi Is there any error, related to the crashing?
otherwise you may just accumulate the urls in some kind of global contet and enqueue them in the errorHandler.
Add a reply
Sign up and join the conversation on Discord