The community member has a Puppeteer scraper that performs many actions on a page, but the browser crashes after 70-80 interactions. The page has infinite scroll, and the scraper needs to click a button and scroll down. The main goal is to collect URLs that the community member wants to navigate to. The community member wants to handle the browser crashing so they can start with the collected URLs when the browser crashes.
In the comments, another community member asks if there are any errors related to the crashing. Another community member suggests that the community member could accumulate the URLs in a global context and enqueue them in the error handler.
I have Puppeteer scraper that is doing lots of actions on a page, at one point the browser fails. It's a page with infinite scroll and I have to click a button and scroll down. After 70-80 interactions the browser crashes, and the request is getting retried as usual. The main idea is that with those actions I'm collecting urls that I wan't to navigate. I want to somehow handle the browser crashing so I can start with those urls when the browser crashes.