Apify Discord Mirror

Updated 5 months ago

TargetClosedError: Target page, context or browser has been closed (I've tried a lot)

At a glance

The community member is experiencing an issue with their web crawler, where it randomly stops working. They have tried various approaches, such as changing memory, concurrency, and requests per minute, but the problem persists. They are unsure if they are missing an await anywhere, and they are also unsure how to access the context with the current crawlee library.

The community members in the comments suggest upgrading to the latest version of crawlee (3.9.3 or 3.10) and adding more logs to determine when the page is closing, potentially within the while (!isEndReached) loop. There is no explicitly marked answer, but the community members provide guidance on how to potentially resolve the issue.

Useful resources
Hello,

I've tried a lot to resolve this issue, from changing memory, concurrency, to requests per minute - I can't seem to understand why this randomly happens. I can't tell if I'm missing an await anywhere. More importantly, I'm not sure how to access context with currently crawlee lib, ie: https://docs.apify.com/academy/node-js/how_to_fix_target-closed

Any guidance would be most helpful. My concern is we're already looking to offload our crawlers to the Apify platform (already testing), so we want to make this work first in our own environment.

Currently on "crawlee": "3.9.2"

Code for reference attached.
A
c
O
4 comments
Hi! Please upgrade to crawlee 3.9.3 then add more logs to find out when exactly page closed, i.e. might be inside "while (!isEndReached)" loop.
ok thanks, I'll try this
I guess, it's better to upgrade to crawlee's latest (3.10) version.
AFAIK, it's some bug related to old versions. Always use / update to latest, if You can)
Add a reply
Sign up and join the conversation on Discord