Apify and Crawlee Official Forum

Home
Members
uberpea5000
u
uberpea5000
Offline, last seen 2 weeks ago
Joined January 2, 2025
Is there a way to ensure that successive requests are made using the same session (with the same cookies, etc.) in the Python API? I am scraping a very fussy site that seems to have strict session continuity requirements so I need to ensure that for main page A, all requests to sub pages linked from there, A-1, A-2, A-3, etc. (as well as A-1-1, A-1-2, etc.,) are made within the same session as the original request.

Thanks as always.
17 comments
D
M
u
A
Hello, I'm seeing (https://playwright.dev/python/docs/library#incompatible-with-selectoreventloop-of-asyncio-on-windows) that there is an incompatibility between Playwright and Windows SelectorEventLoop -- which Crawlee seems to require? Can you confirm whether it is possible to use a PlaywrightCrawlingContext in a Windows environment? I'm running into am asyncio NotImplementedError when trying to run the crawler, which suggests to me that there might be an issue. Thanks for the help.
1 comment
M
Hi there. Whenever I try to use residential proxies ('HTTP://groups-RESIDENTIAL:/...') I run into this error:

httpx.ConnectError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self-signed certificate in certificate chain (_ssl.c:1129)

The 'auto' group seems to work fine. Can anyone tell me what I'm doing wrong here?

Thanks!
7 comments
u
M