After following the tutorial on scraping using crawlee, I cannot figure out how to add specific cookies (key-value pair) to the request. E.g., sid=1234
There is something like a session, and a session-pool, but how to reach these objects?
Then max_pool_size of session-pool has default size of 1000, should one then iterate through the sessions in the session-pool to set the session-id to the session.cookies (dict)?
Imagine the below from the tutorial, default handler handles the incoming request, it wants to enqueue requests to the category-pages. Lets say these category-pages require the sid-cookie to be set, how to achieve this?
Any help is very much appreciated, as no examples can be found via Google / ChatGPT / Perplexity.
@router.default_handler
async def default_handler(context: PlaywrightCrawlingContext) -> None:
# This is a fallback route which will handle the start URL.
context.log.info(f'default_handler is processing {context.request.url}')
await context.page.wait_for_selector('.collection-block-item')
await context.enqueue_links(
selector='.collection-block-item',
label='CATEGORY',
)