Apify and Crawlee Official Forum

Updated 7 hours ago

Crawlee with multiple Crawlers?

At a glance

The community member is asking if the Python Crawlee library allows for running multiple crawlers using a single router. They mention that their colleague suggested this approach, as curl requests are faster than Playwright, and they could use curl for some requests and Playwright only where necessary to potentially speed up their processes.

In the comments, another community member indicates that they found the answer to this question in a discussion on the Crawlee GitHub repository.

Useful resources
Does the python crawlee allow for multiple crawlers to be run using one router?
Plain Text
router = Router[BeautifulSoupCrawlingContext]()

Just asking as a coleague asked me if it would be possible because curl requests are a lot faster than playwright, so if we can use curl for half the requests and only load the browser for the other portion where it's needed, it could significantly speed up some processes
Add a reply
Sign up and join the conversation on Discord