Apify and Crawlee Official Forum

Updated 4 months ago

Running nested crawlers

How can I run a crawler inside a running crawler? I have a cheerio crawler running and want to run a new crawler for example a JSDOMCrawler per each page the Cheerio Crawler visits. I know that I can run them in parallel but what I want is to run them nested.
s
2 comments
What problem are you trying to solve here? Or, what use case exactly?
What do you need separate crawlers based on type of page?
Add a reply
Sign up and join the conversation on Discord