Apify Discord Mirror

Updated 5 months ago

Running nested crawlers

At a glance

The community member is asking how to run a Cheerio crawler inside another running crawler, such as a JSDOMCrawler, to process each page visited by the initial crawler. The community members in the comments are asking for more context on the problem the community member is trying to solve and the use case for needing separate crawlers based on page type.

How can I run a crawler inside a running crawler? I have a cheerio crawler running and want to run a new crawler for example a JSDOMCrawler per each page the Cheerio Crawler visits. I know that I can run them in parallel but what I want is to run them nested.
s
2 comments
What problem are you trying to solve here? Or, what use case exactly?
What do you need separate crawlers based on type of page?
Add a reply
Sign up and join the conversation on Discord