I have a crawler I have setup (x) that collects links from a page. I want to run another crawler(y) within x crawl to collect more data and return it with the rest of x crawl data. Is it possible to do this?
(I know you can add requests to the queue from a crawl but I want to keep the data together)
example data structure:
const data = [{
// from "x" crawl
title: 'a page',
links: [
{
// from "y" crawl
title: 'a page',
link: 'link'
}
]
}]