Apify Discord Mirror

Home
Members
Nisthar
N
Nisthar
Offline, last seen 5 months ago
Joined August 30, 2024
So i have this code:

Plain Text
const cookieJar = new CookieJar();
export const basicCrawler = new BasicCrawler({
    async requestHandler({ sendRequest, request, log }) {
        try {
            const res = await sendRequest({
                url: request.url,
                method: 'GET',
                cookieJar
            });
            const json = destr(res.body);
             const urls = json.map(v => v.url);
            await playCrawler.run(urls);
        } catch (error) {
            console.log(error);

        }
     
    },
});

//code for playwright crawler here

I start the crawler by calling the basicCrawler.run(['url']);
The problem is it seems to call the basicCrawler again for the urls i pass to playCrawler. how is that possible?
13 comments
N
P
A
H
I don't really understand datasets. I want to store an array of objects in the same json file.
So i can connect this json file to table api's or convert them to csv.
4 comments
N
P
The scraper finishes crawl after a few ones everytime. I have 99 urls added to the queue.

This is my config:
9 comments
c
H
N
A