Apify

Apify and Crawlee Official Forum

b
F
A
J
A

How to add multiple crawler in the same desktop program? thanks

Hello everyone, I am using crawlee and electron to develop a desktop program, I met two problems:

  1. I want to start multiple crawler tasks at the same time in my software, and each crawler task can have independent parameter settings.
I don't see anything about this in the documentation. How should I implement this requirement?

  1. I wanna add pause and restart features to the crawler task,but I haven't seen the relate function in the documentation.
Can someone give me some tip? I would be very grateful.
H
h
A
4 comments
hi, Hamza .Thanks for your help. I am using crawlee but not apify in my project . so I don't know these method you mentioned if works in my case.

https://docs.apify.com/api/v2#tag/Actor-tasksRun-collection/operation/actorTask_runs_post
https://docs.apify.com/api/client/js/reference/class/RunClient#abort
https://docs.apify.com /api/client/js/reference/class/RunClient#resurrect
just advanced to level 2! Thanks for your contributions! πŸŽ‰
Hay ,

My bad for not understanding the question.

1- You can do it by creating named routes: https://crawlee.dev/docs/introduction/refactoring#routing, pass the parameter settings using userData: https://crawlee.dev/api/core/class/Request#userData object to each route label

Second option is you technically can create multiple crawlers but the persistence will probably get messed up since it is indexed by order when it started (e.g. SDK_CRAWLER_STATISTICS_0), and you would need to not use a queue or use named queue

2- You can use the pause and resume as mentioned above.
Add a reply
Sign up and join the conversation on Discord
Join