I have a playwright Actor that will has 10 URLs added to its queue before i kick it off with
.run()
. But the actor doesn't finish all 10 URLs. It will process between 4 and 7, then the Log for the run will just show statistics message repeated every second.
Note that this happens in my local runs of this Actor as well. The total number of URLs scraped (out of 10) varies from run to run, minimum 1 URL and max 7 (of 10 total).
This is the message it shows on repeat, on my local and on Apify platform:
2024-05-22T22:34:24.274Z INFO Statistics: PlaywrightCrawler request statistics: {"requestAvgFailedDurationMillis":null,"requestAvgFinishedDurationMillis":35781,"requestsFinishedPerMinute":2,"requestsFailedPerMinute":0,"requestTotalDurationMillis":143124,"requestsTotal":4,"crawlerRuntimeMillis":120866,"retryHistogram":[4]}
2024-05-22T22:34:24.301Z INFO PlaywrightCrawler:AutoscaledPool: state {"currentConcurrency":6,"desiredConcurrency":11,"systemStatus":{"isSystemIdle":true,"memInfo":{"isOverloaded":false,"limitRatio":0.2,"actualRatio":0},"eventLoopInfo":{"isOverloaded":false,"limitRatio":0.6,"actualRatio":0},"cpuInfo":{"isOverloaded":false,"limitRatio":0.4,"actualRatio":0},"clientInfo":{"isOverloaded":false,"limitRatio":0.3,"actualRatio":0}}}
Why would it stop pulling from the requestsQueue? There are no errors in the Actor prior to this.