How to retry only failed requests after the crawler has finished ?
At a glance
The community member finished a crawler that processed around 1.7 million requests, but had around 100,000 failed requests. They asked if there is a way to retry just the failed requests. The comments suggest that this is not currently supported, but one community member recommends creating a dataset or key-value store to handle the failed requests. Another community member notes that retrying the failed requests would require writing a new scraper, as the crawler consists of multiple route handlers. A third community member suggests using the failedRequestHandler option in the Crawlee library to handle all failed requests in one place.