Apify and Crawlee Official Forum

Home
Members
Diego A.
D
Diego A.
Offline, last seen 2 months ago
Joined October 21, 2024
Dev team, hope you're doing well! I'm running this actor "Website Content Crawler" by Apify and I've run into an issue for the automation I'm building. Here's what's happening:

When a lead enters my database (Airtable), I want to use the Website Content Crawler to scrape through the website to provide LLM ready data.
A manual trigger within Airtable is used to trigger the run itself on the website.
There's a timeout limitation of 120 seconds whether you run the scrape synchronously or not.
Because of this, websites with over x amount of pages can't be scraped in one run. For example, I'm scraping a website w/ 200+ for testing.

Instead of one workflow within Make.com, I'll have to make two workflows:

1) The button that triggers the run in Airtable and starts the "Run an Actor" Apify Actor and store the Get Dataset Items ID & Run ID in Airtable.
2) Second workflow retrieves the "Get Dataset Items" results from the Run ID or Get Dataset ID (not clear on it) through manual trigger.

Question: Is there a way to trigger the "Get Dataset Items" Apify module within Make when the run completes? I'm not sure if that's a native feature. I also screenshot the API for that module - wondering if that could do it and how?

Thanks for the attention!
1 comment
D