This Actor demonstrates the Apify metamorph operation by processing multiple URLs with configurable delays, then transforming into another Actor to complete the work in a single continuous run.
- Accepts multiple URLs in
requestListSources
format (compatible with Crawlee) - Processes URLs sequentially with a configurable delay between each
- Stores processing metadata in the key-value store to demonstrate data persistence
- Metamorphs into
dz_omar/universal-downloader
with all URLs as a batch job
The metamorph operation allows one Actor to transform into another Actor mid-execution while preserving all storage data (Dataset, Key-Value Store, Request Queue). This happens seamlessly in a single run with one billing cycle.
Add delays between URL processing to avoid overwhelming target servers or respect rate limits.
All data stored before the metamorph (processing logs, timestamps, metadata) remains accessible after the transformation.
The Actor accepts input in this format:
{
"input": [
{ "url": "https://httpbin.org/bytes/1024" },
{ "url": "https://httpbin.org/bytes/2048" },
{ "url": "https://httpbin.org/bytes/4096" }
],
"delaySeconds": 3
}
-
input
(Array, required): List of URLs inrequestListSources
format- Compatible with Crawlee's RequestList sources
- Supports both
{ "url": "..." }
objects and plain strings
-
delaySeconds
(Integer, optional): Fixed delay between processing each URL- Default: 2 seconds
- Range: 0-60 seconds
-
Phase 1 - Your Actor:
- Processes each URL with delays
- Stores metadata for each step
- Logs progress and timing
-
Phase 2 - Metamorph:
- Transforms into
universal-downloader
- Passes all URLs as batch input
- Continues in the same run ID
- Transforms into
-
Phase 3 - Target Actor:
- Downloads all files
- Stores results in dataset
- Completes the workflow
Console Logs:
Processing 3 URLs with 3 second delay between requests
Processing URL 1/3: https://httpbin.org/bytes/1024
Waiting 3 seconds before next URL...
Processing URL 2/3: https://httpbin.org/bytes/2048
Waiting 3 seconds before next URL...
Processing URL 3/3: https://httpbin.org/bytes/4096
This is the last URL - METAMORPHING into universal-downloader...
[Then universal-downloader logs continue...]
Key-Value Store:
BATCH_INFO
: Overall batch informationPROCESSING_0
,PROCESSING_1
, etc.: Individual URL processing logs
Dataset:
- Download results from universal-downloader
✅ Single Run Billing: Pay for one continuous run instead of multiple Actor calls
✅ Data Persistence: All storage survives the transformation
✅ Seamless Integration: No manual coordination between Actors
✅ Simplified Workflows: Chain complex operations easily
✅ Cost Effective: Reduce overhead of multiple Actor starts
- Batch File Processing: Download multiple files with rate limiting
- Workflow Orchestration: Chain specialized Actors together
- Complex Scraping: Analyze targets, then use appropriate scrapers
- Data Pipeline: Transform data through multiple processing stages
- Storage Persistence: All default storages (Dataset, Key-Value Store, Request Queue) are preserved
- Input Handling: Metamorphed Actor receives input via
INPUT-METAMORPH-1
key (handled automatically byActor.getInput()
) - Billing: Single run ID for the entire workflow
- Limitations: Runtime limits apply to the total metamorph chain
- Crawlee RequestList: Compatible input format
- Apify SDK: JavaScript SDK for Actor development
- Actor Development: Complete development guide
The Actor includes error handling for:
- Invalid URL formats in input
- Missing or malformed input data
- Metamorph operation failures
- Delay parameter validation
- Push to Apify platform:
apify push
- Configure input in Console or via API
- Run and monitor the metamorph workflow
- Check both phases in the same run logs
Note: This is an educational demonstration of the metamorph operation. In production, consider additional error handling, input validation, and monitoring based on your specific needs.
- 📧 Email: [email protected]
- 🐙 GitHub: DZ-ABDLHAKIM
- 🐦 Twitter: @DZ_45Omar
- 🔧 Apify: dz_omar