Description
Overview
This automation workflow synchronizes starred articles from a Tiny Tiny RSS (TTRSS) instance to a Wallabag reading list. The orchestration pipeline ensures continuous, event-driven analysis of newly starred items, enabling efficient content curation for users managing RSS feeds and read-later archives. The process initiates via a manual trigger or a scheduled Cron node that activates every 10 minutes.
Key Benefits
- Automates transfer of starred RSS articles to Wallabag without manual intervention.
- Uses a no-code integration pipeline combining TTRSS and Wallabag APIs via OAuth2 authentication.
- Maintains state with static data to avoid duplicate entries and redundant processing.
- Supports event-driven analysis through both manual and timed Cron triggers for flexible execution.
Product Overview
This workflow begins with either a manual trigger or a scheduled Cron node set to run every 10 minutes. It authenticates to the Tiny Tiny RSS API using user credentials and retrieves a session ID, which is used to request the list of starred articles identified by feed ID -1. Concurrently, it authenticates with Wallabag via OAuth2, acquiring an access token for authorized API calls. The starred articles are merged with the Wallabag token, then processed in a function node that compares article IDs against the last saved ID stored in global static workflow data. Only new articles are selected for insertion. If no new articles exist, the workflow routes to a no-operation node, skipping further processing. When new articles are detected, the workflow sends POST requests to Wallabag’s `/api/entries.json` endpoint, submitting URLs with bearer token authorization. Error handling defaults to platform behavior; no custom retries or backoff are configured. The workflow processes data transiently without local persistence beyond the static `lastStarRssId` state.
Features and Outcomes
Core Automation
This automation workflow processes RSS feed data from TTRSS, applying deterministic filtering based on the last processed article ID in a decision node. The function node evaluates new items in a single-pass loop, preparing them for downstream delivery.
- Single-pass evaluation of starred articles with stateful ID comparison.
- Deterministic branching via conditional node to handle presence or absence of new articles.
- Efficient filtering to prevent duplicate article transfers.
Integrations and Intake
The orchestration pipeline integrates directly with the TTRSS API for feed retrieval and Wallabag API for content saving, using HTTP POST requests with credential-based authentication. The TTRSS login requires user and password fields, while Wallabag authentication uses OAuth2 client credentials.
- TTRSS API accessed with user credentials over HTTP POST for session authentication.
- Wallabag OAuth2 token endpoint authenticates via client ID and secret.
- Feed data intake includes JSON arrays of starred articles with IDs, URLs, and tags.
Outputs and Consumption
The workflow outputs new starred articles as POST requests to Wallabag’s entries API. This synchronous delivery sends URLs in JSON body format, authorized with bearer tokens. If no new articles exist, no output is generated beyond the no-operation path.
- Wallabag entry creation via JSON body containing article URLs.
- Authorization header uses bearer token from OAuth2 authentication.
- Output is conditional and skipped when no new starred articles are found.
Workflow — End-to-End Execution
Step 1: Trigger
The workflow starts either manually through a manual trigger node when executed on demand or automatically via a Cron node configured to trigger every 10 minutes. This scheduling enables regular synchronization intervals without manual input.
Step 2: Processing
After triggering, the workflow authenticates to TTRSS using HTTP POST with user credentials, receiving a session ID. It then requests starred article headlines with this session ID. The function node performs basic presence checks and iterates over the fetched article list, comparing each article’s ID against the stored last processed ID to identify new items.
Step 3: Analysis
The function node implements deterministic logic to filter only new starred articles by breaking the iteration once the last saved ID is encountered. The IF node evaluates whether the filtered list contains valid article IDs (not “NaN”). This conditional branching ensures only new articles proceed to the next step.
Step 4: Delivery
If new articles are present, the workflow sends HTTP POST requests with URLs to Wallabag’s API endpoint. Each request includes an Authorization header with the OAuth2 bearer token. If no new articles exist, the workflow terminates via a no-operation node, preventing unnecessary API calls.
Use Cases
Scenario 1
A user manually collects starred RSS articles and wants to archive them automatically in Wallabag. This workflow automates that transfer, ensuring articles are synced with minimal delay, resulting in an up-to-date read-later list without manual duplication.
Scenario 2
Organizations using TTRSS for internal news tracking need streamlined archiving of relevant starred content. The workflow runs on a schedule, systematically exporting new starred articles to Wallabag, maintaining consistent record-keeping and access across teams.
Scenario 3
RSS feed curators require an automated process to avoid missing newly starred articles. This orchestration pipeline detects additions by comparing article IDs, then forwards them to Wallabag, providing a deterministic outcome of synchronized content after each execution cycle.
How to use
To deploy this automation workflow, import it into the n8n environment and configure the HTTP Request nodes with valid TTRSS and Wallabag host URLs and credentials. Set the client ID, secret, username, and password for OAuth2 in the Wallabag authentication node. The Cron node controls periodic execution, but manual triggers are available for on-demand runs. Results appear as newly added entries in Wallabag, reflecting starred articles from TTRSS. Monitoring the last processed article ID through static workflow data ensures idempotent behavior across runs.
Comparison — Manual Process vs. Automation Workflow
| Attribute | Manual/Alternative | This Workflow |
|---|---|---|
| Steps required | Multiple manual login, article selection, and copy-paste operations. | One automated sequence integrating authentication, data retrieval, and API submission. |
| Consistency | Prone to human error and missed articles during manual transfer. | Deterministic filtering ensures no duplicates and complete synchronization. |
| Scalability | Limited by manual effort and time constraints. | Scheduled automation scales with feed size and frequency without additional labor. |
| Maintenance | Requires ongoing user attention and manual updates. | Low maintenance with credential updates only; relies on API stability. |
Technical Specifications
| Environment | n8n automation platform with HTTP request capabilities |
|---|---|
| Tools / APIs | Tiny Tiny RSS API, Wallabag API with OAuth2 authentication |
| Execution Model | Event-driven with manual and scheduled (Cron) triggers |
| Input Formats | JSON responses from TTRSS API containing article metadata |
| Output Formats | HTTP POST JSON payloads with URLs to Wallabag API |
| Data Handling | Transient processing with global static data for state tracking |
| Known Constraints | Relies on availability of TTRSS and Wallabag APIs and valid credentials |
| Credentials | HTTP basic credentials for TTRSS, OAuth2 client and user credentials for Wallabag |
Implementation Requirements
- Valid user credentials for Tiny Tiny RSS API access
- OAuth2 client ID, client secret, username, and password for Wallabag authentication
- Network access to TTRSS and Wallabag hosts with API endpoints reachable
Configuration & Validation
- Confirm TTRSS login credentials are correct and session ID is successfully retrieved.
- Verify Wallabag OAuth2 authentication returns a valid access token for API calls.
- Run the workflow manually to ensure starred articles are fetched and new URLs are posted to Wallabag.
Data Provenance
- Trigger nodes: Manual trigger and Cron node for periodic execution.
- Authentication nodes: “Auth TTRss” for TTRSS login; “Auth Wallabag” for OAuth2 token retrieval.
- Processing nodes: “Function” node manages article filtering and state with static workflow data.
FAQ
How is the automation workflow triggered?
The workflow supports both manual execution via a manual trigger node and automated runs every 10 minutes through a Cron trigger node.
Which tools or models does the orchestration pipeline use?
The pipeline integrates the Tiny Tiny RSS API for feed retrieval and the Wallabag API for saving articles, using HTTP request nodes and OAuth2 authentication.
What does the response look like for client consumption?
The workflow sends POST requests to Wallabag with JSON bodies containing article URLs authorized by bearer tokens; no additional response transformation is applied.
Is any data persisted by the workflow?
Only the last processed article ID is stored in the workflow’s global static data to prevent duplicate processing; no external data persistence is implemented.
How are errors handled in this integration flow?
Error handling relies on n8n platform defaults; no explicit retry, backoff, or idempotency mechanisms are configured within the workflow.
Conclusion
This automation workflow provides a dependable method for synchronizing starred articles from Tiny Tiny RSS to Wallabag, ensuring consistent content archiving with minimal manual effort. By leveraging OAuth2 authentication and stateful filtering, it systematically avoids duplicate entries and supports both scheduled and manual operation. A key constraint is its reliance on the availability and responsiveness of external TTRSS and Wallabag APIs, which directly affect workflow execution. Overall, it offers a precise, no-code integration pipeline suitable for users seeking automated feed curation and read-later list management.








Reviews
There are no reviews yet.