Description
Overview
This cleanup automation workflow focuses on systematically deleting outdated package records from two MySQL databases based on transport-received status. This orchestration pipeline targets package data older than one month with the situational filter ‘TRANSPORTE-RECEBIDO’, offering a deterministic approach to data lifecycle management through scheduled and webhook triggers.
Key Benefits
- Automates removal of stale package records across multiple database environments daily.
- Implements a no-code integration with Telegram for real-time cleanup status notifications.
- Supports both scheduled and event-driven analysis via Cron triggers and webhooks.
- Ensures consistent data hygiene by marking outdated entries for deletion without manual intervention.
Product Overview
This cleanup automation workflow initiates either through a scheduled Cron trigger set to 8:00 AM daily or via an HTTP webhook endpoint configured at the path “/limparPacotes”. Upon activation, it executes identical SQL queries on two distinct MySQL database connections labeled “PPM” and “OBJ”. Each query identifies package records where the “modulo” field equals “pacoteProduto”, the “situacao” field is “TRANSPORTE-RECEBIDO”, and the “data” timestamp is older than one month. These records are updated by setting the “modulo” field to “DELETE”, effectively flagging them for removal without immediate physical deletion. Following each database update, the workflow dispatches Telegram messages to a predefined chat to confirm the successful execution of each cleanup operation. Error handling defaults to the n8n platform’s built-in retry mechanisms, with no additional custom logic applied. Credentials for MySQL and Telegram nodes use secure stored connections, maintaining operational security. This automation pipeline thus provides a reliable and repeatable method for maintaining data relevancy and notifying stakeholders without manual database management.
Features and Outcomes
Core Automation
The cleanup automation workflow accepts triggers from scheduled or HTTP event sources to initiate a database update process. It applies a SQL Common Table Expression (CTE) to select and flag records meeting precise age and status criteria in two databases, ensuring targeted data sanitation.
- Dual-database update executed in parallel for synchronized data hygiene.
- Single-pass evaluation of package records older than one month with status filtering.
- Deterministic flagging of records via a standardized SQL update query.
Integrations and Intake
This orchestration pipeline integrates with MySQL databases using credential-based authentication for secure query execution. It ingests triggers from a Cron scheduler and a webhook listener, accepting no payload constraints other than the invocation event.
- MySQL nodes connected to “PPM” and “OBJ” databases execute cleanup queries.
- Cron node triggers daily execution at 8 AM without payload requirements.
- Webhook node exposes an HTTP endpoint for on-demand workflow invocation.
Outputs and Consumption
Post-cleanup, the workflow outputs status messages via Telegram to a fixed chat ID, providing real-time confirmation of operations. The workflow operates asynchronously, with messages sent after each database update completes.
- Telegram messages confirm cleanup completion separately for each database.
- Outputs contain fixed text notifications indicating database names and operation status.
- Asynchronous delivery model ensures non-blocking notification dispatch.
Workflow — End-to-End Execution
Step 1: Trigger
The workflow initiates via two primary triggers: a Cron node scheduled daily at 8:00 AM and a webhook node listening on the path “/limparPacotes”. The manual trigger node is present but disabled, thus inactive in live operation.
Step 2: Processing
On trigger activation, the workflow executes two SQL queries on separate MySQL connections. These queries use a CTE to select tokens from the “i_objeto” table where package records are older than one month and marked as “TRANSPORTE-RECEBIDO”. The queries update the “modulo” field to “DELETE”. No additional schema validation or payload transformation occurs.
Step 3: Analysis
The SQL operation implements a deterministic filtering heuristic: only records exceeding the one-month age threshold and with the specified status are affected. This approach ensures precise targeting without unintended data modification.
Step 4: Delivery
Following query execution, the workflow sends Telegram notifications to a predetermined chat ID, with fixed text messages confirming cleanup for each database. These messages provide deterministic confirmation to stakeholders asynchronously.
Use Cases
Scenario 1
Data administrators need to remove outdated package records marked as received in transport to maintain database performance. This workflow automates the identification and flagging of such records older than one month, returning notification confirmations to reduce manual oversight.
Scenario 2
Operations teams require timely alerts after routine database maintenance to verify completion. The automation pipeline sends Telegram messages upon each cleanup action, ensuring deterministic status reporting without manual checking.
Scenario 3
Developers aim to integrate cleanup tasks into existing systems with minimal coding. This no-code integration leverages scheduled triggers and webhooks to invoke cleanup across multiple databases, returning structured operational feedback through messaging.
How to use
To deploy this cleanup automation workflow within n8n, import the workflow JSON and configure MySQL and Telegram credentials according to your environment. Enable the Cron node to schedule daily execution at 8 AM or invoke the webhook endpoint to run on demand. The workflow requires no payload input for triggers. Upon execution, expect Telegram messages confirming cleanup status for each database. Monitor workflow executions through n8n’s interface to ensure operational continuity.
Comparison — Manual Process vs. Automation Workflow
| Attribute | Manual/Alternative | This Workflow |
|---|---|---|
| Steps required | Multiple manual database queries and manual notifications | Single automated execution triggered by schedule or webhook |
| Consistency | Prone to human error and inconsistent scheduling | Deterministic SQL filtering with repeatable daily execution |
| Scalability | Limited by manual effort and resource availability | Scales across multiple databases with parallel execution |
| Maintenance | High ongoing effort to monitor and trigger cleanups | Minimal, relying on n8n’s platform and credential management |
Technical Specifications
| Environment | n8n Workflow Automation Platform |
|---|---|
| Tools / APIs | MySQL, Telegram Bot API |
| Execution Model | Event-driven with scheduled and webhook triggers |
| Input Formats | Trigger event (Cron schedule or HTTP webhook) |
| Output Formats | Telegram text messages |
| Data Handling | Transient database updates; no data persistence beyond update |
| Known Constraints | Relies on external MySQL database availability and Telegram API uptime |
| Credentials | MySQL connection credentials, Telegram API token |
Implementation Requirements
- Configured MySQL credentials with appropriate permissions for update queries on target databases.
- Telegram Bot API credentials linked to a bot authorized to send messages to the specified chat ID.
- Network access allowing n8n instance to communicate with MySQL servers and Telegram API endpoints.
Configuration & Validation
- Verify MySQL credentials and connectivity by testing query execution on both “PPM” and “OBJ” databases.
- Confirm Telegram bot credentials with a test message sent to the target chat ID.
- Trigger the workflow manually via webhook or enable the Cron node and monitor for expected Telegram notifications.
Data Provenance
- Trigger nodes: Cron (scheduled 8:00 AM daily), Webhook (path “/limparPacotes”).
- Database nodes: MySQL nodes “limpaPacoteCliente0” (PPM), “limparPacoteCliente1” (OBJ) executing SQL update queries.
- Notification nodes: Telegram and Telegram1 nodes sending status messages to chat ID “-657820242”.
FAQ
How is the cleanup automation workflow triggered?
The workflow can be triggered automatically via a Cron node at 8:00 AM daily or manually through an HTTP webhook endpoint configured at “/limparPacotes”. A manual trigger node is also configured but currently disabled.
Which tools or models does the orchestration pipeline use?
The pipeline integrates MySQL nodes executing SQL queries for data cleanup and Telegram nodes for sending status notifications. It does not use external AI models or heuristic engines beyond SQL predicates.
What does the response look like for client consumption?
Clients receive Telegram messages with fixed text confirming that package records have been marked for deletion in each database. These messages include database-specific identifiers to confirm execution.
Is any data persisted by the workflow?
No data beyond the update to the “modulo” field in the database is persisted by the workflow. Telegram messages are sent transiently without storing workflow state or input payloads.
How are errors handled in this integration flow?
Error handling relies on n8n’s default retry and error management. There is no custom retry or backoff logic implemented within this workflow.
Conclusion
This cleanup automation workflow provides a precise method for marking outdated package records as deleted in two MySQL databases, triggered either on a fixed daily schedule or via webhook calls. It delivers deterministic cleanup with immediate Telegram notifications confirming execution status. While the workflow depends on external database and Telegram service availability, it reduces manual maintenance and supports consistent data hygiene efforts. Its no-code integration approach facilitates straightforward deployment and operational monitoring within n8n’s platform.








Reviews
There are no reviews yet.