🎅🏼 Get -80% ->
80XMAS
Hours
Minutes
Seconds

Description

Overview

This cleanup automation workflow focuses on systematically deleting outdated package records from two MySQL databases based on transport-received status. This orchestration pipeline targets package data older than one month with the situational filter ‘TRANSPORTE-RECEBIDO’, offering a deterministic approach to data lifecycle management through scheduled and webhook triggers.

Key Benefits

  • Automates removal of stale package records across multiple database environments daily.
  • Implements a no-code integration with Telegram for real-time cleanup status notifications.
  • Supports both scheduled and event-driven analysis via Cron triggers and webhooks.
  • Ensures consistent data hygiene by marking outdated entries for deletion without manual intervention.

Product Overview

This cleanup automation workflow initiates either through a scheduled Cron trigger set to 8:00 AM daily or via an HTTP webhook endpoint configured at the path “/limparPacotes”. Upon activation, it executes identical SQL queries on two distinct MySQL database connections labeled “PPM” and “OBJ”. Each query identifies package records where the “modulo” field equals “pacoteProduto”, the “situacao” field is “TRANSPORTE-RECEBIDO”, and the “data” timestamp is older than one month. These records are updated by setting the “modulo” field to “DELETE”, effectively flagging them for removal without immediate physical deletion. Following each database update, the workflow dispatches Telegram messages to a predefined chat to confirm the successful execution of each cleanup operation. Error handling defaults to the n8n platform’s built-in retry mechanisms, with no additional custom logic applied. Credentials for MySQL and Telegram nodes use secure stored connections, maintaining operational security. This automation pipeline thus provides a reliable and repeatable method for maintaining data relevancy and notifying stakeholders without manual database management.

Features and Outcomes

Core Automation

The cleanup automation workflow accepts triggers from scheduled or HTTP event sources to initiate a database update process. It applies a SQL Common Table Expression (CTE) to select and flag records meeting precise age and status criteria in two databases, ensuring targeted data sanitation.

  • Dual-database update executed in parallel for synchronized data hygiene.
  • Single-pass evaluation of package records older than one month with status filtering.
  • Deterministic flagging of records via a standardized SQL update query.

Integrations and Intake

This orchestration pipeline integrates with MySQL databases using credential-based authentication for secure query execution. It ingests triggers from a Cron scheduler and a webhook listener, accepting no payload constraints other than the invocation event.

  • MySQL nodes connected to “PPM” and “OBJ” databases execute cleanup queries.
  • Cron node triggers daily execution at 8 AM without payload requirements.
  • Webhook node exposes an HTTP endpoint for on-demand workflow invocation.

Outputs and Consumption

Post-cleanup, the workflow outputs status messages via Telegram to a fixed chat ID, providing real-time confirmation of operations. The workflow operates asynchronously, with messages sent after each database update completes.

  • Telegram messages confirm cleanup completion separately for each database.
  • Outputs contain fixed text notifications indicating database names and operation status.
  • Asynchronous delivery model ensures non-blocking notification dispatch.

Workflow — End-to-End Execution

Step 1: Trigger

The workflow initiates via two primary triggers: a Cron node scheduled daily at 8:00 AM and a webhook node listening on the path “/limparPacotes”. The manual trigger node is present but disabled, thus inactive in live operation.

Step 2: Processing

On trigger activation, the workflow executes two SQL queries on separate MySQL connections. These queries use a CTE to select tokens from the “i_objeto” table where package records are older than one month and marked as “TRANSPORTE-RECEBIDO”. The queries update the “modulo” field to “DELETE”. No additional schema validation or payload transformation occurs.

Step 3: Analysis

The SQL operation implements a deterministic filtering heuristic: only records exceeding the one-month age threshold and with the specified status are affected. This approach ensures precise targeting without unintended data modification.

Step 4: Delivery

Following query execution, the workflow sends Telegram notifications to a predetermined chat ID, with fixed text messages confirming cleanup for each database. These messages provide deterministic confirmation to stakeholders asynchronously.

Use Cases

Scenario 1

Data administrators need to remove outdated package records marked as received in transport to maintain database performance. This workflow automates the identification and flagging of such records older than one month, returning notification confirmations to reduce manual oversight.

Scenario 2

Operations teams require timely alerts after routine database maintenance to verify completion. The automation pipeline sends Telegram messages upon each cleanup action, ensuring deterministic status reporting without manual checking.

Scenario 3

Developers aim to integrate cleanup tasks into existing systems with minimal coding. This no-code integration leverages scheduled triggers and webhooks to invoke cleanup across multiple databases, returning structured operational feedback through messaging.

How to use

To deploy this cleanup automation workflow within n8n, import the workflow JSON and configure MySQL and Telegram credentials according to your environment. Enable the Cron node to schedule daily execution at 8 AM or invoke the webhook endpoint to run on demand. The workflow requires no payload input for triggers. Upon execution, expect Telegram messages confirming cleanup status for each database. Monitor workflow executions through n8n’s interface to ensure operational continuity.

Comparison — Manual Process vs. Automation Workflow

AttributeManual/AlternativeThis Workflow
Steps requiredMultiple manual database queries and manual notificationsSingle automated execution triggered by schedule or webhook
ConsistencyProne to human error and inconsistent schedulingDeterministic SQL filtering with repeatable daily execution
ScalabilityLimited by manual effort and resource availabilityScales across multiple databases with parallel execution
MaintenanceHigh ongoing effort to monitor and trigger cleanupsMinimal, relying on n8n’s platform and credential management

Technical Specifications

Environmentn8n Workflow Automation Platform
Tools / APIsMySQL, Telegram Bot API
Execution ModelEvent-driven with scheduled and webhook triggers
Input FormatsTrigger event (Cron schedule or HTTP webhook)
Output FormatsTelegram text messages
Data HandlingTransient database updates; no data persistence beyond update
Known ConstraintsRelies on external MySQL database availability and Telegram API uptime
CredentialsMySQL connection credentials, Telegram API token

Implementation Requirements

  • Configured MySQL credentials with appropriate permissions for update queries on target databases.
  • Telegram Bot API credentials linked to a bot authorized to send messages to the specified chat ID.
  • Network access allowing n8n instance to communicate with MySQL servers and Telegram API endpoints.

Configuration & Validation

  1. Verify MySQL credentials and connectivity by testing query execution on both “PPM” and “OBJ” databases.
  2. Confirm Telegram bot credentials with a test message sent to the target chat ID.
  3. Trigger the workflow manually via webhook or enable the Cron node and monitor for expected Telegram notifications.

Data Provenance

  • Trigger nodes: Cron (scheduled 8:00 AM daily), Webhook (path “/limparPacotes”).
  • Database nodes: MySQL nodes “limpaPacoteCliente0” (PPM), “limparPacoteCliente1” (OBJ) executing SQL update queries.
  • Notification nodes: Telegram and Telegram1 nodes sending status messages to chat ID “-657820242”.

FAQ

How is the cleanup automation workflow triggered?

The workflow can be triggered automatically via a Cron node at 8:00 AM daily or manually through an HTTP webhook endpoint configured at “/limparPacotes”. A manual trigger node is also configured but currently disabled.

Which tools or models does the orchestration pipeline use?

The pipeline integrates MySQL nodes executing SQL queries for data cleanup and Telegram nodes for sending status notifications. It does not use external AI models or heuristic engines beyond SQL predicates.

What does the response look like for client consumption?

Clients receive Telegram messages with fixed text confirming that package records have been marked for deletion in each database. These messages include database-specific identifiers to confirm execution.

Is any data persisted by the workflow?

No data beyond the update to the “modulo” field in the database is persisted by the workflow. Telegram messages are sent transiently without storing workflow state or input payloads.

How are errors handled in this integration flow?

Error handling relies on n8n’s default retry and error management. There is no custom retry or backoff logic implemented within this workflow.

Conclusion

This cleanup automation workflow provides a precise method for marking outdated package records as deleted in two MySQL databases, triggered either on a fixed daily schedule or via webhook calls. It delivers deterministic cleanup with immediate Telegram notifications confirming execution status. While the workflow depends on external database and Telegram service availability, it reduces manual maintenance and supports consistent data hygiene efforts. Its no-code integration approach facilitates straightforward deployment and operational monitoring within n8n’s platform.

Additional information

Use Case

Platform

Risk Level (EU)

Tech Stack

Trigger Type

,

Skill Level

Data Sensitivity

Reviews

There are no reviews yet.

Be the first to review “Cleanup Automation Workflow for Package Records Using MySQL and Telegram”

Your email address will not be published. Required fields are marked *

Loading...

Vendor Information

  • Store Name: clepti
  • Vendor: clepti
  • No ratings found yet!

Product Enquiry

About the seller/store

Clepti is an automation specialist focused on dependable AI workflows and agentic systems that ship and stay online. I design end-to-end automations—intake, decision logic, approvals, execution, and audit trails—using robust building blocks: Python, REST/GraphQL APIs, event queues, vector search, and production-grade LLMs. My work centers on measurable outcomes: fewer manual touches, faster cycle times, lower error rates, and clear ROI.Typical projects include lead qualification and routing, document parsing and enrichment, multi-step data pipelines, customer support deflection with tool-using agents, and reporting that actually reconciles with source systems. I prioritize security (least privilege, logging, PII handling), testability (unit + sandbox runs), and maintainability (versioned prompts, clear configs, readable code). No inflated promises—just stable automation that replaces repetitive work.If you need an AI agent or workflow that integrates with your stack (CRMs, ticketing, spreadsheets, databases, or custom APIs) and runs every day without babysitting, I can help. Brief me on the problem, constraints, and success metrics; I’ll propose a straightforward plan and build something reliable.

30-Day Money-Back Guarantee

Easy refunds within 30 days of purchase – Shouldn’t you be happy with the automation/workflow you will get your money back with no questions asked.

Cleanup Automation Workflow for Package Records Using MySQL and Telegram

Automate deletion of outdated package records with this cleanup automation workflow using MySQL and Telegram for real-time status updates, triggered by schedule or webhook.

49.99 $

You May Also Like

n8n workflow with manual trigger and Bitly node shortening USC event calendar URL

No-Code URL Shortening Tools with Bitly API Integration

Streamline link sharing with this no-code URL shortening tool using Bitly API. It converts fixed long URLs into concise, reliable... More

17.99 $

clepti
n8n workflow automates exporting Google Sheets data to Excel and uploads to Dropbox every 15 minutes

Google Sheets to Excel Automation Workflow with Tools and Formats

This automation workflow exports Google Sheets data to Excel format and uploads it to Dropbox every 15 minutes, ensuring consistent... More

32.99 $

clepti
n8n workflow automating download, merging of two remote PDFs via ConvertAPI and saving merged PDF locally

PDF Merge Automation Workflow with Tools and Formats

Streamline document consolidation using this PDF merge automation workflow. It downloads, merges, and saves PDFs locally via API integration, ensuring... More

32.99 $

clepti
Diagram of n8n workflow automating download, aggregation, and ZIP compression of AWS S3 folder files

AWS S3 Bulk File Download and Compression Workflow Automation

This workflow automates bulk downloading and compression of files from an AWS S3 folder, aggregating all files into a single... More

49.99 $

clepti
n8n workflow automating dynamic DNS updates for multiple Namecheap subdomains on IP change every 15 minutes

Dynamic DNS Update Automation Workflow with Tools and Formats

This dynamic DNS update automation workflow uses IP change detection and scheduled triggers to keep multiple subdomains' DNS records current... More

47.99 $

clepti
n8n workflow with manual trigger and Mocean node for sending SMS via Mocean API

Manual SMS Sending Workflow with Mocean API Integration Tools

This manual SMS sending workflow uses Mocean API tools for secure, on-demand text message dispatch with customizable recipient, sender ID,... More

17.99 $

clepti
n8n workflow manually triggered to fetch synonyms for 'Hallo' using OpenThesaurus node

Synonym Retrieval Automation Workflow with OpenThesaurus Tools

Access related words instantly using this synonym retrieval automation workflow with OpenThesaurus tools, ideal for linguistic enrichment and content variation... More

19.99 $

clepti
n8n workflow automating minute-by-minute simulated humidity sensor data insertion into PostgreSQL database

Sensor Data Logging Automation Workflow with Humidity Sensor Tools

This workflow automates humidity sensor data generation and logs time-stamped readings into PostgreSQL every minute, ensuring continuous ingestion and reliable... More

22.99 $

clepti
n8n workflow with manual trigger and Box node retrieving folder contents from n8n-rocks folder

Manual Trigger Workflow for Box Folder Metadata Retrieval with OAuth2

This workflow enables manual retrieval of Box folder metadata using OAuth2 authentication, providing secure, on-demand access to folder details and... More

19.99 $

clepti
n8n workflow automating Excel data import into PostgreSQL product table with name and EAN columns

Excel to PostgreSQL Data Ingestion Tools and Formats Workflow

Automate Excel spreadsheet data ingestion into PostgreSQL with tools that convert and insert product name and EAN code efficiently, ensuring... More

32.99 $

clepti
n8n workflow with manual trigger node connected to Cockpit CMS node fetching samplecollection data

Manual Data Retrieval Workflow for Cockpit CMS with n8n Tools

Fetch data manually from Cockpit CMS collections using this n8n workflow with manual triggers and API authentication for precise, controlled... More

17.99 $

clepti
n8n workflow with manual trigger node and read binary file node reading picture.jpg

Manual Trigger Binary File Reading Workflow for Local Image Data

This workflow enables manual trigger initiation to read binary image files locally, providing deterministic data extraction for integration or processing... More

18.99 $

clepti
Get Answers & Find Flows: