Skip to content

Transcripts Migration - Ameer

This workflow migrates transcript analysis data from Airtable to a database by retrieving all records from the "Transcripts Metadata" table and inserting them into the training_sessions database table in batches.

Purpose

No business context provided yet — add a context.md to enrich this documentation.

How It Works

  1. Manual Trigger: The workflow starts when manually executed
  2. Data Retrieval: Fetches all records from the Airtable "Transcripts Metadata" table
  3. Batch Processing: Splits the retrieved records into batches of 30 items for efficient processing
  4. Database Insertion: For each batch, inserts the transcript data into the training_sessions database table via API
  5. Loop Continuation: Continues processing until all batches are complete

Workflow Diagram

graph TD
    A[Manual Trigger] --> B[Get All Records from Airtable]
    B --> C[Loop Over Items - Batch Processing]
    C --> D[Insert into DB]
    D --> C
    C --> E[Complete]

Trigger

Manual Trigger: The workflow must be manually executed by clicking the "Execute workflow" button. It does not run automatically on a schedule or in response to external events.

Nodes Used

Node Type Node Name Purpose
Manual Trigger When clicking 'Execute workflow' Initiates the workflow when manually triggered
Airtable Get All Records Retrieves all records from the Transcripts Metadata table
Split in Batches Loop Over Items Processes records in batches of 30 to manage load
HTTP Request Insert into DB Sends POST requests to insert data into the database

External Services & Credentials Required

Airtable

  • Credential: EXP Training Bot (Airtable Token API)
  • Base: app7ljEXNqhMlhsNS (Transcripts Analysis)
  • Table: tblZv77GimrYguEKg (Transcripts Metadata)

Database API

  • Credential: training_sessions_bearer (HTTP Bearer Auth)
  • Endpoint: https://dataview.educateapps.work/api/data/databases/chatbot/tables/training_sessions

Environment Variables

No environment variables are used in this workflow. All configuration is handled through node parameters and credentials.

Data Flow

Input

  • Airtable records from the "Transcripts Metadata" table containing:
    • Trainer name
    • Transcript Link
    • Summary (2-3 sentences)
    • Business type
    • Milestone action
    • Adjacent stage (S0-S13 mapping)
    • Obstacles
    • Actions
    • Quality score (0-10)
    • Quality remarks

Output

  • Database records inserted into the training_sessions table with mapped fields:
    • trainer_name
    • transcript_link
    • summary
    • business_type
    • milestone_action
    • adjacent_stage
    • obstacles
    • actions
    • quality_score (converted to number)
    • quality_remarks

Error Handling

The workflow includes basic error handling: - HTTP Request timeout set to 120 seconds (2 minutes) to handle large data insertions - Batch processing limits load on the target database - Field mapping includes fallback values (empty strings) for missing data

No explicit error paths or retry mechanisms are implemented.

Known Limitations

  • The workflow is currently inactive and must be manually triggered
  • No validation of data quality before insertion
  • No duplicate checking - running multiple times may create duplicate records
  • Limited error reporting if database insertions fail
  • Fixed batch size of 30 may not be optimal for all data volumes

No related workflows identified from the provided context.

Setup Instructions

  1. Import the Workflow

    • Import the JSON into your n8n instance
    • The workflow will be inactive by default
  2. Configure Airtable Credentials

    • Create an Airtable Token API credential named "EXP Training Bot"
    • Ensure access to base app7ljEXNqhMlhsNS and table tblZv77GimrYguEKg
  3. Configure Database Credentials

    • Create an HTTP Bearer Auth credential named "training_sessions_bearer"
    • Set the bearer token for accessing the dataview.educateapps.work API
  4. Verify Connections

    • Test the Airtable connection by running the "Get All Records" node
    • Test the database connection by running a single "Insert into DB" operation
  5. Execute the Migration

    • Activate the workflow if needed
    • Click "Execute workflow" to start the migration process
    • Monitor the execution to ensure all batches complete successfully
  6. Post-Migration Verification

    • Check the target database to confirm all records were inserted
    • Verify data integrity by spot-checking a few records
    • Deactivate the workflow after successful migration to prevent accidental re-runs