Dokumentation (english)

MySQL Connector

Connect your MySQL database directly to AI pipelines and data flows with automatic synchronization.

Connect your MySQL database directly to AI pipelines and data flows.

Setup Instructions

1. Navigate to Data Integrations

Go to the Data Integrations tab in your flow.

2. Select MySQL Integration

Click Select an Integration, type MySQL in the search, and click Connect.

3. Configure Connection Settings

Fill in the following database connection details:

  • Connector Name: Give your connector a descriptive name
  • Host: The hostname or IP address of your MySQL server
    • For local setup: localhost or 127.0.0.1
    • For remote databases: The full hostname (e.g., db.example.com or your-instance.region.rds.amazonaws.com)
  • Port: The MySQL port (default is 3306)
  • Database Name: The name of the database you want to connect to
  • Username: Your MySQL username
  • Password: Your MySQL password
  • Folder (Optional): Select a destination folder in the file manager
    • If not specified, data will be stored in the root directory

4. Configure SSL Settings (Optional)

For production databases:

  • Enable SSL to ensure secure data transmission
  • Ensure your MySQL server is configured to accept SSL connections

For local development:

  • SSL can be disabled when connecting to localhost

5. Create the Connection

After filling in all connection details, click Create Connection.

The system will:

  • Test the database connection
  • Verify credentials and permissions
  • Begin the initial data synchronization

6. Monitor Sync Status

  1. Navigate to Data Synchronization to see the import progress
  2. The connector will download all tables from your MySQL database
  3. Each table will be imported as a separate file

7. Access Your Data

  1. Once the sync is complete, go to File Manager
  2. Navigate to the folder you specified (or root directory)
  3. You'll see a folder named after your database (e.g., mysql_data)
  4. Inside, all tables from your database will be available as individual files
  5. Click on any file to preview the table data
  6. The data is now ready to use in your AI pipelines and flows

What Gets Imported:

  • All tables in the specified database
  • Complete table data with all rows and columns
  • Table structure is preserved in a standardized format

Best Practices:

  • Use SSL for production databases to ensure data security
  • Use read-only database users when possible to prevent accidental modifications
  • For large databases, consider creating a dedicated read replica to avoid impacting production performance
  • Regularly monitor sync jobs to ensure data stays up-to-date

Command Palette

Search for a command to run...

Schnellzugriffe
STRG + KSuche
STRG + DNachtmodus / Tagmodus
STRG + LSprache ändern

Software-Details
Kompiliert vor 1 Tag
Release: v4.0.0-production
Buildnummer: master@64a3463
Historie: 68 Items