Database Migration Guide¶
Navigation:
- menu_book Database Selection Guide - Which database to choose
- sync Database Migration Guide ← You are here
- arrow_back Back to README
This guide helps you migrate between different database systems in go-starter-kit projects.
Prerequisites¶
Important
Database migration is not automated. You will need to:
- Export data from the source database
- Regenerate the project with the target database
- Import data into the target database
- Test thoroughly
Migration Paths¶
PostgreSQL ↔ MySQL¶
Difficulty: circle Easy (both SQL, GORM compatible)
Steps:
-
Export data from the source database
-
Regenerate the project with the target database
-
Convert SQL syntax if necessary
- Most differences are handled by GORM
- Common changes:
- Serial → AUTO_INCREMENT (handled by GORM)
- BOOLEAN → TINYINT(1) (handled by GORM)
- TEXT vs LONGTEXT (generally compatible)
For MySQL → PostgreSQL:
# Convert MySQL dump to PostgreSQL format
# Option 1: Use pgloader (recommended)
pgloader mysql://user:pass@localhost/monapp postgresql://postgres:pass@localhost/monapp
# Option 2: Manual conversion
# Replace AUTO_INCREMENT with SERIAL
# Replace `backticks` with "quotes"
# Adjust data types
For PostgreSQL → MySQL:
# Export as SQL then convert
pg_dump -U postgres monapp -f dump.sql
# Edit dump.sql: replace SERIAL with AUTO_INCREMENT, etc.
mysql -u root -p monapp < dump_converted.sql
-
Import data
-
Test migrations and queries
Common issues:
- Serial vs AUTO_INCREMENT (handled by GORM)
- Some data types differ (TEXT vs LONGTEXT)
- Function syntax may differ (DATE_ADD vs INTERVAL)
- Identifiers: MySQL uses backticks, PostgreSQL uses "double quotes"
Recommended tools: - pgloader: Automatic MySQL→PostgreSQL migration - mysql2postgres: Schema conversion script - GORM AutoMigrate: Automatically recreate the schema (custom data loss)
SQL → SQLite (Downgrade)¶
Difficulty: circle Medium (feature reduction)
When to do this: - Moving from production to local development - Creating a portable demo version - Simplifying infrastructure for small projects
Steps:
-
Export data as SQL or CSV
-
Regenerate the project
-
Import a limited dataset
-
Remove advanced SQL features
- No stored procedures
- Limited transaction support
- No concurrent writes
Limitations:
- warning No concurrent writes (database-level locking)
- warning Limited data types
- warning No stored procedures
- warning No user/permission management
- warning Not suitable for large-scale production
Recommended use cases: - Local development environments - Portable demos - Testing and CI/CD pipelines - Small-scale applications (<100 users)
SQLite → SQL (Upgrade)¶
Difficulty: circle Easy (adding features)
When to do this: - Moving from prototype to production - Need for concurrent write access - Need for advanced SQL features
Steps:
-
Export SQLite data
-
Regenerate the project
-
Convert the schema if necessary
- GORM migrations should handle most differences
-
Review and adjust any custom SQL
-
Import data
-
Test thoroughly
- Verify data integrity
- Test all API endpoints
- Run the full test suite
Data Export/Import Examples¶
Export from PostgreSQL¶
# Full database dump (compressed)
pg_dump -U postgres -h localhost -d monapp -F c -b -v -f monapp_backup.dump
# SQL format (human-readable)
pg_dump -U postgres -h localhost -d monapp -f monapp_backup.sql
# Specific tables only
pg_dump -U postgres -h localhost -d monapp -t users -t posts -f partial_backup.sql
# CSV export for a specific table
psql -U postgres -d monapp -c "COPY users TO '/tmp/users.csv' WITH CSV HEADER;"
Export from MySQL¶
# Full database dump
mysqldump -u root -p monapp > monapp_backup.sql
# Specific tables only
mysqldump -u root -p monapp users posts > partial_backup.sql
# CSV export
mysql -u root -p monapp -e "SELECT * FROM users INTO OUTFILE '/tmp/users.csv'
FIELDS TERMINATED BY ',' ENCLOSED BY '\"' LINES TERMINATED BY '\n';"
Export from SQLite¶
# Full database dump
sqlite3 monapp.db .dump > monapp_backup.sql
# Specific table
sqlite3 monapp.db "SELECT * FROM users;" > users.txt
# CSV export
sqlite3 monapp.db -header -csv "SELECT * FROM users;" > users.csv
Import into PostgreSQL¶
# From a PostgreSQL dump
pg_restore -U postgres -h localhost -d monapp monapp_backup.dump
# From a SQL file
psql -U postgres -d monapp < monapp_backup.sql
# From CSV
psql -U postgres -d monapp -c "\COPY users FROM '/tmp/users.csv' WITH CSV HEADER;"
Import into MySQL¶
# From a SQL file
mysql -u root -p monapp < monapp_backup.sql
# From CSV
mysql -u root -p monapp -e "LOAD DATA LOCAL INFILE '/tmp/users.csv'
INTO TABLE users
FIELDS TERMINATED BY ','
ENCLOSED BY '\"'
LINES TERMINATED BY '\n'
IGNORE 1 ROWS;"
Import into SQLite¶
# From a SQL file
sqlite3 monapp.db < monapp_backup.sql
# From CSV
sqlite3 monapp.db <<EOF
.mode csv
.import users.csv users
EOF
Post-Migration Testing¶
Checklist: - [ ] All migrations run successfully - [ ] Data integrity verified (counts, relationships) - [ ] All API endpoints work correctly - [ ] Authentication/authorization works - [ ] All tests pass - [ ] Performance is acceptable - [ ] No data loss detected
Validation commands:
# Run all tests
go test ./...
# Verify data integrity
go run ./cmd/main.go
# Test all API endpoints manually or with automated tests
# Verify record counts
# Compare counts between source and target databases
Rollback Plan¶
Always have a rollback plan before migrating:
-
Back up the original database
-
Test the migration in staging first
- Never test migrations directly in production
-
Use a staging environment identical to production
-
Schedule a maintenance window
- Plan the migration during low-traffic periods
-
Communicate the maintenance to users
-
Document rollback steps
Migration Checklist¶
Before migration: - [ ] Complete backup of the source database - [ ] Backup verified (restore test) - [ ] Staging environment ready - [ ] Migration plan documented - [ ] Rollback plan prepared - [ ] Maintenance scheduled - [ ] Team notified
During migration: - [ ] Stop the application (prevent data changes) - [ ] Export data from the source - [ ] Regenerate the project with the new database - [ ] Import data into the target - [ ] Run GORM migrations - [ ] Verify data integrity
After migration: - [ ] All tests pass - [ ] API endpoints verified - [ ] Performance acceptable - [ ] Monitoring in place - [ ] Old database backed up - [ ] Documentation updated
Common Migration Scenarios¶
Scenario 1: Prototype to Production¶
Path: SQLite → PostgreSQL
Use case: You built an MVP with SQLite, now moving to production.
Steps: 1. Export SQLite data 2. Regenerate with PostgreSQL 3. Set up PostgreSQL in Docker/cloud 4. Import data 5. Test thoroughly
Scenario 2: Shared Hosting to Cloud¶
Path: MySQL → PostgreSQL
Use case: Moving from shared hosting to cloud infrastructure.
Steps: 1. Export MySQL data 2. Regenerate with PostgreSQL 3. Set up managed PostgreSQL (AWS RDS, GCP Cloud SQL) 4. Import data 5. Update connection strings
Scenario 3: Simplify Development¶
Path: PostgreSQL → SQLite
Use case: Faster local development without Docker.
Steps: 1. Export a minimal test dataset 2. Regenerate with SQLite 3. Import test data 4. Keep production on PostgreSQL
Need help?¶
- Check the Database Guide for detailed information
- Open an issue on GitHub
- See examples in the
/examplesdirectory (if available)
Additional Resources¶
Last updated: 2026-02-09