Copilot for Database Development: SQL, Migrations, and Schema Design
Leverage GitHub Copilot and Microsoft Copilot to streamline database development, write better SQL queries, design robust schemas, and manage database migrations effectively.
Copilot for Database Development: SQL, Migrations, and Schema Design
As a database developer or DBA, you can significantly boost your productivity and code quality by leveraging AI-powered coding assistants. This comprehensive guide shows you how to use GitHub Copilot and Microsoft Copilot for various database development tasks.
🎯 What You'll Learn
- Write optimized SQL queries with Copilot assistance
- Design robust database schemas using AI guidance
- Generate and manage Entity Framework migrations
- Optimize database performance with AI suggestions
- Create comprehensive database documentation
- Implement proper indexing strategies
💡 Getting Started with Copilot for Database Development
Important Note: All code examples shown in this guide are samples of what Copilot MAY generate. Actual suggestions will vary based on your specific database schema, project context, existing code, and the version of Copilot you're using. The key is understanding the workflow and interaction patterns.
Where You'll Get Copilot Assistance
Primary Applications:
- Visual Studio Code with GitHub Copilot extension (SQL files, .cs files)
- Visual Studio 2022 with GitHub Copilot extension (.cs files, Entity Framework)
- SQL Server Management Studio (SSMS) with GitHub Copilot extension (SQL queries)
- Azure Data Studio with GitHub Copilot extension (SQL development)
- JetBrains DataGrip with GitHub Copilot plugin (database development)
1. Writing Complex SQL Queries
Application: SQL Server Management Studio, Azure Data Studio, or VS Code with .sql files
Your Action: Start typing a descriptive comment, then press Tab or Enter
What You Type:
-- Find top 10 customers by total order value in the last 6 months
-- Include customer name, email, total orders, and total value
-- Order by total value descending
Copilot's Response: After typing the comments above, Copilot will suggest a complete query. You'll see a gray suggestion that you can accept with Tab. Here's an example of what the generated code might look like:
SELECT
c.customer_id,
c.first_name + ' ' + c.last_name AS customer_name,
c.email,
COUNT(o.order_id) AS total_orders,
SUM(oi.quantity * oi.unit_price) AS total_value
FROM customers c
INNER JOIN orders o ON c.customer_id = o.customer_id
INNER JOIN order_items oi ON o.order_id = oi.order_id
WHERE o.order_date >= DATEADD(MONTH, -6, GETDATE())
GROUP BY c.customer_id, c.first_name, c.last_name, c.email
ORDER BY total_value DESC
OFFSET 0 ROWS FETCH NEXT 10 ROWS ONLY;
Note: Actual Copilot suggestions may vary based on your specific database schema and context.
Pro Tip: If the first suggestion isn't perfect, press Ctrl+Enter
(Windows) or Cmd+Enter
(Mac) to see alternative suggestions in the Copilot panel.
2. Interactive Query Building with Copilot Chat
Application: VS Code with GitHub Copilot Chat extension
Your Action: Open Copilot Chat (Ctrl+Shift+I
) and ask specific questions
What You Type:
@workspace Create a SQL query to analyze table sizes and performance metrics for our e-commerce database
Copilot Chat Response: Provides explanation + code. Here's an example of what you might receive:
I'll help you create a comprehensive query to analyze table sizes and performance metrics. This query will show you table sizes, row counts, and space usage:
```sql
-- Analyze table size and row counts for database optimization
-- Include schema, table name, row count, and size information
SELECT
SCHEMA_NAME(t.schema_id) AS schema_name,
t.name AS table_name,
p.rows AS row_count,
CAST(ROUND(((SUM(a.total_pages) * 8) / 1024.00), 2) AS NUMERIC(36, 2)) AS table_size_mb,
CAST(ROUND(((SUM(a.used_pages) * 8) / 1024.00), 2) AS NUMERIC(36, 2)) AS used_space_mb,
CAST(ROUND(((SUM(a.total_pages) - SUM(a.used_pages)) * 8) / 1024.00, 2) AS NUMERIC(36, 2)) AS unused_space_mb
FROM sys.tables t
INNER JOIN sys.indexes i ON t.object_id = i.object_id
INNER JOIN sys.partitions p ON i.object_id = p.object_id AND i.index_id = p.index_id
INNER JOIN sys.allocation_units a ON p.partition_id = a.container_id
WHERE t.is_ms_shipped = 0 AND i.object_id > 255
GROUP BY t.schema_id, t.name, p.rows
ORDER BY table_size_mb DESC;
Note: Copilot's actual response and generated SQL may differ based on your workspace context and database schema.
Follow-up Questions You Can Ask:
- "Explain what this query does line by line"
- "How can I optimize this for better performance?"
- "Create a version that filters for tables larger than 100MB"
3. Performance Optimization with AI Assistance
Application: Any SQL editor with GitHub Copilot
Your Action: Type a comment describing your performance goal
What You Type:
-- Find queries with high CPU usage and long execution times
-- Include execution count, average duration, and CPU time
Copilot's Auto-completion: As you press Enter after the comment, Copilot may suggest something like:
SELECT TOP 20
qs.sql_handle,
qs.execution_count,
qs.total_worker_time / qs.execution_count AS avg_cpu_time,
qs.total_elapsed_time / qs.execution_count AS avg_duration,
qs.total_logical_reads / qs.execution_count AS avg_logical_reads,
SUBSTRING(st.text, (qs.statement_start_offset/2)+1,
((CASE qs.statement_end_offset
WHEN -1 THEN DATALENGTH(st.text)
ELSE qs.statement_end_offset
END - qs.statement_start_offset)/2) + 1) AS statement_text
FROM sys.dm_exec_query_stats qs
CROSS APPLY sys.dm_exec_sql_text(qs.sql_handle) st
ORDER BY avg_cpu_time DESC;
Note: The exact query structure and columns may vary depending on your SQL Server version and Copilot's analysis of your context.
What Happens Next: You can press Tab to accept, or continue typing to refine the suggestion.
-- Analyze table size and row counts for database optimization
-- Include schema, table name, row count, and size information
SELECT
SCHEMA_NAME(t.schema_id) AS schema_name,
t.name AS table_name,
p.rows AS row_count,
CAST(ROUND(((SUM(a.total_pages) * 8) / 1024.00), 2) AS NUMERIC(36, 2)) AS table_size_mb,
CAST(ROUND(((SUM(a.used_pages) * 8) / 1024.00), 2) AS NUMERIC(36, 2)) AS used_space_mb,
CAST(ROUND(((SUM(a.total_pages) - SUM(a.used_pages)) * 8) / 1024.00, 2) AS NUMERIC(36, 2)) AS unused_space_mb
FROM sys.tables t
INNER JOIN sys.indexes i ON t.object_id = i.object_id
INNER JOIN sys.partitions p ON i.object_id = p.object_id AND i.index_id = p.index_id
INNER JOIN sys.allocation_units a ON p.partition_id = a.container_id
WHERE t.is_ms_shipped = 0 AND i.object_id > 255
GROUP BY t.schema_id, t.name, p.rows
ORDER BY table_size_mb DESC;
🏗️ Schema Design with AI Assistance
4. Entity Framework Model Generation in Visual Studio/VS Code
Application: Visual Studio 2022 or VS Code with C# extension
Your Action: Create a new .cs file and start with a comment
What You Type:
// Customer entity with proper relationships and constraints
// Include navigation properties for orders and addresses
// Add data annotations for validation and database mapping
Copilot's Response: As you type public class Customer
, Copilot may suggest a complete class structure like this:
[Table("Customers")]
public class Customer
{
[Key]
[DatabaseGenerated(DatabaseGeneratedOption.Identity)]
public int CustomerId { get; set; }
[Required]
[MaxLength(50)]
[Index("IX_Customer_Email", IsUnique = true)]
public string Email { get; set; }
[Required]
[MaxLength(100)]
public string FirstName { get; set; }
[Required]
[MaxLength(100)]
public string LastName { get; set; }
[Phone]
[MaxLength(20)]
public string? PhoneNumber { get; set; }
[DataType(DataType.DateTime)]
public DateTime CreatedAt { get; set; } = DateTime.UtcNow;
[DataType(DataType.DateTime)]
public DateTime? UpdatedAt { get; set; }
[Required]
public bool IsActive { get; set; } = true;
// Navigation properties
public virtual ICollection<Order> Orders { get; set; } = new List<Order>();
public virtual ICollection<CustomerAddress> Addresses { get; set; } = new List<CustomerAddress>();
}
Note: The specific attributes, data types, and navigation properties will vary based on your project's Entity Framework version and existing code context.
Interactive Refinement: You can continue typing or use Copilot Chat to ask:
- "Add audit fields to this entity"
- "Create a soft delete pattern for this model"
- "Add validation attributes for business rules"
5. Database Migration Generation
Application: Visual Studio Package Manager Console or VS Code Terminal
Step 1: Use Entity Framework CLI
What You Type:
dotnet ef migrations add AddCustomerLoyaltyProgram
Step 2: Open the generated migration file in VS Code/Visual Studio What You See: Empty Up() and Down() methods
Step 3: Add descriptive comments and let Copilot complete What You Type:
// Migration to add customer loyalty program tables
// Include points system, tiers, and transaction history
public partial class AddCustomerLoyaltyProgram : Migration
{
protected override void Up(MigrationBuilder migrationBuilder)
{
// Create loyalty tiers table
Copilot's Auto-completion: As you type the comment and start the method, Copilot may suggest a complete migration like this:
migrationBuilder.CreateTable(
name: "LoyaltyTiers",
columns: table => new
{
TierId = table.Column<int>(type: "int", nullable: false)
.Annotation("SqlServer:Identity", "1, 1"),
TierName = table.Column<string>(type: "nvarchar(50)", maxLength: 50, nullable: false),
MinimumPoints = table.Column<int>(type: "int", nullable: false),
DiscountPercentage = table.Column<decimal>(type: "decimal(5,2)", nullable: false),
IsActive = table.Column<bool>(type: "bit", nullable: false, defaultValue: true),
CreatedAt = table.Column<DateTime>(type: "datetime2", nullable: false, defaultValueSql: "GETUTCDATE()")
},
constraints: table =>
{
table.PrimaryKey("PK_LoyaltyTiers", x => x.TierId);
});
// Create customer loyalty points table
migrationBuilder.CreateTable(
name: "CustomerLoyaltyPoints",
columns: table => new
{
CustomerLoyaltyId = table.Column<int>(type: "int", nullable: false)
.Annotation("SqlServer:Identity", "1, 1"),
CustomerId = table.Column<int>(type: "int", nullable: false),
CurrentPoints = table.Column<int>(type: "int", nullable: false, defaultValue: 0),
LifetimePoints = table.Column<int>(type: "int", nullable: false, defaultValue: 0),
TierId = table.Column<int>(type: "int", nullable: false),
LastUpdated = table.Column<DateTime>(type: "datetime2", nullable: false, defaultValueSql: "GETUTCDATE()")
},
constraints: table =>
{
table.PrimaryKey("PK_CustomerLoyaltyPoints", x => x.CustomerLoyaltyId);
table.ForeignKey("FK_CustomerLoyaltyPoints_Customers", x => x.CustomerId, "Customers", "CustomerId", onDelete: ReferentialAction.Cascade);
table.ForeignKey("FK_CustomerLoyaltyPoints_LoyaltyTiers", x => x.TierId, "LoyaltyTiers", "TierId", onDelete: ReferentialAction.Restrict);
});
// Create indexes for performance
migrationBuilder.CreateIndex(
name: "IX_CustomerLoyaltyPoints_CustomerId",
table: "CustomerLoyaltyPoints",
column: "CustomerId",
unique: true);
migrationBuilder.CreateIndex(
name: "IX_CustomerLoyaltyPoints_TierId",
table: "CustomerLoyaltyPoints",
column: "TierId");
}
protected override void Down(MigrationBuilder migrationBuilder)
{
migrationBuilder.DropTable("CustomerLoyaltyPoints");
migrationBuilder.DropTable("LoyaltyTiers");
}
}
Note: The generated migration code will vary based on your existing database schema, Entity Framework version, and the specific requirements you describe in your comments.
Migration Best Practice: After Copilot generates the migration, review it and use Copilot Chat to ask:
- "Is this migration safe for production?"
- "Generate a rollback verification script"
- "What indexes should I add for optimal performance?"
// Customer entity with proper relationships and constraints
// Include navigation properties for orders and addresses
// Add data annotations for validation and database mapping
[Table("Customers")]
public class Customer
{
[Key]
[DatabaseGenerated(DatabaseGeneratedOption.Identity)]
public int CustomerId { get; set; }
[Required]
[MaxLength(50)]
[Index("IX_Customer_Email", IsUnique = true)]
public string Email { get; set; }
[Required]
[MaxLength(100)]
public string FirstName { get; set; }
[Required]
[MaxLength(100)]
public string LastName { get; set; }
[Phone]
[MaxLength(20)]
public string? PhoneNumber { get; set; }
[DataType(DataType.DateTime)]
public DateTime CreatedAt { get; set; } = DateTime.UtcNow;
[DataType(DataType.DateTime)]
public DateTime? UpdatedAt { get; set; }
[Required]
public bool IsActive { get; set; } = true;
// Navigation properties
public virtual ICollection<Order> Orders { get; set; } = new List<Order>();
public virtual ICollection<CustomerAddress> Addresses { get; set; } = new List<CustomerAddress>();
}
4. Database Migration Generation
Let Copilot help you create comprehensive migrations:
// Migration to add customer loyalty program tables
// Include points system, tiers, and transaction history
public partial class AddCustomerLoyaltyProgram : Migration
{
protected override void Up(MigrationBuilder migrationBuilder)
{
// Create loyalty tiers table
migrationBuilder.CreateTable(
name: "LoyaltyTiers",
columns: table => new
{
TierId = table.Column<int>(type: "int", nullable: false)
.Annotation("SqlServer:Identity", "1, 1"),
TierName = table.Column<string>(type: "nvarchar(50)", maxLength: 50, nullable: false),
MinimumPoints = table.Column<int>(type: "int", nullable: false),
DiscountPercentage = table.Column<decimal>(type: "decimal(5,2)", nullable: false),
IsActive = table.Column<bool>(type: "bit", nullable: false, defaultValue: true),
CreatedAt = table.Column<DateTime>(type: "datetime2", nullable: false, defaultValueSql: "GETUTCDATE()")
},
constraints: table =>
{
table.PrimaryKey("PK_LoyaltyTiers", x => x.TierId);
});
// Create customer loyalty points table
migrationBuilder.CreateTable(
name: "CustomerLoyaltyPoints",
columns: table => new
{
CustomerLoyaltyId = table.Column<int>(type: "int", nullable: false)
.Annotation("SqlServer:Identity", "1, 1"),
CustomerId = table.Column<int>(type: "int", nullable: false),
CurrentPoints = table.Column<int>(type: "int", nullable: false, defaultValue: 0),
LifetimePoints = table.Column<int>(type: "int", nullable: false, defaultValue: 0),
TierId = table.Column<int>(type: "int", nullable: false),
LastUpdated = table.Column<DateTime>(type: "datetime2", nullable: false, defaultValueSql: "GETUTCDATE()")
},
constraints: table =>
{
table.PrimaryKey("PK_CustomerLoyaltyPoints", x => x.CustomerLoyaltyId);
table.ForeignKey("FK_CustomerLoyaltyPoints_Customers", x => x.CustomerId, "Customers", "CustomerId", onDelete: ReferentialAction.Cascade);
table.ForeignKey("FK_CustomerLoyaltyPoints_LoyaltyTiers", x => x.TierId, "LoyaltyTiers", "TierId", onDelete: ReferentialAction.Restrict);
});
// Create indexes for performance
migrationBuilder.CreateIndex(
name: "IX_CustomerLoyaltyPoints_CustomerId",
table: "CustomerLoyaltyPoints",
column: "CustomerId",
unique: true);
migrationBuilder.CreateIndex(
name: "IX_CustomerLoyaltyPoints_TierId",
table: "CustomerLoyaltyPoints",
column: "TierId");
}
protected override void Down(MigrationBuilder migrationBuilder)
{
migrationBuilder.DropTable("CustomerLoyaltyPoints");
migrationBuilder.DropTable("LoyaltyTiers");
}
}
📊 Database Performance and Optimization
5. Index Strategy with Copilot
Use AI to design comprehensive indexing strategies:
-- Create optimized indexes for e-commerce order system
-- Cover common query patterns and improve performance
-- Composite index for order queries by customer and date range
CREATE NONCLUSTERED INDEX IX_Orders_Customer_Date_Status
ON Orders (customer_id, order_date, status)
INCLUDE (total_amount, shipping_address_id)
WITH (FILLFACTOR = 90, PAD_INDEX = ON);
-- Index for product search and filtering
CREATE NONCLUSTERED INDEX IX_Products_Category_Price_Active
ON Products (category_id, price, is_active)
INCLUDE (product_name, description, stock_quantity)
WITH (FILLFACTOR = 95);
-- Index for order item analysis
CREATE NONCLUSTERED INDEX IX_OrderItems_Product_Date
ON OrderItems (product_id, order_date)
INCLUDE (quantity, unit_price, discount_amount)
WHERE order_date >= '2024-01-01';
6. Stored Procedure Generation
Let Copilot help create optimized stored procedures:
-- Stored procedure for comprehensive customer order analysis
-- Include parameters for date range, customer filters, and pagination
CREATE PROCEDURE sp_GetCustomerOrderAnalysis
@StartDate DATE = NULL,
@EndDate DATE = NULL,
@CustomerId INT = NULL,
@MinOrderValue DECIMAL(10,2) = NULL,
@PageNumber INT = 1,
@PageSize INT = 50
AS
BEGIN
SET NOCOUNT ON;
-- Set default date range if not provided
IF @StartDate IS NULL SET @StartDate = DATEADD(YEAR, -1, GETDATE());
IF @EndDate IS NULL SET @EndDate = GETDATE();
-- Calculate offset for pagination
DECLARE @Offset INT = (@PageNumber - 1) * @PageSize;
WITH CustomerOrderStats AS (
SELECT
c.customer_id,
c.first_name + ' ' + c.last_name AS customer_name,
c.email,
COUNT(o.order_id) AS total_orders,
SUM(o.total_amount) AS total_spent,
AVG(o.total_amount) AS avg_order_value,
MIN(o.order_date) AS first_order_date,
MAX(o.order_date) AS last_order_date,
DATEDIFF(DAY, MIN(o.order_date), MAX(o.order_date)) AS customer_lifetime_days
FROM customers c
INNER JOIN orders o ON c.customer_id = o.customer_id
WHERE o.order_date BETWEEN @StartDate AND @EndDate
AND (@CustomerId IS NULL OR c.customer_id = @CustomerId)
AND o.status NOT IN ('cancelled', 'refunded')
GROUP BY c.customer_id, c.first_name, c.last_name, c.email
HAVING (@MinOrderValue IS NULL OR SUM(o.total_amount) >= @MinOrderValue)
)
SELECT *,
CASE
WHEN total_orders >= 10 AND total_spent >= 1000 THEN 'VIP'
WHEN total_orders >= 5 OR total_spent >= 500 THEN 'Regular'
ELSE 'New'
END AS customer_segment
FROM CustomerOrderStats
ORDER BY total_spent DESC
OFFSET @Offset ROWS FETCH NEXT @PageSize ROWS ONLY;
-- Return total count for pagination
SELECT COUNT(*) AS TotalRecords
FROM customers c
INNER JOIN orders o ON c.customer_id = o.customer_id
WHERE o.order_date BETWEEN @StartDate AND @EndDate
AND (@CustomerId IS NULL OR c.customer_id = @CustomerId)
AND o.status NOT IN ('cancelled', 'refunded');
END;
🔧 Database Configuration and Maintenance
7. Entity Framework Configuration
Use Copilot to create comprehensive DbContext configurations:
// Comprehensive DbContext configuration with performance optimizations
// Include connection resiliency, query optimization, and logging
public class ECommerceDbContext : DbContext
{
public ECommerceDbContext(DbContextOptions<ECommerceDbContext> options) : base(options) { }
public DbSet<Customer> Customers { get; set; }
public DbSet<Order> Orders { get; set; }
public DbSet<Product> Products { get; set; }
public DbSet<OrderItem> OrderItems { get; set; }
protected override void OnModelCreating(ModelBuilder modelBuilder)
{
base.OnModelCreating(modelBuilder);
// Configure Customer entity
modelBuilder.Entity<Customer>(entity =>
{
entity.HasKey(e => e.CustomerId);
entity.Property(e => e.Email).IsRequired().HasMaxLength(255);
entity.HasIndex(e => e.Email).IsUnique().HasDatabaseName("IX_Customer_Email");
entity.Property(e => e.CreatedAt).HasDefaultValueSql("GETUTCDATE()");
// Configure one-to-many relationship with Orders
entity.HasMany(c => c.Orders)
.WithOne(o => o.Customer)
.HasForeignKey(o => o.CustomerId)
.OnDelete(DeleteBehavior.Restrict);
});
// Configure Order entity with optimizations
modelBuilder.Entity<Order>(entity =>
{
entity.HasKey(e => e.OrderId);
entity.Property(e => e.TotalAmount).HasPrecision(10, 2);
entity.Property(e => e.OrderDate).HasDefaultValueSql("GETUTCDATE()");
// Create composite index for common queries
entity.HasIndex(e => new { e.CustomerId, e.OrderDate, e.Status })
.HasDatabaseName("IX_Order_Customer_Date_Status");
// Configure relationship with OrderItems
entity.HasMany(o => o.OrderItems)
.WithOne(oi => oi.Order)
.HasForeignKey(oi => oi.OrderId)
.OnDelete(DeleteBehavior.Cascade);
});
// Configure Product entity with search optimization
modelBuilder.Entity<Product>(entity =>
{
entity.HasKey(e => e.ProductId);
entity.Property(e => e.Name).IsRequired().HasMaxLength(200);
entity.Property(e => e.Price).HasPrecision(10, 2);
// Full-text search index for product search
entity.HasIndex(e => new { e.Name, e.Description })
.HasDatabaseName("IX_Product_Search");
});
// Seed initial data
SeedInitialData(modelBuilder);
}
protected override void OnConfiguring(DbContextOptionsBuilder optionsBuilder)
{
if (!optionsBuilder.IsConfigured)
{
// Enable sensitive data logging in development
optionsBuilder.EnableSensitiveDataLogging()
.EnableDetailedErrors();
// Configure connection resiliency
optionsBuilder.UseSqlServer(connectionString =>
{
connectionString.EnableRetryOnFailure(
maxRetryCount: 3,
maxRetryDelay: TimeSpan.FromSeconds(30),
errorNumbersToAdd: null);
});
}
}
private void SeedInitialData(ModelBuilder modelBuilder)
{
// Seed default loyalty tiers
modelBuilder.Entity<LoyaltyTier>().HasData(
new LoyaltyTier { TierId = 1, TierName = "Bronze", MinimumPoints = 0, DiscountPercentage = 0 },
new LoyaltyTier { TierId = 2, TierName = "Silver", MinimumPoints = 100, DiscountPercentage = 5 },
new LoyaltyTier { TierId = 3, TierName = "Gold", MinimumPoints = 500, DiscountPercentage = 10 },
new LoyaltyTier { TierId = 4, TierName = "Platinum", MinimumPoints = 1000, DiscountPercentage = 15 }
);
}
}
🚀 Advanced Database Operations
8. Database Maintenance Scripts
Generate comprehensive maintenance scripts with Copilot:
-- Comprehensive database maintenance script
-- Include index maintenance, statistics updates, and cleanup operations
DECLARE @DatabaseName NVARCHAR(128) = DB_NAME();
DECLARE @sql NVARCHAR(MAX);
DECLARE @StartTime DATETIME2 = GETUTCDATE();
PRINT 'Starting database maintenance for: ' + @DatabaseName;
PRINT 'Start time: ' + CONVERT(VARCHAR, @StartTime, 120);
-- 1. Update statistics for all tables
PRINT 'Updating statistics...';
EXEC sp_updatestats;
-- 2. Reorganize or rebuild indexes based on fragmentation
PRINT 'Analyzing and maintaining indexes...';
DECLARE index_cursor CURSOR FOR
SELECT
OBJECT_SCHEMA_NAME(ps.object_id) AS schema_name,
OBJECT_NAME(ps.object_id) AS table_name,
i.name AS index_name,
ps.avg_fragmentation_in_percent,
ps.page_count
FROM sys.dm_db_index_physical_stats(DB_ID(), NULL, NULL, NULL, 'LIMITED') ps
INNER JOIN sys.indexes i ON ps.object_id = i.object_id AND ps.index_id = i.index_id
WHERE ps.avg_fragmentation_in_percent > 5
AND ps.page_count > 1000
AND i.name IS NOT NULL;
DECLARE @schema_name NVARCHAR(128), @table_name NVARCHAR(128), @index_name NVARCHAR(128);
DECLARE @fragmentation FLOAT, @page_count BIGINT;
OPEN index_cursor;
FETCH NEXT FROM index_cursor INTO @schema_name, @table_name, @index_name, @fragmentation, @page_count;
WHILE @@FETCH_STATUS = 0
BEGIN
IF @fragmentation > 30
BEGIN
-- Rebuild index if fragmentation > 30%
SET @sql = 'ALTER INDEX [' + @index_name + '] ON [' + @schema_name + '].[' + @table_name + '] REBUILD WITH (ONLINE = ON, MAXDOP = 1)';
PRINT 'Rebuilding index: ' + @index_name + ' on ' + @schema_name + '.' + @table_name + ' (Fragmentation: ' + CAST(@fragmentation AS VARCHAR) + '%)';
END
ELSE
BEGIN
-- Reorganize index if fragmentation between 5-30%
SET @sql = 'ALTER INDEX [' + @index_name + '] ON [' + @schema_name + '].[' + @table_name + '] REORGANIZE';
PRINT 'Reorganizing index: ' + @index_name + ' on ' + @schema_name + '.' + @table_name + ' (Fragmentation: ' + CAST(@fragmentation AS VARCHAR) + '%)';
END
EXEC sp_executesql @sql;
FETCH NEXT FROM index_cursor INTO @schema_name, @table_name, @index_name, @fragmentation, @page_count;
END;
CLOSE index_cursor;
DEALLOCATE index_cursor;
-- 3. Clean up old audit logs (example)
PRINT 'Cleaning up old audit logs...';
DELETE FROM audit_logs
WHERE created_date < DATEADD(MONTH, -6, GETUTCDATE());
PRINT 'Database maintenance completed at: ' + CONVERT(VARCHAR, GETUTCDATE(), 120);
PRINT 'Total execution time: ' + CAST(DATEDIFF(SECOND, @StartTime, GETUTCDATE()) AS VARCHAR) + ' seconds';
🎯 Pro Tips for Database Development with Copilot
1. Master the Comment-Driven Development
Workflow: Write detailed comments → Let Copilot suggest code → Refine with additional comments
-- Step 1: High-level description
-- Find customers who haven't ordered in 6 months
-- Step 2: Add specific requirements
-- Include their last order date and total lifetime value
-- Step 3: Add technical details
-- Use LEFT JOIN to include customers with no orders
-- Order by last order date ascending (oldest first)
Result: Copilot generates increasingly accurate and complete queries as you add detail.
2. Use Copilot Chat for Complex Problem Solving
Application: VS Code Copilot Chat panel (Ctrl+Shift+I
)
Example Conversation:
You: "I need help optimizing a slow query that joins 5 tables"
Copilot: "I'd be happy to help optimize your query. Please share the query and I'll analyze it for performance improvements."
You: [Paste your slow query]
Copilot: "I see several optimization opportunities:
1. Missing indexes on join columns
2. SELECT * instead of specific columns
3. No WHERE clause filtering
Here's an optimized version..."
3. Leverage Copilot for Code Reviews
Workflow: Paste your SQL or C# code into Copilot Chat and ask:
- "Review this Entity Framework configuration for best practices"
- "Are there any security issues with this stored procedure?"
- "How can I improve the performance of this query?"
4. Generate Documentation Automatically
What You Type in Copilot Chat:
Generate comprehensive documentation for this database schema including:
- Table relationships
- Index strategy
- Performance considerations
- Security notes
Result: Copilot creates detailed documentation in markdown format.
5. Interactive Learning and Explanation
Ask Copilot Chat:
- "Explain the execution plan for this query"
- "What's the difference between INNER JOIN and LEFT JOIN in this context?"
- "Why is this index not being used?"
Follow-up Questions:
- "Show me an example of when to use this technique"
- "What are the trade-offs of this approach?"
🔍 Common Use Cases
- Schema Evolution: Planning and implementing database schema changes
- Query Optimization: Improving slow-running queries with better indexing and structure
- Data Migration: Moving data between systems or updating schema
- Performance Monitoring: Creating queries to monitor database health and performance
- Backup and Recovery: Automating database maintenance and backup procedures
📚 Resources and Next Steps
- Explore Entity Framework Core documentation for advanced features
- Learn about database design patterns and normalization
- Study query execution plans and performance tuning
- Practice with different database systems (SQL Server, PostgreSQL, MySQL)
- Join database development communities for best practices
🏁 Conclusion
Copilot can significantly enhance your database development workflow by helping you write better SQL, design robust schemas, and maintain optimal performance. Remember to always review and test AI-generated code, especially for production environments.
Start small with simple queries and gradually work your way up to more complex database operations. The key is to provide clear context and validate the results to ensure they meet your specific requirements.
Happy coding with Copilot! 🚀