Md Mominul Islam | Software and Data Enginnering | SQL Server, .NET, Power BI, Azure Blog

while(!(succeed=try()));

LinkedIn Portfolio Banner

Latest

Home Top Ad

Responsive Ads Here

Post Top Ad

Responsive Ads Here

Sunday, August 24, 2025

Master SQL Server Module 4: Intermediate T-SQL & Query Development - A Comprehensive Guide

 


Table of Contents

  1. Views: Simplifying and Securing Data Access

    • What Are Views?

    • Creating and Managing Views

    • Real-Life Examples

    • Pros and Cons

    • Best Practices for Views

    • Security Considerations

    • Performance Optimization

    • Alternatives to Views

  2. Stored Procedures: Reusable and Secure Code Blocks

    • Understanding Stored Procedures

    • Creating and Executing Stored Procedures

    • Real-Life Examples

    • Pros and Cons

    • Best Practices

    • Security Best Practices

    • Performance Optimization

    • Error Handling in Stored Procedures

    • Alternatives to Stored Procedures

  3. Functions: Enhancing Query Flexibility

    • Types of User-Defined Functions

    • Creating Scalar, Inline, and Multi-Statement Table-Valued Functions

    • Real-Life Examples

    • Pros and Cons

    • Best Practices

    • Security Considerations

    • Performance Optimization

    • Alternatives to Functions

  4. Transactions and Error Handling: Ensuring Data Integrity

    • Understanding Transactions

    • Implementing TRY/CATCH Blocks

    • Real-Life Examples

    • Pros and Cons

    • Best Practices for Transactions

    • Error Handling Strategies

    • Security Considerations

    • Performance Optimization

    • Alternatives to TRY/CATCH

  5. Triggers: Automating Database Actions

    • AFTER Triggers vs. INSTEAD OF Triggers

    • Creating and Managing Triggers

    • Real-Life Examples

    • Pros and Cons

    • Best Practices

    • Security Considerations

    • Performance Optimization

    • Alternatives to Triggers

  6. Window Functions: Advanced Analytical Queries

    • Understanding Window Functions (ROW_NUMBER, RANK, LEAD, LAG)

    • Real-Life Examples

    • Pros and Cons

    • Best Practices

    • Security Considerations

    • Performance Optimization

    • Alternatives to Window Functions

  7. Temp Tables vs. Table Variables: Choosing the Right Temporary Storage

    • Differences Between Temp Tables and Table Variables

    • Real-Life Examples

    • Pros and Cons

    • Best Practices

    • Security Considerations

    • Performance Optimization

    • Alternatives to Temp Tables and Table Variables

  8. Dynamic SQL: Flexible Query Execution

    • What Is Dynamic SQL?

    • Creating and Executing Dynamic SQL

    • Real-Life Examples

    • Pros and Cons

    • Best Practices

    • Security Best Practices (Preventing SQL Injection)

    • Performance Optimization

    • Alternatives to Dynamic SQL

  9. SQL Server 2025: JSON Enhancements and Advanced Features

    • JSON Enhancements in SQL Server 2025

    • Other Advanced Features in SQL Server 2025

    • Real-Life Examples

    • Pros and Cons

    • Best Practices

    • Security Considerations

    • Performance Optimization

    • Alternatives to JSON in SQL Server

  10. Conclusion: Mastering Intermediate T-SQL

SEO Tags: SQL_Server, T-SQL, Intermediate_T-SQL, SQL_Server_2025, Views, Stored_Procedures, Functions, Transactions, Error_Handling, Triggers, Window_Functions, Temp_Tables, Table_Variables, Dynamic_SQL, JSON_Enhancements, Database_Design, SQL_Best_Practices, SQL_Security, SQL_Performance, SQL_Tutorial, Real_Life_Examples


1. Views: Simplifying and Securing Data Access

What Are Views?

A view in SQL Server is a virtual table based on the result set of a SELECT query. It doesn’t store data physically but provides a simplified or customized view of data from one or more tables. Views enhance security, simplify complex queries, and improve maintainability.

Creating and Managing Views

Let’s create a view to display employee details from an HR database, joining the Employees and Departments tables.

CREATE VIEW vw_EmployeeDetails
AS
SELECT 
    e.EmployeeID,
    e.FirstName + ' ' + e.LastName AS FullName,
    e.HireDate,
    d.DepartmentName
FROM Employees e
INNER JOIN Departments d ON e.DepartmentID = d.DepartmentID
WHERE e.IsActive = 1;
GO

-- Query the view
SELECT * FROM vw_EmployeeDetails WHERE DepartmentName = 'IT';

To modify a view, use ALTER VIEW. To drop it, use DROP VIEW.

Real-Life Example: Sales Dashboard View

Imagine a retail company needs a dashboard to display sales data for active products. A view can simplify this by joining the Products, Sales, and Categories tables.

CREATE VIEW vw_SalesDashboard
AS
SELECT 
    p.ProductID,
    p.ProductName,
    c.CategoryName,
    SUM(s.Quantity) AS TotalSold,
    SUM(s.Quantity * s.UnitPrice) AS TotalRevenue
FROM Products p
INNER JOIN Sales s ON p.ProductID = s.ProductID
INNER JOIN Categories c ON p.CategoryID = c.CategoryID
WHERE p.Discontinued = 0
GROUP BY p.ProductID, p.ProductName, c.CategoryName;
GO

-- Query the view
SELECT * FROM vw_SalesDashboard WHERE CategoryName = 'Electronics';

This view provides a clean, aggregated dataset for reporting, hiding the complexity of joins and aggregations.

Pros and Cons of Views

Pros:

  • Simplification: Abstracts complex queries, making them easier to consume.

  • Security: Restricts access to specific columns or rows, enhancing data protection.

  • Reusability: Centralizes logic for reuse across applications.

  • Consistency: Ensures consistent data presentation across queries.

Cons:

  • Performance Overhead: Views may degrade performance for complex queries if not indexed.

  • Limited Flexibility: Cannot accept parameters, limiting dynamic behavior.

  • Maintenance: Changes to underlying tables require view updates.

Best Practices for Views

  • Use Descriptive Names: Prefix views with vw_ for clarity (e.g., vw_SalesDashboard).

  • Limit Complexity: Avoid overly complex views to maintain performance.

  • Document Views: Include comments explaining the view’s purpose.

  • Use SCHEMABINDING: Prevents underlying table schema changes from breaking the view.

CREATE VIEW vw_SecureEmployeeDetails
WITH SCHEMABINDING
AS
SELECT 
    EmployeeID,
    FirstName,
    LastName
FROM dbo.Employees
WHERE IsActive = 1;
GO
  • Avoid Nested Views: Nesting views can lead to performance issues and maintenance challenges.

Security Considerations

  • Grant permissions explicitly to control access:

GRANT SELECT ON vw_EmployeeDetails TO SalesTeamRole;
  • Use WITH CHECK OPTION to enforce data integrity when updating through views.

  • Avoid exposing sensitive columns (e.g., Salary) in views accessible to non-privileged users.

Performance Optimization

  • Indexed Views: Use materialized views (indexed views) for frequently accessed data.

CREATE VIEW vw_SalesSummary
WITH SCHEMABINDING
AS
SELECT 
    p.ProductID,
    COUNT_BIG(*) AS SaleCount
FROM dbo.Sales s
INNER JOIN dbo.Products p ON s.ProductID = p.ProductID
GROUP BY p.ProductID;
GO

CREATE UNIQUE CLUSTERED INDEX IDX_vw_SalesSummary ON vw_SalesSummary(ProductID);
  • Ensure underlying tables have proper indexes to support view queries.

Alternatives to Views

  • Common Table Expressions (CTEs): Temporary result sets for single queries.

  • Table-Valued Functions: Parameterized views for dynamic filtering.

  • Stored Procedures: For complex logic requiring parameters.


2. Stored Procedures: Reusable and Secure Code Blocks

Understanding Stored Procedures

A stored procedure is a precompiled set of T-SQL statements stored in the database, allowing reusable, secure, and optimized execution of complex logic.

Creating and Executing Stored Procedures

Here’s a stored procedure to insert a new employee and log the action.

CREATE PROCEDURE sp_InsertEmployee
    @FirstName NVARCHAR(50),
    @LastName NVARCHAR(50),
    @DepartmentID INT,
    @HireDate DATE,
    @EmployeeID INT OUTPUT
AS
BEGIN
    SET NOCOUNT ON;
    
    INSERT INTO Employees (FirstName, LastName, DepartmentID, HireDate, IsActive)
    VALUES (@FirstName, @LastName, @DepartmentID, @HireDate, 1);
    
    SET @EmployeeID = SCOPE_IDENTITY();
    
    INSERT INTO AuditLog (Action, EmployeeID, ActionDate)
    VALUES ('Insert Employee', @EmployeeID, GETDATE());
END;
GO

-- Execute the procedure
DECLARE @NewEmployeeID INT;
EXEC sp_InsertEmployee 
    @FirstName = 'John', 
    @LastName = 'Doe', 
    @DepartmentID = 1, 
    @HireDate = '2025-08-24',
    @EmployeeID = @NewEmployeeID OUTPUT;
SELECT @NewEmployeeID AS NewEmployeeID;

Real-Life Example: Order Processing

A stored procedure for processing customer orders, updating inventory, and logging transactions.

CREATE PROCEDURE sp_ProcessOrder
    @CustomerID INT,
    @ProductID INT,
    @Quantity INT,
    @OrderID INT OUTPUT
AS
BEGIN
    SET NOCOUNT ON;
    
    BEGIN TRY
        BEGIN TRANSACTION;
        
        -- Insert order
        INSERT INTO Orders (CustomerID, OrderDate)
        VALUES (@CustomerID, GETDATE());
        SET @OrderID = SCOPE_IDENTITY();
        
        -- Insert order detail
        INSERT INTO OrderDetails (OrderID, ProductID, Quantity)
        VALUES (@OrderID, @ProductID, @Quantity);
        
        -- Update inventory
        UPDATE Products
        SET StockQuantity = StockQuantity - @Quantity
        WHERE ProductID = @ProductID;
        
        COMMIT TRANSACTION;
    END TRY
    BEGIN CATCH
        ROLLBACK TRANSACTION;
        INSERT INTO ErrorLog (ErrorMessage, ErrorDate)
        VALUES (ERROR_MESSAGE(), GETDATE());
        THROW;
    END CATCH;
END;
GO

Pros and Cons of Stored Procedures

Pros:

  • Reusability: Centralizes logic for multiple applications.

  • Performance: Precompiled execution plans improve performance.

  • Security: Encapsulates logic, reducing direct table access.

  • Maintainability: Easy to update without changing application code.

Cons:

  • Complexity: Can become unwieldy for complex logic.

  • Debugging: Harder to debug than inline queries.

  • Versioning: Managing changes across environments can be challenging.

Best Practices

  • Use SET NOCOUNT ON: Reduces network overhead by suppressing row count messages.

  • Parameterize Inputs: Prevent SQL injection and improve plan reuse.

  • Include Error Handling: Use TRY/CATCH blocks (covered later).

  • Comment Code: Document purpose and parameters.

  • Avoid Business Logic: Keep complex business logic in application code.

Security Best Practices

  • Grant EXECUTE permissions instead of table-level access:

GRANT EXECUTE ON sp_ProcessOrder TO OrderProcessingRole;
  • Use EXECUTE AS to control execution context:

CREATE PROCEDURE sp_SecureProcedure
WITH EXECUTE AS OWNER
AS
BEGIN
    -- Secure logic here
END;
GO

Performance Optimization

  • Minimize cursor usage; prefer set-based operations.

  • Use proper indexing on tables accessed by the procedure.

  • Cache execution plans by avoiding dynamic SQL unless necessary.

Error Handling in Stored Procedures

Integrate TRY/CATCH blocks (detailed in Section 4) to handle errors gracefully.

Alternatives to Stored Procedures

  • Functions: For reusable logic returning specific values or tables.

  • Inline Queries: For simple operations in application code.

  • ETL Tools: For complex data transformations outside SQL Server.


3. Functions: Enhancing Query Flexibility

Types of User-Defined Functions

SQL Server supports three types of user-defined functions (UDFs):

  1. Scalar-Valued Functions: Return a single value.

  2. Inline Table-Valued Functions: Return a table result set using a single SELECT.

  3. Multi-Statement Table-Valued Functions: Return a table built through multiple statements.

Creating Scalar, Inline, and Multi-Statement Table-Valued Functions

Scalar Function Example: Calculate employee tenure.

CREATE FUNCTION fn_CalculateTenure (@HireDate DATE)
RETURNS INT
AS
BEGIN
    RETURN DATEDIFF(YEAR, @HireDate, GETDATE());
END;
GO

-- Use the function
SELECT 
    FirstName,
    dbo.fn_CalculateTenure(HireDate) AS YearsOfService
FROM Employees;

Inline Table-Valued Function Example: Get employees by department.

CREATE FUNCTION fn_GetEmployeesByDepartment (@DepartmentID INT)
RETURNS TABLE
AS
RETURN
(
    SELECT 
        EmployeeID,
        FirstName,
        LastName
    FROM Employees
    WHERE DepartmentID = @DepartmentID
);
GO

-- Query the function
SELECT * FROM dbo.fn_GetEmployeesByDepartment(1);

Multi-Statement Table-Valued Function Example: Summarize sales by category.

CREATE FUNCTION fn_SalesByCategory (@Year INT)
RETURNS @SalesSummary TABLE
(
    CategoryID INT,
    CategoryName NVARCHAR(50),
    TotalSales DECIMAL(10,2)
)
AS
BEGIN
    INSERT INTO @SalesSummary
    SELECT 
        c.CategoryID,
        c.CategoryName,
        SUM(s.Quantity * s.UnitPrice) AS TotalSales
    FROM Sales s
    INNER JOIN Products p ON s.ProductID = p.ProductID
    INNER JOIN Categories c ON p.CategoryID = c.CategoryID
    WHERE YEAR(s.SaleDate) = @Year
    GROUP BY c.CategoryID, c.CategoryName;
    RETURN;
END;
GO

-- Query the function
SELECT * FROM dbo.fn_SalesByCategory(2025);

Real-Life Example: Inventory Valuation

A function to calculate the total value of inventory for a given category.

CREATE FUNCTION fn_InventoryValue (@CategoryID INT)
RETURNS DECIMAL(10,2)
AS
BEGIN
    DECLARE @TotalValue DECIMAL(10,2);
    SELECT @TotalValue = SUM(StockQuantity * UnitPrice)
    FROM Products
    WHERE CategoryID = @CategoryID;
    RETURN ISNULL(@TotalValue, 0);
END;
GO

-- Use the function
SELECT 
    c.CategoryName,
    dbo.fn_InventoryValue(c.CategoryID) AS InventoryValue
FROM Categories c;

Pros and Cons of Functions

Pros:

  • Reusability: Functions can be used across queries and procedures.

  • Modularity: Encapsulates logic for specific tasks.

  • Flexibility: Table-valued functions support complex result sets.

Cons:

  • Performance: Scalar functions can be slow when used in large datasets.

  • Limited Error Handling: Functions cannot use TRY/CATCH.

  • Complexity: Multi-statement functions can be harder to maintain.

Best Practices

  • Prefer inline table-valued functions over multi-statement for better performance.

  • Avoid scalar functions in WHERE clauses or SELECT lists for large datasets.

  • Use descriptive names (e.g., fn_GetEmployeesByDepartment).

  • Document function purpose and parameters.

Security Considerations

  • Grant EXECUTE permissions explicitly:

GRANT EXECUTE ON fn_InventoryValue TO InventoryRole;
  • Avoid exposing sensitive data in function outputs.

Performance Optimization

  • Use inline table-valued functions for set-based operations.

  • Avoid recursive scalar functions.

  • Ensure underlying tables have proper indexes.

Alternatives to Functions

  • Stored Procedures: For complex logic with side effects.

  • Views: For static, reusable result sets.

  • CTEs: For temporary result sets in a single query.


4. Transactions and Error Handling: Ensuring Data Integrity

Understanding Transactions

A transaction is a sequence of operations treated as a single unit, ensuring ACID properties (Atomicity, Consistency, Isolation, Durability). Transactions are critical for maintaining data integrity during complex operations.

Implementing TRY/CATCH Blocks

Here’s an example of a transaction with error handling.

BEGIN TRY
    BEGIN TRANSACTION;
    
    -- Update inventory
    UPDATE Products
    SET StockQuantity = StockQuantity - 10
    WHERE ProductID = 1;
    
    -- Insert order
    INSERT INTO Orders (CustomerID, OrderDate)
    VALUES (100, GETDATE());
    
    COMMIT TRANSACTION;
END TRY
BEGIN CATCH
    ROLLBACK TRANSACTION;
    INSERT INTO ErrorLog (ErrorMessage, ErrorDate)
    VALUES (ERROR_MESSAGE(), GETDATE());
    THROW;
END CATCH;

Real-Life Example: Bank Transfer

A transaction to transfer funds between accounts, ensuring data consistency.

CREATE PROCEDURE sp_TransferFunds
    @FromAccountID INT,
    @ToAccountID INT,
    @Amount DECIMAL(10,2)
AS
BEGIN
    SET NOCOUNT ON;
    BEGIN TRY
        BEGIN TRANSACTION;
        
        -- Deduct from source account
        UPDATE Accounts
        SET Balance = Balance - @Amount
        WHERE AccountID = @FromAccountID
        AND Balance >= @Amount;
        
        IF @@ROWCOUNT = 0
            THROW 50001, 'Insufficient funds in source account.', 1;
        
        -- Add to destination account
        UPDATE Accounts
        SET Balance = Balance + @Amount
        WHERE AccountID = @ToAccountID;
        
        COMMIT TRANSACTION;
    END TRY
    BEGIN CATCH
        ROLLBACK TRANSACTION;
        INSERT INTO ErrorLog (ErrorMessage, ErrorDate)
        VALUES (ERROR_MESSAGE(), GETDATE());
        THROW;
    END CATCH;
END;
GO

Pros and Cons of Transactions

Pros:

  • Data Integrity: Ensures all operations succeed or fail together.

  • Consistency: Maintains database state across failures.

  • Error Handling: TRY/CATCH allows graceful error management.

Cons:

  • Performance Overhead: Locks held during transactions can reduce concurrency.

  • Complexity: Nested transactions require careful management.

  • Deadlocks: Improper transaction design can lead to deadlocks.

Best Practices for Transactions

  • Keep Transactions Short: Minimize lock duration to improve concurrency.

  • Use Appropriate Isolation Levels: Default to READ COMMITTED unless specific needs require otherwise.

  • Avoid Nested Transactions: Use savepoints for partial rollbacks.

BEGIN TRANSACTION;
SAVE TRANSACTION SavePoint1;
-- Operations
IF @@ERROR <> 0
    ROLLBACK TRANSACTION SavePoint1;
COMMIT TRANSACTION;
  • Log Errors: Store error details for auditing.

Error Handling Strategies

  • Use THROW to re-raise errors after logging.

  • Include error details like ERROR_NUMBER(), ERROR_LINE(), and ERROR_MESSAGE().

  • Implement retry logic for transient errors (e.g., deadlocks).

Security Considerations

  • Restrict transaction execution to authorized roles.

  • Avoid exposing sensitive error details to end users.

Performance Optimization

  • Use minimal locking with SET TRANSACTION ISOLATION LEVEL.

  • Index tables involved in transactions to reduce lock contention.

  • Avoid long-running transactions in high-concurrency environments.

Alternatives to TRY/CATCH

  • Application-Level Error Handling: Handle errors in application code.

  • Constraints: Use table constraints to enforce rules.

  • Triggers: For automatic validation (covered next).


5. Triggers: Automating Database Actions

AFTER Triggers vs. INSTEAD OF Triggers

  • AFTER Triggers: Execute after an INSERT, UPDATE, or DELETE operation.

  • INSTEAD OF Triggers: Replace the triggering action with custom logic.

Creating and Managing Triggers

AFTER Trigger Example: Log employee updates.

CREATE TRIGGER tr_EmployeeUpdate
ON Employees
AFTER UPDATE
AS
BEGIN
    INSERT INTO AuditLog (Action, EmployeeID, ActionDate)
    SELECT 'Employee Updated', EmployeeID, GETDATE()
    FROM inserted;
END;
GO

INSTEAD OF Trigger Example: Prevent deletion of active employees.

CREATE TRIGGER tr_PreventEmployeeDelete
ON Employees
INSTEAD OF DELETE
AS
BEGIN
    IF EXISTS (SELECT 1 FROM deleted WHERE IsActive = 1)
        THROW 50002, 'Cannot delete active employees.', 1;
    ELSE
        DELETE FROM Employees
        WHERE EmployeeID IN (SELECT EmployeeID FROM deleted);
END;
GO

Real-Life Example: Inventory Audit

A trigger to log inventory changes when stock is updated.

CREATE TRIGGER tr_InventoryAudit
ON Products
AFTER UPDATE
AS
BEGIN
    INSERT INTO InventoryAudit (ProductID, OldStock, NewStock, ChangeDate)
    SELECT 
        i.ProductID,
        d.StockQuantity AS OldStock,
        i.StockQuantity AS NewStock,
        GETDATE()
    FROM inserted i
    INNER JOIN deleted d ON i.ProductID = d.ProductID
    WHERE i.StockQuantity <> d.StockQuantity;
END;
GO

Pros and Cons of Triggers

Pros:

  • Automation: Executes logic automatically on database events.

  • Data Integrity: Enforces rules at the database level.

  • Auditing: Simplifies tracking of changes.

Cons:

  • Performance: Can slow down DML operations due to additional logic.

  • Debugging: Hard to trace and debug.

  • Hidden Logic: Can make database behavior less transparent.

Best Practices

  • Keep triggers simple and focused on specific tasks.

  • Avoid recursive triggers unless necessary (ALTER DATABASE SET RECURSIVE_TRIGGERS OFF).

  • Use inserted and deleted tables effectively.

  • Document trigger logic clearly.

Security Considerations

  • Restrict trigger execution to specific roles.

  • Avoid triggers that expose sensitive data in audit logs.

Performance Optimization

  • Minimize trigger logic to reduce overhead.

  • Ensure indexes on tables involved in triggers.

  • Avoid triggers for high-frequency operations in large datasets.

Alternatives to Triggers

  • Constraints: For simple validation (e.g., CHECK, FOREIGN KEY).

  • Stored Procedures: For explicit logic invocation.

  • Application Logic: Move automation to application code.


6. Window Functions: Advanced Analytical Queries

Understanding Window Functions

Window functions perform calculations across a set of rows (a "window") without collapsing the result set. Common functions include:

  • ROW_NUMBER(): Assigns a unique number to each row.

  • RANK(): Assigns a rank, with ties receiving the same rank.

  • LEAD(): Accesses data from the next row.

  • LAG(): Accesses data from the previous row.

Real-Life Example: Sales Ranking

Rank products by sales within each category.

SELECT 
    p.ProductName,
    c.CategoryName,
    SUM(s.Quantity) AS TotalSold,
    ROW_NUMBER() OVER (PARTITION BY c.CategoryID ORDER BY SUM(s.Quantity) DESC) AS SalesRank
FROM Sales s
INNER JOIN Products p ON s.ProductID = p.ProductID
INNER JOIN Categories c ON p.CategoryID = c.CategoryID
GROUP BY p.ProductName, c.CategoryName, c.CategoryID;

LEAD/LAG Example: Compare sales month-over-month.

SELECT 
    YEAR(s.SaleDate) AS SaleYear,
    MONTH(s.SaleDate) AS SaleMonth,
    SUM(s.Quantity * s.UnitPrice) AS TotalSales,
    LAG(SUM(s.Quantity * s.UnitPrice)) OVER (ORDER BY YEAR(s.SaleDate), MONTH(s.SaleDate)) AS PreviousMonthSales
FROM Sales s
GROUP BY YEAR(s.SaleDate), MONTH(s.SaleDate);

Pros and Cons of Window Functions

Pros:

  • Analytical Power: Enables complex calculations without subqueries.

  • Readability: Simplifies queries compared to self-joins.

  • Flexibility: Supports partitioning and ordering.

Cons:

  • Performance: Can be resource-intensive for large datasets.

  • Complexity: Steep learning curve for beginners.

  • Limited Scope: Not suitable for all analytical tasks.

Best Practices

  • Use PARTITION BY to limit window scope.

  • Optimize ORDER BY clauses with indexes.

  • Avoid overusing window functions in high-frequency queries.

Security Considerations

  • Ensure users have access only to necessary data partitions.

  • Avoid exposing sensitive data in window function results.

Performance Optimization

  • Create indexes on columns used in PARTITION BY and ORDER BY.

  • Limit the window size with ROWS BETWEEN clauses.

Alternatives to Window Functions

  • Self-Joins: For simple next/previous row access.

  • Subqueries: For one-off calculations.

  • CTEs: For breaking down complex logic.


7. Temp Tables vs. Table Variables: Choosing the Right Temporary Storage

Differences Between Temp Tables and Table Variables

Feature

Temp Tables

Table Variables

Scope

Session or global

Batch or function

Storage

TempDB (physical)

TempDB (in-memory, may spill)

Indexes

Supports all index types

Only primary key constraints

Statistics

Maintained by SQL Server

Not maintained

Transaction

Affected by transactions

Limited transaction support

Real-Life Example: Temp Table for Reporting

A temp table to store intermediate sales data.

CREATE TABLE #TempSales
(
    ProductID INT,
    TotalSold INT
);

INSERT INTO #TempSales
SELECT 
    ProductID,
    SUM(Quantity) AS TotalSold
FROM Sales
GROUP BY ProductID;

SELECT 
    p.ProductName,
    t.TotalSold
FROM #TempSales t
INNER JOIN Products p ON t.ProductID = p.ProductID;

Table Variable Example: Small dataset processing.

DECLARE @SalesSummary TABLE
(
    ProductID INT,
    TotalSales DECIMAL(10,2)
);

INSERT INTO @SalesSummary
SELECT 
    ProductID,
    SUM(Quantity * UnitPrice)
FROM Sales
GROUP BY ProductID;

SELECT * FROM @SalesSummary;

Pros and Cons

Temp Tables:

  • Pros: Support indexes, statistics, and large datasets; reusable across sessions (global temp tables).

  • Cons: Higher overhead in TempDB; cleanup required.

Table Variables:

  • Pros: Lightweight; auto-cleaned; minimal logging.

  • Cons: Limited indexing; no statistics; poor for large datasets.

Best Practices

  • Use temp tables for large datasets or complex operations.

  • Use table variables for small, short-lived datasets.

  • Drop temp tables explicitly to free TempDB resources:

DROP TABLE #TempSales;

Security Considerations

  • Restrict TempDB access to prevent unauthorized data exposure.

  • Avoid storing sensitive data in temp tables or table variables.

Performance Optimization

  • Index temp tables for large datasets:

CREATE TABLE #TempSales
(
    ProductID INT PRIMARY KEY,
    TotalSold INT
);
CREATE INDEX IX_TotalSold ON #TempSales(TotalSold);
  • Minimize TempDB contention by limiting temp table usage.

Alternatives to Temp Tables and Table Variables

  • CTEs: For temporary result sets in a single query.

  • Physical Tables: For persistent storage.

  • Memory-Optimized Tables: For high-performance scenarios.


8. Dynamic SQL: Flexible Query Execution

What Is Dynamic SQL?

Dynamic SQL allows you to construct and execute SQL statements at runtime, enabling flexible query logic.

Creating and Executing Dynamic SQL

Example: Dynamic filtering of sales data.

CREATE PROCEDURE sp_DynamicSalesReport
    @CategoryID INT = NULL,
    @Year INT = NULL
AS
BEGIN
    DECLARE @SQL NVARCHAR(MAX);
    SET @SQL = N'SELECT 
                    p.ProductName,
                    SUM(s.Quantity) AS TotalSold
                 FROM Sales s
                 INNER JOIN Products p ON s.ProductID = p.ProductID
                 WHERE 1=1';
    
    IF @CategoryID IS NOT NULL
        SET @SQL += N' AND p.CategoryID = @CategoryID';
    
    IF @Year IS NOT NULL
        SET @SQL += N' AND YEAR(s.SaleDate) = @Year';
    
    SET @SQL += N' GROUP BY p.ProductName';
    
    EXEC sp_executesql @SQL, 
        N'@CategoryID INT, @Year INT', 
        @CategoryID, @Year;
END;
GO

Real-Life Example: Dynamic Pivot

A dynamic pivot to show sales by month.

DECLARE @Columns NVARCHAR(MAX), @SQL NVARCHAR(MAX);

SELECT @Columns = STRING_AGG(QUOTENAME(MONTH(SaleDate)), ',')
FROM (SELECT DISTINCT MONTH(SaleDate) FROM Sales WHERE YEAR(SaleDate) = 2025) AS Months;

SET @SQL = N'
SELECT *
FROM
(
    SELECT 
        p.ProductName,
        MONTH(s.SaleDate) AS SaleMonth,
        s.Quantity
    FROM Sales s
    INNER JOIN Products p ON s.ProductID = p.ProductID
    WHERE YEAR(s.SaleDate) = 2025
) AS SourceTable
PIVOT
(
    SUM(Quantity)
    FOR SaleMonth IN (' + @Columns + ')
) AS PivotTable;';

EXEC sp_executesql @SQL;

Pros and Cons of Dynamic SQL

Pros:

  • Flexibility: Supports dynamic query construction.

  • Reusability: Reduces repetitive code for similar queries.

  • Dynamic Filtering: Adapts to varying conditions.

Cons:

  • Security Risks: Vulnerable to SQL injection if not parameterized.

  • Performance: Execution plans may not be cached effectively.

  • Complexity: Harder to debug and maintain.

Best Practices

  • Always use sp_executesql with parameterized queries.

  • Validate input parameters to prevent injection.

  • Comment dynamic SQL logic for clarity.

Security Best Practices

  • Parameterize Queries:

EXEC sp_executesql @SQL, N'@Param INT', @Param;
  • Avoid concatenating user inputs directly into SQL strings.

  • Restrict dynamic SQL execution to authorized roles.

Performance Optimization

  • Cache execution plans with sp_executesql.

  • Minimize dynamic SQL for frequently executed queries.

  • Use proper indexing for tables involved.

Alternatives to Dynamic SQL

  • Stored Procedures with Optional Parameters: For fixed logic.

  • CASE Statements: For conditional logic.

  • Application Logic: Move dynamic query construction to application code.


9. SQL Server 2025: JSON Enhancements and Advanced Features

JSON Enhancements in SQL Server 2025

SQL Server 2025 introduces advanced JSON capabilities, building on previous versions. Key enhancements include:

  • Native JSON Data Type: Stores JSON efficiently with indexing support.

  • Enhanced JSON Functions: Improved JSON_VALUE, JSON_QUERY, and JSON_MODIFY.

  • JSON Schema Validation: Ensures JSON data adheres to a defined schema.

  • JSON Path Expressions: Advanced querying with dot notation and wildcards.

Example: Store and query JSON data.

CREATE TABLE CustomerProfiles
(
    CustomerID INT PRIMARY KEY,
    Profile JSON
);

INSERT INTO CustomerProfiles (CustomerID, Profile)
VALUES (1, '{"Name": "John Doe", "Preferences": {"Theme": "Dark", "Notifications": true}}');

-- Query JSON
SELECT 
    CustomerID,
    JSON_VALUE(Profile, '$.Name') AS CustomerName,
    JSON_VALUE(Profile, '$.Preferences.Theme') AS Theme
FROM CustomerProfiles;

Other Advanced Features in SQL Server 2025

  • AI-Driven Query Optimization: Automatically tunes queries using machine learning.

  • TLS 1.3 Encryption: Enhanced security for data in transit.

  • Fabric Mirroring: Real-time data replication to Azure Synapse Analytics.

  • Autoreseed for Mirroring: Prevents transaction log overflow.

Real-Life Example: JSON for User Preferences

Store user preferences as JSON and validate schema.

-- Define JSON schema
CREATE TABLE UserSettings
(
    UserID INT PRIMARY KEY,
    Settings JSON
    CONSTRAINT CHK_ValidSettings CHECK (
        JSON_VALIDATE(Settings, 
            '{"type": "object", 
              "properties": {"theme": {"type": "string"}, "notifications": {"type": "boolean"}},
              "required": ["theme", "notifications"]}')
    )
);

-- Insert valid JSON
INSERT INTO UserSettings (UserID, Settings)
VALUES (1, '{"theme": "Light", "notifications": false}');

-- Query JSON
SELECT 
    UserID,
    JSON_VALUE(Settings, '$.theme') AS Theme
FROM UserSettings;

Pros and Cons of JSON Enhancements

Pros:

  • Flexibility: Stores semi-structured data without schema changes.

  • Query Power: Advanced JSON functions simplify data extraction.

  • Performance: Native JSON type and indexing improve query speed.

Cons:

  • Complexity: JSON queries can be harder to read and maintain.

  • Storage Overhead: JSON data may consume more space than normalized tables.

  • Learning Curve: Requires familiarity with JSON syntax and path expressions.

Best Practices

  • Use JSON schema validation to enforce data integrity.

  • Index JSON columns for frequently queried paths:

CREATE INDEX IDX_Settings_Theme ON UserSettings (JSON_VALUE(Settings, '$.theme'));
  • Normalize data when JSON complexity grows excessively.

Security Considerations

  • Validate JSON inputs to prevent injection attacks.

  • Restrict access to JSON columns with sensitive data.

Performance Optimization

  • Use computed columns for frequently accessed JSON values:

ALTER TABLE UserSettings
ADD Theme AS JSON_VALUE(Settings, '$.theme');
  • Create indexes on computed columns.

Alternatives to JSON

  • Normalized Tables: For structured data with fixed schemas.

  • XML: For hierarchical data with complex validation.

  • NoSQL Databases: For fully unstructured data (e.g., MongoDB).


10. Conclusion: Mastering Intermediate T-SQL

Module 4 has equipped you with the tools to build robust, secure, and high-performing database solutions using Views, Stored Procedures, Functions, Transactions, Triggers, Window Functions, Temp Tables, Table Variables, Dynamic SQL, and SQL Server 2025 JSON Enhancements. By combining real-life examples, best practices, and optimization techniques, you’re ready to tackle complex T-SQL challenges.

No comments:

Post a Comment

Thanks for your valuable comment...........
Md. Mominul Islam

Post Bottom Ad

Responsive Ads Here