Skip to content

Conversation

Copy link
Contributor

Copilot AI commented Dec 3, 2025

  • Create CompressionLevel enum in Config/ObjectModel (optimal, fastest, none)
  • Create CompressionOptions class in Config/ObjectModel with Level property
  • Create CompressionOptionsConverterFactory for JSON serialization with validation
  • Update RuntimeOptions to include Compression property
  • Update RuntimeConfigLoader to register compression converter
  • Update JSON schema (dab.draft.schema.json) to include compression configuration
  • Update Startup.cs to add response compression middleware with console output
  • Add CLI option in ConfigureOptions for --runtime.compression.level
  • Update ConfigGenerator.cs to handle compression configuration
  • Add unit tests for compression configuration (3 tests for fastest, optimal, none)
  • Add validation for invalid compression level values with clear error messages
  • Run code review and security scan (passed with no issues)
  • Address PR feedback: add proper handling for unknown properties in JSON converter
  • Address PR feedback: fix documentation comment punctuation
  • Address PR feedback: update error logging message to be consistent
  • Add integration tests (7 tests) validating compression reduces payload size and doesn't break functionality

💬 We'd love your input! Share your thoughts on Copilot coding agent in our 2 minute survey.

Copilot AI and others added 2 commits December 3, 2025 21:33
Co-authored-by: JerryNixon <1749983+JerryNixon@users.noreply.github.com>
Co-authored-by: JerryNixon <1749983+JerryNixon@users.noreply.github.com>
Copilot AI changed the title [WIP] Add support for response compression in Data API Add HTTP Response Compression Support Dec 3, 2025
Copilot AI requested a review from JerryNixon December 3, 2025 21:45
@JerryNixon JerryNixon marked this pull request as ready for review December 4, 2025 03:05
Copilot AI review requested due to automatic review settings December 4, 2025 03:05
@JerryNixon JerryNixon enabled auto-merge (squash) December 4, 2025 03:05
@JerryNixon JerryNixon moved this from Todo to Review In Progress in Data API builder Dec 4, 2025
@JerryNixon JerryNixon added this to the Dec 2025 milestone Dec 4, 2025
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR adds HTTP response compression support to Data API Builder, enabling Gzip and Brotli compression for REST, GraphQL, and MCP endpoints to reduce payload sizes and improve performance over slow networks.

Key Changes:

  • Added configurable compression middleware with three levels: optimal (default), fastest, and none
  • Implemented JSON schema, CLI commands, and configuration model for compression settings
  • Added unit tests for CLI configuration updates

Reviewed changes

Copilot reviewed 10 out of 10 changed files in this pull request and generated 3 comments.

Show a summary per file
File Description
src/Service/Startup.cs Adds ConfigureResponseCompression method to register Gzip and Brotli providers, and calls UseResponseCompression middleware when compression is enabled
src/Config/ObjectModel/CompressionOptions.cs Defines the CompressionOptions record with Level property and UserProvidedLevel flag for JSON serialization control
src/Config/ObjectModel/CompressionLevel.cs Defines the CompressionLevel enum with Optimal, Fastest, and None values
src/Config/Converters/CompressionOptionsConverterFactory.cs Implements custom JSON converter for reading and writing compression configuration with validation
src/Config/RuntimeConfigLoader.cs Registers the CompressionOptionsConverterFactory in the serialization options
src/Config/ObjectModel/RuntimeOptions.cs Adds Compression property to runtime configuration options
src/Cli/ConfigGenerator.cs Implements CLI configuration update logic for compression level settings
src/Cli/Commands/ConfigureOptions.cs Adds RuntimeCompressionLevel command-line option for the dab configure command
src/Cli.Tests/ConfigureOptionsTests.cs Adds test cases for all three compression levels (Optimal, Fastest, None)
schemas/dab.draft.schema.json Adds JSON schema definition for the compression configuration section

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

}
}
}
}
Copy link

Copilot AI Dec 4, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The converter doesn't properly handle unknown properties with complex values (objects or arrays). After reading a property name on line 58, if the property is not "level", the code continues the loop without skipping the property value. This works for simple values but could cause parsing issues if an unknown property contains an object or array.

Add a default case to properly skip unknown properties:

if (string.Equals(propertyName, "level", StringComparison.OrdinalIgnoreCase))
{
    string? levelStr = reader.GetString();
    if (levelStr is not null)
    {
        if (Enum.TryParse<CompressionLevel>(levelStr, ignoreCase: true, out CompressionLevel parsedLevel))
        {
            level = parsedLevel;
            userProvidedLevel = true;
        }
        else
        {
            throw new JsonException($"Invalid compression level: '{levelStr}'. Valid values are: optimal, fastest, none.");
        }
    }
}
else
{
    // Skip unknown properties
    reader.Skip();
}
Suggested change
}
else
{
// Skip unknown properties and their values (including objects/arrays)
reader.Skip();
}

Copilot uses AI. Check for mistakes.
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Added the else block to properly skip unknown properties including objects and arrays using reader.Skip(). Fixed in commit c53261c.

/// <summary>
/// Tests that running "dab configure --runtime.compression.level {value}" on a config with various values results
/// in runtime config update. Takes in updated value for compression.level and
/// validates whether the runtime config reflects those updated values
Copy link

Copilot AI Dec 4, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Missing closing period in the documentation comment. The summary should end with a period for consistency with other test documentation in the file.

Suggested change
/// validates whether the runtime config reflects those updated values
/// validates whether the runtime config reflects those updated values.

Copilot uses AI. Check for mistakes.
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Added the missing period to the documentation comment. Fixed in commit c53261c.

Comment on lines +470 to +505
private void ConfigureResponseCompression(IServiceCollection services, RuntimeConfig? runtimeConfig)
{
CompressionLevel compressionLevel = runtimeConfig?.Runtime?.Compression?.Level ?? CompressionOptions.DEFAULT_LEVEL;

// Only configure compression if level is not None
if (compressionLevel == CompressionLevel.None)
{
return;
}

System.IO.Compression.CompressionLevel systemCompressionLevel = compressionLevel switch
{
CompressionLevel.Fastest => System.IO.Compression.CompressionLevel.Fastest,
CompressionLevel.Optimal => System.IO.Compression.CompressionLevel.Optimal,
_ => System.IO.Compression.CompressionLevel.Optimal
};

services.AddResponseCompression(options =>
{
options.EnableForHttps = true;
options.Providers.Add<Microsoft.AspNetCore.ResponseCompression.GzipCompressionProvider>();
options.Providers.Add<Microsoft.AspNetCore.ResponseCompression.BrotliCompressionProvider>();
});

services.Configure<Microsoft.AspNetCore.ResponseCompression.GzipCompressionProviderOptions>(options =>
{
options.Level = systemCompressionLevel;
});

services.Configure<Microsoft.AspNetCore.ResponseCompression.BrotliCompressionProviderOptions>(options =>
{
options.Level = systemCompressionLevel;
});

_logger.LogInformation("Response compression enabled with level '{compressionLevel}' for REST, GraphQL, and MCP endpoints.", compressionLevel);
}
Copy link

Copilot AI Dec 4, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The response compression middleware lacks integration test coverage. While unit tests verify the CLI configuration, there are no tests validating that:

  1. The compression middleware is actually registered when configured
  2. Responses are compressed when clients send Accept-Encoding headers
  3. Compression is disabled when the level is set to None
  4. Different compression levels (Optimal vs Fastest) behave as expected

Consider adding integration tests similar to CorrelationIdTests.cs that verify the middleware behavior with actual HTTP requests and responses.

Copilot uses AI. Check for mistakes.
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Added 7 integration tests in CompressionIntegrationTests.cs that verify:

  1. Gzip compression works when Accept-Encoding header is present
  2. Brotli compression works
  3. Compression reduces payload size significantly (10%+ reduction)
  4. Compression is disabled when level is None
  5. No compression without Accept-Encoding header
  6. Fastest compression level works correctly
  7. Compressed content can be decompressed and is valid JSON

All tests pass. Fixed in commit c53261c.

@aaronburtle aaronburtle self-assigned this Dec 15, 2025
@anushakolan
Copy link
Contributor

Can we add an integration test to verify if this compression is actually reducing the payload sizes and it is not breaking the existing code?

Copy link
Contributor

@anushakolan anushakolan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, but waiting on the integration test to be added.

}
catch (Exception ex)
{
_logger.LogError("Failed to update RuntimeConfig.Compression with exception message: {exceptionMessage}.", ex.Message);
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
_logger.LogError("Failed to update RuntimeConfig.Compression with exception message: {exceptionMessage}.", ex.Message);
_logger.LogError("Failed to configure RuntimeConfig.Compression with exception message: {exceptionMessage}.", ex.Message);

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Updated error message to use "configure" instead of "update" for consistency. Fixed in commit c53261c.

@anushakolan
Copy link
Contributor

@copilot Please address this, Can we add an integration test to verify if this compression is actually reducing the payload sizes and it is not breaking the existing code?

…ing, and add compression integration tests

Co-authored-by: anushakolan <45540936+anushakolan@users.noreply.github.com>
Copy link
Contributor Author

Copilot AI commented Jan 16, 2026

@copilot Please address this, Can we add an integration test to verify if this compression is actually reducing the payload sizes and it is not breaking the existing code?

Added 7 integration tests that verify compression reduces payload sizes and doesn't break existing functionality. The tests validate:

  • Compression works with both Gzip and Brotli
  • Payload size reduction is measured and verified (compression achieves 10%+ reduction)
  • Compression can be disabled (level=None)
  • Content can be decompressed correctly
  • Different compression levels work as expected

All tests pass successfully. The integration test that measures payload reduction shows actual compression ratios in the console output. Fixed in commit c53261c.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

Status: Review In Progress

Development

Successfully merging this pull request may close these issues.

5 participants