Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
26 commits
Select commit Hold shift + click to select a range
ca0f195
ci(release): update release configuration
disafronov Dec 8, 2025
38e7522
chore(release): add semantic release configuration
disafronov Dec 8, 2025
4fe0e1b
chore(release): update release configuration format
disafronov Dec 8, 2025
3eb842a
chore(release): format adjustments in release configuration
disafronov Dec 8, 2025
6ae8f86
refactor(schema): rename SchemaLeaf to _SchemaLeaf for internal use
disafronov Dec 8, 2025
b9e9594
refactor(schema): rename CompiledSchema to _CompiledSchema for intern…
disafronov Dec 8, 2025
cbc430a
refactor(schema): rename SchemaProblem to _SchemaProblem for internal…
disafronov Dec 8, 2025
c5a1a9b
refactor(schema): rename DataProblem to _DataProblem for internal use
disafronov Dec 8, 2025
4887cd7
refactor(schema): rename get_builtin_logrecord_attributes to _get_bui…
disafronov Dec 8, 2025
6c3efea
refactor(schema): rename SCHEMA_FILE_NAME to _SCHEMA_FILE_NAME for in…
disafronov Dec 8, 2025
64c568b
test(schema_applier): add comprehensive tests for schema_applier func…
disafronov Dec 8, 2025
2b103af
test(errors): add comprehensive tests for _DataProblem and _SchemaPro…
disafronov Dec 8, 2025
81ba6ed
test(schema_applier): add tests for _create_validation_error_json fun…
disafronov Dec 8, 2025
899f8eb
test(schema_logger): add tests for _log_schema_problems_and_exit func…
disafronov Dec 8, 2025
cf783df
test: rename tests for public API
disafronov Dec 8, 2025
168fc48
test: move tests for private API to subdirectory
disafronov Dec 8, 2025
36eb478
test: rename tests for public API
disafronov Dec 8, 2025
386923a
test: reorganize tests and add edge case coverage
disafronov Dec 8, 2025
65a560c
refactor(tests): move _write_schema function to helpers module
disafronov Dec 8, 2025
fed6756
test(schema_loader): add test for cached result during exception hand…
disafronov Dec 8, 2025
150e29b
fix(schema_loader): improve caching logic for missing schema file paths
disafronov Dec 8, 2025
7a866b2
chore(coverage): modify coverage options and enhance configuration
disafronov Dec 8, 2025
333f2b1
chore(release): 0.1.4-rc.1
semantic-release-bot Dec 8, 2025
f75e707
chore(deps): bump actions/checkout from 4 to 6
dependabot[bot] Dec 8, 2025
ffa6041
chore(deps): bump actions/github-script from 7 to 8
dependabot[bot] Dec 8, 2025
3761491
docs(schema_loader): enhance thread-safety in schema compilation check
disafronov Dec 8, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions .github/workflows/auto-pr-description.yml
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ jobs:
if: github.event.pull_request.head.ref == 'main'

steps:
- uses: actions/checkout@v4
- uses: actions/checkout@v6

- name: Generate PR description
id: gen
Expand All @@ -28,7 +28,7 @@ jobs:
api_token: ${{ secrets.GITHUB_TOKEN }}

- name: Force overwrite PR title and body
uses: actions/github-script@v7
uses: actions/github-script@v8
env:
PR_BODY: ${{ steps.gen.outputs.pull_request_description }}
with:
Expand Down
33 changes: 19 additions & 14 deletions .releaserc.json → .releaserc.cjs
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
{
module.exports = {
"branches": [
"release",
{
Expand All @@ -10,26 +10,31 @@
["@semantic-release/commit-analyzer", {
"preset": "conventionalcommits",
"releaseRules": [
{ "type": "feat", "release": "minor" },
{ "type": "fix", "release": "patch" },
{ "type": "perf", "release": "patch" },
{ "type": "revert", "release": "patch" },
{ "type": "refactor", "release": "patch" },
{ "type": "docs", "release": false },
{ "type": "style", "release": false },
{ "type": "test", "release": false },
{ "type": "build", "release": false },
{ "type": "ci", "release": false },
{ "type": "chore", "release": false }
{ "type": "feat", "release": "minor" },
{ "type": "fix", "release": "patch" },
{ "type": "perf", "release": "patch" },
{ "type": "revert", "release": "patch" },
{ "type": "refactor", "release": "patch" },
{ "type": "docs", "release": false },
{ "type": "style", "release": false },
{ "type": "test", "release": false },
{ "type": "build", "release": false },
{ "type": "ci", "release": false },
{ "type": "chore", "release": false }
]
}],
["@semantic-release/release-notes-generator", { "preset": "conventionalcommits" }],
["@semantic-release/exec", {
"prepareCmd": "node -e \"const fs=require('fs'),toml=require('@iarna/toml');let version='${nextRelease.version}';version=version.replace(/-rc\\\\./g,'rc');const pyprojectFile='pyproject.toml';const pyprojectData=toml.parse(fs.readFileSync(pyprojectFile,'utf8'));const packageName=pyprojectData.project.name;pyprojectData.project.version=version;fs.writeFileSync(pyprojectFile,toml.stringify(pyprojectData));const uvLockFile='uv.lock';const uvLockData=toml.parse(fs.readFileSync(uvLockFile,'utf8'));const packageIndex=uvLockData.package.findIndex(p=>p.name===packageName);if(packageIndex!==-1){uvLockData.package[packageIndex].version=version;fs.writeFileSync(uvLockFile,toml.stringify(uvLockData));}\""
}],
["@semantic-release/changelog", {}],
["@semantic-release/git", {
"assets": ["pyproject.toml", "uv.lock"],
"message": "chore(release): ${nextRelease.version}\n\n${nextRelease.notes}\n\nSigned-off-by: Release Bot <noreply@github.com>"
"assets": ["CHANGELOG.md", "pyproject.toml", "uv.lock"],
"message": "chore(release): ${nextRelease.version}\n\n${nextRelease.notes}\n\nSigned-off-by: " +
(process.env.GIT_AUTHOR_NAME || process.env.GIT_COMMITTER_NAME || "Release Bot") +
" <" +
(process.env.GIT_AUTHOR_EMAIL || process.env.GIT_COMMITTER_EMAIL || "noreply@github.com") +
">"
}],
["@semantic-release/github", {}]
]
Expand Down
5 changes: 5 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
## [0.1.4-rc.1](https://github.com/disafronov/python-logging-objects-with-schema/compare/v0.1.3...v0.1.4-rc.1) (2025-12-08)

### Bug Fixes

* **schema_loader:** improve caching logic for missing schema file paths ([150e29b](https://github.com/disafronov/python-logging-objects-with-schema/commit/150e29b6a7e58a1e249eecb6677a82c77ad57eec))
2 changes: 1 addition & 1 deletion Makefile
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Variables
PYTEST_CMD = uv run python -m pytest -v
COVERAGE_OPTS = --cov=. --cov-report=term-missing --cov-report=html
COVERAGE_OPTS = --cov --cov-report=term-missing --cov-report=html

# Phony targets
.PHONY: all clean format help install lint test test-coverage
Expand Down
22 changes: 21 additions & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ build-backend = "uv_build"

[project]
name = "logging-objects-with-schema"
version = "0.1.3"
version = "0.1.4rc1"
description = "Proxy logging wrapper that validates extra fields against a JSON schema."
readme = "README.md"
requires-python = ">=3.10"
Expand Down Expand Up @@ -95,3 +95,23 @@ strict_equality = true
[tool.bandit]
skips = [ "B101", "B601" ]
exclude_dirs = [ ".venv", "__pycache__", ".git", "htmlcov" ]

[tool.coverage.run]
source = [ "src" ]
branch = true

[tool.coverage.report]
exclude_lines = [
"def __repr__",
"if self\\.debug",
"raise AssertionError",
"raise NotImplementedError",
"if 0:",
"if __name__ == .__main__.:",
"@(abc\\.)?abstractmethod"
]
show_missing = true
skip_covered = false

[tool.coverage.html]
directory = "htmlcov"
14 changes: 11 additions & 3 deletions src/logging_objects_with_schema/errors.py
Original file line number Diff line number Diff line change
Expand Up @@ -6,9 +6,13 @@


@dataclass
class SchemaProblem:
class _SchemaProblem:
"""Describes a single problem encountered while loading the schema.

This class is part of the internal implementation and is not considered
a public API. Its signature and behaviour may change between releases
without preserving backward compatibility.

This class is used to report schema validation errors during schema
compilation. Schema problems are fatal: if any are detected during
logger initialization, the application is terminated after logging
Expand Down Expand Up @@ -36,12 +40,16 @@ class SchemaProblem:


@dataclass
class DataProblem:
class _DataProblem:
"""Describes a single problem encountered while validating log data.

This class is part of the internal implementation and is not considered
a public API. Its signature and behaviour may change between releases
without preserving backward compatibility.

This class is used to report validation errors when applying the compiled
schema to user-provided ``extra`` fields during logging. Unlike
:class:`SchemaProblem`, data problems are not fatal: they are collected
:class:`_SchemaProblem`, data problems are not fatal: they are collected
and logged as ERROR messages *after* the main log record has been emitted,
ensuring 100% compatibility with standard logger behavior.

Expand Down
48 changes: 24 additions & 24 deletions src/logging_objects_with_schema/schema_applier.py
Original file line number Diff line number Diff line change
Expand Up @@ -11,8 +11,8 @@
from collections.abc import Mapping, MutableMapping
from typing import Any

from .errors import DataProblem
from .schema_loader import CompiledSchema, SchemaLeaf
from .errors import _DataProblem
from .schema_loader import _CompiledSchema, _SchemaLeaf


def _create_validation_error_json(field: str, error: str, value: Any) -> str:
Expand Down Expand Up @@ -45,7 +45,7 @@ def _validate_list_value(
value: list,
source: str,
item_expected_type: type | None,
) -> DataProblem | None:
) -> _DataProblem | None:
"""Validate that a list value matches the expected item type.

Validates that all elements in the list have the exact type declared by
Expand All @@ -58,11 +58,11 @@ def _validate_list_value(
for list-typed leaves.

Returns:
DataProblem if validation fails, None if validation succeeds.
_DataProblem if validation fails, None if validation succeeds.
"""
if item_expected_type is None:
error_msg = "is a list but has no item type configured"
return DataProblem(_create_validation_error_json(source, error_msg, value))
return _DataProblem(_create_validation_error_json(source, error_msg, value))

if len(value) == 0:
# Empty lists are always valid
Expand All @@ -83,7 +83,7 @@ def _validate_list_value(
f"expected all elements to be of type "
f"{item_expected_type.__name__}"
)
return DataProblem(_create_validation_error_json(source, error_msg, value))
return _DataProblem(_create_validation_error_json(source, error_msg, value))

return None

Expand Down Expand Up @@ -124,11 +124,11 @@ def _set_nested_value(


def _validate_and_apply_leaf(
leaf: SchemaLeaf,
leaf: _SchemaLeaf,
value: Any,
source: str,
extra: MutableMapping[str, Any],
problems: list[DataProblem],
problems: list[_DataProblem],
) -> None:
"""Validate a value against a schema leaf and apply it if valid.

Expand All @@ -152,7 +152,7 @@ def _validate_and_apply_leaf(
f"expected {leaf.expected_type.__name__}"
)
problems.append(
DataProblem(_create_validation_error_json(source, error_msg, value))
_DataProblem(_create_validation_error_json(source, error_msg, value))
)
return

Expand Down Expand Up @@ -203,27 +203,27 @@ def _strip_empty(node: Any) -> Any:


def _apply_schema_internal(
compiled: CompiledSchema,
compiled: _CompiledSchema,
extra_values: Mapping[str, Any],
) -> tuple[dict[str, Any], list[DataProblem]]:
) -> tuple[dict[str, Any], list[_DataProblem]]:
"""Internal function to build structured ``extra`` from compiled schema.

The function applies a :class:`CompiledSchema` to user-provided ``extra``
The function applies a :class:`_CompiledSchema` to user-provided ``extra``
values and returns a tuple ``(structured_extra, problems)`` where:

- ``structured_extra`` is a nested dictionary that follows the schema
structure and contains only fields that passed validation;
- ``problems`` is a list of :class:`DataProblem` describing all data
- ``problems`` is a list of :class:`_DataProblem` describing all data
issues observed during processing.

Behaviour summary:

- If the compiled schema is effectively empty (no valid leaves),
all fields from ``extra_values`` are treated as redundant: the returned
payload is empty, and a :class:`DataProblem` is created for each field.
payload is empty, and a :class:`_DataProblem` is created for each field.
- For each ``source`` mentioned in the schema when there are valid leaves:
- if the source is missing from ``extra_values``, it is silently skipped;
- if the corresponding value is ``None``, a ``DataProblem`` is recorded
- if the corresponding value is ``None``, a ``_DataProblem`` is recorded
and the value is not written to the payload.
- Type checks are strict: the runtime type must exactly match the declared
Python type (``type(value) is leaf.expected_type``). This prevents
Expand All @@ -233,17 +233,17 @@ def _apply_schema_internal(
- all elements must have the exact type declared by the leaf
``item_expected_type`` (for example, list[str], list[int]);
- non-primitive elements and elements of a different primitive type are
rejected with a ``DataProblem`` and the list value is not written.
rejected with a ``_DataProblem`` and the list value is not written.
- Redundant fields from ``extra_values`` (not referenced by any leaf
``source``) are always reported as problems: each such field generates
a :class:`DataProblem` indicating that it is not defined in the schema.
a :class:`_DataProblem` indicating that it is not defined in the schema.
- A single ``source`` may be used by multiple leaves. The value is
validated independently for each leaf and written only to locations
where the type matches; mismatched locations produce ``DataProblem``
where the type matches; mismatched locations produce ``_DataProblem``
entries, but do not affect successful locations.

The function itself does not raise exceptions; it only accumulates
:class:`DataProblem` instances for the caller to handle.
:class:`_DataProblem` instances for the caller to handle.

Performance considerations:
Time complexity is O(n + m) where n is the number of leaves in the
Expand All @@ -269,18 +269,18 @@ def _apply_schema_internal(
change between releases without preserving backward compatibility.

Returns:
Tuple of (structured_extra, list[DataProblem]).
Tuple of (structured_extra, list[_DataProblem]).
"""
extra: dict[str, Any] = {}
problems: list[DataProblem] = []
problems: list[_DataProblem] = []

# Group leaves by source field name. This is necessary because a single source
# can be referenced by multiple leaves (allowing the same value to appear in
# different locations in the output structure). Grouping allows us to process
# all leaves for a given source together, which is more efficient and allows
# us to validate the value once per source (e.g., checking for None) rather
# than once per leaf.
source_to_leaves: dict[str, list[SchemaLeaf]] = defaultdict(list)
source_to_leaves: dict[str, list[_SchemaLeaf]] = defaultdict(list)
for leaf in compiled.leaves:
source_to_leaves[leaf.source].append(leaf)

Expand All @@ -304,7 +304,7 @@ def _apply_schema_internal(
if value is None:
error_msg = "is None"
problems.append(
DataProblem(_create_validation_error_json(source, error_msg, None))
_DataProblem(_create_validation_error_json(source, error_msg, None))
)
continue

Expand All @@ -329,7 +329,7 @@ def _apply_schema_internal(
for key in redundant_keys:
error_msg = "is not defined in schema"
problems.append(
DataProblem(
_DataProblem(
_create_validation_error_json(key, error_msg, extra_values[key])
)
)
Expand Down
Loading