Skip to content

Conversation

@ScSteffen
Copy link
Contributor

@ScSteffen ScSteffen commented Feb 10, 2026

PR: Fix QOI/output correctness and probing geometry in SNSolverHPC

Summary

This branch fixes multiple correctness bugs in src/solvers/snsolver_hpc.cpp affecting MPI reductions, RK2 consistency, probing geometry, restart-time continuity, and lattice perimeter tagging.

Why

master had several issues that could skew reported QoIs or produce inconsistent behavior across run modes:

  • MPI guards used BUILD_MPI in places where the project defines IMPORT_MPI.
  • MAX_OUTFLOW was not reduced globally across ranks.
  • RMS_FLUX under MPI used rank-local partial fluxes, making values rank-count dependent.
  • In RK2, _scalarFlux could be out of sync with the final Heun-averaged _sol used for postprocessing.
  • VAR_ABSORPTION_GREEN screen output used the wrong field.
  • Green block probing rectangles were hardcoded around origin and ignored configured green-center offsets.
  • Probing-line generation could hit invalid point-count cases (linspace2D denominator and odd probe-count layout assumptions).
  • Restarted runs reset SIM_TIME to zero instead of continuing from the loaded iteration.
  • Lattice perimeter checks had malformed abs(bool) logic.

What changed

File changed: src/solvers/snsolver_hpc.cpp

  1. MPI guard consistency
  • Replaced #ifdef BUILD_MPI with #ifdef IMPORT_MPI in this file so MPI barriers/reductions compile and run under the project’s actual MPI macro.
  1. Global max outflow across ranks
  • Added MPI_Allreduce(..., MPI_MAX) for _curMaxOrdinateOutflow in IterPostprocessing().
  1. Correct RMS flux computation under MPI
  • FVMUpdate() snapshots previous global scalar flux.
  • RMS now uses global updated scalar flux minus previous global scalar flux.
  1. RK2 scalar-flux consistency fix
  • After Heun averaging of _sol, _scalarFlux is recomputed (and allreduced in MPI) before postprocessing.
  • Ensures QoIs are computed from the final RK2 state.
  1. Fix wrong screen field for green absorption variance
  • VAR_ABSORPTION_GREEN now outputs _varAbsorptionHohlraumGreen (instead of _absorptionValsBlocksGreen[0]).
  1. Green probing geometry alignment
  • Shifted green block probing rectangles by _centerGreen (cx, cy) so block QoIs track configured center offsets.
  1. Probe-count safety guards
  • Added assert( num_points > 1 ) in linspace2D.
  • Added asserts in SetProbingCellsLineGreen() for even total line probe count and minimum side counts.
  • Added #include <cassert>.
  1. Restart simulation-time continuity
  • Initialize _curSimTime from _idx_start_iter * _dT in Solve() so resumed runs continue timeline correctly.
  1. Lattice perimeter logic fix
  • Corrected malformed perimeter comparisons and tightened branch structure (else if) to avoid incorrect overlap/classification.

Behavioral impact

  • QoIs are now consistent across MPI rank counts for MAX_OUTFLOW, RMS_FLUX, and MOMENTS reductions.
  • RK2 postprocessing uses fluxes consistent with final averaged state.
  • Green block/line probing follows configured center offsets and enforces safe sampling assumptions.
  • Restarted runs report continuous simulation time.
  • Lattice perimeter-derived outflow metrics better match intended geometry.

Risk / compatibility

  • No public API changes.
  • Changes are confined to solver internals and output/postprocessing behavior.
  • Assertions are disabled in NDEBUG; release behavior remains non-terminating for those guards unless converted to runtime errors.

Validation

  • Manual code-path validation via targeted review.
  • No automated test run included in this branch.

@ScSteffen ScSteffen added the bug Something isn't working label Feb 10, 2026
@ScSteffen ScSteffen self-assigned this Feb 10, 2026
@ScSteffen ScSteffen changed the title fix lattice perimeter definitions Bugfixes in HPC SN Solver Feb 10, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

bug Something isn't working

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant