Produce FundingInfo::Contribution variants in ChannelMonitor#4498
Produce FundingInfo::Contribution variants in ChannelMonitor#4498wpaulino wants to merge 2 commits intolightningdevkit:mainfrom
Conversation
wpaulino
commented
Mar 19, 2026
Exposing the amounts for each output isn't very helpful because it's possible that they vary across over multiple splice candidates due to RBF. This commit changes `FundingInfo::Contribution` and several of the helpers used to derive it to be based on output scripts instead.
Similar to the `ChannelManager`, we expose the contributed inputs and outputs of a splice via `FundingInfo::Contribution` at the `ChannelMonitor` level such that we don't lose the context when the channel closes while a splice is still pending.
|
👋 Thanks for assigning @jkczyz as a reviewer! |
| let (mut discarded_inputs, mut discarded_outputs) = (new_hash_set(), new_hash_set()); | ||
| for funding in discarded_funding { | ||
| if funding.contributed_inputs.is_none() && funding.contributed_outputs.is_none() { | ||
| self.pending_events.push(Event::DiscardFunding { | ||
| channel_id: self.channel_id, | ||
| funding_info: FundingInfo::OutPoint { outpoint: funding.funding_outpoint() }, | ||
| }); | ||
| } else { | ||
| // Filter out any inputs/outputs that were already included in the locked funding | ||
| // transaction. | ||
| if let Some(mut maybe_discarded_inputs) = funding.contributed_inputs { | ||
| maybe_discarded_inputs.retain(|input| { | ||
| let is_input_in_locked_funding = self | ||
| .funding | ||
| .contributed_inputs | ||
| .as_ref() | ||
| .map(|inputs| inputs.contains(input)) | ||
| // The recently locked funding did not contain any contributed inputs. | ||
| .unwrap_or(false); | ||
| !is_input_in_locked_funding | ||
| }); | ||
| discarded_inputs.extend(maybe_discarded_inputs); | ||
| } | ||
| if let Some(mut maybe_discarded_outputs) = funding.contributed_outputs { | ||
| maybe_discarded_outputs.retain(|output| { | ||
| let is_output_in_locked_funding = self | ||
| .funding | ||
| .contributed_outputs | ||
| .as_ref() | ||
| .map(|outputs| outputs.contains(output)) | ||
| // The recently locked funding did not contain any contributed outputs. | ||
| .unwrap_or(false); | ||
| !is_output_in_locked_funding | ||
| }); | ||
| discarded_outputs.extend(maybe_discarded_outputs); | ||
| } | ||
| } | ||
| } | ||
| if !discarded_inputs.is_empty() || !discarded_outputs.is_empty() { | ||
| self.pending_events.push(Event::DiscardFunding { | ||
| channel_id: self.channel_id, | ||
| funding_info: FundingInfo::Contribution { | ||
| inputs: discarded_inputs.into_iter().collect(), | ||
| outputs: discarded_outputs.into_iter().collect(), |
There was a problem hiding this comment.
The discarded_inputs and discarded_outputs HashSets use RandomState, so when collected into Vecs at lines 4146-4147, the element ordering is non-deterministic across runs. This means FundingInfo::Contribution { inputs, outputs } can have different orderings each time.
Currently this is likely safe because the tests only exercise single-element cases, but any test with multiple contributed inputs or outputs and an assert_eq! on the Contribution variant would be flaky. Consider either:
- Sorting the Vecs before constructing the event, or
- Documenting that the ordering is unspecified and tests should use order-independent assertions (e.g., convert to a set before comparing).
| contributed_inputs: Some(contributed_inputs.to_vec()), | ||
| contributed_outputs: Some(contributed_outputs.to_vec()), |
There was a problem hiding this comment.
These lines unconditionally wrap in Some(...), even when the slices are empty. This creates a semantic difference from None:
None→queue_discard_funding_eventemitsFundingInfo::OutPoint(fallback for unknown contributions)Some(vec![])→ goes to theelsebranch, accumulates nothing, and no event is emitted
This is correct for the non-contributor case (nothing to discard). However, if a RenegotiatedFunding monitor update from a prior LDK version (without fields 7/9) is replayed after upgrade, optional_vec deserialization yields empty Vecs, which become Some(vec![]) here. The resulting funding would silently produce no DiscardFunding event when discarded, instead of the OutPoint fallback.
If splicing hasn't shipped in a stable release yet, this is a non-issue in practice. But for defensive correctness, consider normalizing:
| contributed_inputs: Some(contributed_inputs.to_vec()), | |
| contributed_outputs: Some(contributed_outputs.to_vec()), | |
| contributed_inputs: if contributed_inputs.is_empty() { None } else { Some(contributed_inputs.to_vec()) }, | |
| contributed_outputs: if contributed_outputs.is_empty() { None } else { Some(contributed_outputs.to_vec()) }, |
This would cause non-contributors to get a (harmless) OutPoint event, so it's a tradeoff. At minimum, a comment explaining the Some(vec![]) vs None distinction in queue_discard_funding_event would help future readers.
| ) { | ||
| let (mut discarded_inputs, mut discarded_outputs) = (new_hash_set(), new_hash_set()); | ||
| for funding in discarded_funding { | ||
| if funding.contributed_inputs.is_none() && funding.contributed_outputs.is_none() { |
There was a problem hiding this comment.
Nit: consider adding a comment clarifying the None vs Some(vec![]) semantics, since they produce different behavior:
- Both
None→ fallbackFundingInfo::OutPointevent (contribution data unavailable) - Either
Some(...)→Contributionpath (tracked but possibly empty after filtering → no event if nothing remains)
This distinction is intentional but non-obvious to a reader.
|
|
Codecov Report❌ Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## main #4498 +/- ##
==========================================
+ Coverage 86.19% 86.20% +0.01%
==========================================
Files 161 161
Lines 107459 107547 +88
Branches 107459 107547 +88
==========================================
+ Hits 92621 92712 +91
+ Misses 12219 12216 -3
Partials 2619 2619
Flags with carried forward coverage won't be shown. Click here to find out more. ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|