Finally, LibConvert -> applyStalkPenalty and LibConvert -> calculateConvertCapacityPenalty use this value to determine the overall convert capacity and the penalty fee:
The readCappedReserves works in the following manner:
Updates the reserves if the delta time is > 0.
Return the last capped reserves if delta time == 0 or it caps the current reserves.
As shown above, Basin works with block.timestamp to return the capped reserves and avoid price manipulation in the same block. But Beanstalk is working with block.number. Due to this approach, Beanstalk is now exposed to the following issues when deployed on Arbitrum:
In general, the block.number on Arbitrum is not a reliable source of timing information and the time between each block is also different from Ethereum. This is because each transaction on Arbitrum is placed in a separate block and blocks are not produced at a constant rate.
How to reproduce
Users perform conversions within the same block on Arbitrum, collectively consuming the entire overallConvertCapacity for that block.
Arbitrum repeats the block multiple times due to transactions batching, allowing more transactions within the same block.number.
User Transaction Penalized: A user attempts a pipeline conversion in one of the repeated blocks. Despite not exceeding the intended capacity themselves, their transaction is unfairly penalized, and their grown stalk is burned because the capacity was already exhausted by previous transactions.
Impact Details
Loss of funds(grown stalk)
Users are penalized unfairly when using convert(even when they are contributing towards the BEANS' peg).
Peg will be compromised as users will avoid using the pipeline convert so they don't have their grown stalk burned.
I talked to @Brean from Beanstalk and he recommended me to submit the report even though the assets in scope are not yet updated to Arbitrum. From @Brean: "assume that the assets in scope include the new beanstalk contract and we will work with immunefi."
Due to the urgency of the issue and the nature of the PoC(Arbitrum infra-related) I skipped it but provided enough information in the report to make it clear.
Thank you for this report. We agree that this is a valid issue and was resolved in [EBIP-19](https://bean.money/ebip-19). Upon review, we determined that although we discovered and began working on a fix for this issue before this Immunefi report was submitted (by a matter of minutes), the BIC would like to an issue a good faith reward for the report.
The BIC has determined that the severity of this report is "Medium" with an impact of "Contract fails to deliver promised returns, but doesn't lose value." As outlined in the program, for Medium severity reports, the BIC determines a reward between 1k and 10k Beans based on:
The exploitability of the bug;
The impact it causes; and
The likelihood of the vulnerability presenting itself.
Based on these criteria, the BIC has determined that 5,000 Beans be rewarded for this report.
function executePipelineConvert(
address inputToken,
address outputToken,
uint256 fromAmount,
uint256 fromBdv,
uint256 initialGrownStalk,
AdvancedFarmCall[] calldata advancedFarmCalls
) external returns (uint256 toAmount, uint256 newGrownStalk, uint256 newBdv) {
...
// Store the capped overall deltaB, this limits the overall convert power for the block
@> pipeData.overallConvertCapacity = LibConvert.abs(LibDeltaB.overallCappedDeltaB());
...
pipeData.stalkPenaltyBdv = prepareStalkPenaltyCalculation(
inputToken,
outputToken,
pipeData.deltaB,
@> pipeData.overallConvertCapacity,
fromBdv,
pipeData.initialLpSupply
);
...
}
function prepareStalkPenaltyCalculation(
address inputToken,
address outputToken,
LibConvert.DeltaBStorage memory dbs,
uint256 overallConvertCapacity,
uint256 fromBdv,
uint256[] memory initialLpSupply
) public returns (uint256) {
...
return
LibConvert.applyStalkPenalty(
dbs,
fromBdv,
@> overallConvertCapacity,
inputToken,
outputToken
);
}