Beanstalk Notion
Beanstalk Notion
/
🪲
Bug Reports
/
BIC Notes
/
📄
Report #35860
📄

Report #35860

Report Date
October 10, 2024
Status
Confirmed
Payout
5,000

Penalty Fees for Pipeline Convert Applied Incorrectly

‣
Report Info

Report ID

#35860

Report type

Smart Contract

Has PoC?

Yes

Target

https://etherscan.io/address/0xC1E088fC1323b20BCBee9bd1B9fC9546db5624C5

Impacts

  • Permanent freezing of funds
  • Permanent freezing of unclaimed yield

Description

The LibConvert uses the block.number to prevent flash loan attacks that can cause price manipulation.

I asked the devs why they decided to use block.number instead of block.timestamp. Here's the reply from @pizzaman1337:

Using block number seemed cleaner than timestamp to me, since the idea was to limit converts on a per-block basis.

The problem with this approach is the following:

  1. Beanstalk determines the "conversion capacity" based on the capped reserves.

LibPipelineConvert reads the overall cappedDeltaB and passes it into the overallConvertCapacity -> prepareStalkPenaltyCalculation which passes it to LibConvert.applyStalkPenalty. Look at the snippet code below: https://github.com/BeanstalkFarms/Beanstalk/blob/fbd73fe05587d5e28ccd37d07b4fee04e968595a/protocol/contracts/libraries/Convert/LibPipelineConvert.sol#L45-L71

Finally, LibConvert -> applyStalkPenalty and LibConvert -> calculateConvertCapacityPenalty use this value to determine the overall convert capacity and the penalty fee:

The readCappedReserves works in the following manner:

  • Updates the reserves if the delta time is > 0.
  • Return the last capped reserves if delta time == 0 or it caps the current reserves.

https://github.com/BeanstalkFarms/Basin/blob/830ccd8775d0e53fea0fcdacbcc770459c5575d0/src/pumps/MultiFlowPump.sol#L206-L209

function readCappedReserves(
        address well,
        bytes calldata data
    ) external view returns (uint256[] memory cappedReserves) {
       ...
@>        uint256 deltaTimestamp = _getDeltaTimestamp(lastTimestamp);
@>        if (deltaTimestamp == 0) {
@>            return cappedReserves;
        }
        ...
    }

Vulnerability Details

As shown above, Basin works with block.timestamp to return the capped reserves and avoid price manipulation in the same block. But Beanstalk is working with block.number. Due to this approach, Beanstalk is now exposed to the following issues when deployed on Arbitrum:

  • Arbitrum batches transactions into single L1 blocks, thus multiple Arbitrum blocks can share the same L1 block number. This means that the value of block.number can be repeated for several blocks as shown in this article: https://docs.arbitrum.io/build-decentralized-apps/arbitrum-vs-ethereum/block-numbers-and-time

In general, the block.number on Arbitrum is not a reliable source of timing information and the time between each block is also different from Ethereum. This is because each transaction on Arbitrum is placed in a separate block and blocks are not produced at a constant rate.

How to reproduce

  1. Users perform conversions within the same block on Arbitrum, collectively consuming the entire overallConvertCapacity for that block.
  2. Arbitrum repeats the block multiple times due to transactions batching, allowing more transactions within the same block.number.
  3. User Transaction Penalized: A user attempts a pipeline conversion in one of the repeated blocks. Despite not exceeding the intended capacity themselves, their transaction is unfairly penalized, and their grown stalk is burned because the capacity was already exhausted by previous transactions.

Impact Details

  • Loss of funds(grown stalk)
  • Users are penalized unfairly when using convert(even when they are contributing towards the BEANS' peg).
  • Peg will be compromised as users will avoid using the pipeline convert so they don't have their grown stalk burned.

Recommendation

  1. Replace block.number with block.timestamp.

2.(Alternative) If Beanstalk wants to keep using the block.number without issues, ArbSys. arbBlockNumber () can be used. https://docs.arbitrum.io/build-decentralized-apps/precompiles/reference#arbsys

Additional info:

  1. I talked to @Brean from Beanstalk and he recommended me to submit the report even though the assets in scope are not yet updated to Arbitrum. From @Brean: "assume that the assets in scope include the new beanstalk contract and we will work with immunefi."
  2. Due to the urgency of the issue and the nature of the PoC(Arbitrum infra-related) I skipped it but provided enough information in the report to make it clear.

References

OZ audit report - https://blog.openzeppelin.com/spherex-audit#currentblockoriginhash-invalidation-issues-can-lead-to-false-positives

C4 Audit - https://solodit.xyz/issues/m-04-anchortime-will-not-work-properly-on-optimism-due-to-use-of-blocknumber-code4rena-frankencoin-frankencoin-git

BIR-21: L1 block.number

BIC Response

Thank you for this report. We agree that this is a valid issue and was resolved in [EBIP-19](https://bean.money/ebip-19). Upon review, we determined that although we discovered and began working on a fix for this issue before this Immunefi report was submitted (by a matter of minutes), the BIC would like to an issue a good faith reward for the report.

The BIC has determined that the severity of this report is "Medium" with an impact of "Contract fails to deliver promised returns, but doesn't lose value." As outlined in the program, for Medium severity reports, the BIC determines a reward between 1k and 10k Beans based on:

  • The exploitability of the bug;
  • The impact it causes; and
  • The likelihood of the vulnerability presenting itself.

Based on these criteria, the BIC has determined that 5,000 Beans be rewarded for this report.

    function executePipelineConvert(
        address inputToken,
        address outputToken,
        uint256 fromAmount,
        uint256 fromBdv,
        uint256 initialGrownStalk,
        AdvancedFarmCall[] calldata advancedFarmCalls
    ) external returns (uint256 toAmount, uint256 newGrownStalk, uint256 newBdv) {
...
        // Store the capped overall deltaB, this limits the overall convert power for the block
@>        pipeData.overallConvertCapacity = LibConvert.abs(LibDeltaB.overallCappedDeltaB());
...

        pipeData.stalkPenaltyBdv = prepareStalkPenaltyCalculation(
            inputToken,
            outputToken,
            pipeData.deltaB,
@>            pipeData.overallConvertCapacity,
            fromBdv,
            pipeData.initialLpSupply
        );
...
    }

function prepareStalkPenaltyCalculation(
        address inputToken,
        address outputToken,
        LibConvert.DeltaBStorage memory dbs,
        uint256 overallConvertCapacity,
        uint256 fromBdv,
        uint256[] memory initialLpSupply
    ) public returns (uint256) {
...
        return
            LibConvert.applyStalkPenalty(
                dbs,
                fromBdv,
@>                overallConvertCapacity,
                inputToken,
                outputToken
            );
    }
    function applyStalkPenalty(
        DeltaBStorage memory dbs,
        uint256 bdvConverted,
        uint256 overallConvertCapacity,
        address inputToken,
        address outputToken
    ) internal returns (uint256 stalkPenaltyBdv) {
        AppStorage storage s = LibAppStorage.diamondStorage();
        uint256 overallConvertCapacityUsed;
        uint256 inputTokenAmountUsed;
        uint256 outputTokenAmountUsed;


        (
            stalkPenaltyBdv,
            overallConvertCapacityUsed,
            inputTokenAmountUsed,
            outputTokenAmountUsed
        ) = calculateStalkPenalty(
            dbs,
            bdvConverted,
@>            overallConvertCapacity,
            inputToken,
            outputToken
        );


        // Update penalties in storage.
@>        ConvertCapacity storage convertCap = s.sys.convertCapacity[block.number];
@>        convertCap.overallConvertCapacityUsed = convertCap.overallConvertCapacityUsed.add(
            overallConvertCapacityUsed
        );
@>        convertCap.wellConvertCapacityUsed[inputToken] = convertCap
            .wellConvertCapacityUsed[inputToken]
            .add(inputTokenAmountUsed);
@>        convertCap.wellConvertCapacityUsed[outputToken] = convertCap
            .wellConvertCapacityUsed[outputToken]
            .add(outputTokenAmountUsed);
    }

    function calculateConvertCapacityPenalty(
        uint256 overallCappedDeltaB,
        uint256 overallAmountInDirectionOfPeg,
        address inputToken,
        uint256 inputTokenAmountInDirectionOfPeg,
        address outputToken,
        uint256 outputTokenAmountInDirectionOfPeg
    ) internal view returns (uint256 cumulativePenalty, PenaltyData memory pdCapacity) {
        AppStorage storage s = LibAppStorage.diamondStorage();

@>        ConvertCapacity storage convertCap = s.sys.convertCapacity[block.number];
...
        // update overall remaining convert capacity
@>        pdCapacity.overall = convertCap.overallConvertCapacityUsed.add(
            overallAmountInDirectionOfPeg
        );

    }