0

I was playing around with selectors and particularly how they are computed (I am using Foundry for the test). I have the following example contract:

contract Token2 {
    uint a = 1; 

    function foo2(uint b) external {
        for (uint i = 0; i < 10; ++i)
            a += b;
    }
}

and in a test:

function testToken2() public {
    console.logBytes4(bytes4(keccak256(bytes("function foo2(uint b) external")))); // [1] 0x2e88fa8a
    console.logBytes(abi.encodeWithSignature("foo2(uint)")); // [2] 0x2e88fa8a
    console.logBytes4(Token2.foo2.selector); // [3] 0x7d5f3b35
}

[1] and [2] returns the same thing: 0x2e88fa8a. But [3] returns 0x7d5f3b35. (I am taking reference from https://solidity-by-example.org/function-selector/ to manually calculate the selector).

My question is why I cannot get the same selector as [3] ?

user3102158
  • 173
  • 9

1 Answers1

1

Two things:

  1. For the EVM, uint is not actually a valid datatype. When you type uint in solidity it is compiled to the default size which is uint256. So, the right selector would be: "foo2(uint256)".
  2. To build de selector, solidity doesn't care about the visibility of the function or the outputs, so the first one should be also simply "foo2(uint256)".

TLDR: This is how you get the three of them to match:

    console.logBytes4(bytes4(keccak256(bytes("foo2(uint256)")))); // [1] 0x7d5f3b35
    console.logBytes(abi.encodeWithSignature("foo2(uint256)")); // [2] 0x7d5f3b35
    console.logBytes4(Token2.foo2.selector); // [3] 0x7d5f3b35
Jacobo Lansac
  • 167
  • 1
  • 11