-
Notifications
You must be signed in to change notification settings - Fork 791
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Align Bloom Filter Hashing with Mainnet Compatible Clients #295
Conversation
Hi @AusIV, thanks for the PR! I would generally support this. Any comments from other team members? |
Can you rebase this (if you haven't done before you should do a local repository backup)? You also have some linting errors which prevent the tests above to pass (run |
I've made the requested changes. |
Cool, thanks |
lib/bloom.js
Outdated
* For example, if a topic is an address, it will trim off the first 12 bytes | ||
* before hashing the value to add it to the bloom filter. | ||
*/ | ||
function trimBuffer (buff) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If you name this dropLeadingZeroes
then less explanation is needed.
Hi @AusIV, I would really like to get this merged. Could you maybe do a last rebase and rename to |
Hi @AusIV, thanks for taking up on this again, cool! 😄 The I don't know if you hesitate on the rebase part if you haven't done rebasing that much before. Sorry for that, but we have relatively strict rules on that every PR needs to follow that If you want to prevent rebasing I would suggest the following:
I know this is a bit annoying but I think at the end that's the easiest way to go. I have also already gone through this and can say that it is actually less work than it sounds! 😄 |
Hi @AusIV, I would really like to get this merged. If you don't find any time to update this atm, would it be ok if I take your changes and submit a new PR on this? |
lib/bloom.js
Outdated
@@ -41,7 +54,7 @@ Bloom.prototype.add = function (e) { | |||
* @param {Buffer} element | |||
*/ | |||
Bloom.prototype.check = function (e) { | |||
e = utils.sha3(e) | |||
e = utils.sha3(trimBuffer(e)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think these should be renamed also to use dropLeadingZeroes
.
Sorry, I've been busy. I'l knock this out this morning. |
👍 |
When clients like Go-ethereum and Parity create bloom filters, they trim leading zeroes off the topic before applying keccak to hash the value. As far as I can tell, this is inconsistent with the yellow paper (see: ethereum/yellowpaper#690), but for ethereumjs-vm generated bloom filters to be compatible with tools that consume mainnet bloom filters, it should follow the same pattern. See https://github.com/ethereum/go-ethereum/blob/master/core/types/bloom9.go#L60 where go-ethereum casts topics as big.Int then back to bytes, effectively stripping away leading zeroes before hashing.
It looks like the tests failed when trying to report coverage via coveralls. I'm not sure how to resolve this. |
Hmm, not sure, we had some recent related changes. Will have a look tomorrow. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah, or since tests are passing will just merge, problem is definitely not introduced by your code.
@AusIV thanks for updating this! |
This fixes the last set of blockchain tests: * RPC_API_Test * randomStatetest224BC * randomStatetest234BC * randomStatetest529BC * ExtraData32 * log1_correct * timeDiff0 * timeDiff12 * timeDiff13 * timeDiff14 These were mostly caused by missing block validation and the intentional disparity between the bloom filter implementation and the yellow paper introduced in #295.
This fixes the last set of blockchain tests: * RPC_API_Test * randomStatetest224BC * randomStatetest234BC * randomStatetest529BC * ExtraData32 * log1_correct * timeDiff0 * timeDiff12 * timeDiff13 * timeDiff14 These were mostly caused by missing block validation and the intentional disparity between the bloom filter implementation and the yellow paper introduced in #295.
This fixes the last set of blockchain tests: * RPC_API_Test * randomStatetest224BC * randomStatetest234BC * randomStatetest529BC * ExtraData32 * log1_correct * timeDiff0 * timeDiff12 * timeDiff13 * timeDiff14 These were mostly caused by missing block validation and the intentional disparity between the bloom filter implementation and the yellow paper introduced in #295.
This fixes the last set of blockchain tests: * RPC_API_Test * randomStatetest224BC * randomStatetest234BC * randomStatetest529BC * ExtraData32 * log1_correct * timeDiff0 * timeDiff12 * timeDiff13 * timeDiff14 These were mostly caused by missing block validation and the intentional disparity between the bloom filter implementation and the yellow paper introduced in #295.
When clients like Go-ethereum and Parity create bloom filters, they trim leading zeroes off the topic before applying keccak to hash the value. As far as I can tell, this is inconsistent with the yellow paper (see: ethereum/yellowpaper#690), but for ethereumjs-vm generated bloom filters to be compatible with tools that consume mainnet bloom filters, it should follow the same pattern.
See https://github.com/ethereum/go-ethereum/blob/master/core/types/bloom9.go#L60 where go-ethereum casts topics as big.Int then back to bytes, effectively stripping away leading zeroes before hashing.