Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add batch submission policy that uses Blob transactions #8769

Merged

Conversation

roberto-bayardo
Copy link
Collaborator

@roberto-bayardo roberto-bayardo commented Dec 24, 2023

Description

Once Dencun / 4844 is activated on the L1, we wish to allow batches to be posted using Blob transactions. This PR adds a batcher configuration option to trigger submitting batches using blobs instead of calldata.

Tests

This is primarily a configuration change and will be ultimately be tested via e2e tests in a subsequent PR.

Copy link
Contributor

coderabbitai bot commented Dec 24, 2023

Walkthrough

Walkthrough

The op-batcher codebase has been updated to introduce new configuration options for batch submission and data availability. These updates include new fields and validation checks in the configuration structs, adjustments to the batch submission logic to handle different types of data, and the addition of flags to control these new policies. The changes aim to enhance the flexibility and robustness of batch processing by allowing for different submission and data availability strategies.

Changes

File Path Changes
op-batcher/batcher/config.go Added BatchSubmissionPolicy and DataAvailabilityType fields with validation checks; initialized these fields with flag values.
op-batcher/batcher/config_test.go
op-batcher/batcher/driver.go
Introduced new declarations for BatchSubmissionPolicy and DataAvailabilityType in validBatcherConfig function; updated sendTransaction method to handle blob data when Config.UseBlobs is true.
op-batcher/batcher/service.go Added UseBlobs field to BatcherConfig and set its value in initChannelConfig; new eth package import.
op-batcher/batcher/channel_manager.go Added max_frame_size field to the channelManager configuration.
op-batcher/flags/flags.go Introduced CalldataPolicy and BlobsPolicy constants; added BatchSubmissionPolicy flag to optional flags list.

Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media?

Share

Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    • I pushed a fix in commit <commit_id>.
    • Generate unit-tests for this file.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query. Examples:
    • @coderabbitai generate unit tests for this file.
    • @coderabbitai modularize this function.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai generate interesting stats about this repository from git and render them as a table.
    • @coderabbitai show all the console.log statements in this repository.
    • @coderabbitai read src/utils.ts and generate unit tests.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.

Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments.

CodeRabbit Commands (invoked as PR comments)

  • @coderabbitai pause to pause the reviews on a PR.
  • @coderabbitai resume to resume the paused reviews.
  • @coderabbitai review to trigger a review. This is useful when automatic reviews are disabled for the repository.
  • @coderabbitai resolve resolve all the CodeRabbit review comments.
  • @coderabbitai help to get help.

Additionally, you can add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.

CodeRabbit Configration File (.coderabbit.yaml)

  • You can programmatically configure CodeRabbit by adding a .coderabbit.yaml file to the root of your repository.
  • The JSON schema for the configuration file is available here.
  • If your editor has YAML language server enabled, you can add the path at the top of this file to enable auto-completion and validation: # yaml-language-server: $schema=https://coderabbit.ai/integrations/coderabbit-overrides.v2.json

CodeRabbit Discord Community

Join our Discord Community to get help, request features, and share feedback.

@roberto-bayardo roberto-bayardo marked this pull request as draft December 24, 2023 22:25
@roberto-bayardo roberto-bayardo changed the title 4844/batcher Add batch submission policy that uses Blob transactions Dec 24, 2023
@roberto-bayardo roberto-bayardo force-pushed the 4844/batcher branch 2 times, most recently from c7f1748 to 7386ec8 Compare December 24, 2023 22:42
@sebastianst sebastianst self-requested a review January 6, 2024 15:02
@roberto-bayardo roberto-bayardo force-pushed the 4844/batcher branch 2 times, most recently from e0792f1 to ba8c192 Compare January 6, 2024 19:39
@roberto-bayardo roberto-bayardo marked this pull request as ready for review January 6, 2024 19:45
@roberto-bayardo roberto-bayardo force-pushed the 4844/batcher branch 6 times, most recently from 16649ae to 62a676b Compare January 8, 2024 20:10
@roberto-bayardo roberto-bayardo force-pushed the 4844/batcher branch 3 times, most recently from b5a03d8 to 53c6ddb Compare January 8, 2024 20:22
Copy link
Member

@sebastianst sebastianst left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Are we adding an e2e test in a different PR that tests that the batcher creates correct blob batch txs?

@roberto-bayardo
Copy link
Collaborator Author

Are we adding an e2e test in a different PR that tests that the batcher creates correct blob batch txs?

yes, we have an e2e test already that is doing exactly this, but it is from before all the l1 cost function related complexity was added, so it's going to have to be reworked and be resubmitted after this gets merged.

@roberto-bayardo roberto-bayardo force-pushed the 4844/batcher branch 4 times, most recently from 3beb719 to 8ff4974 Compare January 8, 2024 23:02
@ajsutton ajsutton removed their request for review January 8, 2024 23:18
@trianglesphere trianglesphere added this pull request to the merge queue Jan 9, 2024
Merged via the queue into ethereum-optimism:develop with commit 4250b6e Jan 9, 2024
4 checks passed
// likely result in the chain spending more in gas fees than it is tuned for, so best
// to just fail. We do not expect this error to trigger unless there is a serious bug
// or configuration issue.
return fmt.Errorf("could not create blob tx candidate: %w", err)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@roberto-bayardo was just reading the code and realized that if this ever happens, the frames are forever gone, as they aren't put back in the channel.

are we missing a l.state.TxFailed(txdata.ID()) here?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hmm, this is one of those errors that should probably just panic or log.Fatal since there's no good way to recover. It's only triggered if you get a frame that is larger than the blob, so if you put the data back, it will just get re-triggered.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants