Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix: Handle dynamic batch sizes in TensorRT ReID model #1817

Conversation

usamaimdadsian
Copy link
Contributor

Previously, in the inference of TensorRT models if input batch exceeds the batch shape used for TensorRT conversion it resulted in error on the next steps. My fix divides the input batch in sub batches and those sub batches are used by the model for inference. After the inference I stack the features of ReID models and return them.

@mikel-brostrom mikel-brostrom merged commit 0ef36ef into mikel-brostrom:master Feb 13, 2025
12 checks passed
@mikel-brostrom
Copy link
Owner

Great! Thanks for contributing @usamaimdadsian 🚀

@usamaimdadsian usamaimdadsian deleted the fix-tensorrt-batch-handling branch February 14, 2025 01:07
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants