Update TwoWayTransformer Docs. (#16161)

Co-authored-by: Glenn Jocher <glenn.jocher@ultralytics.com>
Co-authored-by: UltralyticsAssistant <web@ultralytics.com>
Co-authored-by: Ultralytics Assistant <135830346+UltralyticsAssistant@users.noreply.github.com>
This commit is contained in:
Jason Guo 2024-09-12 00:37:40 +08:00 committed by GitHub
parent 9850172707
commit 463ca1a804
No known key found for this signature in database
GPG key ID: B5690EEEBB952194

View file

@ -61,7 +61,6 @@ class TwoWayTransformer(nn.Module):
Attributes:
depth (int): Number of layers in the transformer.
embedding_dim (int): Channel dimension for input embeddings.
embedding_dim (int): Channel dimension for input embeddings.
num_heads (int): Number of heads for multihead attention.
mlp_dim (int): Internal channel dimension for the MLP block.
layers (nn.ModuleList): List of TwoWayAttentionBlock layers.