Update TwoWayTransformer Docs. (#16161)
Co-authored-by: Glenn Jocher <glenn.jocher@ultralytics.com> Co-authored-by: UltralyticsAssistant <web@ultralytics.com> Co-authored-by: Ultralytics Assistant <135830346+UltralyticsAssistant@users.noreply.github.com>
This commit is contained in:
parent
9850172707
commit
463ca1a804
1 changed files with 0 additions and 1 deletions
|
|
@ -61,7 +61,6 @@ class TwoWayTransformer(nn.Module):
|
||||||
Attributes:
|
Attributes:
|
||||||
depth (int): Number of layers in the transformer.
|
depth (int): Number of layers in the transformer.
|
||||||
embedding_dim (int): Channel dimension for input embeddings.
|
embedding_dim (int): Channel dimension for input embeddings.
|
||||||
embedding_dim (int): Channel dimension for input embeddings.
|
|
||||||
num_heads (int): Number of heads for multihead attention.
|
num_heads (int): Number of heads for multihead attention.
|
||||||
mlp_dim (int): Internal channel dimension for the MLP block.
|
mlp_dim (int): Internal channel dimension for the MLP block.
|
||||||
layers (nn.ModuleList): List of TwoWayAttentionBlock layers.
|
layers (nn.ModuleList): List of TwoWayAttentionBlock layers.
|
||||||
|
|
|
||||||
Loading…
Add table
Add a link
Reference in a new issue