Update prediction Results docs (#4139)
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
This commit is contained in:
parent
8870084645
commit
11d0488bf1
107 changed files with 1451 additions and 1317 deletions
|
|
@ -3,17 +3,20 @@ description: Learn about TwoWayTransformer and Attention modules in Ultralytics.
|
|||
keywords: Ultralytics, TwoWayTransformer, Attention, AI models, transformers
|
||||
---
|
||||
|
||||
## TwoWayTransformer
|
||||
# Reference for `ultralytics/models/sam/modules/transformer.py`
|
||||
|
||||
!!! note
|
||||
|
||||
Full source code for this file is available at [https://github.com/ultralytics/ultralytics/blob/main/ultralytics/models/sam/modules/transformer.py](https://github.com/ultralytics/ultralytics/blob/main/ultralytics/models/sam/modules/transformer.py).
|
||||
|
||||
---
|
||||
### ::: ultralytics.models.sam.modules.transformer.TwoWayTransformer
|
||||
## ::: ultralytics.models.sam.modules.transformer.TwoWayTransformer
|
||||
<br><br>
|
||||
|
||||
## TwoWayAttentionBlock
|
||||
---
|
||||
### ::: ultralytics.models.sam.modules.transformer.TwoWayAttentionBlock
|
||||
## ::: ultralytics.models.sam.modules.transformer.TwoWayAttentionBlock
|
||||
<br><br>
|
||||
|
||||
## Attention
|
||||
---
|
||||
### ::: ultralytics.models.sam.modules.transformer.Attention
|
||||
## ::: ultralytics.models.sam.modules.transformer.Attention
|
||||
<br><br>
|
||||
|
|
|
|||
Loading…
Add table
Add a link
Reference in a new issue