Update Docs README (#8919)
This commit is contained in:
parent
e8de3fa693
commit
5c1277113b
3 changed files with 100 additions and 64 deletions
|
|
@ -83,7 +83,7 @@ You can use these files to run inference with the OpenVINO Inference Engine.
|
|||
|
||||
## Using OpenVINO Export in Deployment
|
||||
|
||||
Once you have the OpenVINO files, you can use the OpenVINO Runtime to run the model. The Runtime provides a unified API to inference across all supported Intel hardware. It also provides advanced capabilities like load balancing across Intel hardware and asynchronous execution. For more information on running the inference, refer to the [Inference with OpenVINO Runtime Guide](https://docs.openvino.ai/nightly/openvino_docs_OV_UG_OV_Runtime_User_Guide.html).
|
||||
Once you have the OpenVINO files, you can use the OpenVINO Runtime to run the model. The Runtime provides a unified API to inference across all supported Intel hardware. It also provides advanced capabilities like load balancing across Intel hardware and asynchronous execution. For more information on running the inference, refer to the [Inference with OpenVINO Runtime Guide](https://docs.openvino.ai/2024/openvino-workflow/running-inference.html).
|
||||
|
||||
Remember, you'll need the XML and BIN files as well as any application-specific settings like input size, scale factor for normalization, etc., to correctly set up and use the model with the Runtime.
|
||||
|
||||
|
|
|
|||
Loading…
Add table
Add a link
Reference in a new issue