- Open Visual Inference Image and Neural Network Optimization Tool Kit
- Tools & Capabilities for Developers across domains
- OpenVino is supported only for Intel Devices
- Models for purpose
- Models to detect across the frame
- Hardware for Performance
- Compute Efficiency / Memory Hierarchy / APIs
- CPU and Integrated FPGA, GPU solutions
Pipeline flow
- Decode compressed Image
- Preprocessing - Scale down for DL model input, ROI computation, Frame Re-ordering
- Post-Processing - Write bounding boxes on top of it
- Training offline activity
- Model Optimizer would do conversion for CPU, GPU, FPGA
- Intel Library also added on it
- Out of Box Models in OpenVino
- Compile for Target
- mo.py model optimizer FP16. xml and bin file generated
- Movidus neural compute stick
- -d cpu, -d gpu, -d myriad
- Use of Hetro Plugin - GPU and CPU
- -d HETRO:GPU,CPU
https://github.com/intel-iot-devkit/store-traffic-monitor
https://github.com/intel-iot-devkit/smart-video-workshop
Happy Mastering DL!!!
No comments:
Post a Comment