Back to Browse

Webinar: Intelligence at scale through AI model efficiency

398 views
Jun 14, 2021
1:01:13

Artificial Intelligence (AI), specifically deep learning, is revolutionizing industries, products, and core capabilities by delivering dramatically enhanced experiences. However, the deep neural networks of today use too much memory, compute, and energy. To make AI truly ubiquitous, it needs to run on the end device within tight power and thermal budgets. Advancements in multiple areas are necessary to improve AI model efficiency, including quantization, compression, compilation, and neural architecture search (NAS). In this webinar, we’ll discuss: - Qualcomm AI Research’s latest model efficiency research - Our new NAS research to optimize neural networks more easily for on-device efficiency - How the AI community can take advantage of this research though our open-source projects, such as the AI Model Efficiency Toolkit (AIMET) and AIMET Model Zoo

Download

0 formats

No download links available.

Webinar: Intelligence at scale through AI model efficiency | NatokHD