Introduction to TensorFlow Lite
TensorFlow Lite is TensorFlow’s lightweight solution for mobile and embedded devices. It enables on-device machine learning inference with low latency and a small binary size. TensorFlow Lite also supports hardware acceleration with the Android Neural Networks API. TensorFlow Lite uses many techniques for achieving low latency such as optimizing the kernels for mobile apps, pre-fused activations, and quantized kernels that allow smaller and faster (fixed-point math) models. Most of their TensorFlow Lite documentation is on Github for the time being. This will be added to Artificial Intelligence Resources Subject Tracer™. This will be added to Business Intelligence Resources Subject Tracer™. This will be added to Entrepreneurial Resources Subject Tracer™. This will be added to the tools section of Research Resources Subject Tracer™.