TensorFlow Lite on Andriod for beginners

TensorFlow Lite

Android development is not limited to cute little apps that split the bill in restaurants (that seems to be everyone’s “brilliant app idea”, or is it just me?). Android is a powerful platform backed by one of the largest and most influential companies in the world. A company that is at the forefront of machine learning and considers itself “AI-first”.

By learning TensorFlow Lite for Android, developers can implement advanced machine learning into their creations. This greatly expands the capabilities of an app and introduces myriad new potential use cases. It also teaches invaluable skills, the demand for which will only increase in the years to come.

See also: Is your job safe? Jobs that AI will destroy in the next 10 to 20 years

This is the perfect introduction to machine learning. So let’s start!


What is TensorFlow?

Let’s start with the basics: what is TensorFlow Lite? To answer that, let’s first look at TensorFlow itself. TensorFlow is an open source machine learning platform from the Google Brain team that is “end-to-end” (ie all-in-one). TensorFlow is an open source software library that enables machine learning.

A machine learning task is any problem that requires pattern recognition supported by algorithms and large amounts of data. This is AI, but not in the Hal of 2001: A Space Odyssey Sense.

See also: Artificial Intelligence vs. Machine Learning: What’s the Difference?

use cases

An example of a machine learning application is computer vision. It enables computers to recognize objects in a photo or a live camera feed. To do this, the program must first be “trained” by displaying thousands of images of this object. The program never understands the object, but learns to look for certain data patterns (changes in contrast, certain angles or curves) that are likely to match the object. Over time, the program becomes more and more precise in recognizing this object.

machine learning

As an Android developer, Computer Vision offers many options: whether you want to use facial recognition as a security feature, create an AR program that can highlight elements in the environment, or create the next “Reface” app. This is before we look at the myriad other uses for machine learning models: speech recognition, OCR, enemy AI, and so much more.

Building and implementing these types of models from scratch would be an extremely difficult task for a single developer, which is why it is so useful to have access to pre-built libraries.

See also: What is Google Cloud

TensorFlow can run on a wide variety of CPUs and GPUs, but works particularly well with Google’s own Tensor Processing Units (TPUs). Developers can also leverage the power of Google Cloud Platform by outsourcing machine learning to Google’s servers.

What is TensorFlow Lite?

TensorFlow Lite brings Tensor Flow to mobile devices (this means it runs on the mobile device itself). The TFLite software stack, announced in 2017, was specially developed for mobile development. TensorFlow Lite “Micro”, on the other hand, is a version especially for microcontrollers that was recently merged with the uTensor from ARM.

Now, some developers may be wondering what the difference between ML Kit and TensorFlow Lite is. While there is definitely some overlap, TensorFlow Lite is lower and more open. More importantly, TensorFlow Lite runs from the device itself, while ML Kit requires Firebase registration and an active internet connection. Note that despite Google’s confusing nomenclature, ML Kit still uses TensorFlow “under the hood”. Firebase is just another type of Google Cloud Platform project, too.

See also: Build a facial recognition app using machine learning and the Firebase ML Kit

TensorFlow Lite is available for Android and iOS via a C ++ API and a Java wrapper for Android developers. On devices that support this, the library can also leverage the Android Neural Networks API for hardware acceleration.

Which one should you use for your projects? It depends a lot on your goal. If you don’t mind relying on an outside cloud service, ML Kit can make your life a little easier. If you want your code to run natively, or if you need a little more customization and flexibility, go for TensorFlow Lite.


Using TensorFlow Lite

Developers rely on “models” to solve a machine learning problem. ML models are files that contain statistical Models. These files are trained to recognize certain patterns. Training essentially means providing the model with samples of data so that it can improve its success rate by refining the patterns used.

See also: ML Kit Image Labeling: Determine the content of an image through machine learning

So a computer vision model could start with some basic assumptions about what an object looks like. As you show more and more images, it becomes more and more precise while expanding the scope of what it is looking for.

Training of FFLite models

You will come across “pre-built models” that have all of this data already fed in to refine their algorithms. This type of model is therefore “ready for use”. It can automatically perform a task, e.g. B. Identify emotions using facial expressions or move a robotic arm around the room.

In TensorFlow Lite, these files are called “TensorFlow Lite Model Files” and have a “.tflite” or “.lite” extension. Label files contain the labels the file was trained to use (for example, “happy” or “sad” for facial recognition models).

Training ML models

You may also come across a few other file types that are used in the training process. GraphDef files (.pb or .pbtxt) describe your diagram and can be read by other processes. The TXT version is also human readable. You can also create these with TensorFlow.

The checkpoint file shows you the learning process by listing serialized variables so you can see how the values ​​change over time. The Frozen Graph Def then converts these values ​​into constants and reads them over the diagram from specified test points. The TFlite model is then built from the frozen diagram with the TOCO (Tensor Flow Optimizing Converter Tool). This gives us a nice “pre-trained” file that we can then implement in our apps.

These libraries can do all kinds of general tasks, such as: B. answering questions, recognizing faces and much more.

Discussing and importing models is not covered in this post. You can find a great tutorial here.

The good news is that the TensorFlow Task Library contains many powerful and simple libraries based on pre-trained models. They can do all kinds of general tasks, such as: B. answering questions, recognizing faces and much more. This means that beginners don’t have to worry about checkpoint files or training!

Using TFLite files

There are many ways to get pre-trained TensorFlow Lite model files for your app. I recommend starting with the TensorFlow official site.

For example, follow this link and you can download a starter model that provides basic image classification. The page also has some details on how to use it through the TensorFlow Lite Task Library. Alternatively, you can use the TensorFlow Lite support library if you want to add your own inference pipeline (i.e., look for new content).

Once you’ve downloaded the file, put it in your Assets directory. You must indicate that the file should not be compressed. To do this, add the following to your build.gradle module:

android {
    // Other settings

    // Specify tflite file should not be compressed for the app apk
    aaptOptions {
        noCompress "tflite"
    }

}

Set up your Android Studio project

To use TensorFlow Lite in your app, you need to add the following dependency to your build.gradle file:

compile ‘org.tensorflow:tensorflow-lite:+’

Next you need to import your interpreter. This is the code that will actually load and run the model.

You then create an instance of the interpreter in your Java file and use it to analyze the required data. For example, you can enter images and this will return results.

The results are provided in the form of output probabilities. Models can never say with certainty what an object is. So a picture of a cat could be 0.75 dog and 0.25 cat. Your code must

Alternatively, you can import the TensorFlow support library and convert the image to tensor format.

These pre-trained models can recognize thousands of image classes. However, there are many different model “architectures” that change the way the model defines the “layers” involved in the learning cycle and the steps involved in converting raw data into training data.

Popular model architectures are MobileNet and Inception. Your job is to choose the optimal solution for the job. For example, MobileNet was designed to favor light and fast models over deep and complex models. Complex models are more accurate, but at the expense of size and speed.


Learn more

While this is a complex topic for beginners, I hope this post has given you an idea of ​​the basics so that you can better understand future tutorials. The best way to learn new skills is to choose a project and then learn the steps necessary to complete that task.

Introduction to TensorFlow Lite Android

For a deeper understanding, we strongly recommend machine learning with TensorFlow. This course contains 19 lessons that show you how to implement common commercial solutions. Android Authority Readers are currently getting a 91% discount, which brings the price down from $ 124 to $ 10.

Source link

Related Posts