Object Tracker | Study Pack of HUSKYLENS for micro:bit 05

1 5092 Medium

 

Lesson 5: Object Tracker with an AI Machine Vision Sensor

Introduction

With the rise of competitive shooting games, quick judgement and response have become crucial. This process requires continuous vigilance and focus, as determining the enemy's position involves "listening, observing, and judging". Now, imagine if there was a machine that could do all of this for you: recognizing targets through deep learning, automatically tracking and feeding back the coordinates when the target appears within the field of view, and providing an alarm when the target disappears. This would alleviate our worries about being suddenly attacked in the game. In this tutorial, we will use the AI machine vision sensor, HUSKYLENS, in conjunction with a micro:bit to build an AI vision camera for object tracking.

 

projectImage

This project primarily utilizes the object tracking function of the AI machine vision sensor HUSKYLENS. It recognizes and tracks objects (e.g., a person) through deep learning, and feeds back the position of the object in real time. When the tracked object disappears, there is also a corresponding vibration alert, enabling you to respond promptly.

Learning Objectives

1. Learn the working principle and application field of object tracking.

2. Learn to use the object tracking function of AI machine vision sensor HUSKYLENS.

3. Use AI machine vision sensor HUSKYLENS to make with an object tracker.

Preparation

projectImage

Learning Content

Principle and Application Field of Object Tracking

When we are going to track an active object, visual object tracking is needed besides manual operation. This technology has already been widely used in our life, such as video monitoring, UAV following shot, etc. In this project, we make use of the object tracking of AI machine vision sensor HUSKYLENS.

1. What is object tracking

As one of the vital functions of AI visual recognition, object tracking belongs to one type of object recognition. Object tracking is of vital importance in computer vision, referring to the process of making continuous inferences about target status in a video sequence, which can be simply regarded as recognizing and tracking the objects moving within the visual range of the camera.

projectImage

2. Working principle of object tracking

The image information is collected by the camera and sent to the computer. After analysis and process, the computer can work out the relative position of the moving object. Meanwhile, it will make the camera rotate to carry out real-time tracking. The object tracking system includes four steps: object recognition, object tracking, movement prediction, camera controlling.

projectImage

Object Recognition

Object recognition is to obtain accurate appearance information of the object through some image processing algorithms under a static background, then recognize and mark the shape of the object, as shown in the figure.

projectImage

Object Tracking

Object tracking refers to tracking the subsequent image sequence through algorithms according to the appearance characteristics of the object obtained from the previous step, and carry out more in-depth learning in the subsequent tracking so as to make the tracking more and more accurate.

projectImage

Object Motion Prediction

Motion prediction is using an algorithm to predict the image of a moving object in the next frame so that it can optimize the algorithm and improve efficiency. As the picture shows, the following movement path and action can be predicted by the bird’s movement trend in the first few seconds.

projectImage

Camera Controlling

Camera controlling is to move the camera according to the moving direction of the object while collecting the image information. It usually requires coordination with cloud platforms or other movement mechanisms.

projectImage

3. Application field of object tracking

Smart Video Monitoring

Based on motion recognition (human recognition basing on footwork, automatic object detecting), automatic monitoring (monitor the suspicious acts), traffic monitoring (collecting the real-time traffic data to direct the traffic).

projectImage

Human-computer Interaction

The traditional human-computer interaction is carried out by the keyboard and mouse of the computer. Tracking technology is the key point when a computer needs to be able to recognize and understand the posture, movement, and gesture.

projectImage

Robot Visual Navigation

For smart robots, the tracking technology can be used to compute the motion trail of the object been shot.

projectImage

VR

3D interaction and virtual character action simulation in the virtual environment directly benefit from the research results of video human motion analysis. They provide richer forms of interaction for the participants. And human tracking and analysis are the key technologies.

projectImage

Demonstration of AI machine vision sensor HUSKYLENS Object Tracking

The function is a built-in algorithm in the sensor, allowing it to learn the features of the object, track the position of an object on the screen and feedback the coordinate value of the position to main-controller. The vibration module is controlled by the obtained position value of the object to realize the tracking notification.

1. Object learning

Different from color recognition and face recognition, object tracking is able to completely learn and recognize an object (or human). Color recognition is only for color, while face recognition is only for a part of the human body. Object tracking is to learn the overall characteristics of the object to track.

Point AI machine vision sensor HUSKYLENS to the target object, adjust the distance until the object is included in the yellow frame of the center of the screen. If the object is difficult to be completely contained in the frame, containing distinctive features is also okay. Then, long press the "learning button" to learn the object at various angles and distances. During the learning process, the frame with words the "Learning: ID1" will be displayed on the screen.

projectImage

When AI machine vision sensor HUSKYLENS can track the object at different angles and distances, you can release the "learning button" to end the learning.

Note: If there is no yellow frame in the center of the screen, it means that AI machine vision sensor HUSKYLENS has learned an object before. Please let it forget the learned one and learn again.

2. Disable learning function

After the learning is completed, long press the "function button" to enter the parameter setting interface of the submenu of the object tracking function.

Dial the "function button" to the left or right, select "learning on", then short press the "function button", dial the "function button" to the left to turn off the "learning on" switch, that is: the progress bar turns white, the bar is located at the left side. Finally, save and return.

projectImage

AI Vision Project Practice

The project is divided into two tasks. First, learn to use the object tracking function of AI machine vision sensor HUSKYLENS and read the coordinate data of the object. Then add a vibration module on this basis to achieve the final tracking notification function.

Task 1: Function of object tracking and the coordinate values

Obtain the coordinate value of the object on the screen through the AI machine vision sensor HUSKYLENS, and then judge the relative position of the object on the sensor through the coordinate value.

Task 2: Vibration feedback notification

When the AI machine vision sensor HUSKYLENS is tracking an object, if the target is lost, the vibration module is used to give a vibration feedback notification.

Task1: Function of Object Tracking and Coordinate Values

Hardware connection
projectImage
AI Vision Program Design

Function instruction

Before starting, let AI machine vision sensor HUSKYLENS learn the target to be tracked——Mind+ in camouflage.

projectImage

Coordinate analysis

The screen resolution of AI machine vision sensor HUSKYLENS is 320*240, as shown in the figure, the center coordinates of the object that need to be obtained through the program are also within this range. For example, if the obtained coordinate value is (160, 120), the tracked object is at the center of the screen.

projectImage

Flowchart Analysis

projectImage

The MakeCode Sample Program

projectImage

AI Vision Project Operating Effect

Run the program, AI machine vision sensor HUSKYLENS will track the learned objects and display the coordinates of the objects on the AI machine vision sensor HUSKYLENS screen.

Note: If -1 and -1 are displayed on the screen, the tracked object is lost.

projectImage

Task2: Vibration Feedback Notification

Hardware connection

projectImage

AI Vision Program Design

Function instruction:

When AI machine vision sensor HUSKYLENS is tracking an object, X=-1 and Y=-1 displayed on the screen means that the tracked object is lost. The vibration module turns on the vibration notification, otherwise, turns off.

Flowchart Analysis

projectImage

The MakeCode Sample Program

projectImage

Note: We can use high/low level to turn the vibration module on/off, or use a PWM signal to control its amplitude.

AI Vision Project Operating Effect

When the tracked object is lost (X=-1, Y=-1), the vibration module turns on to send the vibration notification; when the tracked object appears on the screen, the X and Y coordinates of the tracked object are displayed, and the vibration module turns off.

projectImage

Project Review

In this lesson, we mainly learned to use the object tracking function of AI machine vision sensor HUSKYLENS to find the specific coordinate value (X, Y) of the tracked object, and use the vibration module to notify when the target is lost. In real life, the object tracking function can be applied to more scenarios, such as the suspects tracking and reminding when children leave the visible range.

Project Development

Extend the project to realize functions like this: showing the location of the target person; when the person appears, the vibrating notification is turned on. Besides, can we do something to let the vibration amplitude change with the distance? For instance, the closer the target, the larger the amplitude; the farther the target, the smaller the amplitude.

Note: The amplitude needs to be controlled by PWM signal.

Click Buy Study Pack of HUSKYLENS for micro:bit

Click Buy Huskylens - An Easy-to-use AI Camera | Vision Sensor

Click Buy BBC micro:bit V2

Click Buy Study Pack with HUSKYLENS and micro:bit V2

Engage in learning the next AI vision project : Self-service Cashier with an AI Machine Vision Sensor

Engage in learning the previous AI vision project : Vending Machine for Stray Cats & Dogs with an AI Machine Vision Sensor

License
All Rights
Reserved
licensBg
1