Nowadays, shooting games are very popular. In the game, you need to judge the enemy's location and counterattack. The game requires a high degree of concentration and vigilance. The enemy's position is determined by "listening, watching, and observing ".
Can we make a machine that performs in-depth learning of the target? When the target appears in the visible range, it automatically tracks and feeds back the coordinates of the target; when the target disappears, it gives a notification, in this way, we don’t have to worry about being attacked.
What is the relationship between HUSKYLENS and object tracking? This project mainly uses the object tracking function of HUSKYLENS to recognize and track the object through in-depth learning of the object (person) and feedback the position of the object to you in real-time. When the tracking object is lost, there will be corresponding audio feedback.
Learning Objectives
1. Learn the working principle and application field of object tracking.
2. Learn to use the object tracking function of HUSKYLENS.
3. Use HUSKYLENS to make with an object tracker.
Preparation

Learning Content
principle and application field of object tracking
When we are going to track an active object, visual object tracking is needed besides manual operation. This technology has already been widely used in our life, such as video monitoring, UAV following shot, etc. In this project, we make use of the object tracking of HUSKYLENS.
1. What is object tracking
As one of the vital functions of AI visual recognition, object tracking belongs to one type of object recognition. Object tracking is of vital importance in computer vision, referring to the process of making continuous inferences about target status in a video sequence, which can be simply regarded as recognizing and tracking the objects moving within the visual range of the camera.
2. Working principle of object tracking
The image information is collected by the camera and sent to the computer. After analysis and process, the computer can work out the relative position of the moving object.
Meanwhile, it will make the camera rotate to carry out real-time tracking. The object tracking system includes four steps: object recognition, object tracking, movement prediction, camera controlling.
Object Recognition
Object recognition is to obtain accurate appearance information of the object through some image processing algorithms under a static background, then recognize and mark the shape of the object, as shown in the figure.
Object Tracking
Object tracking refers to tracking the subsequent image sequence through algorithms according to the appearance characteristics of the object obtained from the previous step, and carry out more in-depth learning in the subsequent tracking so as to make the tracking more and more accurate.
Object Motion Prediction
Motion prediction is using an algorithm to predict the image of a moving object in the next frame so that it can optimize the algorithm and improve efficiency. As the picture shows, the following movement path and action can be predicted by the bird’s movement trend in the first few seconds.
Camera Controlling
Camera controlling is to move the camera according to the moving direction of the object while collecting the image information. It usually requires coordination with cloud platforms or other movement mechanisms.
3. Application field of object tracking
Smart Video Monitoring
Based on motion recognition (human recognition basing on footwork, automatic object detecting), automatic monitoring (monitor the suspicious acts), traffic monitoring (collecting the real-time traffic data to direct the traffic).
Human-computer Interaction
The traditional human-computer interaction is carried out by the keyboard and mouse of the computer. Tracking technology is the key point when a computer needs to be able to recognize and understand the posture, movement, and gesture.
Robot Visual Navigation
For smart robots, the tracking technology can be used to compute the motion trail of the object been shot.
VR
3D interaction and virtual character action simulation in the virtual environment directly benefit from the research results of video human motion analysis. They provide richer forms of interaction for the participants. And human tracking and analysis are the key technologies.
Demonstration of HuskyLens Object Tracking
The object tracking function of HUSKYLENS employs its built-in algorithm to learn the characteristics of an object, enabling it to track the object's position and feedback the coordinate values of that position to the main controller. By obtaining the position value of the object, a buzzer is triggered to sound, realizing a tracking notification.
1. Object learning
Different from color recognition and face recognition, object tracking is able to completely learn and recognize an object (or human). Color recognition is only for color, while face recognition is only for a part of the human body. Object tracking is to learn the overall characteristics of the object to track.
Point HUSKYLENS to the target object, adjust the distance until the object is included in the yellow frame of the center of the screen. If the object is difficult to be completely contained in the frame, containing distinctive features is also okay. Then, long press the "learning button" to learn the object at various angles and distances. During the learning process, the frame with words the "Learning: ID1" will be displayed on the screen.
When HUSKYLENS can track the object at different angles and distances, you can release the "learning button" to end the learning.
Note: If there is no yellow frame in the center of the screen, it means that HUSKYLENS has learned an object before. Please let it forget the learned one and learn again.
2. Disable learning function
After the learning is completed, long press the "function button" to enter the parameter setting interface of the submenu of the object tracking function.
Dial the "function button" to the left or right, select "learning on", then short press the "function button", dial the "function button" to the left to turn off the "learning on" switch, that is: the progress bar turns white, the bar is located at the left side. Finally, save and return.
Project Practice
The project is divided into two tasks. First, learn to use the object tracking function of HUSKYLENS and read the coordinate data of the object. Then add a buzzer on this basis to achieve the final tracking notification function.
Task 1: Function of object tracking and the coordinate values
Obtain the coordinate value of the object on the screen through the HUSKYLENS sensor, and then judge the relative position of the object on the sensor through the coordinate value.
Task 2: Audio feedback notification
When the HUSKYLENS sensor is tracking an object and the target is lost, it uses a buzzer to provide an audio feedback prompt.
Task1: Function of Object Tracking and Coordinate Values
Hardware connection
Program Design
Function instruction
Before starting, let HUSKYLENS learn the target to be tracked——Mind+ in camouflage.

Coordinate analysis
The screen resolution of HUSKYLENS is 320*240, as shown in the figure, the center coordinates of the object that need to be obtained through the program are also within this range. For example, if the obtained coordinate value is (160, 120), the tracked object is at the center of the screen.
Flowchart Analysis
Sample program
Operating Effect
Run the program, HUSKYLENS will track the learned objects and display the coordinates of the objects on the HUSKYLENS screen.
Note: If -1 and -1 are displayed on the screen, the tracked object is lost.

Task2: Audio Feedback Notification
Hardware connection
Program Design
Function instruction:
When HUSKYLENS is tracking an object, if the displayed coordinates on the screen are X=-1, Y=-1, it indicates that the tracked object has been lost, and the buzzer will start to provide an audio prompt; otherwise, the audio prompt will be turned off.
Flowchart Analysis
Sample program
Operating Effect
When the tracked object is lost (X=-1, Y=-1), the buzzer start the audio prompt. When the tracked object appears on the screen, it displays the X and Y coordinate values of the tracked object, and the buzzer turn off the audio prompt.
When the target is lost, start the audio prompt.
Display the target's X, Y coordinates and turn off the audio prompt.
Project Review
In this lesson, we mainly learned to use the object tracking function of HUSKYLENS to find the specific coordinate value (X, Y) of the tracked object, and use the buzzer for prompts when the target is lost. In real life, the object tracking function can be applied to more scenarios, such as the suspects tracking and reminding when children leave the visible range.
Project Development
Extend this project to achieve the following functions: displaying the location of the target person and activating an audio reminder when the person appears. Additionally, can we make adjustments so that the volume of the sound varies with the distance? For example, the closer the target, the louder the sound; the farther the target, the softer the sound.
