Vending Machine for Stray Cats & Dogs | Huskylens Playground with micro:bit EP05
We may often see abandoned cats or dogs wandering around the city and searching for food in trash bins. As animal lovers, we feel sad that these little guys are forced to suffer for whatever reason. All animals deserve to be loved and nurtured. Can we do anything for them?
If there is a feeding machine that determines whether it is a cat or a dog in front and feeds it with corresponding food, just like a vending machine in stray animal world, wouldn’t it be great? Let’s use HuskyLens to make one!
Function Introduction:
The structure part of this project will mainly be built with waste materials like express paper boxes, plastic bottles, etc. Then, we will use the object recognition function of HuskyLens to distinguish cats and dogs through machine learning. Then a micro:bit will be used to process the result and control the servo to open the valve, and deliver corresponding food to cats and dogs.
Materials:
Micro:bit https://www.dfrobot.com/product-2125.html
IO Extender for micro:bit V2.0 https://www.dfrobot.com/product-1867.html
HUSKYLENS https://www.dfrobot.com/product-1922.html
180 Micro servo x 1 https://www.dfrobot.com/product-255.html
Knowledge Field:
Image recognition technology is an important field of artificial intelligence. It refers to the ability of a computer-powered camera to identify and detect objects or features in a digital image or video. In this project we will use the image recognition of HuskyLens to distinguish and recognize cats and dogs.
1. What is Image Recognition?
Image recognition, a practical application of deep learning algorithm, refers to the processing, analysis, and understanding of images by computer so as to recognize targets and objects in different modes. It is divided into face recognition and product recognition currently. The former is mainly used in security checks, identification, and mobile payment, while the latter can be usually found in the process of commodity circulation, especially unmanned shops, intelligent retail counters, and other unmanned sales fields.
Operating Principles:
Four steps for traditional image recognition:
Image Capture: capture the image by the camera, and prepare for later recognition.
Image Pre-processing: analyze and process the images through a series of algorithms.
Feature Extraction: according to the information processed in the previous step, extract the key information, such as color, outline, etc.
Image Recognition: compare the information extracted with the sample base, the HuskyLens sensor image recognition includes a built-in sample library and it can be enriched by learning.
Similarities and differences between image recognition and other recognition:
we have already learned a lot of functions about camera recognition, such as face recognition and color recognition. What are the differences between them?
We can infer that face recognition, as one of the image recognition, is specifically used for distinguishing human faces. Imagine this scenario: when a group of people passes the camera, the name of each person can be “called out” if the information has been input in advance, while the image recognition can only offer the result human, human, human.... because it can only recognize objects but not distinguish individuals.
We may find image recognition similar to object tracking. Both of them are function of recognition, but technically, object tracking can only learn and track one object, while image recognition can recognize multiple objects because object tracking learns an object from different angles so that accurate tracking can be achieved while image recognition only learns with only one side and recognition cannot be achieved once from another angle.
Color recognition and QR code recognition are easy to distinguish since they are both specific function-oriented.
Application Scenarios:
1. Biomedicine:
Image recognition is widely used in modern medicine because of its explicitness, non-invasive, safe, and convenience, especially in clinical diagnosis and pathological research. For example, during the period of COVID-19, AI is deployed to quickly review the CT of the patients.
2. Remote Sensing Image Recognition
Aerial remote sensing and satellite remote sensing images are usually processed with image recognition to extract useful information. This technology is mainly used for terrain and geological exploration, forest, water, marine, agricultural and other resource surveys, disaster prediction, environmental pollution monitoring, meteorological satellite cloud image processing, and ground military target recognition.
2. Object Recognition in HuskyLens
This function is able to recognize what the object is and track it. 20 objects are supported: airplane, bicycle, bird, boat, bottle, bus, car, cat, chair, cow, dining table, dog, horse, motorbike, person, potted plant, sheep, sofa, train, television. The default setting is to frame and recognize one object. In this chapter, we are going to frame and recognize multiple objects as an example.
(1) Operating Setting
Dial the “function button” to the left until “face recognition” is displayed at the top of the screen. Long press the “function button” to enter the parameter setting interface of the sub menu of object recognition.
Dial the “function button” to the left or right, select the “learn multiple”. Then dial to the right and turn on the “Learn Multiple” switch, that is, the progress bar turns blue and the square icon on the progress bar moves to the right. After that, short press the “function button” to confirm this parameter.
Dial the “function button” to the left and select “save & return”, short press “function button”. It will display “Do you want to save the parameters?”, and “yes” is the default one. Short press the “function button”, the data will be saved, and it will automatically return to the object recognition mode.
(2) Object Detection
When detecting objects, HuskyLens will automatically recognize them, and the object will be displayed by the white frame with its name on the screen. At present, only 20 built-in objects can be recognized, and the remaining objects cannot be recognized temporarily.
(3) Marking Object
Point the “+” symbol at the object, then short press the “learning button”. When pressing, the color of the frame changes from white to blue, and the name of the object and its ID number will appear on the screen. There will be a notice:” Click again to continue! Click other buttons to finish”. Please short press the "learning button" before the countdown ends if you want to learn other objects. If not, short press the "function button" before the countdown ends, or just do not press any button to let the countdown ends.
The ID number is related to the order of marking objects. For example: the ID will be displayed as “ID1”, “ID2”, “ID3” in order, and different objects are matched with different colored frames.
(4) Object Recognition
When encountering the learned objects, they will be selected by the blue frame, and the name and ID number will be displayed. The size of the frame changes with the size of the object, tracking these objects automatically. Similar objects are with the frames in the same colors, names and IDs. It also supports simultaneous recognition of multiple types of objects, such as recognizing cats and dogs at the same time.
This function can be used as a simple filter to find out what you need from a bunch of objects.
*Tip: This function cannot distinguish the differences between objects of the same category. For example, it can only recognize that this is a cat, but cannot recognize what kind of cat it is, which is different from face recognition that can distinguish different faces.
Project Practice
We will complete the task in two steps. First, we will learn to use the image recognition function of HuskyLens and output the recognized results. Then, distribute the corresponding food according to what is approaching.
Task 1: Distinguish cats and dogs
In this step, we need to enable the HuskyLens camera to recognize and distinguish cats and dogs and give feedback so that we can release the corresponding food in the next step.
Task 2: Add the function of sending food
In this step, you need to add the function of sending food based on the previous step, and make the corresponding structure.
Task 1: Distinguish cats and dogs
1. Hardware Connection
HuskyLens sensor uses IIC interface, you need to pay attention to the wire sequence, and don't connect it wrongly or reversely.
2. Program Design
Here we need to let the HuskyLens sensor learn the images of cats and dogs, and output the result of cats and dogs after recognition.
Step 1. Learning and Recognition
Before designing the program, we need to let the HuskyLens sensor learn the images of cats and dogs. (Note that you need to enable the“Learn Multiple” function first)
Step 2. Mind+ Software Setting
Open Mind+ (version 1.62 or above), switch to "Offline", click "Extension", click “micro:bit” under “Board”, click “HuskyLens AI Camera” under “Sensor”.
Step 3. Command Learning
Here are the instructions mainly used.
① Initialize only once between the beginning of the main program and looping executions. You can select I2C or Soft-serial, and no need to change I2C address. Please note that the “Output protocol”of your HuskyLens sensor should be set to be consistent with the program, otherwise, data cannot be read.
② You can switch to other algorithms freely, but please note that you can run only one algorithm at each time, and it takes some time to switch algorithms.
③ The main controller requests HuskyLens to store data in the “Result” once(stored in the memory variable of the main board, and once requested, it will refresh the data in the memory once), then the data can be obtained from the “Result”. The latest data can be got from the “Result”only when this module is called.
④ Check whether there is frame or arrow in the screen from the requested “Result”, including the learned(id>0) and unlearned, and returns 1 if there is one or more.
⑤ Check whether the IDx has been learned from the requested “Result”.
⑥ Check if the IDx requested from the “Result”is in the screen. The frame refers to the algorithm of the frame on screen, arrow refers to the algorithm of the arrow on screen. Select arrow when the current is only line-tracking algorithm, for others, choose frame.
Step 4. Flowchart Analysis
3. The Sample Program
4. Operating Effect
When a cat is recognized in the HuskyLens sensor, a note pattern will be displayed, while a heart pattern is displayed for dogs.
Task 2: Add the Function of Dispensing Food
1. Structure Building
First of all, we chose a square box as the main body of the Vending machine. The structure is simple and firm, and the lid is easy to open and close for internal maintenance, which is quite suitable for this project.
Take the two plastic bottle caps and open a hole on each one as shown in the figure, fix them on the holes on the top of the paper box(cut out with scissors first), and install them symmetrically.
Find two plastic sheets and fold them in half to create a hinge effect. Cut out a slot at one end for easier and more stable binding later.
Fix the folded plastic sheets with glue on the inside of the box. Make sure it is symmetric, and that there is a large enough hole for food coming out when the hinge is opened.
This is a very critical step. First fix the servo with glue. Make two holes for the rocker arm on the back of the paper box, one for leaving space for servo arms’ rotation and the other for the servo wire connection. Remember to calibrate the 90-degree position of the servo. Then tie the left end of the servo arm with a rope and wrap it around the plastic sheet on the left. Then pass the rope through the outside of the box and wrap around the plastic sheet on the right, and finally tie it to the right end of the servo arm, as shown below.
Build the cat and dog foods tracks in the box with cardboard.
Open holes in the corresponding position of the paper box cover so that the food can come out smoothly.
Fix the expansion board, micro:bit, HuskyLens in order on the top. Screw two screws on the box front to adjust the tightness of the rope, also the tightness of the two valves. Since the foods for cats and dogs may vary in size, the screws here can be used to indirectly adjust the exit space.
Here, the structure comes to an end. After the electronic part is debugged, we will add plastic bottles for storing the food.
2. Hardware Connection
Here we connect the HuskyLens camera to the I2C interface of the expansion board through a 4pin cable as the input device of the entire system, and then connect the servo to the P0 port of the expansion board in the correct line sequence.
3. Sample Program
Then we connect the micro:bit board to the computer and open Mind+ to start programming. First, complete the test of the micro:bit board driving servo, confirm the opening and closing of the valve, and adjust the tightness of the rope appropriately.
The final main program is much simpler. Start up and initialize the micro:bit matrix, the position of the servo and the HuskyLens camera. Perform recognition every certain period, and once recognize cats and dogs, the servo will rotate to open different valves and the corresponding food will be released.
4. Operating Effect
When a cat is recognized to be approaching, the micro:bit matrix screen will display 1, and the cat food will be delivered, a musical note will be displayed after that. When a dog is recognized to be approaching, the micro:bit matrix screen displays 2 and the dog food will be delivered and a heart will be displayed when done.
Project Summary:
Project Review:
We use materials that can be seen everywhere in our lives combined with the HuskyLens camera to complete a stray Vending machine step by step. It is not only good news for strays, but also convenient for good-hearted people to add food independently, donate their love to help them.
Knowledge Review:
1. Understand the working principle of object recognition;
2. Learn the learning process of object recognition
3. Use HuskyLens as an input device combined with micro:bit board and other hardware
Project Development:
After completing the stray feeding machine, if you want to help strays more, can you add a statistical function to record the number of cats and dogs that come to the feeding machine to eat every day, so that the food can be rationed more reasonably.
Next Class: Following the Right Track | Huskylens Playground with micro:bit EP06 >