Lesson 3: Set Pet Interaction Mode

Have you ever thought about making your electronic pet not only display expressions but also interact with you? In this project, the light sensor on the UNIHIKER K10 will simulate the pet’s “head,” while the left and right light sensors on the Maqueen Plus will simulate the pet’s “feet.” This setup allows you to interact with your pet through simple movements, letting it sense your “presence” and respond accordingly. Let’s explore together how to make your electronic pet more lively and fun!

Task Objective

By using light sensors, set up interactive modes for the electronic pet so it can respond differently based on the user's actions. When the light sensor on the UNIHIKER K10 is touched, the pet’s expression changes to happy and it spins in place; when the left light sensor on the Maqueen Plus is touched, the pet’s expression changes to naughty and it moves backward to avoid.

Knowledge Points

1. Understand the role of sensors in robot interaction

2. Understand the role of the "input-processing-output" flow in interaction design

3. Learn how to obtain values from light sensors and perform threshold judgment

Materials List

Hardware List:

HARDWARE LIST
1 Maqueen Plus V3
1 UNIHIKER K10

Software Used: Mind+ Programming Software (Version V1.8.1 RC1.0 or above) ×1

Download Link:https://mindplus.cc/

Practical Exercise

This project is completed step-by-step through two tasks to design and implement the interactive mode of the electronic pet. Task 1 focuses on acquiring sensor data and initializing the pet’s expression, while Task 2 improves the pet’s interactive logic and dynamic feedback based on sensor input.

Task 1: Light Data Collection and Expression Initialization

By reading data from the light sensors, understand the ambient lighting conditions and initialize the pet’s expression.

Task 2: Design Interactive Logic

Based on the changes in data collected from the light sensors, design and implement the dynamic reaction logic of the electronic pet. This allows the pet to display different emotions or actions according to environmental changes, enhancing interactivity and liveliness. The specific implementation is as follows:

· When the light sensor on the UNIHIKER K10 is touched, the pet spins in place and switches its expression to “happy.”· When the left light sensor on the Maqueen Plus is touched, the pet moves backward to avoid, simulating a shy or evasive reaction, and switches its expression to “naughty.”

Task 1: Light Data Collection and Expression Initialization

1. Hardware Connection

Use a USB 3.0 to Type-C data cable to connect the assembled robot car to the computer.

Note: The Type-C end should be connected to the UNIHIKER K10.

2. Software Preparation

Open Mind+ and complete the software setup as shown in the following illustrations.

3. Programming

(1)Read Light Sensor Values

In this task, we need to view the values of the light sensor via serial output. Therefore, the "serial write string Wrap" command should be used to display the sensor reading results.

To obtain the ambient light intensity on the UNIHIKER K10, use the "read light" command in the "UNIHIKER K10 Command Area" and embed it within the "serial write string Wrap" command to output the result to the serial port.

To obtain the light sensor data from the Maqueen Plus car, go to the "Extension Board" command area and select the "read light intensity left" command. Similarly, embed it within the "serial write string Wrap" command to display the result.

To get the data from the right light sensor, simply change "left" to "right" in the "read light intensity left" command.

(2)Initialize Expression Display

When the pet is in a non-interactive state, we set its initial expression to "boring." In Lesson 2, all expression image files have already been downloaded. In this lesson, we will directly use the "7-Bored" expression image.

Do you remember how to display the expression on the UNIHIKER K10 screen? Use the “cache local image” command and display the image at the position (X: 0, Y: 0). Click the “Settings” icon in the middle of the command, select “Open” from the dropdown menu, and then choose the expression image from your local folder (where the image is saved). When setting the image properties, set the width to 240 and the height to 320 to ensure the image is fully displayed on the screen.

Next, use the “show cached content” command to refresh the cached image onto the screen. Since the expression is displayed dynamically, the “wait 0.1 seconds” command should be used to control the display time of each frame.

The remaining three images can be set one by one using the same method—loaded and displayed in sequence to form a complete animated "boring" expression. The full program is shown below:

4.Program Execution

Before running the program, please ensure that the UNIHIKER K10 is properly connected to the computer via a USB cable. Once the connection is confirmed, click the “Run” button. After the Program Execution is complete, the "bored" expression image will be dynamically displayed on the UNIHIKER K10 screen.

To view the detected light sensor values, simply open the serial port window in the terminal display area of Mind+, and you can see the sensor output data in real time. Try covering one of the light sensors with your hand and observe whether the values change!

Note: To view data via the serial port, the UNIHIKER K10 must be connected to the computer using a USB cable.

5.Give It a Try

Since the light sensor values update very quickly in the serial port, it's difficult to directly determine which value corresponds to which sensor. So, is there a way to help us quickly distinguish the light intensity detected by each sensor? Try to think creatively — is there a clearer display method or labeling strategy that could help solve this problem?

Task 2: Design Interactive Logic

1.Programming

Building on Task 1, continue improving the program by adding interactive logic. The entire task logic can be divided into Head Interaction, Left Foot Interaction, and Right Foot Interaction.

(1)Head interaction

When the user covers the light sensor on the UNIHIKER K10 (simulating a "head-pat" action), the pet responds with a "happy" reaction: it first spins in place, then switches to a "happy" expression, and displays this state on the screen.

Create a new function called "Head interaction", and complete the following operations:

· Motion Control

Use the "set motor direction roate speed" block from the "Expansion Board" section to make the left and right motors rotate in opposite directions at the same speed (100), enabling the pet to spin in place.

Turn Left: Set the left motor direction to backward and the right motor direction to forward, and keep this state for 3 seconds.

Turn Right: Set the left motor direction to forward and the right motor direction to backward, and keep this state for 3 seconds.

Stop: Use the "set motor stop" instruction to stop all motors.

· Expression Switch

Use the "cache local image" instruction to load the local image file "happy.png", then use the "show cached content" instruction to display or update the content on the screen. Set the expression switch speed to 0.1 seconds.

Use the "repeat 3" command to place the "show cached content" statement inside, so that the expression is displayed 3 times in a loop.

· Trigger the “Head interaction” Function

Add conditional logic in the main program to use the "read light" command to get the current value of the light sensor.

Use the "if... then" instruction to call the "Head interaction" function when the detected value is less than 50 (i.e., the light is dimmed because the hand covers the sensor).

(2)Left foot interaction

When the user touches the “left foot” area on the Maqueen Plus V3 mainboard (touching the Light-L photosensor), the pet first moves backward to the left, then backward to the right to simulate an evasive movement, and then switches to a “naughty” expression.

Create a new function called “Left foot interaction.”

· Motion Control

Use the “set motor direction roate speed” block from the “Expansion Board” section to set the left motor to move backward at a speed of 200 for 0.2 seconds, then use the “set motor stop” block to stop all motors.

Then use the “set motor direction roate speed ” block to set the right motor to move backward at a speed of 200 for 0.2 seconds, and use the “set motor stop” block to stop all motors.

Use the “repeat 3” block to place the pet’s backward evasion program inside it, so that the pet performs the backward evasion three times.

· Expression Switch

We've practiced the facial expression switching function many times already. This time, let's choose the "naughty" expression to display!

Note: Make sure the expression switches three times in a row again!

· Trigger the "Left Foot Interaction" Function

Add conditional logic in the main program. Use the "read light intensity left" command from the Expansion Board to obtain the value from the car's left light sensor.

Use the "if...then" command. When the detected value is less than 50 (indicating the light has dimmed, such as when a hand covers the Light-L sensor), call the "Left Foot Interaction" function.

The complete program is as follows:

2. Program Execution

Before running the program, please ensure that the UNIHIKER K10 is properly connected to the computer using a USB cable. Once the connection is confirmed, click the “Run” button and wait for the program to upload.

After the program starts, the screen on the UNIHIKER K10 will dynamically display a “boring” expression.

When you cover the light sensor on top of the UNIHIKER K10 (simulating a head pat), the car will turn left in place for 3 seconds, then right for 3 seconds, and switch to a “laugh” expression.

When you cover the Light-L light sensor on the left side of the car (simulating touching its left foot), the car will perform evasive movements to the rear-left and rear-right three times in a row and switch to a “naughty” expression.

3. Give It a Try

In Task 2, we have already completed the programming for "Head Interaction" and "Left Foot Interaction." Next, try using the same method to complete the "Right Foot Interaction" function!

Tip: During the right foot interaction, the car should first move backward to the right and display an appropriate expression~

Knowledge Corner

1. Understanding the Role of Sensors in Robot Interaction

What is a sensor? A sensor is like the "sense organ" of a robot, allowing it to perceive the outside world. Sensors can convert information such as light, distance, and temperature into signals that are sent to the main control board for processing. Here are a few basic sensors as examples.

How do sensors help robots achieve interaction? For a robot to interact, it must first “perceive” human actions or changes in the environment. Sensors serve as the bridge—the “information channel”—for this process.

· Acquire external information: Allow the robot to “sense” whether you are interacting with it.· Trigger behavioral responses: When specific changes are detected, the robot performs corresponding actions (such as turning or changing expressions).· Enable human-robot interaction: The robot “responds” to your actions based on sensor data, enhancing both fun and intelligence.

2. The Role of the "Input–Process–Output" Flow in Interaction Design

In the interaction between robots and humans, any complete behavior process typically involves three key stages: input, process, and output.

The “Input–Process–Output” flow is the core logic of interaction design. Without proper input evaluation, the robot cannot respond accurately. You can think of every interaction design as such a flow: just figure out which sensor to use to perceive what, under what conditions to consider the interaction “triggered,” and how you want the robot to “respond.” With this clear thinking, you can design fun and engaging robot interactions!

Challenge Yourself

In previous interaction designs, we made the electronic pet respond with rich actions like rotating, moving backward, and switching expressions by touching different light sensors. Now, let’s add sound so the pet can express emotions through different buzzer tones, creating a more vivid interactive experience! You can try:

· Adding a cheerful beep sound to the “Head Interaction”, so the pet makes a happy “beep beep” while rotating;· Adding a naughty sound effect to the “Left Foot Interaction”, such as alternating high and low tones to increase fun;· Designing your own “sound expression library”, so each interaction is accompanied by a unique emotional tone.

Give it a try! Can your electronic pet convey emotions and reactions through a combination of actions, expressions, and sounds?

icon program-EN.zip 760KB Download(2)
icon resource.zip 106KB Download(0)
License
All Rights
Reserved
licensBg
0