Starting from the first Arduino set, I have been in contact with the robot for several years, but until I recently began to do a complete topic. There are two skills to open the new world during the period: python and linux. Behind them is a powerful open source community. Master the tools for these two tools (meticulous tools), you feel that the web is a weapon.
When I was interior in the company last week, I won my heart: We are software engineers, not programmers. Our work is not a write program, but a reasonable use of tools to solve the problem. In Google, if you feel that you have to write a feature from zero, you have not found the corresponding tool. This is even more like open source community.
This is a remote car, which can control the actions of the car and the angle of the camera by the infrared remote control or wireless keyboard. Tensorflow monitors the picture captured by the camera in real time, and the speech reads the object it identifies. All code is placed on my github.
This idea is not my original, this blog from Lukas Biewald written in September last year. The core part, TENSORFLOW identifies the camera image and the voice output is the open source of Pete Warden, the artificial intelligence engineer of our company. Unlike the original blog, I joined Arduino as a mechanical general control in the production process, and I also learned about the method of Arduino and Raspberry Party (serial communication). During the period, many useful skills and tools are used, and they are organized here, welcome to messages with their message!
The entire topic is completed in the command line environment, no graphical interface. If you don't understand Linux system, you may have some effort. However, you have started playing robots, how can you learn Linux? I am studying Linux through "Bird's Linux Private Food". Later, I tried to build Linux from the source code, and finally overcome the resistance to the command line generated by the Windows system environment. I believe me, overcome this obstacle, you will open the door of the new world. What's more, work with command line is more cool and more passenger, isn't it? In addition to Linux, you have to understand C ++ and Python to complete this topic.
In addition, this article mainly introduces the electronic part, not mechanical and artist. As you can see, this car is ugly breaks my aesthetic bottom line, I have no heart to my heart. I hope to do some electronic issues that have a aesthetic and functionality in the future, maybe I will work with designers and friends!
First, you need a new Raspberry Pi, install custom Linux system, connect wireless network. You also need an official with a camera and set it in the Raspberry Pass. You can connect the Raspberry Part to the monitor through HDMI, but more convenient practice is that SSH is remotely logged in so that you don't have to repeatedly pull the Raspberry from the car from the car during the debugging process, remove, connect the screen, and then install Back to the car, you can modify the kernel of the car in real time. Even, my Arduino program is also written, uploaded, communicated through the Raspberry Pie, so I avoid the steps of the computer to connect Arduino, so that everything is more smooth.
The Raspberry Pisli Linux system supports graphics desktops, you can use RealVNC (for Windows) or TightVNC (for Mac) to remotely log in to graphics desktops. (No need in this issue)
This is the core part of the topic, but it is the easiest way to operate, because everything is clear here, and you will follow the class. Run the code here.
Note: Here is a well-trained model, that is, Tensorflow gives a good panel set in advance, and the training photo library is imageNet. That is, the object recognized by the car can only be the Labels contained in the picture library, nor does it "learn" process.
There are a lot of robot chassis, choose one of your favorite. Standard kits include a pedestal, two sets of motor + wheels, a 10,000-way wheel, a battery case. This topic does not need four flies, and the motor controller to be used later may only support two motors. I used Zhang Yuxie to send me the first DIY kit: a wooden board and 3D printed wheel and connecting parts that poked a lot of holes. This is probably the earliest suite of radish, from Silicon Valley.
Now, the "origin" kit of radish is too spicy, has been perfect, and there is a mature teaching resource on the line. The servo and metal connectors used in this topic come from the second kit to my sister - "Original" kit. But from the perspective, the rough wood kit makes me more close, in line with the "simple material realization prototype".
Power: Raspberry Pistat 5V, 2A power supply, if you put a current enough to charge a lot of charge. Connecting the Raspberry Pieces and Arduino connecting the cable to Arduino. However, the motor I used an external power supply (battery case). You will find that even if there is no external power supply, charging treasure can still drive the motor (even if it is very slow). However, good habits are mechanical partial power supply; the logic circuit part is provided by charging treasure.
Next, control the car. There are two options here, and the first does not require Arduino. I am using the second one.
3.1 Raspberry Pie as Mechanical Policy
I think the essence of the microcontroller is not small size, but a rich GPIO (General Purpose Input-Output), which is the window of the program and the external world dialogue. You see various electronic components, probes, welding, and breadboards, are dealing with GPIO. You need to know basic circuit knowledge, you need to know the arrangement of them on the microcontroller. The Raspberry Pie has a very easy to use GPIO Python library: gpiozero, how to use it at a glance.
Usually used four port control motors, respectively, the positive negative levels of the two motors are used, respectively, and the forward / rewind / steering of the truck is realized by the forward / reverse rotation of each motor. The standard circuit model for achieving two-way current is H bridging. You can purchase the most basic H-Bridge module.
Because I don't have H bridge in hand, I have not realized this plan.
3.2 Arduino as a mechanical general control
I don't have H bridge, but there is a Motor Stacking Shield for Arduino, which h bridges on Arduino. So I simply used Arduino to be responsible for machinery (motor + helm), equivalent to the body; the Raspberry Part is only responsible for image recognition, equivalent to the brain.
Arduino is not a Linux system, can't go directly to write procedures, need to be compiled uploaded after you write. I use the data cable to connect the Raspberry Pieces and Arduino, and upload the program after writing the program on the Raspberry. I found a very useful command line IDE: Platformio (also having a great graphic interface editor). The installation process on Linux is based on Python 2.7. You need some initialization, if you are the Arduino UNO motherboard like I, enter the following command:
Pio Init -b uno
Arduino's C ++ source code is here. After entering this folder, enter the following command to upload:
Pio Run -Target Upload
Later I found that PlatformIO did not seem to support C ++ 11 for the Arduino motherboard. If you have this need, INOTOOL can be considered.
Also have two scenarios: wireless keyboard, infrared remote control. I have achieved two options.
4.1 Wireless keyboard
If you are in the previous step 3.1, the wireless keyboard manipulation module can be directly embedded in the mechanical control code (I didn't implement). If you use 3.2 in the previous step, you need to turn button operation to a mechanical control signal (text form) on the Raspberry Party, and manipulate Arduino by serial port.
Python code is here, using the library I write, used to detect the keyboard button. This library will make a single button to advance / back / turn / stop and other behaviors; but I hope that the long button advance / back / turn, stopping without pressing the button. But I have never found a ready-made library (Update: It is said to be in Pygame).
Later, I tried a library through background threading and system delay, but the effect is less ideal, and the error caused by the system delay and program running time always matches, just give up. Now the code is a single button action / stop scheme. If the reader has a good library, please recommend it!
One thing to pay attention to, need disable login before using serial port communication (since you have already logged in remotely), this explanation is clear.
4.2 Infrared Remote Control
Infrared long pressing is a separate value (Repeat), you can make me easily realize "long press - driving, no press-car park". In addition, the infrared remote control code is written directly in the C ++ code of Arduino, and does not need to communicate with the Raspberry Pieces and serial ports, more in line with Arduino as the design principle of mechanical total control.
Platformio is not self-with infrared library, I use this. Platformio is too simple to use the third party library, do not need to download and install, add the Github link directly, refer to my profile.
Another point, every infrared remote control is different. The TV, audio, air conditioning remote control can be used, you only need to match the button and the corresponding code before use. A bunch of keys in the code in the code only apply to my remote control. You can use this code to get the key code. Note: There are several modes of infrared remote control. My remote control is the most common NEC mode. If you match a bunch of garbled, you can consider several other modes in the library.
Yes, if you use an infrared remote control, you also need to install an IR Receiver in the car. I am put on Arduino and use the 8th port.
If you use the program 3.1, you can also directly put the IR Receiver on the GPIO of the Raspberry.
These are enough for you to drive. I installed the steering machine (Servo) on the car and control the up and down rotation of the camera. The operation is very intuitive, and you can understand the code. I have no ultrasound probes, this can help you detect obstacles and will stop it before hitting the wall.
If you want to look at the real-time picture of the camera remote, VNC can't be competent. This solution can be considered. But in this case, Tensorflow can no longer use the camera. There should be a shared program, I have no exploration.
Almost these, my code doesn't have much notes, and so on. If you have questions, you can leave a message to ask me.
Welfare: There is a simple program for delayed photography. I am set to take a photo every other minute in crontab, and then take the same day to shoot into a video. I plan to bring to the company next week, find a good location, put a few days, take a 24-hour scenery of New York.
Search for "Love Bo.com" to pay attention, daily update development board, intelligent hardware, open source hardware, activities, etc., you can make you master. Recommended attention!
[WeChat scanning picture can be directly paid]
Related Reading:
iPhone8 full body black technology: a good news a bad news!
Our other product: