"The project is to build a portable" speed trap ", using machine learning to identify the vehicle, Doppler radar sensor to measure the speed, and a cellular module to report data to the cloud.
The purpose of manufacturing the device is not to identify individual vehicles or get license plate numbers, but to collect data for use in planning urban traffic mitigation measures.
In order to identify the vehicle, using the camera -PiCamera Module v2, because of its quality, compatible port on RPi camera module and an easy to use API.
Be
Be
Radar sensors:
OPS243-A from OmniPreSense Doppler radar sensor as a solution to determine the speed of the vehicle is detected. It provides a serial interface and an API to access sophisticated data rate within 150 miles / hour.
show:
Using a simple seven segment display to report the results of inference and the speed of data collection.
Cellular data (and power):
When things make the project portable, there are two main considerations:
How you want to give it power?
How would you get the data from?
In order to solve the power problem, I use the USB-C battery pack 30,000 mA.
In order to solve the data problem, I chose the blue wireless notebook card. Notecard (and its companion noteccarrier - pi HAT) not only provides cellular data access, and integrated wireless Bruce cloud services Notehub. Data to be collected in order to secure the relay to my cloud applications to enable reporting purposes.
Assembly:
like:
OPS243 module through the USB connection RPi
PiCamera special interface module connected to the camera RPi
The noteccarrier -pi HAT 40 is connected through a needle provided
The only exception is the 7-segment display, I connect to notecarrie - pass-through on the head pi HAT:
By 7-segment display 3.3v power supply pin, a ground, gpio 2 and 3 are connected to the RPi SDA and SCL pins on the display.
Edge Impulse processing machine:
Step 1 ML created based on the model of the image is to ensure appropriate selection of images used to train the model.
Because I'm trying to identify the vehicles traveling on the road, I started to download about 100 cars, trucks, vans and bikes ordinary photos on the road. Then I spent an hour and took some "real world" photos, imitate what my Raspberry Pi to be processed.
Edge impulse value:
Edge Impulse for all types of developers to take advantage of the machine on the edge of the low-power computing devices to learn, without having to become experts in ML.
For my purposes, I needed to identify Edge Impulse for a given frame or more objects (game state Note: I will simplify vehicle is "car" and "bike"). This is called "object detection" is a very common use case ML.
To start using Edge Impulse, I created a free account, and guide the creation of my first project in Edge Impulse Studio in.
I am building a new project, select a project type of image, and then classify multiple objects (for any one video may contain one or more vehicles):
Next, I collected data in tab upload all of my vehicle image. By default, Edge Impulse automatic image into these training images and test images (allocate enough images to test your model is very important).
Finally, I navigate to the labels queue, assign a label to each image I upload each object. (See above picture police car.)
Model has reached 87% accuracy rate.
Using edge pulse Model:
With the completion of construction of the edge pulse ML model, it is time to deploy the model to Raspberry Pi.
Note: The edge pulse clear that only support Raspberry Pi 4.
First, I use the following command to install the Edge Impulse for Linux on RPi:
curl -sL https://deb.nodesource.com/setup_12.x | sudo bash -
sudo apt install -y gcc g ++ make build-essential nodejs sox gstreamer1.0-tools gstreamer1.0-plugins-good gstreamer1.0-plugins-base gstreamer1.0-plugins-base-apps
npm config set user root && sudo npm install edge-impulse-linux -g --unsafe-perm
To verify the installation, I run Edge - Impulse -linux command to log on to Edge Impulse and register my RPi for the device.
Next, I need to install Edge Impulse Linux Python SDK:
sudo apt-get install libatlas-base-dev libportaudio0 libportaudio2 libportaudiocpp0 portaudio19-dev
pip3 install edge_impulse_linux -i https://pypi.python.org/simple
And the edge of pulses dependent on the installation, I use this command to download the file my model:
edge-impulse-linux-runner --download model.eim
Click to view the code :()
The Python code is divided into five tasks:
The vehicle identification using a model ML pulse edge;
Measuring the speed of the vehicle immediately;
Speed is displayed on the seven-segment display;
The event is sent to the cloud over cellular;
Creating a dashboard for reporting cloud-based.
1. Python edge pulse
After Edge Impulse Python SDK is installed, use the ML model could not be easier. By borrowing the example Python code Edge Impulse team, and I was able to piece together a solution using video frames and outputting an inference result.
During the test, observe the process becomes very interesting, because every inference needs to run <400ms time!
Found 0 bounding boxes (383 ms.)
Found 0 bounding boxes (385 ms.)
Found 1 bounding boxes (384 ms.)
When a match is found, Edge Impulse api return various data elements, but the only thing I care about is the label and value, they tell me the type of vehicle (ie car vs bike) and Confidence% match.
Found 1 bounding boxes (377 ms.)
? What is it car - and how confident 64?
2. Check the speed
At this point, my program can be reasonably sure that it is looking at a car, so now is the time to check the speed from OPS243 Doppler radar module. I reconstruct some examples of Python code OmniPreSense team in order to speed the collection OPS243 module report:
def ops_get_speed ():
"" "" ""
capture speed reading from OPS module
"" "" ""
while True:
speed_available = False
Ops_rx_bytes = ser.readline ()
Ops_rx_bytes_length = len (Ops_rx_bytes)
! If Ops_rx_bytes_length = 0:
Ops_rx_str = str (Ops_rx_bytes)
if Ops_rx_str.find ( '{') == -1:
try:
Ops_rx_float = float (Ops_rx_bytes)
speed_available = True
except ValueError:
print ( "" Unable to convert to a number the string: "" + Ops_rx_str)
speed_available = False
if speed_available == True:
speed_rnd = round (Ops_rx_float)
return float (speed_rnd)
This code is only the latest reading of the report, it should be almost perfectly with the ML model identification associated with a vehicle.
In order to determine whether the vehicle is speeding, I want to be able to set the speed limit through the street I live programming. To this end, I used Notehub environment variable function. Server sync data from the card to my notes.
This allows me to adjust the speed limit, using only a web browser and Notehub.io:
These variables can be set according to equipment, project or fleet.
3. Speed display
By using a simple 7-segment display, at least I can verify that my program is running, as long as there are cars on display through speed.
Use HT16K33 library Adafruit offered, it only takes a few lines of code to initialize the display:
disp_i2c = busio.I2C (board.SCL, board.SDA)
display = segments.Seg7x4 (disp_i2c)
Then, the digital print value on the display to only means to clear before and fill the new figures:
display.fill (0)
display.print (speed)
4. events to the cloud via the cellular
These accumulated data elements packaged for delivery to my cloud applications securely over the cellular network.
The blue Wireless Notecard make this easier than ever. As a global service and simple "JSON-in, JSON-out" prepaid cellular module API's, Notecard so that all developers can access cellular.
For example, the following is a Python will use a speeding "event" Notehub.io example of Blues Wireless service from Notecard sent to:
rsp = note.add (nCard,
file = "" speed.qo "",
body = {
"" Timestamp "": timestamp,
"" Confidence "": confidence,
"" Lat "": lat,
"" Lng "": lon,
"" Speed "": current_speed,
"" Speed_limit "": speed_limit,
"" Is_speeding "": is_speeding
})
Notehub. IO allows you to safely relay data to the cloud application of your choice. Whether you are investing in AWS, Azure, Google Cloud cloud still many things in a platform, Notehub.IO provide integration options.
To initialize Notecard, I sent some commands (to associate it with my created on Notehub.io project):
# Initialize the Blues Wireless Notecard (blues.io)
productUID = keys.PRODUCT_UID
port = I2C ( "" / dev / i2c-1 "")
nCard = notecard.OpenI2C (port, 0, 0)
# Associate Notecard with a project on Notehub.io
rsp = hub.set (nCard,
product = productUID,
mode = "" periodic "",
outbound = 10,
inbound = 10)
Then, whenever I want to capture the event (in Blues WirelessThe world called "Note") is sent to Notehub. Io, I use the fluency API provided by the Notes -Python library to issue this prompt. Add commands shown above.
5. Report based on cloud computing
Data from my Raspberry Party sent to Notehub through the phone. IO, the next step is to securely synchronize this data with my cloud app.
For this project, I chose Ubidots because it is a great platform that visually visualize the network data in a way to point and click.
While following the BlueWireless Ubidots routing tutorial, I have configured a Notehub. IO routing, send data to Ubidot.
Note: Blue Toning Wireless also provides many routing tutorials for other cloud network platforms.
In Ubidots, I used their simple UI to add many widgets to a single dashboard report.
test:
50% of car speed (I am "speeding" is> = 5 miles per hour per hour).
The maximum speed is recorded in a 30 mile per hour. Ah.
The average speed of this car is 35.45 mph. "
Our other product: