top of page
FyeLabs-Logo-_Full_trans_background.png

Robust Actuating Case:

​

This project aims to increase the safety of large industrial garbage trucks when during reversing, specifically to avoid hitting people and objects. The system is required to be highly reliable, and robust due to the harsh environments the vehicle operates in. To achieve this a multifaceted sensing system is required to be mounted to the back of a truck, specifically consisting of 10 LiDAR sensors, 3 depth cameras, 1 visual (infrared) camera, a light sensor, an IMU, and a GPS sensor; all of which are controlled by 3 onboard microcontrollers, and 4 onboard mini compute units. All of this needed to be housed inside a protective unit that would be entirely weatherproof and could shield the sensitive sensors from the elements when the truck was not backing up.

​

The solution is what you see above, an all-metal, actuating, compact, case with over 100 parts. Over the course of 15 major revisions including: changing electronics, modifying sensor placement, optimizing manufacturability, and much more, the model I had designed was finally manufactured in metal (images shown below). Two of the largest challenges with this case was ensuring that it could be economically mass-manufactured, hence I designed 6 parts that could be sheet metal cut/bent and then welded together to yield the majority of the case structure. This proved extremely effective cutting manufacturing costs down by 56% and halving the lead time compared to the initial design which was forced to use CNC milling. The case itself was also very reliable and incredibly strong. Additionally, my gasket + geometry solutions proved to be effective as during a water spray test the case kept the electronics on the inside dry. 

Inner Case

(3/6 Metal Parts)

image_edited.png
image.png

Human Machine Interface (HMI):

​

The HMI is mounted on the dashboard of the truck and acts as the primary medium of communication to the driver regarding information from the main unit mounted at the back of the truck. The HMI consists of a custom 2-layer PCB board with two microcontrollers one that handles the wireless communication with the main unit and the other that controls the, 4 push buttons, a speaker, and a 7” display. From mechanically modelling the plastic casing to designing the electrical PCB, to writing the firmware for the two boards I was involved in all aspects of its development. The control flow diagram below visually demonstrates the communication between the microcontrollers. 

Com Protocol.png

​

After lots of experimenting with different wireless technologies and protocols, I settled on a custom long-range, high-impedance (LoRa) communication protocol I developed and then implemented it along with custom-designed check sums bytes to ensure data transmission reliability across 6 microcontrollers. 

​

Additionally, I designed a PCB for the two microcontrollers within the HMI casing and modelled the case around it which can be seen below.

image.png
image (1).png
HMI Case Assembly.PNG
image.png
FyeLabs-Logo-_Full_trans_background.png

LiDAR 

Data Visualizer

Out of the 3 central systems in the unit (LiDAR, TOF, Visual) I focused primarily on developing algorithms, firmware, and software for the LiDAR array.

Our LiDAR subsystem is a unique implementation of a 2D LiDAR. It uses 10 individual 1D lidar sensors arranged in an arc to be able to gain a 2D understanding of the world. This was done to achieve a higher degree of robustness, as there are no moving parts. One disadvantage of this is in a decreased resolution. This problem became especially pertinent to me when I began working on trying to fix a false positive issue the team was having with dust. Since these trucks would be operating in very dirty/dusty environments, this was a crucial flaw. The reduced point cloud density means I could not use standard techniques as I did in my LiDAR filtering project. Instead, I collected lots of data in clean as well as dusty environments in an attempt to discern patterns that could aid in filtering dust.

Distance (1).png

This is one of several experiments I ran gathering data on the distance and the light intensity (Flux) in varying different conditions. I used only one LiDAR, considering it a sample representation of the population, as numerically this was the easiest way to look at the data. Based on that data I along with my supervisors developed 5 algorithms that worked in parallel to try and ignore the false positives from dust whilst still picking up a person or other obstacle. I then implemented these algorithms through writing firmware (in C++) for the LiDAR PCB. These results were promising, but to go further , seeing real-time data for what all of the LiDARs were doing was critical. Thus I began working in python to create a LiDAR visualiser. I started with a serial interceptor, that would interpret the data being sent over wire from the microcontroller and draw it out, as shown below:

As you can see each LiDAR sensors reported distance is shown correctly for that LiDAR and the thickness of the line represents the intensity of the light reflected back.

 

I then also created a wireless MQTT-based version so that the unit could be entirely sealed up for testing and the laptop did not have to be tethered to it. Additionally, I also created functionality to allow the visualization to be recorded in a custom file format, and then later played back in real-time!

 

The effect of this software is it allowed many people on our team to easily help with the tuning of parameters and constants in my filters. Here is a video of us using it in action early in the tuning process:

​

Person Detection Testing

Dust Filtering Testing

bottom of page