HI, I'M MAHIR!

About Me

I'm a second year Mechatronics Engineering student at the University of Waterloo. Motivated by curiosity, I started experimenting with electronics and robotics at a young age. From there grew a strong interest in all things related to embedded software and machine learning. My name translates to 'skillful' and this is a word I try to embody in my daily life by always striving to learn and grow. I'm currently on co-op as an Embedded Software Developer at Christie Digital Systems. Outside of work I like playing squash, chess and speedcubing. I enjoy working on new projects and you can find out all about them below!

me

Projects

Gesture Controlled Robotic Arm

What started as an ordinary 6 DOF robotic arm turned into my dive into computer vision and the amazing world of image recognition using OpenCV. I used Google's MediaPipe library to place 21 landmarks onto my hand, each with individual coordinates relative to the frame. These could then be transferred into an array and used to detect when specific finger or palm positioning changed.

I detected these changes to define commands related to various hand gestures. The PySerial library was used to then send these from the python program to the Arduino through serial communication over an HC-06 Bluetooth module. After the hand was calibrated by being held up for 30 frames, the C++ program on the Arduino routinely polled the serial port to see if any commands were queued and then executed them accordingly.

The six servo motors were wired to two external power supplies on a breadboard to properly support their stall current. The ground wires of the motors and the power supplies were also soldered together with the ground on the Arduino. All of the wiring was then neatly hidden inside an acrylic base cut out using a bandsaw and then drilled together.

Eye Controlled Trolley

This was the project our team of 4 developed over Hack the North 2023. I programmed an ESP32 to drive 4 DC motors through L298N H-bridges, enabling speed control through a phone or laptop over WiFi. Data from a MPU6050 gyroscope was also processed to keep track of what way the trolley was facing relative to its desired direction. This allowed turning to be done autonomously after a move command was processed. All of the components were then wired together and soldered before being attached to the underside of a 3D printed chassis.

We also interfaced with AdHawk eye tracking glasses and their API in Python to detect blinks and monitor gaze coordinates with 80% accuracy, allowing the distance between points around the user to be calculated. Double blinks were used to lock in a point and two double blinks in a row would signify that the user wanted the trolley to traverse the distance between them. Lastly, sockets were used to connect the Python program with the ESP32 so that commands could be sent properly.

Brick Scanner

Tasked with developing a measuring system to accurately display the dimensions of LEGO bricks, we set out to develop a computer vision based solution with sub-millimeter accuracy. The final architecture consisted of an STM32 Nucleo F401RE controlling peripherals while also communicating with a Raspberry Pi 4 running the CV program. I developed the firmware drivers that the STM32 used to operate two NEMA 17 stepper motors and an LCD display.

The stepper motor driver was developed using OOP concepts so that multiple motor objects could be declared and controlled easily without needlessly repetitive code. The onboard timer was used to adjust speed and the microstepping pins were used to adjust the resolution to suit our fine-tuned application. The communication line to the RPi was then set up using UART and an ISR to handle data coming in appropriately. The STM32 HAL was used to operate the microcontroller and better abstract the functions used.

Guitar Playing Robot

This was the result of the final project for our first year digital computation course. The robot plays a string on a guitar by reading the notes in through a coloured strip. The strip itself is constructed using a C++ program that converts a MIDI file into colour coded squares. This is then passed into a gear mechanism that rolls the strip through, passing it under various sensors for detecting input.

I wrote functions in C to take input from colour and ultrasonic sensors for accurately tracking task completion. Output from onboard motor encoders was processed to play songs with up to five frets. The fret playing mechanism consisted of a cam shaft that rotated and pressed down on the string in different areas depending on the angle of rotation. I wrote and optimised nine different functions to work in sync together so that there was no noticeable delay or lack of clarity in the notes.

Movie Reviews Discord Bot

Implemented text vectorisation to do sentiment analysis of movie reviews with 98% accuracy. Designed, trained, and tested the neural network for NLP with TensorFlow, using a dataset of 50K IMDb reviews. Pre-processed the data using Pandas and prepared it by vectorising and creating an input pipeline, separating training, validation, and testing sets. Graphed the loss and accuracy of the model over time using Matplotlib to detect any discrepancies.

I created a Python web-scraper using BeautifulSoup4 to search a query on Google and extract text from the top ten results. I then developed a bot using the Python Discord API to interact with the user, taking in a movie name and running the web-scraper after the program searches it up. The content of the articles returned is then passed into the model to predict whether reviews are positive or not. The verdict is then given to the user in the chat interface.

Wheel Tracking Algorithm

This was a result of the two Toyota Innovation Challenges held at the university. The first one was to program a solution for taking images of cars at the right time on a conveyor belt, for quality control. The aim was to detect the wheels with OpenCV computer vision instead of the current standard of mechanical triggers, to improve accuracy.

We developed a successful program in under 12 hours using OpenCV code snippets, such as Hough transform and contour detection functions. This was done collaboratively in a Jupyter Notebook amongst a group of four. Both wheels are detected, and the front one is bounded using a red square. When the front wheel passes a certain point (denoted by the vertical red line), an image is taken and saved. The code was tested on a scaled down model of the factory system

The second challenge consisted of detecting the holes on vehicle bodies that TMMC covers with stickers to prevent water leakage and wind noise. This is usually done using a robotic arm equipped with a camera and the goal was for our program to be able to detect when a sticker was properly applied or not. I was successfully the only one able to make the depth camera program function inside WSL Ubuntu and our program worked well as intended, winning a Co-op's Choice Award.

Experience

  • April 2024 - Present

    Vehicle Platform Director

    WATonomous

  • January 2024 - April 2024

    Embedded Software Developer

    Christie Digital Systems

  • May 2023 - August 2023

    Firmware Developer

    onsemi

  • June 2023 - Present

    Firmware Team Lead

    Waterloop

  • September 2022 - April 2023

    Embedded Flight Software Member

    Waterloo Aerial Robotics Group