My Projects
Intelligent Drone Control Using Speech Recognition and Edge Computing
My senior design project focuses on developing an intelligent drone system capable of understanding and responding to voice commands using speech recognition and edge computing. The goal is to create a drone that can execute actions like takeoff, hover, and land based purely on verbal input, showcasing the power of real-time AI integration and autonomous control.
My role in this project extends beyond technical integration. I serve as both a systems engineer and project manager, overseeing the Jetson Nano–Pixhawk integration, developing the speech recognition interface through Vosk, and establishing communication with the drone using MAVSDK. I also take responsibility for the administrative and organizational aspects of the project, managing budgeting, scheduling, report submissions, and presentation materials. By handling these logistics, I allow my teammates to focus fully on the drone’s physical assembly and testing.
Throughout the project, I have coordinated team meetings, prepared technical documentation and Overleaf reports, and ensured our progress aligned with academic deadlines and hardware milestones. This experience has strengthened my abilities in project management, systems integration, and team coordination, while deepening my understanding of how AI and embedded systems can work together to create innovative engineering solutions.
By the project’s completion in December 2025, our goal is to present a fully functional prototype capable of following real-time voice commands through on-board edge processing, demonstrating how intelligent control systems can transform the future of autonomous robotics.
Skills: Project Management; Systems Integration; Speech Recognition; Embedded Systems; MAVSDK; Jetson Nano; Leadership; Budgeting; Presentation Design; Collaboration
Members: Rayhan Zaman, Cole Viking and Dr. Mohsen Saffari
Senior Design Project | Purdue University Northwest
August 2025 – December 2025
AI-Based Cyberattack Detection for Autonomous UGV Systems
This project focused on developing an AI-driven intrusion detection system (IDS) for the Clearpath Jackal Unmanned Ground Vehicle (UGV) to recognize and respond to cyberattacks in real time. The goal was to strengthen the vehicle’s cybersecurity and ensure safe, reliable operation in autonomous environments where traditional defense systems may fail.
My role centered on designing and testing the machine learning-based detection framework that could identify anomalies caused by attacks such as Man-in-the-Middle (MitM) or command injection. I trained and validated models using sensor data analysis collected from the UGV’s LiDAR, odometry, and depth camera, applying supervised and unsupervised learning techniques to distinguish normal operation from compromised behavior.
Beyond the AI development, I was also responsible for simulation and system validation, ensuring the UGV responded appropriately when threats were detected. I led the documentation and report preparation, coordinated testing environments, and worked closely with my teammates to interpret results and refine detection accuracy.
This project deepened my technical experience in cyber-physical systems, ROS 2 Humble, and anomaly detection while improving my ability to translate complex data into actionable engineering insights. The successful outcome demonstrated how machine learning can enhance robotic security by detecting cyber threats before they cause system failures.
Skills: Machine Learning; Cybersecurity; ROS 2 Humble; Python; Data Analysis; Anomaly Detection; System Testing; Technical Documentation; Team Collaboration
Members: Christopher Fieramosca, Luke Lope, Pernando Moreno, Graduate Assistant - Lokesh Gujja and Dr. Khair Al Shamaileh
Wireless Communication Project | Purdue University Northwest
January 2025 – May 2025