I Build Systems That See, Think,
and Drive Themselves
I design and engineer autonomous systems across robotics, ADAS, and AI — built for real-world deployment, not simulations.
- ▸Bosch Future Mobility Challenge Finalist — Top 22 Global
- ▸ADAS Systems — Lane Detection · Drowsiness Detection · Navigation
- ▸Robotics + Embedded — STM32, Raspberry Pi, ROS2
- ▸AI Systems — YOLO, CRNN, TensorFlow, XGBoost
What I Build
Autonomous Systems
→ Robots with navigation, control, and decision-making
Full-stack autonomy: perception → planning → execution. Hardware-grounded.
Computer Vision Systems
→ Real-time perception using YOLO, CRNN
Object detection, classification, and tracking pipelines deployed at the edge.
Embedded Intelligence
→ STM32, Raspberry Pi, sensor integration
AI inference on constrained hardware. Sensor fusion, PWM, motor control.
AI Systems
→ Deep learning pipelines, classification, detection
From dataset to deployment — TensorFlow, YOLO, RAG, XGBoost.
Systems Delivered in Real Environments
Retrieval-Augmented Generation pipeline for structured knowledge access from enterprise data.
Efficient querying + improved AI response relevance across internal knowledge bases.
Full ADAS stack: lane detection, parking assistance, overtaking logic — on a miniature autonomous vehicle.
Selected among top 22 global teams in Bosch's international autonomous driving challenge.
ROS2 communication system with differential drive navigation and Gazebo simulation using URDF.
Real-time robotic control and simulation-to-hardware pipeline enabled.
Computer vision classification model + Grafana sensor dashboard + Vue3 push notification system.
Automated plant monitoring with live data visualization.
Engineering Stack
- YOLOv8 / v11
- TensorFlow
- CRNN
- Scikit-learn
- XGBoost
- RAG Pipelines
- ROS2
- Gazebo
- URDF
- OpenCV
- Grafana
- Vue3.js
- STM32
- Raspberry Pi 5
- Pixhawk
- Ultrasonic Sensors
- L298N Motor Driver
- Python
- C
- JavaScript
- TypeScript
- URDF / XML
Projects Built
ZUNO
Real-time navigation + obstacle avoidance
Driver fatigue detection → safety alert pipeline
Data-driven agriculture with multilingual support
300K+ images — 1,000 species, scalable plant monitoring
25K+ samples, 50 tags — genre and mood detection
What I'm Building Next
OPTINX
An AI and automation company building intelligent systems for real-world environments — across mobility, agriculture, and robotics.
Awards
Certifications
- ▸Introduction to Self-Driving Car (Oct 2025)
- ▸State Estimation & Localization for Self-Driving Car
- ▸Advanced Driver Assistance Systems (ADAS)
- ▸Fundamentals of Accelerated Computing — Python
- ▸Fundamentals of Accelerated Computing — CUDA C/C++
- ▸Deep Learning with PyTorch
- ▸Robo-AI: Industrial Training on Robotics & AI
I build systems, not demos.
Real-world constraints drive design.
Scalability over shortcuts.
Execution over ideas.