My journey through various robotics companies
AI-Powered Decision Making: Architected sophisticated AI decision-making modules that seamlessly integrate multi-modal perception data from cameras, radar, and occupant monitoring systems, enabling safety-critical features and context-aware in-cabin experiences for next-generation vehicles.
Vision-Language Intelligence: Engineered an advanced in-vehicle video intelligence pipeline leveraging state-of-the-art Vision-Language Models (VLMs), aesthetic quality scoring, and unsupervised clustering to automatically detect and highlight significant trip events from multi-camera footage.
Semantic Scene Understanding: Deployed cutting-edge VLMs including CLIP and BLIP variants for real-time semantic scene understanding, enabling object left-behind alerts and adaptive 360° situational awareness systems that enhance vehicle security.
Autonomous Service Robotics: Designed the system architecture for a modular, ROS2-based robotic vehicle platform supporting autonomous valet parking, EV charging, and automated inspection with flexible interfaces and reusable components for diverse automotive service applications.
Industrial-Scale AI Logistics: Designed and optimized system architecture for AI-powered logistics robots deployed in FedEx and DHL warehouses, enabling high-throughput palletizing, depalletizing, and autonomous box transport at industrial scale.
Performance Optimization: Deployed containerized robotics pipeline integrating YOLOv7-based perception, motion planning, and ROS2 control modules. System optimizations including GPU inference tuning and architectural refinements achieved a remarkable 40% improvement in cycle time and throughput.
Advanced Manipulation & Planning: Enhanced grasping and motion planning logic while implementing contingency Model Predictive Control (MPC) for AGVs, enabling real-time safe route re-planning under uncertainty and eliminating placement errors through sophisticated stacking heuristics.
Simulation & DevOps: Built comprehensive Gazebo-based simulation environments and CI/CD deployment pipelines with real-time ROS2 visualization tools, accelerating safe field deployment and ensuring software reliability through automated testing frameworks.
High-Speed Autonomous Racing: Developed the complete motion planning and control stack for the university's autonomous F1/10 racing team, engineering a real-time 40Hz Model Predictive Contouring Controller for generating and tracking dynamically feasible high-speed racing trajectories while avoiding obstacles and opponents.
Hierarchical Planning Architecture: Created a sophisticated real-time hierarchical planner combining state lattice planning with Nonlinear Model Predictive Control for tracking, generating high-quality plans that enabled competitive performance in autonomous racing scenarios.
Automated Track Intelligence: Implemented automated racetrack mapping and data extraction systems with intelligent pre-processing capabilities, reducing race preparation time by 50% and enabling rapid adaptation to new racing environments.
Evolutionary Optimization: Adapted and deployed Covariance Matrix Adaptation Evolution Strategy (CMA-ES) for raceline optimization, demonstrating the integration of evolutionary algorithms with real-time control systems for performance optimization.
Pioneering Robotics Innovation: Co-founded and designed the first-ever waiter and banking service robots in Nepal, establishing a new technology frontier in the region and demonstrating the potential for indigenous robotics development in emerging markets.
Cross-Functional Leadership: Co-led a multidisciplinary team of 4 engineers, orchestrating the development of motion planning and control systems while collaborating with mechanical and electrical hardware teams to deliver working prototypes within an ambitious 9-month timeline.
Multi-Robot Coordination: Deployed a sophisticated multi-robot global path planner to enable collision-free navigation for waiter robots operating in dynamic restaurant environments with multiple agents and human interactions.
Advanced State Estimation: Evaluated and implemented Unscented Kalman Filter (UKF) over Extended Kalman Filter (EKF) for superior indoor pedestrian tracking using LiDAR and radar sensors, achieving enhanced precision in complex indoor environments.