Robotics software engineer building real-time localization, perception, and guidance pipelines for autonomous aerial and ground platforms. Currently at Dronetools in Sevilla, Spain.
Robotics Software Engineering
I design and ship autonomy software for robots operating in GPS-denied and dynamic environments. My focus is SLAM, state estimation, navigation, and perception pipelines that must run reliably in real time.
At Dronetools, I work on autonomous drone navigation and operator systems that connect low-level control, sensor fusion, and mission workflows. I also competed in TEKNOFEST Robotaxi for 3 consecutive years with Team Mekatek, improving the system every year and helping the team reach 3rd place plus Best Team Spirit Award. I am open to relocation for robotics roles in the Netherlands, Switzerland, and Germany.
Four flagship and award-winning systems that demonstrate end-to-end robotics capability: perception, real-time mapping, planning, controls, and deployable software architecture.
End-to-end autonomous drone navigation stack for GPS-denied flight in caves, mines, tunnels, and industrial interiors. Covers every layer from sensor fusion and GPU mapping to 6-level autonomy management, Smart RTL, MAVLink flight control, a custom binary comms protocol, and a cross-platform ground station — all designed, architected, and implemented solo.
Sensor fusion → SLAM → planning → safety → FC execution in a single deployable stack
Up to 50M points live on constrained hardware over custom binary protocol
Runs under comms loss, degraded sensors, and GPS-denied environments by design
Professional-grade UAV ground control station (GCS) for real-time gimbal, payload, telemetry, video, and mapping workflows across surveillance, inspection, and emergency operations.
Windows WPF desktop app on .NET 6 (C#)
MVVM + modular solution structure (40+ projects)
Gimbal, mapping, video, cloud telemetry, and mission tooling
Real-time, GPU-accelerated video stabilization and ultra-low-latency RTSP streaming library with seamless passthrough or processing modes, optimized for NVIDIA Jetson edge deployments.
~10-20 ms passthrough, ~50-100 ms processing mode
gstd + gst-interpipe with instant mode switching
Stabilized FPV/perception streams for drones and robots
Autonomous vehicle competition project where our team achieved 3rd place and the Best Team Spirit Award. I contributed to real-time perception and sensor-fusion decision making under track constraints.
3rd Place + Best Team Spirit Award
Drivable area + YOLO label detection
Multi-sensor fusion for driving decisions
Additional robotics work and experiments are available on GitHub.
Open to SLAM, robotics perception, and autonomous navigation opportunities across Europe.
I am actively exploring robotics software roles focused on autonomy, localization, and navigation in the Netherlands, Switzerland, and Germany. Open to relocation and remote collaboration in European time zones.