Competitions

MathWorks Minidrone Competition

Learn how to develop an autonomous Minidrone line follower and develop key skills like Model-Based Design relevant to the industry – all while having fun using drones! This competition is hosted by MathWorks, makers of MATLAB and Simulink, at ERF 2026.

Get an introduction to Model-Based Design while challenging your peers. You will design a line follower algorithm using Simulink and learn how to model, simulate, and fly a Minidrone.

  • Competition Audience:  Open to Students from degree granting institutes/Universities worldwide
  • Team strength: 2-4 members including team captain (Students from different universities can form a team)

Robotics Automation: Drone Assembly Hackathon

Background & Motivation

As demand for unmanned aerial systems (UAS) surges across both civil and military sectors—including logistics, inspection, security, agriculture, disaster response, and defense—efficient and scalable drone production has become a strategic necessity.

Current assembly processes often rely heavily on manual labor, which limits throughput, increases production cost, and introduces variability in quality. To reach the required scale while ensuring precision, safety, and flexibility, the robotics community must develop new methods for autonomous, adaptable assembly of drone platforms.

Robotic manipulation systems—particularly dual-arm and mobile manipulator configurations—offer a powerful pathway toward automated, modular drone assembly. However, challenges remain in part recognition, fine manipulation, coordinated bimanual actions, and adaptability to component variability.

This hackathon invites researchers to address these challenges head-on.

Hackathon Challenge Statement

Teams are tasked with developing a system in which two robotic manipulators mounted on mobile platforms collaboratively assemble a drone from provided components.

The system must be capable of:

  1. Identifying and classifying drone components using an onboard camera or vision system.
  2. Planning and executing coordinated manipulation actions with two robotic arms.
  3. Performing a complete or partial assembly sequence of a provided drone design.

The emphasis is on autonomy, precision, coordination, and adaptability.

Objectives

By the end of the hackathon, participants should aim to:

  • Demonstrate autonomous part recognition and environmental understanding.
  • Implement robust bimanual or multi-agent coordination between robotic manipulators.
  • Ensure accurate alignment, insertion, fastening, or docking of drone components.
  • Validate approaches that can support scalable, modular drone manufacturing.
Suggested Technical Approach (Flexible)

Participants may choose their own architecture, but we encourage exploration of:

a. Perception & Component Identification

  • 2D/3D vision systems for part classification Marker-based or markerless detection
  • Pose estimation for grasp planning

b. Manipulation & Coordination

  • Bimanual manipulation strategies
  • Synchronization protocols between two robotic arms
  • Shared-task planning across mobile bases
  • Handling tolerances and fine alignment during assembly

c. Motion Planning & Control

  • Collision-free trajectory generation in dynamic/shared workspaces
  • Precision control for gripping, positioning, and fastening
  • Dynamic adjustment to part misplacement or orientation changes

d. Mobile Manipulator Integration

  • Base repositioning for optimal reach
  • Coordination between navigation and manipulation
  • Workspace management for multi-robot systems
Expected Deliverables

Teams will present:

  • A working prototype
  • A demonstration showing component recognition, grasping, coordinated manipulation, and assembly
Judging Criteria

Entries will be evaluated on:

  1. Autonomy – minimal manual intervention throughout the assembly process
  2. Precision – accuracy in component placement and overall assembly quality
  3. Coordination – effective synchronization between the two manipulators
  4. Robustness – ability to adapt to variation in part placement, lighting, or orientation
  5. Innovation – creativity in design, architecture, or approach
  6. Scalability – relevance and feasibility for real manufacturing systems

Fire-Response Robotics Hackathon

Background & Motivation

Rapid urbanization and the accelerating transition to new building materials and energy systems—such as solar panel installations—have led to an increased frequency and complexity of indoor fire incidents. In these environments, fires often occur in enclosed, partially obstructed, or structurally complex rooms where visibility is low and access for first responders is limited or dangerous.

Timely and precise intervention is critical. The sooner a fire is detected, localized, and contained, the more lives can be saved and the less property damage occurs. However, relying solely on human intervention in unpredictable or hazardous indoor environments places rescue teams at unacceptable risk.

To address this global safety challenge, we invite the international research and innovation community to push the boundaries of autonomous systems for emergency response.

Hackathon Challenge Statement

Participants are tasked with developing an autonomous drone system capable of:

  1. Exploring a previously unknown enclosed indoor room, without prior maps
  2. Detecting and localizing a fire source, under constraints such as smoke, obstacles, and low visibility.
  3. Implementing a safe fire-extinguishing action, using onboard tools or mechanisms appropriate to the scale of the demonstration scenario.

The solution should operate fully autonomously, making real-time decisions in dynamic, uncertain conditions.

Objectives

By the end of this hackathon, teams should aim to:

  • Demonstrate advanced autonomy in confined indoor environments.
  • Integrate sensing, perception, navigation, and actuation for emergency-response scenarios.
Suggested Technical Approach (Flexible)

Participants are free to choose their own methods, a drone equipped with rgb camera, thermal camera and laser will be provided.

Expected Deliverables

Each team will present:

  • A working prototype or simulation of the autonomous drone system
  • A demonstration of autonomous exploration, fire detection, and extinguishing during ERF26
  • A technical report or presentation explaining the approach, design decisions, and limitations
Judging Criteria

Solutions will be evaluated on:

  1. Autonomy – minimal human intervention
  2. Effectiveness – accuracy in locating and mitigating fire
  3. Robustness – ability to handle obstacles, reduced visibility, and uncertainty
  4. Speed – how fast the fires are extinguished
  5. Safety – risk mitigation in hardware and algorithmic design