Brain Drone Racing
In Spring 2016, I began working in UF’s Human Experience Research Lab (HXRL). I joined the team that sought to create the World’s First Brain Drone Race, which was heavily funded by Intel. We used an Emotive Insight as our main Brain Computer Interface (BCI) and DJI Phantom for our drones. BCIs provide EEG signals that can be used to compute a user’s focus and level of engagement. Using these signals, we calibrated the Emotiv to measure when a person was trying to move a block forward with their thoughts. The Emotiv calibration set a user’s neutral thoughts and then had the user think of moving a block and set that threshold. This was then translated to move a drone forward. My contribution was developing the simulation that was used in the qualification round and writing some of the code connecting the drone and BCI.
We were featured on 600 different news outlets around the world. Here are a couple of them: