Project: Driven by Optimism | Hyundai
How we customised a car to drive using Brain Computer Interface (BCI)
In April 2016, my friend Matt (founder of TheAlphabetCollective), bumped into an old work acquaintance on the train. During catching up, the acquaintance mentioned about an advertising agency that was looking for a team to customise a car to be controlled by brain activity. This was for a campaign by Hyundai about promoting positivity. My friend asked me for help on the project. I am usually very confident (somewhat delusional) about my tech skills and instantly agreed to work on it.
This project made us nervous as safety of the passengers was of extreme importance. The nervousness was amplified by a hard deadline of 6 weeks after which the video shooting had to be done. The issue of tight deadlines is very common for most experiential projects. The creative teams usually take a lot of time in refining and negotiating which leaves very little time for the actual execution ( Or may be they just like to see us suffer).
We quickly went into action and started planning the development.
Quest to find a BCI device
Our first challenge was finding a BCI device that we could easily program. Neither of us had ever worked with a BCI (Brain Computer Interface) device and it was our understanding that they are pretty expensive to purchase.
EMOTIV to the rescue
We googled for affordable BCI devices and within a short time found out about EMOTIV. The pricing was reasonable, around 300 USD for EMOTIV insight product. EMOTIV was fairly new at that time, just out of kickstarter. I quickly read the SDK documentation and decided to go for EMOTIV.
We ordered 5 of EMOTIV Insight devices. 2 of those were delivered to me in Amsterdam while the other 3 went to TheAlphabetCollective in London.
Mock it till you make it
As there was going to be some lead time in the delivery of devices, we started working on the code and hardware assembly by simulating the BCI device. In other words, made more buttons and notches in the UI :)
Arduino and Actuators
Matt took care of the hardware. He procured actuators that were more powerful than needed and could possibly destroy the car. I guess he didn’t listen to the advice from Spiderman’s uncle about power and responsibility. He connected those actuators to Arduino controller, which sends the commands on the serial port.

Software for BCI
My part was to write the software for interaction with the BCI device and the Arduino controller. We had a few Windows laptops lying around from previous jobs and decided to aim for Windows OS. I decided to go with Python as a language for faster development, plus EMOTIV SDK support for Python was excellent. We also wanted to create a GUI control panel for manual safety overrides and visualisation of the overall working of the software (e.g., actuator position, BCI metrics). The UI framework seemed a bit tricky to get it right at first but I settled on wxPython framework. In retrospect, I would not choose Python for such a realtime application due to Python GIL. I ended up using queues to get some level of concurrency and a responsive UI.
Safety concerns
Safety was of high importance as we couldn’t risk car going out of control if software misbehaved. We created 4 levels of failsafes.
Bake/Release buttons on the GUI on the Windows laptop that would release accelerator and press the brake.
A bigger reset button on the Arduino.
A switch that sat between Arduino and Actuators.
Lastly, the rig with actuators could be released manually with one hand.
If bugs could kill
Once the simulated code was completed, I flew a few days before the event to test integration with real actuators. While testing we realised that every few minutes the actuators went crazy and pushed their full length of 12 cm out. If we had tested with a real car, it would have accelerated to max which could have threatened the life of the passengers. Luckily, we figured out the issue quickly. Apparently, we were sending commands to the actuators faster than they could accept on the serial bitrate. This caused corruption of data on the serial port towards actuators. We fixed it by making Python code wait for an ACK from the Arduino code before sending the next accelerate/brake value.
Shooting day
Shooting took place at an airfield. We configured 2 cars, a day before the shooting and drove them around with the power of our minds. Even though, we knew how the whole system worked, it was still a very strange feeling in the stomach when car moved on its own due to our focus.
We had a separate rig on a table to train the talent (advertising term for the main characters) on the system before sitting in the car. The rest of the shooting went well and we had a lot of fun.