Human Machine Interface
The concept is inspired by the idea of driving an autonomous vehicle like a horse, the famous so-called 'H-metaphor9; introduced by F.O. Flemisch et al. The idea is that two intelligent entities, namely the driver and vehicle, both have control over the drive. The rider can give the horse more control by loosening the reins or less control by tightening them. Haptic input and output create a dialog between the two entities where the rider shows intentions and the horse makes the final call. This dynamic flow of control between two intelligent entities inspired me to design Stewart: an interface that sits between the driver and its autonomous car.
UNIQUE PROPERTIES / PROJECT DESCRIPTION:
Between you and your autonomous car, there is Stewart - a haptic interface that mediates between you and your car. Stewart's objective is to accommodate a healthy relationship between man and machine, to be achieved by an intuitive and expressive form of interaction. Stewart provides you with constant updates about the car's behavior and its intentions. If you don't agree on the car's next course of action, you can manipulate Stewart to change this. Stewart II aims to make the driving experience personal again by enabling a haptic dialogue between car and driver, offering drivers high-level control over their autonomous vehicle. The prototype is fully interactive and is being used for design research and testing in a driving simulator study. The final objective is to design a haptic language that enables intuitive communication between man and machine for a satisfying driving experience.
OPERATION / FLOW / INTERACTION:
When driving an automated automobile the eyes are the main sensory organ for receiving situational awareness. Instead of adding screens and potentially overloading this sensor channel, the decision was made to free the eyes and use haptic and tactile feedback to inform the driver about the vehicle's status. Additionally, a joystick-like design allows the driver to choose whether he or she wants to receive feedback from the car or not, making the interface non-obtrusive. The handle uses rotation to indicate the vehicle's headway, forward and backward motions correspond with acceleration and deceleration of the car, and sideway motions are used to indicate lane changes. User intentions are expressed through spatial gestures in 3D space within a virtual fixture that feels like a concave bowl. A forward motion will accelerate the vehicle, whilst a firm pull will slow it down. Flicking the handle to the left or right will trigger a lane change when possible. By using varied haptic stiffness the vehicle can express a level of urgency in the given situation. For example, if the driver tries to accelerate while another car is in front of him, Stewart's handle will feel rigid and eventually push back to maintain a safe headway distance to the other car.
PROJECT DURATION AND LOCATION:
The project has two phases: prototyping and design, and integration and research. The prototyping and design phase took place at Eindhoven University of Technology, The Netherlands at the department of Industrial Design. This phase started in January 2016 and was finalized in June that same year. The second phase started right after in the form of a 4-month internship. During this internship, the prototype was integrated and tested in a driving simulator at the Nissan-Renault Research Center in Silicon Valley, CA, USA. The research study was made possible with help from the people at Nissan, Renault, and support from the Center for Design Research at Standford University. Special thanks go out to Nikhil Gowda for making the second phase possible.
FITS BEST INTO CATEGORY:
Interface and Interaction Design
PRODUCTION / REALIZATION TECHNOLOGY:
3D Printing, laser cutting, Processing, Arduino.
SPECIFICATIONS / TECHNICAL PROPERTIES:
The prototype is based on an octahedral Gough-Stewart platform. The platform has 6-degrees of freedom (6-DoF), these synergistic movements are linear (lateral, longitudinal and vertical), and rotational (pitch, roll, and yaw). The platform was not designed for performance but instead designed to be visually and haptically expressive. The top of the platform has an ergonomic asymmetrical handle that conveys directionality and orientation and has a clear affordance for grasping. The platform can sense external forces in 2-DoF (lateral and longitudinal). These forces are used for registering user intent.
The HMI is about 25 cm tall and was mounted over the original electronic gear shifter inside the test vehicle for the user study as shown in image 4. The platform can animate its motion per input values retrieved from the driving simulator. Meanwhile, user intent can be registered and sent back to the driving simulator and the autonomous driving control software. The HMI is about 25 cm tall and was mounted over the original electronic gear shifter inside the test vehicle for the user study as shown in image 4. The platform could animate its motion per input values retrieved from the driving simulator. Meanwhile, user intent could be registered and sent back to the driving simulator and the autonomous driving control software.
interaction, interface, haptic, driving, hmi
Stewart II functions as a research tool for an exploratory autonomous driving study on indirect control through the use of a life-like haptic interface. The study was conducted at Renault's Innovation center in Silicon Valley, CA, USA. Participants experienced a simulation of fully automated highway driving with only a haptic joystick at their disposal. The joystick let participants express intentions through gestures that influenced the vehicle’s driving style. The main functionality of the HMI was to provide haptic feedback about the vehicle's actions whilst allowing to influence the speed and change lanes. These functionalities were tested by creating different traffic scenarios such as traffic jams, tailgaters, and a drunk driver. The goal of the study is to identify and organize the effects of fully autonomous driving in combination with having indirect control of the vehicle’s driving style and the effect on the overall user experience. The findings can be used for future research and the development of future design prototypes.
Besides testing whether an HMI like Stewart II works in an automated driving context and has a positive effect on the overall driving experience, the process of designing and integrating the prototype had plenty of challenges to overcome. The prototype is advanced and uses parallel robotics to diplay different haptic properties whilst sensing user intent. This technical approach offers a lot of freedom for designing different interactions on the long run. However, a trade-off with complexity is that it often comes with technicalities. Through an intterative design process the prototype became suitable for testing. The main goal of Stewart II is to push the idea out of concept phase into automotive research. A challenge especially was the integration of the prototype into a driving simulator. The interactions designed had to really work for Stewart II not to only be a concept. The autonomous driving software was limited to what it could do. Designing suitable interactions and driving scenarios for the user study required resourcefulness and technological skill to make the interactions work seamlessly.
TEAM MEMBERS (3) :
Designer: Felix Ros, Coach: Jacques Terken and Coach: Pierre Levy
Image #1: Creator Felix Ros, Stewart II - side view, 2016.
Image #2: Creator Felix Ros, Stewart II - side view with hand, 2016.
Image #3: Creator Felix Ros, Stewart II - top view with hand, 2016.
Image #4: Creator Felix Ros, Stewart II - close up actuators, 2016.
Image #5: Creator Felix Ros, Stewart II - in driving simulator, 2016.