Exploring Human-Robot Collaboration in Architectural Design Processes

Category: Human-Robot Collaboration, Interaction Design

Date: Fall/Winter, 2015/2016

Publication: Exploring Human-Robot Collaboration in Architectural Design Processes (Master’s Thesis)

Project Members: Christian Ø. Laursen


As my Master’s Thesis this project concluded my 5 and a half year at Aarhus University. As with the other robot project regarding robot-supported food experiences, this project also revolved around the use of robots for tasks in a complex human environment.

Robots have a long history in the assembly line manufacturing in the automotive industry. In recent decades, robotics have been moving in to more complex human environments such as architectural design, working alongside humans.
However, collaboration between human and robot in architectural design have not been fully explored. An aspect of this rather unexplored field of research, is related to how architects are required to interact with the robot. Current practice regarding interaction has a steep learning curve for architects with no prior knowlegde of programming.

In addition, the workflow is tedious and and timeconsuming which negates the quick creative design process of an architect. The thesis sought to investigate how architect’s workflow could be enhanced by new interaction types for the robot along with lowering the high entry barrier for the use of robots.

What we did was to take relatively untraditional interaction technologies for human-robot interaction in architectural design, and apply it for the goal of form exploration in granular materials. Through the use of a natural user interface (Leap Motion controller) and a tangible user interface (Square blocks tracked by a camera), we explored how the interaction could be more tangible and rely less on offline-programming. Thus, through the Leap Motion controller, the architect could directly, through the use of gestures, manipulate the robotic agent and sketch out shapes and forms in the granular material. Same goes for the tangible user interface. Here the architect would treat each physical block as a digital waypoint by which he/she could sketch out a path for the robot to follow.