Photo of Ken Nakagaki

Artificial intelligence & robotics

Ken Nakagaki

Physical materials with computational and robotic technologies to create a seamless tangible interaction experience.

Year Honored
2021

Organization
MIT Media Lab

Region
Asia Pacific

Hails From
Asia Pacific

User Interfaces (UIs) serve as an important relay between humans and the digital world behind electronic devices. Over the last decades, Tangible User Interfaces (TUIs) have emerged as a novel and enhanced type of UIs that can connect the physical and digital worlds more effectively.

Ken Nakagaki, a recent graduate of Ph.D. in Media Arts and Sciences at MIT Media Lab, has devoted himself to the research of the future of TUIs since 2014, especially with the integration of dynamic actuation and transformation capabilities, or Actuated Tangible User Interfaces (A-TUIs). He holds two master’s degrees, one from MIT Media Lab (2016) and another one from Keio University (2014).

A-TUIs aim is to physically convey digital information and dynamically adapt bodily interactions through actuation, such as shape-change and movement. Nakagaki specifically focuses on merging physical materials with computational and robotic technologies for creating a seamless tangible interaction experience, which consists of three primary agendas: Hardware Form, Perceptual Design, and Passive Material Activation.

Engineering hardware devices play a fundamental role to realize and examine the idea of A-TUIs. Nakagaki’s previous work, like LineFORM and ChainFORM, have shown the development of new hardware devices to explore interactivity with the form factor of ‘Line-based materials’ like strings, ropes, and wires. He designed the prototyped shape-changing A-TUIs that utilize inherent affordance in those materials, resulting in flexible modular snake-shaped electronic devices and displays.

On the basis of these devices, he improved them through computational actuation and invented COMP*PASS and Petanko Roller. The first one is a compass-based drawing interface that lets users draw a variety of computational shapes. The second one is a rolling pin-based haptic interface that allows users to interact with digital 3D shapes. Both devices can be intuitively operated by fingers and hands.

Of course, hardware alone cannot facilitate the interaction between the physical and digital worlds. Another critical part is integrating human factor and perceptual design techniques into hardware devices.

“I have implemented interactive systems to examine novel cognitive experience through touch and actuation to enrich data physicalization as well as entertainment or narrative design,” says Nakagaki.

In his previous work, like Materiable, he defined a new interaction framework that utilized pseudo haptics, an illusional haptic perception technique, to represent the material property when the corresponding physical materials changed shapes or undertook pressure. The tangible interaction initiated by humans can be reflected on a digital display in real-time. Such research has advanced A-TUIs’ capability to render shapes and versatile materiality for rich data physicalization.

Nakagaki’s third research direction is Passive Material Activation, which is also part of his recently completed Ph.D. dissertation. He defined this concept as ‘Mechanical Shell’, where robots and agents can ‘dock’ with others to change, transform, and augment their functionality.

In particular, he investigates how passive objects in our surrounding physical environment can be activated and start responding to interactions. Those objects are embedded with modular mechanical parts that are interchangeable and flexible. Combined with A-TUIs, they are capable of performing a variety of tasks.

“I want to bring my ideas and visions to real-working ‘experienceable’ prototypes. By doing so, I reveal novel interaction opportunities in the physical environment for computers to physically express and dynamically respond to human interaction. I believe such cutting-edge research paradigm can truly advance the way we interact with computers and materials around us,” says Nakagaki.  

In 2022, Nakagaki is starting his position as Assistant Professor at the University of Chicago, to launch his own lab, named Actuated Experience Lab, to further investigate his vision.