OFFICE DEVICE
TALK 2/3
Tools will usher in new ideas that do not exist within the human brain: MITSUO ISO
January 21, 2015
This is a conversation with animator and screenwriter Mitsuo Iso about the way people will work in the future, conducted by Ricoh researcher Kenichiro Saisho (Part 2 of 3). The discussion of jobs and the office environment in the future that was presented in the previous installment is further developed and expands into new ways of interaction between humans and machines, and between humans and data in the future.
The dawn of new ways of interaction between humans and machines, and between humans and data
-
Iso:
I think that, currently, tools such as smartphones and PCs are in the process of becoming equally as important as people's physical bodies when those people are doing jobs or tasks. Even in animation, for instance, it's gotten to the point where nothing can get done without using software on computers.
-
Saisho:
You're really talking about the same thing as the "new ways of interaction between humans and machines, and between humans and data" that I am researching and developing. Right now, we're working on the interrelationship between head-up displays (HUDs) and the human body, as well as on structures that act directly on human perception.
-
Iso:
In the animation world, when we imagine how the animation will look, we often piece together various ideas that are keyed to the software. I think that people in other professions are the same. It's now a matter of course that the tools themselves usher in new ideas that do not exist within the human brain.
-
Saisho:
You mean that tools will change behavior? With the HUDs that we are developing, augmented reality (AR) changes the actions that people take, just as happens in your anime Dennō Coil with the computer pets. We are conducting our research while anticipating the changes to the "story" that will happen after we have released our displays. In terms of the visual appearance of the HUD, it's as though the data and information previously displayed on the automobile's dashboard and elsewhere is appearing in the middle of the air.
-
Iso:
I think that visualization like that is the externalization of information inside human brains. You might say that it's taking out pieces of intelligence and turning them into objects.
-
Saisho:
Yes, it is truly an externalization of the driver's brain. Particularly for next-generation HUDs, it's absolutely essential to have both technology for the human to watch the external world and technology for the machine to watch the human. The HUD technology determines information such as, "How fast is the vehicle driving in kilometers per hour?" "How far ahead is the target object?" and "Where is the driver looking now"—and then visualizes it. Next, the HUD delivers enhancements to the human by seamlessly aligning those various data and the human's field of vision.
Toward an era in which machine-derived solutions change human behavior and results
-
Iso:
That's pretty much the epitome of an advanced interface, isn't it? What would happen if not just the data from the individual car and driver but also the data from surrounding cars were all linked together? People outside might immediately know, "Hey! That driver's dozed off." (Laughs)
-
Saisho:
As self-driving cars become more common in the future, I think it will end up like that. The theme of our research going forward is how to visualize those data and communicate them to the human.
-
Iso:
It rather adds to the fantasy when you think about what might happen if those data were transmitted as something other than text.
-
Saisho:
That's right. For instance, the car navigation systems that we have now transmit messages by text and by voice, but it's not exactly uncommon for the driver to misinterpret the navigation system's directions and accidentally turn at the intersection in front of them when they shouldn't. I often make that mistake myself. (Laughs) We're making progress in research into how to make drivers' brains immediately understand the directions—rather than trying to make the drivers understand the directions with text and voice commands—so that things like that don't happen. One way of doing this is to use visualization or non-verbal communication, such as by minimizing the amount of text and expressing information graphically.
-
Iso:
There are two recent news stories that come to mind when I hear about the relationship between people and machines. One of them is the one in which a shogi software program defeated a professional shogi player for the first time. The other is the story about how the German team used strategies created by data systems from the IT industry to win the World Cup. In both these stories, the reason the machine won is because it gathered information for itself and then interpreted those data—again for itself—to lead to a solution that a human could not think up.
-
Saisho:
Machines are good at finding things like the logical solution that is beyond the limited processing power of a human, aren't they?
-
Iso:
If the humans had incorporated the conclusions obtained through this process of machine learning, they might have managed to pull it off somehow. (Laughs) I guess it's only recently that humanity has first encountered this phenomenon. Perhaps we've finally met an intelligence that isn't human. My fantasizing is about whether these intelligences will appear in the workplace.