Mind-Expanding Machines

Week of Aug. 30, 2003; Vol. 164, No. 9 , p. 136
Artificial intelligence meets good old-fashioned human thought
Bruce Bower

When Kenneth M. Ford considers the future of artificial intelligence, he doesn’t envision legions of cunning robots running the world. Nor does he have high hopes for other much-touted AI prospects?among them, machines with the mental moxie to ponder their own existence and tiny computer-linked devices implanted in people’s bodies. When Ford thinks of the future of artificial intelligence, two words come to his mind: cognitive prostheses.

http://www.sciencenews.org/articles/20030830/bob9.asp

Artificial intelligence meets good old-fashioned human thought
Bruce Bower

When Kenneth M. Ford considers the future of artificial intelligence, he doesn’t envision legions of cunning robots running the world. Nor does he have high hopes for other much-touted AI prospects?among them, machines with the mental moxie to ponder their own existence and tiny computer-linked devices implanted in people’s bodies. When Ford thinks of the future of artificial intelligence, two words come to his mind: cognitive prostheses.

It’s not a term that trips off the tongue. However, the concept behind the words inspires the work of the more than 50 scientists affiliated with the Institute for Human and Machine Cognition (IHMC) that Ford directs at the University of West Florida in Pensacola. In short, a cognitive prosthesis is a computational tool that amplifies or extends a person’s thought and perception, much as eyeglasses are prostheses that improve vision. The difference, says Ford, is that a cognitive prosthesis magnifies strengths in human intellect rather than corrects presumed deficiencies in it. Cognitive prostheses, therefore, are more like binoculars than eyeglasses.

Current IHMC projects include an airplane-cockpit display that shows critical information in a visually intuitive format rather than on standard gauges; software that enables people to construct maps of what’s known about various topics, for use in teaching, business, and Web site design; and a computer system that identifies people’s daily behavior patterns as they go about their jobs and simulates ways to organize those practices more effectively.

Such efforts, part of a wider discipline called human-centered computing, attempt to mold computer systems to accommodate how humans behave rather than build computers to which people have to adapt. Human-centered projects bear little relationship to the traditional goal of artificial intelligence?to create machines that think as people do.

As a nontraditional AI scientist, Ford dismisses the influential Turing Test as a guiding principle for AI research. Named for mathematician Alan M. Turing, the 53-year-old test declares that machine intelligence will be achieved only when a computer behaves or interacts so much like a person that it’s impossible to tell the difference.

Not only does this test rely on a judge’s subjective impressions of what it means to be intelligent, but it fails to account for weaker, different, or even stronger forms of intelligence than those deemed human, Ford asserts.

Just as it proved too difficult for early flight enthusiasts to discover the principles of aerodynamics by trying to build aircraft modeled on bird wings, Ford argues, it may be too hard to unravel the computational principles of intelligence by trying to build computers modeled on the processes of human thought.

That’s a controversial stand in the artificial intelligence community. Although stung by criticism of their failure to create the insightful computers envisioned by the field’s founders nearly 50 years ago, investigators have seen their computational advances adapted to a variety of uses. These range from Internet search engines and video games to cinematic special effects and decision-making systems in medicine and the military. And regardless of skeptics, such as Ford, many researchers now have their sights set on building robots that pass the Turing Test with flying colors.

“I’m skeptical of people who are skeptical” of AI research, says Rodney Brooks, who directs the Massachusetts Institute of Technology’s artificial intelligence laboratory. He heads a “hard-core AI” venture aimed at creating intelligent, socially adept robots with emotionally expressive faces. Brooks also participates in a human-centered project focused on building voice-controlled, handheld computers connected to larger systems. The goal is for people to effectively tell the portable devices to retrieve information, set up business meetings, and conduct myriad other activities (SN: 5/3/03, p. 279: http://www.sciencenews.org/20030503/bob8.asp).

Cognitive prostheses represent a more active, mind-expanding approach to human-centered computing than Brooks’ project does, Ford argues. “This line of work will help us formulate what we really want from computers and what roles we wish to retain for ourselves,” he says.

Flight vision

In the land of OZ, which lies entirely within a cockpit mock-up at IHMC, aircraft pilots simulate flight with unaccustomed ease because they see their surroundings in a new light.

IN PLANE VIEW. A typical cockpit setup, left, contrasts with that of the OZ system, right. OZ combines data from dials and gauges into a visual depiction of the aircraft and external conditions. Here, the aircraft approaches a runway, represented by three green dots.
Still

IHMC’s David L. Still has directed work on the OZ cockpit-display system over the past decade. The movie-inspired name comes from early tests in which researchers stood behind a large screen to run demonstrations for visitors, much as the cinematic Wizard of Oz controlled a fearsome display from behind a curtain.

For its part, Still’s creation taps into the wizardry of the human visual system. In a single image spread across a standard computer screen, OZ shows all the information needed to control an aircraft. An OZ display taps into both a person’s central and peripheral vision. The pilot’s eyes need not move from one gauge to another, says Still.

A former U.S. Navy optometrist who flies private planes, Still participated in research a decade ago that demonstrated people’s capacity to detect far more detail in peripheral vision than had been assumed.

“OZ decreases the time it takes for a pilot to understand what the aircraft is doing from several seconds to a fraction of a second,” Still says. That’s a world of difference to pilots of combat aircraft and to any pilot dealing with a complex or emergency situation.

The system computes key information about the state of the aircraft for immediate visual inspection. The data on the six or more gauges in traditional cockpits are translated by OZ software into a single image with two main elements. On a dark background, a pilot sees a “star field,” lines of bright dots that by their spacing provide pitch, roll, altitude, and drift information. A schematic diagram of an airplane’s wings and nose appears within the star field and conveys updates on how to handle the craft, such as providing flight path options and specifying the amount of engine power needed for each option. Other colored dots and lines deliver additional data used in controlling the aircraft.

In standard training, pilots learn less-intuitive rules of thumb for estimating the proper relationship of airspeed, lift, drag, and attitude from separate gauges and dials. With the OZ system, a pilot need only keep certain lines aligned on the display to maintain the correct relationship.

Because OZ spreads simple lines and shapes across the visual field, pilots could still read the display even if their vision suddenly blurred or if bright lights from, say, exploding antiaircraft flak, temporarily dulled their central vision or left blind spots.

Experienced pilots quickly take a shine to OZ, Still says. In his most recent study, 27 military flight instructors who received several hours training on OZ reported that they liked the system better than standard cockpit displays and found it easier to use. In desktop flight simulations, the pilots maintained superior control over altitude, heading, and airspeed using OZ versus traditional gauges.

OZ provides “a great example” of a human-centered display organized around what the user needs to know, remarks Mica Endsley, an industrial engineer and president of SA Technologies in Marietta, Ga. The company’s primary service is to help clients in aviation and other industries improve how they use computer systems.

If all goes well, OZ will undergo further testing with veteran pilots as well as with individuals receiving their initial flight training. The system will then be installed in an aircraft for test flights.