Page images

NIST Technical Note 1260

Visual Perception Processing in a
Hierarchical Control System: Level I

[blocks in formation]



NOTE: As of 23 August 1988, the National Bureau of
Standards (NBS) became the National Institute of
Standards and Technology (NIST) when President
Reagan signed into law the Omnibus Trade and
Competitiveness Act.





U.S. Department of Commoroo
Robert A. Mosbacher, Secretary

National Institute of Standards and Technology
Raymond G. Kammer, Acting Director

U.S. Government Printing Office Washington: 1989

National Institute of Standards and Technology Technical Note 1260 Natl. Inst. Stand. Technol. Tech. Note 1260r 49 pages (June 1989) CODEN: NTNOEF

For sale by the Superintendent
of Documents
U.S. Government Printing Office
Washington, DC 20402

[merged small][merged small][merged small][ocr errors][ocr errors][merged small]



2. General System Architecture ................
2.1. Task Decomposition

2.1.1. Job Assignment Module
2.1.2. Planner Module ..........

2.1.3. Execution Module .......
2.2. Sensory Processing .... .....

2.2.1. Comparator Module ........
2.2.2. Temporal Integrators..
2.2.3. Spatial Integrator

2.2.4. Detection Module......
2.3. World Modeling.....

2.3.1. World Modeling to Task Decomposition Interfaces.

2.3.2. World Modeling to Sensory Processing Interfaces.......
3. Level 1 Interfaces and Operation .....
3.1. Level 1 Task Decomposition Module .............

3.1.1. Level 1 Job Assignment Module
3.1.2. Level 1 Planner Module............

3.1.3. Level 1 Execution Module........
3.2. Level 1 Sensory Processing Module
3.3. Level 1 World Model ...........

[blocks in formation]

4. A Vision System Application ........


[blocks in formation]
[merged small][merged small][merged small][merged small][ocr errors]

1. Introduction

The telerobot control system architecture discussed in (ALBUS87) describes a hicrarchical framework that has been used to control complex robot systems. It decomposes plans both spatially and temporally to meet system objectives. It monitors the environment with system sensors and maintains the status of system variables in order to control system resources.

The control system is composed of three parallel systems that cooperate to perform telerobot control (fig. 1). The task decomposition system breaks down objectives into simpler subtasks to control physical devices. The world model supplies information and analyzes data using support modules. It also maintains an internal model of the state of the environment in the global data system. The sensory processing system monitors and analyzes sensory information from multiple sources in order to recognize objects, detect events and filter and integrate information. The world model uses this information to maintain the system's best estimate of the past, current, and possible future states of the world.

Each device or sensor of the telerobot has a support process in each of the three columns of the control system, as shown in figure 2. For example, the task decomposition functions associated with planning the actions for processing camera data reside in the task decompo sition hierarchy; the world modeling functions for supporting those plans reside in the world model hierarchy, and the image processing techniques required for executing those plans reside in the sensory processing hierarchy. The modules can be logically configured according to their function in the system, as shown in figure 3. The system pictured consists of two main branches; the left branch contains the perception processes and the right branch contains the manipulation processes. The perception branch of the tree supports processes which provide sensory feedback to the manipulator system such as cameras, range sensors, tactile array sensors, acoustic devices, etc. The manipulator branch of the tree supports processes which are responsible for planning and executing manipulator trajectories.

The two branches decompose tasks in most cases independently and communicate via the global data system.

The world modeling support modules communicate asynchronously with the task decomposition and sensory processing systems. Data flows bidirectionally between adjacent levels within any given hierarchy. The interfaces to the sensory processing system allow it to operate in a combination of bottom-up (data driven) and a top-down (model driven) modes. Bottom-up processing involves the extraction of knowledge from sensory data, and topdown processing is used to correlate predicted information from the world model with extracted information from the environment. The interfaces between the sensory processing system and the world model allow updated information to be sent to the world model and predicted information or sensory processing parameters to be sent to the sensory processing system.

This document describes the interfaces and functionality of Level 1 of the perception branch for a camera that is part of a telerobotic control system. This level corresponds to the one highlighted in figure 3. Processing is performed on individual pixels. Level 1 gathers raw information (readings) from each camera, filters the information, and, when applicable, enhances it. It then extracts edge points, surface patches, and information relevant to the op

« PreviousContinue »