Page images
PDF
EPUB
[merged small][ocr errors][merged small][merged small]

where

V =

[ocr errors]

is the position of a point in the image, f is the focal length of the camera,

[ocr errors]

is the translation velocity of the camera, and a,ß,y represent the rotational velocity of the camera about the X, Y, and Z axes respectively [HONG89].

The Horn and Schunck optical flow algorithm [HORN81] uses the ratio of spatial and temporal image derivatives described above over two frames of image sequences to measure pixel velocity normal to the gradient direction. Analysis of the results of their method by Kearney, et al. [KEAR87] indicate that large errors occur where the image is highly textured or where motion boundaries exist due to depth discontinuities. In an effort to overcome these shortcomings, methods have been developed which use a large number of frames sampled closely together in time [DUNC88, WAXM88]. Errors are reduced by extracting the optical flow from the second derivative of the Gaussian temporally smoothed image [MARR81].

B8.4. Evaluation

Errors are often made when segmenting pixels based on local information as described in the previous boundary or region methods. One way to improve the reliability of labelling is by adjusting the measurements made based on measurements of adjacent pixels. This method detects and corrects local inconsistencies in the pixel labels and is called relaxation. [DAVIS80]

There are two types of relaxation methods: discrete and fuzzy. A discrete method checks adjacent label values and may adjust a label's value based on this comparison. Fuzzy labelling associates a likelihood value with each label and uses this to determine the appropriate value.

A relaxation process is specified by two things: a neighborhood model and an interaction model. The neighborhood model specifies which pairs of pixels contribute to the relaxation process. The choice of which pixels communicate depends on the goal of segmentation. A directional neighborhood model may be specified for edge detection, while the positional information may not be important in a region extraction method. The interaction model determines the criteria for changing a pixel's label. Interaction models need to represent the relationships between labels and the mechanism by which labels are modified. The interactions can be represented by relational knowledge or by logical statements.

[blocks in formation]

sharpening 32, 33 smoothing 29, 30, 31 spatial integrator 8,9,18 status 7, 11, 14, 15

support processes 9

T

task decomposition 1, 5, 10, 20 telerobot 1

template matching 37, 38, 39 temporal integrator 8,9,18 termination time 14

texture 38, 40, 41

threshold 29, 40

thresholding 27

timestamp 14

timing requirements 7, 10, 11

W

window 9, 10, 18

world model 1,9, 10, 20

U.S.GOVERNMENT PRINTING OFFICE 1989-242-200/00021

NBS-114A (REV. 2-80)

U.S. DEPT. OF COMM.

BIBLIOGRAPHIC DATA

SHEET (See instructions)

4. TITLE AND SUBTITLE

1. PUBLICATION OR
REPORT NO.
NIST/TN-1260

2. Performing Organ. Report No. 3. Publication Date
June 1989

Visual Perception Processing in a Hierarchical Control System: Level I

5. AUTHOR(S)

Karen Chaconas, Marilyn Nashman

6. PERFORMING ORGANIZATION (If joint or other than NBS, see instructions)

NATIONAL INSTITUTE OF STANDARDS AND TECHNOLOGY

(formerly NATIONAL BUREAU OF STANDARDS)

U.S. DEPARTMENT OF COMMERCE

GAITHERSBURG, MD 20899

9. SPONSORING ORGANIZATION NAME AND COMPLETE ADDRESS (Street, City, State, ZIP)

Same as Item #6

7. Contract/Grant No.

8. Type of Report & Period Covered Final

10. SUPPLEMENTARY NOTES

Document describes a computer program; SF-185, FIPS Software Summary, is attached.

11. ABSTRACT (A 200-word or less factual summary of most significant information. If document includes a significant bibliography or literature survey, mention it here)

This document describes the interfaces and functionality of the first level of the visual perception branch of a realtime hierarchical manipulator control system. It includes a description of the scope of the processing performed and the outputs generated. It defines the interfaces and the information exchanged between the modules at this level, as well as interfaces to a camera, a human operator, and to higher levels of the system. The reader should be familiar with ICG Document #001, NASA/NBS Standard Reference Model for Telerobot Control System Architecture (NASREM).

12. KEY WORDS (Six to twelve entries; alphabetical order: capitalize only proper names; and separate key words by semicolons) camera processing; execution module; hierarchical control system; image processing; job assignment module; perception; planner module; real-time; sensory processing; task decomposition; temporal-spatial processing; world model

13. AVAILABILITY

XX Unlimited

For Official Distribution, Do Not Release to NTIS

Order From Superintendent of Documents, U.S. Government Printing Office, Washington, D.C.
20402.

Order From National Technical Information Service (NTIS), Springfield, VA. 22161

14. NO. OF

PRINTED PAGES

49

15. Price

USCOMM-DC 6043-P80

« PreviousContinue »