Page images
PDF
EPUB

tion to organizational and management needs are as likely to lead to failure as lack of attention to technical details. This paper presents a methodological framework which is intended to assist those who are establishing a CPM capability. Techniques are suggested to enable the individual to assess the needs of his or her own organization, and to design a performance management system to meet those needs.

Naval laboratories' quality of service standards, J. S. Dodds, SP500-52, pp. 79-86 (Oct. 1979).

Key words: availability standards; batch processing; calibration programs; computer standards; interactive processing; quality of service; response time; turnaround time.

This paper summarizes the objectives, history and status of user quality of service standards that are being developed by the Naval Laboratories for their general purpose computer centers. The problem includes social, economic and technical challenges. Standards must be relevant to the users, non-competitive with other evaluation processes, economically applied and consistent among the different computer architectures installed at each Laboratory. Each standard currently being used will be presented along with the rationale for the standard.

Computer system migration planning through benchmark performance evaluation, A. Mukherjee, A. K. Jain, and B. A. Ketchledge, SP500-52, pp. 89-104 (Oct. 1979).

Key words: benchmark performance data; empirical models; IBM 168-3; IBM 3033; migration guidelines; migration planning.

This paper presents the development of a program which provides guidelines for migration from an IBM 168-3 to an IBM 3033. This program, based on the Bell System 3033 benchmark performance data, consists of analytical and empirical models. The benchmark consisted of several real and synthetic job streams, which were run on the 168-3 and the 3033 under an MVS Operating System. The migration guidelines are in terms of (i) key 3033 system performance measures, (ii) gross configuration tuning information, and (iii) execution times for batch job steps. Furthermore, a component of this program can be used as a capacity planning aid for an existing 3033 system.

An optimal sample size allocation scheme for benchmark design, S. K. Tripathi, K. D. Gordon, and A. K. Agrawala, SP500-52, pp. 105-111 (Oct. 1979).

Key words: benchmarking; clustering; performance evaluation; stratified sampling.

A major problem in benchmark design is the selection of the jobs to compose the benchmark. In this paper, stratified sampling is applied to the problem. The strata in the job population are identified by clustering the jobs on the basis of job features. The clusters are then viewed as strata. Within a stratum, jobs are selected by simple random sampling. The question of how many jobs to select from each stratum is addressed in this paper. An extension of Neyman's result on the optimal allocation of sample points among strata to the multidimensional case is used. The technique is then applied to a real workload, and the results are examined.

Computer workload forecasting, J. E. McNeece, SP500-52, pp. 113-120 (Oct. 1979).

Key words: computer workload; data requirements; documentation cycle; forecast; projections; sensitivity analysis.

Experience has shown that the successful initiation of computer workload forecasting is directly dependent upon the quality of the data furnished by users to the requesting organization. Inaccurate data almost inevitably causes delays in the documentation cycle. The purpose of this paper is to serve as a guide for performing the analysis required to forecast workload requirements. Application of the methodology suggested herein should significantly reduce the risk of inaccurate or misleading projections.

A simulation model of JES output processing, H. P. Artis, SP500-52, pp. 123-127 (Oct. 1979).

Key words: data collection; JES output; job entry subsystem; model design; model selection; remote work stations; simulation model.

The design and implementation of a discrete simulation model of the IBM Job Entry Subsystem (JES2 and JES3) output processing is presented. This model was developed for sizing printers at remote stations and at the central site.

Design for performance, M. J. Kirrene and M. G. Spiegel, SP500-52, pp. 129-140 (Oct. 1979).

Key words: audit; capacity planning; financial applications; long-range planning; management control; measurement; modeling; on-line system design; performance evaluation; performance management; prototyping; remote terminal emulation; system testing.

Expensive on-line systems are difficult to justify to top management—unless your competitors or contemporaries are using them to advantage.

In 1971, AVCO Financial Services (AFS), a subsidiary of the AVCO Corporation, set out to construct an on-line system for its consumer credit operation. Funding was approved in 1972 and the design project formally began in January 1973. The first pilot branch office was converted in November 1975, and the conversion of the last branch occurred in May 1977. This is a history of how major strategic performance decisions were successfully made for AVCO's on-line system. AVCO's process of strategic performance decision-making, “design for performance," encompasses present ideas about capacity planning and performance evaluation.

AFS employed a variation of the conventional approach to the system design methodology-a process of prototyping the system and its interface with the organization in every design phase to focus on the performance issues.

Quantitative methods in computer performance evaluation, A. K. Jain, SP500-52, pp. 143-145 (Oct. 1979).

The application of clustering techniques to computer performance modeling, T. C. Hartrum and J. W. Thompson, SP500-52, pp. 147-161 (Oct. 1979).

Key words: cluster analysis; computer modeling; computer performance; empirical models; modeling; performance modeling; workload characterization.

The performance of a given computer system is to a large extent dependent upon its workload. A fairly standard approach to workload modeling is to define the workload by the amount of computer resources each job consumes. If each resource is considered to be one element in an ordered set, then a job's workload requirement can be represented as an n-dimensional vector = (x1,x2,...,Xn) where, for example x, = CPU time, x2 = central memory used, and so forth. By applying vector distance measurements the workload can be partitioned into clusters of "similar" jobs based on their nearness in n-space. This ap

proach has been applied to describe workloads more accurately than by the aggregate resource usage of all jobs taken together. This paper investigates the possibility of extending the clustering technique to modeling and predicting computer performance. If one hypothesizes that performance is a function of resource consumption, then one should be able to determine a predictable range of performance for each cluster. This paper presents the results of the application of this technique to the workload characterization of a Cyber 74 computer and the subsequent investigation of the relationship between a job's turnaround time and its workload cluster.

Performance comparison measures for computer systems, I. Dzelzgalvis, SP500-52, pp. 163-176 (Oct. 1979).

Key words: comparison measures; evaluation process; performance measures; performance ratings; response time; system design tradeoffs; thruput measures.

The need for performance ratings of computer systems is as fundamental as the measures of horsepower, calories, watts, etc. Yet the computer system performance measures that are of sufficient quality to quantify and evaluate reasonable the differences of systems of differing architectures, design, and operating systems are hard to find. That does not mean that comparisons are not made. In fact, comparisons are made quite frequently; however, the quality and value of the results is at best questionable. It will be shown that the classical performance measures of cycle time and MIPS never were accurate measures and with today's added complexity and clever system design tradeoffs, these measures can be downright distortions and/or inversions of the facts. The measures of thruput and response time/turn around time are the real candidates for computer system performance measurement; however, their proper evaluation can be a tedious and costly undertaking. In addition, due to the complexity of the evaluation process, these measures are prone to error and biases which can quickly destroy their quality as proper measures to be used in a comparison. This paper focuses on he evaluation process of the response time and thruput measures, identifies the potential pitfalls, suggests some useful approaches to their proper evaluation, and identifies key problems yet to be resolved.

Event driven capacity planning, S. W. Cox, SP500-52, pp. 179-192 (Oct. 1979).

Key words: capacity planning; hardware monitors; modeling; performance evaluation; performance prediction; simulation; validation; workload characterization; workload management.

Accurate performance prediction for capacity planning has historically been hindered by inadequate workload descriptions. Present techniques often rely on accounting information without considering its sufficiency as a basis for performance prediction. Further, the performance reports generated for capacity planning often force the analyst to take sizable intuitive steps in reaching his conclusions. As a result, planning decisions must be based on predictions of unknown accuracy and sensitivity to error.

A performance prediction system for capacity planning is under development. The important events of system/workload interaction are recorded and used to drive an efficient hardware/software simulation model. After each workload has been traced, the model can predict performance under a wide variety of loads, mixes, and configurations. For a small but diverse domain of demand paging environments, accurate response time predictions have been achieved.

A FORTRAN synthetic program for benchmarking, P. M. Fleming and A. C. Rucks, SP500-52, pp. 193-199 (Oct 1979).

Key words: benchmarking; performance evaluation; synthetic program; workload mapping.

Benchmarking is a generally accepted and essential element in the competitive procurement of computer systerhs. A benchmark workload consists of a set of application programs, synthetic programs, or a combination of these designed to be representative of expected system workload. A benchmark workload developed from application programs is unacceptable because it (1) is potentially biased in favor of the incumbent vendor, (2) may not be truly representative, and (3) may contain data that is subject to privacy and security restrictions. A benchmark workload constructed from synthetic programs does not suffer from these limitations. The basis for employing synthetic benchmarks is well established; however, previous

synthetic programs have failed to provide a means to test system capacity through the execution of a reasonable variety of programming functions. The FORTRAN Synthetic overcomes this disadvantage of previous synthetics by providing a set of programming functions which test a wide range of system capabilities. The Synthetic is modular in structure to provide representativeness and control of instruction mixes; parameter driven to provide control of processing time; and provides a means of controlling memory usage. The structure of the FORTRAN Synthetic and the process of workload mapping is presented.

The NBS Network Measurement Instrument, M. D. Abrams and D. C. Neiman, SP500-52, pp. 201-211 (Oct. 1979).

Key words: computer; computer communications; computer performance measurement; data measurement; measurement; performance measurement.

The NBS Network Measurement Instrument (NMI) represents the third generation implementation of an approach to the measurement of interactive computer networks, teleprocessing systems, and network services which focuses on the service delivered to users rather than on the internal operating efficiency of the system. The information obtained aids users in the quantitative evaluation of such systems and services. The performance measures and measurement conditions are described. The applicability of the stimulus—acknowledgement-response model to interactive asynchronous, character synchronous (e.g., bisync), and bit synchronous (e.g., SDLC and ADCCP) communication is discussed. The NMI is presented in terms of its functions, statistical capabilities, architecture, and communications protocol state transitions.

Performance analysis of a saturated system-A case study, N. Lennon and W. P. Bond, Jr., SP500-52, pp. 215-218 (Oct. 1979).

Key words: inventory management; on-line; performance; response time; saturated system; transaction processor.

This paper describes some recent experiences that the authors have had in attempting to gather decision-making data from an on-line system which was totally saturated, and for which minimal performance measurement tools were provided. The case study presented illustrates the limited use of data resulting from post facto measurement techniques. An attempt has been made to illustrate the type and quality of results which can be expected under these circumstances.

Teleprocessing transaction thruput performance, B. Irwin, SP500-52, pp. 219-226 (Oct. 1979).

Key words: input lockout; mathematical modeling; queuing models; race conditions; TCAM data flow; teleprocessing.

Financial industries are dependent on Teleprocessing systems to execute their daily transactions. Unacceptable Teleprocessing performance can, and does, impede the expected flow of line of business transactions, thus degrading the anticipated level of work units for that line of business. This paper is an analysis of TCAM's tuning parameter: BUFFER DELAY VALUE. The analysis develops a mathematical characterization of TCAM's BUFFER DELAY VALUE which describes the race conditions inherent in TCAM logic. Such race conditions potentially degrade Teleprocessing transaction throughput. The analysis resulted in a TCAM modification to search a table of BUFFER DELAY VALUES, where each value corresponds to a number of stations currently active on the line. The instantaneous optimal DELAY VALUE generated by this table acts to optimize line utilization and prevents station input transaction line lockouts.

Methodology for performance evaluation and capacity planning, A. O. Allen, SP500-52, p. 227 (Oct. 1979).

Key words: analytic queueing theory models; capacity planning; performance evaluation.

In this tutorial we summarize a methodology taught at the Los Angeles IBM Systems Science Institute in a class called Performance Evaluation and Capacity Planning. Benchmarking with remote terminal emulation, T. F. Wyrick and R. E. Youstra, SP500-52, pp. 229-230 (Oct. 1979).

Planning and implementing remote teleprocessing services: Management perspectives of the TSP, R. L. DeMichiell and G. L. Underwood, SP500-52, pp. 231-232 (Oct. 1979).

Key words: competitive negotiated procurement; computer management; remote teleprocessing; statement of work; teleprocessing services program; timesharing.

the

This tutorial addresses a competitive negotiated procurement of remote timeshare services under Teleprocessing Services Program (TSP). The guidelines provided by the Department of Transportation and the General Services Administration governing the U.S. Coast Guard have resulted in the implementation of timeshare services for the U.S. Coast Guard Academy. A final comprehensive report culminated a rather extensive effort.

It is the intention here to discuss many aspects of the procurement from the perspective of the personnel involved in the various phases of the process. The relevant issues will be examined and some reference literature on the general subject will complement the two main thrusts of the presentation: (1) to identify and clarify the procedures and options, and (2) to provide practical, locally-derived guidelines for future implementation of the program.

Although the guidelines are general in nature, they were derived from a detailed analysis of the events which recently were experienced by the authors. The completion of the negotiation for a five-year systems life procurement under TSP resulted in a savings in excess of half a million dollars.

Selection and evaluation of instructional time-sharing services-(A tutorial outline), R. T. Close and R. A. Kambeitz, SP500-52, pp. 233-234 (Oct. 1979).

Key words: benchmarking; cost analysis; evaluation; procurement; technical analysis; time-sharing; TSP.

The U.S. Coast Guard Academy has completed a multiyear procurement of instructional time-sharing services using the General Services Administration Teleprocessing Services Program (TSP). This fully competitive procurement included extensive technical and cost evaluations and adhered to Department of Transportation and Coast Guard directives for data processing contracts. The technical evaluation included an on-site operational capability demonstration with benchmarking. The final ranking of vendors used a point scoring system which included both the technical and cost analyses.

Tutorial on benchmark construction, H. Letmanyi, SP50052, pp. 235-240 (Oct. 1979).

Key words: benchmark construction process; benchmark validation; competitive evaluation; tutorial; vendor systems; workload requirements.

This tutorial will provide participants with a detailed overview of the benchmark construction process-the steps involved and the tools that can be employed to construct a benchmark. This tutorial is recommended for those who have an interest in constructing benchmarks for use in the competitive evaluation of vendor systems.

A brief review of the ADP system evaluation and selection process within the Federal Government will first be given to identify how the benchmark construction process fits into the total selection process. Next, the tutorial will discuss in a step-by-step fashion those tools and techniques which can be used to analyze existing workloads, project future workloads, and represent workload requirements via benchmarks. The importance of having definite objectives and goals prior to constructing a benchmark, as well as the need for benchmark validation and documentation will also be discussed. An outline of the topics to be covered in the tutorial follows.

Using accounting log data in performance reporting, J. P. Bouhana, SP500-52, pp. 241-243 (Oct. 1979).

Key words: accounting logs; performance evaluation; workload characterization.

This tutorial summary outlines several topics pertinent to using a computer system's accounting log file as a basis for performance reporting. Some major topics are a log's organization and contents, the types or reports and displays which can be generated, and the problems encountered in using log data. It is concluded that although accounting logs have an understandably principal orientation toward accounting, they can be effectively used for performance reporting and for operations reporting as well.

SP500-53. Computer science & technology: Technology assessment: ADP installation performance measurement and reporting, C. B. Wilson, Nat. Bur. Stand. (U.S.), Spec. Publ. 50053, 37 pages (Sept. 1979) SN003-003-02123-7.

Key words: computer performance evaluation (CPE); computer performance management (CPM); installation management; installation performance management; performance measurement and reporting; resource management; standard performance measures.

This report compares the current status of ADP installation performance measurement and reporting in the Federal ADP community with the best practices as found in the Federal and private sectors and described in the literature. The comparison reveals that more effort could be expended by Federal sites in

the area of computer performance management. The principal obstacles to more and better performance programs are perceived to be the lack of needed measures on many systems and the magnitude of the effort involved in accessing and analyzing the measures which are available. The report discusses several underlying causes for these obstacles and makes three recommendations which could partially relieve the situation: (1) development of standard performance measures, (2) development of a Government-wide data base for normative performance ranges, and (3) development of statistical computer performance evaluation techniques.

SP500-54. Computer science & technology: A key notarization system for computer networks, M. E. Smid, Nat. Bur. Stand. (U.S.), Spec. Publ. 500-54, 35 pages (Oct. 1979) SN003-00302130-0.

Key words: cryptography; digital signatures; encryption; identifiers; key management; key notarization.

A cryptographic, key notarization system is proposed for computer networks to protect personal (nonshared) files, to communicate securely both on and off-line with local and remote users, to protect against key substitution, to authenticate system users, to authenticate data, and to provide a digital signature capability using a nonpublic key encryption algorithm. The system is implemented by addition of key notarization facilities which give users the capability of exercising a set of commands for key management as well as for data encryption functions. Key notarization facilities perform notarization which, upon encryption, seals a key or password with the identities of the transmitter and intended receiver. SP500-55. Computer science & technology: Selection of data entry equipment, S. A. Recicar, Nat. Bur. Stand. (U.S.), Spec. Publ. 500-55, 77 pages (Nov. 1979) SN003-003-02133-4.

Key words: application; character set; computer interface; cost; data entry; edit; operator speed; record size; transaction volume; transfer speed; validate; verify.

This publication provides information to be used by Federal organizations in the selection of data entry equipment. It serves as a supplement to the Federal Information Processing Standards Publication (FIPS PUB) 67, “Guideline for Selection of Data Entry Equipment." The objective is to make available information that could lead to the selection of more efficient and economical data entry systems. This report provides information about economic and general operational considerations, steps to be followed in acquisition and training, and other factors pertinent to data entry equipment selection. Equipment profiles for the different data entry methods are also provided. SP519. Trace organic analysis: A new frontier in analytical chemistry. Proceedings of the 9th Materials Research Symposium held at the National Bureau of Standards, Gaithersburg, MD, Apr. 10-13, 1978, H. S. Hertz and S. N. Chesler, Eds., Nat. Bur. Stand. (U.S.), Spec. Publ. 519, 788 pages (Apr. 1979) SN003-003-02054-1.

Key words: drug analysis; food toxicants; hormones; neurotransmitters; nutrients; organic pollutants; trace organic analysis.

Researchers in diverse areas must currently perform critical analyses on minute quantities of organic compounds in various matrices. It was the aim of this Symposium to bring together these scientists to discuss their common problems and to explore current and impending technology for organic analyses. Emphasis was placed on the total analysis, from collecting the sample through interpreting the results, rather than upon the measurement only.

The Proceedings consist of a series of invited papers by experts as well as particularly appropriate contributed papers. Topics covered in the Proceedings are as follows: Sampling and Sample Handling for Trace Organic Analysis, State-of-the-Art Analytical Systems, Analytical Techniques on the Horizon, Analysis of Nutrients, Analysis of Organic Pollutants and Their Metabolites in the Ecosystem, Analysis of Drugs in Body Fluid, Analysis of Food Toxicants, and Analysis of Hormones and Neurotransmitters. These proceedings include the following papers (indented):

Statistical sampling and environmental trace organic analysis, H. H. Ku, SP519, pp. 1-6 (Apr. 1979).

Key words: environmental measurements; modeling; sampling schemes; statistical sampling.

In the field of analytical chemistry, statistical sampling traditionally did not play a prominent role. Samples were usually drawn, or composited, from fairly homogeneous material and the characteristic of interest determined. The results were averaged and claimed to represent the value of the property desired. This procedure has been used in manufacturing and industrial processing for some time, and has shown to be satisfactory for the purpose. For example, in determining percent carbon in steel, only one preliminary grab sample from a melt is taken and analyzed to give the carbon content representing the whole 160 tons.

Once the chemists ventured from manufactured goods with controlled composition to natural products, the variability of the properties among samples began to pose a problem. In determining the sucrose content of a shipload of raw sugar, samples were taken systematically every 300 tons while unloading. The average of the 90 or so samples in a shipload is considered to be "the sucrose content" of the whole shipload by definition. Where the buyer and seller can agree on a specified procedure, the purpose is served.

In environmental measurements, or in analysis of low level contaminants, time and space added dimensions to the statistical sampling problem. Coupled with difficulties in extraction and measurements in trace organic analysis, the problem is indeed formidable. Any knowledge as to sources of variability and pattern of variability, however, would be helpful in dealing with the sampling problems.

It is suggested that the design of a proper statistical sampling scheme depends almost entirely on the purpose for which the results are going to be used. Hence, without an explicit and defined purpose for an undertaking, the design of the sampling scheme cannot be formulated for efficient data collection and for the correct interpretation of results.

Sample preparation for environmental trace organic analysis, F. C. McElroy, T. D. Searl, and R. A. Brown, SP519, pp. 7-18 (Apr. 1979).

sample

Key words: gas chromatography; gas chromatography/mass spectrometry; preconcentration techniques; preparation; ultraviolet spectrometry.

This paper covers the handling of markedly different sample types, namely gas (ambient air, stack gas), water, and solid wastes. Emphasis will focus upon the sample preparation for the measurement of EPA's priority pollutants exclusive of pesticides and some specialty byproducts. For gas samples, toxic compounds are present in the gaseous and particulate phases. Volatile components are concentrated by passing the gas through solvent, carbon, or porous polymer resins. They are quantitatively dissolved and introduced into the instrument. After removal of particulates by filtration or cyclonic action, the nonvolatile

toxics are measured by GC/MS, GC with and without specific detectors, or HPLC.

In the analysis of water, volatile compounds of low solubility are removed by nitrogen sparging, trapped and analyzed as mentioned above. Nonvolatiles such as polynuclear aromatic hydrocarbons, PCB's, or phenols are adsorbed on porous polymer resins or solvent extracted. The resulting organic concentrate is analyzed in the same manner as particulates.

Solid wastes are simpler in that the toxics are already present in a concentrated form. Volatiles are determined by simply heating the sample in the instrument. Nonvolatile organics are dissolved by solvent in Soxhlet or ultrasonic extractors. Following extensive cleanings the pollutants are determined.

It is convenient and desirable to employ internal standards for quantitation purposes. These may consist of 1) a representative organic known to be absent in the sample to be analyzed, or 2) a compound especially labeled with a halogen, deuterium or "C atom.

Analysis of water for chlorinated hydrocarbon pesticides and PCB's by membrane filters, D. A. Kurtz, SP519, pp. 1932 (Apr. 1979).

Key words: cellulose triacetate filters; gas chromatography; pesticide residues; sampling.

Pesticide residues in water have been analyzed through the use of cellulose triacetate membrane filters. As a chemisorption separation method DDT analogs, mirex, aldrin, and PCB mixtures have been separated from water by absorption to these filters. Recovery of these compounds has been achieved through elution with diethyl ether. Filters were cleaned before use with diethyl ether.

Adsorption of p.p.-DDT, pp.-DDE and Aroclor mixtures 1242 and 1254 were found to be 98-99% complete. Maximum loading studied at this time has been 385 ng/cm2 for DDT analogs and 1650 ng/cm2 for Aroclor-1254.

Recovery from membrane filters of these compounds ranged from 69 to 113%. p.p.-DDT analogs (p.p.-DDT, p.p.-TDE, and p.p.-DDE) at charge levels of 60, 300, and 1500 ng were recovered from 68 to 89%. Mirax at 300 and 7500 ng charge was recovered at 97-113%. Aroclor mixture 1242 at charge levels of 400, 2000, and 10,000 ng and Aroclor mixture 1254 at similar levels were recovered at 89-112% and 83-84%, respectively.

The recovery of DDT analogs and mirex were found to be similar from water whose pH was 2 and 7 but at pH 12 it dropped to almost zero.

The importance of this new chemisorption separation method is seen by its convenience of operation. Separations can be done in the field and filters stored easily. A high degree of concentration can be achieved.

Application of liquid and gas chromatographic techniques to a study of the persistence of petroleum in marine sediments, R. G. Riley and R. M. Bean, SP519, pp. 33-40 (Apr. 1979). Key words: gas chromatography-mass spectrometry; glass capillary chromatography; high pressure liquid chromatography; hydrocarbon persistence; infrared spectroscopy; petroleum hydrocarbons; sediment analysis.

A technique employing high-pressure liquid chromatography (HPLC) and glass capillary chromatography has been used to monitor the long-term persistence of petroleum in marine sediments. Petroleum in sediments was studied using Prudhoe Bay crude (PBC) as the host oil, as part of a combined chemistry/biology study of the impact of oil on marine ecosystems.

Total monoaromatic-diaromatic hydrocarbon concentrations in extracts of oil polluted sediments were analyzed by HPLC on series coupled columns containing Durapak oxypropionitrile on Porasil C. Chromatographic separation of the extracts produced a saturated fraction and a monoaromatic-diaromatic fraction which was approximately 80% resolved as determined by the separation of model compounds. The quantitative analysis of the summation of the monoaromatic-diaromatic components was performed with a calibration curve prepared from a PBC aromatics fraction isolated by an ASTM method. Infrared analysis of carbon tetrachloride extracts of the same sediments provided total oil concentrations as complementary information to the HPLC analysis. More detailed compositional information on individual saturate and aromatic hydrocarbons in oil contaminated sediments was provided by glass capillary chromatography.

The above methods were applied to a study of the persistence of PBC in two types of sediments that were exposed to conditions which simulated possible environmental pollution situations. In a laboratory experiment, a finetextured sediment containing PBC oil was monitored periodically over a period exceeding 1 year after deposition in a continuous-flow seawater bioassay apparatus. Similarly, the fate of PBC in a field experiment involving a coarse beach sediment located in an intertidal zone was studied. The persistence of PBC in both sediment systems is discussed in terms of total oil concentration, concentrations of hydrocarbons and hydrocarbon classes, and some of the physical, chemical, and biological parameters which affect the degree of persistence.

The use of a fluorescence detector in high performance liquid chromatographic routine analysis of polycyclic aromatic hydrocarbons in environmental pollution and occupational health studies, B. S. Das and G. H. Thomas, SP519, pp. 4156 (Apr. 1979).

Key words: environmental pollution; fluorescence; high performance liquid chromatography; occupational health; picogram; polycyclic aromatic hydrocarbons; selectivity; sensitivity; specificity; trace analysis.

A routine method for trace analysis of nine major polycyclic aromatic hydrocarbons (PAH) by high performance liquid chromatography (HPLC) with fluorescence detection is described. The fluorimetric detection involves a deuterium light source and excitation wavelengths below 300 nm. Careful selection of excitation and emission wavelengths gives a high degree of selectivity and specificity in fluorescence detection and permits complete analysis of individual PAH in a multi-component mixture. The extremely high sensitivity in fluorescence detection has reduced minimum detectable concentration of the PAH close to subpicogram levels, e.g., benzo(a)anthracene 0.19 pg, benzo(k)fluoranthene 0.11 pg, benzo(a)pyrene 0.34 pg. The sensitivity is found to be strongly influenced by the amount of water present in the PAH solution to be analyzed. The HPLC-fluorescence system allows the use of dilute solutions, thus eliminating the usual clean-up procedures associated with trace analysis. The application of the method for the analysis of PAH in environmental, process and occupational health samples is discussed.

Study of organometal speciation in water samples using liquid chromatography with electrochemical detection, W. A. MacCrehan, R. A. Durst, and J. M. Bellama, SP519, pp. 5763 (Apr. 1979).

Key words: electrochemical detection; liquid chromatography; methylmercury; organometals; water preconcentration.

« PreviousContinue »