Trends in Sample Preparation and Automation -
What the Experts are Saying

By Ronald E. Majors
LC-GC's "Sample Preparation Perspectives" column editor

A special report, based on this paper, was published in LC-GC (13[9], 742-749 [1995]).


Sample Preparation (SP) is still one of the more time-consuming, error-prone aspects of analytical chemistry (LC-GC 9[1], 16-20 [1991]). Many of the SP techniques that are in current use have been around for decades with little or no improvement ov er the years. New SP technologies have been slow on the uptake despite proven advantages over some of the older technologies. Despite many strong driving forces such as increased sample loads, decreasing skilled labor force, worker safety and less exposur e to chemical hazards, the need for enhanced productivity, better quality data with increasing regulatory constraints, and the greater need for information management, automation of SP and integration of information management into the analytical process has been accepted with reluctance. Despite all of these negative impacts, new SP techniques are on the horizon and are being introduced with vigor.

This article is drawn from a recent survey of leading experts in SP and laboratory automation. We asked the 14 experts their opinions on some of the above questions and predictions on what we might see happening in the next several years. An international cross section of noted SP technologists from the U.S. and Europe, from manufacturers of equipment, academics, government workers and industrialists, from the pharmaceutical, chemical, agrichemical, food, and environmental segments was chosen (see box on this page). A questionnaire with a series of open-ended questions was sent to each of them which they were asked to respond. The results of their responses are presented here.

1. SP has long been the "bottleneck" in the analytical process, yet only in the past 5 to 10 years has its automation been actively addressed while automated chromatographic and spectroscopic instrumentation began to make marked advances a decade befor e. Why do you think this is so?

There were a variety of responses but in general the experts agreed that SP has always been viewed as a manual technology, often regarded as "low tech," and usually assigned to the least trained staff. "The addition of instrumentation to a traditionally n oninstrumental required chemist conversion to new techniques and chemists are somewhat reluctant to quickly accept new technologies" noted Brian Holden of Hewlett-Packard. Another factor which impeded its progress was that SP was not very attractive from an academic viewpoint, not really recognized as a field of analytical chemistry; thus many universities shied away from the study of SP. In industry, the chemists role was thought to be more "high tech" to support the use of the analytical instruments as well as analyte detection. Chemists spent most of their time evaluating and implementing new analytical technologies and generating and interpreting data. In addition, analytical techniques are more generic, while there are many ways to do SP, many of whi ch are sample-specific. There were not many new concepts for the automation of SP, and early attempts tended to mimic manual operations. These approaches were not successful and were expensive since many of the manual procedures were not amenable to autom ation. "Also scientists wanted flexibility but didn't know how to get it" claimed Frank Zenie of Zymark.

Steve Hamilton of Amgen noted "that there was early laboratory automation of SP with the Technicon Autoanalyzer (or flow-segmented technology) but it was mainly a flow injection analysis system which never really crossed-over into chromatography." Many of the major instrument companies did not identify SP automation as important and it was mainly smaller companies who became niche players and provided the needed instrumentation. For this reason, most of the early instruments were stand-alone SP modules an d were not integrated with the analytical instrument as part of the analytical process. Even today, there are still only few automated instruments for SP produced by the major instrument companies. "Most users do not have the time or the resources to manu facture systems" notes Jeffrey Hurst of Hershey Foods "and must rely upon commercially available and supportable instruments."

Several of the experts felt that the automation of SP was really a progression; once the other analytical techniques were automated then SP became the "bottleneck." Tony Beugelsdijk of Los Alamos and Hamilton felt that the automation of instrumentation wa s "a logical extension of the use of microprocessor-based controllers and microcomputers as part of the other automated instrumentation." They and others felt that robotics (flexible automation) was a revolutionary step in the automation process but neede d a different infrastructure - scientists, engineers, and software experts working together - to support its implementation. "Flexible automation is truly difficult and demands a great deal of technology" noted Frank Zenie of Zymark, who has had 15 years of experience in the automation business. He added "laboratory robotics and our laboratory unit operation (LUO) concept provided the first insight into flexible lab automation." Beugelsdijk also felt that another factor in the low acceptance of automation was the fact that "management did not have the strategic vision of the information-generating role of the laboratory and its competitive value." Once chemists felt the pressure to improve productivity such as is needed with high throughput drug candidate screening in the pharmaceutical industry along with the increasing cost of labor and decrease in the skilled work force, then automation of SP became more important.

2. Laboratory workers have used SP techniques such as liquid-liquid extraction and Soxhlet extraction for many years. Why don't a larger number convert to some of the newer replacement technologies such as supercritical fluid, microwave, or solid-phase extraction which seem to be faster, safer, and at least as efficient?

As with most new technologies, the innovators first evaluate the technique. Most of the experts felt that until a technology is proven and its competitive advantages demonstrated, it may take years for widespread acceptance. Mary Ellen McNally of Du Pont thought that 10 years or so is normal for a new technology to become "mainstream." The time it takes to validate new methods to prove equivalency limits the acceptance of new technology. The regulatory agencies, especially in the United States, are very s low to accept new technologies. Beugelsdijk feels that this slow acceptance is due to the fact that most regulated methods are "prescription rather than performance-based" and would like to see this change. Another factor is the inertia of most workers. W orkers are reluctant to change (i.e., fear of the unknown) and feel that it takes a great deal of time in gaining the experience and know-how, "especially when many of the newer technologies are based on different chemical and physical principles" added B eugelsdijk. Many of the new SP technologies do not fit into the laboratory work flow. SP often takes place in another laboratory away from the analytical laboratory. When SP is coupled to or closely aligned with the analysis, it must move closer to the in strument which may interrupt the "normal" way samples are handled.

Another factor is the lack of training on the newer technologies. In the absence of "cookbooks" or practical guidelines to their application, training courses in the universities, through the instrument companies, and elsewhere are required. Many times th ese courses just are not available states Karl Sigfred Boos of the University of Munich.

"The matrix dependence of the newer methods have to be studied and guidelines established before they become more common" says Gary Kramer of the National Institute of Standards and Technology. He elaborated that "traditional methods have had decades to d evelop their 'universality' so workers feel comfortable with these approaches."

The experts believe that another factor in the acceptance of the newer technologies is the financial payback. With the exception of solid phase extraction (SPE), states Kramer, most of the newer techniques require expensive instrumentation. For example, a typical automated supercritical fluid extractor may cost in the neighborhood of $50,000. For many laboratories with restricted capital budgets, this can be a real issue. Often the newer technology is replacing a less capital-intensive but more labor-inte nsive method. For example, eight or ten Soxhlet extractors can be set up in a hood for a thousand or so dollar investment. Manufacturers of instrumentation should be able to provide real justification of payback based on actual laboratory operation.

The feeling of several of the experts was that implementation of many of the new SP technologies was not "on the mark." To aid these efforts, Lars-Erik Edholm of Astra Draco feels that "the instrument companies should collaborate with and work more closel y with their customers truly understanding their needs." In SP techniques, "the instrument companies spent more time trying to 'legally outmaneuver' each other in attempting to get exclusivity with their equipment written into regulatory methods rather th an getting acceptance of the new technologies" says Skip Kingston of Duquesne University. He cites the example of microwave digestion methods for environmental analysis which were published in 1990 but only promulgated with EPA Update II on January 13, 19 95. A similar case can be made for supercritical fluid extraction for organics in soil samples. Kingston feels that the application of newer methods may be restricted by the instrument manufacturers and their inability to cooperate with one another. The e xperts identified lack of universal data communication standards, incompatible hardware interfaces from one manufacturer to another, and consumables which are instrument-specific as examples of a lack of cooperation of manufacturers.

3. The recent introduction of SPE disks has created a rivalry between this format and the traditional cartridge design. Which approach will win out in the end and why? Or will they coexist each having its place in SP?

A majority of the experts felt that the cartridges and disks will coexist each seeking its optimum area of application. The disks would be favored for cases where trace organics in a large volume of sample such as polynuclear aromatic hydrocarbon in water required concentration and high flow rates could be used to cut sampling time. Some felt that the presence of fines in the sample might shift the advantage to the disks which offer a higher cross-sectional area (and therefore less clogging) compared to t he cartridges. Experts agreed that there would probably be less channeling in disks since sorbent particulates are held firmly in place in the disk matrix. The use of vacuum to pull liquid through the bed would be an advantage for the disks due to their l ower flow resistance. For example, Udo Brinkman of the Free University of Amsterdam felt that "the lower pressure requirements would make the disks more suited for field studies." Once the disks were returned from the field sampling, individual sections c ould be easily segmented (cut) and handed out for final different analyses in the laboratory. The general feelings were that the cartridges would be favored when one wanted to quickly incorporate a new adsorbent or bonded phase, a higher sample load was n eeded, or when the analyst needed a different bed mass with exactly the same surface chemistry. Several of the experts felt that, due to the advantages mentioned above, the disks might achieve greater acceptance if there are a sufficient number of chemist ries available, they have similar capacity and selectivity, and, if the disks become less expensive (currently not the case).

Mike Burke of the University of Arizona cautioned that "one should not think the two formats as different technologies because they both involve passing a solution through an adsorbent for the purpose of isolation and purification as well as trace enrichm ent . . . they differ only in the manner in which one addresses the question of optimum bed length versus bed diameter." Zenie felt that the disk vs. cartridge controversy was a result of "strong promotion of the disks rather than a real difference in per formance." He further states that "automated cartridge techniques can provide excellent results and substantially reduce costs compared to manual disks." As far as the automation aspects, other experts had varying opinions on which approach is easier to a utomate. Some felt that the disks were harder to integrate into automated systems while others felt that the more standardized formats of the disks will make it easier for automation. Kingston felt that since SPE is one of the easier SP methodology to aut omate several formats will be developed and in a few years SPE will be a standard technique in most laboratories.

4. Solid phase extraction has been around for over 15 years yet only a handful of instruments has been developed to automate this technique. Why?

Several of the experts agreed that manual (off-line) SPE has been viewed to be a simple, cheap SP technique similar to "flash chromatography" and requires only glassware and a vacuum station. Thus, many chemists were not willing to spend a lot of money fo r a sophisticated instrument to automate this technique, especially when the benefits and throughput were not high enough to overcome the cost. Early automated instruments did not met user's expectations. Cited were robotics where SPE adaptations were poo r; the AASP (Varian SP Products, Harbor City, California - formerly Analytichem) which automated only part of the process, and the Millilab (Waters Chromatography, Milford, Massachusetts) which was slow and used nonstandard cartridges. In fact, Hurst and Hamilton felt that "the lack of standardization of cartridges and instrument-specific proprietary formats were major reasons why automated SPE has lagged behind." Popular manual consumables such as the packed, medical grade syringe barrels do not fit into many of the automated instruments. On the other hand, Edholm felt that current instruments meet his needs in the pharmaceutical laboratory and Holden agreed that, given the size of the market, that the number of automated SPE instruments was probably ade quate. In fact, Hurst mentioned in his response that "many SPE users may have only a handful of samples a day to analyze and cannot really justify the automated equipment."

Others cited the lack of packing and column-to-column reproducibility that plagues both HPLC and SPE was a determining factor in limiting its automation. For automated overnight runs, a rugged reproducible method is essential. Nevertheless, Linda Brunner of Ciba Geigy feels that "SPE is a 'perfect technique' to automate, that many of these early problems have been solved, and the technology has been accepted by many practitioners in the pharmaceutical laboratory." Burke agrees that SPE is actually "higher on the automation curve" than many of the older SP techniques such as liquid-liquid extraction, Soxhlet extraction, evaporation, etc. Some experts felt that the regulatory agencies such as the EPA have also been slow in accepting SPE methods; thus, large volume of samples such as might be generated in environmental testing labs were not being prepared using SPE techniques. On the other hand, pharmaceutical laboratories used large numbers of SPE cartridges, especially during animal studies and clinical tr ial and many U.S. Pharmacopoeia methods specify SPE sample cleanup.

Kingston felt that the "lack of integration of SPE automated instruments with the analytical instrument has limited its acceptance." Automated SPE instruments have the image of "stand-alone instruments." Boos and Brinkman both feel that column switching, a form of automated SPE, is a more reliable and less expensive alternative. Direct injection methods, where SP is minimized or eliminated may be on the horizon for drug analysis laboratories like those of Edholm. He cited HPLC-immunoassay and HPLC-MS-MS a s alternative solutions.

5. Automated column switching is often used in Europe as an on-line sample cleanup-analysis technique yet in the United States it is practically nonexistent. Why?

The reasons cited by the experts were mixed with several not having an opinion. Some of them felt that one of the main reasons that column switching has been more popular in Europe is that the Europeans are more "open minded" and "advanced" in about new t echnology while U.S. chemists are more conservative. "European scientists have always worked under the guidelines of 'doing more with less' " states Linda Brunner while Steve Hamilton felt that culturally the Europeans take a "lower tech" more practical a pproach to problem solving. Brinkman added that Europeans have always had a more independent attitude on tackling difficult tasks. The level of training in column switching is higher in European universities with several research groups having done pionee ring research in the '70s and '80s. Edholm considered that "much of the European know-how came out of these efforts and many of the students went on to industry feeling comfortable with this approach" to SP-analysis . Boos felt that the lack of training i n the U.S., few practical guidelines in setting up column switching devices, and the lack of commercially available units which were fully set up, proven and robust all contributed to a lower level of application. The WAVS (Waters Automated Valving System ), a commercial column switching apparatus which failed to get a large following in the '80s, was because "use of column switching required a thorough understanding of chemistry and chromatography and its failure was due to a lack of understanding of the technology by any bench chemists" cited Jeff Hurst.

Brinkman felt that the "regulated U.S. environment does not stimulate alternative approaches." Burke mentioned in his response that America is more of a "disposable society" and find it easier to rationalize using something just once. McNally added that " there may actually be more column switching practiced in the U.S. than realized but for proprietary reasons companies do not publish their results" since much of it is carried out in the pharmaceutical industry.

6. Automated SP can be accomplished using a batch or serial approach. In the batch mode, multiple samples are prepared and the prepared samples are then transferred to the analytical instrument. In the serial mode, samples are prepared one at a time an d the SP device is connected on-line (integrated) with the analytical instrument. Under what conditions would you link SP automation to your analytical instrument as compared to automated sample prep in batches?

The general consensus among the experts was that when the SP time was equivalent to or less than the analysis time, then a serial method would be preferable. Fritz Erni of Sandoz does not see a clear separation of batch and serial modes in SP. In most cas es in his labs, the two modes are mixed. In his laboratories "usually time-consuming process steps are nested and processed batchwise or semibatchwise." For samples where the assays are time-dependent or when there is a stability question (labile samples or matrices) of the prepared analyte, serial methods are more useful. Examples cited where serial techniques were required were OPA-amino acid derivatives for HPLC analysis where some of the derivatives have very short half-lives and for the determination of moisture where batching of assays could expose the sample to moisture in the environment thereby affecting results. When sampling cycle time is very long then serial SP makes the most sense. Brinkman cited an example of the environmental early warning analysis of Rhine river water where samples are taken every 4-6 hours. On the other extreme when SP time is very fast such as with microwave digestion, then "taking the sample directly from the microwave vessel on-line makes a great deal of sense, althou gh currently there is not a microwave instrument that is integrated with the analytical instrument to perform that task" states Kingston.

There were other opinions on when serial SP might be preferred over batch methods. Brunner felt that "if limited sample is available, then one may feel more confident in preparing one sample at a time"; in a similar vein, McNally favored serial preparatio n when a difficult sample "must be watched." Burke felt that in automated SP methods development the serial approach would be useful but he and Zenie concluded that off-line SP makes sense when the instrument is a very expensive one such as GC-MS where on e would not want to "tie up" the instrument during routine use. Here Burke feels that "serial or on-line systems will require the development of low cost analyzers." Beugelsdijk perceived that the serial mode has its greatest potential in feedback-control led systems wherein real-time (or pseudo real-time) control of a process is possible. As control systems for the laboratory are developed, more serial systems will emerge. Boos favors serial SP because in his hands it improves accuracy, precision, and sen sitivity of his methods. Edholm prefers on-line (serial) systems where specific chemistries can be designed into the process or where SPE can be coupled to HPLC for a crude SP before a final workup is performed. Hamilton feels that "linked" systems are " more easily accomplished in labs with routine loads, less so in labs that need high flexibility or mixing and matching SP and instruments."

Several experts sustained the belief that the batch environment is often an accommodation of the human in the laboratory rather than the preferred way to prepare samples. Often, these methods are not science-based but are developed by the constraint of an 8 hour workday cites Beugelsdijk. He provided examples, such as "let stand overnight to settle" or "allow to digest for one hour." Automated systems are not so constrained and should not be penalized with human limitations." Beugelsdijk further adds that "a rethinking of the method that recognizes the true chemical boundaries should be done before settling on the batch format as the de facto best approach." The human element also dictates the batch mode "because of our limited ability to keep track of to o many simultaneous events" says Zenie. Other factors could be the lab layout, instrument availability, or the work flow that might favor the batching of samples.

Another situation where the batch mode might make sense is when a wide variety of SP techniques feed a wide range of analytical instruments. Kramer felt that the type of SP procedure dictates whether batch or serial approach is required. For example, he m entioned that "centrifugation and microwave assisted digestion tend to require batch processes. But, new technologies such as axial centrifuges and the new flowthrough microwave digestion system show that even traditionally batch steps can be made serial. " Since batching is a buffering operation, Kramer would like to see buffering capability provided by system instruments so that in an automated system time dependent activities or processes could be built into the analytical method. Batch systems are espe cially attractive when dealing with large batch sizes, such as the 96- and 383-well microtiter plates.

7. Laboratory robotics had a very rapid growth in the early to mid-'80s but growth has fallen off in the '90s. Can you cite three reasons what this fall off has occurred?

In general, there was a consensus among the experts that robotics is more complex than other analytical techniques in the laboratory. "Laboratory robots were originally marketed as do-it-yourself technology" mentioned Beugelsdijk. Some of the first practi tioners were chemists who were not skilled in the engineering and software aspects and highly trained, dedicated personnel were needed. In many applications, the overall lead times to implementation became excessive and chemists were often sidelined for m onths awaiting the tools they needed to do their job. Furthermore, Beugelsdijk added "the resulting systems were not robust and required constant monitoring and maintenance."

Laboratory management didn't understand the resultant benefits of automation and wanted a ready-to-use automated instrument just like they were purchasing for GC or HPLC. They wanted "turnkey" systems and got dismayed at the long lead times required to ge t their systems up and running. They did not understand the infrastructure required to implement successful robotics applications. Few companies had automation groups and did not invest in establishing them. In the past, a few have tried outside services (third parties) but often these external companies don't fully understand the application. Once the outside company completes the project, the system must be supported by in-house personnel not fully trained. Nevertheless, out of necessity many companies are turning to third parties when the need for automation arises since they cannot get the job done by internal personnel. Zenie questioned whether "management really understood how to justify robotic automation other than by labor savings." He cited time -to-market for new products, improved product quality and productivity gains as examples of strategic benefits often overlooked in the decision-making process.

The early robotic units tended to mimic human manipulations which made interfacing them cumbersome and many applications resulted in a haphazard implementation. In general, Edholm felt that the robots were "not really optimized for task-oriented work." Of ten, the robots were used to perform the sample preparation directly which made them appear to be slow. Hamilton noted that the trend has been towards using the robots as sample-transport devices which moves samples or surrogates among a series of dedicat ed workstations which are optimized to do specific tasks such as SPE or filtration.

Another observation by the experts was that robotics was oversold by the instrument companies and by the robotic champions. When they didn't fulfill their promises or user expectations, they lost their "luster" and many robots were put on the shelf for an other day. Although they were sold as flexible automation devices at least three experts felt that the robots actually offered limited flexibility and even limited application. Kramer noted that many of the early robotics systems were sold into environmen ts where they could not be adequately supported and left a bad flavor in the user's mouth. Another factor for slow acceptance cited Beugelsdijk was the presentation of robotics early in the game "doing whimsical or trivial tasks that caused robots not be seriously regarded."

Probably the single biggest reason why robotics sales have fallen off cited by a majority of the experts including the manufacturers was the high cost or maybe better stated as "cost effectiveness." A typical successful robotic application often ended cos ting hundreds of thousands of dollars and therefore could only be justified for tremendously large sample loads. Many installed systems had to be reworked and greatly modified because of a lack of communication between the user and the vendor during the d efinition stages and this drove final costs up. Fortunately, a recent success story are the robot-based drug candidate screening systems where, for time-to-market, labor-savings, and tedium reasons, these systems are making a comeback in this very specifi c pharmaceutical application.

Kingston and others felt that "the proprietary nature of the robotic communications and software limited their acceptance in many laboratories." Lack of standardization among the robotic suppliers meant that it was difficult to connect the robots to the a nalytical instrument and other peripherals in a straightforward manner.

Holden felt that the new generation of robots were easier to use, have simpler interfacing requirements, are more reliable than those of the '80s. Hurst felt that the modern robots are now part of a larger body of laboratory automation.

8. What is your perception on how well instrumentation vendors have responded to meet the analyst's needs in terms of automated sample preparation? Where could they do better?

Although stated in different ways, the general feeling of the experts was that many of the automation systems have not been built to the end user requirements. Hurst felt that "some vendors took a rather arrogant attitude and essentially told the end user that we know what's best for you and we will make it." However, Kramer stated that "the general attitude seems to be that they would provide such equipment, if there were a market for that equipment." He added that "vendors continue to develop instrument s for manual operation and, with a few notable exceptions, have not been leaders in bringing new sample prep automation into laboratories." Kingston believes that "smaller companies have done a better job" while the larger companies "stick to spectroscopy and chromatography which made them large." "The larger companies could innovate in the area of sample preparation equipment development and take risks like they once did."

A majority of experts felt that "listening to the customers" and, as Linda Brunner put it "really understand the bottom line" would be an area for vast improvement. Brinkman felt that "they are often not familiar with the real world applications of the sa mple preparation techniques and therefore need to understand how the equipment will be used as well as the equipment from other companies so that they could be used together in a system." All felt that an "upfront" closer collaboration and open and free c ommunication among vendors, consumable makers, and users would be a welcome change in the instrument development process. "Good automation requires customer involvement, knowledge and commitment " stated Frank Zenie. He further encourages the instrument c ompanies to "follow their products into the lab and make sure they do the job." Brunner felt "that most vendors are presently looking to the end user needs-requirements when designing their instrumentation but that was not always the case." Beugelsdijk al so sensed that finally "instrument companies have realized that customers generally wanted solutions (turnkey systems), not tools." However, adds Hamilton "its a very application focused area - thus requiring a lot of time and effort from a vendor to real ly understand the needs. But, customers are not always willing to pay for the true cost of such effort as reflected in the eventual instrument price." Zenie feels that the companies "worked hard to do a good job but tried to do too much" and made solution s "too expensive and hard to use by attempting to drive out all manual intervention." "Well conceived automation blends the use of people and instrumentation" in some cases, he even recommends "discouraging customers from trying to do too much especially when just starting automated methods" he adds. Holden also feels that "making products less expensive and easier to use" would benefit sample prep automation.

A number of experts agreed that an area where instrument developers have fallen down is in the area of system integration by not providing seamless coupling of their equipment to the equipment of others. Just as was the case in the computer industry, ther e has been no standardization amongst the manufacturers with each having its own proprietary hardware and software. An "open architecture" design is recommended. Another predominant theme among the experts was that most sample preparation instruments are "after thoughts" with no real "integration with the analysis." Kingston extended his thoughts on the role of the major instrument companies by stating "their risk avoidance and reluctance to get into new markets to support analysis has created a niche for small companies but left some customers with no integrated analysis answers." Kramer agreed "there is very little easy-to-integrate automated equipment available; only a few companies are building automated systems since currently there's a small market which discourages further investment by the other companies. It's going to take some vision and convictions to break out of this circular argument." He went on to suggest a number of system design ideas which would allow sample prep instruments to be thou ght of as "system building blocks and not as stand-alone devices."

9. The trend in analysis seems to be moving from the analytical laboratory closer to the sample source such as the factory production floor (at-line or on-line) or to field testing, as in the case of environmental analysis. What impact will this trend have on the preparation of samples?

A number of the experts felt that this trend will have a dramatic effect on sample preparation and analysis, in general. Holden notes that "currently many at-line procedures are identical to those done in traditional laboratories. Until new analytical tec hniques are incorporated, the appropriate SP will still be required." In these techniques there will have to be a simplification of procedures to get answers and, in fact, there may be little or no sample preparation. Brinkman felt that new procedures wil l cause the analytical chemist to "make the answers really match the questions as they ought to have been formulated." He cited an example "if preliminary screening is all you want, do not overdo the separation, but keep the analyte detectability intact." In the bioanalysis of drugs, the increasing use of "on-line monitoring through the use of biosensors or dialysis probes" will be another trend foreseen by Edholm. Brunner feels that at-source analysis "will further promote automation, no matter how primi tive and/or simple, since rapid turnaround (instant answers) is often a driving force." She and others foresee that more reliable, foolproof, easy-to-use, less-flexible, analyte-specific instruments will need to be developed to meet these factory-field sa mpling and analysis needs. Says Kingston "this is a healthy trend with additional development opportunities for research and improvement of both methods and instrumentation." Zenie feels that "integrated information systems will be crucial to this evoluti on. Information must be secure and centrally available to decentralize testing."

Jeffrey Hurst is one of several experts who believes that there should be "no sample preparation and thus no requirement for solvents and less chance of error." However, being realistic, at least in the short term, he further adds "there will be a trend t owards more 'environmentally friendly' solvents and the preparation of smaller samples since there is not only concerns about the initial cost of the solvents but also their disposal." Kramer also feels that "since SP is one of the reasons that keeps the analysis in the lab, this trend will continue to push the development of analytical techniques that require little or no SP." McNally feels that "we are not there yet and we have a long way to go with the variety of sample matrices that are seen."

Since the lab is a bottleneck, Beugelsdijk also feels that "more field sampling and factory testing is inevitable" but "more specific, sensitive information and/or quantitation is often unavailable with field techniques and these will continue to be sent to the laboratory where the entire analysis process can be more carefully controlled, especially when regulated methods must be carried out in a certified laboratory." Overall, he and others believe that field screening will reduce the quantity of samples entering the laboratory. On the other hand, Burke feels that "the sampling procedures will be integrated to include the SP as well as the sample collection and the number of samples collected will increase as the SP processes simplify the analytical dete rmination step." One specific growth area he sees in environmental field sampling is the increased use of SPE for immobilizing analytes on solid surfaces for transport back to the laboratory since 1) chemical stability is enhanced when sorbed and 2) large quantities of analyte (i.e. a liter of water) can be more easily handled or shipped in a "concentrated trapped form."

10. In addition to SP, another bottleneck in high throughput-routine testing laboratories is information management. The combination of automated SP and laboratory data automation go hand-in-hand in meeting sample and information flow requirements. Why haven't analysts taken advantage of this tool?

Most experts agreed that currently information management systems are not well integrated into the laboratory and is an area that requires greater attention. It became a bottleneck once "automated SP and higher throughput analysis became available" mentio ned Hamilton. Both he, Zenie and Fritz Erni agreed that there are "no single vendor solutions" where information management is integrated with the analysis, even with companies who offer both instrumentation and data management systems. Skip Kingston and Jeff Hurst both added that it is not just information management but "the transformation from data into information to final results." Kingston further states that "the information should be fed back to the instrument to optimize and improve the operation , an area largely ignored by instrument manufacturers." Often it is left to the analyst to be the integration step and but technology exists where the process could be "combined to create automated feedback analytical systems." Several of the experts agre ed with Beugelsdijk who stated "automation of the entire analytical process from sample preparation to data interpretation is hampered not only by the difficulty of the task but also by the complexity and incompatibility of the tools currently available a nd the lack of industry wide standards for integration of these systems.""Incompatibility of different systems" still plagues the laboratory of Erni.Zenie states that many of instrument manufacturers including his own "are now committed to be compatible w ith computer industry standards such as Microsoft's DOS, Windows, etc." but feels that the integration of SP and information systems still remains yet another customized project.

Part of the problem "is the familiarization and comfort level of chemists in dealing with proper data management" stated Brunner, who works in a pharmaceutical company. She adds "it is a full-time job in itself if properly implemented, maintained, validat ed, documented, etc." Edholm agrees that "there has been a tremendous growth in both software and hardware development and workers have hesitated to choose information management technologies for this reason, especially when they have to collaborate with information management and computer science departments." Hamilton adds that "the group that manages information management is not always well integrated with those who look at prep and analysis." Holden agreed but feels that "efforts are underway in many companies where the analytical group and information technology (IT) groups are working together to tackle these larger projects." Burke further comments "GLP remains a poorly understood process and unfortunately it remains today in the minds of most lab staff as "regulatory" and therefore something that is being imposed on them rather than approaching this as a means of providing better information (or science). GLP requires that we truly accept the idea that "paper" or information flow is the actual pr oduct of the lab and that benefits of operating with true automation of information flow and lab exercises is essential." Kramer agrees "analysts see their job as prepping samples and getting good numbers, while the job of figuring out what to do with tho se numbers belongs to someone else." McNally reflects this feeling, in her response, when she mentioned "the craft is analytical chemistry" and "a major investment in time to become a computer 'jock' is not in her domain of interest." Obviously more work is needed to make the information management user friendly so that chemists are comfortable with its use in the laboratory.

11. Sample identification is very important for proper tracking of samples through the analytical process. Sample i.d. is especially pertinent to regulated laboratories. Bar code labeling equipment has been around a long time and has been readily adopt ed by hospitals-clinical laboratories for sample tracking and for maintaining sample integrity. Why isn't it used more extensively as an aid in sample tracking in many analytical laboratories?

Most experts agreed that the need for sample tracking should be pervasive throughout the analytical process yet acknowledged that in most cases it is not used. The experts felt that there are several reasons, in addition to the chain of custody argument, that bar coding is used more in clinical situations than in the analytical laboratory. First of all, most analytical instruments have no provision for reading a bar code while clinical instruments do. Kingston notes that "most analytical instruments are b eing produced to be autonomous islands of data and signal processing and not integrated systems ready to talk to the rest of the world." In Steve Hamilton's experience, the "lack of integration within the laboratory organizations and the IT group isn't co nducive to the addition of bar code devices in the analytical process." Furthermore, he adds "it is difficult to integrate bar coding into mature organizations where complex patterns of sample i.d., tracking, and data management have evolved over the year s. A retrofit to automated i.d. can take a lot of work and time." In Burke's opinion "the number of laboratories which require bar coding is rather small such that most analytical companies have not found this market worth pursuing." "In analytical labora tories, the container doesn't usually follow the entire process through the laboratory" since methods are more complicated, multiple sample preparation steps (e.g. extractions, derivatization, etc.) usually take place and several other sample containers a re used during analysis mentioned Tony Beugelsdijk. He adds "this is in contrast to the clinical lab where analysis is performed on the original sample (e.g. blood or urine) or an aliquot thereof." Along those lines, Brian Holden and others noted that the samples encountered in an analytical laboratory can be quite varied while in the clinical lab there are only a few types of sample. Kramer said "compare the variety of matrices encountered in an environmental lab compared to the clinical laboratory which sees mostly red and yellow samples over and over again."

Quite a number of experts noted that, with the exception of the 2 mL autosampler vial, there are no standard sample containers again in contrast to clinical labs where containers are standardized. Another reason is that the number of samples in a typical high production clinical facility is much higher volume than in a typical analytical laboratory.

In the future all of this may change. Brinkman mentioned that in Europe the need for sample integrity in the environmental laboratory has lead his group to look into some form of bar coding to track samples. Hurst also notes that he is starting to see bar codes more widely used. He claims that "a new bar code standard is being promulgated and this trend will have an impact on the laboratory." Zenie predicts that "automated bar coding will be part of automated sample preparation which is still a small segm ent of all lab operations. The vision is clear but the implementation is a big job."

12. Many laboratories are now required to provide elaborate justification for the purchase of new equipment. For automated SP instruments, what factors should be considered for such a justification? Do you use financial models (e.g. break even time, re turn on investment - R.O.I., R.O.A., etc.) or more strategic benefits (e.g. capacity to meet greater sample loads, better product quality, time-to-market in development, better utilization of scarce resources, etc.) in making such justifications.

More and more companies are looking at the strategic value of automation. This was the feeling of several of the experts. The value of timely, high quality information, productivity enhancement, and head count avoidance or better utilization of scarce res ources were the strategic factors most often mentioned. Hurst and Brunner agreed that "justification for the purchase of laboratory automation equipment is predominantly dependent on the specific application and is really a combination of both financial a nd strategic benefits." Brunner added that her number one justification for automation is "reliability." A number of the experts added that one of the biggest justifications for the automation of sample preparation is the improvement in the quality of the work environment. Not only will automation expose the worker to fewer health hazards (e.g. biohazards, toxic solvents, etc.) but it will improve the quality of the daily work by allowing more time for creative thinking, specifically mentioned by Boos. Be ugelsdijk asks the question "do we want to be a player in this market?" which encompasses a long term strategic automation policy that is pursued as a long term investment. Edholm and Erni felt in the pharmaceutical industry that the key strategic element was "faster drug development" or, in other words, getting to market before your competition. McNally agreed stating "the best justification is influence on the bottom line, not in terms of analyst time but in terms of getting our products to the market."

Holden mentioned that their customers "are justifying purchases based on financial models and Hewlett-Packard provides worksheets to assist them in developing this information." Kramer believes that either model can be used and work but that "over the yea rs, he has seen such planning tools abused - cost justifications that covered only acquisition costs, labor cost estimates that did not include company overhead, assumption that the automation will run 24 h/day, etc. Its too bad this happens, because in a ttempting to make the automation look good, these faulty models project scenarios that real automation cannot support."

Frank Zenie had a more provocative feeling about such justifications. "Senior management have not accepted that productivity and automation are part of their job. They leave the technical evaluation to the scientist and the financial justification to the accountants. For example, in most routine testing labs, the management issue is building sufficient analytical capacity to process a doubling or tripling work load without adding people or space. The issue is capacity not R.O.I. In today's world, if the l abs can't get the work done, they outsource it at premium prices." He adds "in his opinion lab automation has not reached strategic importance in most labs as yet."

13. Many laboratories are required to work under GLP (Good Laboratory Practices), cGMP, and other regulatory constraints. What do you see as requirements for SP products and methodology to fit into one of these laboratories.?

The experts agreed that there is increasing concern about the validation, documentation, and control of user programmed automated devices including those which do SP. "The methods and methodology have to be proven and appropriately 'validated' (ISO 9000 c ertification is a plus), proper documentation has to exist, and reproducibility is probably the most important factor" in her laboratory states Linda Brunner. Burke felt that "in general, expectations of customers has grown and they will not accept sloppy product or results in the '90s . . . quality is more than a buzz word, it must be provided."

Edholm and Erni both feel that such quality requirements will put more burden on the vendor such that they "should not only give specifications, they should also test and verify them when a product is delivered, provide SOPs for all instruments including instrument qualification plans, instructions for preventive maintenance, validation plans, and training." Other future requirements will include "sample i.d. and tracking from collection to analysis, logging samples, password protection and other security measures" cited Brian Holden. Rather than the analyst doing all of this record keeping, Beugelsdijk, Kramer, and McNally all felt that the lab automation products should assist in generating the required documentation. Beugelsdijk summed it up when he st ated "sample preparation instruments and systems must be able to account for their behavior and generate a defensible audit trail. Control points and decisions must be automatically logged in unalterable formats so that feedback-based systems can be devel oped." Kramer added that "automated equipment will provide a much more thorough audit trail than would ever be kept by an analyst." Zenie agrees "vendors will have to make validation part of their solutions offering to meet this need. Ultimately, automate d techniques with built-in documentation will become the standard for compliance."

14. Do you see any new technologies for SP on the horizon? Where do you see SP (both manual and automated approaches) going in the next 5 to 10 years?

A wide variety of new technologies were cited by the experts. Although not necessarily confined only to SP, several of the experts felt that multivendor interconnectivity and standardization may not be far off, especially with the "push" of organizations and programs like the AIA, CAA, and CAALS behind the movement. Such activity will allow true integration of sample preparation and analysis and other hyphenated techniques as well as allowing self-optimized instruments with feedback control of instrument parameters, and many of the other automation features cited elsewhere in this article.

In the instrumentation area, Erni does see further development in "sophisticated injection devices with on-line SP possibilities." He also foresees "more sophisticated robot systems with optimized nesting where simultaneous processing of totally different samples is possible." Other technologies which the experts felt would see future growth were supercritical fluid extraction, microwave-assisted and other thermal extraction methods, and "a repackaging of some of the traditional techniques such as 96-well microtiter plates for better automation" (Zenie).

The further development of on-line and coupled sample prep-analysis techniques such as isotachophoresis, free-flow electrophoresis, flow injection analysis with bio- and chemical sensor detection, ultrafiltration and microdialysis (more and different memb ranes being developed - Brunner) coupled to trace enrichment, and flow immunoassay where "samples will be directly introduced in to the analytical system" (Edholm, McNally, Zenie).

In the field of drug development, Edholm professes "a need for measurements of proteins with short half lives where devices are needed that would allow for sampling and inhibition simultaneously since the concentration is needed at the time of sampling wi thout further metabolism occurring."

In the chemistry area, more tailor-made SPE materials such as antibody-coated, on-column derivatization packings, molecular recognition (imprinting) phases, more use of restricted access media and other columns which can accept dirtier samples (i.e. blood and urine) especially for direct coupling to a mass spectrometer, and a further development of the solid-phase microextraction (SPME) technology were mentioned by the experts.

Some of the experts thought that in the future some of the newer hyphenated techniques such as LC-MS-MS may lessen the need for exhaustive SP. Zenie felt that "these more selective instruments will simplify SP." Brinkman felt that there will still be some SP required "to protect the expensive equipment but that a short SPE-like precolumn can do wonders!." He added that SPE-GC-AED and SPE-LC-FTIR are other hyphenated techniques which may solve complex problems and that SFC may be resurfacing as a viable al ternative to other chromatographic procedures. Brunner, Kramer, and Hurst all sensed that "CE-MS is another combination technique which will find its niche" but Edholm feels that "front end sample preparation techniques will be needed to overcome current problems such as incompatibility with the electrophoresis buffer and the sample." New spectroscopic "developments in NIR and MALDI-TOF" (Erni, Kramer, Zenie) and "FT-IR and FT-IR Raman spectroscopy" (Hurst) may also lessen the need for exhaustive sample p reparation. In the inorganic domain, Kingston mentioned that "truly integrated microwave digestion chromatography-ICP-MS with computer feedback and control would be a major step forward."

Several experts felt that miniaturization of sample preparation methodology will need to accompany expected miniaturization of the analytical techniques. A further trend towards smaller sample amounts and volumes will lead to a greater reduction in organi c solvent consumption, but still allow detection-identification below 1 part per billion. Kramer also foresees "more sample prep being done in flow systems using disposable components within the flow system." For example, he cited "flow systems with autom atically exchanged cartridges which will contain derivatization reagents, enhancing extractions, or enabling detection, which will offer binding sites for reactions, and which will provide for separations and retention of unwanted analytes, matrix materia ls, etc. In addition, the raw samples would be placed in similar cartridges for automated processing."

SUMMARY AND LAST COMMENTS

To summarize, a final question was asked about where the experts thought sample preparation was heading in the next 5 to 10 years. Although many of these improvements were cited elsewhere in this article, to summarize, there was generally agreement among the experts on where SP was heading in six main areas:

1. Greater integration of SP and analysis with information feedback, automated record keeping, with greater use of hyphenated techniques (e.g. SPE-GC-MS)

2. Smaller samples and miniaturization

3. More standardization, automation and modularization

4. Techniques which use less organic solvent

5. On-line measurement and at-location measurement

6. Less sample preparation due to improvements in selective spectroscopic measurements.

An expansion on the last point, several experts believed that direct analysis of untreated samples will lessen the need for exhaustive SP. Specifically, some of the newer spectroscopic techniques such as MALDI-TOF and LC-MS-MS and the use of specific bio- and chemical sensors and immunoassays will not eliminate all SP but much simpler devices will be used (e.g. short precolumn or SPE cleanup). Brinkman sees the development of SP procedures that are better adapted to the needs of the sample. For example, a different approach may be needed for small-volume biomedical samples than for large volume environmental samples. Kingston thought that "once fully integrated instruments which can communicate with one another are available, then methods will be encapsul ated and distributed by organizations such as the U.S. E.P.A. and ASTM on disk (or over the Internet-editorial addition). The instruments will validate their performance automatically. Information rather than data will be provided by the instruments and e valuation of instrument performance and analysis objectives will be evaluated by integrated software packages assisting the analyst." Kramer added to Beugelsdijk's earlier comment that he would also like to see "more performance-based methods and less pre scriptive methods of today." He adds "performance-based methods put the burden of measurement validation back with the analysts where it should be." In terms of instrumentation development, Kramer would like to see fewer "anthropomorphic approaches to SP where we'll stop trying to make machines do things the way people do them." He and Zenie both see disposables developed for automation.

In general, the experts feel that the future for further developments in SP, especially in its automation and integration with the rest of the analytical laboratory, look very bright.

LIST OF EXPERTS FOR SAMPLE PREPARATION SURVEY

Dr. Tony Beugelsdijk, Team Leader, Robotics Tech. Development in Laboratory Automation, Los Alamos National Laboratory, Los Alamos, New Mexico.

Prof. Dr. K.-S. Boos, Professor, Institute of Clinical Chemistry, University Hospital, Munich, Germany.

Prof. Dr. U.A. Th. Brinkman, Professor, Analytical Chemistry, Free University, Amsterdam, The Netherlands.

Ms. Linda A. Brunner, Group Leader, Bioanalytics and Pharmacokinetics, Ciba-Geigy Corp., Summit, New Jersey.

Dr. M.F. Burke, Professor, University of Arizona, Dept. Of Chemistry, Tucson, Arizona.

Dr. Lars-Erik Edholm, Director of Bioanalytical Chemistry, Astra Draco AB, Lund, Sweden.

Dr. Fritz Erni, Technical R&D Manager, Analytical Development, Sandoz Pharma, Basel, Switzerland.

Dr. Steve Hamilton, Research Automation Manager, Amgen Inc., Thousand Oaks, California.

Mr. Brian Holden, R&D Manager, Sample Preparation-Sample Introduction, Hewlett-Packard, Wilmington, Delaware.

Dr. W. Jeffrey Hurst, Senior Staff Scientist, Hershey Foods Technical Center, Hershey, Pennsylvania.

Dr. H.M. 'Skip' Kingston, Professor, Duquesne University, Pittsburgh, Pennsylvania.

Dr. Gary W. Kramer, Automation Consortium Project Manager, National Institute of Standards and Technology, Gaithersburg, Maryland.

Dr. Mary Ellen P. McNally, Research Associate, Analytical Technology Group, Agricultural Products, Du Pont, Wilmington, Delaware.

Mr. Frank H. Zenie, President, Zymark Corp., Hopkinton, Massachusetts.


[SamplePrep Web home page] [Duquesne home page] [Directory of Services]