NEWS FOCUS
Process Control No Longer Separate from Simulation, Design Changing conceptions of chemical process design resulting from computerization are beginning to blur disciplinary lines Joseph Haggin, C&EN Chicago
Various forces are at work on chemical process control, conceptually and in practice. Spurred by computerization of the overall process design activity, process control is moving toward a deeper integration with simulation and design functions, with implications extending beyond the immedi ate activity. In a three-part series over the next two months, C&EN will take a look at the current state of process control, beginning with this article examining the context in which it is evolving.
process control is no longer a discipline distinct from design. Today it has become impossible to properly conceive of chemical process control outside the de sign context. Control practitioners of various sorts tend to regard all these developments merely as inevitable, if occa sionally inconvenient and uncomfortable. Theorists tend to endow the changes with deeper significance. Both viewpoints have merit. Computerization has provided the possibility to con trol and optimize chemical processes to an extent never before possible. Through integration of process control with company management systems, the possi bilities afforded by modern process control reach far beyond the plant gate to become an integral part of a
To the uninitiated, the craft of process control appears to be cut and dried: Attach one or more controllers to appropriate points of a plant and connect them to corresponding control elements. If it ever was that simple, it certainly isn't now. Process control is intimate part of overall design process Chemical process control has ex perienced several "revolutions" Research and development since World War II. Manual control gave way to pneumatics, and pneu matics gave way to electronics. Ana Process design log computers were displaced by digital machines. Digital control is w f ^ π* ι becoming distributed rather than ^ centralized. And distributed digital ^ control is being integrated with sys / \ ^ tems for more general managerial Control and Electrical Machines < \ > instrument Vessels Piping 1 control. design design ^ ™ ^ ^ design [ design design But superimposed on these devel opments are some less heralded ^> changes in the way that chemical \ / processes are conceived and de signed. The advent of process simu lation with digital computers and Civil and structural Site and equipment desi(3n is y out -* ' the gradual development during the past 10 years of a design approach Note: Many essential functions are omitted—for example, hazard and operability called chemical process synthesis studies, monitoring, and costing. have left little doubt that chemical
t
s
W
"X^
\JL·
\V/
V
April 2, 1984 C&EN
7
News Focus
Regulation requires loop, controller variations
Control variable Value of deviation from original value No control
Process regulation usually is exerted through control loops, which are pneumatic or electronic circuits containing the control elements. The most frequent kind of loop is the univariable feedback loop. A sensing element registers the value of an exit stream property and transmits the value to a comparator. Here, the recorded value is compared with a preset value. The difference, if any, is the error, which is passed to the controller that actuates the appropriate control element—valve, heater, cooler, or the like. In feedback control, any disturbance already has passed through the process before the error is generated. Feedback control, consequently, is a corrective control aimed at restoring some predetermined state of the process. Another kind of control is feedforward control, in which the sen-
.Proportional control
• Proportional + integral control
Time •
sor is placed in an entering stream. A disturbance is sensed before it enters the process. If the corrective action is fast enough, upset may be avoided, or at least minimized. The choice be-
company's tactical and strategic business and financial objectives. Computerization also has required degrees and levels of involvement by process design and control engineers that have taxed the ability of the professions to provide. An abrupt change of perspective has been involved, as well as some major, and mostly little noted, professional reorganization. Most of the changes taking place in approaches to chemical process technology involve the proliferation of the small digital computer. It isn't an unmixed blessing. Most of the hardware and software that launched the computer era were conceived and designed for the business and financial communities. They are only now being adapted in any consistent way for use by science and engineering. The problems of adaptation are usually highly localized and require much local expertise for solutions. Publicity suggests that there is software for everything and computerized solutions for problems that have yet to occur. But for many engineering disciplines, the fact is less impressive. It takes time, effort, and a lot of education to produce high enough numbers of scientists and engineers with the expertise to tame revolutions. It also requires appropriate hardware and software. Many of these considerations underlie the thinking of one engineer who has expressed concern over the state of chemical process design and control. Vern W. Weekman of Mobil Research & Development Co. cites the perennial problem of coming to grips with the process to be controlled and measuring the right 8
April 2, 1984 C&EN
Offset
Proportional -j- integral + derivative control
tween these and other types of control depends on system stability and other concerns. Depending on the design of the controller, control may be of several
variables. Reliable on-line measurements are often difficult to make, and the presence of a compositional disturbance, for example, is often not known soon enough to avoid trouble. Weekman thinks the biggest single obstacle in devising a control system for a process is understanding the process itself. Chemical processes are complex, generally highly nonlinear, and invariably affected by multilevel interactions among component properties. They are usually quite large, and may operate either continuously or in batches. Ideally, control systems are designed as part of a process, not appended later. Small changes in design of a process can produce profound effects in the process dynamics. However, it isn't obvious a priori how to determine the best control configuration. Hence, a considerable amount of dynamic simulation is desirable, if available. But there is still a marked tendency, invariably for reasons of practicality, to design processes and control systems around some steady state. A classic engineeering decision is usually involved over the optimum amount of sloppiness to have in a process model. The decision involves a compromise between precision and practical utility. Weekman's industrial point of view complements in many ways the more theoretical one of chemical engineering professor Alan S. Foss of the University of California, Berkeley. A few years ago, Foss produced a major critique of process control theory and practice, concluding that chemical engineers have been infatuated with translating control methodology from
kinds. A proportional controller gener ates an output signal proportional to the error measured. Generally, the greater the error, the larger the re sponse from the control system. With proportional control only, the system equilibrates at some new steady-state value of the measured variable. The difference between the original and new value is termed "offset." Another mode of control is integral control, in which the response is pro portional to the integral of the error. Proportional and integral control frequently are applied in concert (PI control) to take advantage of the ability of integral control to reduce offset. In addition, the response is improved. A third mode of control is derivative (rate) control, in which the response is in proportion to the derivative of the error. Proportional and derivative con trol sometimes are combined (PD control). Derivative action tends to an ticipate changes in the error.
Process output
Process input Final control / element
Set point
^^Λ
Process system Property sensor
Control signal
D=> Primary signal
Signal transmitter
Controller
Feedback control loop
All three modes of control often are combined (PID control). This adds faster damping of oscillations to the advan tages of PI control. A great variety of controller archi tecture has been devised. In addition to the unitary kinds of control, all com binations are also employed. And frequently, controllers are stacked—in effect, using one controller to actuate another for smoother action.
other industries to chemical processes. He claimed that was the wrong approach because chemical pro cesses present unique features that make such adapta tion usually difficult and sometimes impossible. Foss says that in recent years the situation has im proved considerably, and the celebrated gap between control theory and practice has narrowed. However, there are still many theorists and practitioners who believe that Foss' critique is still quite accurate and current. Like Weekman, Foss suggests that many of the prob lems of chemical process control derive from imper fect understanding of the process itself. In a superfi cial way, chemical process control can be described as the regulation of a complex chemical and physical system beset by unpredictable disturbances. Regula tion usually implies a predetermined state or set of states that serve as a reference. Devising a control system requires an obvious knowledge of the process, with the implication that the control system designer is, in fact, a process designer. Not everyone agrees with this assessment, but it does appear to be gaining adherents. It seems axiomatic that the more known about the process, the better the controls developed for it. The dynamic complexity of most chemical processes usually defies precise definition, even in this age of ultracomputation. However, complete definition is sel dom required, even though it may be desirable. Practi cal process descriptions invariably employ "lumping" of parameters, variables, components, and the like, and require measurement of only some of the variables.
Many " c o n v e n t i o n a l " pneumatic and/or electronic analog mechanisms are still used in various processes. However, digital algorithms for control modes are rapidly replacing analog prescriptions and mechanical settings. The main advantage is that digital microprocessors can handle many loops at once on a time-sharing basis. In modern processes, hundreds of loops are typically used.
Determining which variables to measure can be some what difficult, and experience isn't always the best guide. At present, however, the engineering insight gained from experience is still an integral necessity for process control design. There still is no generally acceptable theory for industrial chemical process control. There are theoreti cal bits and pieces from other disciplines that may apply to simple subsystems. And many control prob lems of practical interest are of the simple, single-loop variety that are treatable by linear analysis. This simple picture may be starting to change. Al though there are numerous control theories and theorists, only recently has the realm of chemical processing developed its own. A decade ago, coincident with the proliferation of digital computers, a substantial change in perspective for chemical engineering began. This evolution is gathering momentum and is now generally known as chemical process synthesis (CPS). The effects of this development probably will be important, if not crucial, to chemical process control in the future. Among the leaders in development of CPS are chemi cal engineering professors Dale F. Rudd of the Univer sity of Wisconsin, Madison; Arthur W. Westerberg of Carnegie-Mellon University; and Rodolphe L. Motard of Washington University, St. Louis. Westerberg defines CPS as the act of determining the optimal type and kind of processing units—reactors, heat exchangers, and the like—and the optimal con nections between them. CPS is still highly academic in flavor, and it is virtually restricted to research April 2, 1984 C&EN 9
News Focus problems. But there have been some notable industrial adaptations of the first results, and the promise for more is bright. Five areas of interest have been defined for process synthesis: • Determining an optimal reaction path for a chemical synthesis. • Determining an optimal arrangement of heat exchangers in a network. • Determining the optimal separation system for a given purpose. • Developing a complete flowsheet for a process. • Designing an optimal control system for a process. Optimal determinations implicitly contain the notion of alternate models and the means for finding the optimum. The most active areas of research in CPS are the first two. Separation system design has been an area drawing increasing interest. However, the last two areas, which relate most directly to process control, haven't developed so rapidly as researchers expected. The thrust of CPS development has been toward total process design in all its aspects, a monumental undertaking. Two general approaches have been advocated. One assumes no initial process structure. The other assumes several alternatives, possibly even arbitrary alternatives, from which to optimize. This latter is usually referred to as the "integrated" approach and seems to be favored by a number of Japanese researchers. Some of the earliest work in CPS at the University of Wisconsin made use of a computer program called AIDES (Adaptive Initial Design Synthesizer), which was essentially heuristic in nature. A substantially different approach was the BALTAZAR program developed by Motard and his associates at Washington University. Both assume no initial structure in a processing system, and both have been tested on real systems with some success. Carnegie-Mellon's Westerberg believes that the AIDES and BALTAZAR programs are more immediately useful than the integrated approach, but substantial trials of commercial interest are still needed. There have been numerous complaints about the lack of a systematic approach for the synthesis of a whole-plant control structure, something that CPS purports to do. What theories do exist assume that potential control variables can be predetermined, which is roughly equivalent to specifying a control structure in the first place. This chicken-and-egg situation is not unfamiliar to the commercial designer, who must routinely contend with such matters. Chemical processes may be viewed as operational units connected by process streams. They are kinds of chemical networks with dynamic properties—which is to say that properties vary with time. Networks have input and output variables. Input variables may be controlled or uncontrolled. In the process control lexicon, uncontrolled variables are disturbances that may or may not be random. A control strategy operates through the controlled variables, sometimes also called manipulated variables, with the objective of minimizing the effects of the disturbances. Output 10
April 2, 1984 C&EN
variables may or may not be measured. However, in the conventional situation at least one output variable from a unit or network must be monitored for control purposes. A major problem is to define the objectives of the control system. The objectives may be either technical or economic. The purely technical considerations are paramount in importance. Plant and personnel safety is an obvious one. Another is achieving requisite product quality. There are usually environmental constraints and some operational requirements as well. Thereafter, most of the control objectives are economic and are concerned with optimizing plant operation within the required technical constraints. A CPS rule of thumb is that technical control requirements determine regulation of the plant, whereas economic constraints determine the optimization. Both are inextricably connected. In the CPS context, a number of formal procedures have been suggested for developing a control system. One uses digraphs to represent relationships among system variables. Another uses multilevel iterations. Alternative control structures are first specified for each unit in the process. Conflicts in these structures are then minimized at another level. The iterations are carried on until a compromise is reached. Development of CPS in theory and in practice has suffered from the slowness with which a cadre of engineers and scientists having the necessary mathematical and computational skills has accumulated. The situation is gradually improving, but it appears that industry has taken the lead over academia, largely because of the immediate need and because the computing equipment is available. This movement also has produced a close liaison between chemical companies and makers of control equipment, not to mention computer makers. Considering the apparent movement toward integration of process and control design functions, it is remarkable that process control equipment manufacturers don't become more involved in process design than they do. The situation has been assessed by two engineers with Honeywell's process management systems division. Evan Whitmer and David Wick suggest that although most of the proprietary process modeling does not directly involve Honeywell, they are intimately involved with respect to control system simulation and in operator training. Even though the arrival of distributed control systems has thrown design and control personnel together, the two functions are still often distinct. Whitmer and Wick also note that there is a tendency, at least inside Honeywell, to retitle process control "process management technology." This reflects the recent changes in control technology associated with the microprocessors that permitted practical distributed control. Honeywell claims the first microprocessor-based digital system for industrial process control with introduction in 1975 of its TDC2000 system. Among the competition are Leeds & Northrup's MAX-1, Fisher's PRoVOX, Bailey Controls' NETWORK-90, and Taylor Instruments'
MOD 30 systems. Thousands of such systems have been installed around the world. The microcomputer also has launched a further vertical integration of control functions. Whitmer and Wick cite the present integration of plant control with appropriate m a n a g e m e n t information systems and the beginning of an effort to further integrate local control into plantwide control, company control, regional control, and beyond. Availability of distributed control also suggests some intriguing possibilities about the character of future industry. Since distributed control can be applied to multiple locations, it may be advantageous to distribute the manufacturing base geographically. Whitmer and Wick note that before the Arab oil embargo of 1973, the general tendency in plant design was toward bigger, single plants, usually justified by the economy of scale afforded by big plants. However, there also were some implicit assumptions that weren't always justifiable. One was that transportation costs were low enough to permit transport of raw materials to and products from a central plant. Another was that plant reliability was good enough to accommodate occasional periods of downtime through reliance on inventory to supply markets.
FLOWTRAN is Monsanto's approach to simulation Pure component data bank
Mixture data bank
PROPTY Component data processor
VLE Phase equilibrium processor
Engineering databases
Engineering database management system
FLOWTRAN in/out
User libraries
FLOWTRAN preprocessor
FORTRAN compiler
Simulation model
Output
The oil embargo invalidated both assumptions. Transportation costs have become very high, and reliability of big plants is suspect. In addition, the high cost of money and the extremely large investments in the huge plants have dimmed their luster considerably. There were also some rather significant market contractions due to high prices. It became difficult to justify operating a big plant at a fraction of its capacity for a prolonged period of time. The corresponding technical problems of controlling a large plant operating so far from design specifications are monumental. With all these contributing influences, the appeal of smaller, modular plants has increased abruptly. The arrival of distributed control systems while all this was happening may have been serendipitous but nonetheless timely. The possibility now presents itself for use of smaller modular plants that can be optimally located to take advantage of raw materials and market locations. The means to control the plant and to manage its operation simultaneously are in view, and the incentives are growing. Whitmer and Wick expect no major problems with available computing power for a long time. Available and planned computers of all kinds are more than adequate, and present controllers are fulfilling requirements well. There is, however, a growing need for better sensing instruments. These sentiments are echoed by E. Victor Luoma, director of analytical science at Dow Chemical's corporate R&D department. He is content with the present state of process control technology but suggests that some major changes in the sensing end of the control loop are about to take place. In particular, he expects a proliferation of some of the new rapid-response gas chromatographs for on-line process control. At present, Luoma notes, temperature, pressure, and flow rate are still the most frequently measured properties for control purposes. However, composition is gaining rapidly, and other properties, such as conductivity, viscosity, and turbidity, may be more widely used if rapid-sensing and rapid-response detectors can be developed. Honeywell's satisfaction with the present state of process modeling and simulation isn't shared generally by makers of control equipment and systems. In the past few years, there have been a number of acquisitions of control equipment makers by chemical or engineering and construction companies. Notable examples are the acquisition of Fisher Controls by Monsanto, Taylor Instruments by Combustion Engineering, and Bailey Controls by Babcock & Wilcox. Chemical and engineering companies may achieve an instant capability in control equipment through acquisition. However, the necessary capabilities in modeling and simulation have been either developed in-house or contracted for from a new group of companies that deal in simulation and optimization services. Monsanto was one of the early developers of an in-house system. In 1966, the company began using a simulation system it developed called FLOWTRAN. The system automates steady-state chemical process design, and is used in a broad range of design activities, April 2, 1984 C&EN
11
News Focus
Complex mathematics are central to process modeling, control Dynamic chemical processing systems and their associated controls are char acterized by great numbers of differen tial equations. The most frequently en countered are constant coefficient, linear, partial, and ordinary differential equations. These may be quite complex, even for simple situations. Rigorous analytical solutions are sometimes possible, but it has be come common to use operational tech niques that have evolved from the pioneering work of English physicist Oliver Heaviside in the late 1800s. The modern form of this operational calcu lus is concentrated in the Laplace trans form and its manipulation. Other trans forms are also useful in process de sign and control, notably the z-transform and the Fourier transform. The basic idea behind the Laplace transform is to "transform" a relative ly difficult differential equation into a simpler mathematical form, solve the equation, and then, with an inverse transform, recover the solution on the
original mathematical domain. The pro cedure is remotely analogous to the use of logarithms, in which many-factor multiplications and divisions can be reduced to simple addition and subtrac tion. The solution is recovered in the original domain with the antilogarithm. Solution of a linear differential equa tion with the Laplace transform thus becomes often a matter of solving an algebraic equation. The Laplace transform of some func tion F(t) is obtained for all positive values of the variable t by multiplying F(t) by e" s t and integrating from zero to infinity:
L[F(t)]=
£
The transform is some function f(s) in a new s-domain. Since the s-domain is generally easier to operate within, the solution is usually simpler. By in version the original function can be recovered:
F(t) = L-i [f(s)]
from conceptualizing new plants to making process improvement studies on existing plants. Stanley I. Proctor, director of e n g i n e e r i n g technology in Monsanto's corporate engineering department, says that for a while there was an attempt to market the system outside the company, but this move was shelved in favor of direct licensing in 1973. Direct licensing avoided problems arising with more than one version of FLOWTRAN. By licensing, each user became a de facto member of a FLOWTRAN consortium, and a formal users' group meets annually to advise mem bers of improvements in the system. Since 1974, Monsanto has made FLOWTRAN avail able to chemical engineering students in the U.S. and Canada through the aid of an association of chemical engineering educators and industry professionals known as CACHE (Computer Aids for Chemical Engi neering Education). Use of the FLOWTRAN system is restricted to educational activities, and no faculty member may use the system for consulting. In 1979, an improved interactive version of FLOW TRAN was introduced, and the new system has been used widely in Monsanto's corporate engineering department. Proctor says that every capital project that involves a continuous fluid-based processs has a FLOWTRAN simulation carried out in both prelimi nary and final design phases. New engineers hired by Monsanto are given an intense grounding in use of FLOWTRAN. Besides FLOWTRAN's intrinsic merits, Proctor at 12
April 2, 1984 C&EN
e" st F(t)dt = f(s)
Large tables of transforms have been generated for most of the functions usually encountered. When unusual functions are encountered, the trans forms can be found from the definition. Of particular interest to designers of control systems is the extension of the methods of the Laplace transform to include complex numbers—that is, when s = χ + iy. F(t) still represents a real function of a positive real variable, but f(s) can assume complex values. There are endless variations in the use of operational methods, and con siderable mathematical facility is re quired of process modelers and de signers of control systems. Accom plished practitioners have become adept at designing control systems and modeling processes selectively in ei ther time or Laplace domains. The use of transforms of all kinds has been greatly enhanced with the arrival of the digital computer, which permits rapid trials with approximation meth ods for integral evaluations.
tributes much of the system's success to its continued support by management and the way that Monsanto manages its technology. In every stage of the develop ment of FLOWTRAN, the key people have been pro cess engineers with considerable knowledge of com puters and computing. Computer experts aren't neces sarily capable of adapting the machines to process design and control. FLOWTRAN hasn't been without skeptics, but Proc tor believes that their influence has been minimal. As with all such systems, initial development costs and the cost of maintaining FLOWTRAN have had to be justified. Although management support is necessary, there must also be eventual economic justification through demonstrated profits. Proctor says that in the case of FLOWTRAN, justification hasn't been so diffi cult as originally feared. In addition to better designs with demonstrated capital savings, FLOWTRAN has provided additional savings of professional time with the quick communication that computerized simula tion affords. Another company that has made a considerable in vestment in its own abilities to simulate processes and perform optimization studies is Shell Oil. At last fall's meeting of the American Institute of Chemical Engineers in Washington, D.C., Shell's Charles. R. Cutler, process control manager, manufacturing and technical department, listed the advantages of computer simula tion but admitted that simulation isn't used so often as he would like because of the lack of skill in its use.
Simulation is an aid in designing plants and process control Chemical process modeling is an established means of design, construction, control, and operation. It is essential for dynamic processes, which are typical of most chemical reaction systems. The description of dynamic behavior is somewhat dependent on the mathematical resources applied and the time, talent, computer power, and money available. The basic idea is to generate a quantitative conceptual equivalent of the physical process. If the model is good enough, complete design can be achieved and tested without actually building anything. This is seldom the case, but the economic importance of good modeling is obvious, particularly since it permits optimization of alternate designs before actual construction of a plant. Models can be built at varying levels of abstraction. The least abstract is the full-size operating plant. For some simple, small plants, this may be the best "model" to build. At a high level of abstraction, a set of equa-
tions and numerical procedures constitute a purely mathematical model. Most models fall between the extremes. Imperfect mathematics and inadequate physical and chemical data typically require a succession of scaleups, each intermediate stage being a kind of mid-course correction between the initial idea and the final design. Ideally, the control requirements for the plant are developed along with the process design. Mathematical descriptions used by modelers are fundamental physical and chemical laws with time derivatives included. Their use is more or less rigorous depending on circumstances. Rigor is usually desirable but is frequently compromised for practical reasons. "Engineering judgment" inevitably requires some sacrifice of mathematical and scientific rigor for the sake of progress. One of the problems that often appears is the intractable equation. Obviously, inclusion of an equation that cannot be solved is
However, he says, competitive pressures will force companies to upgrade their engineering skills because they can't ignore the need for advanced computer controls. One of the biggest challenges facing chemical engineers today is cost reduction, and one of the best ways to achieve it is through better process control. To Shell, Cutler says, process control is computerized optimization in real time. That means that many management functions have been integrated into the control scheme of things. Furthermore, multivariable controls have become the norm, and there appears to be little intimidation by the complexities usually encountered with multivariable systems. Further echoing the sentiments of Weekman, Foss, and others, Cutler emphasizes that chemical process control is essentially different from that for other kinds of systems, primarily because of the lack of precise enough data on the dynamic chemical systems. The systems, he points out, are usually very large and involve many property interactions at several levels with ever present nonlinearities. For these reasons, as well as a necessity to employ multivariable control routinely, chemical process control cannot subsist on theory borrowed from other disciplines. In support of Shell's development of advanced control systems, Cutler says, the company has patented a Dynamic Matrix method of modeling for predictive process control in multivariable systems. Associated with it is a multivariate control algorithm based on actual response data from a plant.
futile. This is not to say that some of the more "operational" means of equation solving are ignored. The general list of equations usually encountered includes those for continuity and the energy-conservation, motion, transport, state, chemical and phase equilibrium, and chemical kinetics. There are also numerous empirical and semiempirical relationships used to correlate and estimate properties when factual data are not available. Eventually it is necessary to verify the model. If the designer is duplicating a well-used design, the confidence level may be high enough to proceed directly to the commercial plant. If the design is new and untried, there comes the moment of truth when design is committed to construction. This usually means a lot of intermediate piloting and fine-tuning. All along the way, the simulations have been similarly refined to reflect the inevitable discrepancies between real and ideal systems.
Companies that don't have in-house capabilities for modeling or optimization can avail themselves of a great variety of services from numerous companies in the simulation and optimization services business. The kinds of assistance range from cassettes to contract simulations for whole plants. Two of the major companies in this relatively new business are ChemShare and Simulation Sciences. Simulation Sciences claims to be the world leader in process simulation programs and support services, primarily for the petroleum refining and gas processing, petrochemical, chemical, and synthetic fuels industries. According to Clifford L. Kirk, director of marketing for Simulation Sciences, the company has been in the simulation business since 1966 and introduced its first proprietary simulator, PROCESS, two years later. PROCESS is an all-purpose computer simulation program that performs rigorous mass and energy balance calculations. The user builds a flowsheet of the process, unit by unit, with a wide variety of constraints being specified along the way. The final design, or any part of it, may be optimized as needed. Another of the company's programs, HEXTRAN, deals specifically with heat recovery in new or retrofit designs. Simulation Sciences also markets several programs developed by Chevron Research for refinery planning and engineering calculations. Part of the PROCESS package, and typical of such programs, is an extensive library of component properties and pure component data, along with computer April 2, 1984 C&EN
13
News Focus routines for estimating properties of single compo- for a large-plant simulator is in making optimization nents and mixtures and for predicting phase equilibria. studies, and publicity has suggested that whole-plant Thermodynamic functions are also included. However, optimization has become routine. But however desirKirk emphasizes that PROCESS and the other pro- able such use may be, it isn't clear that it is either grams offered by the company do not include kinetics routine or even occasionally encountered. David E. information per se. These areas of modeling still re- Haskins, an internal consultant with Honeywell's promain somewhat intractable. cess management division, for instance, doubts that A study recently completed by Imperial Chemical optimization of entire plants may ever come to pass. For Industries in the U.K. indicates that a typical process one reason, it isn't always clear what is meant by optisimulator contains about 150,000 lines of FORTRAN mization, he says. Haskins considers it an approach to code and about 1000 subroutines that include exten- perfection, in the sense that profit is maximized by sive libraries of individual unit operations models choosing the "best" set of process variables around and physical property models. As impressive as this which to control a plant. That isn't always easy to do. may seem, it also suggests extensive chaos that shows The efficacy of optimization often depends on the little chance of improving soon. size and nature of the model being used. SingleThe ICI study notes that many subjects in process variable models are the simplest and most frequently technology have yet to be embraced by computers employed, but they are seldom realistically indicative and programs in a meaningful way. Furthermore, for of actual plant operations. There are very few examthose subjects that have been computerized in any ples of whole-plant optimization on which to draw. way, there appears to be almost too much enthusiasm On-line whole-plant optimization is all but out of the and energy generated. The overall picture is of large question until better simulation can be achieved. The masses of problem-solving resources being poured list of constraints acting against it is long. into narrow problem areas. Haskins notes that in most cases the optimum operMost of the available simulators do little more than make steadystate mass and energy balances of Engineering database management system the process being simulated. In the has multitier architecture more common sequential modular simulators this is usually a trial-anderror procedure, and failure to conUser's program interactive mode mode verge is a common problem. Even so, says the ICI study, the converged trial-and-error balance remains one of the more important parts of proTop-ievei interpreter cess study. Reinforcing Westerberg's observation on the same subject, the ICI Views study notes that the most popular subject for simulation and optimizaNonprocedural tion remains heat exchanger netquery works. Next are distillation column networks. Few other areas have shown much real progress. Of particular difficulty are several classes Data Procedural MiscellaneData Data Schema of chemical reactors, multiphase ous deletion, query manipulation insertion retrieval equilibrium, dynamic process simucommands modification lation, and batch processing. Set Delete Select Project Line Line Create Computer-aided design has beUnload Join Multiply Modify oriented oriented Expand come established, albeit with difDelete Tuple (row| What is? Sort List ficulty, at several levels of comScreen Report Change Union Duplicate inatable) Delete oriented generator! Copy Subtract plexity. A study by a group of engieditor Rename Save Intersect neers at Fluor Corp. indicates that drafting with a computer-aided design system yields a 13% saving in designers' time, with another 5% Utilities that isolate machine dependencies Utilities that support the data structures: string, item, tuple, and page utilities being saved on making isometric Utilities that support double precision arithmetic piping drawings alone. The total Reading a tuple 18% savings in professional effort Utilities that support images, assertions, and axioms is attributed to methodology alone Input pattern match: pattern + tuple—*- true/false + modified tuple Encript, decipher a page and has nothing to do with matheRecovery from system crash matical simulation as such. One of the more frequent uses 14
April 2, 1984 C&EN
ating conditions for a plant probably have never been realized, and there is no firm idea of what constitutes the optimum. What optimizing is done depends on use of a less-than-perfect model by estimating with some often esoteric computer operations. Again, echoing Weekman and Foss, Haskins believes that too few people have a good understanding of either the process itself or the computerized techniques for optimization. Good optimization requires a team approach that makes bigger demands on available manpower than a lot of managers realize.· A great misconception that often surfaces is that there are models available for just about everything and that once installed they require little care. In fact, says Haskins, good models are hard to find and even harder to maintain. This is not to say that a lot of models are not offered. They are. But even when a good model does appear, it still must be verified, and that may be more difficult than devising the model in the first place. The computer resources required for modeling, optimization, control, and related activities are probably adequate for the present. Computer memories are generally big enough for most jobs, but no computer has an infinite memory and none have infinitely fast computational speed. Optimizing off-line usually is limited by the necessity to allocate memory. On-line optimization is limited by computational speed. Haskins foresees in the near future a slow increase in on-line optimization equipment and systems, and the simplest plants actually may be optimized on-line. However, most optimizing still will be done off-line in connection with debottlenecking projects. On-line optimization of any real complexity is far in the future. The biggest problem will be obtaining enough welltrained manpower. Optimization of subsystems, however, has been in development for a long time. At Du Pont, for example, computerized simulation is employed for the operation and maintenance of processing plants. In the early 1970s, Du Pont's engineers began developing a general-purpose model that could be used by relatively untrained personnel and dubbed it SAGE (System Availability Generalized Evaluator). In the first stage of a SAGE analysis, the operating units are characterized. In the second stage, all connections between units are added to yield a simulated plant. Most SAGE analyses apparently have been done on operational plants after the fact, but there is no reason to believe that they might not be used on new designs as well. If the computer has increased both the scope and facility of process and control designers, it also has presented them with new problems. Those mostly relate to the complexity of their professional environment. Washington University's Motard, who is chairman of the chemical engineering department there and a practitioner of computerized engineering, observes that despite the proliferation of computerized means for design and control, there isn't a corresponding increase in productivity. One reason is that the computerized systems have brought a related need for better organization of the professional tools of the trade.
Leeds & Northrup terminal is typical of modern control Motard believes that a new approach is needed to make better use of available computing power, a refrain that becomes more familiar each day. The common experience, he explains, is that process and project engineering have evolved into a proliferation of standalone engineering computer systems. There has been little or no attention to management, communication, and information control between systems except by way of human couriers. This environment typically involves one user and one execution, employs limited system records with several computing languages, and operates on more than one computer configuration. That means that process and control engineers typically spend most of their time manipulating information— as much as 70% of their time, by some estimates. Motard says that successful use of computerized systems of all kinds, from computer-aided design techniques to the most abstract calculations, depends a lot on how well the computer system manages the many databases that are involved. Almost every database of interest to a process or control engineer probably has been computerized independently and requires a good bit of "translation" to be used. Database management systems may not be new, but Motard stresses that they are at present inadequate for engineering purposes on several counts. There have been attempts to develop a database management system for engineering. ICI, for instance, began developing its Process Engineering Data Base in the late 1970s. In 1983, ICI and ChemShare agreed to market that database jointly as part of a DESIGN/ April 2, 1984 C&EN
15
News Focus 2000 process simulation package. Motard says that the database achieves some of the desired objectives but falls short of the real need. In specifying the characteristics of an acceptable database management system for engineering, Motard notes that most of the organizational requirements are related to system performance during the transition to a new era. The technical issues are more formidable. They include problems in communicating graphics and the crucial matter of a universal organization of process data. No two data banks appear to organize their data in the same way, and there is a corresponding proliferation of h a r d w a r e a n d software to compensate, often with mediocre results. Commercial database management systems are described by Motard as being of the network type, with no provision for the dynamic redefinition or organization required by engineering users. Commercial data are either of the character format or of the singleentry variety. In contrast, engineering databases usually involve floating point operation, integers, double precision, complex numbers, vectors, matrices, and graphic data. There are highly intricate mathematical operations with engineering data that simply cannot be handled with present commercial systems. Databases in the typical commercial system are disjointed, making it impossible for a single program to access data from more than one database during a single execution without reprocessing to create temporary files and postprocessing to restore data in the multiple databases. Having surveyed the database problems, Motard and his associates have begun developing a prototype engineering database and a process engineering data model. It is being written in the Pascal language, which is adaptable by a great variety of large and small computers. The first version of the system supports a variety of data types, and additional types are easily defined. The system has two interfaces—an interactive terminal and an application program. Future improvements in the prototype will implement assertions, axioms, and views. Assertions are built-in requirements that a datum satisfy a formula wherever inserted, modified, or deleted. An axiom is similar, except that the constraint is relational rather than arithmetic. A view is an ad hoc virtual relation (table) constructed as needed from other views and actual relations. Views can be read only. Motard stresses that the prototype is still in the first stages of development. But dialogue with companies in the process industry has established a set of nearterm objectives. Efforts such as those of Motard's group, combined with others in the chemical process designsimulation-control community, are clearly leading to a deeper integration of the three functions. But there are broader implications as well. The design-simulationcontrol frontier is not immune to broader economic influences. It is part of a much wider reconstitution of industry. A high-tech future isn't restricted to silicon chips. It includes some very exotic chemistry and some equally impressive engineering to commercialize it. D 16
April 2, 1984 C&EN
MANAGEMENT CONSULTANT 67C NEWS
You couldn't sign on a more valuable ally than Chemical & Engineering News. For $35 a year (just 670per weekly issue), we'll help you spot trends that are going to impact your company's sales, production, construction, and prices. We'll give you advance information on where to concentrate R&D efforts, how the import/export market's shaping up, what employment prospects look like industry-wide. We'll take you to Capitol Hill for the inside scoop on policy makers, legislation, and regulatory affairs...around the world for an overview of the chemical industry... and across the country for an in-depth look at what your competition's up to. With Chemical & Engineering News, you'll know which way the wind's blowing in time to take decisive action. You'll get special issues with the facts and figures you need to make intelligent choices. And you'll enjoy crisp, accurate reporting by the only chemical publication with a fully staffed Washington bureau. So call the toll-free number below, and hire us today. Your very first issue will show you why the best managers wouldn't manage without us. To subscribe call:
(800)424-6747 CHEMICAL & ENGINEERING
NEWS
(U.S. only)