Batch processes have been around for many millennia, probably since the beginning of human civilization. Cooking, bread making, tanning, and wine making are some of the batch processes that humans relied upon for survival and pleasure. The term "batch process" is often used to refer generically to both batch and fed-batch operations. In the former case, all ingredients used in the operation are fed to the processing vessel at the beginning of the operation and no addition or withdrawal of material takes place during the batch run. In the latter, material can be added during the batch run. For brevity, the term batch is used in this text to refer to both batch and fed-batch operations when there is no need to distinguish between them. The term fed-batch is used to denote addition of material in some portions of an otherwise batch operation.
Batch processes have received increasing attention in the second half of the twentieth century. Specialty chemicals, materials for microelectronics, and pharmaceuticals are usually manufactured using batch processes. One reason for this revival is the advantages of batch operation when there is limited fundamental knowledge and detailed process models are not available. Batch processes are easier to set up and operate with limited knowledge when compared to continuous processes. The performance of the process can be improved by iterative learning from earlier batch runs. A second reason is the increasing pressure to start commercial production of novel materials once patents have been issued to recover research and development costs before competing products affect prices. Another reason is the ability to use the facilities for many products with little or no hardware modification. Many pharmaceutical products are produced in limited quantities and the plant manufactures a specific product for a short period of time before switching to another product. Batch operation is usually more efficient than continuous operation for frequent product changes and small amounts of products.
Although batch processes are simple to set up and operate, modeling, monitoring, and controlling them is quite challenging. Consider a simple operation like cooking spaghetti. The basic steps involved are simple. Heat some water, immerse the spaghetti strings in boiling water, drain the water after the spaghetti is cooked, add oil and sauce, and serve. But the actual process to make good spaghetti is more complex and requires many well-timed decisions. What should be the temperature of the water when the spaghetti strings are added, how long should the spaghetti be cooked in water, how much oil and what other seasoning and ingredients should be added to the spaghetti sauce? This is a process with several phases (operations in the same vessel for a specific activity such as cooking or fermentation) and stages (operations in different vessels for different activities such as raw material preparation and product separation). The landmarks denoting the end of one phase and beginning of the other should be monitored for proper timely actions. For example, spaghetti should not be added to water that is not hot enough, otherwise the strings will stick to each other. A good landmark is boiling of water which can be detected easily as opposed to water temperature reaching 200 °F. The latter would work equally well for the cooking operation but will be more difficult to detect, monitor (a thermometer would be needed) and regulate. The duration of keeping the spaghetti in hot water will change because of many factors. These include the relative amounts of water and spaghetti (the initial charge of ingredients), the tenderness of cooked spaghetti (a quality variable that varies with personal taste and weight watching - it is said that absorption of the carbohydrates by the body increases as the spaghetti gets tender), type of spaghetti flour (whole wheat or bleached flour), and the amount of heat provided (one can turn the heat off and keep the strings in hot water longer). Consequently, while developing an optimal reference trajectory for this example process, one may have to take into consideration variations in batch run duration and other factors that influence the degree of cooking. Developing a detailed model of this simple process based on first principles may be even more challenging. A simple empirical model based on data may be accurate enough for most needs. Most industrial batch processes have more process and quality variables, and more stringent operational and financial constraints. Consequently, development of reference trajectories, determination of change point landmark occurrence, quality assessment, and monitoring of process and product safety are much more challenging.
This book focuses on batch process modeling, monitoring, fault diagnosis, and control. The discovery of a new drug such as a new antibiotic or a new manufacturing method that revolutionizes yield and productivity are critical for commercial success. Biology, chemistry, bioinformatics, and biochemical engineering provide the foundations for these advances. But, large-scale commercial production with consistent product quality, stringent process and product safety requirements, and tight production schedules necessitate a different set of skills built upon systems science, statistics, and control theory. The focus then shifts to finding optimal reference trajectories and operating conditions, and manufacturing the product profitably in spite of variations in raw materials and ambient conditions, malfunctions in equipment, and variations in operator judgement and experience. Techniques in model development, signal processing, data reconciliation, process monitoring, fault detection and diagnosis, quality control, and process control need to be integrated and implemented. The book provides a unified source to introduce various techniques in these areas, illustrate many of them, and discuss their advantages and limitations.
The book presents both fundamental and data-based empirical modeling methods, several monitoring techniques ranging from simple univariate statistical process control to advanced multivariate monitoring techniques, many fault diagnosis techniques and a variety of simple to advanced process control approaches. Techniques that address critical issues such as landmark detection, data length adjustment, and advanced paradigms that merge monitoring and diagnosis activities by a supervisory knowledge-based system are discussed. The methods presented can be used in all batch processes by paying attention to the special characteristics of a specific process. The focus of the book is on batch fermentation and pharmaceutical processes. Penicillin fermentation is used as a case study in many chapters throughout the book. Various paradigms are introduced in each subject to provide a balanced view. Some of them are based on the prior research of the authors, others have been proposed by other researchers. Appropriate examples and case studies are presented to illustrate some of the methods discussed. A dynamic simulator for batch penicillin fermentation and batch process monitoring software are provided in the Web. The readers are invited to check the Web site of one of the authors at www.chee.iit.edu/~cinar/batchbook.html for the penicillin fermentation simulator and software tools for supervision of batch process operations.
This chapter continues with a discussion of batch process operations in Section 1.1. Section 1.2 provides introductory remarks about the main focus areas of the book: modeling, monitoring, control, and diagnosis. Section 1.3 introduces the penicillin fermentation process that is used in many case studies in various chapters. The last section (1.4) of the chapter provides an outline of the book and provides road maps for readers.
Was this article helpful?