contains a so-called 'transfer function' which subjects the value of the input to a mathematical transformation. Output from each transfer function within a layer is summed and passed on to the next layer. The back-propagation method compares the difference between the network output and the actual output. The error is assumed to be a consequence of an error in the weightings assigned to each processing element. By comparison with further input data these weightings are adjusted until the error is reduced to an acceptable value.
Syu and Tsao (1993) applied back-propagation neural network modelling to batch growth of the bacterium Klebsiella oxytoca. The model used initial glucose concentration in a two-element input layer, a three-element hidden layer and an eight-element output layer which was the biomass concentration over the first 8 hours. Using seven sets of data for learning, the network was shown to be capable of both simulation and prediction when tested against a further seven sets of data. In a later communication (Syu el al., 1994), application to the prediction of brewery fermentation was described. This also used a three-layer back-propagation neural network. In this case a 3-3-7 configuration was used in which wort free amino-nitrogen, dissolved oxygen concentration and initial viable yeast count were used as input and ethanol concentration at seven 24 h time intervals between one and seven days as output. The network was trained using 18 sets of fermentation data. New sets of initial fermentation conditions, different to those used for training, were presented to the network and actual and predicted ethanol concentrations were compared. At production scale (1500 hi), a good correlation between actual and predicted ethanol concentrations was obtained up to 96 h. After this time significant deviation occurred. The discrepancy was ascribed to the fact that temperature was not included as input in the network.
The requirement to consider temperature has been addressed by Gvazdaitis et al. (1994). In this case, it was reported that in large fermentation vessels temperature fluctuations limited the efficacy of predictive models. In order to improve attem-peration it was necessary to have an accurate model of exothermy due to the fermentation and precise knowledge of the cooling behaviour of the vessel. The progress of fermentation was predicted using a neural net, which had a derivative of present gravity and diacetyl concentration as output. Nine input elements were used and a single hidden layer containing ten elements. Vessel cooling behaviour was modelled using a separate difference equation. Cooling behaviour during both primary fermentation and the diacetyl stand were encompassed. The combination of the models produced a dynamic temperature controller, which minimised coolant costs with no loss of fermentation control.
Software-based fermentation control systems with a different philosophy to neural networks are those based on real models. Several mathematical models of the biochemical reactions which underpin brewery fermentation have been proposed (Engasser et al., 1981; Bezenger & Navarro, 1988; Gee & Ramirez, 1988; Jarzebski et al., 1989; Pascal et al., 1995). Using relationships given in these models, bilinear simulations of brewery fermentation have been developed (Johnson & Burnham, 1996; Johnson et al., 1996a, b). These mathematically based systems allow prediction of the effects of all major fermentation variables on times to achieve attenuation gravity and diacetyl specification. This development is still in its infancy but it is anticipated that the simulations will be used to develop software control models for application in automatic on-line regimes.
Was this article helpful?
Discover How To Become Your Own Brew Master, With Brew Your Own Beer. It takes more than a recipe to make a great beer. Just using the right ingredients doesn't mean your beer will taste like it was meant to. Most of the time it’s the way a beer is made and served that makes it either an exceptional beer or one that gets dumped into the nearest flower pot.