The use of data analytics is on the verge of becoming an essential factor in metals production. But what precisely is data analytics, or “analytics?” According to the Oxford Dictionary, analytics is the “systematic computational analysis of data or statistics” and has various diverse applications. We can already see data analytics work their magic in image and speech recognition on a day-to-day basis. Search engines, such as Google, are becoming increasingly capable of identifying content found in the pictures uploaded to webpages—which is an area expected to develop further in the next few years. Today’s smart devices, such as phones and voice assistants, rely on artificial intelligence and big data, both associated but not identical with data analytics, to identify commands and enable dictation functionality.
It is important to acknowledge that data-based models reach their full potential in steel production only when combined with first-principle physical models.
When applied to the world’s steel plants, data analytics will facilitate the next step in overall performance optimization—making the production process more intelligent, more adaptive, and more efficient. This increase in machine-applied intelligence will lead to even greater consistency in terms of product quality and help speed up product development—especially that of higher-quality steels. New applications will support operators in finding the root cause of any production issue they may be investigating and enable smaller production runs with minimal losses in productivity. Additionally, data analytics would also make it easier to adjust a plant’s production process to changes in the raw-material mix or if a producer wants to extend their product mix. As the technology matures, additional application areas will become apparent.
“The systematic computational analysis of data or statistics.”
“The theory and development of computer systems able to perform tasks that normally require human intelligence, such as visual perception, speech recognition, decision-making, and translation between languages.”
“Any complex piece of equipment, typically a unit in an electronic system, with contents that are mysterious to the user.” (In this context. “Black box” could also refer to a flight recorder.)
“Vast data sets that may be analyzed computationally to reveal patterns, trends, and associations, especially relating to human behavior and interactions.”
“A type of machine learning based on artificial neural networks in which multiple layers of processing are used to extract progressively higher-level features from data.”
“The process by which a human or animal brain or a computer detects and identifies ordered structures in data, visual images, or other sensory stimuli.”
Data is the Fuel
Of course, there is one crucial prerequisite to implementing data analytics-based solutions: data. The availability of large amounts of data is the fuel that powers the analytics engine. For example, Primetals Technologies has developed a “heat cloning” solution for electric arc furnaces. Still, historical production data, going back at least half a year, is required for this technology to realize its full potential. These solutions function similarly to artificial intelligence applications that need to be extensively “trained” with a vast number of real-world scenarios before they can effectively build simulations. Data analytics can only find meaningful correlations if their algorithms are let loose on a decently sized data lake. However, once these preconditions are met, data analytics technologies can unveil the hidden secrets of metals production, including correlations no one even dreamed existed, making the world of metals more predictable and transparent than ever before.
Which begs the question, what is the most powerful data analytics algorithm? In 1997, American mathematicians David Wolpert and William Macready stipulated that no single algorithm consistently outperforms others. In their view, circumstances determined which approach was best suited to deliver optimum results. They called this idea the “No Free Lunch” theorem and demonstrated its validity in various contexts. Their conclusion states, “If an algorithm performs well on a certain class of problems, then it necessarily pays for that with degraded performance on the set of all remaining problems.”
While choosing the correct algorithm for the task at hand is a challenge in and of itself, the use of highly complex models can present additional problems. Their intricate inner workings are hard to explain without additional visualizations of their architecture and a detailed explanation of their features. Still, even with these aids in place, data analytics algorithms appear to many people as “black boxes”—they seem to be performing a certain kind of magic that remains largely opaque. The black box effect leads to common misunderstandings of what data analytics can and cannot do, leading to unrealistic expectations. Often, programmers have to remind their colleagues and managers that there is not much they can do without sufficient amounts of high-quality data. No matter how sophisticated an algorithm might be, it cannot turn a messy and incomplete dataset into a treasure chest, like an alchemist turning lead into gold.
It is also important to acknowledge that data-based models reach their full potential in steel production when combined with first-principle physical models. The latter have existed for several decades but have seen excellent refinement and complement algorithms grounded in data analytics and machine learning. This combination of physical and analytics-based models is considered most fruitful by R&D specialists at Primetals Technologies as they continue to fine-tune these complementary approaches in their new innovations. By developing more well-rounded applications using this approach, these data magicians take their solutions to a place beyond mere data analytics.
ANALYTICS CASE STUDIES
Dr. Petra Krahwinkler
How do you maximize sinter-plant productivity using a data-driven permeability control system? Sinter-permeability control can be implemented in any sinter plant and uses data analytics to ensure optimal sinter-plant performance. The control system relies on samples taken from the sinter feed before the top hopper and analyzes the raw material’s permeability, density, and moisture values. This solution helps determine the correct amount of moisture to add to the raw-material mixer to maximize sinter permeability and reach full production capacity. A permeability control system allows operators to avoid excessive raw-mix moisture levels. Dr. Krahwinkler’s Sinter-Permeability Control System goes beyond data analytics by applying a reliable statistical model based on profound knowledge and understanding of the metallurgical process and automation to integrate it easily into a closed-loop control system.
Using acoustic monitoring and data analytics, the Grain-Size Monitor is a new way to detect an excessive proportion of fines in raw materials by listening to different materials transferred from conveyor belts. For example, this system can monitor raw input materials in iron and steelmaking, such as sinter and pellets. By replicating the functions of the human ear, the monitor mimics experienced operators but provides more information—and more readily than even the most experienced operator could provide. The Grain-Size Monitor can improve the quality of end products and add stability to the production process. It also makes up for blind spots in Level 2 systems in material handling, provides continuous basic information on raw materials, and directs operators to take samples. By applying enhanced analysis to the acoustic data, Anna Mayrhofer’s monitor accounts for irregularities, providing a more detailed understanding of grain-size distribution, and can help understand additional quality parameters. This system also corresponds well with other automation systems to integrate into tailor-made solutions for unique applications.
Blast Furnace Hot-Metal Temperature Forecasting
Dr. Christian Tauber
Hot-metal temperature forecasting ensures stable blast furnace operation, increased productivity, and consistent quality in the hot metal produced. Input data consists of charging data, pressure, temperature, top-gas analysis, hot metal, slag, and results from metallurgical models that the system then cleans and clusters to forecast hot-metal temperatures. The prediction results are integrated into the process-optimization system, thus creating a closed-loop scenario to control the blast furnace. Dr. Christian Tauber’s system combines an intimate knowledge of the production process to determine relevant input data and create suitable predictive models by going beyond data analytics. This feature can be used with a closed-loop expert system for standard blast-furnace operations. This innovation is currently in operation at three different locations and continuously improving.
EAF Heat Cloning
This solution uses a reference heat to determine how operators can reliably recreate the target heat of an electric arc furnace (EAF). The system effectively produces a model for cloning the desired conditions using historical data of the prior operation as Level 1 and 2 inputs. Heat cloning helps steel producers adjust to new objectives, such as maximizing productivity, cost optimization, or reducing iron loss. By preserving the expertise of seasoned employees, heat cloning also optimizes operation using different input materials, including pig iron and hot-briquetted iron. It also adds a layer of transparency to furnace operation and is applicable in virtually any EAF. Manuel Sattler’s heat-cloning solution goes beyond mere data analytics by combining algorithms and visualizations with a deep understanding of the melting process. EAF Heat Cloning builds on historical data and approaches real-world applications for enhanced productivity. Sattler and his colleagues are currently testing the system and expect to see it become standard in electric steelmaking.