or nearly half a century, the idea of AI seeping into our lives had languished at the back of most of our minds. But fast forward to 2020, a survey conducted by Annual manufacturing report 2018 found that 92% of senior manufacturing officials agreed to the idea of a smart factory with integrated AI systems to run the cogs was what would push the industry forward into the future. And as a result of the democratization of AI appliances due to reduced costs and a greater intrigue among the masses due to startling innovation, this future is slowly seeping into the present.
As we step into the 4th industrial revolution, we are witnessing a boom in the popularity of AI-based solutions to simplify the manufacturing process. There are 4 key pillars to this modern revolution. They are:
- Predictive Maintenance: Which can help in reducing downtime by predicting potential machinery breakdowns ahead of time, giving operators time to manage the issue
- Digital Twin: A modern solution to testing out bridge designs, the digital twin allows developers to design, and test products in a virtual world.
- Generative Design: An Artificial Intelligence algorithm that takes in criteria from designers and runs through all design possibilities and shortlists the most efficient ones.
- Price Forecasts: A Machine Learning algorithm that takes into account various factors from the weather to who wins the super bowl to predict the best possible price of a product
Downtime - the period of time during which equipment is dysfunctional, and therefore unusable- is responsible for the loss of nearly 5,600 USD every 60 seconds for the average company. The conventional method that companies relied on to save costs due to downtime was through human intervention. As stated by “AI in production: A game-changer for manufacturers with heavy asset” by Eleftherios Charalambous, Robert Feldmann, Gérard Richter, and Christoph Schmitz - A set of operators would observe down from a control room on a myriad of screens presenting them with a plethora of sensory data that needed to be conscientiously screened through for inconsistencies. Using their intuition, and the data provided, they were required to run complex simulations, and troubleshoot issues by the clock. While trusted by manufacturers for over 2 centuries, this mode of troubleshooting provides us with 2 key issues, both resulting in disastrous outcomes for the company at hand.
- Firstly, the impending strain on the limits of human agility results in operators deliberately focusing on “urgent” but unnecessary tasks and choosing to take shortcuts to ease their burden. This consequently leads to reduced efficiency despite the mammoth costs that a human operator's levy.
- Secondly, the efficiency of an operator is directly proportional to his/her experience. Therefore, replacing retired operators with years, if not decades of experience is a troublesome task that can lead to reduced troubleshooting for a certain span of time.
This is where predictive maintenance can help. By installing an ML system, fine-tuned to the operational requirements of the situation at hand, and with constant sensory inputs from the machinery, both issues that arise with conventional means of operating are neglected.
To better visualize this, we’ll be taking a look at 2 chemical companies, both of which produce an antacid at a specific pH level.
- Company A uses operators to manage the pH level of solutions.
- Company B applies an ML system that has sensors to observe the pH of the solution and has the machinery sending complex operational status quo from time to time.
In the instance of an imbalance in the pH levels, company A will indefinitely have a delayed reaction to the issue, as there is a wide range of things that could’ve gone wrong, from a human error to the breakdown of certain machinery, and going through each and every permutation (by hand), and addressing the problem will lead to a loss in productivity and a decline in profitability. Looking at Company B, active updates from machinery allows operators to source any potential breakdowns ahead of it happening, effectively preventing all possibility of downtime. Leading to more profits relative to Company A.
This is because an ML algorithm is more or less a one time cost, but as a result maintenance costs being at a fraction of traditional means. And due to its long term lifespan, its efficiency will only continue to outperform both its and human performance as more data will only make it more adapted the environment
While the assembly of a product is a sizable portion of the manufacturing process, a segment that can not go un-avoided in pursuit of an optimum product is its design. Despite its contribution to the overall yield, it had not experienced any development for the first 2 industrial revolutions- which mainly focused on tweaking the assembly line to perfection.
While industry 3.0 imported the advantages of digital designing with the help of CAD, the Digital twin of industry 4.0 is turning design from a superset of the digital world to a subset. The Digital Twin is best explained by KAJA POLACHOWSKA AND MICHAŁ TROJNARSKI in 10 use cases of AI in manufacturing, where they define it as “the virtual representation of a factory, product, or service”. With the use of sophisticated sensors to map different types of data about a certain product, engineers and designers alike can test, model, and understand the inner workings of a product, while in the comfort of safety. This technology also has the added benefit of being cost-efficient, as it requires a fraction of the cost that building a real-life replica has - and having it go through trial and error-does.
The same article aforementioned also discusses how NASA - one of the first to firmly root itself with The Digital Twin method of design - uses the process to create, test, and build preliminary versions of their equipment before manufacturing a replica in real life, which would again tie back to the virtual model with the use of sensory data in order to further understand the inner workings of the equipment.
Generative design is the epitome of using AI by its basic definition. It works by having the user- the designer/engineer- input a set of conditions (dimensions, weight, material types, available production methods, budget limitations, and time constraints). By taking this into account, an AI algorithm goes through countless combinations to hone in on an optimum solution- one that best hits all required bases- thereby saving designers countless hours of tweaking and turning the cogs manually.
In addition to its design aspect, this process also has the added benefit of being stray from human logic. This objectivism is best summed up by Phillip Kushmaro in his article on 5 ways industrial AI is revolutionizing manufacturing, where he describes it as: “No assumptions are taken at face value and everything is tested according to actual performance against a wide range of manufacturing scenarios and conditions.”
The article 10 use cases of AI in manufacturing also adds that Generative design is far more efficient in the assembly of ideas and unique models when compared to humans, but nonetheless, is only an acquaintance to human creativity- something that presents AI lacks.
Fluctuating markets and competitive pricing (among other factors) have a pristine effect on the profitability of goods. So knowing how to price products can boost sales and help companies rise above the competition. But in this instance, traditional means of using market analysts to carry out this job has proved to be efficient. But similar to the job of an operator, this job strains the limits of human ability, as it is nigh impossible for humans to predict the uncertainty of today, as everything from politics, to macroeconomics, to a micrometer-sized virus can have an immense impact on the global market.
With prices rising and falling on a daily basis, having a more sophisticated form of analyzing trends in the world around us to predict the future is a must-have to get ahead of competitors. This is where Price forecasts come into play. In a nutshell, price forecasting is more or less an umbrella term, referring to a range of AI applications that soaks up socioeconomic, political, meteorology, location, and much more data from data banks to predict the price of a commodity when deciding to either buy or sell an article.
Tech companies like amazon and google have already implemented such tactics to get a landslide lead over competitors and rule their respective markets. And as industry 4.0 begins to form the foundation of all manufacturing corporations across the world, it doesn’t seem very far away that all companies will have incorporated a mode of price forecasting just to stay in the game.
What Does This Mean?
While the annexation of the manufacturing industry by AI solutions has been very slow, it seems inevitable. As the technology matures and democratizing its access by reducing costs occurs, it is certain that many jobs will be lost- a concern that has many fraying away from the idea of industry 4.0. But one must remember that the jobs lost will only be replaced by more high paying alternatives. The idea of products being delivered to us faster, better, and at a faster rate is an exciting prospect to me, what about you?