In a previous post I discussed how companies add unnecessary complexity to manufacturing information management and asset management efforts. The efforts should be simple with well defined objectives. To be sure, you may be able to solve the problem or provide a solution with you existing technology stack. But if you are ready to proceed I have outlined what this journey might look like.
Simplicity
A great place to start on an information management journey is with a simple process historian. This allows you to capture data from your process variables, like PLC tags, and enables you to perform simple trend analyses in real-time. Maintenance troubleshooting is a common application when using client tools. If there is a process upset, the process conditions at the time are readily revealed. This helps you diagnose problem and take corrective action.
Because of the simplicity, process historians are common in many plants. You may already have one. But, its applications are limited to time-series data (the three pieces of data capture by a historian are time, value and quality). If this is the case the next step is to implement a tool that capture process events, sometimes known as an event database. This enables you to capture process events and analyze data specific to them.
Downtime is the most common application for an event database. When a downtime event occurs, it is captured by the event database along with a reason code. Not only are you alerted to an event, you are able to track most common causes. Adding in quality information to a production event allows you realize Overall Equipment Effectiveness (OEE). Operations can now use this data to better understand the main causes of downtime and develop solutions to resolve them.
Less Simplicity
Another common information management application for an event database is product tracking and traceability, also known as genealogy. When a production batch starts all of the raw materials that are used in the batch are recorded as part of the batch record. As the process continues, the finished goods are also recorded In the batch record. You can quickly determine the flow of material for any given batch. This is also extremely helpful for creating batch reports. Moreover, because the start and stop times of the batch are known, all of the process data can be queried specific to the batch.
A third manufacturing information application is Statistical Process Control (SPC), which extends the weight-control data from your historian. While quality data can be visualized and reported, SPC will use statistical methods to control your process. This system should be well understood to ensure the proper process adjustments are made. Using the methods incorrectly or misapplying the tools can lead to disastrous results.
Complexity
Once more simple systems are operational and people are using them (and understand them), you can introduce complexity into your manufacturing information journey. The aforementioned systems will generate a significant amount of data that can be used for better process control and improving production. But these systems are generally isolated from each other. A final step can be what is know as manufacturing intelligence, whereby you are adding context into your data. Typical applications are production per shift, quality per line, quality and production per shift per line. The list is endless.
I cannot emphasize enough that your manufacturing information journey should always start with a problem statement. It should be well understood the problem you are trying to solve or outcome you want to achieve. You should then determine if you can solve them with your existing technology stack. If it is time to move forward, start out with simple solutions before getting complex. Your stakeholders will thank you. And it will make Occam smile.