Search
Close this search box.
Build Your Digital Infrastructure. Register for the ELEV8 2024 Event Today.
By Elian Zimmermann
06 January 2022
CUSTOMER SUCCESS

Making Sense Of All Your Data With Flow

Share

PRODUCTS: FLOW

INDUSTRY: FOOD & BEVERAGE

END USER: INTEG

 

1. Introduction

Different industrial sectors use Flow to collect, calculate and visualise data to assist in optimising their daily operations. Learn how a leading SI empowered high-tech engineers to plant floor newbies with actionable information. Making everyday technical personnel reporting and data analysing mavericks.

2. Transcript

00:05
Good afternoon, ladies and gentlemen. Or good morning. Still, I think it’s just before twelve, we’re running a bit early and everybody online, I want to welcome you as well. First of all, I just want to say my name is Jaco Kip and I’m from the company Intech. As Jaco said, I want to thank element eight and flow for the opportunity to who give us the opportunity to be part of this exciting event? Today I’ll be talking about the. Sorry, this first slide, just want to say Intech is a system integrator company that offers a wide variety of industrial automation solutions throughout Africa. And today I’ll be talking about how we helped our clients in different sectors of industry to identify the information gap and use flow to visualise and give them more sense out of their data in their plants.


01:01

So my agenda today will cover, first of all, the business case, the reasoning for providing solutions, to quickly identify and identify the information gap, and then to identify the information gap, we’ll show you some of the data challenges we came across in today’s industries. The solutions that we provided is the software that we use to help address these requirements, and then the results, the benefits and feedback that we got back from our clients. And then I’ll talk a little bit about the conclusion where you used flow to actually help us with a solution. We all know today that manufacturing today is under a tremendous amount of pressure from management and from customers to produce quality goods and services at a low cost in as little time as possible.


01:56

Today’s market leaders believe that the return on investment is multidimensional and is not only just determined by cost, but also by the ease of access to information. Thus, manufacturing need to collect more data today than they did before, and convert that in actionable information to help this, to help make this dimensionality part of their return on investment calculations. So before we can make sense of all this data, the business case needs to be clearly defined. A couple of key indicators to determine would be to know if it will help to lower operational cost, will it help to increase productivity, will it improve compliance, or will it provide more accurate and real time data?


02:43

In each case, and no matter what structured and unstructured data has been brought together, the aim is the same as to help the organisation to make more informed decisions, to produce better quality at a safer and in a shorter amount of time. In an age where we collectively generate more data in a day than we did used to in a century, the challenge becomes how we identify the relevant data. In many cases, the data stores are simple databases like Excel, but in other cases it is industrial tag historians, SQL databases, or online data providers. The data on its own is not always useful for monitoring. It needs to be processed into useful information. Some big technical data experts would reason that we’d require with all this magnitude of data, we’d require statistical processing or advanced data analytics.


03:43

But these powerful tools are not always necessary. Rather, providing a software tool that is easy to configure, that has industry standard visualisation capabilities that everybody can see and understand. Each of the following cases that I’ll be talking about require an information system that could rapidly and at a low cost be rolled out and to give more insight into their production facilities. With the multitude of data sources that we currently have, or these days have in all of these production facilities, the goal in any one of these cases or sectors, although they are different in nature, boils down to the same reasoning. They need to get quicker access to more accurate information to make a more effective and informed decision to ultimately enable them to reach their goals and exceed them.


04:42

Before we continue to identify these information gaps that we want to identify, let’s quickly have a look at some big data statistics that I could find. So some of the stats is that nearly 90% of all data that we have today has been created in the last two years alone. That’s quite insane. Companies that harnesses big data’s full potential could increase their operational margins by up to about 60%. Every day we create about 2.5 quintillion bytes of data. Now, with all these data, let’s have a look how we can solve the problem to make a bit more sense of it. So we looked at three cases. I divided these up into three cases where we looked at the challenges and then the solution and the results that we got out of them. So the case for case one, we looked at transport.


05:42

The request was to provide data related to the loading and offloading of raw material on this particular site. Technicians would use historian clients to formulate some sort of production performance information dashboard. These client tools were available to all the managers on site and all the technical operators on site as well, but they normally didn’t have the time or the know how to use these tools. So technicians would take this data, print it out on trains, or print the trains out on PDF and clicker working, print these out on pdfs or on Excel and then send this off to their management. This would normally take time for the data to get from a plant floor all the way up to management. And this led to mistakes in the data that’s been driven through.


06:41

They also wanted the ability to create their own dashboards, and that reflected the real time data of the plant statuses. For case two, we looked at general manufacturing. Some requests that we received was included, including target versus actual production projected into the future, machine cycle times and OE performance data, to name a few. So manufacturers still capture data on paper and handed that over. I think we’re all familiar with this, with some of the factories where the data has been captured on paper. This would then be entered into the Excel spreadsheet, which will then be, at the end of the day, be available for managers to have a look at. So that’s quite a time delay you get here.


07:26

Some plants also updated the performance during the day on dashboards like these, which the personnel could have a look at and see if the machines were running on spec or if they were ahead of target. This also took time to calculate and it could incur errors with the calculations. With this performance, Whiteport operators could still easily see if they were a target, but not if the machines were running on spec. So there was mistakes and time delays here as well. For case three, also with all of these, the two previous cases, the personnel also wanted the ability to generate their own reports and create their own reports, which is a tedious process. Normally if you have requests coming in daily or weekly to do changes. For k three, we looked at food and beverage.


08:17

Now, yes, it’s possible to provide quick CIP production and transfer reports linked with OE data after CIP cycles. Quality normally use manual testing and custom reports generated by the control system to verify their Cips. These reports normally do not clearly indicate the steps and the failures which produces an information gap. Machine performance are usually on display on the OEe machines locally and not on the main scalars because there’s no connection to it. The gap information is a delay in time and manually data capturing errors. There’s also a need to add more context to the data pressing the wrong button. There’s also a need to add context to the specific production batches. Then with all three of these cases, we could establish that there’s a delay in time before the data was transferred.


09:19

The accuracy of the data is uncertain due to it being manually processed or manually captured. This led to a delay in action on any of the information received. Outsourcing development of new custom report to the id department or external service providers also got lengthy and costly. So the solution that we had a look at was flow software with its capability to turn massive amount of data into quality information. The flow information platform was easily adopted from its unique capability to templatise KPIs for entire enterprise to the ease to connect to any data source that’s found on the plant. We know that flow was the right tool for the implementation to cover these information gaps. The solution for case one we wanted to allow the product managers to understand the performance data at a glance looking at the production screens.


10:20

So the data captured for the transfer of products needed to be combined with OEE KPI data and to add some context in a particular site. We connected. Flow data was available to historian and the Excel spreadsheets that they filled in and we visualised that additional dashboards were also created for production personnel and technicians so that they could quickly see the status of the current and previous shift performance. So that was the old forms that they used that they filled in and where we could capture data from as well. The flow messaging system messaging engine extended the reach of the information by pushing configured messages reports dashboards to the external notification services on a schedule.


11:11

The solution for case two in general manufacturing also looking at different sites in this sector, we normally have to identify additional process parameters that needed to be made available from the PLC or the controller and then send that over to the historian for historisation. The data can then be visualised and made available for management, production and technical staff throughout the site. It was also known that after maintenance, the cycle times of the machines were normally either slower, so the maintenance affected the machine cycle times. So the plant engineers also wanted to have a look at the cycle times of machines on the dashboard. We have that information available to them as quickly as possible. The solution for case three in food and beverage, looking at a specific site where the control system was redone. A historian luckily was part of the solution.


12:16

The data needed to be contextualised and visualised to transfer product and CIP reports. Here we used flow events. So flow events are defined by triggers and attributes. The triggers are used to start and end the events, and attributes are then used to add some context to additional information to this data for the event periods. By mapping the product tag to a description, a simple transactional summary report of the product used and quantity transferred could then be generated. We also created event driven CIP overview reports. With this, we had to use the custom expressions in flow, which gives us the ability to use c sharp to do more complex type of expressions, so summary and a detailed CIP report could be generated by this.


13:08

They also wanted the capability of the templates functionality of flow, which allowed them to make create easier and a drag and drop functionality to create reports for new lines that’s been added. Now coming to the result or the benefits that was realised for case one was that technicians can monitor the production transfer totals and quickly identify a slowdown in production. These enabled them to be more informed and effective and responsive. Data could now be backfilled with to leverage the existing system and perform historical data analysis and calculations on all data verified with manual data that’s been entered. So on the left hand side is where we can see the actual target and how they are performing and they could see a slowdown in production on that with this side giving them a little bit more of OeE KPI sort of data and shift data.


14:04

Operational staff received this data also first thing in the morning on a schedule. The result for case two was that it started to motivate the plant operators to perform better. Technicians and engineers can now quickly monitor the machine’s productivity from their workstations or their office. Taking machine down for maintenance sometimes affected the cycle times and this could now be easily and quickly or immediately detected because flow is accessible from anywhere and from any web browser. With its help, the production managers could monitor the production statuses from the comfort of their home. The result for case three in food and beverage defining and deploying the events gave us the ability to provide more transparency on the CIP reports. We could easily generate a summary and a detailed report. Detailed report included steps and the failures of each CIP event. It included durations.


15:05

We associated attributes that provided additional context for each event period. Production managers can compare flow information to the operator feedback and this also assisted QC to verify their processes and speed up that process. The media to access to real time data also was a big advantage. So this just displays the CIP reports where we had a summary report and more detailed steps with those custom expressions that we created where the production managers could see the exact steps that was taking the CIP and if there was any failures. We found that with most clients, big or small, the key people who needed information from the data sources normally get stuck in the IT pipeline or need to scramble through a lot of spaghetti spreadsheets. This can be costly and is very not very efficient.


16:08

Flow makes it possible to quickly and easily contextualise and visualise information to assist in improving daily operations. Some of the benefits that we realised with flow was that it shortens the time for decision making. It adheres to IT governance. It extends the reach of your quality information. It gives you self service capability, which is a huge advantage for the client as well. It transforms personal personnel into information management experts. There’s a single reliable source, there’s flow support on your doorstep, and there’s a single repository of data which creates an ideal source for integration into your ERP system. As with any reporting tool, we want to make it easier for the people to get their data and understand their data, to empower them to get the information that they. Information that they require more frequently to make better decisions.


17:11

Flow is one addition to our arsenal of software, which helps us to make, which helps our team of technical experts to deliver this value to our customers. And thank you very much for your attention.

You might also like