Build Your Digital Infrastructure. Register for the ELEV8 2024 Event Today.
By Clarise Rautenbach
26 November 2024

Keynote: The Digital Transformation Journey

Share

Introduction

Although many manufacturers want to get a Digital Transformation project going, they hesitate to invest significant time and effort into a project that may not deliver the desired results. Travis shares what defines Digital Transformation, the fundamental tenets of a foundation, the flow of data to information and the importance of connecting OT to IT for Digital Transformation.

SPEAKERS:

Travis Cox
Chief Technology Evangelist
Inductive Automation

Transcript

00:10
Travis
All right, well, thank you very much. All right, so I want to talk about something that’s really near and dear to my heart, and that is the digital transformation journey. And this is something that, you know, hear a lot of people talking about, Digital Transformation Industry 4.0, but it’s all about, you know, I’ve been in the trenches for a long time, so I’ll about being able to do this in a practical way, in a way that makes sense, in a way that gives you value, in a way that it can make it, you know, make it be understandable so that you can move forward and your organization can move forward. Because there’s a lot of hype out there, there’s a lot of, lot of misnomers, a lot of complexity.

00:48
Travis
If I were in your position and I were going and looking at, you know, searching Google, trying to find solutions, it would be very difficult to understand, to get through all the noise that’s out there. So I’m gonna give you my take in terms of what the digital transformation journey looks like. And there’s no product placement here whatsoever. This is just talking about really, it’s not a technology problem today. It’s more of a people, process, and culture problem in terms of how we can approach digital transformation. So let’s get started here today. The theme of today has really been around digital infrastructure. But of course, at the fundamental level it’s all about data, right? Data is the most important part. Data is king. Digital transformation is really all around data.

01:36
Travis
Being able to take data, turn into information and get data driven decisions and insights off of that data. So there’s two really important things that we need to think about with data. One is just access to information, access to that data. We need broad access to it in order for us to do something meaningful with that data. And we need to get that data to the right stakeholders, to the right people that need that information. And ultimately, it’s greater collaboration that we have between teams so that they can work together to figure out not only how to get the data, acquire it, transfer it, but how they can actually utilize that data to solve business objectives, right? So right now there’s a lot of disruption around Industry 4.0 and all the amazing possibilities that are there. But fundamentally, we’re talking about data.

02:27
Travis
And of course, with data it is increasing at an exponential rate. There’s a lot of industry manufacturers, this is industries smart Manufacturing has said that the unprecedented increase in data is changing everything and that this is digitization and datafication of industry. That’s true Right. There’s a lot more data that’s out there. A lot of air gapped systems or siloed systems. There’s a lot of stranded data that’s in the field. And it’s all about being able to get access to that. Plus, there’s new sensors, new equipment coming in, and again, it’s just increasing. I could remember when I go back 10 years ago, a system that was 100,000 tags seemed massive, you know, seems really big. Today we’re talking; I see customers dealing with millions of data points on a daily basis. Right. And that’s just from one facility.

03:21
Travis
Think about multiple facilities where at the end of the day, they have a challenge to acquire that data, to process it, to store it, and to get it to a place where they can actually take insight off of that. So there’s a lot of data going on, but we all know that there’s a lot of challenges to accelerating our journey. I mentioned stranded data. There’s a lot of stranded data out there. In fact, a lot of surveys suggest that about 90% of data is stranded in the field. And that may be true or not to you, but I’ve actually seen that in practice, we’re just not getting the data in for lots of various reasons, technology reasons, licensing reasons, whatever it might be, political reasons. There’s a lot of difficulties in acquiring that data as well as, of course, being able to store that data.

04:05
Travis
There’s complexity in moving and mapping that data. There’s a lot of, for a lot of people, lack of open standards they have in the plant floor from the systems that they’re using. And of course, that also leads to data that’s not standardised and contextualised, you know, our model properly. And that leads to a lot of challenges overall. And we look at this, it’s. It can seem like it’s expensive to acquire, store, analyse that data, and if we approach it the wrong way, it can be expensive, especially if our first instinct is to just go straight to the cloud. Let’s do everything in the cloud. We could be missing out a lot of. A lot of benefits or ROI that we can get just by fundamentally fixing the plant floor.

04:45
Travis
So this is, I put this up there as pretty as a, kind of a funny picture, but as we work with a lot more companies, I’ve had manufacturers that are working with us for a long time, they’re very sophisticated OT systems. They get a new CIO that comes in, and that CIO is focused all on this. They’re focused on all the AI and analytics; they’re focused on what they can do with that data and what that high level strategy is for that data. And they think it’s so easy, we’re just going to draw a single line down to our OT data. Well, that looks really simple, right? But what’s the reality of that is that, of course it is not that simple.

05:35
Travis
The world that we live in is a very complex world, you know, and there’s a lot of domain knowledge that’s in that world. There’s a lot of that’s where that’s known and owned by operations folks. There’s a lot of different devices out there, especially brownfield devices, lots of different protocols, lots of different systems that we’re dealing with. And overall, in order to get to take advantage of all these amazing tools, we have to fundamentally fix and look at the plant floor first. So the reality is, of course, that it can be difficult in the journey. It is a journey, and we can’t replace everything. There was that rip and replace word earlier. It’s not about necessarily doing that. It’s about being able to fundamentally solve business challenges, but by doing it in a new way, a better way.

06:32
Travis
So I think digital transformation is that it’s what it is and what it isn’t. It’s definitely not about using technology for technology’s sake. Just because it exists doesn’t mean I have to take advantage of it. Right? So as soon as ChatGPT was, you know, a lot of people are using it, so I, I know some people were automatically trying to figure out how can they use that for their organization or, and they’re bringing it into the OT realm and, but they had zero reasons for why they would do that just because it was cool. That’s fine, I understand that. I understand wanting to use new technology, but we can’t use it just for the technology’s sake. Secondly, this is most important to me is that we can’t put new technologies bolted on top of old technologies. If we do that, we are fundamentally not changing.

07:27
Travis
We’re not modernising the entire infrastructure, we’re just creating layers of complexity. And this is what leads to cybersecurity issues. This is what leads to brittle systems that are hard to maintain, hard to move forward. It might look good on the surface, but what if we need to fundamentally change things? Right? It’s very difficult. We have to look at approaching things in a much better way. So to me, it’s all it is about a comprehensive shift in the way that we look and work. And the way that we do business, it’s about identifying areas of improvement. And we all know there’s lots of areas we can improve, right? We have those business cases. Then we make those improvements in an intentional, methodical, and measurable way by leveraging modern technologies, building the right architecture, the right foundation. We get the win, and we get the roi.

08:20
Travis
Get the win. And then we continue to do that as we advance in our journey. And this is not about replacing everything we have. It’s about being able to do it in parallel, doing it the right way so that we can migrate to this new way of doing it, because we know we can’t just replace everything tomorrow. I’d love to, but it’s very difficult, very challenging. So to me, this is what digital transformation is and what it isn’t. I’ll give you a personal story for me. I think it relates pretty well to the overall picture. So I have two boys. They’re 11 and 8. And a couple years ago, they wanted a playhouse. I’m an engineer. I can figure this out, right? So I go buy the wood.

09:00
Travis
I come back, I have no idea how to build anything, by the way, like modern construction techniques or anything. I have no idea get the wood. I have some various tools that I was given to by my dad and by my grandpa. I said, I’m going to do this. And I had some blueprints that I found online. So trust me, you can see where this ended, right? Not well. They did not get a playhouse that was made custom by me because I approached it from a way where I didn’t look at what the modern techniques were, the modern construction techniques, and I didn’t have the right tools. So what I did do is spend a lot of money going to home improvement stores trying to get the right ones. And that was very difficult for me. Right.

09:42
Travis
So these challenges can be very hard or almost impossible if we approach it in the wrong way. And not to mention that tool that I had, like, I had a staple gun as one of the tools that was given to me by my grandfather. That staple gun, while it would work, I didn’t have any more staples. So I went down to the store to see if I can get more staples for it. Guess what? Those staples didn’t exist. That format didn’t exist anymore. It wasn’t a standard that staple gun used. So even if we have tools, if it’s proprietary or not based on standards, it can be very challenging over time to leverage them or to use them properly. So to me, it’s all about choosing the right tool set.

10:27
Travis
It’s about approaching problems in a methodical and measurable way and leveraging, of course, standards, whether it be open standards, whether it be industry standards, whatever it is. But most importantly, it’s all about being able to work together to solve business goals. So had I approached this a better way, maybe I should have gotten guidance from somebody who has built a playhouse before, right. And think I can do it myself, right? And work with the teams, work as a team, work together to achieve my ultimate outcome, my business objective. So I just want to kind of give you that sense because I think our first instinct we want to solve these challenges is to just kind of build ourselves, leverage some new things, put it in and make it work.

11:15
Travis
And we might get some success with that, but it could be very difficult to scale and then to do the next one and to really ultimately long term solve more challenges. Right. So in terms of working together, I think this is an area that’s really important specifically with these two teams. So we’ve already heard somebody said, have a good relationship with it. You know, we have to be able to bring teams together. So this really digital transformation leads to convergence ot, IT convergence in a much bigger way where we have to bring our OT specialists or IT specialists, bring them together as one, working together to solve those business objectives. Fundamentally, this is. Convergence is happening. It’s starting to happen more and more.

12:01
Travis
And again, if we approach this the wrong way, then we get a lot of resistance, we get a lot of conflict. So we know that each of these teams have their own skill sets. From us, from an operations standpoint, OT is so important, right? That is how the business makes money, it’s how we make products. But there is those skill sets. Being able to understand the machinery, right? And be able to keep these machines running. Obviously, all the PLCs, the knowledge around how these, the processes work, that’s all there and very much, very important. But what we’ve done for so long with OT has said we need you to be experts on the infrastructure and how these tools are installed, managed and all of that. And that’s really challenging for them. That’s not their area of expertise. That’s it’s area of expertise, right?

12:52
Travis
And it, you know, they’re responsible for cybersecurity, responsible for getting these business systems to work, make sure these data can flow the way that we need to. But ultimately, if they don’t understand the tools on the OT side, they don’t touch them. And so if we don’t have these Two working in concert. Ultimately, we have challenges, and we’ve seen those challenges in real every day. But ultimately these skills benefit each other. They fit together perfectly. They’re melded perfectly. If we can get the teams to think that way, right? And to think that we are both working towards the same company’s objectives, at the end of the day, we all know there’s a lot of benefits to these convergence.

13:39
Travis
I’ve seen it when I’ve worked with companies who have said, I’m gonna get the OT manager and the IT manager, We’re just gonna put them together into a new unit, right? And they’re gonna run things together going forward. That has changed the. The outcomes quite fast. Getting faster development, reducing their costs and lowering their risks, getting the right infrastructures in place, faster integrations, more information, all sorts of great benefits of this convergence. So we can see that. But kind of how do we get to the convergence? What are the types of convergences that we’re dealing with here today? So I think there’s three types personally. And if you look at, there’s culture, there’s technology, and there’s data. But when we look at this, we focus. All digital transformation really focuses on data, right? And so we focus on convergence around data.

14:33
Travis
How can we move data from the OT world to the IT or to the business side? And we neglect the other types of convergences that are required in order to make data happen. So in my opinion, in order to achieve full convergence, you have to be able to fix culture. You’ve got to leverage right technology and put the right infrastructures in place. Then we can approach the data that’s required for digital transformation. I want to start with culture, because I feel this is the area that’s most important here. I am up here talking about digital transformation, but I’m talking about culture, because if we don’t fix culture, it’s really difficult for us to transform. And this, of course, is something that, you know, we’ve seen the culture shifts that have happened over time for other areas. Let’s take cybersecurity, for example.

15:26
Travis
I’m sure a lot of you in your organizations, you know, with cybersecurity, you’ve maybe done trainings around how you like, how you can. You are also a part of your cybersecurity for your organization, right? It’s not the IT teams can only do so much. Everybody plays a role in the overall securing of the systems. Because for me as a person, if I access my email, I’m not paying attention. I just go and download, install things. I can put my company at risk. So everybody plays a role in that journey. And the same thing is true here with digital transformation. Everybody plays a role in being able to get digital transformation to happen and to get this convergence to happen. So it’s not just one team trying to solve this. It’s not OT by themselves or it’s not it by themselves. Right.

16:19
Travis
Ultimately, it’s the teams together. But what needs to happen, of course, is the executive teams, the leadership teams. Actually, they need to set the stage. They need to sponsor these kinds of initiatives and lead that down to the organization that in order for us to get where we want to go, we’ve got to get everybody working together and be on the same page about what we’re trying to do. But that brings us to what are we trying to do? Right? We can’t achieve digital transformation if we don’t have reasons for why we’re trying to do it. So to me, identifying business goals, identifying objectives, what problem are we trying to solve? That, to me, is number one. If everybody understands the problem we’re trying to solve, then maybe they’ll be a part of the solution. How can we help achieve that outcome?

17:13
Travis
It’s really interesting from a psychology standpoint just how, if we can get communication from top down to really understand this and have people feel like they’re connected and part of the journey, you get much better outcomes at the end of the day. So that’s why I think digital transformation, there’s a lot to do when it comes to communication and culture, and that’s not something just OT teams can do. This is a hard thing to do. I’m not suggesting this is something you could do tomorrow, but we have to start changing the narrative and start getting organizations to think all the way through on how we can accomplish this. And of course, it’s all about not taking on the entire world, breaking down these things into chunks that we can actually accomplish, seeing results and moving on.

18:02
Travis
So culture, to me, is a very important part of the journey. Of course, that leads us to technology. Technology is, you know, there’s a lot of technology out there. I say it’s a very exciting time to be an engineer. There’s amazing technology, both in even the OT world, of course, as well as the IT world. There’s a lot of great things that we can take advantage of. And there’s a huge shift right now, which is a movement away from proprietary siloed systems. We know that is not giving us the flexibility that we need. That shift is leading towards that open, that transparent, that interoperable mindset. And a lot of the new technologies that are there today, that’s a fundamental part of the DNA of that technology being based on open standards.

18:50
Travis
So it’s that integration and that interoperability, being able to leverage what’s out there, that’s what we need to do. But that requires systems that we use in OT to be leveraging those kinds of technologies or if we are solving a new challenge, make sure that we’re not getting ourselves back into that old way of thinking when we bring in new systems. So there’s also, you know, systems are scaling at an enormous rate, and I mentioned this earlier today, but there is so much that can help us in terms of being able to not only deploy these things, but manage these things and to scale these out across organization. There’s a lot of technologies you may not recognize, a lot of these logos that are on here, you may recognize them.

19:41
Travis
But being able to leverage kubernetes or containers for deploying systems, there’s a lot of advantages and benefits that we get from that technology. Being able to utilize identity providers, we’ve heard that earlier today a couple of times. But an organization, if they, if all their tools gets, you know, can use the same centralized authorization system that does multi factor authentication, maybe it’s facial recognition, maybe it’s a Yubikey or some sort of form, then all these things get a lot easier for us to access and to work with and to secure in our organization. So there’s all sorts of amazing tools. And this technology is more around, of course, if you will, the infrastructure. But there’s technologies, of course, we want to leverage in the cloud and much more.

20:33
Travis
But there’s again so much out there, but we need to approach it in a way that allows us to leverage that everywhere that we want to use it. Okay, so that brings us to that data convergence. So that is the most important area. It’s how we’re going to achieve the outcomes we want. And there’s a lot of data that’s out there. Of course we know it’s critical to our business, but the approach to data I think is fundamentally important and how you approach it. And that is that at the end of the day, it is your data, it is your tools that you’re using. And it is not when you bring it in, it’s your infrastructure, it’s how you want to use it, how you want to work with it across your organization. It is not a vendor’s data, it’s not a vendor’s tools.

21:22
Travis
So you don’t let vendors dictate what you do for how you want to build your systems or where you want to store your data or what that data looks like. We have to have. It’s something you control. And you have to leverage tools that are out there that fundamentally believe that philosophy, that are transparent and that allow you to fit it into your infrastructure. You define that infrastructure. So I’m very much a proponent of that because I’ve seen it with IoT in particular, when the big IoT boom or IoT especially a lot of new systems and tools came into play. And for example, let’s do you know, motor vibration to kind of predict when a motor would fail.

22:01
Travis
And you could buy that tool from them, they’d hook it up to the motor, they publish it to the cloud, they provide a service to you. But you didn’t own that data. I mean, maybe you could export it, you couldn’t have fit it into your other systems, you couldn’t migrate with other systems because they were just telling you this tool, you might have got value from it, but it didn’t allow you to use that data more. And so to me, this is something you need to have. You need to decide what your approach is, how you want your infrastructure to work, what tools and standards you want to use and make sure things align with that. It also means though that we have to think about, you know, if we’re going to approach convergence, where do we approach it from?

22:44
Travis
And this is to me the last big thing I want to mention on this is what is our mindset, how, where, what teams would lead that convergence. I fundamentally believe that we have to approach convergence by thinking about OT first. That is where all of our data resides. That is what makes the business money. But there’s a lot of domain knowledge that’s required there. If we approach it from a top down, that knowledge may not be a part of that journey. But if we approach it from the bottom, not suggesting that it is not involved, but that is where the data is. We need to define that data there, provide the context at that level and allow that data to move through our organization. So that ot first mindset, approaching it from the bottom up is really important.

23:37
Travis
So I’ll give you a story about top down approach. I’ve seen a lot of companies where they have their SCADA systems, they’re historians, they have, you know, their control system’s in place and the IT team wants to do a particular project to get some data scientists and they say look, we get that single line, right? We got that line down, we can go out to our systems and get that information. And so what they do is independently, they go and build some scripts, some tools to go down and capture that data from the plant floor. Who knows what they get it from? It might be a story and it might be, you know, directly from the system. There might be an API, there might be some way of being able to access it.

24:16
Travis
And then they write code, they map it to something else, right? They build their ML models. And this is unfortunately what I see in the cloud a lot on the right is just all these different services that are in the cloud. They’re writing lambdas, they’re moving over here to this system, they’re doing that, they get into a data lake, it’s going over to this, you know, there’s a lot that they’re doing with it. And ultimately they do solve the challenge. They get something to be successful. But then how does this scale to every other system? You have to every factory? What if every factory is different in terms of where that data resides? Right. So it’s, if we approach it this way, it can be a very big challenge overall.

24:56
Travis
But if we approach from the bottom up and we build the right foundation across the entirety of our organization, it makes actually achieving these things at a top level a lot more approachable. So to me, it’s, if you look at it and this mindset, if a solution that we’re trying to put in, if it meets the requirements that we need for ot these mission critical systems, right, that allows our users and operators to do their job. But even better if we can make their operations better, if it works for that side and of course it works for it. And that they understand it, they know it, they know how to provide the infrastructure, they know how to secure it, they know how to manage it, if it works for those sides, then you don’t get resistance, you don’t get that conflict.

25:42
Travis
I know it’s a simple thing to say, but it’s true. And it happens because the teams are then communicating on the same base and they’re understanding, they’re working together on the same objectives that the company has for what they want to do with that data. Okay, so with that being said, that’s a lot of, you know, in the journey, it’s a lot about that mindset and that culture change. If we can approach that, then it helps us actually look at our digital journey in a meaningful way. And what I’ve seen a lot of companies do is they’ll come together and say, look, this is what our OOT journey means to us. This is what we’re going to do. This is how I think we’re going to solve some of these challenges and we’re going to take them bite by.

26:28
Travis
So if we look at this, the OT journey, the data journey, of course, it’s very much first about data collection. We have to be able to get the data from our machines, our equipment that’s out there, and that is brownfield data and greenfield data. I’ll talk a lot about that. Right. Legacy PLCs that might be out there, of course, new sensors we’re putting in place. We want to be able to collect data from all of those and we want to be able to then provide a data model to add context to that data and do that at the edge. You notice that first part of it is all the edge.

27:00
Travis
That kind of single source of truth idea is really important because you don’t want to have that one picture where we’re having people not know what the data is and have to then map it or manipulate it multiple times. We want to be able to define as much as we can closer to the source of that data, have the proper data models so that we can then get that into, and I know we’re saying UNS here, but into an infrastructure that makes sense for you so that you can first and foremost fundamentally change how OT architectures work. And then if we do that, we can approach analytics frameworks, we can approach knowledge graphs, which I’ll talk a little bit more about, which is really what LLMs work off of.

27:40
Travis
If you want to really do like generative AI and use those technologies, we can do it, but we have to get our data modeled properly for that to happen. So this is kind of the journey I’m gonna start here with data modeling. We’ve talked about it, I think a few times today, but ultimately it’s being able to, you know, data from our different assets that we have to be able to define that data model. So what does, for example, a CNC machine look like to us as an organization? Right. What does our extruder machine look like to our organization, whatever it might be? We wanna be able to organize these models that make sense that we’re gonna. That, that we’re going to act on as we go forward? Right. It could be energy. That’s the example I use here.

28:24
Travis
I have a lot of companies there. The question they’re trying to. Or the problem they’re trying to solve is I want to do energy reduction. Well, in order to do energy reduction we have to understand where our energy is coming from, what machines are consuming most of that. And so we, if we approach it, where we build a data model for what that energy looks like, consumption looks like for a particular machine. And then no matter where the device the data is, it could be different PLCs, different kinds of systems. If we can get that data into that common data model framework, then as we bring it up, we can compare and look at the data across our entire organization. We can look at and easily now be able to compare machines and see what’s happening against that particular data model.

29:10
Travis
So I think it’s really important to be able to do that. But at the same time we want to provide additional context. What is the asset ID of that machine? Where is it located? Right the all the process data that’s there, what is the engineering unit of that? Is it degrees Fahrenheit, Is it degrees Celsius? What’s the range of the expected range? What is the target range that we want? All this metadata is important because that typically fits in this data model. Because if we don’t have it when we’re trying to build a high level system, then people have to ask those questions. But if I built it all correctly at the edge and we get it up, the data scientists wouldn’t have to ask those questions.

29:46
Travis
They would know and then they could do their job by being able to provide what’s the right ML algorithm that’s gonna help us just see anomalies or see what’s happening with our energy use. And I’ve seen some organizations who they just focus on that one problem statement, have drastically been able to see things that they wouldn’t have seen before. So I don’t know. In the US energy costs are rising a lot. We’ve had some companies who just by capturing it, they can see what their peak demands are. And because some of them will run multiple compressors at the same time, will start at multiple presses at the same time, it has the peak draw from, you know, and then from there that’s, they’re getting charged with that peak demand.

30:31
Travis
So what they do is instead they stage, because they can see their data in a common model, they’ll stage the compressors, starting them up. And that has saved them a drastic amount of money from energy. So we can’t do that though unless we have that common framework that we can work off of. So the data model itself, it’s important As I say, to kind of standardize that for your organization. Now, I’m not trying to dictate what your model is. You get to decide what, you know, your machines look like to you. But I will say there are some movements in the industry, there are some organizations that are trying to make that job a little easier for you and trying to figure out what we’re working with a lot as a collaboration with a lot of other people.

31:15
Travis
What should, you know, our energy model be, what should our CNC machine be or whatever? Especially if we’re trying to share data with the supply chain. Because if we can then have standardize data models there, we can leverage that as well as if somebody builds a really cool AI and ML model and it’s built off of a standard that they did, then if we just conform that standard, we can utilize that cool new AI model. So data model standardization is really important. There is a group called Sesame in the United States. It’s a kind of a smart manufacturing institute. They’re, they’re working off the OPC UA Part 5 specification, which is about defining, you know, these data models. You can go and actually see on the OPC UA foundation, you can actually see their library of these machine models.

32:03
Travis
Not suggesting you have to use it, but ultimately they’re there. And some companies have, you know, leveraged those and they just kind of fit their data into that, into those models. And. But the point is to really do what makes most sense for you and first and foremost use a model, figure out what that means and standardize that for your organization. Then once we get that data and we have that data of context, we need to get that data somewhere. And that is where open standards become really important. So we all know that, you know, legacy PLCs, the protein, the polling protocols like Modbus and all that. So we have to poll these devices. Those aren’t open standards.

32:42
Travis
I mean, modbus has been around for a very long time, but it’s very difficult for us to, without something in the middle to get access to that data. So open standards to me is the key to an actual digital infrastructure or UNs. And there’s a lot of them out there. We talked about OPC UA and MQTT today and Sparkplug. I’m gonna show example around MQTT just as an example. But there’s Kafka, there’s GraphQL, there’s REST APIs, there’s lots of standards that we can leverage. And as long as we’re putting in tools or we’re building our infrastructure against these open standards, then we’ll be flexible, we’ll be agile, we can use best in class tools for that. So that brings us kind of that unist. So open standards.

33:26
Travis
If I’m going to get data from my machines, I want to get that into this unified namespace. What does this really mean? Why are we saying that? It really is digital infrastructure. This is a really simple concept. This is nothing really complicated. I don’t know why there’s a lot of complexity around this term today, but it’s really just think of it as a centralized place where my data can go. If I have all this machine data on my plant floor, if I were to throw out any of our tools and things that we think about, if I were to just to get that data and I model it right and I publish it to a centralized place that is based on standards, ways of getting data in and out of it could be any kind of standard, open standard.

34:10
Travis
UNS is really, it’s protocol agnostic. It’s meant to just be. I put this data in this repository and then all these consumers can work with that data and do something with it. That’s really all it is. It’s about data democratization, about getting data into a framework, to an infrastructure that you can use. That’s really all we’re talking about. This is a concept. It very much is a concept. It’s not a product. Made a joke. You can’t buy UNs, but you could buy components of a UNs. You can build a UNs the way that you want to build a UNs, but you have to define that. You can’t go and ask a vendor to provide you a uns. You define in your infrastructure what would make the most sense for you, right?

34:56
Travis
What tools will we use to help us get all of our data into a common framework? And once you define what that looks like for you, then you align the different products and tools against that. And you have integrators and vendors. You come in, you say, hey, here’s how I want you to fit into our infrastructure. How does that work? How can you do that for us then having people dictate what it is for you? So there’s tons of benefits of a uns, lots of cool things you get. I’ll talk more about these benefits just with a small example with that. I mean, these are all just kind of PowerPoint benefits. There’s practical benefits that we get by being able to approach our architecture in a different way.

35:43
Travis
The last thing I Want to mention on this digital journey is kind of this end picture of getting to a knowledge graph. And now as you saw today with flo’s presentation, there’s getting process data into a common framework. They’re then potentially building analytics and building KPIs, providing that back into that framework. But then ultimately later, what we do is we bring that data up into a knowledge graph which not only defines what the data is, but it defines the relationships on that data. Relationships is really key. It’s really crucial for us to really gain a lot of insights about our entire process. We can model particular machines, but how those machines work together, how does this relate to environmental information or information brought into an MES or from an ERP system, like work orders?

36:31
Travis
We have to define these relationships and there’s a lot of great tools that help us do this, but we can’t even think about approaching this if we don’t approach first our asset or machine data. Right? We have to get that to me. That’s why I think it’s nice to think about UNS first. And that’s something we do on premise because once we do that framework, then maybe we can start looking at these. And there’s tools by all the big cloud providers. AWS and Azure do a lot around knowledge graphs. Digital twins work off of knowledge graphs as well. The true digital twins, where you’re actually, you know, you can simulate your process because you’ve not only identified what data looks like, but how that process actually functions. So there’s a lot here and this is really crucial for supply chains.

37:14
Travis
Supply chains are getting more complicated and in order for us to define these relationships, especially for how it relates to other products, these things have to, we have to look at these kind of tools. Now again, this is far off. I don’t want you to be worried about these kind of things. You can get there, you can achieve it and. But if we can get to this area, building our own LLM can be really possible, can be really cool. So I think of. There’s one use case that I think would be really awesome. Within the OT layer, practical layer for LLM would be to create a chatbot for your operators, for the people that are running your machines. Imagine you feeding the LLM all your process data that’s modeled properly. So it has all that machine data.

37:57
Travis
You feed it all of the machine documentations, you feed it a history of how alarms have happened within that facility and what their corrective actions were. You feed it, with all this Data and it will continuously learn, but that if there is an issue, or you just ask the chatbot, how is my process working? It can give you really good results to that. And instead of having to go try to find it, you just ask a simple English question, what’s wrong with my process? What’s wrong with this? Or why is that alarm? Why did it happen? Or how can I fix the issue related to that alarm? Those kind of things are really cool with LLMs, but I don’t think it’s wasting a lot of resources and time just trying to accomplish that.

38:43
Travis
If we neglect all the rest of the foundation, I don’t think it’s something we should approach. Okay, so the last part I want to talk about here, the last thing I’ll end up with is just sort of a simple example of how we can achieve a digital infrastructure uns within the OT world. So for all of our process data, first I think is we got to. We have to start with something, right? And there’s a lot of process data out there. And as I said earlier, how we’re building SCADA systems, the mindset around that’s the same. I’m not suggesting you completely go and just change all your stuff tomorrow to this idea. But how we’ve been building SCADA is I go and figure out the data I need for it, I bring it into the system and I work with it.

39:32
Travis
That’s very much this picture. So these systems that we have, these SCADA systems, MES systems, analytic systems, whatever it might be, is coupled directly with devices or the sensors or the, you know, whatever it is that hold that critical data. And this has worked. It’s worked well for a very long time. But from an actual infrastructure standpoint, what happens in this picture, if this PLC here, imagine this is a PLC 5 from Allen Bradley, and I can’t find those. They don’t sell them anymore. I don’t know if you have ebay here or not, but a lot of people in the US are trying to just find every device that’s on ebay. So they have inventory on the shelf, because if it dies, they need to be able to put a new one in place, right? But let’s say they can’t find it dies.

40:23
Travis
This PLC dies, they can’t find it. It’s not available. We have to then bring a new PLC in. So that means they have to program the plc. We’re not going to get away from that part of it. But then what do I have to do with all these applications that are Using that data, if I rebuild that plc, I then have to go and probably update or touch every one of those systems, right, and change them and add and make the modifications necessary for that plc. So that’s where a lot of these issues come into play. Or what happens if we introduce a new device. Then these applications, I have to make connections, I have to then bring that data into a lot of them and it just becomes a challenge, It’s a management challenge over time.

41:07
Travis
If we compare that with a different picture. Again, this is an example. I’m not trying to tell you how to do things, I’m just giving you an example. This is a way that a lot of people have gotten actual practical solutions in place that give them a lot of value. But if you look at this picture, leveraging MQTT as a standard, we’re taking our devices and we’re publishing them into a repository, right? So it is just for real time data, but we’re publishing it into a common framework that then all these applications can subscribe to. So what’s cool about this picture is that the applications know nothing about the end device. They don’t know where it’s coming from, they don’t know which plc, any of that kind of stuff.

41:52
Travis
They don’t have to know where it’s coming from or what type of PLC it is, right? They just are getting the data coming into that broker. Likewise, the PLCs, right, have know nothing about which applications are leveraging the data. But we’re not gonna have a million connections to the plc. We’re simply going to bring that in so that we can leverage that data in a meaningful way. So there’s a couple big benefits we get with this kind of approach, we’re decoupling our applications from our devices. That decoupling nature is what’s really important. That’s the fundamental architecture change, right? I’m building SCADA differently. I’m not coupling things together, I’m decoupling it. And so when I have a brownfield device, I need to get that brownfield device, publishing it into that infrastructure.

42:44
Travis
And if I buy a new one, it might very well already support this technology and I can just plug it right in. But if we the way to balance the old world, the new world is by having them be in that common framework. And we need to be able to get architectures to look more like this. But with the decoupling, if I need to change out this plc, I can do that. I can upgrade the plc, I can put in and I can publish the data the same way, to the broker, to the UNs, publish the same structure, and my applications don’t change whatsoever because I built what I want that namespace to be. Why it’s called Unified Namespace is that you’re defining what your data naming conventions are, hierarchy is, and what your data models for your different assets are.

43:30
Travis
You define that up front and doesn’t matter what device you have, you fit it into that common framework because if you do it, none of these things change. There’s no more mappings at higher levels for how that data works. So to me, that’s one of the big benefits. But the other one is if I want to add a new application here to take advantage of that data, I can add a new one in, I can connect to the broker, I can get that data, they can play with it, I can experiment, I can potentially get some value. If it fails, no big deal. I didn’t affect scada, I didn’t affect the devices. Anyway, we’re not putting additional load, we’re not doing any of that. And I can guarantee that happens in a read only way as well.

44:08
Travis
I can control how the data flow actually works. So those are kind of the benefits of this decoupled architecture. All right, so let’s go here, let’s see how can we approach it? Okay, so that’s great. How do we approach that? So we have to of course be able to have tools that leverage modern infrastructure. Now we have a lot of people that just have a SCADA system and if that system supports mqtt, then we can most certainly just publish that data to that broker and now these consumers can use that data. But there’s a problem with this picture. What do you think the problem of that picture is? I am making scada. What here? Middleware. I’m making middleware. Is SCADA designed to be middleware? No. Have I really changed the architecture much on this? I’ve had scada. Whoops, sorry, don’t look at that one.

45:05
Travis
I’ve had SCADA who is getting the data the way it usually does. I’m just bolting on something to get it into a broker. And don’t worry, you can get value out of that. But I’m not fundamentally changing the architecture. Right. It’s not decoupled. SCADA’s coupled with it. But then getting a data to the business, that’s great, but we need to look at an architecture that is more like this. This is where again, Brownfield world, we can’t change all those out over time. But what we can do is introduce a little layer in front of it that’s an edge node. We always say edge is just basically a protocol conversion that takes the data, that pulls that PLC all the way at the source as fast as it possibly can, contextualize data, and publishes it into a broker.

45:51
Travis
Now there’s lots of tools that can do that, there’s tons of them on the market today that can do just that. In fact, there’s even hardware that does that. But the idea is that you get an edge node that publishes it into the broker and that’s sole responsibility. Now, I recognize that it’s new hardware, it means there’s a cost, you have to get that stuff in place. But if you did it and you do it in fragments, small fragments overall, your architecture gets to all the data being here that then every consumer can leverage, right? Including scada. Now you have a more modern, more robust SCADA system, right? And with mqtt, of course, it does support command and control, it supports, you know, quality within data.

46:31
Travis
All the stuff that we require, you know, to make sure we know the state, we know what’s happening, that we can trust our system, that we get that with this picture. But we also get a brand new way of looking at data. So again, I’m not suggesting tomorrow that maybe you like this, maybe you don’t, or I’m not suggesting go and change your entire infrastructure this, but you can see the benefits of being able to do this. I can introduce new tools, I can change things out, I can introduce then a new smart sensor. And by doing that, all that stuff is going to be brought into all these tools can leverage it, can work with it.

47:05
Travis
And imagine of course we put on that picture over the left hand side, consumers being the products you saw here today, we all can fit into that very easily. Then ultimately now everybody wins because we can fully take advantage of the data. So I’m going to end you with this, which is a simple recap of what I think the digital transformation journey really is about. First and foremost, it is a mindset change, it’s an architecture change, especially culture change, for us to be able to approach it in the right way.

47:37
Travis
Meaning that we don’t just use tools for tools, we don’t bolt on new technology, we figure out what is our infrastructure, how does our infrastructure need to work so that we can adapt for tomorrow and how do we get our teams to all be aligned so they can contribute towards that one common goal? I think Digital infrastructure, or UNs is critical for that digital transformation has to happen at some point. Can’t keep building the systems the same way we’ve been building forever and building data models, standardizing on data, figuring out what that means for you, getting to a place where we leverage open standards, having interoperability, best in class solutions. That’s how you’re going to get there. But start small. Find a use case or objective you’re trying to solve. Solve that, but do it in a methodical in the correct way.

48:29
Travis
Don’t just bolt on stuff. Figure out what you want that infrastructure to look like. Solve that challenge, get the ROI and rinse and repeat. So hopefully it gives you a few tools that can help you in your digital transformation journey. Thank you.

You might also like