Close this search box.
Get an exclusive look at Ignition's latest demos, remote energy management and more.
By Tebello Masedi
02 November 2020

How-To: How Flow Connects To Different Web Api Sources



Hi everybody. It’s twelve the hour. I see there are still a few people joining as I speak, so I’m going to give it another few, a few seconds here, just waiting for all everybody to join and then we’ll kick off the tech session. Right, I think we’re going to get started this afternoon. Well, first of all, I want to say thank you so much for everybody that joined one of our another tech sessions Thursday, tech sessions from Flow software. For the guys that doesn’t know me, my name is Lenny, I’m with Flow software and I’ll be covering some, a little bit of a technical session here with you today. So what I’ve got planned for today in our tech Thursday session, it’s the second one that we’ve launched. 

If you haven’t watched our first session around the aggregation methods that’s available in flow, please be sure to visit our website that you can get that session. It is available where I go around the different aggregation methods and how flow turns all of this data that we’ve got on our factory floors and just anywhere in our production environment and turns that into actionable information. What we’re going to do today though, is a little bit technical. We’re going to see how our flow can connect to web API sources to bring that data into our normal realm of manufacturing data, and then really start to leverage quite cool KPIs and calculations that we can do from data that potentially sits in IoT platforms, data that sits in clouds and cloud platforms in general, data that’s available for us by calling an API call. 

So I’m going to go through that today. But before we get a little bit technical, I just want to go and plot the typical flow of what we see when we do a normal project on the manufacturing kind of space. Now, the typical flow of data that we normally see is typical what we’ve seen in the industry for numerous years, and that is that obviously we’ve got some kind of a plant that we’ve automate, we automate the plant, we add some instrumentation to that plant, we control the plant with our PLCs and dcs systems. Obviously we added supervisory control with our SCADA and HMI solutions on top of that. And then on top of this layer that generates this mass amount of data for us is where we start putting in our mom type of applications. Now, mom stands for manufacturing operations management. 

And this is typical where all the data that’s been generated by our devices, by our laboratory systems, by our MeS systems is being stored in some type of historian or some type of waybridge solution or a LIm system, et cetera, or typically kind of MES transactional data as well. And then we use these data to generate what we call information. And that’s now where the reporting layer fits in. And this is literally where we now take all of this data and turn that into information. Because at the end of the day, people that sits in the enterprise now, it could be just be that people that works in the organization, they need to have access to all of this information that’s available to make decisions in their normal operations. But it could also be systems, right? 

It could be ERP systems, it could be business systems. And this kind of flow of data is very familiar to us, right? It looks like the very old familiar for the guys that’s familiar with the automation pyramid or the hierarchy of information as it goes through. It’s really literally going from input and output signals on our plant floor all the way to our ERP or enterprise level. And we’re very comfortable with the way that data acquisition always happened. In these layers. You always go from your sensor to your Plc, and then from your Plc to your SCADA level, and then from your SCaDA to your nes. It was a very kind of structured way of doing the integration between these different components. But this is a little bit being thrown a little bit on its head. 

And the reason I say that is obviously with the evolving of iiot that we’ve been hearing so much about in the past, probably two or three years, as well as this concept of digitization. And what we see is happening is that cloud is becoming a real thing, and it’s really becoming exploding into our environment. And what we’re seeing is that things are a little bit being moved around in the normal hierarchy here. So we’ve got data points that can now get delivered or stored directly into the cloud. We can not only just have the data, but we’re seeing that entire applications are being moved into cloud hosted environments, Mes applications, ERP applications, they are getting moved into the cloud as well. And so there’s this need for any point of the data to be able to communicate to these different cloud platforms. 

And one thing that excites me quite a lot is the concept of node to node communication. That pieces of equipment can now start to really talk to one another and really leverage this concept of Internet of things and getting data for the machine to actually make an intelligent decision on what it needs to do next on the production floor. So the pyramid is a little bit evolving. The pyramid is getting a little bit anymore. It’s a point where there’s these node to node communications and everything can talk to everything, and everything should be able to get data from whatever other system. Now you might think and look at me and say, yes, Lee, this looks a little bit like a spaghetti integration. It’s point to point integration. 

It’s something that we’ve been taught in the industry is definitely a no, and that we want to be a little bit more clever in the way that we do this integration. And you’re 100% correct, definitely. We don’t want to have solutions. That’s point to point. It must be standard communication protocols that we utilize. It must be standard ways of getting the data across. 

And in an ideal world, what we would look at is kind of a scenario where we have like a broker centric or a message bus kind of architecture, right, where devices can subscribe to this message bus or solution or platform and they can publish their data, and that data is then available for other system or software platforms to take that data, to correlate the data, to contextualize the data, and that software platforms can then make the data again available to publish and subscribe to the applications that now are starting to live in the cloud as well. So there’s definitely a method in this madness by having this kind of publish and subscribe architecture where pieces of equipment can get the data whenever they want and whenever they want. But what are we seeing in reality is reality. 

We’re seeing that currently, if we look at this space, there’s definitely not a shortage on devices that enables this IoT kind of environment, right? There’s a lot of devices that can push data or that’s IIT ready or whatever the case may be. And there’s definitely not a shortage of protocols and networks, right? We’ve got sick Fox, we’ve got Laura, we’ve got 3g, we’ve got normal WAN connections. So there’s definitely not a shortage on networks. And there’s definitely a lot of integration that can be done. There’s APIs, vendors have sdks and toolkits that you can go and create and integrate. There’s brokers that get installed, there’s gateways that allow specific communication. There’s controllers that can push directly from the devices into the cloud solutions. And there’s definitely not a shortage of cloud providers. 

Now the problem with this is though, that some of these providers and some of these devices have a little bit of a proprietary kind of integration that you need to be enabled to get at. But what we are seeing though is there’s definitely an adoption in this kind of space to what we call open standards in the industry. If we think about MQTT just as one of these examples, it’s one of the standards that’s getting adopted quite good in this space. So we do see that it’s getting much more easier to allow us to have this kind of broker centric architecture, especially when we talk about cloud, but we’re not almost there. We’re almost there, but we’re not entirely ready. But the problem is that the need from the user is exactly the same as what it was. 

Doesn’t matter if you look at the normal integration via the pyramid, or if you’re looking at this new broker kind of centric architecture, they still want to know what was the production figures for the day, and they want to add context to that. They want to know for which area in the plant it was for, what was it for each shift, what was the material that’s been produced. They also want to know what was there from a utility perspective, where’s their monthly water consumption, what’s the energy usage. They want to know what is the average temperature when they were running this particular batch, which operators on the plant floor is exceeding their target, is doing very well. And very importantly, what is my efficiency, what is my KPI metrics? 

When we look at OEe from our machines, am I actually producing what I want now? Typically what we’ve seen in the past? Well, that was kind of easy, right? All of this information comes from these databases that we had on our plot floor, from our historians, from our mes informations, et cetera. But as the data starts moving to cloud hosted providers, as the applications actually shift into these cloud hosted providers, we need to extend our little realm here. We need to be able to now also connect into this information layer. That’s data that sits in these cloud hosted environments. I know I’m showing arrows only pointing into the mes layer, but potentially you need to be also able to send the data out back to these applications to enable that node to node communications between these systems as well. 

And this is where flow fits in, I would say pretty perfectly. It’s really geared to turn this amount of data to information, and it’s also geared to become this software platform that allows us to enable the sharing of information not only between enterprises or people, but also pieces or devices and other applications that might require the information as well. Now you guys have seen the flow information architecture probably a lot. I’m just very quickly want to run through it, because I just want to highlight a few of the things that I’m going to cover in the webinar or in the demo just now. And as you guys know, is that when you install flow, you create this information platform. 

And the first thing that we do with flow is it’s to be able to connect to all of these different data sources or disparate data sources or silos of information that we have in our production facilities, and obviously very important, still manual data that we need to connect. So you can already see there, we need to be able to connect to historians as per normal. But then we also included this cloud and IoT bubbles into our collection architecture here to be able to use the data that sits in these cloud platforms, in our calculations, and in our KPIs to really mix and match the information that’s available. Obviously, we can then utilize that in our visualization platforms. We can collaborate with our mobile devices. 

And as we explained, that software platform must also be able to push the data out to subscribe its data back to other systems. And flow can definitely do that with the integration capability that it’s got. What we’re going to focus on for today, though, is just a little bit of how do I actually get data that sits in these cloud platforms or IoT kind of platforms or pretty much any web API driven kind of solution? How do I get that data inside of flow and how do I utilize that to do so? We’re going to move a little bit onto the demo side of the session today. So I’m going to go and show you guys all the different APIs that’s available and that what I’m going to do to demo this for you guys. 

So the APIs that I’m going to utilize in the demo today, here is a few things. First of all, I’m going to get weather data into flow. I think this is probably one of the things that people think is the easiest to do, and it definitely is to get weather data. Now, weather data can be extremely valuable, especially if you look at rainfall, humidity. You might need to utilize that. If you have storage locations that’s outside in the field, silos of actual raw product turns wet, you want to know what that precipitation was. You can definitely use that in your calculations as well, temperatures, et cetera. So I’m going to get weather data into flow by using a weather API. 

I’m also going to get some employee data, probably not too relevant, but there’s a method in my madness with this one just to show you guys how powerful the queries can be and how we can filter out data that we get back from our potential web APIs. I’m also going to get exchange rate data. I’m sitting here in South Africa, in Johannesburg. So I’m going to get some conversion between south african rand and dollar, and I’m going to use that exact same spot rate to do some KPI calculations by using exchange rate data that we can get. And then I also have a little bit of a web service database that serves utilities data. So I’ve got some meter that pushes data, my power consumption pushes that into the cloud. 

And I want to get that data back because I want to use that in conjunction with my production data to do intensity calculations as an example. I’ll show you guys how to do that as well. And then just for fun, I thought it would be cool to pull in bitcoin data into flow and do some cool aggregation methods with bitcoin data. At this point, I want to do a disclaimer that flow software is no point in time. A financial institution, t’s and C’s apply. We cannot be held accountable for any financial decisions you make after this webinar as well. And then obviously I’m going to use that in conjunction with some historian data that sits in pretty much stock standard historia. Cool. So that’s the web API that I’m going to use in the session today. 

Just to give you guys a little bit of a feedback of what I’m going to do before I just go gung ho and call a whole bunch of APIs inside of my system. Cool. I’m going to move across here to my vm here. And even before I go into flow, the configuration tool in flow, to start connecting to APIs and get data. There’s two other pieces of software that I would strongly recommend everybody gets even before they start thinking of doing web integration or web API integration into flow. The first one is Postman. Now, Postman is something that I probably use on a weekly basis. It’s a great little tool to simulate and to see if you’ve got the correct API. 

If your keys work a lot of software or a lot of APIs, you need a token and pass that token before you can actually get the data. So Postman is definitely up there when it comes to testing web APIs and calls to see what the data looks like that you’re going to get back. So I would strongly advise getting Postman as a solution. The other thing that I a lot of times use, obviously the data that comes back from these web APIs is structured in a JSON packet format and there’s a little utility that I use. It’s an online utility, it’s called 

And this is really great for me to be able to drive and drill into the data to see what the query should be that I should configure and flow to get specific data out of the API calls or the data that comes back from our web API. Right, let’s do the first one. Let’s do weather data. So I’ve created an account with, and they’ve got a very good API documentation here where I can get the current weather data for your city. So you’ll notice that they’ve got a current weather API. So if I click on the API documentation, they show exactly what that API call is. So pretty much all you need to tell it is what city are you in? As well as you need an API key. So you need to register, you need to sign in with them. 

They will then supply you with an API key and you need that API key obviously to make your call. Now, I’ve done that already, so I’m just going to go here to a little bit of my cheat sheet of all my URLs. So let’s do the weather one first. So there’s the weather URL that I would use. And obviously I am using Johannesburg as a city because that’s currently where I’m in and I’ve got my API key there as well. Now even before I go into flow, I’m going to do it in Postman. So in Postman here, I’m going to make a new collection. I’m just going to call it ts for tech session. I’m going to create a collection here. And in here I’m going to go and create a new request. All right, so I’m going to add a request. 

This will be my weather request and it’s going to be a get command. And all I need to do is paste in my URL here. Cool. So postman is quite cool. It will see what parameters are you passing? So you’ll see that I’m passing Johannesburg as my city. I’m also doing the metric units so I can get degree C back. And then obviously there’s my app API key as well. I’ll hit the send button. Here it will go and actually do a request. And there we go. So at the bottom here, I can see what the actual response is from this API call. And I can see that the current temperature in Johannesburg is a very nice, hot summer evening. It’s about 26 degrees celsius. So obviously I would like to know in flow, what is this temperature. 

So that’s the KPI, or the metric that I just want to be able to see is what is the current temperature in Johannesburg. Now, this is where the second part of my little testing tools come in. What I do is I select all of this data in Postman, I copy it, and then I go back to that JSON path evaluator, and I just pretty much paste in the data that I get back from Postman in here. Now, the end goal is to get to the temperature field, because that’s the field that I would like to record inside of flow. Now, JSON has a method that you can actually browse to the different locations in this JSON object. You’ll notice that there’s an array of weather data here. There’s a whole bunch of base kind of objects that I can go and look. 

I can look at the cloud data, I can look at the wind speed data, but I would like to get to this main data that sits in here in the temperature field. So they do have a whole bunch of expressions that you can use. So you can go and recursive down this object. If it’s arrays, you can go and look at your child operator, you can look at the current element that you are in, and you can just go and browse into the data here until you get to the field that you want. So I’m going to just expand that a little bit. So currently, you see, if I do a dollar sign, I’m actually in the main object in here. 

So if I start going down into the structure, so if I go into main, I get to just the main portion of the object that I got back. And if I only want to get temperature back, I just go and do another dot and I go to temperature, and that will then only return the actual temperature field inside of this massive JSON packet that I got back from the web API call. That’s quite cool. This is actually what I want. This is the value that I would like to get inside of flow. So really cool tools to utilize to make sure that your paths are correct, to make sure that you get to the right point before you even try and configure this inside of flow. So let’s go into flow now. I think we are ready. 

We know exactly how to get to that temperature field. So I’m going to go open up the flow configuration editor here. And as per normal, I’ve got a config already. It’s quite a blank model because I want to do a lot of these things from scratch. So in my integration section in flow is where I will go and set up all my connections to all my different data sources that I want to pull into flow. If I right click here to add a new data source, we have all the typical kind of time series historian data sources that we can get data from, like the Canary historian, from inductive, the ignition historian, et cetera. 

You’ll notice right at the bottom here, we’ve got an option here to add cloud integration, and we’ve got a generic web service call that I can utilize to create and get the data from these different API calls. So I’m going to use this web service call that we’ve got here, and it’s actually not too difficult to set up. You can give it a name, so I’m just going to call it tech session weather as a name, and then it’s going to ask you for a host or a URL that you need to supply. Now if we go back to Postman here, I’m going to just copy this whole point or this whole URL here, and I’m going to post that into my host connection here. 

But the host is just the first part of the API call, so I’m going to just go and delete all of this here. So the host that I’m calling is pretty much just API, if I needed to supply username and password, I can do that. Luckily this web API doesn’t have to do that. If I need to go and enable basic authentication, I can do that as well. Luckily we don’t need to add that as well. I do have a key. I need an API key to actually get the call. Now we can go and supply the key right here inside of the configuration. And if we go back to Postman, the key is this end part here that I can actually go and copy. 

Now we do password protect the key, so that’s why we can add it in here for the endpoint. And obviously nobody else can see what your particular key is. If I had to add a header in my get request, I could do that as well. And then I also have the capability to cache the data. Now this doesn’t just cache the actual HTTP URL, it actually caches all the data that comes back from the call. Given that the URL stays the same, if the URL changes, then obviously we’ll get new data back from it. In this case it’s the live data as I get it in. So I don’t need to cache it. I’m going to just take the caching off for now and make it false. And pretty much that’s it. 

That’s all I need to do to get that set up and going. So I’m going to save this connection here and then I’ve got a new tech session weather API connection up and running. Now unfortunately we can’t browse the namespace because we just don’t know what the actual parameters is that you’re going to pass, how many cities you’re going to call it for. So we do give you the capability to create your own namespace. So I’m going to create a namespace here. It’s going to be the temperature, and the temperature that I want to do is going to be analog value and it’s going to be for Johannesburg. I’m just going to go and add that as well. All right, so now I can go and edit my endpoint. 

So obviously I need to supply it with what is the endpoint of the API that you’re going to call. Now obviously the endpoint is the part after the initial URL, right. So I’m going to go and copy that out, I’m going to put that back into flow and I’m going to replace my key with the placeholder that I’ve done in the connection. So this will just become my key. So at no point is your key actually visible when you do the configuration. Now I need to go and supply the data path, right. So as we saw in our little configuration here, the data is going to be dollar main. So that’s going to be my data path. 

So I’m going to change this to dollar main and then the actual value that I want to get back is going to be just temp or temperature. So I’m going to go and change that to temp. All right. And that is it. So that is all that I need to do. Just going to zoom in a little bit here for you to be able to get that exact same data path and value back from flow. If you watched my previous session last time about the aggregation methods, currently I’m not doing an aggregation method because this API call doesn’t give a timestamp. So if you do have web API calls that gives you back a timestamp as well, then you will be able to change the aggregation method to any one of these aggregation methods. 

That’s standard with flow, but for that we need an API with some timestamps. But I’m going to do that a little bit later when I look at my utilities data that comes back from that. Perfect. So this is all that I have to do, all that I have to set up to get that out. And let’s see if this works. So I’m going to minimize this. I’m going to add a new folder here. So I’m just going to call it my tech session and I’m going to go and create a temperature metric for Johannesburg. All right, and let’s see if I can get the value from Joburg. So I’ve already done this on my weather API. So I’ve got my Johannesburg temperature there. I’m going to drag it across underneath my Johannesburg temperature metric. And I’m going to go and deploy this out. 

And the flow engine will now at the back end go and query the API, get the data back. And I should see a value of around 26 or 27 degrees when it’s done. Getting the data back from the web API. Obviously, if you were able to pass in a start and an end date parameter, then it could have actually backfold the data and got the historical data from the weather service as well. That unfortunately is a paid for service. And for the purpose of my demo, obviously, I just registered with the free version, so that’s why I’m only getting the live value of the temperature back. So if I refresh there. There we go. 26 degrees currently here in Johannesburg. Cool. So I’ve got weather data inside of flow by using a web API. Cool. 

The next thing that we’re going to look at is employee data. And the reason that I want to show employee data is just to show you guys how powerful the JSON path combinations is and how deep you can actually filter into that as well. So there’s numerous amounts of dummy rest APIs available in the Internet where you can pretty much go and just call an API call. In this case, this one will generate a whole bunch of employee data that you can play with to learn all these different ways to get the data and all the different paths. So I’m going to do exactly that. I’m going to go and copy this dummy data that I’ve got from this web API again, passing it into my evaluator here to go and see the data. 

All right, so if we look at this data that we’ve got here is we’ve got a success code from the API just to tell us that the API call was successful. And then we’ve got a data array of our values. So all I need to be able to do here is I need to be able to go into the data array. There we go. And if I want to look at the employee ages, I can go and do a dot, and I can go and look at employee. And this is where my spelling is really testing me now. So I’m going to cheat a bit, copy that out, paste it in, and there we go. Right. So I’m in the array, and I’m looking at all of the different employee salaries and very confidential data I know. 

And then I can also look at the employee ages. Right now, this doesn’t really help me at all, because what I’m going to do with an array of ages, I can’t add them together, but potentially I want to look at a particular employee’s age. So let’s pick on Doris. Sorry, Doris. I want to know what Doris’s age is again. If you expand the expressions here and you go to this JSON path expression editor here as well, there’s really good examples on how you can go and actually filter and look for specific conditions in the array and then filter out data based on that as well. So that’s exactly what I’m going to do. I’m going to filter on that employee name to get the actual employee age out inside of my JSON blob that I’ve got here. So let’s just do that again. 

Let’s copy the data and paste it in. Let’s look at that. So there we go. There’s the employee ages. But now I want to filter the age based on the employee name. Now, I’ve done this before, so I’m just going to go and use my little cheat sheet here. So I’m going to copy that out, paste that in here as well. And now what I’m saying is bring me back the age when my employee name is called Doris. And now all I want to know is, what’s Doris’s age? And obviously, Doris’s age is 23. So if I copy out the age component at the end, I should be able to see what Doris’s age is, and Doris’s age is 23. Perfect. So I can use these expressions to really go and drive into these data points and get the data out as that. 

Again, I’ve done that already. So all I’m going to do is I’m going to show you guys just the configuration of that in this case, obviously, it was very simple. I didn’t need a key or a token to be passed to this API call. So the configuration of this API is pretty simple. Literally all it is to go and call the dummy data. No authentication is required. And then on the configuration of the age, all I had to do here was to point and add in my filter condition exactly like we did and then look at the actual age of the employee. So if I do that, let’s create another metric here. This is employee age. 

Drag it across, and if I deploy this out again, the flow engine on the back will go and query the API, and it will go and record an age of 23 because I’m filtering out just particular to Doris. Cool. So let’s do that. And the flow engine will process that in the back end. And when we’re done processing and getting the data, you’ll notice that it recorded a value of 23 for Doris. Cool. There we go. Doris is 23 years of age. Cool. Don’t know really why you would want that in your dashboard, but just to show you guys the power of filtering that out from an API call perspective. Right. Let’s look a little bit more into a typical kind of production information example. I also have a historian in this environment, so I do have a process historian. 

And in this process historian, I do have a packaging line, and I’ve got two fillers. And what I’ve got is a bottle count of all the rejects that gets discarded off my filling line. So I want to potentially see from a cost perspective, what does this cost me, what does all of these rejects cost me from a production perspective? I want a KPI that’s going to tell me how much money do I lose, all of these bottles that I need to reject. So I’m going to create a new metric here. This is going to be my rejects cost, and I’m going to go and add my rejects from my historian, drag it across, and these are my rejected models. Cool. Now, if I look at the data that sits behind this is a normal totalizer value that sits in my historian. 

So obviously I need to be able to do a counter retrieval of that data. Again, please have a look at my aggregation webinar I did last week where I explain all the different retrieval methods inside of flow. But in essence, if I want to see what was the total of rejects for the hour, I would go and configure what we call a counter retrieval, and that will go and query the data, perform a counter retrieval on the data, and then tell me what is the actual amount of rejects. Right. So the flow engine again will go and backfill that, do that, query the historical data, and there you can see all the rejects that we’ve got. Perfect. Now, I know that every bottle that I reject a bottle to cost to manufacture is about one rand 50. 

So it costs me one rand 50 to actually make a bottle. So I would like to multiply that to get a total number of dollars and cents that I’m literally throwing down the drain. So what I’m going to do here is I’m going to create a measure, and this is going to be my cost of manufacturing. And in this case, my cost is going to be in rands. So I’m going to change my unit of measure to rands. And I know that it costs one rand 50 for a bottle. All right? So it’s going to be 1.5. That’s going to be my cost and I’m going to backfill that out. Now, the equation is pretty simple to see what is the total monetary value associated to my rejects. 

All I have to do is obviously multiply my cost with the amount of rejects bottles that I’ve got. So I’m going to do that and I’m going to create a calculation here. This is going to be my reject costs calculation. And literally all that I have to do here is to say, take the amount of reject bottles and multiply that with what it costs to actually manufacture a bottle. So let’s do that and add the cost in there as well. And this is a very simple calculation. The unit now is obviously rands. So this is the total rands that I’m wasting. So deploy that out and we can see what is the total rands that we are wasting by doing our rejects of a model. So not too bad. About 600 rands per the hour. 

If I want to see what that is looking like in the day, I can obviously go and aggregate that up so I can get a daily cost of all my rejects. Now, we’ve got some international guests on the webinar as well. So I would like to see what this is as a dollar value, right? Not in Rands. And this is the third API that I can use. So I’ve got an API here that I use. It’s called currency layer. Again, I had to access or create a KPI, but it’s quite cool. You can go and ask it for a whole bunch of live currencies at this point in time. So if I run that, you’ll notice that I can get what is the current rand dollar? Rand australian dollar, sorry, dollar against the australian dollar against the canadian dollar. 

And then yes, you saw, right, us dollar to south african rand is a whopping 16 rand per dollar. Cool. Again, I can go and copy this out, I can put that into my little evaluator here. Let’s clear out all the data that’s in here. Let’s paste that in, let’s go. And now look at that. So obviously I want to be able to go to quotes. And inside of quotes I would like to see the Usdzar value. So I’m going to go to my branch, I’m going to go to quotes. And inside of quotes I would like to know what is USD and then Zarf. And there we go. So that is exactly the path that I need to go and configure inside of flow. And I’ve already done that. 

But just to show you guys how to go and get to that particular point as well, again, I obviously use the key component to specify my API key when I set up the connection. So just to show you guys that, again, very simple, it’s the host URL and then I added in my endpoint key to the component. Again, I’m not caching it because I’m using the live data as it comes in and not asking for historical data. So I’m not caching it at this point, but definitely something that you could have done if you wanted historical exchange rate data as well. And then obviously I had to go and configure the exchange rate and in this case I need to go and look at my endpoint. 

I’m adding in my key that I’ve configured and I’m also looking at these three currencies and then as I saw I’m looking at my root path and then the actual value sits in the quotes object and inside of the quotes object there is the UsDzar component. Cool. So that’s all I have to configure. Let’s drag that across. Let’s do a little bit of a see what is the current Usdzar, deploy that out and that should give me then a value for what the current exchange rate is. Let’s just wait for the flow engine to go and query the web API, get the values out and it should be somewhere around 1616 rand to the dollar. All right, so to see what my actual cost in breakages is in a dollar format, obviously I can use my rand cost that I’ve got there. 

Let’s just wait for this guy to call and to finish off and we should see the value that we’ve got. There we go. All right, 16.38 is the current spots exchange rate. So let’s do that. Let’s see what is the hourly rejects. Let’s do a calculation here. So this is going to be my reject costs, but based in dollar I can actually just do that conversion and I’m going to change the unit of measure. All right, open that up in the retrieval section here, drag in what the current spot rate is, and obviously to go from rands to dollars, I have to take my rand cost and divide it by my exchange rate and change this to be dollars as my unit of measure. And then I can go and obviously aggregate this to the day. 

So if I want to know what was my daily cost in dollars, it will take all the hourly information, add it together and see what my daily cost from a dollar perspective is. So let’s see. There we go. It’s about $670 per day that I am throwing away in the dust bin due to the rejects that I’m running on my factory. So quite cool. I hope you guys can see how we can mix and match all the data that we’ve got from these devices and see how we can utilize that in really efficient kind of KPIs and calculations that we’ve got. All right. Up until now I’ve only been looking at live data. I’ve been looking at the actual values as it comes in. I’m not applying any aggregation methods to it. 

The reason for that is that all of the API calls that I’ve done up until now didn’t have time series data coupled to it was live values for exchange rates, live values for weather, et cetera. But I do have a data source where I do have some raw data for utilities that I can also go and pull, and I’m quickly going to show you guys how that data looks as well. So I’ve got some raw data here that’s been pulled. It’s got a timestamp, and coupled with that, timestamp is actually a value. So this is typical kind of time series data that’s been stored in a cloud hosted kind of historian environment. So I’m going to utilize this to actually get some time series data from a web API call. 

Again, I’m going to utilize my little cheat tool here to see how I need to go and access the data. Paste that in here and let’s start right from the scratch to see how I would get to it. All right, so it’s going to tell me in this case, it’s telling me for what is the date range that you are pulling data to. So I’m not too fussed about this initial period start and period end component. But what I do want to know is I want to see these values. So there’s some value components in here and then there’s a timestamp associated with that value and then obviously the value itself. So obviously I need to get to this value array that sits in my object here. So I can literally go and say, give measure values. All right, so there we go. 

Here’s all the measure values that sits inside of that array. And now if I want to go and look at the timestamp, I just go t and I need to go into the array. So there are all the timestamps that’s inside of that measure values object array. And if I change the t to a v it’s going to tell me all the values. So I’ve got two paths now. I’ve got a path showing me the values and I’ve got a path showing me the timestamps that’s associated with each and every one of these values. So in this case it’s very simple. In this case, if I look at my power data that I’m getting back from this web API call, all I have to do in this case is obviously populate the timestamp path as well. 

So up until now I’ve not used this timestamp path because none of the API calls gave it to me. But in this case I would love to use it. And obviously now I can utilize the whole set of flow API or aggregation methods on top of this data because it does return a timestamp associated with that. In this case it’s a counter that sits in the cloud. So I’m going to again use the counter retrieval to do that and I’m going to go and pull that out as well. So if I’m going to go write a new little thing here, this is going to be my power data. 

And again, as per normal, all I have to do is just drag it across and flow will go and query that API, look at the timestamps and the values, and in this case it will actually do an aggregation method on top of that data. So if I open this up, there we go. There is my total kilowatt hours that I’m utilizing every hour for the machine and now I can utilize this utilities data in conjunction with actual process historian data to do some cool intensity calculations. So if I want to know what is my energy usage per bottle, I can very simply do that if I go back to my process historian in there, I do have the data for bottles. So I’m going to go and create a new metric here. 

This is going to be intensity, and all I have to do is drag my bottle count across. Again, bottle count is a counter or a totalizer. So I’m going to change this to be a counter retrieval, deploy it out, and now, very simple to get an intensity calculations or kilowatts per bottle. All I have to do is take my total bottles, drag it to the calculator. This is going to be my intensity calculation. And all I have to do here is to say how much kilowatts do I use to produce a bottle? 

So I’m going to drag in my power here as well, power usage, and in this case, I’m going to say take my power, divide it by the total bottles that I’ve got, and I’ve got a very simple intensity calculation that’s going to tell me kilowatts per bottle or per piece or whatever you want to call it from that perspective. Cool, deploy it, and the flow engine will obviously create this calculation. And you can see what your intensity is or how much kilowatts are you using when producing a piece. Very low, which is good. So I hope you guys can see how I can mix and match these different data sources. Doesn’t matter where it’s coming from. 

The flow software platform creates this one single version of this rich repository of all of this data that’s coming from all of these different sources that I’ve got. Okay, so this is kind of the typical kind of things that we see exchange rates, we see a lot of weather data getting pulled in. We see a lot of data coming from IoT sensors that’s pushing data into the cloud that people need to get back to the plant floor to do these calculations. So there’s a lot of this type of stuff that we do encounter, and we had quite a lot of cases or tech support calls on exactly how to do this in the past few months. So I thought this is a very good time to show this kind of values as well. Now for something completely different. 

I know it’s not related to manufacturing data, but I thought it would be very cool to show as well. And that is the last API call that I’m going to do today, and that is actually to get some bitcoin data into flow. So raise your hands if you’ve got bitcoin. If you don’t got bitcoin. I think after this, you kind of go and buy some bitcoin because I think what I’m going to show you is pretty cool. All right, so we’ve got an exchange here in South Africa. It’s called the Luno exchange. And the Luno exchange pretty much allows you to see what is the current bitcoin price. 

So I’ve got my URL here and I’m going to paste it in here and that is going to tell me currently what is the ask price and the bid price for bitcoin, obviously in rand, but do not fear, because we’ve got the exchange rate in flow, we can convert this back to dollars as well. So let’s see how I can get to the ask price again. I’m going to copy this out and I’m going to put it into my little JSOn sheet tool here. And you’ll notice, to get the ask price, it’s part of the root JSON packet. To get to the root, it’s the dollar sign. So there I’m in the root component of this JSON packet. And literally to get the ask price, all I have to do is go and say, give me ask price. So there we go. 

This is the current ask price for bitcoin. It’s quite high. I don’t know what happened with bitcoin in the past few months, but it really took a little bit of a skyrocket. Cool. So all I have to do in flow is, again, configure it to look at this specific data point and I can get what the current asking price is. So I’ve done that. So I’ve got an API call here for this Luno exchange. And nothing fancy or nothing’s different from all the other components that we’ve done today. The base URL, this case, you don’t need to get a token to actually get the current values. So there’s no endpoint or no token here. So it’s pretty simple. And then I’ve created the current asking price for the tika. And that is pretty plain and simple. 

Look at the dollar, the node, look at the current ask price for. What is the current bitcoin, rand ask price. Cool. So if I do that and I pull this in, let’s go and create a new metric here for bitcoin, and let’s drag this guy in and let’s deploy it. Cool. All right, so I’m going to query the lunar exchange. I am going to get the actual current spot rate for bitcoin back and leave this run for a few hours. And then I’ve got some great historical data of the actual bitcoin price. Now, a very simple method of looking at market movements is moving averages. I don’t know why we don’t apply the same concept to manufacturing data. It feels something for me that can be utilized in manufacturing data as well. 

And it’s literally to look at moving averages to kind of not predict, but to see how the movement in the market is moving. Now, if I did moving averages on my broken bottles or my actual total production, I could potentially see when I’m going to go higher in production or lower in production. But as a warning, hindsight is always 2020 vision, so not necessarily going to work out that way as well. Cool. So I’ve got the asking price for my bitcoin. Now, what flow allows you to do is because flow stores all of this historical data inside of its databases, I can very quickly and very simply do a moving average calculation in flow. 

So if I take this ask price and drag it onto the calculator, I can go and tell flow, you know what, on the retrieval section for this guy, don’t look at its current value. Please go and look at a previous range of values. In this case, I want to potentially look at the past 18 hours worth of bitcoin values, and I want to go and take these 18 values. And all I want to do with that is do a very plain, simple average of it. So I want to average it out. So, in essence, what I’ve done here is I’ve created moving average, taking 18 hours into account. So it’s an 18 hours moving average of the value. Check my calculation and deploy that out. Now, I’ve done this a little bit earlier to get some historical data before the course. 

And I’ve done it for both hourly values as well as every five minutes. And I’ve generated these moving averages on the bitcoin price. So I can see what is the current moving average for every 18 hours, every 4 hours, and every 9 hours from that perspective. Cool. Now, let’s plot this on a chart. So I’m going to go to my chart here inside of flow. All right. And let’s look at just the actual ask price of Luno. So, I’m sorry, I’m just going to take off all the things. Cool. So there we go. There’s the market movement of bitcoin. Over the past 4 hours. I’ve added in the label so I can see what the actual values are. So this is the actual raw values. Every five minutes, let’s add just a very simple four minutely moving average. Cool. 

So there’s the movement of bitcoin with the moving average of four minutes. Let’s add an 18 point. So a little bit slower moving average on the movement of bitcoin for the hour. Now, theory is, and again, this is where hindsight always looks very promising. And I am definitely not a financial advisor, please do not take me to books on this one. But theory is that when the fast moving average crosses the slower moving average, that’s a signal to buy some bitcoin. Right. And if we go across to the top here, obviously there’s a big movement down, and the fast moving average goes down the lower moving average, and that signals then a sell command for bitcoin. So if you use that kind of theory, you would have made some bucks doing a trade here on the market. 

Now, we can also add the nine moving average in as well, just to give us a little bit more clarity on the data as well. And then another cool thing that we can add with flow is we can also add a little bit of regression analysis, right? So we can see from a polynomial perspective what’s happened. So we can see, oh, it was increasing. And then we’re starting into this little bit more downward curves. Now, if we look at the polynomial regression line, it’s going negative here. So if you add bitcoin at this point, it would probably be a good time to start selling it, because we are moving into a little bit more of a downward spiral. Again, hindsight, I’m not a broker, I’m not an investment banker at all. 

But I thought this is really cool that we can actually start using this kind of data in flow and start experimenting with this. And as I said, I would love to do these type of moving average analyses and regression, polynomial regression analyses on actual production data to see if we can actually see and track what the movement from that perspective is. Cool. I think I’ve pretty much spoken the entire hour. I didn’t think I was ever going to do that. If there’s any questions, please pop them into the questions section, and then we can probably address that for the next two minutes. Else, please be at the lookout for our another tech sessions that we’re going to have. 

The next tech session we’re going to have is a little bit more on the visualization side to show how I did all of this visualization, a little bit of run rate and projections inside of flow, and to show how we do that within our graphical component. Today was a little bit more focused on just getting data into one single platform in place by using all of these different APIs and web APIs that we’ve got. But if there’s any questions, I can probably address them now or else we will address that and get that later to you guys as well. If there’s other questions. I know it is late evening for some of us. It is early morning for a lot of others. I hope you guys have a fantastic Thursday and I hope to see you guys again on the next tech session. 

Thanks everybody.

You might also like