Close this search box.
By Tebello Masedi
11 May 2023

How-To: Get Legacy Historical Data Into Your Canary Historian



I’m going to kick it off here this afternoon. Again, thank you so much for joining us. My name is Lenny here from element eight. If it’s the first time that you’re joining one of our tech session Fridays, welcome. We aim to do one of these sessions every second week. We cover all three of the products that we have in our product offering, including ignition, the flow information platform, as well as the Canary historian. Now, if you’re on our mailing list, you will get an invite. And today’s session, what we’re going to look at is really a little bit of a technical session, and it’s going to be focused on the Canary historian, and it’s really about, you might have an excel file with some data that’s been populated. You might even have a SQL database. 

You might have a legacy historian system that you want to potentially migrate into your new Canary historian. And it’s literally about how do you get all of this historical data that’s potentially sitting in other systems into your Canary labs historian. Now, we are going to go a little bit technical today because I am going to show you guys how to use the Canary sender API to do this. But I think it’s a very valuable and a very good session here to understand how you pretty much can get any data that’s lying around back inside of your Canary historian. Just a little bit of a recap for you guys on the Canary sender service and exactly how that works. Canary has a very simple three step program of how to get data into the historian. 

Obviously, the first point is you need to be able to collect all of your data and store that data into your canary system, then assigning context to that data. Very important, great way that Canary does that with their calculations and their views to additional context to the data. And then obviously, great trending tools from an HML five capability perspective, they call it axiom, which is their trending and dashboarding tool. Really use that in an effective way to maximize your operations and to really make sense of the data that you are looking for. Obviously, we’re going to focus a little bit on step one. That’s to actually get data into the Canary historian. And now there’s numerous ways that you can do that. 

Now out of the box, the Canary historian comes with, they call it collectors, to actually collect data from our plant information systems and store that into canaries. Time series historian. And obviously, they have a whole bunch of standards that they support. They do have a collector to support the MQTT Sparklab glee implementation of MQTT great for those remote assets, those remote pump stations that you want to get into your enterprise solution. As an example, they do also have the capability to connect to OPC, UA and still the legacy OPCDA component to get that data from an OPC server into the Canary historian. They do have specific drivers for specific SCADA systems if you need to get that from Scadus. But obviously we will always advise to try and get as close as the source as possible and utilize potentially the OPC drivers instead. 

And then we can get data out of a SQL database, we can get data out of a CSV file. And then the one that I’m going to showcase today here is actually using the web and. Net APIs to get some data into your Canary story. Why would we want to use the sender service APIs? Well, it comes with all of the great stuff that the sender service has obviously store and forward capability. So if you do have a loss of connection, the sender service will cater for that and keep that store forward capability as well. Just a little recap on that. So obviously the Canary collector will connect either via OPC or OPCDA or NQTT depending on the protocol that you want to use. And that will actually then talk to the sender service. 

So now an example that we’re going to do today is obviously we are going to be the collector, right? We’re going to use the web API and we are going to connect directly to that sender service. Now, the sender service talks on the historian side to the receiver service. And the receiver service is then the component that will actually store the data inside of the canary story. This allows for great architectural freedom. And obviously, as you guys can imagine, it will also cater for loss of communication and local store forward capability. So if you do have a connection breakdown from your collector to your receiver service, it will be cached locally. 

And obviously when that connection goes up again, it will automatically push the data through, also have the capability to notify the administrator of your canary system that such an event had just occurred. And as I said, if the connection gets back up, it will automatically backfill and store the data back, push it back to the receiver server, and the receiver service will push it into the canary story. With this architecture, it’s really cool. You can put down as many of these sender service and collectors as you want, so you can really build out a very good redundant system. You can log to two canary historians at the same time from one sender service and do that vice versa from a redundancy perspective. 

So the sender service and the way in the architectural freedom that it gives you can really cater and create a very robust architecture perspective. So you never lose data inside of your architecture or your implementation that you have. Cool. So just a little bit of recap on that, just exactly how it works. Because when I’m going to do the demo, I’ll show you guys, when I connect to the sender service, I’m going to create a connection to there how that gets transferred to the receiver and eventually ends up into one of my data sets inside of my canary historian, right? So the sender server web API, it is a web API and it’s got numerous endpoints that you can go and enable or disable on your canary historian configuration. 

If you want to utilize a soap connection, either via a username and password or via anonymous, you’ll notice that you can go and enable all of these different endpoints and obviously the specific ports that you need to communicate to talk to them. Obviously very important to have these ports open and enabled on your firewalls. And from an IT perspective, if you want to utilize these endpoints to push data into your canary storage, I’m going to use the very simple HTTP anonymous call today and just show you guys how, in fact, how easy it is to actually get data into your canary system. All right, so those are the web endpoints I’ll show you guys when we go a little bit into the demo, where to find them, how to set them up, how to enable them. There’s also a very good documentation. 

I know sometimes we are engineers and the manual is something that we read after the fact. In this case, I would strongly agree to first read the manual to get an idea and a concept of all the different web calls and endpoints that’s available. And I’ll show you guys as well where this document is located. It’s part of the installation, very good documentation about all the different methods and calls that you can make against these endpoints from a sender service API perspective. All right, so what calls was available? Obviously we need to authenticate ourselves, right? So we need to create a session with our user credentials that we can pass and we need to go and authenticate against the sender service. 

So we’re going to get a token, and this token is pretty much required for any of the subsequent calls that you want to make against that sender service. So this is a must, you must get your token. And obviously if you want free licenses, you can also go and revoke user context as well, from the new versions of Canary, there’s not really a restriction anymore on the web API licenses. So in this case we will just utilize the get user token component, right? Then you can also get a little bit of info from your historian. So there’s calls that you can get. What is the current version of the historian? What is all the compatible versions? If you do have a little bit of a mix and a match architecture perspective, what older versions of the sender service can you utilize in this? 

And then also you can get all the different data sets that’s available inside of your historian for you to be able to push data in from this sender service as well. And then obviously the one that we all want and would like to start playing with, and that’s the storage methods and that’s now actually to get data into the historian. Now very important enable for you to actually send data to the Canary historian, you would need to have a session token. The session token is what we will use to see if you are still alive, if you’re still active, are you still pushing data into the historian? But again, you need that user token right at the front to be able to get a session token. If you do have a session token, then you’re pretty much ready to roll. 

You can then go and call the stored data endpoint and that will be able to then store data inside of your canary story. So the workflow is pretty simple. In essence, all that you have to do is you need to be able to get this user token. The user token will bring back a little JSON blob. If you did something wrong, obviously there’s status codes and error checking to see what you’ve done wrong from a call perspective. But you will be presented with your user token, the user token. Then obviously you will use to get a session to start activating your session on the sender service, and that will then create a little session token for you. Again, if there was anything wrong, there will be some error checking and error codes that you can see from that perspective. 

And then you will use the session token in conjunction with the user token to then to start storing data into the historian. And if all is good and well, the historian will reply and say, hey, that was a good call. Everything is in the historian and everything is hunky dory cool. This is the point where we’re now going to go over with a little bit of a live demo and I’m actually going to show you guys how to go and create these tokens, how to set it up and actually how to store data in the historian, right? So I’m going to move over here to my virtual machine. So I’ve got canary installed here. So if I open up the administration panel for my canary admin, there’s the admin panel. You’ll notice that I’ve got everything installed on this one box. 

So I’ve got the sender, the receiver, and the historian located on my one central server. If I go in and look at the sender service itself, I can see currently who is active, who already have active sessions with the sender service. And you notice that currently I’m storing data from an UA source. So this is obviously with the default collectors that comes with the Canary historian. Now, if I open up the configuration part at the bottom here, you’ll notice that here I can go and see all of these different endpoints that I can now utilize obviously when I want to start making calls again to send a service API notice. Also the port. So currently what’s enabled here is the anonymous web API that’s on port 55253. 

And if I did want to use credentials or pass my username and password, that is done on a different port via HTTPs secure connection there, I’m going to just do the plain, simple anonymous call for today. So I’m going to utilize the 55253 port and utilize that to execute my web APIs. All right, so now that you know what the port is, how do you know what calls you can make? All right, and that’s why I said sometimes it is better to start with the documentation before you go head on show into this thing. And that’s what they’ve done really well with the canary story in here. So I’m going to go to my install directory that I’ve got. 

So I’m going to just browse my machine here, I’m going to go to my program files, I’m going to go to Canary, I’m going to go to the sender service in here, and I’m going to go to the web API, and there is my help documentation that I can go and open. All right, so this is going to go and open up, obviously my browser here, and it’s going to be a very detailed documentation on exactly what is all the different calls that I can make. So obviously the first call that we need to make, as we explained, is the get user token call. 

So if I expand the section here, it will show you exactly what the protocol use is, depending if you’re using your anonymous request or if you need to use it with your Windows account or need to pass your credentials. Obviously for me I’m just going to use the one that’s linked to the anonymous port. So I’m just have to go and specify HTTP. Now to get a user token, it’s literally a post that you need to make and you need to just specify the machine name where your historium is installed. The port now the port is dependent on if you are using HTTPs or HTTP, depending if you want to use anonymous or user credentials, and then the rest is the endpoint for the API. And that is now actually to get this user token. 

And remember, as we’ve discussed, the user token is required for any of the other subsequent calls that I want to do inside of my canary story. Now a little tool that I use quite a lot and some of you guys might be familiar with this tool. It’s called Postman. If you’re not familiar with it, I would strongly advise you guys to have a look at it, especially if you’re going to play with web APIs. If you want to investigate first how the calls are going to work before you actually go and implement it in code, I would strongly recommend that you guys get Postman. It’s a great little tool where I can actually go and execute calls against a web API. All right, so I’ve got a few collections already here, so I’m going to make a new one. 

I’m going to create a collection here of all my different API calls. I’m just going to call it my Friday changemaker sessions and I’m going to create this little collection here. Right on this I’m going to create a new call. So currently I don’t have any requests configured, but I’m going to go and create a request. And this is all about getting my user token, right. So I’m going to call it user token and I’m going to save this request here. Cool. There we go. So there’s my get request. All right, so at the top, depending on what you need to do, you can specify if it’s a get request or a post request, et cetera. And if we look back at the documentation, you’ll notice that pretty much all of the calls is post requests. 

Okay, so the first thing that I’m going to do is I am going to go and change this to a post and then the only thing I really then have to do is enter the requested URL. If I go back to the documentation to get a user token, I’m going to drop this down and all I’m going to do is copy out this example that I’ve got here. Copy that out, go back to postman, paste it in. If I can copy today that would be great. Sorry, I’m just going to do that again. Copy and paste. There we go. Cool. So there is my URL. Now obviously there’s a few things that I need to change here, so I’m going to need to change the machine name. 

I’ve got everything installed on my local machine, so I’m just going to call it localhost for now. Obviously it’s not HTTPs, it’s just a normal htp call. And then I also need to change the port. Now I’m leaving the default port 552353 if I remember correctly. And I just want to go and double check that again. So if I go back to my canary admin Anonymous 552-535-5253 sorry for the guys that don’t know me, I’m a little bit dyslexic, so a lot of fives and twos and fives will definitely screw my brain around. So I just need to double check that is definitely the case. Cool. So there, I’m ready, I’m actually ready to give this a bash and execute this web request against my canary story. 

So I’m going to send this request and I should at the bottom see what the actual response is. And it seems like it was an okay response. And there we go. There is my user token. So no errors got received back from the historian. So I’m ready to rumble. I’ve got access to that port, the firewalls are open, and I can actually now start to get any other information from that historian as well. So let’s go a little bit back to the documentation here. So when I do have my user token, I can now use that user token to any one of these calls that I’ve got at the bottom here. So potentially I would like to see what is all the different data sets that’s inside of that historian. 

So if I expand this guy here, you’ll notice that again, they specify what the actual URL is. But the difference in this one here is that I need to specify a body. So I need to specify what is the actual user token that I’ve just received back from my user token call, as well as where that machine is generated. So let’s do that. Let’s create that in Postman as well. So I’m going to add another collection here. This time it’s again going to be a post connection I’m going to copy out the get datasets endpoint from this perspective here and paste that back into postman. Obviously it’s not authenticated, so I’m going to remove the s. My machine name is local host, and again, it’s exactly the same port number, which is port 552353, you see? Anyway, cool. 

So there’s the port, there’s the connection. The only difference between the user token that I’ve just done previously and the actual request to get all the data sets is, you’ll notice in the documentation, is that I need to specify a body into my request as well. So I’m going to copy this body example set that they’ve got here and I’m going to paste that back into Postman. You’ll notice that you’ve got a body section, so in this body section you can depend on what type of data it is. Obviously for me it’s going to be raw json. So I’m going to change it to adjacent packet and I’m just going to paste the sample example that they’ve got here. So obviously my historian is my local host, so I’m going to change that to my local host machine name. 

And you’ll notice that I also need to supply it with my user token. Cool. So I have to go back to the original request to get my user token. I’m going to hit that request again. Cool. You’ll notice that the user token updated. I’m going to copy that, go back into my get data sets, paste the user token in there, and I’m going to go and hit a request to get all of that. And there we go. Pretty good. So again, historian replied, nothing wrong with that request. All is good. And here’s all the different data sets that I’ve got inside that historian that I can now actually use in my subsequent configuration to start install data in. So that’s quite cool. Definitely. I can see that my user tokens are working. I’m getting data back from my historian. 

So potentially I can now use this to actually start pushing some data into my historian as well. So let’s have a look at the data connection. So I’m going to go back to the documentation and I’m going to go and look at now the storage methods that’s available for me on the server API. Now obviously the one that I’m interested in is the store data. So you’ll notice that again is a token or API. It’s again a post method of that API. And with this case I need to supply it with both, not just the user token, but I also need to reply it with, supply it with my session token that I’ve got. So you need both of them. 

And then literally all that you do for the body of all the data that you want to push is you create an array of all the different tags that you want to push into your historian. So you notice there’s tag number one, there’s tag number two, the timestamp of the value, the actual value itself, and then the quality component. So is this good quality or is this bad quality? And I can use all three of those to push the data back into my historian. Right. So I’ve done this a little bit earlier. So you’ll notice that I do have a canary request already built up. 

I just wanted to show you guys how easy it is with postman to just copy and paste your bodies, copy and paste your requests that you don’t really have to even start writing code to test and to see if your data is working. So great little tool, great to fault find and great to use in this scenario. So I’ve created one already. I’m just going to show you guys that. So there is a send data, one that I’ve done, and then there is the body with all the little tags. And I’m going to actually push it into a data set called weather. And then here’s all the values that I’m going to push into it. So it works exactly the same like all the other requests. 

And obviously I have to supply it with my user token and my session token and off I go. And if you’re done and it’s happy, then the historian will reply with a good result. Okay, so hopefully that gave you guys a little bit of insight of literally three API calls that you need to make to actually start getting legacy data inside of your historian. And the source can be anything, it can be a database, it can be a weather API, et cetera, to get all of this data inside of your canary story. So here’s my scenario. I’ve got a little database, so I’m going to open up SQL here and I’ve got a little database here with some data that I’ve been storing into a SQL database. Obviously, SQL is transactional based data. You need a little bit of DB skills to maintain this. 

You need to make sure that your indexing on your tables are set up correctly. You need to make sure that your purging is done. So obviously me as an automation engineer, I don’t have that skill to maintain a DB that’s part of it’s world, it’s part of database administration. So I would really love to move all of this data that’s sitting in my SQL database into Canary. Canary is a nosql time series historian. It’s purpose built to maintain and store a lot of process data that gets generated on our manufacturing environments. It is much better suited for high volume process data. I don’t have to have very special DB skills to make sure that indexing works and to make sure that retrieval speeds is optimal, that is catered for automatically. 

They also do a lot of great compression so I don’t have to worry about size or disk space as much as would I have had to do with a transactional type of database like you see here. So obviously this do increase my data usage by using a more transactional type of database like SQL in this example. All right, so here’s all my data, and I’ve got some tank tags here and obviously a timestamp of when that happens. Very typical time of wide query that you get from historians. If you would query historian and ask for a lot of tags, we’ll typically do it in this wide query format where the tags are as columns and then obviously the rows are the individual values with the timestamps that’s associated with that. Right? So my goal is to move all of this data into canary. 

All right, so let’s do a few things here. I’m going to go into Canary and I’m just going to show you guys currently that I don’t have any sessions running on my sender service and that inside of my historians currently, if I open up my historian data blocks, nothing is really writing except this UA and the DA connection. They are online. So they’re starting, they got some writers up, but nothing is writing into my weather data, nothing is writing into my other data sets as. Okay, now don’t be alarmed. I’m going to open up some code here. 

It’s based in C sharp, but pretty much the calls that I’ve just shown you guys in postman to go and get the user tokens, to get the session tokens, I’ve literally just token those calls and written a little application that’s going to assist me in getting all of this data automatically out of this database and into Canary. So I’m going to open up visual studio here and you’ll notice that I’ve created myself a little bit of a project. And in my project here, I’ve created subsets for all of the different tokens or web APIs that I want to utilize. So if I want to go and create or get a user token, I create a little application or function here that’s just going to do exactly what Postman did. 

It’s going to take Restorion, it’s going to add the port number to it’s going to execute it, and it’s going to get a web request back from that post application, and I’m going to send back my user token from that JSON response that I got back from the canary. And now I can use that user token all my subsequent calls. So you notice there’s the user token call. Exactly the same will be the session token call, in this case passing in the user token as data. Because remember now, that must be part of the data there and use that to get the session token. And when I do have those two, I can then actually go in and send some data. All right, so the logic around this is very simple. I’m specifying my port number, which you could obviously go and change. 

I’m putting in my historian node localhost. I’m currently running on my local box. The data set where I want to go and store my data in is called 20 burgers. So that’s that data set there. It’s Friday. I hope some of you guys are actually using or eating burgers while you’re watching me doing this. So seemingly, I’ve called this 20 burgers for this little bit of an exercise here. Cool. And then off we go. We’re going to go and supply this little program with a bunch of tags that you would like to migrate. So that will literally be just a tag list of all the different tags that’s available in here. All right, cool. So what I’m going to do is I’m actually going to go into the configuration for this and I’m going to go and delete 20 burgers here. 

So I’m going to remove it completely and I’m going to go and delete all the historical data that’s already associated with it. So you’ll notice that currently there is no data set that I’m going to use to write the data in. One of the properties that you can set is to actually go and create this database for you if it doesn’t exist and you want to go and write your data into your tags. Cool. Let’s see how this works. All right, so I’m going to start this program and what I’m going to do is I’ll just put it side to side with the actual segment service when it’s actually up and running. And then we can see what’s going to happen here. All right, so it’s going to start it. All right. And it’s going to ask you exactly that. 

Which tags do you want to migrate? So if I look back in my SQL database just for now, let’s do just ramp one or ramp zero. I just want to go and select ramp zero and write that into my story. Right. Let’s get the admin tool next to it here. So I’m just going to minimize it so we can see what actually is happening. I’m sorry about this. Just want to move stuff a little bit around. So the first thing that I would like to see is in my sender service, I would like to see that I actually get a session and that session has created correctly and that I actually have am active by sending data in here. Right. So let’s type in, I want to go and say, let’s go and store ramp zero into my canary. I’m going to hit enter. 

Cool. So there we go. I created a user and a token. So there’s my user token, there’s my session token. The historian says all is good. And you’ll notice that I actually have a session token available here. So you notice that I’ve created an active token for my little application. It says that it stored one tag into my historian. So if I’m lucky and I go back to my historian, ha, you’ll notice that it automatically created the 20 burgers data set for me. And you’ll notice that it’s got one writer, which is me, my little session, and that it’s done writing all the data into that data set. So if I go in here, this is for today, the file for today. Open that up. There we go. There’s ramp zero. And if I’m lucky and I select that, there we go. 

There’s all the timestamps, all the values and all the quality components that I’ve just read from that database and inserted it into this as well. Cool. So I’m done with column one. I’ve written all of these values with its relevant timestamp, which is on the 13th into my little canary story and using the web API. Now, the web API will time out if you’re not actively pushing data. So you’ll notice mine already closed down. There’s an active setting that you can set to say how long the web API must be available or active, or your session rather must be active for before it automatically will kill that session. So currently you’ll notice that my session is dead. So obviously when I now call it again, I need to recreate that session to start pushing the data in again. So let’s do that. 

Let’s push all the rest of them. So I’m now going to push ramp one to ramp six. So I’m going to just start up my little application here again. And this time what I’m going to do is I’m going to give it a string of tags to do, not one of one. So I’m going to go and do ramp one, I’m going to go and do ramp two and the rest of them, et cetera. Ramp three, ramp four, five and six. And hopefully, if I didn’t spell anything, almost, if I didn’t spell anything horribly wrong, it will go and migrate all of the rest of the columns for me. So let’s get the sender service up here again. Just going to minimize the bit this time. 

I’m going to go into the historian, into my actual data set, and let’s see how this updates as soon as I push the enter button here. All right, let’s go. Did I press enter? I did not. Cool. There we go. Historian said, good. There. It’s writing my tags into the historian. It’s doing all of those updates, and it’s pretty much done. So let’s go and look into the file again. There’s the rest of my tags. There’s ramp one, there’s ramp two, ramp three, four, five and six. So I’ve just done, in literally milliseconds, I’ve transferred all of the data that’s now in my legacy data that’s in this little database, and I moved it across inside of my canary story by using the web API. So that’s great. Cool. Now, this is not the only thing that obviously you can use it for. 

You can think about all of the other applications that you might want to use to get this data in. I do have a little bit of another example here where I actually use another API. In this case, it is a weather API that I can go and use to also get data into my canary historian. So let me just uncomment a few things here, and let’s see if we can get some weather data inside of my historian. So I’m going to ask the weather API for some data, and I just need to uncomment all of this code here quickly. All right, so I’m going to uncomment that and hopefully we’re going to get some weather data into my canary story. I’m going to launch it again. Save it, run it again. Let’s open up the sender service here, minimize it a bit. 

We can leave it like that. Okay, I’m going to do ramp one again. Doesn’t really matter, but what I’m actually looking for is my, ha, there we go. My temperature data from my web API so I can kill this, actually. So what I’ve done, literally, is I got a whole bunch of south african cities through that weather API. You’ll see Bononi, Brits, Carltonville, Cullinan, Delmas. Sorry. For the guys listening in from Cape Town, this is all Joburg based data. So nothing about the mountain, if it’s clear or not, if there’s a wind blowing, can you actually see the mountain today? I don’t know, but the point is. There we go. Let’s look at Joburg. It’s not that humid. There’s the pressure. And it is a very nice, warm 30 degrees celsius in Joburg today. 

All right, so I hope that kind of gave you guys an idea of all the different ways that you can now really leverage different data sources, utilize the web API and push that data inside of your canary historian. It’s actually very simple to do utilizing the semi service API, and you can really now go town on the different possibilities and different sets, types of data, time series data that you can now push into your canary historian. And obviously you can now utilize the trending tools to visualize this data. So if I go and open up my axiom client, I just want to make sure I know what the port is. So if I open up my browser here, go to localhost port 83. This will obviously open up axiom, and I can now go and create a new application or a new chart here. 

So let’s quickly do that. Add a trend. I’m going to browse for my data that I’ve just pushed in. So let’s go to 20 burgers. Let’s go and look at my ramp values. So there we go. Let’s add them to my trend and close. And let’s just go back. There we go. There’s the data that I’ve pushed in. Obviously not a lot of data points, but there’s all the data that I’ve pushed into my canary story from that legacy SQL database. And I can actually now go and use the great trending capability of canary historian to go and visualize that data that sits inside of my database. Cool. So I hope that guys gave you guys a little bit of an insight, a little bit of technical knowledge, not just on the Canary sender API, but just web calls in general. 

How easy it actually is to make them remember, please, Postman is your friend. If you want to start playing with these things, it’s a great little tool that you can use to get that data in. So I encourage you guys, download it, play with it, push all of the data in, really start playing, and get innovative. I think my message for today is really to start to become innovative with the type of data that you can get into the Canary historian. It’s not difficult to do. I think we’re stuck in a rut where it’s always just OPC kind of data via an OPC connection. But I want to challenge you guys. Use the APIs, get some innovative stuff going, and start pushing the data into your canary storage. That is all that I’ve got for today. 

I thank you guys so much for joining me again for one of my Friday sessions. I hope this was very insightful for you. And please, if you guys have anything technical that you want me to cover, not just on the Canary historian, ignition and flow as well, please give me a shout. You can email us at information at elementate co za. Get me those topics. Get me those things that you would like to see. Let me get it onto the roster and do a tech session Friday on them with this. I want to say thanks again, and hopefully I see you in the future on another session. Thanks.