Hello, everyone. Thank you so much for joining us for today's webinar, Boost Your Business with Advanced Workflows where we're going to explain how incorporating automation into your file transfer processes can increase productivity and help you get the most out of your MFT solution. I'm here with my co-host, Rick Elliott. Rick, are you there?
I am here. Good morning.
Excellent. Good morning. Before we kick things off, I'm going to remind everyone that we have the event scheduled for about an hour. And we are recording the event and will be sending out an email following with the link to the recording for you. You can feel free to ask questions throughout the presentation. We have team members online who will be answering those. And we'll try to answer a few verbally along the way as well and at the end. Finally, we also have a survey that will pop up at the close of today's session. If you would answer that, it'll give us great feedback on what parts of the presentation were most helpful for you, and you can reiterate any additional questions that you have and we'll be sure to get back to you.
All right. Let's dive into our agenda for today.
Oops, there we go. I'll learn how to drive this thing.
No problem. All right. So, today's agenda, we're going to go over file transfers in organizations today, some of the more common ways people are transferring files today. Then we're going to dive into automation and Advanced Workflows in your file transfer processes. Rick's going to go through a live demo for us. And then we'll close with some Q and A.
So, let me introduce you to Rick. All right. Rick Elliott is a lead solutions consultant and 12 year veteran in the GoAnywhere MFT area. He was brought onboard to help us build out the consulting and professional services group in 2012, as well as travel worldwide providing training and teaching on-site technical support for customers and trading partners worldwide. He was a prior customer that came to HelpSystems from Discount Tire Company, so he is well versed in our MFT products. Rick is joining us remotely from his home in Phoenix, Arizona and, Rick, thank you so much for being here. I'm going to turn things over to you and you can take it away.
Absolutely. Good morning everyone, and I hope everybody's doing well and staying safe. Hopefully they're getting used to working from home, which I think a lot of people are getting used to right now. But I've been doing that for about eight years. It's not really that bad, as long as you don't have kids around. I think it makes it easier.
So anyway, we're going to have fun. We're going to talk about Advanced Workflows. We're going to get into discussing that, how automation can really help you in your MFT needs, in your file transfers. And discuss what you can and can't do or how to get into those situations that you need to manage within your system.
How do you managed file transfers?
So first, let's answer a question. How do you actually manage your file transfers? What do you do? Do you have legacy commands that you know somebody at the company actually created many, many, many years ago but they don't happen to work at the company anymore? But no one's wanting to touch them because they're afraid that something may break. They just know that they work or they've been working for years and all of a sudden it's like, well wait a minute. Maybe we do need to make a few changes, but nobody knows how. Nobody knows why whatever they created works. It's kind of like a mystery or a black box.
Automation with Advanced Workflows
Do you have home grown transfers? So, in other words, some people who happen to know batch scripting or bash scripting, they went in and created a batch file that would connect to JP Morgan Chase and download a file, or upload a file. So, that way you have something but it's in a batch file, and somebody has to remember to run it. Or maybe it's on Suzie's laptop in accounting and she has to remember to run it every day to go transfer ADP files. But when Suzie's out, who's taking care of it? Do we know that it's actually being executed? Maybe you're using a DOS scheduler. Or maybe a Linux cron job, so that way you've gotten a little bit more advanced with the batch files and you can run them. And you're going to schedule it to run in a particular time. But all you know is that it's scheduled to run, and that's all you know.
Maybe you got even a little further down the line and you started using Control M or S4, or maybe a robot job. These are all items that you can use to basically accommodate making sure something gets executed in a timely fashion. There're things that you can use within your products. And these different programs that you can use to make sure that you're actually transferring data to and from.
These are all well and good, but take the next step. How do you know that what you did actually worked? Is it when the customer calls and says, "Hey, I didn't get the file." That's when you know it didn't work because you've just been executing this job and it works all the time, and then all of a sudden it doesn't. The only way you know it doesn't is because the customer calls and says, "Hey, I didn't get my file last night."
The worst one is the second one listed here. All of a sudden, the boss steps in your office or gives you a call. "Hey, wait a minute. I'm hearing the customer didn't get the files last night." Or we didn't get files last night from the bank. That's an important part, but it's not fun when the boss makes that phone call to tell you.
Maybe you get a page at 2:30 in the morning. Maybe you have a support group or a help desk that basically alerts you and lets you know there's something going on. or there's other jobs waiting to be processed and something didn't complete. Now you have to spend time signing back in to figure out what happened, what's going on. How do I get my transfer to work?
There's all kinds of things that can come into play here, right? So, taking a statement from Susan Powter, if any of you are old enough to remember her. "Stop the insanity!" Figure out what you could do to make it better. Build your automations. Matter of fact, you don't even need a programmer to do that. If you know how to sign into a computer, and basically get in. And you know what you need to do, chances are you can go ahead and just drag and drop these into a project outline in the order that it needs to happen, fill in a few prompts, go ahead and start processing. Automate. Schedule it. Make sure that you're ready to go.
Matter of fact, we'll even get you to the point where you will get notifications to let you know when something goes right, but will also let you know when something goes wrong. It's your call. What do you need to know? So, at this point you can be proactive. That way if something does fail the night before, you're going know about it ahead of time and have a solution before the boss comes in saying, "Hey, what happened?" Seems to be always a pretty good thing when your boss comes in and they're saying, "Hey, I noticed we had problems last night." "Yeah, we did. This is what happened. This is what I did to fix it. It's already been corrected and files are in place." For some reason, the boss really likes to hear that when they know something went wrong.
What about your environment? Cleaning up the environment. Yeah, that's part of file movement. Everyone always wants to keep a copy of something, but how long do you keep those copies? I mean, it's not like the IRS and you have to keep it seven years. Maybe you only need it 30 days. 21 days. Six months. That's fine, but you can always automate the clean up of your environment. There's multiple things you can do here that you can take advantage of that go anywhere as a full, managed file transfer process can help you manage your environment in multiple ways. There's a lot you can really take advantage of here.
So, let's look at a Workflow. Really, there's three major parts to the Workflow. One is you have to know what it is you want to do. So, if you've done batch scripting or bash scripting, you know that there're commands. Go get a directory of a folder. FTP the file to a target. Copy the file from one folder to another. Deleting, moving. These are all just commands that you do on a daily basis, just from a command prompt.
But what we're giving you here is a chance to kind of pull that into the 21st century. I can see things happening in front of me. I can build an interface. On the left side, you'll see what we call a component library. In that library is 150 plus commands that you can execute. Zipping, unzipping, encrypting, decrypting, FTP, AS2, AS3, AS4, communication, data translation, MQ series, integrating with local commands, making calls to an IBMI. There's a whole plethora of commands here that you can execute.
And that doesn't even get into touching the cloud connectors. Cloud-based computing has become a big, big deal in our lifetime. Now, we're actually have opened up that door to you to basically make connections to CRM and SalesForce.com, and Jira, and ServiceNow and all of these different cloud-based products that you really need to integrate with on a daily basis.
Taking that information, you can put it into what we call a project outline. This is where you tell me what we're going to do in the order we're going to do it in. So, literally you just grab a task out of the component library, pull it into the project outline and literally fill in the information. Here's the file I want you to get. Here's that data that I want you to read from a database. Here's what I want you to do with that data, where I want you to copy the data, how I want you to encrypt the data, what key do I want you to use, what resource do I want you to connect to. You're just literally picking and choosing from a drop down box. Or browsing to go select. Or if you get a little more advanced with it, providing parameters to pass into the Workflow in automating. Making it part of your system. Letting it meld into your environment where you can take advantage of exactly what the product can do for you.
So, let's take a look at a few of these things that we can basically determine. GoAnywhere does have about 15 or a few ways that you can actually translate data. I can pull information out of a database and create a CSV and SFTP it in three commands, three tasks. I can actually read EDI, EDIFACT, JSON files, XML files, Flat files, CSV files. I can create and read from an Excel spreadsheet. Data comes in all kinds of forms and fashions. Do we actually provide all of them? No. There are some formats that we don't do yet. But as the demand grows for those particular areas, we literally address that. If we have enough customers that are pushing those issues, then we bring it on board. Start discussing it. Figuring out how we can incorporate it with the product if it's that big of a demand.
But in this case, what we're looking at is the fact of I have a database that has employee information, and I need to pull that out. Maybe I need to send it to ADP for new employees so that they can get paid. So, whatever the information is I can literally go connect to that database, select that information out of that database in the order that I want it, and then write it out to a JSON file, a CSV file, XML file. Whatever the type of file is that I need to put that into.
Once I get that file, a big part of today's society with the way things happen is encryption. Data needs to be encrypted at rest and during transfer. So, that means normally you're going to send things across SFTP or HTTPS or FTPS. That means that now you can go through and physically encrypt that data during movement. But that doesn't mean the data itself is encrypted.
So, normally you could use a PGP encryption. PGP encryption within GoAnywhere is based on an open PGP standard. That means that you create what's called a PGP key pair. You have a private key, and you have a public key. You send that public key to your trading partner, and they send you their public key. Then when you need to send them data, you use their public key to encrypt data so that when they receive it, they have their private key to unlock that data. Same coming back to you. They encrypt with your public key and send it back to you. And then you have your private key, mainly because you have the pass phrase to that private key. You have proof that you can get into that data. That allows me to basically send an encrypted file. Now I can email it. Yes, I can FTP it or SFTP it. AS2, AS3, HTTPS, whatever the communication is. And then decrypt. Just take the base PGP file, decrypt it and pull it right back into the raw data that was sent.
Now, what about scheduling? When do I need this particular job to run? GoAnywhere does provide you with ways to schedule these automations. So, when I define a Workflow, I'm actually defining the whole piece. The project is just part of it. That's just telling me what to do. I have to have something that triggers that project to execute. I can schedule it. Like you're seeing here. I can say I want you to run something weekly. I want it to run on Monday, Tuesday, Wednesday, Thursday and Friday, and I want it to run at 12 noon. You're picking and choosing daily when you want this particular Workflow to execute. This is where it's going to begin.
One thing GoAnywhere does allow you to do is create a holiday calendar around that. When do you not want me to run? If today is a weekend, there's days that I don't work, holidays. Maybe your company has particular days or weeks that they don't want to process things. You can build those into your holiday calendar. And if today happens to be one of those days, what do you want me to do? Do you want me to skip it? Run it on the day before? Maybe run it on the day after? How do you want me to address that? Again, these are things that come into play and things that need to happen. You're just identifying things that need to happen and work around them.
We have triggers as well. The trigger is utilized when a web user signs on to your system to upload and download data. Every time they sign on, there are events that happen while they are signed on to your system. That event could be them uploading a file successfully. That event could be them logging on or logging off. That event could be pretty much anything we can trigger on that event. Maybe they disabled their account. We can trigger on that particular event to alert your support group and tell them, "Hey, this company or this customer has disabled their account." And then you can be proactive in getting them corrected.
The main one we're looking at here would be like an upload successful. They're sending you a CSV file. Once you receive it, you want to basically open it up, read it, insert that data into a database table, call a stored procedure to go process your data or some kind of a process of that data. And then maybe send an alert back to let them know that you've received the data and it's already been processed. All of that triggered by the fact that the trigger executed upon the successful upload of that data. So, in this case, the trigger can automate that project Workflow. Just like a scheduled job except it's not scheduled. It's only triggered based on the event that a customer signed on and uploaded a file successfully.
What about monitoring? Monitoring is a big part of today's business world. I'm going to go look in a local folder, and when a file shows up in that folder, I'm going to pick it up. I'm going to encrypt it, and I'm going to SFTP that to JP Morgan Chase. Maybe I need to monitor a folder at JP Morgan Chase so that I can pick up my daily report, my credit card report. I have a certain window, normally that banks give you to come pick up that report after you've submitted your payments or your credit card entries. So, you look from six to eight PM, because that's your window. And when the file shows up, you call a Workflow that connects to that JP Morgan Chase bank, downloads the file, places the data where it needs to go, and waits to start looking tomorrow again.
So now, I can look to see if there's new files in that folder. If there's just a file in that folder, meaning it just exists, or if somebody deleted something from that folder. These are all events that I can actually monitor on. And again, once I identify something, I can trigger that Workflow. Make something happen. So that way it does allow me to basically go through and do something with my data.
Advanced Workflows Automation
So, that being said, let's kind of flip over and look at the product and what really happens here. The GoAnywhere dashboard is where life usually begins within GoAnywhere. But for automation, there's usually two things that you're going to need to look at to make sure you have before you get into that automation. The first is going to be your PGP key. So in other words, you need to make sure that you do have a key available. So, in this case, I'm using a demo key. It's installed in my key management system. It is a key pair. So, that means I do have both the private and the public key. So, it does give me a public key to encrypt with, but it also gives me a private key to decrypt with.
The second thing I want to basically have or make sure that I have is going to be an SSH server that I'm going to connect to. So, in this case, I'm going to have my information defined however I'm going to communicate with that particular server. I can even go through and test communicating with that particular server just to validate that I can communicate. Once I know I have the pieces that I'm looking for, then I can actually get into my Workflow. This allows me to go through and physically set up the automation.
As I mentioned earlier, if you have your specifications, you know what you need to do and the order you need to perform them in. Then all it is is taking those particular pieces and pulling them into place. So, maybe I need to go read a database. I just grab it, and drag and drop it into the project outline. Maybe I need to write a CSV, which I'm going to be doing. Just drag and drop it as I need them.
So, in this case, you're going to look through or provide access to any of these tasks into your project outline in the order you want to make this happen. So, per the example that we were talking about in what we listed earlier, I'm going to go connect to a database. Yes, this is just a drop down box. I've already pre-defined those particular items in my resources that we saw just a minute ago. So now, I can physically just go select the resource that I want to connect to. Then all I have to do it provide some kind of an SQL statement, or maybe execute a stored procedure or some command to insert, delete, update, select. Basic SQL communication.
In this case, I'm going to basically pull it into a variable called MyData. This is what's called a row set. The row set allows me to move data between different environments or different genres of data types. Now I can use MyData to go through and convert that into my CSV. So in this case, I've selected the right CSV task, I'm taking my data that I've pulled out of the spreadsheet or out of the database, and I want to create a file name using that data.
As an example here, you'll notice that I'm using a variable to create the file name. So, what I've done here is I'm actually creating a unique file name on the fly to put the data into. I want it with today's date, maybe a unique job identifier. The word HR data. I can build a file name out of whatever components I need to build that and then use that to create this particular file. Because I don't know the actual name of the file, I'm creating a pointer to it, and then I'm going to use that pointer to say I want you to go encrypt that data. So, whatever it is, I just created the CSV file, I'm going to go in and create that. And then I'm going to turn right around and take that file that I just created and I'm going to go use that PGP key to encrypt it.
So yes, I can just select that key from a drop down box, or as you're seeing here, I can also pass that value in from external source. So, if we go back to that beginning where we said yes, you could use Robot, S4, Control M, PowerShell scripts, JAVA. If you're making a call to go anywhere, which you can do from the outside world, then you could just pass me the parameter of what to do. Making administration a lot simpler, but also giving you a little bit more melding into your environment with your MFT product.
So no, I don't have to go create an individual Workflow for every transfer that I need to do. If I have 25 or 30 that are doing the same thing, and the only thing different is the customer I'm talking to, their PGP key and the folder on their side that I need to place the data in, here's a project that will do all that for you for all 25 customers. All I have to do is schedule it and pass the information.
And yes, we can turn right around under the connection and make sure that the file gets transferred. This way we can basically look for errors during communication, during encryption, during copying the file, SFTPing the file, grabbing data from a database. If there's a problem, we need to know about it. So, if we have a problem, meaning I found that there was a problem during the communication, tell me what to do. In this case, I'm just going to go send an email to let somebody know. I had an error. Here's the problem that I had. Matter of fact, I'm even going to attach the job log to that email to let you know, here, before you even sign on or go look or even know that you need to, I'm going to give you all the details you need to know about the problem that I'm having.
You can be as granular with that information as you want to be, or you can be as generic with that information as you want to be. Whatever fits your business rules and your business needs. If I'm successful and I don't have an issue, fine, go make a copy of what I did and put it into my archives. I can even date and time stamp that, and then send out a notification that lets somebody know that I've actually communicated this information and I was successful.
So, now when you actually execute this particular project and you go make that communication, you grab the data, you create a CSV file, you encrypt that CSV file, and you SFTP it. Now you are at a point where you can see the logging of everything that you've done. So, here I'm basically giving you a snapshot of the parameters that were passed into the project. I'm telling you when we get down to and we grab the data, here's where we actually went and selected the information from that table. Here's where we went in and was creating the CSV file and the name of the CSV file that we created. Here's where we were encrypting that same file. So, now you can see that it actually encrypted successfully. And then, where I actually made an SFTP to communicate.
In this case, I have the SFTP task in debug mode. I want to know everything about that communication. Where am I going? What folder am I putting it into? The file size, the location. I want to know that it was successful. How did I communicate? What kind of encryption was being used? All of that is being given to me just during the communication. And yes, I know that it actually got there successfully.
And then, last but not least, I'm going to go send out an email that lets me know that everything came back or was transferred successfully so that now I can literally go through and actually open up that email to see exactly that the data was transferred. In this case, I happen to attach the PGP as well as the CSV file that was created on the fly with alerts to let you know, hey, these particular files were created. You can then review them at your leisure. That makes it available to you.
Hopefully, this has been a little bit of a enlightenment for you in showing you what GoAnywhere can do for you and how it actually works. This gives you the capability of doing some automation and moving into the automation world to actually take advantage of that. So, with that being said, that's kind of the bigger picture of automation.
Donnie, were there any questions? Does anybody have questions about what we're doing or...?
Hey, Rick. I have one question that just came through for you. Let me see here. It says, "If a stored process is used, can you put the SQL messages into an alert for investigation purposes?"
Yes. If I have the information, I can do whatever I need to with that information. So, in other words, if get an alert or I get an email, yes, I can read that from a database table. I can even read it from an email. So yes, I can go check an email box and pull information out of an email, and actually alert on that information. So, there's multiple ways to alert.
Okay. Excellent. If there are any additional questions, please feel free to put those into the questions pane on the right hand side of your screen. And we can go ahead and answer those. Looks like we have one more that just came through. Is it possible to interface with create a files and have high res PDFs zipped and placed into a work front integration?
Within GoAnywhere currently, we do not create high res PDFs. You can execute that connection to a third party product from within GoAnywhere if you have a product that will actually create those PDFs. But as far as data is concerned, and movement of that data, GoAnywhere doesn't play favorites. If you have data and you want to move, we'll move it. We don't care what the data type is. But currently, we do not create PDFs on the fly like we do the other formats.
Very good. We'll give it a moment here for any additional questions that you guys have to come through. In the meantime, you have an opportunity here if there is anything you want to reach out to us and contact GoAnywhere or HelpSystems for, we have that information shared with you right up there on the screen, in addition to links to our trial and demo. Feel free to reach out to us there. In addition, I'll, just a quick reminder that you will get an email with the recording to this webinar. In addition, that survey will pop up at the close of today's webinar. Please feel free to reiterate any additional questions that came up for you and give us any feedback that you have for today's event.
Rick, it looks like that is all the questions that we have for today. Thank you everyone...
Sorry Angela, I would interrupt here. Sorry. Donnie Laughlin from the GoAnywhere team here as well. There was one question that was a piggyback off of the original question on the stored procedure, Rick. It came through as it really pertains to the SQL responses, not just the alert, to the question I think he was asking.
Oh, the response from the secure...
Yeah. Unfortunately, right now it just depends on the stored procedure and the way it communicates with the JDBC driver. So, normally there are ways for you to pass parameters into a stored procedure, but getting the parameters back out of the stored procedure you do have to kind of take a little bit of a trip to get that back. Meaning writing the data into a temporary table and then pulling that data back out of that table. Unfortunately, the inbound, outbound with stored procedures is not as simplistic as it should be. It just hasn't been grown into that particular functionality yet.
So, sometimes it does work. Sometimes it doesn't, but I find that the best way to do that is in your stored procedure, have it write out the information to a temporary table, and then just select those parameters from that temporary table that you need to continue processing. That's the easiest and more secure way to doing that.
Great. Thanks Rick. And again, we will review the list of questions that have come through and will follow up with folks on those open questions. Just to go through and just make sure we cover those that are applicable here. There was a question on moving files interfacing with S3 resources, Rick. The answer is yes. I know that. I don't know if you want to talk to that just briefly.
Yeah, GoAnywhere does provide access to Amazon S3, as well as Azure Blob as resources. So, just as we had set up a resource to connect to the S3 or the database, you would connect... I mean, I'm sorry, to the SSH or the database, then you could actually create a resource for Amazon S3 as well as Azure Blob. So, that means that yes, you can make a direct connection. You can upload and place the file directly onto an S3 bucket, or I can just copy files to and from the S3 bucket just like it was a local folder.
Okay. We had another great question come in about monitoring and alerting. So the question was can you monitor the file and send a notification if it hasn't been picked, processed by a specific time? And again, I will answer that in the simplest form, and the answer is yes. And then I'll ask Rick to elaborate.
Yes, you can do that. The key here is that the product itself has what's called an SLA. So, there is an option under GoAnywhere to get into your service level agreements. So in this case, what you're looking for is the execution of a particular feature by a certain date or a time. So yes, I can go in and create a service level agreement that I'm looking for within a monitor being executed, a trigger or a project being executed based on a certain value. So, absolutely we can do that.
Okay, great. Thanks Rick.
I have one quick follow up confirmation here to the high res PDF question. Assuming that the high res PDF is already created, just confirming that you can zip the file and place it into a work front database. Is that correct?
You can zip it. I mean, that's not an issue. Zipping is zipping. So, if we can zip it, it's all about the compression rate of what you can actually zip. Inserting it into a database... I'm assuming you may be talking about Hadoop or something like that, maybe. I'm not sure. But there are ways to insert it into Hadoop, or there are ways to basically attach it to some type of a blob if you need to into a database area. Hopefully that's what they're talking about is like Hadoop or something along that line.
Well, Donnie, unless you see any additional questions here on the list that are pertinent to ask, we will go ahead and make sure that any questions that weren't asked during the live portion today, that you get a follow up there with additional questions. And then in addition, we'll send that recording out your way. And I believe that is a wrap for today. Thank you so much for your time, Rick. We appreciate it here. This was a great overview. And thank you for all of our attendees for your time today. And we'll be in touch with you soon.
Thanks guys. Have a great day.