Automating Your File Transfers

Thank you for viewing for this on-demand webinar. If you have questions following the webinar, please contact us. You can also download the presentation slides here.

 

About the Webinar

Are you manually transferring numerous files on the daily in your workplace? If so, these file transfers may be taking up way too much of your time – time that you can get back with an automated and secure solution!

Although free tools like Open PGP Studio are a good option, GoAnywhere MFT helps to centralize, securely automate file transfers, and save crucial time – plus, so much more!

This webinar will focus on the many different aspects of automation that GoAnywhere MFT can achieve for you and your organization, including topics like:

  • Automating the PGP encryption/decryption process
  • Monitoring the file systems
  • Data translation and database queries
  • Error notification and eliminating human error

Transcript

Angela: Hello everyone. Thank you for joining today's webinar on automating your file transfers. During the webinar we will go through how companies are centralizing and securely automating file transfers to save crucial time in their organizations. We hope you find the presentation helpful. I'm here with my co-host Dan Freeman, Dan, are you there?

Dan: I am.

Angela: Great. Before we kick things off I'll remind you that the event is scheduled for an hour and we are recording the event, and we'll send a link out afterwards so you have it. Feel free to ask any questions throughout the presentation in the lower right hand questions pane. We will have team members online that can answer them and then we'll also answer some verbally throughout the event and at the end as well. Finally, a survey will display at the close of the presentation and if you fill that out it will give us great feedback on what parts of the presentation were most helpful. You can also reiterate any questions that weren't answered on today's call and someone will get back to you.

All right, here's our agenda for today. First, we'll set the context for the many ways organizations are transferring files in their organizations today and the common issues associated with those methods. Then we'll dive into the benefits of secured managed transfer file solutions and then finally, we will cover a live demo and end with some Q&A. All right, let me introduce you to our presenter. Dan Freeman has spent the last 10 years of his career in various security roles, ranging from systems engineer to security officer. He currently serves as senior solutions consultant at HelpSystems for the GoAnywhere product line. Thanks for being here Dan, and you can take it away.

Dan: Awesome, yeah thank you, and thanks everybody for taking a little bit of time out of your day to join us for this brief conversation that we're going to go over today. Hopefully everyone had a good holiday season, hopefully you're excited to be back. I know myself it was a good break, got to take the family out to some fun restaurants and I know usually when you go out to restaurants you look at the ratings, maybe four or five star, well I'm not that high end but speaking of which, did you guys see or hear about the new restaurant on the moon? I was like, oh this is really cool. I feel like splurging, taking my kids up there, I looked at the ratings, it was great food, no atmosphere. Unfortunate, I know. Maybe they'll get it right the second time around.

Okay, that was a really bad dad joke to start it off, so let's get going here. We do have quite a few slides, so I'll try and be as high level and brief as possible. I think the demo contents and going through the product and showing you some of the ways that we can do that automation to help save time and money will come through with the demo portion, but let's go through a few slides first.

Okay, so we talked about, or Angela mentioned the context of this. Just to talk about data I think in general, if you've listened around, and I think it's kind of a general rule that in maybe the next two years or so that it is definitely predicted that data is going to double, whatever that number is, as we know the further along we get in technology, the more and more data is becoming very, very integral to everything that we do. So a lot of things that we're going to deal with when we talk about collaboration applications, email attachments, automated scripts, these are just ways that organizations are going to be exchanging those files through these different channels, whether it's to businesses, trading partners, what have you.

Most of these options, like your collaboration, maybe it's really hard to keep tabs on certain cloud services like a Dropbox or a Google Drive or Box. You might not have really good controls on what leaves the network, as far as maybe some sense of information, trade secrets, whatever. So it's really important to have that collaboration but at the same time have some good control of auditing around those types of things. Same goes for applications. You might have applications, maybe homegrown apps, your APIs that you might be making available, whether it might be SOAP or REST, might not be secure. Maybe the folks that are doing these things don't really have, or have that security background, or even really think about it. Sometimes security is put to the wayside because it does take a little bit more time and effort and can be a little bit less convenient, so got to pay attention to those type of things.

Email attachments or email in general, it seems to be the defacto communication standard, but at the same time it can be very cumbersome, it can be very, from a security perspective, a nightmare. Large files usually can't send or if they do open up the send exchange connector settings to a large file size, that can bog it down or even lock it down. But those things, as far as auditing of any kind of mail attachments really much is going to be really a pain in the rear end if not impossible, especially when you start dealing with that sensitive information. Now you can that kind of data wherever that email is sent with multiple mailboxes, recipients, all that stuff.

Then automated scripts, I think these are really, really difficult to maintain. Sometimes, even things like [Retry Logic 00:05:26] is really convoluted if even available at all a lot of times. We'll talk about the automated scripts from a development standpoint, not really user friendly, not everybody can do those type of scripting technologies, so we'll talk about how we can deal with or how a manged file transfer system can deal with those types of things as well.

Different file exchange types, I think this is pretty straightforward. I think most of you have seen this, whether it's going to be for the most part, either a server to server or that business to business type communication. I think most of us think of those things like those unattended or scripts that are run. Again, that no human intervention. We have common channels in there, but it can be pretty much any channel as far as these things. A lot of times we'll talk about an EDI or a data interchange type of transfer going back and forth, having those set standards so you can have that business to business or machine to machine communication. Getting rid of that human intervention from a speed efficiency accuracy standpoint.

On the other side we do have still that person to person type thing or person to server. More from the ad hoc file transfers. This is really where email is more prevalent. Things like that team collaboration, whether it's email, even like an FTP is pretty common still. Web cloud services, a lot of times it is going to be those web client portals where people can log into, drag and drop things onto the interface but those are going to be other methods that we need to pay attention to when we're doing those types of file exchanges.

We have external exchanges. You see in that top right portion there, FTP, email, HTTP, in red, highlighted red. Those are going to be your traditional not so secure and reliable protocols to deal with. So when we're doing with these in transfer we definitely want to provide that encryption. So instead of HTTP let's try and lean towards HTTPS and leverage some TLS certificates. Instead of your FTP, let's do an SFTP, or FTPS isn't really on there, but let's leverage some of that either SSH keys or certificates to encrypt that traffic as well.

Then also that encryption at rest we want to have as best we can that true end to end encryption, so not only in transit but also lets have an encryption at rest so that we can have those types of protections in case people do get to the final destination to pull those types of files. It's not always being cracked or hacked or listened on during transfer but a lot of times it is that the data at rest is vulnerable on the inside of the network. Then EDI, I briefly mentioned that, but a lot of times these can be based on homegrown solutions or really, really expensive tools. These are going to be things that we need to pay attention to. EDI is a backbone of places like purchase orders or think like a Walmart inventory, shipping info, invoices, all those types of business transactions, that's going to be a really, really heavy workload of that automating of file exchanges.

Then on, not the flip side so much but internal exchanges like your application integration, data workflows, this is the stuff we're going to talk about. Translation type technology, taking a CSV, reading it and then importing it into a database. All those types of things we can do from an automated standpoint as well as the internal file movement among maybe different servers, platforms, applications. A lot of times we have a really loose association between these things or maybe even completely siloed environments. How do you manage these things? This is what we're going to get into, having that centralized MFT type solution is really going to help in these areas as well.

Internal exchanges from remote locations, kind of gets on a little tangent but having potentially remote locations coming back to the central hub. You might be opening up unnecessary ports. These might be high latency lines. The transfer is really, really time consuming. You could be using a traditional FTP or email, some of those non secure protocols to transmit this data. Again, it's going to get away from that centralized management, so a lot of decentralization on the stuff that we're talking about here as well.

That bottom point, that acceleration transfer of large files between remote sites, that is something also I think a lot of folks look at, some sort of file acceleration product or solution, or if they're not, this is where that really, really comes into play from a time consumption, and as we all know, time is money so it's nice to have some sort of solution there to have these large files being accelerated over these high latency lines.

This slide here, just real quick, we don't need to go into the details of it. Kind of just gives a little breakdown on some of the security trends or some of the different areas from a hierarchal level of what is usually causing the problems or breaches or causing money to leave organizations because of these type of endpoint attacks. The biggest thing I think I want to pull from this slide is, one, the dollar figures are obviously huge. The prediction of the amount of damage that any kind of threats are happening or any kind of data loss, things like that are, every year, year over year, increasing all the time.

But the other thing to pull out from this I think is kind of on that main trends, that first bullet point, is that human error is still the main problem, whether it's going to be accidental loss, bad configurations, lack of knowledge. Things like that that we have, or even insider malicious use, human error seems to be one of the biggest, if not the biggest reason for a lot of these breaches or data loss that we have worldwide. Getting that human element out is the core reason or one of the reasons for automating things from an efficiency standpoint, from an accuracy standpoint, things like that. Getting that automated solution, whatever that may be, MFT is what we're talking about, then that's very critical to get rid of a lot of these issues and bottom line, a lot of lost dollars.

Okay, so some of the common issues, it's going to come through some of the things that we just talked about obviously. But we'll look at them as problems I guess. I think that we have five that we have listed on here. Development and maintenance, now I think we can all agree, developers are pretty sharp folks. In fact, sometimes from a maintenance standpoint they can sometimes be very, very creative in what they do, which can lead to I guess a few problems. You're very, very dependent on those folks now to do the work. They might not be readily available, maybe you can't afford them, maybe they can be very expensive.

Some of that dependency could create big delays and obviously expensive ones at that, so having to rely on development staff for doing things like these scripts can sometimes be a little bit expensive for I think the most part and you're really depending upon those folks to do those types of fixes. They can be a little bit convoluted, for lack of a better term. Even when we do get those good scripts or programs to do what we want obviously maintaining them is going to prove to be quite a struggle.

Maintenance of these are going to be really important, you can't just build something and let it go forever. I'm sure there's some exceptions to that rule but that outdated or misconfigured tools, those things are going to be prevalent throughout your organization and getting back to, which I'll talk about in a second, that centralization. If you have scripts all over the place it's going to be really hard, from a maintenance standpoint for things like if something changes in your script and it's commonly calling an SFTP server that you are sending files to. If that IP address or username, password, anything, maybe hunting down everywhere where that resource or that server is being leveraged in a script or scripts can be a real headache. Again, getting away from that centralization type theme.

Then I think the lack of knowledge on how secure protocols work is going to be another one. Whether it's lack of knowledge or whether it's maybe lack of resources and time, or where with all to think of integrating security within applications, that seems to be something that although it should be on the forefront a lot of times it's not because it can be time consuming and can be expensive to implement these things, so sometimes those factors lead into non secure type solutions or misconfigured outdated type tools, so those types of things, all those things, whether you're dependent upon a certain resource, whether those solutions are now convoluted and a little bit difficult to maintain, whether it's just expensive to do so or maybe we didn't have the where with all to think about security while developing these, all those things from a development and maintenance standpoint can definitely be an issue of some sort there.

Leading into decentralization, I think it's great. I think everyone can agree that a centralized hub for all your answers to, or life, that would be great obviously, but that's probably been a pipe dream, having at least the processes all over the place without any type of centralized administration makes the difficult basically impossible. If we don't have any sort of administration or centralized administration for these things, not only can the tools themselves be difficult to maintain but once you have them all over the place then it makes it, again, almost impossible.

So we need to have some sort of centralization, so these decentralized administration concept is going to cause headaches from the standpoint of managing things like even users of their multiple applications and systems. I know a lot of people probably have an FTP server and all the user counts for that FTP server, they're actually creating them on the machine itself and they're just local system accounts. Then they have another integration system, an ERP system or they have something else and they have completely separate accounts, from a user perspective so that no integration with Active Directory or LDAP or SAML or something that they have as far as a central repository for an authentication source, not having that can also just be an entire headache, absolutely a headache.

Then that last bullet point, the encryption keys, we do get to the point where we're actually doing due diligence to provide that security but now we've also again, we've got encryption keys all over the place. We don't have a central PKI where there's certificates, where there's SSH keys, where there's PGP keys. We usually tend to have those types of things specific to the applications that are using them and not really a centralized way. So again, that's going to be another issue, not only from a user perspective but now we've got our secure keys all over the place and no centralized place of doing that, so something else to pay attention to.

Limited automation capabilities, I think this is pretty straightforward. This is going to be from a reliability standpoint. Things like that auto-resume, a lot of times the retry mechanisms aren't built into scripts. Things like your users manually operating with sensitive data, whether it is the actual final product, whether it's them having to send something out and encrypting it using a PGP key and they select the wrong one and send it to the person that doesn't have the private key to decrypt it. Whatever that case may be, just that human manual interaction and that decentralization of secure keys, user accounts, things like that.

That final encryption, decryption, it could be still a manual process. We don't know. That whole picture, I guess that whole picture there doing the sender receiver, encrypting signing and then the decryption verification process, all of those things within there we definitely want to automate that as best we can. We don't want to rely on any kind of human interaction to know what key am I using to encrypt a file. If I'm receiving something, what key am I using to decrypt, what am I using to verify the public or verify the actual digital signature, so I can verify that I know who actually sent me the file. You want those things to most definitely be an automated process, depending upon who sends you a file or depending upon what folder we're monitoring to kick a file out to a certain partner, so these things we definitely want to automate as best we can.

Audit notifications, I think this is pretty straightforward, pretty obvious. If you've ever been under an audit you definitely know that that auditor wants to know that you know what's going on with your systems at any given time, who's touching them, what files you want out, what are they doing with those files, what time are they doing, what changes are they making? All those types of things, they're definitely going to want you know what's going on, especially from a sensitive information, whether it's PAI, PHI< FTI, anything from a GDPR standpoint, any of that classified or, not classified so much but anything that's going to be in any of those sectors of compliance, they're definitely going to want to know that you know what's going on, so having that centralized, auditing, again, gets away from having multiple applications doing all kinds of different things.

They're all going to have their own audit trails for every different application, every different process, every different protocol that's moving files in and out. Having that centralized again, it's I think getting back to that common theme of centralization, having won the actual audit files to begin with because a lot of times these don't even have really good auditing. So one, just to have them in the first place and two, to have them in a centralized location with all the different mechanisms of how files are coming in and out of your network, so very important there.

Then from a security standpoint, I think these again are pretty straightforward. Encryption, not going to go through all that but I think the two things to remember, let's encrypt the traffic, so whatever protocol we're using to send something, so let's encrypt in transit and let's be diligent to encrypt at rest. Those are I think the two things to make sure you pay attention to. There's some other things in there but from an encryption standpoint, those are two things you need to pay attention to. Access management, again, this is going to be something else that your auditors are going to want to know, that you have access controls on your network, so you are controlling the gatekeepers per se. You want to do some segregation of duties, you want to implement lease privilege as best you can, have multi-factor authentication. I'm sure that's something everybody has been hearing, whether you're in compliance or not, especially in compliance. A 2FA, or MFA, whatever the term that is used, you're going to be seeing that a lot, so that's something that's been definitely at the forefront the last few years is some form of multi-factor authentication.

Architecture, this gets a little outside of automating file transfer per se, but from a security perspective we want to make sure that things coming into our network, we are protecting the back end private network. Not going to go into details on this piece, as far as the gateway's concerned but this is where we want to have that front end to control those things to where we have that protection and some of that mechanism so we're not opening up ports all the way back into our private network for people to come in, pick up, drop off files like that. Something from an architecture standpoint, we could definitely talk about the gateway at some point here.

Again, architecture, fuller on that. Getting away from that traditional FTP type model, I guess if you will, where the most times we would have an FTP server in the DMZ, where files are going to be staged on the DMZ, potentially credentials. We want to get away from that, we want it just to be a pass through and stream through that DMZ server in the middle, and all the files, authentication, happens on the private network on the back end. So things like that we want to pay attention to and change maybe the methodology of architecture and how we set these things up.

Then availability, if you ever talk about the CIA, kind of like your confidentiality, integrity and availability. It's one of the tenants of security, availability just meaning from a high availability standpoint, having the information available to authorize users is the main point of availability, having it available at all times and keeping it to the people that need to have access to that. Kind of dovetails off that architecture conversation as well, as far as an available data to your users, authorized users I should say. Then integrity, has anything been altered, and it's the I of CIA, that's going to be where we need to make sure that nothing has been altered in its transit on the way to the users.

Common needs, I think we've gone over this. Protect the data, manage and control, regulation, compliance, all those types of things that we kind of covered really quick and then why would we do MFT? Hopefully this is a rhetorical question, but from a secure file exchange management, centralized administration, I know we harped on that a lot. Full traceability and control, again when the auditor comes, I want to know I have at least the mechanisms, the logs, the alerts to let them know that I know what's going on and that full auditing tracing control, and access control. Then your automation, get that human element out as best you can, make things efficient, make them accurate as best we possibly can.

Here again, I think the demo kind of covered this. A modern MFT solution is what we're talking about. Obviously you're on a GoAnywhere webinar so we're going to talk about GoAnywhere as your MFT solution. All these things, and we do have MFT solutions all over the place but I think this other bullet, or not bullet but box over the top right, ease of use, is something I really preach. I shouldn't say preach but when folks get on demos and things like that on what separates us from competitors, I think there's quite a few things, from a technical standpoint, but this is one I really point our to a lot of our users and I think is resonates with the amount of folks that actually renew their licenses, everything like that.

Where we are in the marketplace now, we didn't get to where we are because it was a really convoluted product. We really depend on professional services to help people and to get more money out of the deal. I think the product is very, very intuitive and for lack of a better term, it's easy to use. It really is. So highly encourage if you haven't, download the product, you get a free trial, all that good stuff. Anyway, we'll go off of that, it's just something that I really believe that our product is really, really good at.

Cross platform as well, if you're a Linux, Windows, IBM i, [Novel 00:24:22], whatever, you can throw this on any of those platforms that you want to and the extensibility, you're going to see it's very, very flexible. It's modular. Most everything is perpetual license. It makes it really, really easy from a licensing, a module, we don't shove everything down your throat, you pick and choose what you want to have and pay for just that, so it's pretty cool. I guess my only advice is to get your hands dirty and try get a demo out of the deal or [EVA 00:24:54] license and put it on your network.

I guess I've put my sales hat on for a second. This is something where if you've ever looked at the Gartner Quadrant, this is something similar. The 2019 Info-Tech MFT data quadrant, MFT is kind of a niche market. I think it went out of the Gartner, it's been probably six or seven years now, so this is something that Info-Tech came out with. Obviously we're in the top right, so if you're familiar with the Gartner four quadrants, the top right is always the one desired, the leader quadrant, so that's awesome to see that we're up there above some of the competitors out there. So this is neat to see some of our peers of people out in the industry recognizing that we do have a pretty solid solution. Anyway, there's the sales pitch for you there.

All right, let's jump out of here and let's go ahead and jump through a couple of scenarios. Let me escape out of here. So first things first is what you're seeing right now is I'm actually logged into the administrative console. It is web based, so you don't have to pull down a client or install a client to do any management of the actual application. So you'll see here that I'm logged in, I'm just going to use my local machine for today's conversation to go through maybe three, four different scenarios to demonstrate ways that we can do some automation to get rid of that human intervention or human interaction, whatever you want to call it, so we can have some of that efficient processes, very accurate processes, get away from some of those things, those human error, to where we can have data leaks, loss, stuff like that.

First thing also to note is we'll talk about what we call projects in just a second. At a very high level that's kind of how we do our, we'll say scripting for lack of a better term. We'll also, before we jump into there, there's going to be three mechanisms that I'm going to cover today. Everything that I'm going to cover today is going to be pretty much a small set of things that we can do but obviously in the time allotted I'm going to try and go through some pretty common scenarios that we see. I'm going to try and keep it fairly simplistic, maybe give you a little touch or an idea of things that can also be done, but hopefully you'll see within the projects it at least gets you an idea, gets you interested on what else this thing can do because it is very, very powerful, very, very flexible.

Okay, with that being said, we're going to cover from a scheduler standpoint, so pretty straightforward. You can schedule certain things to happen. We'll go through that, we'll go through a monitor where we're basically going to monitor the file system for something, a file of some sort. Maybe it's a certain type of file, certain actions. Whether it's created, modified, whether it just exists to call something to then further do another action. Then we'll look at something called triggers where we're actually looking for users that are going to log into our system depending upon what protocol we're offering, whether it's SFTP server, FTP, FTPS, HTTPS web client, whatever the case may be, we're going to look for certain what we can, web users. Web users are going to be those folks that we create to log in to do SFTP, puts, whatever. We're going to look at certain actions, specifically in upload to then do some decision logic on what we want to do because of that said action.

Okay, with that said, I'm going to jump straight into one of the projects first, and I've kind of worked my way backwards. It potentially could be a little confusing at first but we'll work our way back and see how this works. First thing we're going to do is we're going to look at the project that I'm going to put on a scheduler to see what it's doing. This project is called schedule copy to S3 bucket, same names. Basically what I'm doing is I'm going to look at a certain directory and I'm going to take those files or file, whatever it is and I'm going to copy it to an S3 bucket. Pretty straightforward.

This will give us a chance at least to look at what we're seeing here, so let's go over this project designer window, just so you understand a little bit at least what we're doing or what we're looking at. This has four different sections or windows. The first one over here on the far left is going to be our component library. This is where all the action items are, so in our case we're going to create a file list, basically look at a certain directory. We're going to take those files, we're going to copy them and we're going to send them to an S3 bucket. So all those types of things are going to be in this case, maybe a file system task. We're going to create a file list. Basically, we're going to look at a directory for the most part. We're going to copy those files is what we're going to do.

But the point is, this is where all the action items are going to be, whether it's an SFTP put action, whether it's maybe doing a database sequel query. Maybe we're doing some data translation. We have a CSV file and we want to actually enter into a sequel database, so we can do a read CSV task. Not only can you do obviously action item tasks like this, like some file data system translation. But also things that control logic. Maybe we want to do some if else conditionals to see what's going on, maybe we need to loop through this file list that we're creating, lots of different items.

The point being is the component library is where all your action items are going to be. Once you decide on what you want to do or at least the action, now you can either double click it over here or you can drag it over here. Once it turns green then you can drop it in this project outline. That's what that is, this project outline is going to be a graphical depiction of every single task that you're pulling from that component library to build out some sort of business function. We'll go over this here in a little bit more detail. This window over we'll call our attribute window, so once we define the task we're pulling over here, now we've got a few things to define. In this case a create file list, where am I building this file list? It looks like I'm building it from this directory here, so we've got a few things to define. Then finally over here you're seeing a variables window.

There's going to be a few different variables. There's four different types that are possible within every project. System variables, we'll go over a couple of these. You're going to see these in every single project. Folder variables, these are defined outside of this project at a folder level, so any project you define underneath that folder, similar to just a Windows folder, so anything underneath that, in this case, projects, you can define variables, so they'll show up here automatically. That's where this one's coming from. Then some output variables and again, we'll go over where these are coming from.

Okay, so let's go through this project really quick. Let's walk through it and then we'll actually kick it off via the scheduler and see what happens. This one here, the create and delete workspaces are basically just creating a temporary directory for information to do into during the project and then we can just clean that after we're done, so that's all that's really doing. The meat of what we're doing in this case, we're creating a file list. So what are we doing? We just went out and we browsed our network and we found, okay, we're going to go ahead and look at this, go into our schedule, S3 bound directory, which is right here. So schedule, we're going to look at this directory here. So it looks like we've got three files, a txt, a doc and a csv.

The file list variable, this is something we're defining. We put this file list in there, it auto creates over here for you. Then there's the number of files found, should you want to use that, great we can do that. That auto gets created over here. Then all we're doing is we're taking, we're doing copy task to take that file list, which is just a source files variable and we're copying it to this destination directory, which is an S3 bucket. Now, very quickly what you didn't see and I'll bounce out of here and we'll do the actual scheduled project is when I hit the ellipsis and I go through places that I can put this, in this case the output directory, you're going to see this resource links, one of them is an Amazon S3 bucket and this Linoma encrypted shows up. Well, all these resources here, these need to be defined previously to getting here, so let's exit out of here.

We kind of get the idea what's going on there. That's going to be something from a research perspective, I'm not going to spend any time on this but just so you know, you can define different resources and think of resources as us being the client and we're connecting out to, in this case, an Amazon S3 bucket. Maybe you're connecting out to a database server because you need to do a SQL query to create a file before you send it out a partner or an ICAP server, you want to scan things from AV perspective, or SSH server, this is defining your SFTP servers. Whatever the case may be, you define them first here, and by the way, with every single resource you put in your information and there's always that sanity button or the test button to make sure that everything you did put in there actually works. So we're going to test and now we verify that yes I can connect out to my S3 bucket.

Okay, so we define those resources first. Anywhere that we want to connect to, in our case we're doing in S3 bucket as our final destination, so we define that bucket and that is going to be output or our destination directory. Okay, so let's go to the scheduler, so we're going to work our way back. So let's actually look at what's going on in the scheduler. The scheduler here that we have, we are going to call that project, we're calling that schedule copy to S3 bucket, same names.

A couple of things here, admin user password, that's just a GoAnywhere administrative user that has access or project executor rights to this specific project, but probably more importantly, how often do you want us to run? You can do a multitude of different options. I'm just doing weekly on this. I'm going to do a run now option anyway, I'm obviously not going to wait for this. But you can do this on whatever schedule you want, pretty straightforward on that aspect. From a retry option, from a scheduler perspective, you can say, hey by the way if this project fails, I want to repeat it for a two hour period every 20 minutes. So if it goes the second time and it fails again then we're going to wait 20 minutes and we're going to do it again. So this is a repeat schedule within a schedule. You can have basically some retry logic within the scheduler itself.

Okay, let's go ahead and cancel that, just so we can see this. Let's go ahead and first off, let's go to our S3 bucket and let's make sure there's nothing in there because we're putting those three files, that txt, that csv and that doc file, we're going to throw it here in this S3 bucket. So let's go back to our scheduler and let's hit run now. This should kick off that schedule copied S3 bucket, same names, so it should build up that file list, which as you remember is right here. Where's that schedule S3? So this should be the create file list task and then we're going to take this file list and copy the contents, which are these three files to my S3 bucket. Hopefully we should see that here, else we're going to be doing some live troubleshooting.

Okay, cool. So it's showed up here and let's say it copied at 10:35, which is just now, so that's just one, again, really, really quick, simple project to show how you can put things on a schedule to run as an automated basis. Very, very simplistic. Now, one thing that we always do as well, as far as going to look at what actually happened you can always go to your completed jobs and look at the actually job log here. So we'll see that this project got kicked off by a scheduler. Looks like we're doing, we talked about the create work space, it's just creating a temporary directory, we're actually not even really using it in this project at all. It's just good practice to have it there. The create file list task was the first one and it created that file list containing those three files that we looked at. Then the copy test just took those three files and moved them up to our S3 bucket.

That's all it did. It looks like we also had a send mail task. Yeah, so let's look at our email. Looks like we had a send mail task as well. Yeah, here we go. So the send mail task had the variable, the number of files variable, remember? We just used that, so that got filled in with three, files were copied to S3 bucket. So pretty straightforward, pretty simplistic. We copied, we did an email saying, yep, everything was good. Way to go.

Okay, so let's take that same project and do kind of the same thing, but maybe you want to take those files or file, because we don't know what it's going to be. When you do a create file list, you really don't know how many files are going to be there, it could be one, it could be five, it could be 100. We don't know. In the case that we just did, we just took them all and we just copied them to a location. But maybe you want to take that file list, which we're looking at the exact same one and maybe we want to actually apply, again whether it's one, whether it's five, three in our case. We want to take the file names but we want to maybe append or prepend the current timestamp. Again, there's a lot of different things we can do. In this case, that's all I'm going to do.

We can't just say okay cool, you know what? Let's take the copy test and let's copy those files to that directory and then just add the current timestamp variable on there, this file list is now a complex variable, because again, we don't know if it's one, we don't know if it's 100. We can use a for each loop, which is in our loop section, to loop through that file list. Now, that's the variable that got created that that file list variable that we created, so you could just drag that right over here and then each iteration, again, we don't know how many but each one, there is now a new variable called current file, so now we can go through and now our copy task isn't just taking all of them and throwing them up to that directory, now we're doing a little bit of manipulation. So we're taking the current file in that loop and we're saying okay, let's throw it up in that same directory. But now let's do this, and we're calling it all this garbly junk here. Where's that garbly junk coming from?

Well, let's hit our little show variables button, and this is what we call our expression wizard. There's tons of different functions in here, but for the sake of time what I'm doing is I'm just taking the file name, I'm appending it with the current timestamp and then I'm putting back the extension. So we're using a concatenation function, so when you highlight any of these functions, you're going to see down here that it gives you a nice example of exactly what it's looking for, so CONCAT texts, you can have as many texts fields as you want.

In mine I'm just doing a CONCAT function. I'm taking the current file variable, so the current file that's going through the loop, I'm using an attribute called name without extension, which is doing pretty much that. So whatever that file was called, copyme.txt, so I'm taking the copyme section and then coming back here and then an underscore and then I'm doing the current timestamp, which is this one right here. Oops, sorry, this one right here. Then a dot and then the current file, so the file again with the extension attribute which is just what you think it is, it's going to be the txt in this case.

When this goes through all, in our case three times, it's going to rename those files not just what it was but basically put the current timestamp in the middle of that file. That's just one example. There's tons of things that you can do. Maybe you have files that have a current form or a common format and it says like, trigger word underscore then another part of the file name. You just want everything after that trigger word underscore. So you could do things like position of, define where that underscore is and then use a substring function of that to pull everything out of it from there. Anyway, point being, don't want to get too in depth there. Let's go ahead and I think we can just this one here.

This should take those same files and rename them basically the same thing with the current timestamp in the middle, hopefully. They're going to the same place, so we should when we refresh, see those same file names just with the timestamp in between there, hopefully. Okay, so we got those in there, so we've got the underscore, the current timestamp and then whatever the extension was. Okay, a couple things you can do there. All right, let's move to another option for an automation standpoint. This could be where, think of it from a standpoint of you have folders or directories on your network and maybe you have an application, an ERP system that's going to kick out a file to a certain folder. Depending upon what folder that is, you're going to pick that file or files up and then call a project to do something.

So the file monitors are going to be just that, that's going to be monitoring the file system, so in our case, again, I'm going to keep it pretty straightforward, we're going to have a monitor out there called partner A, so we'll just assume there's a folder on your network called partner. Whether it's an application that's dumping a file off into that folder or whether it's a human being, dumping it off into that folder to do whatever doesn't really matter to us, we're just going to monitor that folder here. Again, there's a lot of different things you can do. I'm just going to do, file exist, and I'll tell you my logic behind that but you could do created or modified, explicitly created, modified, deleted, you can filter it out by pattern.

Schedule, this one's again up to you. In this case I'm doing it basically all day every 15 seconds Monday through Friday, totally up to you how you want to do that. But probably the main concept of the monitors is we are going to monitor a certain location which we already determined. Once we get a hit from what we decided here, now I'm going to build a file list, and that's all the monitors are doing, it's going to build a file list. Very similar to what we just did in that scheduled project where we had the create file list task, this is basically doing the same type of thing and it's going to put all those files or file in a variable called files. Again, it could be one, could be 100, we don't know. Then it's going to call this project. Monitor PGP, then SFTP, so in this case it looks like we're going to put files in the partner A folder and those files have to be PGP encrypted with partner A's public key and an SFTP to partner A's SFTP server, is basically what we're doing in this case.

Let's go to that monitor PGP, then SFTP. It should be pretty straightforward, so the first task we've got here is a PGP encrypt task, which is just under here. What are we doing? We're going to be in encrypting this file or files. This is the variable that's getting pushed in from the monitor, so the file list. I am using, since it's partner A's folder, I'm using partner A's public key. Now, what I did before this is I imported their public key into my KMS before this project happened. Oh, and by the way, you can manage obviously PGP, SSH and SSL certificates within your key management system. What you didn't see me do is import their public key ahead of time and then it's just a drop down choice.

I'm doing that, I'm encrypting it first, I'm taking the files and plain text, I'm encrypting them. I'm putting them in this output variable called PGP files, which automatically gets created over here. I'm temporally putting them in this system [.job.workspace 00:44:16] because I don't know really need these. I need these temporarily, but for now I'm just going to put them in there. I'm going to take those PGP files and I'm going to send them to partner A, so that SFTP server might put, the source is not every what I got for the monitor, it's going to be what I got for the monitor plus the PGP stuff, so now I'm going to drag that PGP files variable over here as my source and then I'm going to send it here.

I don't know if I have access to this, so I've just added a copy statement so I can throw it to that exact same resource that we have access to and we can take a peek at. Then I'm going to send an email if it's confirmed and by the way, I didn't go over error logic on the last project but if there's an error anywhere in here you can switch the focus to a module, which I created using the module task, so if anything happens, I'm going to switch the focus to the errors and again, I'm going to send an email basically saying, hey by the way, this project failed.

Okay, so let's go ahead and let's look at that monitor again. Did I change anything? I don't know if I did, let's just hit don't save. Let's go ahead and activate this and this was looking under GoAnywhere monitor partner A, so we should see this partner A copy in a text, both of these. They should get sent up here eventually. It was every 15 seconds, so it might take just a second here. We can always check. We can always do our completed jobs and see if that got kicked off yet, doesn't look like it did yet. The dead silence ... oops, not triggers, is always fun in webinars. All right, let's look again here. Okay, it looks like it got kicked off and it was successful so let's go back up to our S3 bucket and we should refresh this and see those text files appear. Oh, they're in partner A, my apologies, there they are.

Okay, so there's the two files. They just got uploaded here about a minute ago, or just now. So there's your two files that got uploaded via monitor, so it just looked for that folder. All we cared was if the file existed. We called that project, that project all it was doing was SFTP-ing, which is what we're not seeing, we just did the copy test so I could actually see it get moved. But point is the application, the user, they don't need to know what key, they don't need to know the SFTP server. They don't need to know any of that, all they know is hey, I'm dumping in this folder, going to partner A, boom. There it goes.

All right, let's go ahead and go to our last one, our trigger. I'm going to deactivate this real quick. Triggers again, are going to be actions based off of web user activity. Web users are those folks that you're creating to log into your system via SFTP, HTTPS, whatever the case may be. I'm going to leverage, this trigger here is going to say it's going to be called partner A. My condition is one, a file gets uploaded successfully, that's the actual name of the trigger. Two, I'm going to be looking specifically for a username that equals partner A, so basically a web user that is partner A. If that happens, if partner A uploads a file successfully I'm going to call this project, call trigger me.

Now a few things, you can start adding different variables and similar to the expression wizard, you can highlight and click on the button here and now you can start adding all these different variables if you wanted to, if you wanted to leverage these for whatever reason. I'm just mainly going to lever the actual physical file that's getting uploaded, so we'll see that. There's a few things here I think I'm printing out the project, that you can see where these variables are coming from.

Let's go to that trigger me project quickly. This project is I think very straightforward. Since it's from partner A, I know that when I get something from partner A uploaded I am going to call this project to PGP decrypt. How am I decrypting it? Well, I'm decrypting it with my private key, obviously. Now, I could be verifying a signature with partner A's public key should I want to. In this case in not, but I'm just uploading those, I'm taking that file and I'm putting it to a decrypted file's location. So go to the GoAnywhere, and let's assume this encrypted file here. So this copyme.txt PGP, this is actually the file that we're going to assume partner A is logging into our system and uploading.

If I try and open this it should be garbled junk, which it is, so we've got this old garbled junk file here. If I run this and it's under trigger me encrypted, if I take this file right here. Let's go and log into ... partner A is going to log into our web client in this case, it could be SFTP, it could be FTTP, whatever. We're just going to use the web client in this example. They're going to upload that file right there. This one here that I can't read, I'm going to go and upload it here, so they're using HTTPS one as the transfer medium. They've also PGP encrypted the file itself, I just got it on my network, I noticed, oh hey, by the way, this came from partner A, so what am I going to do?

I'm going to call this, project, I'm going to decrypt it, I'm going to actually delete the original one. Oops, decrypt, sorry. I'm going to move it to this decrypted files location and then I'm actually going to delete the original one. So I should see now in my decrypted folder, decrypted files, I should see that exact same copyme.txt but I should be able to read this one, hopefully, and there we go. Now it got automatically decrypted and put in a directory of my choice.

In the interests of the time, I just noticed the time right now, these are three different methods from a kicking off a project perspective, whether you're scheduling things, whether you're monitoring certain file systems or whether you're actually looking or waiting for certain web user activity in the form of triggers on how you can automate things, depending upon in this case, the web user that uploaded the file or depending upon the folder location. We could have done some filtering on file types, we could have done filtering on whether a file contains a certain string value.

There's a ton of different things we could do, obviously didn't have a chance to, but that's why I wanted to show you the expression wizard, all the different functions you can do. Gets you an idea of different ways or common ways that products or processes are called and probably most importantly, gets you an idea of what a project actually is and how they're created. I think, just looking at this interface, hopefully it's not terribly intimidating. I personally had never had programming experience, I don't think people need programming experience at all. Don't get me wrong, I'm sure it helps from a logical perspective, looping, job control, that probably makes more sense for them but this is pretty straightforward. For the most common tasks, these things can be very straightforward.

Having said that, the capabilities of the project can make things as complicated and flexible and granular as you want. With that being said, I think it's an amazing tool. The automation is to me, almost endless. Some of the use cases that our customers come up with are interesting and exciting and actually drive us to do enhancements. But it's really cool. The point is you can do the automated and you can get things out of the onus of your user's hands as most as possible, you can make things as efficient as possible. Probably most importantly, as secure as possible and I guess hopefully in the bottom line, save you guys money in the long run. With that being said, I'm going to jump back here. I don't know if there's another side or not but I might pass it back to Angela real quick.

Angela: Thanks, Dan. Let me see here. We do have some questions that have come in, and so we'll get to those here in one moment but before we jump into those I wanted to thank everyone for joining us today. If any questions don't get an answered today, you can see all of our contact information up here on the screen and then also if what you've seen today has sparked your interested to go through a trial or a demo, you can access those at the links that you have there. But with that, we do have a handful of questions here. I've gone in and I've answered a couple already but a couple of others, is there a way to send to S3 compatible endpoints or is it strictly Amazon S3?

Dan: As far as S3 compatible endpoints, there's a couple of cloud connectors. So what I showed here, as far as the resources are concerned, there's native ways to connect to S3 buckets, blob storage. But there's also something called cloud connectors. One of them is like a data lake storage connector. I wish I knew specifically what you we're looking for but anything from I guess, if you have it available or there's a way that we can connect to it via like a REST. If the APIs are opened up to REST or SOAP, that's another resource here as well that you can connect up to those, so anything REST or SOAP enabled, you should be able to collect up to no problem. But just natively, the S3 buckets and then from the cloud connector standpoint, I don't think I have them installed on my local machine unfortunately. There's a bunch of different ... let me just do this real quick for you.

Browser marketplace, quick. Your cloud trail, cloud watch, EC2, I know those aren't the stores, SNS. Data lake storage, oh yeah, sorry, data lake storage is Azure, that's not AWS. There's a lot of different ways from a Cloud Connector standpoint, whether it's actual native resource or again, if it's open the arrest via REST or SOAP APIS, then absolutely, you can connect that way as well.

Angela: Excellent. A couple of other ones here. Can you add formatting to the current timestamp, such as year month day or hour minute second?

Dan: Yeah, awesome question and yes you can. I kept mine very, very simplistic. I just did the two parentheses. So if you look, I'll just open this up real quick. Let's just go to that copy task. So if we look at the actual function wizard, if you do the current timestamp and you highlight that you'll see if you just do the two parentheses like I did, you're going to get that whole thing here. Within the documentation, you can do in this case it would be, yyyy-MM-dd. The documentation was just something I did not cover, which is this little help guide here, which is again, I don't want to dive too much into it but it's awesome. It's very comprehensive. Let's just type in timestamp here, hopefully, I can get to what you're looking at. This is going to show you the different variables that you can leverage and what they actually mean, so yes you can do whatever format you want, you just need to make sure you're doing either the M's, y's, whatever, and that should show you where, so yes, you can.

Angela: Excellent. It looks like we have time for a couple more here. Can you explain the print function inside the loop?

Dan: Oh yeah, good call. Yeah, I didn't even talk about that. My apologies. So the print statement for the most part, there's two reasons why you'd ever do a print statement. Where did I have that? Here we go. The print statement here for this, a lot of times what people would do with the print statement is use this to see what a variable is currently standing at. It's mostly, I would think used for troubleshooting purposes. So in this case, in my for each loop I was just printing out what the CONCAT function with the current file, the name of that extension and the current date, what this actually looked like. If we looked at the job log when that ran, which I apologize, I didn't even show you that. Was it monitor? Timescale, it was this one.

We should see that print statement, so yeah, here's that print statement. Every time that went through those three files, it was copy me and the current timestamp and then it was copy me two or copy or me three. Yeah, the print statement will do that and this is really, really helpful from a troubleshooting standpoint. It's just going to show you what that value is. You can obviously put in string values, your own stuff if you want. I think that's the most use case. The second one is very quickly, print statements can be used as a way to ride out flat files. Well, you're not going to notice because I didn't tell you, but here in your data translation there's a read flat file. Every other format there's a read and write flat file. There is not, but you can just do this right here with that print statement, just put it out to an output file. Then you can basically do a right flat file right there. Good question.

Angela: Excellent. Can you explain, and I hope I'm going to say this one right, can you explain under which circumstances to use the for dash loop inside an FTP project?

Dan: Oh, that's a solid question, and one I don't know if I know the best answer to. The for each loop, and I know that's not what you're asking. The for each loop in my experience with the product is probably 98% of the usage. The while and do while I see randomly. The for loop, that's really good question and you caught me off guard there. Again, this is showing my non programming experience but there is probably good use cases. I honestly can't think of one off the top of my head. I would encourage you to do what I'm doing here, well you probably don't have the product down but this is going to be a good time to look at, starting here, obviously it says executing a task with a specific number of times. Sounds good. Iterating over columns and a row of set variables. This is another good, shiny moment for not only me who's been in the product for a while but for anybody just getting in, the help guide is awesome. In this case it shows you examples, so good question, hopefully that answered it halfway decently.

Angela: Looks like it, perfect. All right, and then one more here. Are the HIPAA EDI transaction types available and is it included within the product?

Dan: Okay, EDI is fairly new, so let me go here and let me show you what is out there. So the transaction sets for, oh I've got to think here. That's another solid question. I will definitely disclose that I'm not in a HIPPA EDI specific expert by any means. We have had folks that will use, oh my goodness. I wish whoever asked that question, I wish they'd tell me what transaction set it was and I can't think off the top of my head. I'm more familiar with the shipping purchase orders, 850s, 810s, 856s, 997s. I cannot think of what, and maybe 853. Is that right?

Angela: They just posted it in the question.

Dan: 836?

Angela: 834, 837.

Dan: 834, I was so close. 834. Yeah, so they are out there. One thing to note though, whoever asked that question, we should have the transaction set. Version is potentially, can be non-existent for us. We've had a couple of customers have really, really specific to certain organizations that aren't I guess standard per se, as far as the version. If it's something that we don't have, let us know, it's something that we'll work with our partner to go out and purchase that or do what we can to get those out there, so good question.

Angela: Excellent, so I think that that about wraps up. There are a couple of others I'll go and send some emails to you with the information that they've asked for and then any additional questions that you have following the event, please feel free to reach out to us directly. The slides will be sent to you momentarily, and thank you all for joining us. Really appreciate your time.

Dan: Yeah, thanks guys.

Ready to See GoAnywhere in Action?

Schedule a live demo. Choose from our 15-, 30-, or 60-minute options to pick the level of detail that works best for you!

Schedule My Demo