Boost Your Business with Advanced Workflows

Thank you for viewing for this on-demand webinar. If you have questions following the webinar, please contact us. You can also download the presentation slides here.

 

About the Webinar

Don't let time or money slip away! This year, resolve to save your business both money and time wtih the Advanced Workflows boost GoAnywhere MFT has available.

This webinar focuses on an array of ways to use Advanced Workflows and how they can help to boost your business for 2020 and beyond. Some of the uses highlighted are:

  • Creating projects with no programming skills
  • Translating data into different formats
  • Controlling data security with triggers
  • Automating end to end file transfers to eliminate human error

If you're transferring sensitive files daily but want new ways to get the most out of GoAnywhere, this is the webinar for you. See what Advanced Workflows can do for you and your business — boosting your progress all while saving you time and money.

Transcript

Brooke: Hello, everyone. I hope your day is going well. Thanks for carving out some time to spend with us as we dive into advanced workflows. If you're not already using advanced workflows as part of your GoAnywhere Platform, we're excited to share how it can make your file transfer needs so much easier. And we hope you find the presentation helpful. I'm here with today's presenter, Dan Freeman. Dan, you there? Oops. Speak up again.

Dan: Yep, I am. Can you hear me?

Brooke: Yep, we can hear you. Thanks, Dan. Great.

Dan: You bet.

Brooke: Before we kick things off, I will remind everyone real quick that the event is scheduled for an hour. If you need to leave or drop off at any point, no worries. We are recording and we'll send the link afterwards so you'll have it. Feel free to send over any questions through the chat window throughout the presentation, and we have some team members online who can try to answer them. And then, if there's time at the end, we'll answer a few verbally as well.

And then we'll also have a survey sent out at the end of the presentation, and if you fill that out, it'll give us some good feedback on what was helpful today and what other questions you might have. So if we go to the next slide, we'll run through our agenda quick. It's pretty simple today, so talking about how to boost your automation with Advanced Workflows as part of GoAnywhere. Dan is going to fire up a live demo and show you all of this in action. And then, as I said, we'll take some questions at the end.

So I'll introduce you to our presenter. You've already heard him really briefly. You'll recognize Dan if you've been on any of our webinars in the past. He works pretty closely with customers and potential customers in his senior solutions consultant role, and he's spent over 10 years in various security roles. In short, when it comes to our GoAnywhere solution and how to leverage it to the max, he knows what he's talking about. So, Dan, thanks for taking the time to do this. I'll turn it over to you now.

Dan: Awesome. I appreciate it. Can you still hear me, to make sure we're still good there?

Brooke: Yes, yes. You sound great.

Dan: Okay, cool. Awesome. Thanks, guys. As always, thanks for taking the time out to sit with us in this conversation today. I think most of the folks that are going to be on today have GoAnywhere already, whether or not we're using Workflows or not is a moot point for this. But we'll show you a couple of different things that we can do as far as leveraging a little bit more of the GoAnywhere product, and hopefully, have you guys expand and explore other areas in how it can do more of that file movement and manipulation from an automated standpoint.

Speaking of exploring unknown territories, it makes me think of Star Trek, of course, being a Trekkie. And I thought it was interesting to find out that Spock, he's a very interesting guy in itself, but he has three ears if you didn't know that. It's crazy, the left ear, the right ear, and the final frontier. Yeah, I know. It's crazy. I thought Spock was interesting enough but finding out that he has three ears was quite revolutionary.

Okay. Alright, let's jump into what we came here for, not dad jokes. So we're going to go through, I'll probably keep the slides pretty quick. And then we'll jump into the live demo to give you some ideas of, I think, some basics on how Advanced Workflows work to go through, I think, a lot of the common examples that we see to get you at least a little taste of how we can leverage the Advanced Workflows module within GoAnywhere.

A couple of things to note, I think, on this slide is one of my favorites, not having a programming background. I came from a infrastructure sysadmin security side of things before starting here, so this slide was neat to see, and I can definitely attest to this. Creating these projects, what we call projects, within the Advanced Workflows module of GoAnywhere does not require programming skills, which is great. If you've ever had to deal with scripting, doing some sort of business processes by using C#, or Windows batch files, or Python or whatever, that can be, I think, a little difficult, a lot of Googling, if nothing else, to get things to work.

This is one of the areas of GoAnywhere that I always tout or talk about to our potential customers as one of the things that separates us from our competitors, the intuitiveness on how you can actually build these, what we call projects, or the business functions to do whatever it is that you're trying to do as far as an automation standpoint. Now, having said that, not having programming skills being a requirement, don't get me wrong, some of the logic that goes behind these things like your if statements that you see on the screen there or 4H loops, things like that, the logic itself, obviously, if you're a programmer, that stuff will come a little bit more easily. But the way that this is built out, and we'll go through some examples on it being task-oriented, GUI-based like you see here, makes it to where these folks like myself and potentially folks like you guys on the phone, you can definitely do these things. It's nothing terribly intimidating.

We can do things like translate data into different formats, which is very, very common and very, very useful. So as the bullet point there talks about 19 different tasks for data translation, and we'll see those when we jump into the product, but a lot of it, for the most part, is going to be doing reading and writing of certain different formats very common to reading maybe in Excel or a CSV file, and take all of the results and maybe look at inserting or updating a backend database. Or flip side, making or doing a sequel query that outputs information.

We need to maybe take this employee listing table that you see a picture of there, and we need to query it to pull out all the people with a certain ID or everybody in Denton, Nebraska, whatever, or maybe the entire set of information to then write to a CSV or Excel file, or whatever format we want to, to then package that up and then SFPT it off to whoever we need to send that to, so lots of different things from that standpoint.

And then I do have in there that read/write X12 and EDIFACT, which could, by all means, be its own webinar, so we won't terribly get into the weeds on that. But you can definitely do a lot of the EDI translation, whether you're doing reads to put into a backend ERP, or maybe you're pulling things out of your backend ERP to do a write of some sort of EDI document. Again, that's a little heavy on the details, so that could definitely be its own little animal or webinar for that matter. But the point is, we can definitely do a lot of that reading, and writing, and data translation.

The automate file encryption/decryption, I think this is a pretty straightforward process using Open PGP or the PGP standard to encrypt files, whether we're receiving them and we're decrypting them, or whether we're actually taking someone's public key and encrypting files and sending them out to folks. This is, I think, a very, very common, very popular use case, and it takes a lot of the headache as well as security risk out of the equation on a lot of manual processes that a lot of people do, I think, today, whether they're relying on their users to know where this file is going and which public PGP key to use to actually encrypt the file before they send it. Which, obviously, any time you involve the human in a lot of these steps, you're going to run the risk of having files go to the wrong person, encrypting with the wrong public key so the person can then decrypt it, all those types of things.

So it's very common for us to have maybe a folder monitor to look at certain folders to pick up files. And depending upon where they pick those files up, we're going to use the PGP key that's associated to that partner to them push them out the door and vice versa. Maybe you guys are requiring all of your folks that are sending you files, you're requiring them to PGP encrypt those files before they send them to you, which, at some point, then obviously, you're going to send out your public key to everybody.

And when you get these files, whether it's via SFTP of ACTPS, or however you're getting these, you can call a project to go ahead and use your private key to decrypt these files automatically and then shove them to a certain folder depending upon, obviously, who uploaded that file. Point being is you can get all the human interaction out of the file encryption/decryption process when leveraging a very popular open PGP standard. We'll show a quick example of that too when we jump in there.

Everything that we're going to do as far as looking at the Workflows and building out these projects, we're going to look at different ways that we can call those or invoke those projects, or procedures, or workflow pieces. One of them is going to be a scheduler, so a common scheduler. Within GoAnywhere, there is a built-in scheduler where you can do those types of things, whether you're automating things like reports.

Maybe you have a CIO or CEO, or your boss, manager, whoever, and they want to know certain things about maybe file activity, user activity, whatever, we can do those types of querying on the backend database to generate the information you need, and then maybe package it into a PDF file and send it to an email address of your CIO or whoever you want to send that to. We can put them on a scheduled basis, daily, maybe like in this picture over here, weekly. We're going to send it Mondays and Fridays every week, things like that so we can have these things in an automated basis.

Again, we talked about generating reports, delivering reports in an email, text notification, whatever you have access to you as far as those protocols are concerned. Again, when we jump in, we'll see some of the different options that you can do within the built-in scheduler than just your typical okay, we're going to send this weekly. We can do a couple of different conditional type things, repeat options that you see on there. Obviously, that picture says never, but there's some options that we'll point at through that schedule as well.

Triggers, we're talking about automation as well as data security. We'll touch on both from the automation standpoint when we look at triggers. Triggers are going to be from your web users, so if you guys already have GoAnywhere, you're probably, most likely, probably if you're not using Advanced Workflows, you're probably using the service side of things or the server side. So if you're using SFTP, HTPPS, I mean those listeners, then you know what we users are. These are those folks that you're creating to log into GoAnywhere leverage whatever service it is that you are offering.

Well, with that, those web users, whenever they're doing any type of action, whether it's their account gets disabled we have download failed, we've got upload successful, which is pretty common. That's probably the one we're going to demonstrate the most. But triggers are going to be based off of web user activity, period, and only web user activity. Again, we'll look at some of the different activity to trigger things off of. We'll probably focus more on the upload successful trigger to show how we can do some automation of when files do land, or certain actions of web users do happen. We can definitely do some sort of action.

One of the things from a data security standpoint and the triggers, we'll specifically look at the before secure mail send trigger to leverage an ICAP resource to do some DLP or some content scanning of the files that are being sent out or potentially trying to be sent out. As well as on the upload successful, we'll have a couple of examples of folks logging in and trying to upload files. We'll use ACTPS protocol for, I think, visual sake. It's a lot easier to see that, to see some certain action, whether we're redacting information or whether we're flat out just blocking it because of certain criteria. But those are other things from a data security standpoint how we can leverage triggers and certain web user activities. So we'll cover a lot of those things when we jump into, again, the live demo portion.

This one, the monitors, so we covered schedulers. We just looked at triggers as different ways that we can call projects. Different actions monitors is another one. Monitors, as the name suggests, is where we're going to be monitoring the file system for the most part. This can be different as far as you see this screenshot there in the middle, monitor location. There's going to be different things that we can look at to find network shares. Anything that you're going to give GoAnywhere application access to as far as your network shares going into your own network, maybe Amazon S3 bucket or Azure block storage, which are considered local network shares, or any FTP resources that you have defined, FTP, FTPS, or SFTP.

The point being, we're going to be looking at a certain folder. We're going to be looking for certain event types or activity. So as you see there, file exists, but it could be created or modified, explicitly created, or explicitly modified, deleted, a lot of different things as far as the activity or even type that we're going to look for, and then what are we actually looking for. In this example on the screenshot, we just have an asterisk meaning anything, so we're describing anything in that outbound share and anything in that file exists. So again, we'll cover more of this.

The main point of monitors though is any time we get a hit or anything that fits our criteria, so if we're pulling a certain folder every hour, and the first hour there's nothing, the second one, there's five files, the main point of monitors is we're going to grab those files. Whether it's one, whether it's five, whether it's 100, we don't care. We're going to package it up into a file list, and then we're going to call a project and pass that file list to the parameter into the project. So again, another way that we can pick just because of certain activity from a monitor standpoint.

Okay, that's enough of the slides. Let's jump out of there, and let's jump into the product. So first things first, when I'm going through the live demo portion, I do have a couple of instances. I have one that's a local machine, and one is actually our portal demo box. I'll try and be as best as I can to tell you when I'm doing that just so you guys follow along. But for the most part, through this demo, I'm going to build off of a simple project and try to add certain pieces as we see fit to explain a little bit more of some of the common features that go with Workflows. I'll try to get as detailed as I can. I'll try and cover the most common examples to at least get you an idea of how I think some people use it, or maybe start some ideas on how you guys can use Advanced Workflows in your environment.

With that being said, there is a couple of things that I do want to cover before we actually jump into the projects themselves, most importantly is going to be our resources. Resources are going to be the different servers and services that we are putting connection information into so that we can actually leverage maybe in the S3 bucket, or block storage, or a database, things like that.

In our example, we are going to leverage an Amazon S3 bucket. This is going to be probably a destination location for a lot of the files that we're going to move. Maybe it's an SFTP location. Maybe it's just the ACTPS virtual folder. We'll go through those things as we see fit. But this is where we're just going to put in some connection information. No matter what the resource is after you put in the information that you have, you always have a test button as your sanity check to make sure that what you did put in actually works. Then let's hope this actually works. So this is good. We have resource test successful. We can actually connect up to this Linoma encrypted labeled S3 bucket, which we'll flip back and forth. That's going to be here. I'll make sure we got this up to date.

We'll see we'll leverage this location to see where these come into play and when we actually use this in our project. Other common ones will go through database servers. It's going to be very, very popular, whether we're querying a database to pull information out to write to a file or vice versa doesn't matter how we do that. We have to actually leverage it first. So we'll create the database connection. In my case, I'm connecting it to an EDI850 database on my local, my sequel instance. Again, looking at the test button. Obviously, working is a good thing.

That's where we're going to flip back and forth here. I'll keep you in the loop when I'm doing that. Well, we're going to leverage this database table, so if you go to that EDI850. So we'll come back to that, but we had to make that connection first. And we just called it PDI850. We are going to leverage an ICAP server. This one, it does work. I am actually working remote, and this actually has light lifting policies.

Not to get too much in the weeds, this is one instance where I am going to jump back and forth because I have to be on this box for this to work. In any case, same thing, you're going to put in connection information. You're going to test it, and if we have connectivity, then great. Now we can leverage that ICAP server. I'm going to switch back to my local instance here.

Network shares is just going to be places where you're going to define network shares throughout your environment. It's very, very common, probably one of the most commonly used network resources or resource types out there. Same thing, how you're connecting to it, if you're going to use a certain account, go right. We just need to do the test and make sure that it actually works before we can leverage those in the project. And that one looks good.

And last but not least, we'll look at our SSH servers. We're going to do some SFTP. I think that's a very, very popular protocol, and it's very popular to do SFTP pushes or puts and get some things like that. So we'll definitely be leveraging an SFTP server.

Before I jump into the projects, one thing real quick. I want to leverage all those variables, which may not make sense to you yet. But just so you know what I'm doing here, this KWS or this SFTP server is not in my AWS environment. Let's make sure it works first and foremost. That's a good thing. And we're good there. So what I'm going to do here is I'm going to actually copy this one just to give it a different name. So I'm going to copy this, and we're going to call this ... Let's call it webinar if I can spell webinar SFTP. Okay, it's going to the exact same place. I'm just giving it a different name because I want to leverage this here in just a second. Now we've got an SFTP resource called Webinar SFTP just to keep that in the back of your brain.

Alright, let's go to the project section within Workflows. The first thing you're going to notice on the left-hand side, you do have that project explore window. And just like anything else that you've done, whether it's in Windows or IDMI vis IFS, you're going to create this folder structure yourself from an organizational standpoint as well as from a security standpoint. No different here. You're going to create, obviously, folders and whatever you want that stuff. We're going to go down to the webinar stuff, which makes sense.

I can do things like permissions if I want to, allow certain admins to do certain things within that folder, and the project actually inside of that folder. Or as we're going to see here, we can also do defined variables, which it looks like I already did here. We'll X that out here. You can add variables at the folder level. So again, we'll call this webinar SFTP server as the variable name. The actual value, this is where we're going to leverage that webinar SFTP. It needs to be exactly what I called that resource.

So, in this case, now I can leverage this SFTP server as a variable that's going to be available in any project that I create underneath this folder. There can be other things like, in this case, it looks like there's one that's defined up here at the root level. Something called help desk group, and it's actually pointing to my individual email address. It would probably make more sense if it was pointing to a help desk distribution list, but the point is, you can define variables out here at the folder level. And usually, it's going to be things, or resources, or distribution lists in this case that are going to be used very common or you expect probably you're going to be using this a lot. And you want it defined as a variable, so now these variables will show up in your project automatically. So now I can just drag and drop. And I'll show you how that works as well. Just before we do the create project, you'll know where those folder variables are coming from.

Okay, so let's do a create project. And again, we're going to start simple and build on that, hopefully, touch a few areas that will spark some interest. But let's just start with SFTP put. It's what we're going to do as far as our project is concerned. So we have a name. We'll hit saved. This will get us into our project designer window. And very quickly, what we're looking at here is we've got four different, we'll call them sections or windows. The first one, the component library, is where all of your action items is going to be. Ours is going to be pretty simple, an SFTP put command.

But we're also going to explore things like maybe some create file lists, some copy, delete. We're going to maybe look at some data translations, maybe doing sequel query. We'll also look at maybe handling things from a job control standpoint or looping through some lists. The point being is the component library is where all of these action items are going to be to where you now take those and drag them over into your project outline. The project outline is just that. It's a graphical depiction of everything that we're pulling over to build out whatever this function that we're trying to build, kind of like scripting, just in the graphical sense.

This window over here is going to be the attribute or definition window, so when we do pull certain tasks from the component library and/or project outline, we need to define a few things about them, in this case, SFTP server task. Well, which one am I connecting up to? Things like that. And then this window al the way over here, our variables window, is always going to be there available to us. But typically, the system variables are going to be there every single time as well as any folder variables that are going to show up depending upon where I created this project. So before we got here, we created that webinar SFTP server variable, so this could be something where we just drag right into here because maybe this is something we're going to use a lot. So we could just drag it in as a parameter for the SFTP server itself.

Maybe the help desk group is going to be a fill in the two fields first, send email notifications, so we'll get there in a second. The point being is, that's where all of your variables are going to be held. There's four different types of variables that can be involved in every project, UC tooling, system variables, and folder variables. There's also going to be output variables and project or user-defined variables, which we'll go through as we build this project.

Alright, one thing also to note, we'll jump through and again, we're going to keep things pretty simple and just walk through a few things. The first thing we're going to decide is let's go ahead and select the SFTP server we're going to connect up to. And let's just say, for example, in this put file, we're going to go through different examples of the sources as well as the destination. But to start, just look at all three of these. The source file could be a variable, as well as it's a single file. Or it could be an explicit file. Maybe you go out here, and you choose an actual file and select that, and that's your source.

I don't think this is used very often. We are explicitly choosing a single file to do a SFTP put, but that's one option you can do. Source file as a variable, this can be a parameter passed into the project at run time. We'll explore this in a little bit, specifically with monitors. But first, let's do the source file set.

Let's go to this one here, our monitor directory. We'll go to this one a few times today. So we're going to say okay, our base directory or how I'm going to build off this file set, I'm going to be pointing at this monitor directory. I'm going to include just files. You can do directories or both. I'm just going to do files. And by the way, there's all kinds of defaults, so you don't have to define these to find out what the faults are. You can just hover over here, left click, and it'll show you. One is going to describe what this actually means or what it's looking for. And then if there is a default, it's going to tell you what it is. I'm not going to do any repair sets, sort, or sort order.

If we wanted to do filters, that's great, but I'm just going to keep it simple. I'm going to do the include, and I'm just going to do a base. So this file set is going to look in this monitor, and it's going to build a file list of all the files that fit this criteria, which is basically grabbing everything in this folder. The destination, since we are grabbing a file set, so it's going to be a file set. It's not a single file. We're going to have to choose an actual destination directory. So I'm going to hit the ellipses. What I'm going to browse to is going to be the folders that I have asked it to because of the SFTP server I defined. And again, I'm going to use that encrypted S3 bucket as my destination, so we can scan back and forth to see how this stuff works.

And then things like transfer options, preserving timestamp if you want to. I'm not for this, so it's simple, file name, prefix, and suffix. So maybe you want to do prefix, not a test. You want to put that in the beginning of all of these files that go across. Great. We'll look at some more options, but let's keep it simple. This is all we want to do, is an SFTP put. We're going to take these files and go ahead and fill those out to that encrypted S3 bucket. So let's hit execute.

Now, think of this as something that you could put on a schedule, so we could kick this project off once an hour if that's what you wanted to do. But let's go ahead and just look at what happened there. Let's refresh this, and we see that this took off, and it grabbed those two files because remember, we're monitoring this monitor folder, which had just these two files. So file transfers are cool, webinars are cool. Those got pushed up here. We did define a prefix, and we've called it prefix test. So that's why those are on there. Prefix test, we see those. So let's go ahead and delete those out of there. So that's just one, I guess, simple way to do a simple SFTP by monitoring a certain folder, grabbing the files that we want, and doing that SFTP put in this case. I'm just going to get rid of these here.

Let's say we looked at that idea, and then we say that okay, we went ahead and we put those. But let's say in the monitor, we don't want to do the same things over and over again. So maybe after we do a successful SFTP put, maybe we'll want to copy those original files that we just did. And so instead of just doing the SFTP put, we can have an output variable here. We can take all the files that we just did that SFTO put to, and we could define things like your destination files variable, which is going to be everything that we're doing that SFTP put, how they're landing in the destination location.

So if we did destination files, we define it as a variable, you could do number files. I'm going to skip that for now. You could do process source files, so the files as they were originally grabbed. So it could do, we'll call it source files. The point being is we'll see that those the going to be two now output variables. So there's another variable that we can leverage. In this case, it's going to be output variables. So this time, we can see okay, well if we're going to do this SFTP put, we're grabbing those two files in our case. We're going to push them up to the S# bucket. And when they land there, they're going to be a little bit changed. We're putting that prefix test in the name of them. So the destination files variable is going to have those two files with that name. The source files is going to be the original sources.

So in my case, when I do this, I want to copy the original, so I want to take those source files, and that's going to be my source files variable. And I want to push those to, in our case, I'm just going to put them in the archive directory. And then maybe after that's done we want to delete the original files, so I don't need those again. We take those source files, and we're just going to delete those.

So in this example, we should see pretty much the same activity. But we're going to see a copy of the original files that are going to land in this archive directory. These should be gone if they SFTP put correctly, and they copy successfully. They should be deleted out of the original location, and our S3 bucket should have those prefixed files here. So let's go ahead and shoot that one off. Hopefully, we did things right. In the end, we should see our two files up here with the file name prefixes on there. Okay? And then if we go back to our original location, the monitor, we should see the original files that are gone because we deleted them. And we should see a copy of the original ones, not the prefix ones for the destination file. We should see those copied there.

So again, it's pretty common, whether or not it's done interactively like I'm doing, probably not. It's usually going to be put on a schedule, things like that. So the next time these run, you don't want to grab and process the same files the next time. So it's the idea with that here. So let's go ahead and throw those back here. And let's get rid of these guys here. Let's delete those.

So we saw how we can do some renaming on the fly, but the way that we did it isn't really common by just throwing in something of value in that prefix. I would say probably what I see a lot of the times is people want to put in the current timestamp when you actually grab those files and shove them up there. One thing I see people do is they say, "Okay, cool. I'll just put the file name suffix and put the current timestamp variable here." But the problem with that is it literally appends it to the very end of the file. So if your file is webinarsarecool.text, it'll be webinarsarecool.text and then the current timestamp.

So to get rid of those, let's get rid of these filename prefixes. Maybe you want to go through and we do want to rename every single file that we're grabbing and we're throwing up to our S3 bucket. So the way to do that is we can do, instead of the SFTP, put source defined here in the actual file set under the put command. Maybe we want to create that file list beforehand. So we can do this, we can do a create file list. And we'll call this variable file list. Yeah, we'll skip the number of files for that. So now, the file list or file set is going to be pointed at the exact same location. So we're going to point to that same monitor location. And again, filters, I'm not going to define them because I just want to grab everything.

And instead of the SFTP put going through that and renaming those the way that we did, we can now do a 4H loop because now we've got a file list. It could be one file, but it could be 100. We don't know. So the way to go through that is we can take these 4H loops, and we can iterate over every file that is in there. Again, we don't know how many. All we know is we're going to take the file list variable that was created here. And that's what we're going to iterate over. We can drag that over here. And then the current item variable, so the current file that's being evaluated. This is up to you what you want to call this. We'll just call it per file.

So now, essentially, we can take that, and we can take that SOTP text, and let's just throw it right in that loop. So now, for every file, we're going to do an SFTP log in, and then we're going to do a put. But now our source has changed. We don't want this file sent anymore because we've already defined it out here. So we want our source file to be an actual variable being passed in here, and it's going to be the current iteration of whatever that file list is. I can take that current file because I know it is going to be a single file. I can land it in this source file location. It is the first iteration of that fil list, so it's going to be a single file. And instead of destination directory, I'm going to ...

Let's cut that. And what I'm going to do is just put this right up here. And then for the main file, for this current file, so the first one is going to be ... Oh, what did we have here? Oh, filetransfersarecooltoo.text. I want it to be file transfers are cool too, put the actual current timestamp, and then .text. So we can leverage this here, which is going to get us into our expression wizard or our function. And one of the ones that I'll use, we'll again, try and keep it simple here. We're going to use our concatenation function. So I'm just going to concatenate a bunch of different fields, we'll put it that way, or text values. Some are going to be variables, some are going to be the literal string. So we'll see how that looks.

Let's go ahead and hit the concat function. One thing that's cool about the expression wizard that you see here, when you do highlight the individual functions you're going to see different descriptions, and usually an example of how it's used. So it's very, very helpful in that sense. So let's just hit done here to throw this out there. One thing also to note is the dollar sign collaboration is how you've got to invoke these layered lists.

So let's do concat, and let's do the current file. So the cur file variable that's going to be passed in. I want to take that name, and so I'll show where I'm getting this name. But I want to do it without extension. Where this is coming from, the cur file is obviously the variable that's getting currently evaluated. This colon and this names of extension is actually an attribute. Now, where that's coming from, let's quickly see the little help guide here. And let's go to local if I can find files. And you can see in our help guide, you'll go through and you can see how we can now leverage certain attributes of files themselves. I'm specifically going to use this one here, this name of that extension, because that's going to give me the file name without the extension as the name suggests.

That's going to be my first part of the concatenation. And then I'm going to comma, and I'm actually going to put maybe just an underscore to separate that and the current timestamp. And then we can go back in ... I think that might be a double quote, which I don't want. There we go. And now I can go back in and leverage my function expression wizard again because I'm going to grab the current timestamp and throw that in there. And it got thrown out here, but that's okay.

And for the pattern, again, this is something where you can use the help guide to show you what formats are acceptable. But I'm going to do the year, month day, underscore, hours, underscore, minute, underscore second. Again, this is in the documentation, so you can go look. And you can see what all of those capital and lowercase letters mean. And then we'll have underscore and current timestamp. And then we'll have a little of a dot, and then we're going to use the cur file, and then the extension attribute. Close that off, and hopefully, this works the first time, which would be maybe amazing.

So what we're going to do here is the 4H loop, we're going to go through each file. We're going to do an SFTP put, and we're going to put it in that S3 bucket with this new name. Basically, we're going to include the current timestamp. And let's go ahead and fire that off. We might have to do some life troubleshooting. Who knows? I'll be surprised if the syntax is actually ... Okay, that's surprising. Okay.

So let's go out to the S3 management bucket here, and we should have those files with the current timestamp opinion in there somewhere. We've got file transfers are cool too, underscore, year, month, day, underscore, then it looks like 10:36, 24 seconds, which, that's probably right. So there we go. So that's pretty common, appending the current timestamp to filename. And that's one way that we can do that. We can iterate using a 4H loop over that file set that we threw in there.

Okay, let's go back here, and I think we'll probably have to come back here really quick. File transfer to these archive ... Oh, it looks like I only did the one. And cut that, put this back up here. Okay, so let's look at different ways as far as the source of these files can be.

So we looked at ways that you can define looking at a certain folder within a file, but we looked at a way that you can actually create the file first and then iterate over that list to do whatever it is that we want it to do. In our case, rename each individual file. But maybe our source is coming from a monitor, so we talked about that as well. So let's go ahead and exit out of here, and let's create a folder monitor.

So let's go to a folder monitor. That project that we just did, no matter how we designed it, we could always put them on a scheduler to kick it off, so pretty straightforward there. On the monitor, let's do SFTP put. Yeah, monitor, we're going to make it in that for now. We're going to monitor that same location. We're going to look for just file exists. Anything, that's fine with me. We're going to kick it off every 15 seconds, so we don't have to sit here and wait. The point is, when we do get a hit on whatever we're doing, I'm going to call that same project that we were just looking at. And let's put in user.

And the main point is, like we mentioned before, these folder monitors, they're going to build a file list and pass that file list as a parameter called file by default. And we'll keep it default. Advanced is going to make sure you're not grabbing files that are being currently accessed, and you can do some built-in notifications. For our example, I'm not going to worry about those for now. Before I activate that though, let's go back to the project because we're going to have to change the source of what we're the SFTP puts for.

So now this create file list, we don't need this anymore because we're actually doing the file list outside of the monitor level. We can go ahead and disable this. So we can come straight into the 4H loop, and the 4H loop is now going to iterate over not the file list variable, but the actual files variable that's getting passed in from that monitor that we defined. And this is going to look over. We're going to do the same stuff for the file. That's fine. We can leave that all the same.

But now for the copy task, I'm not copying the source files variable because that was used when we were defining the actual output of when we're doing that create file that was here. This is going to be, the source is going to be the actual files variable that's getting passed in from the monitor itself. And then the delete task, and this is especially very, very critical when you're doing monitors, especially when files exist. You definitely want to make sure that you're cleaning up after yourself on that file monitor location so that you're not processing the same files again the next 15 seconds when it runs again. So we need to make sure that we clean up after ourselves there.

So again, this should ... basically did the exact same thing, just in a different manner. Let's go ahead and save it. Let's go back to that monitor, and let's look at the SFTP put monitor. And let's go ahead and activate this. Be sure you're looking in monitor, any file, calling the right project, and file variable. Okay, let's save that. It's going to take a quick snapshot, do some file comparisons, the whole thing about 15 seconds. While that's going, we'll go up here and we should start seeing files come in, or it's going to make me re-authenticate, one of the two. Always good stuff in live webinars. All right, let's go ahead and re-authenticate.

While that's going on, let's go back over here. And it looks like the delete task we set ... There they are. So there's our files that came in at 10:40. So yeah, those just came in. The same thing, renamed them, looped over them. Now we'd notice if we go back to the monitor original location, they're gone because we definitely don't want that monitor to pick them up again. And it put a copy in the archive directory. So if I go back to here, we should see this monitor. Oh, I've got to refresh here. Let's say it gets kicked off three times, but it's only done a couple of projects. That's because we've moved those files and deleted them after we processed them. So now this monitor, even though it's checking every 15 seconds, it's going to keep running. But nothing is happening until I put some files back in there. That's the concept of those monitors. Let's go ahead and deactivate that, and then let's put this back in here to keep on moving here.

Okay, in the interest of time, let's go ahead and skip over to ... We can do things also within that project. I'm going to skip over because I've just noticed the time, and I want to make sure we cover a couple of things. But different tasks as far as maybe you did want to PGP encrypt the files first, then great, you just build that in there. The input file is variable, do the same thing. If this was coming from the monitor, it would be files. And what would you be outputting? Well, we'd be outputting PGP files. And that's the only thing that's really going to be different. Now the 4H loop, what am I iterating over? I'm not iterating over the original file list anymore, I'm iterating over the new file list that just got PGP encrypted. So that's about the only change.

Things like that, things like maybe you were doing a sequel query first to actually get the files there. So maybe we're connecting up to that backend EDI database, and we're doing something like a select statement from the customer's table. And the output is going to be customers. And then we're actually going to take all that data, and we're going to do a write, maybe Excel. We'll do it right after that. And what's the input to that? Well, we're going to take that customer's row set that we just got from the sequel query, and that'll be our customer's, that Excel, maybe. And that could be PGP encrypted, and then SFTPed out the door.

So lots of different things from the commonality standpoint. I just wanted to get you an idea of how you can use the scheduler, you can use a monitor to kick these processes off. And then a lot of things that you can add as far as the projects are concerned to do different types of automation and different tasks, some of the common things.

Well, let's switch gears a little bit, and let's go to the triggers. So let's go to the triggers themselves. Don't save here. A couple of triggers that we're going to look at is the upload successful, it's very, very common, and the before secure mail send. We're going to look at these triggers from a standpoint more on the security side of things, basically scanning files before they get uploaded, dropped into locations, or before you're actually able to send the secure mail methods so it can do some scanning, at least some automation on that piece. As well as when we do the upload successful, we'll also see ways that are not just from a security standpoint to scan them. But when files do land, what do you want to do with those files? Kick off processes, maybe just an email notification to somebody so that they do something, lots of different options there.

For this, I am going to switch over to my other instance. So I'll just make a quick switch here. And we are going to stimulate a couple of things. Again, I'm going to use the web ACTPS protocol. You don't have to. It could be SFTP, FTPS, whatever the case may be, I'm just going to use this because I think it's a little bit more pleasing to the eye from a visual standpoint.

For the first example, we'll do a trigger that is going to trigger off of a certain user that uploads a file successfully via the HTPPS protocol like we see here. So let's go ahead and log in as, we'll call it redact is the user that I want to use. And so this user has a certain rule that's tied to it that if I upload files, there's going to be a trigger. So let's go to the backed trigger. Let's switch to the admin page again, and let's go to the triggers themselves, specifically the upload successful trigger. And I believe we have one for redact here to redact. We'll look at these.

The point of the triggers is you're going to select a certain if it'll open up, you're going to select a certain action. In my case, it's going to be a file gets uploaded successfully. We'll look at HTTPS or SFTP. The conditions name, okay, cool. The username is redact, then I will want to, in this case, I'm going to call it project, which is ICAP project redact trigger and so forth. We'll look at those items here in just a little bit. But just so you know what those triggers are going, it's looking for a file that has uploaded successfully via the HTPPS protocol. When it's user redact, then I'm going to do this. I'm going to call this problem.

Let's just see what happens here when we log in. Let's grab some files, grab some test files that specifically are redact. A couple of things to note too, the ICAP, the Clearswift product that we use for the DLP, and content filtering, one thing I think is really cool is it does view content or OCR, so your optical character recognition. So this is an actual jpeg file. And as you can see, there's a social security number in here. This is one where we can upload to here. I'm going to tell that just where to go. It should upload. I'm going to hit refresh here. We'll take that exact same file and let's download it. Hopefully, when we open it, we have our redaction. So cool, it actually did what I wanted it to do.

Now, what happened on the backend because that was neat and you saw how that worked? But what it did was it took that upload successful because all of the trigger information that I had, and it called that project. So now if we actually go to the projects tab, and I'll save that. If we go to our projects tab, I'm sorry, not projects, but completed jobs. It's still there. We can see that ICAP project, redact demo that we had here, if we open up the actual job log, you take out submitted from the trigger. That was kind of quick. But the point being is, it sees that file comes in by the certain user that we had, the redact user. It's going to call. Here's our ICAP resource that we defined, so it's going to take that file, send it to that ICAP resource. It's going to do its thing. It's going to give us back a code. And in my project, I'm looking for certain types of codes.

It looks like my ICAP status code of 200 was met, which is no bueno, or not good. 204 is good 200 is not good. So we see that it got met. We're printing out certain things, and we're going to see that it actually got moved. And then we're doing a denied trigger even, so when these come in, we're actually going to block those. And, in this case, we are doing a redaction. If we look at, I apologize for jumping around in you, the actual ICAP gateway, you can see that came in at 10:47. And it's being allowed, which is confusing. You would think it would be a denied. But it's actually allowing this rule, the rule being redact because all it's doing is redacting the secure information and running the file still pass through, so that's what's going on there.

Other scenarios, maybe you want to actually block a file, so a potential virus file, so you could do the same type of thing. So if we do block, let's log in with that user account. They have the same kind of scenario, don't have to look at the backend. I think you guys get the point. But we can take, in this case, a test virus file, cryptolocker.text, which just has some text in it that's recognized by a signature of files with antivirus software. But the point is if I can grab that, and I'll drag it over here, and it'll upload. It looks like it's there if you hit refresh because what this is doing is streaming through and sending that right to that ICAP device giving back our codes, which hopefully, it's okay. It refreshed, it's gone.

So in this case, it got back a certain status code that said, "Hey, by the way, this is a known virus. Let's do my trigger event and delete that file so we don't have those files being uploaded to our system." Last but not least, and I'm flying through those. I apologize. I'm just seeing the time here. I want to look at a secure mail send before a trigger. The one that I'm going to do is going to be, I don't know if it's PHI or just PII credit card. I can't remember. But this is going to be a good use case for that DLP, or data loss prevention, so it can prevent folks from actually sending out stuff that shouldn't be sent out. I can't talk and do this and do this at the same time. Try that again. Oh, okay.

Another example of maybe multifactor authentication, you can leverage radius. In my case, I've got a soft token here, so surprise, surprise, we're going to use this. So in this case, let's go ahead and copy that code. So again, this is set up at the web user level. You can leverage multifactor authentication. I happen to be, for this certain user, leveraging our RSA server. In any case, now we get to what we ordered, so let's go ahead and do a secure mail. Let's compose one, and let's just send it out to my Gmail account because I'll have to do that twice. And whatever here, it doesn't matter.

Let's grab, and let's go back to our test files. And let's do another image file just to show off the OCR capability. So this should before it actually encrypts the file, which is, obviously, what a secure mail is because if we encrypt it, we can't inspect it, or our ICAP server can't inspect it. So it's going to do it before it's going to send this content there. Hopefully, our ICAP server will see this as not good stuff, no bueno, sensitive information, please don't send this. So let's go ahead and grab that and upload that to our attachment. And let's go ahead and try to send this via secure mail. And let's go back to the admin console and see where we're at here.

Secure mail block and notified, 10:51 because the one that I wanted. We'll see. We should get an email notification as well. Yeah, at 10:52, okay, there we go. Your email message was blocked as it contained sensitive information. So that's good. So this one here, if we actually go to the job log, that actually happened. This is where you're going to see a lot of the same stuff. Now the ACTP response, I apologize, I left that in the job log. So let's just scan quickly over that. Let's get back to where we want to be. Okay, so here is going to be our response body, that the ICAP returns server code 200, which is not good, which sometimes can be confusing because usually, a 200 in HTTP return code is a good thing. Well, in ICAP 200 is not good, 204s are good.

In any case, in the project here, we're just going through some if conditionals saying if it's 204, then we would say, "Okay, go ahead. Send the mail. Everything is good to go. Nothing to see here." But it was not that because we got a code of 200, so in this case, we are doing the, "Okay, we're sending that for policy." We're going to go ahead and also send a message to, in this case, the user that sent it, which is me. That's why I got the email. And hen we're denying the trigger event, so that actual email, specifically the email attachment as well did not get sent. The email itself doesn't get sent. It doesn't strip off the attachment and send the email itself. It denies the actual trigger that, in our case, the trigger even is sending insecure mail. So it actually denies that even getting sent out.

So other ways that you can see you can leverage triggers, I know this got more specific to the security side of things looking at blocking, redacting, and stuff like that as far as actions. But hopefully, you see the point that any web user activity that you do, whether they are uploading files successfully, you can call a project and pass all kind of information about that actual event, whether it's the file itself, whether it's the username, the IP address you came from, things like that. You can pass all of those things as parameters into a project to do. Again, I know we've just covered just the surface of what projects can do, very, very limited, but I think you get the idea on how those things, whether it's schedules, monitors, or triggers, you can use to really power your automation on whatever kind of file manipulation or movement that you need to do.

With that being said, let me kick back here and go back to the presentation. And I'll shoot it back over to Brooke very quickly if I can pull this back up. There we go.

Brooke: Wonderful. Thanks, Dan.

Dan: Sure.

Brooke: So wrapping things up here, we'll take a few questions at the end since we have a few minutes. But before we do that, I'll just go over a few quick housekeeping items. Thanks again for everyone who's joined. If you do have more questions about Advanced Workflows, our email address is on the screen, so feel free to shoot us a note there. Or feel free to contact whatever sales rep you might be working with. We also have an easy way to request a quote online, so that URL is on the screen too.

And then I just want to give a quick plug for our customer community, GoAnywhere Insiders. I have looked at the attendees today, and I think a bunch of you are already in there, which is exciting, but we have over 700 customers in this new community. It's been live for about a year now, and it's just a good way to educate yourself, lots of best practices on the software, ways to share your feedback, and also earn rewards too. So you can join by following that link on the screen and then using the join code GAINSIDERS, and let us know if you have any questions about that.

So we did get a few questions. If you guys need to go and don't want to stick around, again, thanks for joining, and we hope your day is great. But then for the rest of you that are sticking around, feel free to enter a question while we're still here through the questions pane in the control panel. And, Dan, we did have a couple come through, so let's try to get through as many as we can with the few minutes we have left.

Dan: Okay.

Brooke: I'll just read these to you, and then you let me know if you can answer. So first one, can you describe the process of what to do if the project fails when a monitor fires?

Dan: Yeah, so good point. So that was some of the things that I didn't quite, I guess, maybe have time to jump into, the actual error control, handling a logic within the projects themselves. But yes, you can do that. By default, what I was doing was just assuming that, let me get this on error, you can define an on error at a module at an individual pass level. Of you don't define it, but default, it just aborts. So if your copy task doesn't work, or whatever task that you're trying to do does not work, it's just going to abort right to the job log and be done.

But you want to have some sort of logic in that project so if something does fail, maybe we want to send the focus to another module that's going to maybe just simply send an email to somebody saying, "Hey, by the way, this didn't work." Or maybe you want to do some retry logging in the case of maybe something failing in the interim. But the point is, inherently, if tasks are not successful, then they just won't bump ut and fail there. You never run into that issue of maybe deleting something that really didn't process in the first place.

But specifically with monitors, since they do call projects and they run interactively, if they don't happen, then those files will still be there because the delete or clean up task didn't happen. So we'll just get caught up in the next time, in our case, 15 seconds. So there's different ways that you can catch that logic, but the point being is you build that into the project itself, which I know I didn't get a chance to dive into, but that logic can be handled at the project level.

Brooke: Good, good. Another question, can you just reiterate and clear up for everyone if this is a cloud software as a service offering or an on-premise install?

Dan: It can be both right now. It's an application that you own, so whether you put the application on your on-prem environment, whether you want to throw it out in AWS, Azure, Google Cloud Services, that's totally up to you. And Brooke, I don't know, did you ask if it was like software as a service, or just in cloud services?

Brooke: He asked if it's a cloud software as a service offering.

Dan: Okay. Well, as far as where you physically put it, everything I just said is true. As far as the software as a service, I know that's something we are diving into. I would probably have you talk to a sales rep to get a more definitive answer on that, or if Brian wants to chime in. But I know that's something we are getting into. I don't think we're fully there yet as far as a software as a service is concerned.

Brian: Yeah, we're pretty close on that. In another week, we're going to start rolling out a beta version of it to a few select customers. And maybe in March, we'll start making it more available to everybody. Coming soon.

Dan: Cool.

Brooke: Exciting stuff. Good, good. What ICAP server did you use in the demo was another one we had come through.

Dan: We use Clearswift. HelpSystems acquired Clearswift a couple of months ago I want to say, somewhere around there. So the one that I used was Clearswift ICAP. But as long as it's ICAP compatible, it should be good.

Brooke: Good, good. And then one more I think, someone just wanted to clarify if everything you demoed today, you get it all within Advanced Workflows and just how that process work to purchase and add all of this functionality on.

Dan: Yeah, so everything from a functionality standpoint that I showed, resources, everything from the workflow standpoints, schedulers, monitors, triggers, everything comes with Advanced Workflows. Yeah, I'm trying to think. I don't think that there's anything that I didn't show, or that I showed that you would not have access to. Obviously, you'd have to have access to your own resources that you connect up to like your S3 bucket or ICAP server in hat case. But as far as the functionality, everything that we showed, yeah, it's totally available with the Advanced Workflows module.

Brooke: Good, good.so we're at the top of the hour. If you had a question come through and haven't had it answered, we'll stay on for a few minutes and try to get to that. Otherwise, we'll definitely follow up with you after the presentation. And thank you all of joining. Thank you, Dan, for walking us through everything. And we hope you guys all have a really great rest of your day.

Dan: Thanks, guys.

See Full Transcript Close Full Transcript

Ready to See GoAnywhere's Advanced Workflows in Action?

Schedule a live demo. Choose from our 15-, 30-, or 60-minute options to pick the level of detail that works best for you!

SCHEDULE MY DEMO