Andrew: Okay, good morning, everyone. We're gonna go ahead and get started, as it's right at 10:00 a.m. here, Central time, for us. Thank you for joining us for our first webinar in the series of Get the Most Out of GoAnywhere. Today we will focus on Advanced Workflows module, and at the end of the presentation I will provide details on upcoming events yet this year. A few housekeeping notes, before we get started. We are recording this presentation, and we will send it out to all those that registered, later this week. So, you can review it and replay as needed. We should take about 30 minutes, give or take, depending on the number of questions we get at the end, because we will have time for some question and answer. And, any that we cannot get to, we will make sure that someone from the GoAnywhere team does follow up with you. Let's take a quick look at the agenda, before I introduce Dan, our speaker. Today we're gonna touch on what Advanced Workflows are, the top reasons to use advanced workflows. We'll look at typical use case, and then jump into the software to see how we can solve some of the issues that our users are seeing. Then, we'll leave a little bit of time for any questions, answers that you may have. Our speaker today is our Senior Solutions Consultant, Dan Freeman, who has a lot of experience working in both the security field for large healthcare providers, and here Linoma Software and HelpSystems. Dan, are you there?
Dan Freeman: I sure am.
Andrew: Great. I will pass over the rights to you, and you can jump right into the content.
Dan Freeman: All right. Appreciate it, Andrew. Thank you, and thanks to all of you that took the time to join us for this quick session on advanced workflows. Speaking of which, what are Advanced Workflows? What are they exactly? Well, at a high level, these are all the components that automate and movement of manipulation of data. Now, Advanced Workflows allow you to prepare data for sending off to customers, trading partners, or just a final destination on your network or other system. They can also allow you to automatically process, manipulate and move data when on the receiving end. Now, we're talking anything from compressing, decompressing, encrypting, decrypting, data file format, translations, conversions, to simple file system tasks. Advanced Workflows specify what is to be done with your data to meet your business needs.
Now, automation can be achieved by built-in enterprise scheduler to kick off projects at specific times. Or, perhaps, use your existing schedule to call projects. You can also monitor the file system to check for files created, modified, deleted, or just whether they exist. These successful conditions will create a list of files that can be passed into a project for further data manipulation or movement. Triggers, based on user actions, can trigger processes, email alerts, file manipulation, or even projects to further automate business processes and data movement.
Advanced Workflows can be called by our free APIs, GoAnywhere command list, or the GACMD. Now basically, Advanced Workflows can solve problems. Problems of decentralization, security lapses, limited auditing and alerting, lack of auto resume technology, and probably most of all, manual processes that inherently are prone to human error. All of which some way or another, cost us money. Automation via Advanced Workflows can eliminate these problems, and yield results where they are needed the most. The bottom line.
For situations where we wanna initiate, or perhaps prepare data to be moved, let's take a look at this current slide to illustrate how this can be done. Now we can take the example of a file that already exists, let's say an Excel file. We wanna convert that to a CSV. We wanna also PGP encrypt that file. GoAnywhere does have a full key management system, PGP encryption file. Encryption is one of those key encryptions that we do have. So with those key encryptions we can encrypt those keys, and then we can go ahead and SFTP those out to our trading partner, all of this automated within a simple project.
Now sometimes, files aren't actually files. Maybe we need to grab information out of a data base first. This is where we can make use of resources and other servers and services available to us, either on our network or even off premises, like partner FTP servers or even Amazon S3 Buckets. Point is, we can extract data, like a select statement from a data base for instance, write it out to various file formats, maybe Excel, CSV, Flat File, Fixed Width, and perform any other necessary tasks and then send it on its way.
Or maybe, on the flip side, we receive files, either by one of our server listener protocols, or initiating or doing a GIT Command. And here we can kind of walk through an example of an advanced workflow in action on the receiving side. Let's say that we receive a file from partner A via the SFTP Protocol, and we have designed a trigger that is defined as an upload successful condition from user partner A. When this happens, this will call a project to read in the Excel file, then take that row set or the results of that, and enter the information into a customer database.
Or maybe, you could be monitoring a certain mailbox for incoming emails with a particular subject, or maybe file attachment name, or message body containing certain text. In any case, the advanced workflows can parse out that attachment or any other part of the email in question for further processing or simple notification.
Now at the core of Advanced Workflows are the projects. Here in this slide is a look of the projects that are made, or more appropriately, designed. The project designer window is built from four main working areas. The component library, the project outline, the work panel or work space, and variables. Now the project outline is a visual representation of your workflow. It defines the business processes to perform. With tasks, decisions, loops, and many other integrated and innovative functions, all are executed in the order in which they appear. Now you can drag and drop to move each task item, or right click the choose from a multitude of sub functions.
Each project contains one or more modules, and while the number of tasks within a project are virtually limitless, you can build complex workflows using smaller, more manageable modules or a group of tasks that perform a specific action. Every project is saved as an XML file, so for those savvy users who do have the ability to modify projects directly from the XML file if they do choose.
The component library has over 100 individual tasks that are organized by function. Just expand the folder, choose your task, and either double click or drag the action over to your project outline window. The component library is smart enough to place the task only where it was designed to go. Now the works panel, or work space window, is where you customize or define your attributes for all the selected tasks in your project outline. This is where you can define fields like variables, file locations, and various other advanced features.
And speaking of variables, there can be user or project defined or common system variables for use, and variable usage can come from reading file formats and placing the results in a row set variable, numerous output variables, or maybe parameters that are passed into a project at run time. And variables allow reuse and great flexibility as placeholders for dynamic values.
Once a project is defined, you can validate to make sure it compiles correctly. Interactively run the project and view the job log for details. There's also a debug option for step by step troubleshooting, should the project not yield the desired results, or maybe even error out completely. The combination of the component library tasks, drag and drop features, the project outline intuitiveness, and step by step debugging options makes project design pretty easy. No more dependence upon developers to write scripts, or come up with complex programs for automation. The GoAnywhere Project Designer Window was made for anyone to develop workflow automation to meet business goals.
Now each project when run is considered a job. Each job has a unique ID associated with it. This with the job number as well as the ID for the completed job audit log. Now GoAnywhere has an enterprise job queue manager that can be used for prioritization, resource allocation, as well as determining single or multi threaded job queues. You also have the capability to monitor your active, queued, and completed jobs, and perform further actions if needed. You have the ability to hold, cancel, or even override parity on jobs, depending on their state. Job queues can be defined explicitly within a project from the scheduler, as well as from the GA Command API. These job queues can help meet service level agreements by providing priority as well as run time resources to appropriate jobs.
Now each time a job is run through a producer, GA completes a job audit. The details of the job log is dependent upon what the log level is set to. Within the control section of your project is where you can find your log level. By default, it sets it for both, but can be as vague as silent, or as detailed as debug. Now these can be set at the project level, the module level, as well as the individual task level. This can be particularly useful when trying to decipher why a specific task or module keeps failing. You can set the individual task at debug, while keeping the project level at info or verbose, to avoid having over inflated log files, and making for easier searching of what is pertinent.
The basic and advanced searching of the completed jobs can be very useful when tracking down projects for auditing purposes, or just to recall results. The advanced search is very granular, and can either narrow down exactly what you want, to help find the project that you only have minor details about. Now let's look at some of the advantages of GoAnywhere's workflows.
A component library. We have over 100 unique tasks to design automation. These advanced workflows can handle pretty much about anything that you, or any process you can come up with. Also, very, very popular with our customers is that automated PGP encryption and decryption process. Have a lot of customers that have to deal with regulations or multiple partners that need to encrypt or decrypt files, and have the full featured PGP manager that's built into the product to make it easy to leverage within projects, to automate that encryption and decryption of secure PGP files.
The reading and writing capabilities. This is a huge benefit to be able to parse out information from a multitude of file formats, and do further processing with those results. Auditing is always a key element within GoAnywhere. Depending on the regulations that you have to stand for, auditing and accountability are very important aspect of security, but it's also the informative data that helps track as well as troubleshoot jobs. It's also nice on the client's site to have transfer and initiation, to be able to kick off projects and automate your file transfers and manipulation need.
Resources, you can definitely save on time and personal resources by getting rid of these manual processes. And also, we'll talk about resources a little bit, and those are leveraging this connectivity to other servers and services that we can make available to these project automations. The intuitive workflow project outlines, they eliminate the need for programming backgrounds. Even guys like me can do some work for automation. User interface is definitely friendly, as we'll see in just a second. I think most of all, let's eliminate the manual processes and potential for human error.
And with that, let's go ahead and jump into the screen, and I'm gonna switch over here. I'm hoping you guys can see my screen here. Andrew, let me know if you can. It should be the GoAnywhere Managed Admin Interface.
Andrew: Yep, we can see it clearly.
Dan Freeman: Okay, great. All right, so let's take a look. One of the first things that we need to talk about, when talking about workflow automation and projects is resources. Now resources are gonna be GoAnywhere's way to dip into those other servers and services to leverage things that we can make automated within the actual project. Some of the common resources they can see, there's tons of them within here. Due to time constraints we'll kind of take a peek at a couple.
Database server seems to be a very, very popular one. As you see, we can add a database server just by providing the driver. We do load all the common drivers. The URLs, sometimes maybe not the most intuitive thing in the world, but we do have a wizard here. You click on that, select your actual driver, put in the host name, and then go ahead and hit the generate button and it'll go ahead and generate that URL for you. Go ahead and cancel that here. Put in username and password that's valid for this account, and with every single resource that we do have, we have a test button. It's kind of our sanity check. It does a couple things. One thing, it checks for that network connectivity to that resource, and then two, checks to see if actual username/password if applicable is valid. Once we have that resource test successful, obviously that's a good thing. Now we have that resource available to us in our projects going forward.
One of the other ones, again there's tons of different resources that we can leverage for our projects, but one of the other ones that we do see quite often is our SSH servers, in particular, our SFTP servers. Here again, we'll put in some information, host port that it's listening on, username and password. One thing in particular that I do want to point out is that connection tab. It's also nice to have within projects, and we'll see the error handling from the project module as well as individual task level. We also have auto retry/auto retry attempts at the resource level. So for our SFTP, FTP, and FTPS type servers, we can have those connection retries, so that in case maybe a router reboots or a server goes offline for a little bit, we're gonna go ahead and retry that connection and if successful, we'll go ahead and resume where we left off instead of having to retransmit the entire file. Again, we'll do the test from here. If successful then we're good to go, and now those will be available to us within the actual project. Let me get rid of that window there. There we go.
Okay, so let's get over to our projects here. Again when we first go to workflows and projects, you're gonna see kind of like a Windows Explorer window here. Either you can set permissions at a top level domain level on the folder, or we can do individual folder permissions to keep out unauthorized access to your projects. Creating a project is as easy as clicking on the create project. Now even if you have no idea what to do here, we'll just give it a project name. It's nice that we do have about 40 different templates that you can choose from. Over the years, you know kind of pick out some of the common themes or common tasks that a lot of our customers like to do. You can select one that fits your needs. Go ahead and open that you, and then navigate through the actual task and change the actual workspace, or the attribute windows to match up to what is on your needs. So this will give you a nice template, or at least a good starting point to create a project from scratch.
So, let's go ahead and go back here. And let's go ahead and open up a certain project. We're gonna touch upon a couple projects that we kind of talked about, earlier in the presentation. We'll kind of walk through these here. This one here is kind of going through that first slide that we looked at, from an initiating standpoint, and outgoing connection. So we're going to take a file, which is not a file yet. We're going to dip into a database, we're gonna pull out some information via a select statement, write that to an Excel file. We're gonna take that Excel file, we're gonna PGP encrypt it, and then we're gonna SFTP it out the door to our trading partner. All the while, if everything's successful, let's go ahead and send out an email confirmation. If there's any problems within this project, let's go ahead and send out an email as well, letting them know about the actual problem.
Let's look at a couple things here as we talked about real quick within the project designer window. One, you'll notice the component library, and this is where all our hundred or so different tasks are available to us. We can either double click or drag and drop them right on into the project. Couple other things here, the validate button's gonna make sure that the project actually compiles successfully. The execute button, which we'll do interactively here in just a second, show you how that works. Debug, now if you happen to run into some problems, we can go through the debug mode, and this one goes step by step, as you notice the task that's up right now. We'll go and it next, you'll see the job log filling in as the tasks go through, and it'll tell you either information or any error messages that pop up for your debug capabilities.
And then also as I mentioned, each project that you're creating in the project outline here, this nice graphical interface, it also has a back end XML that's building. Now for those savvy users, you can actually edit the projects within the XML, but I think a lot of our folks like the GUI that's set up here.
Okay, so let's take a look at this project, and kind of go through this. One of the first things we see, is we're actually creating a couple variables. Again, these are gonna be user-created variables to where we can give you initial values which in this case we do, but also this gives us leverage to be able to have programs pass in parameters in run time for this particular project. Again, this particular project is grabbing information out of a database, right into Excel, encrypting, and then SFTPing out the door.
One of the first concept we'll kind of see, is we've got a create job, and an associated delete job workspace. This task is made for some of those complex jobs that are doing a lot of manipulation or file movement, and you need a temporary workspace or temporary location to do all this, all these steps. The create and delete job workspace are your tasks. So we'll do those two tasks here. So the first thing we're gonna do, we're gonna go ahead and do our SQL task. That would be dropping database down. We'll pull that SQL task right on over. The database server that we select in this dropdown box. All these databases that we pull down in this dropdown box are gonna be provided by the resources that we talked about, when we first logged in. Everything that you define in resources are gonna show up in this drop down box.
From here we're gonna do a sub pass, add a query, which we'll move here, and this here is gonna do a select statement of that actual database, the database GA Demo done employee, where wages is greater than or equal to the variable that we set initially called minimum salary. So again, we put in an initial value of 25,000. Now we wouldn't have to. We could make it to where we have to pass a value in, and we'll kind of show how we can manipulate this variable at run time.
One of the key statements on this, in this step here, this query, is everything that comes out of this select statement is going into an output variable called data. Very key concept that usually the output from one task is going to the input from another task. You'll notice that also the output variables just automatically gets created over in your variable section. We'll see that we also can implement conditionals. This is here just for kind of demo purposes. This is basically saying if records found, which is the variable that we defined, equals true, let's go ahead and create these, or let's go ahead and do these next four tasks. So if we do find anything out of this select statement, this data actually has some information, this will be true and we'll go ahead and do our next steps.
So the first step, let's create an Excel file from that data we pulled out of that SQL query. So that input row set variable or basically the input of that task, is going to be the output from the SQL query. Now again, our nomenclature or syntax for variables are dollar sign, curly brace variable, and then close curly brace. But if you happen to forget, again these automatically get created. We can drag and drop those right on over there. Here we're just gonna put an output file to employees @ xls. If the file exists, we're gonna overwrite. There are certain defaults. You'll notice if you hover over the actual verbiage here on the left, you can click on it and it'll tell you what the actual default option is. In this case, it's rename. We chose to overwrite.
The event tab, we can kind of dress it up a little bit. Let's give it a sheet name of Employee, and let's go ahead and include the column headings. These are not necessary to fill in, but just kind of making it a little bit, more easily readable. And then again, output variable. From this task, we're gonna put it into an Excel file output variable, which leads in as you probably guessed to the next task.
The input file. Again, if we forgot that, it's going to be automatically created. We can drag that right on over there. The PGP key ring resource, again this would be something that you'd have to define, over in our resources, the actual PGP key ring. Again, as I mentioned, we do have an open PGP key manager within the product, so it'd just be a matter of importing and exporting your public keys, and importing your customer's public keys for encryption. And then the output file again, we're going to put into a variable PGP file. And as you probably guessed, we're going to select SFTP server, again from the resources list, and do a put command to put that file out there.
If everything's successful, let's go ahead and email out a confirmation. If not, one of the things we can look at as we talked about, we can do an on error on the module level, which was what we're doing here, so anything that goes wrong in here, we'll call the problems module, which is basically sending an email, and you can also do at the individual task level, you can do on error modules. Now this one's calling the SFTP failed, which doesn't actually exist. But if we wanted to, we could create a module just by dragging that over, call that SFTP failed, and create any task specific to the SFTP resource. Probably an email just saying, "Hey, by the way, the connection to the SFTP server failed." Whatever you want it to do.
So let's go ahead and use the interactive execute job here. See what happens here. Again as we mentioned, each individual job that gets run gets an individual or unique ID. This can be viewed right here, from the view drop log, if we're doing interactive execute, but if it's run on a schedule or batch, you can always go to completed jobs and look at them here. So if we look here real quick, we'll go through a lot of these steps, but the first one's that query. You'll notice it pulled in the initial 25,000 for the minimum salary variable. That pulled out 974 records, added public key to encrypt the file. Uploaded it via SFTP, and then we go ahead and close SFTP and delivered the actual mail. Which, by the way, we should have gotten an email saying the message was delivered successfully. Let's go ahead and take a peek here real quick. So, 1023, yep go ahead and we've got that actual email delivered successfully. And that we close the task.
One thing to note, now if we talked about those variables, if we did actually pass variables from another program or the GACMDs, we can actually pass in different values for those variables. Let's go ahead and execute advance. That'll give us the chance to change those variables. And let's put in a value of 90,000 instead, and let's go ahead and execute that. We'll view the job log. And we'll see now is that select statement is going where wages are greater than 90,000, and instead of 900 and some odd records, now we only have 172 that fit that criteria. So just an example where you see, sometimes there is no file to send to your partner. Maybe you need to dip into a database, pull that data out, and also set up variables so that you can be dynamic in how that information gets pulled out.
Let's look at the flip side here. Let's look at maybe something that we received. We talked about triggers very briefly. Triggers are basically ways that users, any kind of user action can kick off a certain project email, move, delete, rename, certain kind of file activity. For this particular example, we'll kind of look at an SFTP or an ACTPS upload. So for this one here, I have a trigger set to monitor on upload successful, so any file that is uploaded successfully to my system via the ACTPS or SFTP protocol, and has the event username equal to Dfreeman, I wanna move the file, the actual path, to work stuff archive and trigger. So let's go to the archive and trigger folder, and we don't have anything in there now.
Let me go to the client side. Let me log in real quick to the client side, so this will be the https protocol. So this will be partner A logging into your system, or just SFTP something up to your system here. We can say, we'll just take this next steps, we'll just drag and drop that right on in there. We see that it was successfully uploaded. So there's out successful upload trigger. It was under the Dfreeman user account, so we should see that next steps text file in my trigger folder, which is where it's at. This is just a simple way, again, now that trigger as we look at here, is just doing a move file. You can definitely do a lot of different things from an actual trigger, most commonly or most I guess, specific or detailed would be to call a project similar to what we just looked at, to go ahead and kick something off for further manipulation. This was just a simple example to show you how triggers kind of work.
And one of the last examples I'll show you here, we do have a lot of customers sometimes talk about file retention archives as well as those retention policies. Sometimes you know, it can be quite a tedious task if you have multiple file locations, manually copying the files to the destination and archive folder, deleting out that original staging file directory. You know, verifying, proceed with the first processing on the destination file. You know these can get very complex and labor intensive, especially for multiple directories. Now these manual process inherently are subject to that human error, which can cause non-compliance with either retention policies, or in some cases, actually have too much information that can be discoverable in case of an audit.
So one of the projects we'll look at here, this one's just kind of a utility almost. This could actually be a stand-alone run as a scheduled project on a specific directory, or as we see in this case, we have a couple variables that we want to pass or leverage, to be very flexible in our actual retention policy by days of how old the files can be, and also the directory that it's actually checking. So some of those main concepts here again, are gonna be these two variables that we're passing in at run time. We have a retain days variable, which we're giving an initial value of 60. But this retained days value is defined by our time stamp module here. Just right up here, so our time stamp, and if we do our actual format, we're looking at the days and months. We're doing a minus of the actual retained days variable. So whatever we're passing into this project is going to be that many days old, as far as the actual retention policy goes.
The clean directory, this variable, is just defining the actual directory that we're targeting, to look for those files that are older than the retained days retention policy. That goes to the create file list task, and here the base directory again, we're leveraging that variable, to define either at run time or within another project, to call this project to go ahead and do that retention policy and clean up those directories.
So this one, pretty quick and simple here. Looking at the actual setting of the time stamp, has a current format of the year, month, day. Just ISO standard format there. This here's just gonna do a couple print statements and comments to kind of go through, to kind of when we look at the actual log file, it'll pull these out and make things a lot more clearer as they get defined and as the project actually runs. But here, we're gonna leverage an actual loop, and we're gonna go through the actual remove files variable that we set from the actual directory of the files to remove, and then we're again gonna go ahead and produce that file list, and then remove or do a delete statement, to delete the actual files that fit that category.
So let me take a look, here we'll just look at the actual archive directory that we have here. This'll be kind of a simulation of archives. One of the things within the create file list as you may or may not have noticed, I did not point out, but it does have a recursive function on that. So not only are you looking at the base directory of the variable that you're passing in, but also any sub folders within that. So within this project, let's go ahead and go to projects, and run this advanced. Go ahead and pass in a couple of the variables. We'll do the 60 days, and we'll just do a local host, the archive directory. Go ahead and execute this command here.
We'll view the job log, and it kind of goes through and it'll list out the clean directory that we passed in, or the directory that we're looking to archive, or clean up, auto clean up. Gives you the date of when the last day of when it can be created, so July 8th is the cut off date. Removing files older than 60 days old. And we'll go through the whole list here, and at the very end, give you a list of the three files that were older than 60 days old, and the ones that were actually removed, and go ahead and do your auto clean up or archiving of these documents.
Now this is just a simple really quick way of showing how you can leverage certain tasks within the automated workflows, to make things automated. To make things to where, you're not doing your archive or retention policies by human manual processes to go through these things. This can make sure that you can run these tasks on an automated basis, to keep you in compliance and keep your retention policies okay, as well as keep the auditing of everything that gets run, is going to be kept within the auto logs of every job.
Now these are just a few, a couple examples. It just scratches the surface of what can be done with advanced workflows. I really hope that these few examples I showed today can give you an idea, of how going where advanced workflows can work for you, and probably more importantly, save you time and money. If you guys do have questions, please contact your representative, or sign up and use the MFT community forum, which actually out here is a great resource. And it's a great place to share ideas and learn a lot more about GoAnywhere. And so I highly encourage going out and using that community forum. We do have a couple of personnel on staff that checks this community forum, and works with not just the community to answer the questions, but also staff here at Linoma (HelpSystems).
So again, I hope these helped, and thank you for your time.
Andrew: Great, thanks Dan, that was a great overview. I'm happy you walked through it. We did have a few questions, we will have since we're out of time, someone reach out from our team to answer some of the specific questions. I did wanna quickly mention, we are doing a series of events here and this is just the first one of five that is coming yet this year. Here are the details, but you can access those online, using the URL up above, and we will send that out as well in the follow up email. And I will make sure to get all these questions over to the appropriate person and get those answered for you.
So thank you for taking the time to join us today, and we look forward to seeing you on one of the next ones. Have a great day.