Get the Most Out of GoAnywhere: Azure & Amazon - Cloud Server Installation

Thank you for registering for this on-demand webinar. If you have questions following the webinar, please contact us. You can also download the presentation slides here.

 

Transcript

Andrew: All right, we'll go ahead and get started. Good morning everyone, thank you for joining us this morning for part five of our Get The Most Out of GoAnywhere webinar series. Today's topic is deploying GoAnywhere in the cloud. Before we get started, a few house-keeping notes, we are recording this event and I'll send out an email with the slides and the recording either later today or this week. The event should last roughly about 30 minutes, we can get you back to your day, and we are doing question and answers throughout as well as at the end if we have time.

 

So if there are questions as Dan is presenting here, feel free to put those in the Q&A window on the lower right hand corner and we'll try and get to them as quickly as possible. Lastly, at the very end, we will have a quick survey as always, there we will use that to get better feedback and understand any other topics you might be interested in. So feel free to let us know that in the notes as well as how you felt the webinar went.

And then if there's anything else you would need to follow up on, especially around cloud deployment, please indicate that and we will have a sales manager reach out to you. Before I hand over the reigns to Dan, let's take a quick look at the agenda. So today we're going to talk about cloud deployment advantages, specifically around GoAnywhere and GoAnywhere cloud features. Dan will talk a little bit about AWS and Azure and a quick live demonstration focusing on the AWS Marketplace at the end. And as we said, if there's time remaining we will handle the question and answers too.

So by this point you all should be familiar with Dan. He's been great doing a number of these for us over the last few months here, but he is a Senior Solutions Consultant here at HelpSystems. Many of you are familiar with him because you spend time with him on the phone on a daily basis if you need help. So with that I'll turn it over to Dan and he can jump right in. Dan, can you hear me?

Dan Freeman: I sure can, can you hear me?

Andrew: Yep, go ahead.

Dan: All right, sounds good, appreciate it. Well thanks Andrew and thanks to all who taken out the time again to listen to, as Andrew mentioned, the fifth of our webinar Get the Most Out of GoAnywhere series. And today as mentioned, we will be taking a look at some of the benefits of cloud service providers and how GoAnywhere can leverage both the infrastructure as a service as well as some native storage connections via REST and backend API calls.

But first, a very important question to set the stage and get a good visual for our content today, what kind of clothes do clouds wear? Thunder-wear. Yep, I can hear the groans through the muted mics but I feel like that just got us in the right frame of mind now. So let's go ahead and dive in and take a look at some of this cloud provider activity here.

Cloud Deployment Advantages

All right, On Cloud 9, so what's all the fuss about moving to the cloud? Although it may not be the perfect or best solution for all businesses, there are a lot of reasons and advantages for doing so. Flexibility, now being able to meet ever-changing demands and business needs without wasting FTE resources, doing the proper analysis to decide on what may be needed.

The cloud provides the means to turn up or down bandwidth or resources in just a few clicks. And with a little thought into the configuration of the resources, you can have those applications or infrastructure resources do auto-scaling dependent on thresholds that you define. This flexibility of scale not only meets demands but can avoid costly over procurement or potentially even worse, not enough bandwidth to support potential customers and money coming in the door.

Security is also a concern. This was definitely an interesting topic, it has been lately been a positive reasons why businesses are moving to the cloud but also can be a reason for hesitancy. I'll talk a little bit more about security in CSPs in a couple slides on compliance and regulation standpoint, but one of the biggest problem security officers had in the past is sensitive information being lost because of a stolen or lost laptop that employees has physically own.

Not only the cost of the hardware itself but more importantly the sensitive information that was stored on the device. Instant reporting in such cases is very costly even if due diligence is proven and if not, the fines can really break the bank. We can also use this for disaster recovery. Disaster recovery is always a good intention of most every business out there and those are the pockets having geographically dispersed hot data centers with replicated data and resources can be an option.

But instead pushing data up to the cloud can be a very affordable option for all the other guys out there. Most CSPs have different tiers of data storage dependent upon service level agreements and how fast information would need to be covered. Not only is it much cheaper than traditional offsite locations, but your data can be replicated as to avoid any major geographical disasters.

And not to mention, I don't know about you who likes to change out tape media and ship it to maybe Iron Mountain every week and then keeping a log of which week rotation is supposed to go out that Friday and vice versa. I don't know about you, but I personally hated doing that.

Another one that talks about [inaudible 00:05:01] is document control, but just having your data inherently replicate throughout geographic locations, depending upon how you configure can make your data available to all employees no matter where they're located. All without huge performance issues, cloud storage can keep one version of the truth and maintaining versioning throughout each document.

And how many of us, especially in Exchange or even Lotus Notes admins out there, loath the idea that email is a document management system. In fact, Microsoft, I believe back in 2010 announced they were going to drop support for public folders by Exchange 2016. Apparently everyone went crazy about that idea, Microsoft since rescinded that statement since public folders were being used for doc management by so many people.

Point is, email has been kind of the de facto way for document management and not a good one I might add. Cloud document management can facilitate actual efficient collaboration and better true vision into our business critical information.

Cloud Computing Trends

Now this slide here is we can talk about as mainly infrastructure as a service and how we can leverage cloud service providers for their compute resources, especially when it comes to installing GoAnywhere with leveraging IaaS or infrastructure as a service and their associated storage options. As you can see in this graph or maybe not so clearly, might be a little tough.

But the top line is for your software as a service, the middle for infrastructure as a service is what we're going to concentrate on today, and the bottom is for platform as a service. So as you can see, although infrastructure as service is still second in terms of billion spent, the trend for leveraging infrastructure in the cloud is gaining fastest in popularity mainly for all the reasons previously just discussed.

Concerns with Using the Cloud

So we just painted with since we have perfect picture of these cloud service providers, so why wouldn't we move our infrastructure to the cloud? Well, surprisingly enough, security does seem to still be one of the big reasons why most folks have leery of moving all their prize information up to the cloud.

Over the past five years or so we've seen breaches of large cloud-based providers from Adobe, Microsoft Windows Azure and even Amazon. But these breaches are not because of inherent security inadequacies but rather in either configuration issues or social engineering tactics. Most notably phishing scans to compromise this and to gain a foothold for further scanning.

I know that a lot of folks who believe that they have controlled their own devices and equipment, they can protect their information better than someone providing a convenience type service feeling that they don't have any vested interest in that information. But the bottom line is that's almost like thinking your money under the mattress is more secure than in the bank, which I suppose could be debatable.

But seriously, even more so now the security is such a hot topic to remain competitive all the CSPs or cloud service providers must put limit this resources into ensuring the safety and security of their customer's data. Not only in their reputations drive the effort but also there are many regulations that are constantly coming out with new compliance standards in the wake of all high profile breaches of late.

There's definitely no shortage of motivation for the cloud service providers to make sure that they have the latest cipher suites, algorithms, key exchange methods provide customers the ability to make their data secure. The potential issue comes from unsecure configurations of resources within those cloud networks, but this is no different than the risk you run on-premises.

At least at the cloud service provider level, the infrastructure is protected by compliance regulation and mandates that are checked regularly and vigorously. I'm not sure most organizations can honestly say the same thing at their home shops. So let's look at some of the GoAnywhere cloud features. All right, so how do we fit into all this?

GoAnywhere Cloud Features

For now, we're leveraging as I mentioned the security, flexibility, cost effectiveness, disaster recoverancy, efficiency, and all the benefits of their infrastructure as a service models. For instance, we can install GoAnywhere within the Amazon Web Services arena either by spinning out some EC2 instances using multiple flavors of Windows or Linux operating systems.

Or we've also developed a preloaded instance or AMI using Windows Server 2016 by searching the AWS Marketplace. And with just a few clicks can have GoAnywhere up and running, ready for testing and deployment, which we'll demonstrate here in just a couple of minutes. In the same way, you can run GoAnywhere within Microsoft Azure by choosing Windows or Linux resource and installing MFT in their cloud space.

With those instances running on cloud resources, GoAnywhere can automate file transfers and data manipulation at ease strictly in the cloud space or between potential hybrid solutions or just to trading partners and customers alike.

Anything from offloading maybe archive data for cheaper storage use, freeing up storage and resources on your network, backups for potential DR or for efficient collaboration and visibility for your entire organization, no matter where they're located without the effects of slow and expensive bandwidth limitations.

Furthermore, most regulations require data encryption while in-transit as well as at rest. Moving files to and from the cloud can be done via the HTTPS secure protocol as well as server-side encryption leveraging cloud managed KMS solution. Currently, GoAnywhere does have the capability to natively connect up the AWS S3 buckets for cheap and secure storage in our current beta.

And in our current beta release 5.6.0, we have added the ability to leverage Blob storage within Microsoft Azure. As always, all transactions within GoAnywhere, whether on-prem or in cloud instances are audited by service protocol use file activity and web and admin user activity. These logs can be viewed by segregated out log files via the GoAnywhere gooey, but also the option to send to a central PeSIT log server should you desire.

GoAnywhere Cloud Infrastructure - AWS

Now here's a quick diagram depicting a common high availability cluster using both GoAnywhere Gateway as a forward and reverse proxy as well as the load balancer for the active-active HA cluster of the GoAnywhere instances. When we chose AWS as our demo for this webinar as it was the first CSP we integrated with, and it's a bit more mature with our AMI quick deployment options as well as the S3 bucket integration.

Now this diagram here, let me make it just bigger here. This diagram here illustrates a common infrastructure from the leveraging from our system architecture diagram. We have high availability both at the gateway level, let me get my little pen tool here. At the gateway level here, this instance, maybe a smaller tool than that.

At the gateway level here, we've got two gateways here and then we've got two internal MFT instances here. Now within the AWS environment, what we're going to notice is we're going to first set up a VPC or a virtual private cloud, this is going to be your first protection level within the actual environment.

This is where you're going to set up your subnetting and things like that, firewall rules from the outside wall or inside. One of the things that typical setup is if we do have a cluster or two instances of gateway, we will leverage the Amazon elastic load balancer feature to basically add these two IP instances into a virtual IP so that Amazon can handle the actual load balancing into the actual gateway machine.

Now these gateway machines are going to be installed on Linux instances, pretty small instances and we'll talk about a little bit when we get into the actual marketplace. But usually for a gateway maybe one or two CPUs and two to four gig of RAM would be plenty for those actual Linux AMI instances. A couple of things to note from an infrastructure standpoint what we have here in the gateway, we're going to have basically a public subnet set up here, and this is going to simulate a traditional DMZ on your internal network. Also, within the public subnet area we do have these black and orange squares here that will represent security groups within the actual VPC setups.

This is kind of your host firewalls if you will, it kind of determines what ACLs were going to allow and traffic inbound as well as egress traffic into these particular instances. And then on the backend here, we're going to have another just in the private subnet for your actual MFT instances to be sitting on your new traditional network, your private network.

So these will be segregated between the gateways and the actual MFT instances. Now the actual communications, all the same as if a traditional GoAnywhere set up. Just a couple of things to note of importance, you'll note the dotted orange lines here, these are actually different availability zones. If you're not familiar with AWS, availability zones are actually going to be geographically separated physical data centers.

So within AWS we have regions and within those regions we have usually three to four so availability zones. So in our particular example, I think we did the Northwest region. So these actual instances, this version of GoAnywhere MFT and this GoAnywhere version MFT are not only in an active-active cluster talking to a common database and a common file share.

For that, if case one instance goes offline, the other one will take up the load. But we're also, because of AWS has low latency lines in between these geographically dispersed physical data centers, we can actually put these instances in separate data centers. So in case of the Oregon data center completely went offline, we would still have the ones sitting in the Ohio region sitting over here to where that can pick up the slack even if an actual data center, physical data center went down within AWS.

All right, let me get out of annotation mode here and lets kind of take a peek at some of the actual examples that we can look at within the product here. Return here and let me share my screen. Okay, from here, and can you guys see my screen okay? Andy?

Andrew: Yep, we're good.

Demonstration – AWS Marketplace

Dan: Okay, all right. From here, so we're going to go and dive into the Amazon marketplace with AWS marketplace. So first you're going to have an actual AWS account, and it's nice because AWS will let you sign up for a one year free account to do a little bit of tinkering around and playing with the systems. So first thing I'm going to do is log into my AWS account, and from here when we first log in, we'll get to our console page.

The first thing we want to look at to show kind of how easy and how quickly we can set up an actual instance of GoAnywhere leveraging not only AWS infrastructure as a service but also having a preloaded version of GoAnywhere. So if we go down to the AWS marketplace and we can hit the learn more.

And we can do some searching, obviously if we know we're looking for GoAnywhere MFT, we can search on that and it should pull it up. Or if you want to do some kind of keyword searching, we can do secure managed file transfer and we'll see it is a couple down here but here it is. But once we do get to the actual AMI looking through the marketplace, we can go ahead and click on that and We're going to see a few things here, maybe a little bit of description on what GoAnywhere does as a product.

We'll see the latest version that we have out there, deployed from that AMI instance. The operating system, you'll notice right now the only choice we have is going to be for Windows Server, we are working on a Linux-based AMI so we have both windows and Linux. And then a couple other highlights within the actual product here.

The other thing we're going to notice is we're going to have to select the region. Again, I've selected the Oregon region, there's going to be a couple of things on the backend when you first set up your Amazon account but we won't dive in for this, just to kind of show you how quickly and easily we can set up an instance.

And then you want to look at and pay attention to the actual EC2 instance. And again if you're not familiar with AWS, this is basically just going to be a build and for the most part it's going to be what types of resources are you going to allocate to this machine, whether CPU or memory type resources.

For our example, we did a T2 medium, so if we pan down and look at T2 medium and you're going to see that it's about six cents per hour to run it constantly. So once we get all that decided, one of the things to note as well, this is a bring your own license model. So you will have to purchase the GoAnywhere license outside of here.

But once we got that, we'll hit continue and this'll get us kind of into our summary page where we're going to look at what we've chosen. So we've got the latest 5.5.5. version as far as the AMI goes, we're choosing the Oregon region. Again, looking at the instance type, we can change that here if you want, we're looking at four gig of memory, two virtual course for CPU, that looks good.

VPC settings, again, those are going to be things we talked about in that diagram, those will be things that you've already configured at this point. And by default with the AMI that we have chosen, the security groups remember being kind of like that host firewall, what ports or what kind of traffic, either ingress or egress you're allowing.

By default we're going to allow 3389 which is RDP or Remote Desktop Protocol so that you can actually log into that Windows Server remotely as well as ports 8000-8001. If you're not familiar, but these are going to be your default administrative server for the admin portal to configure GoAnywhere. So we'll allow those ports by default.

The key pair, again, something that will be configured when you first set up your Amazon account so we'll just do our testing GoAnywhere. From that standpoint, again, we can also look at our cost estimator, the bring your own license model. And with that and we're looking at, constantly running this all day, every day it'll give out $46.36 and 37 cents a month.

So from here we can launch with one click and you'll notice it says, "Okay. Hey, thanks for launching." So away it goes. Now if we actually jump over to our EC2 console, it does take roughly about five minutes for this to actually get up and running for you to be able to leverage the AMI instance. So for sake of time, I had one done beforehand.

But when this wasn't [test here 00:20:06] for about five minutes or so, you'll have a public DNS, so we can copy this to clipboard here. And if I open up another instance, let's just throw that in here. Remember we have 48,000 opens, we'll need to throw in 48,000. So here, again, with just a couple clicks, couple configuration options and again it takes about five minutes for that actual instance to get up running the services to start.

We now have a running instance of GoAnywhere for you to log in and start configuring and tinkering around with. Now the default administrator account is actually going to be that instance ID. The instance IDs are unique with each individual instance that you have up and running but again, you can copy to clipboard, put this right in here and go ahead and log in.

So now again, within just a minute or two after setting up your Amazon Web Services account, you can go through leverage our AMI for Windows 2016 and get up and running. So this is kind of nice way for you to get up and running, going in without asking, having to ask permission from your IT team or use up any existing resources on-premises.

All right. So let's take a look at a couple use cases for leveraging some of the storage, data storage or connection options of AWS within GoAnywhere. One of the first things that we'll take out real quick as we look at the actual resources, I'm actually going to look at two different instances. One I have an Azure Blob Storage.

If you don't remember, within GoAnywhere, resources are going to be our way of leveraging other servers and services or reaching out to other servers and services whether on-prem or in this case, out in the cloud. So here we did add Azure Blob Storage. Now this is then the beta release 5.6.0, but adding here, once you do set up your Blob storage within Azure, you'll put in account name, the actual key that you get with that account and the container or the actual Blob storage container name.

Put it in a couple of connection options, whether you want the root directory to be that actual folder or if you have sub folders and you want that to be the root director, you could just signify that by forward slash and whatever the folder name was, and then timeout settings. If you want to have a timeout setting for connection and then your contacts.

Once you fill in your information, we can do a test, resource successful. That's a good sign. Now, we can save an exit that resource and leverage that as a storage option. I'm going to jump over to the S3 buckets because I've asked this to my AWS account.

One of the things on the Microsoft Azure, you do get a month free, and currently I don't have access to my Azure. So I'm going to go ahead and leverage the Amazon S3 buckets for quick run through of a couple projects. But here, we're going to look at the S3 bucket resource.

Again, pretty similar to the Blob storage, we're going to give it a name. We have an access key ID, secret access key and bucket name. So what I want to do is go back to my console here and let's go ahead and launch the S3 services. So let's look at the actual Linoma Encrypted bucket that I have created. So the only thing I'm going to get from the actual S3 area is going to be the name of the bucket. The way that the S3 buckets work and the connection is going to be the access key IDs and secret access keys.

These are going to be by actual user accounts that you set up within AWS, and then they actually get put into a certain policy group that has permission to those S3 buckets. So if I go back to here and I'm going to switch over to the IAM section. Look at my users and my Linoma encrypt. This is the one that has Amazon S3 full access profile or policy file. So if I look at my security credentials, this is where I'm pulling that access key ID to copy into here.

And then the secret access key is only going to be given to you one time, usually in a CSV file. If you lose that key, it is non-recoverable, you would have to create a completely new access key account and then make this one inactive. But once you do put that information in there, this one here, we're particularly choosing server side encryption, which is definitely an option within AWS.

So they can do the actual AES 256-bit encryption in the actual buckets, same kind of connections here. You can do your timeout settings, root directory if you're using a proxy server to connect, and then obviously the all-important contacts tab to let you know if this resource is broken, who to contact.

Once we do our test, hopefully we're good to go. This is connecting up to, so now we've got a successful test. Now, we've got an actual resource, we've got an S3 resource within GoAnywhere that we can connect to. So how can we leverage these things? One of the first ways or a quick example is how we can actually leverage an S3 bucket situation is getting back to one of the examples we talked about and maybe you want to offload some files, maybe you want things for disaster recovery, or in the case of the project that I have chosen here, we're just simply going to say, "Hey, you know what? I just want anything older than 30 days. I want to go ahead and automatically uploads up to an S3 bucket."

So within this project here, it's as simple as defining a timestamp pass, which is under our miscellaneous timestamp. Here, we're just defining anything older than 30 days giving it a variable archive, 30 days old. We are going to leverage a copy task and this is just literally taking a base directory, this is on my local laptop, the folder is called archive 30 days old, a very witty name there.

So we've got four files in there, there's a newly created should still be here file I got created yesterday, and then a couple of older ones, older than 30 days. So when we run this project, we should be copying these files that fit that criteria because this actual file set is filtered upon a date setting and leveraging that timestamp variable that we defined earlier of native 30 days.

The destination directory, if you hit the ellipsis now we have access to Azure Blob as well as Amazon S3 buckets. So we can select the S3 buckets, this list will be populated by the resources that we defined earlier and I'll go ahead and put it in the archive directory. And then a couple other variables, the source process files, this is what we're going to use to delete the actual original files, the number of bytes, the number of files, and the destination files variable here.

So on the delete task, I'm just doing a little cleanup, I want to get rid of the files locally when they get moved up there, and then just send as an email confirmation leveraging couple of variables to let me know what actually happened. So this would probably most likely be run maybe on a nightly scheduled basis, but for this demo purposes, we'll go ahead and run this interactively.

And if we look at our actual S3 bucket, we should see in the archive folder two different files get populated in there, back to here, I'm still going, still thinking. Okay, so looks our job ran successfully, so if we go originally to our 30 days old, we see those two files are now gone. So they did get copied, they did get deleted and we should see them hopefully here. So now they actually got pushed up to our AWS S3 bucket here.

So just one example of maybe doing archiving certain files that are certain of age to kind of free up local resources and again making them available in the cloud to anybody at any time. The other one, we have a Dfreeman folder under my Linoma Encrypted bucket and this one doesn't have any files in there now. One of the other things we can do is if we look at a web user and we call this one buckets, we can have certain directories that are pushing files directly up to S3.

So for this instance we're just doing a folder and making the home directory is actually pointing to the S3 resource Linoma Encrypted Dfreeman. Putting a little bit of disk quotas on their five gig and giving the appropriate permissions that I want to give this user. But for demo purposes, I am allowing whether I'm doing this via the web client, which is what I'm doing here, just allowing the HTTPS web client and secure folders. You could do SFTP, FTPS what other protocol that you want us to do. But for visual purposes, we'll go to the web client. Let's log in as the buckets user.

I can't do two things at once, let me type this in first. So here again, we're just going to take a file and we noticed again that Dfreeman folder has got nothing in it. But when we drag and we'll just drag a file right in here with our drag and drop features called Demo Notes, it's just let go. This file will upload into the web client under secure folders so it looks it's here when I log into the web client, it's under my secure folders, it's available right here.

But if we look at the back end, if we refresh this, our Demo Notes showed up in our Linoma Encrypted Dfreeman S3 bucket. So again, just a couple things that we can do to leverage that S3 storage natively within GoAnywhere to do that secure file movement. And then again, little things like the Linoma Encrypted if we looked at the properties, one of the settings we can set within the at rest within the S3 bucket, you can see our default encryption is set for AES 256 encryption while sitting in that S3 bucket.

So we can see there's different ways and there's a couple of different things, I know we're running out of time here, things that we can do automated from a trigger standpoint if things actually land in a certain folder to kick them out the S3, or maybe you're just doing some file monitoring. Monitoring a certain folder on certain types of files land in there, let's go ahead and kick those and move those up to your S3 or Blob storage resources.

Upcoming Cloud Integrations

Let me jump back into the PowerPoint here, I'm just kind of show you a quick slide. So some of the things coming soon, I know we showed you the beta version here but we are going to have native Blob storage with Microsoft Azure. It isn't that current beta release, I believe the new release, I don't want to step on any toes but I believe it's for probably early next week having the beta release 5.6.1 available.

Also new with this, the KMS, the Key Management System is now going to be database driven, we still will support the file based driven or file based keys, we're their PGP, SSL and SSH. But it will be nice looking at some of the agent tasks, I know we talked about agents a few months back in one of these courses, they were limited because it was the first couple iterations. Now, we're able to pass those keys to do any kind of automated PGP file level encryption, decryption as well as some FTPS, SFTP and email type tasks from the agent level.

And then again as we mentioned, the Linux AMI for AWS marketplace coming and then we're looking also at native Cloud Connectors like Google Drive, Dropbox, OneDrive, et cetera. With that, I want to wrap that up [inaudible 00:31:55] one a little long, I'll just go ahead and pass that back to you, Andrew.

Andrew: Thanks. Can you hear me okay?

Dan: Yep.

Q&A

Andrew: Okay, we did have one quick question and I'm not sure exactly the full details, but it's how is this different from regular server install on Windows machine? I don't know if it's a quick response or if you need to get into more details, but-

Dan: No, it's pretty quick. You're literally leveraging the infrastructure as a service, the actual AMI, or not AMI, but just the actual instance up in the cloud. There is really no difference from the install, from an on-premises Windows 2016 server to an AWS or Azure 2016 server, there really is no difference. Adding the AMI instances so you can do that quick click through to have it auto loaded, those are the things we're kind of leaning towards and then obviously having that native connection to their Blob and their block-level type storage is a key. But no, not really a whole lot different, just a location.

Andrew: Okay. Perfect. Thank you.

Dan: Yep.

Andrew: Are there any plans for Azure webinars? We've got a couple of questions specifically about Azure. I think that is something we probably will look into and have discussed in 2018, which is kind of a good segue here. So we are wrapping up, this is kind of the end of the series for this year. We have the four that are on your screen now are all on demand.

On our website, just one will be there as well, so you can always jump by and check out if you missed any of those there are also... We did cover up on some modules in the past kind of agents, secure forms, workflows, things like that. When the survey pops up at the end here, if there's more information you want specifically on any of those modules as well, feel free to make note in there and we'll make sure to have a representative contact you to answer any additional questions you may have.

And again, a couple of questions coming through on pricing and things like that. That's a great question, if you can throw that in the survey there and we'll have someone follow up and we'll also have notes here to follow up on. In the case we move MFT to the cloud, what resources exist to keep X15 internal database connections. So maybe for you, Dan.

Dan: Oh, I'm sorry. Can you say it again?

Andrew: Sorry. It's just kind of a question, I'm moving MFT to the cloud, what resources exist to keep existing internal database connections?

Dan: Internal database. It depends, I mean you can obviously leverage the database connections up in the cloud, whether it's Aurora any of the RDS instances with AWS, or if you have that connection, like a direct connect with AWS, you can still have connections to any resources on-prem, you could definitely do like a hybrid type environment. I'm not sure if I understood the question correctly, but you can definitely do hybrid type set ups with this.

Andrew: Okay. This is kind of an extension of that. Can cloud exchanges, can they integrate in Advanced Workflows? Like it's more push?

Dan: Yeah, I mean that's what we were talking about with the native storage there, you can definitely do as long as it has that connectivity depending upon how the connectivity is just kind of showing through with the S3 buckets and Blob storage, you can obviously leverage those as just network resources as we showed to connect up to, or if your instances are up in the cloud, then you can move things between instances up in the cloud. Just depends on where they're located.

Andrew: Okay, great. That looks like it for right now, so we'll wrap up with that. If there are additional questions, again, we will follow-up directly with those and feel free to put them in the survey at the end. If there's any additional topics you want us to cover, we are planning for 2018 right now. Dan's been great and kind of digging deep into GoAnywhere this past three or four months, so we'd love to continue to move on. So let us know in the comments as well what you'd like to hear from us. Thanks again, Dan, for taking the time to teach us all this stuff.

Dan: Yep, you bet.

Andrew: All right, everybody have a great day. Thanks for joining us.

Learn More about GoAnywhere MFT in Azure and Amazon

Schedule a live demo. Choose from our 15-, 30-, or 60-minute options to pick the level of detail that works best for you! Plus, check out the Azure and Amazon GoAnywhere pages.

Schedule My Demo