» 

Blog

Posts Categorized Under "ENCRYPTION"

No Such Thing as a Free File Transfer, Part 2: Cost-effective Security

With new corporate data breaches in the news seemingly every day, it’s no surprise that security is a top concern for IT professionals. However, file transfers are an area where many companies are still vulnerable. Most file transfers still use FTP, a protocol that comes with inherent risks. It’s especially worrisome that, as TechRepublic points out, FTP is actually becoming more popular again. Other common file transfer solutions, like file sharing apps, come with their own security concerns.

GoAnywhere MFT ROIThis is the second in a series of articles about the ROI of managed file transfer (MFT), the first of which covered time savings. There’s no doubt that data breaches are costly. The 2016 Ponemon Cost of Data Breach Study puts the current cost at $4 million—$158 per record breached. So it’s a no-brainer that a solution to secure your file transfers would bring you a great return on investment.

And yet, when you try to get internal approval for products to help with security, proving the ROI can be difficult. A good security tool is by nature preventative. If you haven’t suffered a breach (or you have and don’t know about it yet), you probably don’t have a way to precisely calculate cost-savings.

Still, your data certainly has value, and you know you have to keep it secure. So how do you know you are protecting your file transfers with the solution that gives you the most bang for your buck? By making sure the software you choose addresses all of the top file transfer security concerns within one solution—no additional purchases or custom scripting required.

A Variety of Secure Protocols

FTP has been proven vulnerable to hacking. For example, 7,000 FTP sites, including an FTP server run by The New York Times, had their credentials circulated in underground forums in 2014. In some cases, hackers used the credentials to upload malicious files.

It’s essential for modern enterprises to turn to more modern and secure file transfer methods, such as:

  • AS2: AS2 generates an "envelope" for the data, allowing it to be sent using digital certifications and encryption.
  • SFTP and FTPS: These secure FTP protocols bring down the risk during data exchange by using a secure channel between computer systems
  • HTTPS: The secure version of HTTP, HTTPS encrypts communications between browser and website.

Which of these methods your company implements may depend on several factors, like your industry compliance requirements or what your trading partners use. Your requirements may also change over time. That’s why the best investment is a versatile managed file transfer solution that can handle any of these protocols and more.   

GoAnywhere MFT ROIProtection against People

When you imagine the security threat to your company, you might conjure up images of hackers working tirelessly to access your systems and use your data for nefarious purposes. The truth is, one of your biggest threats is probably in the office down the hall.

A 2015 study found that internal actors were responsible for 43% of data loss. Half of this is intentional—disgruntled or opportunistic employees, contractors, or suppliers performing deliberate acts of data theft. But half of it is accidental. People like to cut corners, and probably most employees in your company aren’t as concerned about security as you are.

Any file transfer solution with a good ROI has to address the threat coming from within the business.  You want to have role-based security options that limit each user to the servers and the functions of managed file transfer that they absolutely need to use. Detailed audit logs mean you always know who is doing what with the solution.

Ensure Compliance

In many industries, inadequate security practices don’t just put your own corporate data at risk, they can endanger highly sensitive information like credit card numbers and health records. For this reason, a number of regulations exist to protect personal data. A few of the most common are PCI DSS, Sarbanes-Oxley, and HIPAA, but your industry may have others.

A 2011 study found that while the cost of compliance averaged more than $3.5 million, the estimated cost of failing to comply was $9.4 million, showing that a solution that can help you comply with regulations has a clear ROI. In the case of file transfers, your MFT platform should have a number of encryption methods available to protect sensitive data including SSL, SSH, AES, and Open PGP encryption. Audit trails should also be in place to track file transfer activity so you can easily determine what files are being sent, what time they are sent, and who the sender and receiver are.

Modernization and Scalability

Once you go to the effort of choosing a file transfer solution that will protect your company, convince management of its necessity, and implement the software, the last thing you want to have to do is  change it two years down the road because your company is bigger, has more compliance requirements, or new trading partners.

A managed file transfer platform from an established, reliable software provider will make sure you stay updated with the features necessary to combat current security threats. Furthermore, if your volume of file transfers increases, you won’t need to invest in a new tool to handle the workload.

Bonus: Increased Productivity

If your managed file transfer solution can prevent a data breach, that alone makes it worth the investment. But what if it could increase productivity and reduce errors at the same time? The automation capabilities of managed file transfer software allow you to make a high-volume of file transfers without the need for tedious manual work. Streamlining this process—and eliminating the risk of human error—add to your organization’s bottom line.

Read more about safeguarding company data and limiting risk, or get started with a free trial of managed file transfer.


No Such Thing as a Free File Transfer, Part I: How MFT Saves Time

How MFT Saves Time - GoAnywhere MFTEvery business engages in some kind of information exchange, whether it’s a small retailer attaching an invoice to an email or a hospital sending hundreds of patient records between departments. Some methods of exchanging files, like a basic FTP server or a file sharing app, seem like an inexpensive way to deal with your transfers. In the long run, however, the shortfalls of these tools will likely cost your company significantly more than the investment in a sophisticated managed file transfer (MFT) solution.

A study by the Aberdeen Group found that every file sent “for free” actually has an 80% chance of costing your organization money. In a new series of articles, we’ll break down the reasons why MFT gives your company a better ROI than any other file transfer solution. The first reason we’ll discuss is the time you’ll save with managed file transfer.

We’ve all heard that time is money, and if you’ve ever been the unlucky person manually transferring files by FTP, it’s no stretch of the imagination to think that automated file transfer software would save a bit of time on each exchange. But you probably haven’t even thought of all the ways a rudimentary file transfer tool can waste costly hours. Here are a few:

  1. Dealing with Exceptions

As with any process, your file transfers aren’t always going to go smoothly. While even a basic tool will work most of the time, you’ll inevitably run into the occasional problem which will require you to divert members of your staff away from more important projects to help get the files moving. Aberdeen’s analysis found that those who don’t use MFT have more than twice as many of these errors and exceptions as MFT users. With a single-function file transfer tool, the operator is solely responsible for checking if the transfer succeeded and trying it again if it failed. A good managed file transfer solution has ways of dealing with issues that arise—for example, the software could automatically reconnect and resume the file transfer after a problem occurs with the network.

Moreover, the MFT solution will provide visibility into the status of automated file transfers and let you know if something goes wrong. This allows you to attack the problem immediately and get back to your more strategic initiatives as soon as possible. A basic tool or script may cause you to waste hours just trying to determine what happened to your files.

  1. Upgrades and Modifications

A common solution for moving files is with custom scripts. This seems like an easy option at first. Your company has talented programmers and it’s not too hard to create a homegrown FTP script that gets the job done. The first few times you need a modification or a new feature, that’s not difficult either. But pretty soon your company is transferring thousands of files every day, your homegrown solution is severely lacking in the error-handling, security, and logging capabilities it needs, and updating your mess of sprawling scripts will cost you dearly in expensive programmer hours. Or maybe the original creator of the scripts has left the company and those hours will be spent just trying to figure out how it all works.

Managed file transfer has the features you require as your business needs grow more complex. You can trust that it will continue to be updated when necessary and upgrades won’t require the same technical expertise as creating a homegrown tool does. 

  1. Compliance Requirements and Auditing

Storing and tracking detailed audit information is crucial for staying compliant with PCI DSS, HIPAA, state privacy laws, and other regulations. A managed file transfer solution will store detailed audit records for all file transfer and administrator activity and provide that data in an easily accessible format to authorized users. If you are legally obligated to collect this information, there’s no better time-saver than implementing file transfer software that stores the data automatically.

Furthermore, compliance requirements can always change or new regulations can be put in place. While you may already have a process for complying with current regulations, MFT provides the flexibility to respond to new security requirements without creating too much additional time-consuming work.

  1. Avoiding Downtime.

Just one minute of unplanned system downtime costs a company an average of $5,600. Talk about expensive hours! Make sure your file transfers keep running even if a server goes down by implementing MFT software that integrates clustering. This means you have a group of linked servers running concurrently, with each installation of your MFT tool sharing the same set of configurations and trading partner accounts. The servers in the cluster are in constant communication with each other, so if one fails, the remaining systems in the cluster will continue to service the trading partners. With the fast pace of modern business, you can’t afford to let your transactions wait while you take the time to get your systems functioning again.

Every minute that your business isn’t paying employees to fight fires, write custom scripts, or compile audit reports is a minute that can be put towards the work that helps the bottom line.

Interested in learning more about the ROI of Managed File Transfer? Read the next installment in our series: No Such Thing as a Free File Transfer, Part 2: Cost-effective Security.

 

Learn more about the risks of inadequate FTP implementations or get started with a free trial of managed file transfer today. 


Top 10 Healthcare Data Breaches in 2010

Most data breaches are caused by simple acts of carelessness.

Last March the Ponemon Institute released its findings for the 2010 Annual Study: U.S. Cost of a Data Breach. The study -- based on the actual data breach experiences of 51 U.S. companies from 15 different industry sectors -- revealed that data breaches grew more costly for the fifth year in a row. They jumped from $204 per compromised record in 2009 to $214 in 2010.

The increase in cost, however, pales in comparison to the reputational cost of companies that have been victimized, particularly in the healthcare sector.

HITECH builds Wall of Shame

Consider that the U.S. Department of Health and Human Services has begun posting the data breaches affecting 500 or more individuals as required by section 13402(e)(4) of the HITECH Act.  The New York Times has labeled this site "The Wall of Shame".  Why? Because if patients have no faith in electronic record-keeping, the future of healthcare record automation will be jeopardized: Law suits and government regulation will bury any cost-savings.

The Back Stories of Healthcare Data Breaches

What are the stories behind the most severe healthcare sector data breaches reported in 2010?  Here are the ten most expensive stories, in ascending order of cost, documented in the Privacy Rights Clearing House database. While they're sober reminders of the problem of keeping data secure, they're also instructive: none of these breaches were malicious hacks, but were instead the results of theft, poor record-keeping policies, and simple human error.

(Note that the estimate of liability uses the $214/ record cost identified by the Ponemon Institute in its annual report. We have purposely not published the names of the reporting institutions.)

10th Most Expensive: Physician Computer Theft Exposes 25,000

On June 29th of 2010 a thief stole four computers from a physician specialist's office in Fort Worth, Texas.  This theft resulted in an estimated 25,000 patient records being exposed.  The patient records contained addresses, Social Security numbers and dates of birth. Estimated liability: $5,350,000.

9th: Medical Center Theft Exposes 39,000

On the weekend of May 22nd, 2010 two computers were stolen from a medical center in the Bronx. Names, medical record numbers, Social Security numbers, dates of birth, insurers, and hospital admission dates of patients were known to be on the computers.  Total records compromised: 39,000. Estimated liability: $8,346,000.

8th: Optometrist's Computer Theft Exposes 40,000

A computer stolen from an Optometry office in Santa Clara, California on Friday April 2nd, 2010 contained patient names, addresses, phone numbers, email addresses, birth dates, family member names, medical insurance information, medical records, and in some cases, Social Security numbers. Though the files were password protected, they were not encrypted.  A total of 40,000 records were lost, with an estimated liability of $8,560,000.

7th: Medical Records Found at Dump Expose 44,600

Medical records were found at a public dump in Georgetown, Massachusetts on August 13th, 2010. The records contained names, addresses, diagnosis, Social Security numbers, and insurance information. A medical billing company that had worked for multiple hospitals was responsible for depositing the records at the dump. The exposure required the hospitals to notify patients - an effort that continues to this date.  The total number of records known to have been exposed is 44,600, but the search continues.  Estimated liability: $9,544,400.

6th: Consultant Laptop Stolen Exposing 76,000

On March 20th, 2010, in Chicago, Illinois, a contractor working for a large dental chain found his laptop stolen.  The computer held a database containing the personal information of approximately 76,000 clients, including first names, last names and Social Security numbers. Estimated liability: $16,264,000.

5th: Lost CDs Expose 130,495

On June 30th, 2010 a medical center in the Bronx reported that it had failed to receive multiple CDs containing patient personal information that was sent to it by its billing associate.  These CDs were lost in transit. Information of 130,495 patients included the dates of birth, driver's license numbers, descriptions of medical procedures, addresses, and Social Security numbers.  Estimated liability of $27,925,930.

4th: Portable Hard Drive Theft Exposes 180,111

In Westmont, Illinois, a medical management resources company reported on May 10, 2010 that a portable hard drive had been stolen after a break-in.  The company believes the hard drive contained personally identifiable information about patients including name, address, phone, date of birth, and Social Security number. The company acknowledged that this hard drive had no encryption.  As a result, 180,111 records were exposed, creating an estimated liability of $38,543,754.

3rd: Leased Digital Copier Leaks 409,262

On April 10th, 2010 a New York managed care service in the Bronx reported that it was notifying 409,262 current and former customers, employees, providers, applicants for jobs, plan members, and applicants for coverage that their personal data might have been accidentally leaked through a leased digital copier. The exposure resulted because the hard drive of the leased digital copier had not been erased when returned to the warehouse. Estimated liability: $87,582,068.

2nd: Training Center Hard Drive Theft Center Exposes 1,023,209

The theft of 57 hard drives from a medical insurance company's Tennessee training facility in October of 2010 put at risk the private information of an estimated 1,023,209. customers in at least 32 states. The hard drives contained audio files and video files as well as data containing customers' personal data and diagnostic information, date of birth, and Social Security numbers, names and insurance ID numbers. That data was encoded but not encrypted. Estimated liability to date: $218,966,726.

Most Expensive of 2010: Two Laptops Stolen Exposes 860,000

A Gainsville, Florida health insurance company reported in November of 2010 that two stolen laptops contained the protected information of 1.2 million people.  This is an on-going story, as new estimates are calculated.  To date, the estimated liability is $256,800,000.

Preventing Exposure: Data Encryption

These cases document that the majority of the data breaches which occurred in 2010 were not the result of hacking activities, or even unauthorized access by personnel. The greatest data losses were simply the result of computer theft of portable devices and misplaced media.  Had the contents of the files been encrypted, this could have significantly reduced the risks and liabilities of these data losses.

Time and time again, industry experts point to data encryption as the key method by which organizations can prevent inadvertent exposure of sensitive data.

Of course, no healthcare organization wants to be listed on the US Department of Health and Humans Services' Wall of Shame.  And the costs - in dollars and in reputation - can be extraordinary.

Isn't it about time your management got serious about data encryption?


Encrypting Files with OpenPGP

When our users send a file over the Internet there are really just a few things that seem important to them at the time:

a)      Is the file complete?

b)      Is it being sent to the right place?

c)      Will it arrive intact? and -- if the data is sensitive --

d)     Will the intended recipient (and only that recipient) be able to use it?

That's where encryption comes in: By scrambling the data using one or more encryption algorithms, the sender of the file can feel confident that the data has been secured.

But what about the file's recipient? Will she/he be able to decode the scrambled file?

Encryption, Decryption, and PGP

For years, PGP has been one of the most widely used technologies for encrypting and decrypting files. PGP stands for "Pretty Good Privacy" and it was developed in the early 1990s by Phillip Zimmerman. Today it is considered to be one of the safest cryptographic technologies for signing, encrypting and decrypting texts, e-mails, files, directories and even whole partitions to increase the security.

How PGP Works

PGP encryption employs a serial combination of hashing, data compression, symmetric-key cryptography, and, finally, public-key cryptography. Each step uses one of several supported algorithms. A resulting public key is bound to a user name and/or an e-mail address. Current versions of PGP employ both the original "Web of Trust" authentication method, and the X.509 specification of a hierarchical "Certificate Authority" method to ensure that only the right people can decode the encrypted files.

Why are these details important for you to know?

Growing Pains for PGP

PGP has gone through some significant growing pains - including a widely publicized criminal investigation by the U.S. Government. (Don't worry! The Federal investigation was closed in 1996 after Zimmerman published the source code.)

One result of PGP's growing pains has been the fragmentation of PGP: Earlier versions of the technology sometimes can not decode the more recent versions deployed within various software applications. This PGP versioning problem was exacerbated as the ownership of the PGP technology was handed off from one company to another over the last 20 years.

And yet, because PGP is such a powerful tool for ensuring privacy in data transmission, its use continues to spread far more quickly than other commercially owned encryption technologies.

Fragmentation and the Future of PGP

So how is the industry managing the issue of PGP fragmentation? The answer is the OpenPGP Alliance.

In January 2001, Zimmermann started the OpenPGP Alliance, establishing a Working Group of developers that are seeking the qualification of OpenPGP as an Internet Engineering Task Force (IETF) Internet Standard.

Why is this important to you? By establishing OpenPGP as an Internet Standard, fragmentation of the PGP technology can be charted and - to a large degree - controlled.

This means that the encrypted file destined for your system will be using a documented, standardized encryption technology that OpenPGP can be appropriately decrypted. The standardization helps ensure privacy, interoperability between different computing systems, and the charting of a clear path for securely interchanging data.

The OpenPGP Standard and Linoma Software

OpenPGP has now reached the second stage in the IETF's four-step standards process, and is currently seeking draft standard status. (The standards document for OpenPGP is RFC4880.)

Linoma Software uses OpenPGP in its GoAnywhere Director Managed File Transfer solution. Just as importantly, Linoma Software is an active member of the OpenPGP Alliance, contributing to the processes that will ensure that OpenPGP becomes a documented IETF Internet Standard. This will ensure that your investment in Linoma's GoAnywhere managed file transfer software remains current, relevant, and productive.

For more information about OpenPGP and the OpenPGP Alliance, go to http://www.openpgp.org. To better understand how OpenPGP can help your company secure its data transfers, check out Linoma Software's GoAnywhere Director managed file transfer (MFT) solution.


Who is Protecting Your Health Care Records?

Patient Privacy in JeopardyHealth Care Records

How important is a patient's privacy? If your organization is a health care facility, the instinctive answer that comes to mind is "Very important!" After all, a patient's privacy is the basis upon which the doctor/patient relationship is based. Right?

But the real answer, when it comes to patient data, may surprise you. According to a study released by the Ponemon Institute, "patient data is being unknowingly exposed until the patients themselves detect the breach."

The independent study, entitled "Benchmark Study on Patient Privacy and Data Security" published in November of 2010examined  the privacy and data protection policies of 65 health care organizations, in accordance with the mandated Health Information Technology for Economic and Clinical Health (HITECH) Act of 2009. HITECH requires health care providers to provide stronger safeguards for patient data and to notify patients when their information has been breached.

Patient Data Protection Not a Priority?

According to the study, seventy percent of hospitals say that protecting patient data is not a top priority. Most at risk is billing information and medical records which are not being protected. More significantly, there is little or no oversight of the data itself, as patients are the first to detect breaches and end up notifying the health care facility themselves.

The study reports that most health care organizations do not have the staff or the technology to adequately protect their patients' information. The majority (67 percent) say that they have fewer than two staff members dedicated to data protection management.

And perhaps because of this lack of resources, sixty percent of organizations in the study had more than two data breaches in the past two years, at a cost of almost $2M per organization. The estimated cost per year to our health care systems is over $6B.

This begs the question: Why?

HITECH Rules Fail to Ensure Protection

HITECH encourages health care organizations to move to Electronic Health Records (EHR) systems to help better secure patient data. And, indeed, the majority of those organizations in the studies (89 percent) said they have either fully implemented or planned soon to fully implement EHR. Yet the HITECH regulations to date do not seem to have diminished security breaches at all, and the Ponemon Institute's study provides a sobering evaluation:

Despite the intent of these rules (HITECH), the majority (71 percent) of respondents do not believe these new federal regulations have significantly changed the management practices of patient records.

Unintentional Actions - The Primary Cause of Breaches

According to the report, the primary causes of data loss or theft were unintentional employee action (52 percent), lost or stolen computing device (41 percent) and third-party mistakes (34 percent).

Indeed, it would seem that - with the use of EHR systems - technologies should be deployed to assist in these unintentional breaches. And while 85 percent believe they do comply with the loose legal privacy requirements of HIPAA, only 10 percent are confident that they are able to protect patient information when used by outsourcers and cloud computing providers. More significantly, only 23 percent of respondents believed they were capable of curtailing physical access to data storage devices and severs.

The study lists 20 commonly used technology methodologies encouraged by HITECH and deployed by these institutions, including firewalls, intrusion prevention systems, monitoring systems, and encryption. The confidence these institutions feel in these technologies are also listed. Firewalls are the top choice for both data breach prevention and compliance with HIPAA. Also popular for accomplishing both are access governance systems and privileged user management. Respondents favor anti-virus and anti-malware for data breach prevention and for compliance with HIPAA they favor encryption for data at rest.

The Value of Encryption

The study points to the value of encryption technologies - for both compliance purposes and for the prevention of unintended disclosure - and this value is perceived as particularly high by those who participated in the study: 72 percent see it as a necessary technology for compliance, even though only 60 percent are currently deploying it for data breach prevention. These identified needs for encryption falls just behind the use of firewalls (78 percent), and the requirements of access governance (73 percent).

Encryption for data-at-rest is one of the key technologies that HITECH specifically identifies: An encrypted file can not be accidentally examined without the appropriate credentials. In addition, some encryption packages, such as Linoma's Crypto Complete, monitor and record when and by whom data has been examined. These safeguards permit IT security to audit the use of data to ensure that - should a intrusion breach occur - the scope and seriousness of the breach can be assessed quickly and confidently.

So how important is a patient's privacy? We believe it's vitally important. And this report from the Ponemon Institute should make good reading to help your organization come to terms with the growing epidemic of security breaches.

Read how Bristol Hospital utilizes GoAnywhere Director to secure sensitive data.