The Top 5 Mistakes Made When Apps and Data Move to the Clou…

We don’t have resources to invest in that new server farm, so let’s move it to the Cloud. Have you ever heard that from your C-Suite?

New applications or significant upgrades to existing applications often require additional computing power. Since these expansions can strain existing in-house data centers in terms of power, floor space, and cooling, CIOs may want to shift new operations to outside cloud providers. For organizations who do not have previous experience with the cloud, this can be a high-risk endeavor.

An important first step in moving assets to the cloud is to know the common mistakes that can lead to a data disaster or a breach of sensitive information.

         1. Treating the Cloud as an Extension of the Data Center

Information technology professionals have historically taken the “M&M” security approach when designing network and application architectures – by designing a hard outer shell with a soft gooey core. Defending a perimeter is easier because the number of connections and ports is finite, therefore internal security controls have not traditionally needed to be as robust. The use of firewall rules also limits access inside, so less attention is needed.

When data and applications are moved into a cloud, IT staff should not attempt to transplant the internal data center architecture as it will be necessary to redesign the architecture with the unique risks the cloud brings. History has exposed a few common mistakes in credential management, addressing, and application management.

A robust credentialing architecture must balance the complexity of a secure system with the amount of inconvenience that end users will endure. When applications and data are hosted in the same protected network environment as the end users, the lower level of risk will allow developers to leverage the user ID and password for general users. Administrator accounts should have additional controls, such as longer passwords or even multi-factor authentication (MFA). For access to cloud applications, a user ID and password is probably not appropriate for sensitive data so additional controls such as device credentialing and/or MFA should be the minimum. Administrator accounts should use MFA plus additional measures including VPN access, edge devices that are geolocation aware, frequent password changes, and of course increased logging and alerting.

There have been several breaches of sensitive data because application developers used web shortcuts. A more secure architecture would require a separate login to a web application before data can be accessed. It is never proper to create direct external links to sensitive data. In previously reported breaches, data was accessed because the suffix of the URL contained a record number, which was assigned sequentially. The breaches were reported when a third-party user was sent a direct URL link to access their personal data, then when they changed one character, were able to access another individual’s data. There are a few workarounds, including requiring a complete login into a secure environment, then entering specific credentials to access the sensitive data. While less secure, the use of very long and randomly generated unique keys will thwart most accidental disclosures. Finally, any public or private website can limit automated search engine indexing by putting a ‘robot.txt’ file at the root of every website. While this will stop legitimate search engines, it will not stop web crawlers and spammers who are looking for email addresses.

Site Search 360 Reports

Be the first to comment

Leave a Reply

Your email address will not be published.