The ways of exchanging data between multiple parties is constantly evolving year after year. This is most likely due to all of the data hacks and also due to ever-advancing technology. Businesses have many options when selecting products to use and services to offer their clients. However, if these products are not utilized correctly and placed into the proper configuration the dreaded hack can have a higher probability of occurring.
Lets look at how placement configurations have evolved over the years. Traditionally when you think of hardware & software placement, for exchanging data from outside the public internet to inside a private network, most appear to be in a straight line with very few layers. First there is the public DMZ at the outer most layer, then advancing inward you typically see application severs and storage components and if the customer is utilizing a mainframe you would see that environment at the inner-most layer. Larger organizations would then have VPN’s (Virtual Private Networks) connected directly to this mainframe platform. All of this is a very flat and potentially high-risk configuration.
As time has advanced, larger enterprise organizations would adopt multiple network segments and firewalls in an effort to separate servers hosting applications into unique environments. This was a step in the right direction and there was a perception of high-security. However, during this time most organizations were storing sensitive data in the DMZ – some encrypted and some unencrypted – and this in itself had significant risk. In addition, many data transmissions were transferred using open FTP with data level encryption, but the credentials were in the clear and those were being compromised. It’s quite obvious what can happen when credentials are compromised, so we won’t go into detail about that in this article.
In the modern day of high-security and sensitivity to all risk, the environment layouts, configurations an attributes should look much different. I’ll outline some very important items of interest below.
- No data should be stored in the public DMZ (This includes server logs)
- Do not perform virus scanning in the public DMZ
- Open protocols should never be used when transferring data
- No application servers should be hosted in the public DMZ
- Utilize network security solutions in the DMZ that do not allow incoming ports and IP’s for LAN applications to be exposed
- Adopt a network solution that pulls session data into the LAN rather than the traditional push method
- Utilize many network segments & firewalls
- Separate virus scanning/scrubbing environments from application servers
- Quarantine data that is deemed at-risk from all critical components and data
- Deploy a DLP (Data Loss Prevention) solution that is yet in a separate layer from critical data
These are some basic ideas and thoughts around multiple layers of security. There are many more that you can utilize. The primary take away from this should be that application servers, storage components and scanning mechanisms should be segregated as much as possible. This separation makes it more difficult for “cross-pollination” of data and limits a hacker’s ability, if they gain unauthorized access, to go from one platform or zone to another.
Deploy applications and infrastructure with as many layers as possible to limit your exposure!