Friday, August 27, 2010

Cost Effective DLP Strategies for Small Businesses

According to Osterman Research in their "Messaging Policy Management Trends Report" from 2007-2010, "current policies designed to protect organizations against the leakage of sensitive information are not considered effective by the majority of organizations.” In a world where 31% of data leak incidents are from viruses, and 28% come from insider abuse (Deloitte Global Security Survey), finding a Data Loss Prevention (DLP) strategy that is actually effective is paramount in keeping a business running.

For large businesses, a dedicated IT department is often a viable option to contain potential data leaks, but this form of DLP can be both expensive and cumbersome. In order to make a DLP effective, it needs to identify high-risk data leak points, be accurate in capturing data, and protect all data transmitted by a company, not just that deemed "sensitive". For a small business, this is an especially difficult challenge, as they often do not possess the resources to hire full time-tech staff.

One option for small businesses (SMBs) to deal with data loss is the addition of hardware or a software program intended to capture data that is sent out; but there is a risk of failure or incompatibility with certain systems. Another, often more effective, DLP method is to employ a separate, non site-based security option.

There are now a number of DLP solutions on the market that allow off-site monitoring and reconstruction of all data out and into a company; products that can scale with data use and provide a greater functionality for SMBs.

Monday, August 23, 2010

Decrease the Risks of Cloud Computing: Develop A Data Capture Strategy


Cloud computing - also known as "grid computing", involves the use of a third-party's system or servers in order to store, access, and manipulate data. The New York Times recently used Google Apps to create a searchable PDF database of old issues, something that would have taken them years to accomplish using their own systems. Google Apps did it in a day.

However, while the benefits of the cloud can be concrete, security issues abound. On July 15th of 2009, a hacker broke into an e-mail account belonging to a member of Twitter’s staff. Once inside the account, he was able to view a significant amount of password data which was stored on a cloud computing system, in this case Google Apps.

Any time data is stored or complied off-site by a third party, a company's control of their data is partially lost, and there are a number of specific concerns that arise. First is the question of simply who is accessing your data and why - who are the administrators of the cloud, and what are their qualifications? How secure is the data itself? Second, just where is your data going? Is it stored locally? Internationally?

One way for a company to proactively protect their interests is with a data capture strategy. By knowing what data is accessed and collecting it on a local level using a separate, non-hardware, non-software dependent capture program, a company can both track its data use and plug potential leaks before they occur.

Thursday, August 19, 2010

Larry E Daniel Tapped As Expert

Larry E Daniel, CEO of Guardian Digital Forensics, has been tapped as an expert digital forensics consultant by AIT Inc. for a Dell Computer Lawsuit.
"Executives for A.I.T., which hosts websites and manages web domains, declined comment. According to its lawsuit, A.I.T. leased the Dell computers in 2003 and 2004 and says they were prone to overheating, crashing and losing data. The company says Dell stopped honoring its warranty, leaving A.I.T. and its customers stranded.
A.I.T. is now seeking to force Dell to release internal memos that A.I.T. thinks will show that senior executives are hiding information about their role in downplaying the computer malfunctions. A.I.T. says Dell computers were subject to nearly a 100 percent failure rate, but Dell says the failures affected about 22 percent.
In the latest round of filings, A.I.T. says Dell continues to engage in a coordinated campaign of deception. As evidence it cites a 2005 memo from Lynn Tyson, then Dell's vice president for investor relations, in which Tyson says the computer glitch poses no risk of safety or data loss.
"A.I.T. suspected the document in question might have been altered in some way and even went so far to retain a digital forensic consultant to examine the document," A.I.T.'s court filing says."


NetSentry is a division of Guardian Digital Forensics.



Wednesday, August 11, 2010

Does Your Compliance Solution Include DLP Best Practices?


According to a recent study conducted by the Ponemon Institute, a typical data breach costs a company $128 per document, averaging out to almost $5 million in total per incident. In short, big numbers for what appear to be small incidents.

Often, this type of data loss does not begin as malicious - it may simply be as a result of user error or system difficulties. Consider the case of Clarkson University. In 2008, a glitch on the public drive of the University's internal file server allowed everyone on campus access to the personal information and social security number of all students in the database. The data leak was caught and rectified almost immediately thanks to an honest student who gained access.

Incidents like these, in combination with the high costs of aggregate data loss, speak to the need for ensuring that a data loss prevention solution follows what are known as industry best practices. These include defining DLP needs and setting a focus for a DLP program, as well as making any DLP solution both unobtrusive and comprehensive.

In order to achieve compliance with these best practice guidelines, many companies have turned to network security systems that not only operate without the need for added hardware or software, but that can also actually reconstruct data that has been transmitted, rather than simply select random files for further analysis. This recording and recompiling of data allows businesses to better address the volume and nature of their data loss.

Wednesday, August 4, 2010

Social Media's Impact on Network Security

A 2009 FaceTime survey shows that 62% of IT professionals believe social network sites are being accessed on the networks they maintain. However, in reality 100% of businesses surveyed had evidence of social networking site use.

While companies have a vested interest in how much time employees spend on social networking sites, only 24% are "extremely concerned" that it will damage efficiency. In fact, the amount of time taken to access these applications is often balanced out by the potential productivity allowed by the interconnectivity of a Web 2.0 world.

Although many IT departments have begun monitoring and recording the usage of these sites on their network, only 49% - down 2% from 2008 - could produce documents if asked to do so by management, and over 60% of IT professionals still view incoming Malware as their biggest network threat.

In many cases, companies do not consider the traffic of data out as a a potential problem. Should a user choose to post a sensitive file, conversation, or other piece of company information on a social networking website, it can never be fully removed, and can be almost instantly copied to other sites. This creates the potential for a serious network breach simply based on the way social networking sites are viewed and operated. As a result, companies are finding themselves needing to create social networking usage rules and standards, and examining methods for the tracking and retention of data sent out over a network.

The Cost of Data Leaks

Many companies view the issue of data leaks as a minor annoyance - one that is either over-reported due to media sensationalism or simply an alarmist concern by IT departments trying to pad their budgets.

In reality, the cost of data leaks can be damaging to both a company's profitability and its reputation. For example, consider the situation that the Obama Administration now finds itself in after 2008 documents showing a link between Pakistan's intelligence agency and Taliban insurgents. The source of the document leak is a website known for publishing classified military information - apparently with little difficulty - considering nearly 90,000 documents were recently posted, forcing the government to deal head-on with the issue.

Or take the example of DuPont, which had an employee steal and leak over $400 million worth of data. During the course of four months, the thief accessed over 15 times as many documents as other employees, but this was never questioned or even detected until long after he left the company.

In order to properly address these kinds of issues, a company must act pre-emptively to both monitor and track data movement. This can be done in several ways, and new methods have now been developed which allow the tracking, storage and reconstruction of accessed and transmitted data on PCs without the need for added hardware or software. Used properly, options such as these can aid companies in catching data leaks before they start as well as in protecting both their finances and their reputation.