Wednesday, November 10, 2010

E-Mail Archiving for Compliance

E-mail archiving is a regulatory requirement for many industries. This preservation is a mandatory safeguard that creates a record of all employee communication. Some e-mail archiving regulations are addressed below in respect to each regulatory issue.

Financial Industry Regulatory Authority and SEC
FINRA is the largest independent regulator of securities firms doing business in the US. The regulations for securities firms are strict and require the preservation of electronic business records for 3-6 years, depending on the nature of communication. All messages must be stored in the original form and in a tamper proof, non-modifiable and non-erasable storage device. The data must also be stored in multiple locations with timestamps and ID for indexing and search. More about SEC and FINRA compliance requirements can be found here.

Healthcare Insurance Portability and Accountability
HIPAA is a regulation for protecting health information. There are numerous requirements but the main goal is to secure data at all levels and to provide a complete record of ALL communications for up to 6 years. It is necessary to store data to maintain a record in case of an audit or patient complaint.

E-mail archiving for compliance is a necessary function. If your organization is interested in learning more about a compliant E-mail archiving solution, NetSentry Live, please contact us directly at 1-888-50NETSENTRY.


Tuesday, October 19, 2010

How Does Sensitive Data Leave Your Network?

This post references the Verizon 2010 Data Breach Investigations Report

Malware

Malware factored into 38% of 2009 cases and 94% of all data lost. Malware is loaded onto the network by various methods that constantly evolve and keep pace with network security. The main goal of malware is to collect sensitive information without being discovered. Once collected, the data is used in many different ways. In many cases, the data is sent out of the network and back to the host computer. In this case, if the company were to have network monitoring software active, they could distinguish between valid/invalid outbound traffic.

Employee Misconduct

Employees play a major role in data leakage. Whether it be for financial gain, to directly harm the company or both it is pertinent that employee network usage is monitored. Simple internet surfing by idle employees could lead to malware being installed on the corporate network. Records of websites visited, with exact reconstruction, is the first step in damage control if a breach occurs.

The Verizon report states "We advocate paying attention to what goes out of your network and what changes take place within your systems." Additionally it states that any periodic, odd sized, trending, or further suspicious outbound activity is grounds for investigation. This can be accomplished with many tools, but often a simple traffic analyzer will not let capture the actual files that are sent out of the Network.

NetSentry Live undetectably monitors network Internet traffic and captures, reconstructs, and stores original content in a searchable database. With its real-time alerts, NetSentry can provide the insight to identify both who and when suspicious or malicious Internet activity happens on your network. Never before has a network monitoring and forensics tool been so powerful and as easy to use as NetSentry. Adding NetSentry Live to your DLP strategy gives your business a best-in-class tool that produces the complete evidence when the leak happens.

Guardian Digital Forensics Hires Don Gilman as Vice President


Raleigh,  NC – Guardian Digital Forensics, a Raleigh based Digital Forensics Consulting company is pleased to announce the addition of a new member to its NetSentry management team - Vice President of Market Development, Don Gilman. NetSentry is proud to have Gilman on board and is excited to utilize his expertise and experience to expand our market reach.

Gilman brings with him over twenty years of management and leadership experience coupled with a consulting background. He enjoys the art of building relationships and tackling difficult challenges, which is why he has a reputation for developing innovative solutions to business problems.

He has recently served as Ambassador for the North Carolina Technology Association and as VP of Strategy and Director of Service Delivery for the North Carolina Chapter of the Project Management Institute. Additionally he holds the following certifications: Certified Project Manager in Construction, Certified Project Manager in Information Technology, Certified Information Systems Auditor from ISACA, as well as, Certified LEAN Six Sigma Green Belt.

Friday, October 8, 2010

Internal Threats in Data Breaches

This post references the Verizon 2010 Data Breach Investigations Report.

The three origins of data breaches are external, internal and partner agents. The origins are not exclusive and a data breach can have attacks from one or multiple agents. This post will focus the internal threat and include recommendations to help protect against internal attacks. In 2009, the role of insider agents in data breaches doubled compared to 2008 figures. This increase is partly due to an influx of insider cases from the U.S. Secret Service but it also reflects the constant threat from insiders in cases analyzed by Verizon.

Internal agents act with different motives and possess varying levels of intelligence. The similarity that 90% of internal agents share is that they act deliberately. Only 4% of insiders who contribute to a data breach act unintentionally. The other 6% are insiders who carry out inappropriate behavior and because of their actions a breach occurs. The report states that employees who commit data theft have more than likely been cited for network misuse in the past. Should corporations more closely monitor employees that have network misuse infractions? It is a preventative measure that is less costly than having to mitigate a data breach.

51% of internal data breaches come from regular employees, noting personnel that handle cash on a regular basis as main culprits.  The finance and accounting staff account for 12% of internal breaches, as they have access to corporate accounts and financial data. Additionally, Systems and network administrators accounted for 12% of breaches. Executives accounted for 7%, Helpdesk staff 4%, Software developers 3%, Auditors 1% and unknown 9%. The access to confidential data, especially without necessity, can lead to data breaches. Unnecessarily high user IT privileges contribute to many of the breaches and should be more closely monitored.

Below are some tips, implied from the report, that will help secure internal networks.

  1. Use network surveillance software with packet re-construction and forensics analysis capabilities for quick user identification, evidence and prosecution requirements in case of a breach.
  2. Develop a corporate wide "Breach Plan" including personnel, time frames and budget considerations.
  3. Distribute internet usage policy with clearly defined rules for all possible infractions.
  4. Quarterly review of user IT privileges in an attempt to prevent unnecessary access and a possible motivator for data theft.
  5. Monitoring of all top level users with access to valuable corporate information.
  6. Monitoring of all employees with certain level of network misuse infraction.

Monday, October 4, 2010

External Threats in Data Breaches

This post references the Verizon 2010 Data Breach Investigations Report.

From 2004-2009, over 87% out of approximately 919 million data records were compromised by external threats. I agree with the authors of the report in stating that this is one of the most powerful statistics in the paper. The harm caused by external threats is clearly the most costly to organizations. The more valuable the information an organization has the more secure its network security has to be, plain and simple. External threats in 2009 comprised 70% of breaches and 98% of records. Internal threats in 2009 comprised 48% of breaches and 3% of records.


The types of external threats and percent of breaches include Organized Crime (24%), Unaffiliated person(s) (21%), external systems (3%), activitst groups (2%), former employees (2%), other organizations (1%), competitor (1%), customer (1%) and unknown (45%). Organized crime is the largest identified threat agent. This is not unusual as organized criminals, located all over the world, have the resources to infiltrate networks and extract valuable data. Geographically, 21% of external breaches orginate from Eastern Europe, including Russia. North America origin accounts for 19% of breaches and East Asia accounts for 18%. Unknown origination accounts for 31% . Interestingly, in 2009 Verizon cases, East Asia rose to the top spot for external breach origination, while a majority of the unknown origination is suspected of coming from East Asia.

The unknown in both external threats and external threats origination is a result of breach victims not seeking out an answer to who or where their attack came from. This is often a purely financial decision or the attacker can not be identified. Most breach cases handled by the US Secret Service have a determined suspect and origination due to prosecution.


What does it all mean? Cyber-crime is not new. The number of external agents with the knowledge to access sensitive corporate data on "secure" networks is not shrinking. The constant battle to develop secure applications as old security is compromised will not end. Corporations need to be vigilant and consistently monitoring network security and procedures to stay ahead of external threats.

Wednesday, September 29, 2010

Does Organizational Size Determine Data Breach Frequency?

The table discussed in this post is found on page 11 of the Verizon 2010 Data Breach Investigations Report.

According to the table, the highest percentage of data breaches occurs in to 1,001 to 10,000 employee firm. The report makes a key point of noting that it’s possible and logical that organizational size has little affect on the possibility of suffering a data breach. The probability of a breach is more correlated with the value of information in a firm rather than the number of employees. Surprisingly, firms with over 100,000 employees have the lowest percentage of data breaches. I infer that these massive corporations put an incredible amount of resources into data loss prevention, maintenance and corporate technology policies that drive the low data breach rates. Similarly, a smaller, 1-10 employee firm with little valuable data does not have the need or resources for an expensive data loss prevention strategy. The conclusion is, as the report states, the size of the organization has a minor impact on the rate of data breaches. The real driver of data leakage is the type of information stored within the network. If your organization has valuable information, it is prudent to develop a secure Data Loss Prevention Strategy.

Monday, September 27, 2010

Verizon 2010 Data Breach Investigations Report Discussion

 Verizon 2010 Data Breach Report PDF


The 2010 Data Breach Investigations Report is an analysis of data from actual breach cases worked on by Verizon and the US Secret Service. The results of the report should be required reading for all IT professionals as well as anyone interested in the current state of corporate data breaches, IT security and cyber-crime.

The 2010 dataset includes 141 breach cases worked in 2009 by Verizon and the US Secret Service. The amount of data records compromised in these studies is over 143 million. The sheer amount of data provides a solid set for analysis.
Over the next few weeks, in related posts, we will discuss some of reports main points.

Demographics of Data Breaches

Out of the 141 confirmed cases, the top three industries, based on breach incidents, are Financial Services, Hospitality and Retail. Not surprisingly, 94% of all compromised records were attributed to Financial Services. It is concluded that the financial service industry has the highest value information and also the largest volume of high value information. The Hospitality and Retail industries are increasing targets because of their Point of Sale systems and consumers reliance on payment cards. The number of breaches for the Hospitality and Retail industry will only increase as more electronic data is transacted and stored. It is up to these industries to adopt more stringent protection policies of their customer’s data.

Geographically, the U.S. has the highest reported incidences of data breaches. It is not surprising given the vast amount of international and domestic financial transactions taking place daily in the U.S. The authors of the report also highlight a key reason for the reported U.S. cases:
“The reason we hear more about data beaches in the U.S. stems from mandatory disclosure laws. Outside the U.S. breach disclosure differs significantly. Some countries are still silent on the matter, others encourage it but don’t require, and some even discourage disclosure. (9)"
The report notes that in the past two years the international caseload has increased consistently in Asian-Pacific and Western-European countries.

U.S. businesses are responsible for large amounts of confidential information. As the report points out, much data breach risk can be removed by establishing usage policies that are pro-active and constantly maintained. This requires constant support by each person in the organization, as a chain is only as strong as its weakest link.

All information sourced from  Verizon 2010 Data Breach Report PDF

Thursday, September 16, 2010

Website Redesign Completed with Test Drive Function!

We are excited to announce the completion of our website redesign. We added some major features in order to better showcase the power of NetSentry Live. Our favorite addition is the ability to “Test Drive” NetSentry Live without an install, directly from our home page. It runs through a browser and you have full access to all functions of the retail software. This is an incredible opportunity to discover NetSentry Live without committing to a trial.

The potential of the program is limitless, but at first it might seem overwhelming. We recommend you start by downloading a few chat, email or web traffic logs. NetSentry Live is a workhorse and fully reconstructs all packets into original content, as the user viewed it. Whether it is a confidential document, internet video, illicit image or FTP transfer, everything is visible to NetSentry Live.

After downloading logs, move onto the alerts and reports tab where the real results can be seen. Alerts are invaluable spies on your network, constantly watching for keywords, file names, extensions or specific user activity. For a company worried about network activity on any level, the combination of alerts and reports creates a powerful virtual IT department. But don’t just take our word for it. Visit http://www.netsentry.us/trial/test-drive.html and follow the directions to explore this productivity enhancing, cost cutting tool.

We already know how powerful NetSentry Live is and we want to share its power with you. After test driving the product we invite you to watch the video with our Chief Forensics Examiner, Larry Daniels and finally download the full licensed trial version of our software. The download can be completed directly from our home page.

Friday, August 27, 2010

Cost Effective DLP Strategies for Small Businesses

According to Osterman Research in their "Messaging Policy Management Trends Report" from 2007-2010, "current policies designed to protect organizations against the leakage of sensitive information are not considered effective by the majority of organizations.” In a world where 31% of data leak incidents are from viruses, and 28% come from insider abuse (Deloitte Global Security Survey), finding a Data Loss Prevention (DLP) strategy that is actually effective is paramount in keeping a business running.

For large businesses, a dedicated IT department is often a viable option to contain potential data leaks, but this form of DLP can be both expensive and cumbersome. In order to make a DLP effective, it needs to identify high-risk data leak points, be accurate in capturing data, and protect all data transmitted by a company, not just that deemed "sensitive". For a small business, this is an especially difficult challenge, as they often do not possess the resources to hire full time-tech staff.

One option for small businesses (SMBs) to deal with data loss is the addition of hardware or a software program intended to capture data that is sent out; but there is a risk of failure or incompatibility with certain systems. Another, often more effective, DLP method is to employ a separate, non site-based security option.

There are now a number of DLP solutions on the market that allow off-site monitoring and reconstruction of all data out and into a company; products that can scale with data use and provide a greater functionality for SMBs.

Monday, August 23, 2010

Decrease the Risks of Cloud Computing: Develop A Data Capture Strategy


Cloud computing - also known as "grid computing", involves the use of a third-party's system or servers in order to store, access, and manipulate data. The New York Times recently used Google Apps to create a searchable PDF database of old issues, something that would have taken them years to accomplish using their own systems. Google Apps did it in a day.

However, while the benefits of the cloud can be concrete, security issues abound. On July 15th of 2009, a hacker broke into an e-mail account belonging to a member of Twitter’s staff. Once inside the account, he was able to view a significant amount of password data which was stored on a cloud computing system, in this case Google Apps.

Any time data is stored or complied off-site by a third party, a company's control of their data is partially lost, and there are a number of specific concerns that arise. First is the question of simply who is accessing your data and why - who are the administrators of the cloud, and what are their qualifications? How secure is the data itself? Second, just where is your data going? Is it stored locally? Internationally?

One way for a company to proactively protect their interests is with a data capture strategy. By knowing what data is accessed and collecting it on a local level using a separate, non-hardware, non-software dependent capture program, a company can both track its data use and plug potential leaks before they occur.

Thursday, August 19, 2010

Larry E Daniel Tapped As Expert

Larry E Daniel, CEO of Guardian Digital Forensics, has been tapped as an expert digital forensics consultant by AIT Inc. for a Dell Computer Lawsuit.
"Executives for A.I.T., which hosts websites and manages web domains, declined comment. According to its lawsuit, A.I.T. leased the Dell computers in 2003 and 2004 and says they were prone to overheating, crashing and losing data. The company says Dell stopped honoring its warranty, leaving A.I.T. and its customers stranded.
A.I.T. is now seeking to force Dell to release internal memos that A.I.T. thinks will show that senior executives are hiding information about their role in downplaying the computer malfunctions. A.I.T. says Dell computers were subject to nearly a 100 percent failure rate, but Dell says the failures affected about 22 percent.
In the latest round of filings, A.I.T. says Dell continues to engage in a coordinated campaign of deception. As evidence it cites a 2005 memo from Lynn Tyson, then Dell's vice president for investor relations, in which Tyson says the computer glitch poses no risk of safety or data loss.
"A.I.T. suspected the document in question might have been altered in some way and even went so far to retain a digital forensic consultant to examine the document," A.I.T.'s court filing says."


NetSentry is a division of Guardian Digital Forensics.



Wednesday, August 11, 2010

Does Your Compliance Solution Include DLP Best Practices?


According to a recent study conducted by the Ponemon Institute, a typical data breach costs a company $128 per document, averaging out to almost $5 million in total per incident. In short, big numbers for what appear to be small incidents.

Often, this type of data loss does not begin as malicious - it may simply be as a result of user error or system difficulties. Consider the case of Clarkson University. In 2008, a glitch on the public drive of the University's internal file server allowed everyone on campus access to the personal information and social security number of all students in the database. The data leak was caught and rectified almost immediately thanks to an honest student who gained access.

Incidents like these, in combination with the high costs of aggregate data loss, speak to the need for ensuring that a data loss prevention solution follows what are known as industry best practices. These include defining DLP needs and setting a focus for a DLP program, as well as making any DLP solution both unobtrusive and comprehensive.

In order to achieve compliance with these best practice guidelines, many companies have turned to network security systems that not only operate without the need for added hardware or software, but that can also actually reconstruct data that has been transmitted, rather than simply select random files for further analysis. This recording and recompiling of data allows businesses to better address the volume and nature of their data loss.

Wednesday, August 4, 2010

Social Media's Impact on Network Security

A 2009 FaceTime survey shows that 62% of IT professionals believe social network sites are being accessed on the networks they maintain. However, in reality 100% of businesses surveyed had evidence of social networking site use.

While companies have a vested interest in how much time employees spend on social networking sites, only 24% are "extremely concerned" that it will damage efficiency. In fact, the amount of time taken to access these applications is often balanced out by the potential productivity allowed by the interconnectivity of a Web 2.0 world.

Although many IT departments have begun monitoring and recording the usage of these sites on their network, only 49% - down 2% from 2008 - could produce documents if asked to do so by management, and over 60% of IT professionals still view incoming Malware as their biggest network threat.

In many cases, companies do not consider the traffic of data out as a a potential problem. Should a user choose to post a sensitive file, conversation, or other piece of company information on a social networking website, it can never be fully removed, and can be almost instantly copied to other sites. This creates the potential for a serious network breach simply based on the way social networking sites are viewed and operated. As a result, companies are finding themselves needing to create social networking usage rules and standards, and examining methods for the tracking and retention of data sent out over a network.

The Cost of Data Leaks

Many companies view the issue of data leaks as a minor annoyance - one that is either over-reported due to media sensationalism or simply an alarmist concern by IT departments trying to pad their budgets.

In reality, the cost of data leaks can be damaging to both a company's profitability and its reputation. For example, consider the situation that the Obama Administration now finds itself in after 2008 documents showing a link between Pakistan's intelligence agency and Taliban insurgents. The source of the document leak is a website known for publishing classified military information - apparently with little difficulty - considering nearly 90,000 documents were recently posted, forcing the government to deal head-on with the issue.

Or take the example of DuPont, which had an employee steal and leak over $400 million worth of data. During the course of four months, the thief accessed over 15 times as many documents as other employees, but this was never questioned or even detected until long after he left the company.

In order to properly address these kinds of issues, a company must act pre-emptively to both monitor and track data movement. This can be done in several ways, and new methods have now been developed which allow the tracking, storage and reconstruction of accessed and transmitted data on PCs without the need for added hardware or software. Used properly, options such as these can aid companies in catching data leaks before they start as well as in protecting both their finances and their reputation.