Thursday, September 16, 2010

Website Redesign Completed with Test Drive Function!

We are excited to announce the completion of our website redesign. We added some major features in order to better showcase the power of NetSentry Live. Our favorite addition is the ability to “Test Drive” NetSentry Live without an install, directly from our home page. It runs through a browser and you have full access to all functions of the retail software. This is an incredible opportunity to discover NetSentry Live without committing to a trial.

The potential of the program is limitless, but at first it might seem overwhelming. We recommend you start by downloading a few chat, email or web traffic logs. NetSentry Live is a workhorse and fully reconstructs all packets into original content, as the user viewed it. Whether it is a confidential document, internet video, illicit image or FTP transfer, everything is visible to NetSentry Live.

After downloading logs, move onto the alerts and reports tab where the real results can be seen. Alerts are invaluable spies on your network, constantly watching for keywords, file names, extensions or specific user activity. For a company worried about network activity on any level, the combination of alerts and reports creates a powerful virtual IT department. But don’t just take our word for it. Visit http://www.netsentry.us/trial/test-drive.html and follow the directions to explore this productivity enhancing, cost cutting tool.

We already know how powerful NetSentry Live is and we want to share its power with you. After test driving the product we invite you to watch the video with our Chief Forensics Examiner, Larry Daniels and finally download the full licensed trial version of our software. The download can be completed directly from our home page.

Friday, August 27, 2010

Cost Effective DLP Strategies for Small Businesses

According to Osterman Research in their "Messaging Policy Management Trends Report" from 2007-2010, "current policies designed to protect organizations against the leakage of sensitive information are not considered effective by the majority of organizations.” In a world where 31% of data leak incidents are from viruses, and 28% come from insider abuse (Deloitte Global Security Survey), finding a Data Loss Prevention (DLP) strategy that is actually effective is paramount in keeping a business running.

For large businesses, a dedicated IT department is often a viable option to contain potential data leaks, but this form of DLP can be both expensive and cumbersome. In order to make a DLP effective, it needs to identify high-risk data leak points, be accurate in capturing data, and protect all data transmitted by a company, not just that deemed "sensitive". For a small business, this is an especially difficult challenge, as they often do not possess the resources to hire full time-tech staff.

One option for small businesses (SMBs) to deal with data loss is the addition of hardware or a software program intended to capture data that is sent out; but there is a risk of failure or incompatibility with certain systems. Another, often more effective, DLP method is to employ a separate, non site-based security option.

There are now a number of DLP solutions on the market that allow off-site monitoring and reconstruction of all data out and into a company; products that can scale with data use and provide a greater functionality for SMBs.

Monday, August 23, 2010

Decrease the Risks of Cloud Computing: Develop A Data Capture Strategy


Cloud computing - also known as "grid computing", involves the use of a third-party's system or servers in order to store, access, and manipulate data. The New York Times recently used Google Apps to create a searchable PDF database of old issues, something that would have taken them years to accomplish using their own systems. Google Apps did it in a day.

However, while the benefits of the cloud can be concrete, security issues abound. On July 15th of 2009, a hacker broke into an e-mail account belonging to a member of Twitter’s staff. Once inside the account, he was able to view a significant amount of password data which was stored on a cloud computing system, in this case Google Apps.

Any time data is stored or complied off-site by a third party, a company's control of their data is partially lost, and there are a number of specific concerns that arise. First is the question of simply who is accessing your data and why - who are the administrators of the cloud, and what are their qualifications? How secure is the data itself? Second, just where is your data going? Is it stored locally? Internationally?

One way for a company to proactively protect their interests is with a data capture strategy. By knowing what data is accessed and collecting it on a local level using a separate, non-hardware, non-software dependent capture program, a company can both track its data use and plug potential leaks before they occur.

Thursday, August 19, 2010

Larry E Daniel Tapped As Expert

Larry E Daniel, CEO of Guardian Digital Forensics, has been tapped as an expert digital forensics consultant by AIT Inc. for a Dell Computer Lawsuit.
"Executives for A.I.T., which hosts websites and manages web domains, declined comment. According to its lawsuit, A.I.T. leased the Dell computers in 2003 and 2004 and says they were prone to overheating, crashing and losing data. The company says Dell stopped honoring its warranty, leaving A.I.T. and its customers stranded.
A.I.T. is now seeking to force Dell to release internal memos that A.I.T. thinks will show that senior executives are hiding information about their role in downplaying the computer malfunctions. A.I.T. says Dell computers were subject to nearly a 100 percent failure rate, but Dell says the failures affected about 22 percent.
In the latest round of filings, A.I.T. says Dell continues to engage in a coordinated campaign of deception. As evidence it cites a 2005 memo from Lynn Tyson, then Dell's vice president for investor relations, in which Tyson says the computer glitch poses no risk of safety or data loss.
"A.I.T. suspected the document in question might have been altered in some way and even went so far to retain a digital forensic consultant to examine the document," A.I.T.'s court filing says."


NetSentry is a division of Guardian Digital Forensics.



Wednesday, August 11, 2010

Does Your Compliance Solution Include DLP Best Practices?


According to a recent study conducted by the Ponemon Institute, a typical data breach costs a company $128 per document, averaging out to almost $5 million in total per incident. In short, big numbers for what appear to be small incidents.

Often, this type of data loss does not begin as malicious - it may simply be as a result of user error or system difficulties. Consider the case of Clarkson University. In 2008, a glitch on the public drive of the University's internal file server allowed everyone on campus access to the personal information and social security number of all students in the database. The data leak was caught and rectified almost immediately thanks to an honest student who gained access.

Incidents like these, in combination with the high costs of aggregate data loss, speak to the need for ensuring that a data loss prevention solution follows what are known as industry best practices. These include defining DLP needs and setting a focus for a DLP program, as well as making any DLP solution both unobtrusive and comprehensive.

In order to achieve compliance with these best practice guidelines, many companies have turned to network security systems that not only operate without the need for added hardware or software, but that can also actually reconstruct data that has been transmitted, rather than simply select random files for further analysis. This recording and recompiling of data allows businesses to better address the volume and nature of their data loss.

Wednesday, August 4, 2010

Social Media's Impact on Network Security

A 2009 FaceTime survey shows that 62% of IT professionals believe social network sites are being accessed on the networks they maintain. However, in reality 100% of businesses surveyed had evidence of social networking site use.

While companies have a vested interest in how much time employees spend on social networking sites, only 24% are "extremely concerned" that it will damage efficiency. In fact, the amount of time taken to access these applications is often balanced out by the potential productivity allowed by the interconnectivity of a Web 2.0 world.

Although many IT departments have begun monitoring and recording the usage of these sites on their network, only 49% - down 2% from 2008 - could produce documents if asked to do so by management, and over 60% of IT professionals still view incoming Malware as their biggest network threat.

In many cases, companies do not consider the traffic of data out as a a potential problem. Should a user choose to post a sensitive file, conversation, or other piece of company information on a social networking website, it can never be fully removed, and can be almost instantly copied to other sites. This creates the potential for a serious network breach simply based on the way social networking sites are viewed and operated. As a result, companies are finding themselves needing to create social networking usage rules and standards, and examining methods for the tracking and retention of data sent out over a network.