Thursday, July 30, 2009

Network Solutions starts healing process after data breach
Lauren Bell July 27, 2009


Network Solutions, a provider of Web-related services for small and medium businesses, has started reaching out to customers about adata breach that was discovered in early June. Credit card information on 573,928 individual consumers may have been compromised in the breach, which Network Solutions publicly reported at the end of the day on July 24.

Less than half of the company's 10,000-plus e-commerce services customers were affected in the breach, which occurred when hackers implanted a code on the system used to deliver e-commerce tools to clients. Over a three-month period — from March 12 to June 8 — the code diverted transaction and personal information from 4,343 merchant Web sites to a rogue server.

Susan Wade, director of PR for Network Solutions, said that the unauthorized code was discovered on June 8 during routine procedures, and Network Solutions immediately called in a team of data breach forensics experts to analyze the leak and track it. The experts did not crack the code until July 13. When the team discovered that credit card information was at risk, Network Solutions reported the incident to federal law enforcement, which is currently investigating the situation. So far, none of the at-risk cards has been misused.

Network Solutions informed clients of the breach through e-mail and postal mail last week and has offered to help its clients notify affected individual cardholders. In a preemptive PR effort on Friday, the company also reached out to select bloggers and reporters, started monitoring Twitter and responding to blog posts and launched a new Web site and blog about the breach at CareandProtect.com. The site offers FAQs and invites clients and consumers to weigh in on the breach.

“We were proactive in getting the news out,” Wade said. “We're having an open dialogue with customers, so anyone can go to the site and see what the dialogue is.
Network Solutions is also offering affected cardholders 12 months of free fraud monitoring service from TransUnion. Wade says the company has put additional security measures in place to protect against future breaches.

“The main message we want to get out is that we're there for our customers, and we are very sorry about this,” Wade said. “Unfortunately, something like this could happen to any online business, so we're just letting our customers know that we're there for them, we will help them as much as we can, and we take this issue very seriously.”

Amichai Shulman, CTO of database security company Imperva, lauded Network Solutions for bringing in a forensics team right away, but noted that the breach illustrated larger database security problems faced by many companies.

“This incident points out the basic problem of cloud computing,” he said. “With many more companies hosting their data on the Internet, the databases and the servers they are hosted on become phenomenally attractive. The lesson: once you've penetrated the cloud, you've got an easy path to the important, underlying data.”

He added that announcing the breach closer to its time of discovery would have seemed more credible.

“I don't think they did worse than others in such cases, but I think that the industry standard is behind what customers expect,” he said.

Thursday, July 16, 2009

Integrating Privileged Accounts with Existing Security Infrastructure
Published by SearchSecurity.com

While these accounts are required by the platform, a lack of accountability exists for the administrators that use them. Join The Burton Group's senior analyst Mark Diodati as he discusses the do's and don'ts around managing privileged accounts and how vendors are offering solutions for those who have root access.

View this videocast to discover:
  • The risk of leaving privileged accounts unprotected
  • Best practices that security professionals should employ
  • The differences between programmatic access and interactive access and how to decide which to choose
  • Integration of privileged accounts with other systems and technologies: Windows, SIMs, SSO, provisioning, and more

VIEW VIDEOCAST

How to use Excel for security log data analysis
sponsorsed by Tom Chmielarski

Microsoft Excel, already installed on most corporate desktops, is commonly underappreciated by IT security practitioners. Data analysis is a common security task and Excel can often be the quickest option to analyze firewall logs, antivirus data, proxy logs, OS logs and a file listing from a compromised server. Data is everywhere and is often more useful than we expect, if we know how to look at it.

To read further, please click HERE.

Thursday, July 9, 2009

Fact or fiction: Reining in Privileged Access
sponsorsed by Guardium

Learn about some of the common misperceptions around privileged access management and how organizations can implement sounds access controls around root access so confidential data does not leak out of an organization.

Speaker
Mark Diodati
Senior Analyst, Burton Group
Mark Diodati, CPA, CISA, CISM, has more than 19 years of experience in the development and deployment of information security technologies. He is a senior analyst for identity management and information security at Midvale, Utah-based research firm Burton Group.

To listern to the Podcast, please click HERE.
Integrating Privileged Accounts with Existing Security Infrastructure
sponsorsed by Guardium, Inc.

In this videocast, Burton Group Senior Analyst Mark Diodati discusses the risk of leaving privileged accounts unprotected and best practices that security professionals should employ. He also talks about the differences between programmatic access and interactive access and how to decide which to choose, as well as integration of privileged accounts with other systems. Finally he discusses best practices for implementing a privileged account management product.

To view the videcocast, please click HERE.

Friday, July 3, 2009

Tech Insight: Database Security -- The First Three Steps

Protecting sensitive data means locating and enumerating the information in your databases -- and finding the right method to secure it

By John Sawyer
DarkReading

A Special Analysis For Dark Reading First of two articles

One of a security professional's biggest challenges is to keep an organization's most sensitive data out of harm's way. When it comes to the huge volumes of information stored in databases, however, that's no simple task.

Protecting sensitive information means finding and securing it in any location, from corporate headquarters to branch locations to mobile devices. Such data isn't always easy to locate -- it may be stored in a variety of formats, from the small Excel files on a CFO's laptop to enormous databases that contain critical inventories or customer information.

Frequently, databases hold the "crown jewels" of the organization -- the largest and most mission-critical data. This means a database breach can have serious consequences, whether it comes from an employee with authorized access or from a hacker who comes in via vulnerabilities in poorly written Web applications that are linked to the database.

Complying with regulations, like PCI DSS or SOX, has helped many organizations become more aware of their most sensitive data repositories, but it is easy to lose track of what network resources exist when these repositories are spread across multiple office locations. To prevent this sort of oversight, we should look at database security and compliance as a three-stage process: locating your databases, enumerating the data, and securing the critical database servers.

The first stage -- locating the databases themselves -- can be achieved through a couple of different methods. The easiest, but often less fruitful, method is to consult the documentation. If you're lucky, then there will be an extensive, searchable repository containing the information you're looking for. Otherwise, you'll be digging through a lot of docs. This is where sysadmins and developers can help fill in the missing gaps.

When documentation fails, the best method for locating databases is scanning the network with Nmap to find hosts that are running database services and actively listening for connections. For even better coverage, use Nessus with administrative credentials to audit your hosts for installed and running applications -- this will help you find the database servers that are running but not listening on the network.

The second stage to securing your database environment is to enumerate the data contained in the databases you found in the first stage. Not all database servers will need the same level of protection. A test database containing bogus data for use by developers, for example, will obviously not need the same level of defense as a production database server containing customer information and front-ended by a Internet-exposed Web application.

Documentation, developers, and database administrators (DBAs) should provide insight into the database's contents -- but they aren't always as accessible or helpful as they could be. To get the full picture of what's in your databases, you may need to look into data discovery products, like Identity Finder, or discovery features included in data leakage prevention (DLP) and database activity monitoring (DAM) tools.

The discovery process will be straightforward -- as long as the tool you're using properly understands how to access the databases in your organization. If you haven't purchased a product -- or if you have a DLP/DAM solution already -- then be sure what you choose will work with all of the technologies you discovered in the first stage.

The third stage is to secure the database servers themselves and ensure they comply with corporate configuration policies. Manually checking database server settings is a monotonous, tedious task best-suited for automation. Free and commercial tools are available that make the process easier, so it can be done enterprisewide with little effort.

The most important part of the third stage is to ensure you have a well-defined database security configuration policy; hopefully, this was created and refined well before you started this process. The policy should be based on best practices, while meeting the needs and required security level of your environment.

Next, choose an auditing tool that suits your database environment. The CIS Security Benchmark tool and Nessus vulnerability scanner come with customizable configuration files that can be edited to match your security policies. You can also get configuration files from groups like DISA, which can serve as a basis for your auditing.

Though the CIS tools are free, Nessus is a good upgrade to consider because it can scan for vulnerabilities in the database server and underlying operating system. Also, remember that they don't both support the same number of database server types, so be sure to confirm the one you're using can work with all, or at least most, of the software types that run your critical data.

For truly comprehensive database security, you must also consider secure network design, DLP and DAM technologies, secure application development, and proper backup and disaster recovery. However, if you execute these first three stages properly, then you'll be well on your way to securing your most sensitive database information, and you can add additional security capabilities later.