Tuesday May 19, 2015

Securing the Big Data Life Cycle: A New MIT Technology Review and Oracle Paper

The big data phenomenon is a direct consequence of the digitization and “datafication” of nearly every activity in personal, public, and commercial life. Consider, for instance, the growing impact of mobile phones. The global smartphone audience grew from 1 billion users in 2012 to 2 billion today, and is likely to double again, to 4 billion, by 2020, according to Benedict Evans, a partner with the venture capital firm Andreessen Horowitz. 

“Companies of all sizes and in virtually every industry are struggling to manage the exploding amounts of data,” says Neil Mendelson, vice president for big data and advanced analytics at Oracle. “But as both business and IT executives know all too well, managing big data involves far more than just dealing with storage and retrieval challenges—it requires addressing a variety of privacy and security issues as well.”

With big data, comes bigger responsibility. A new joint Oracle and MIT Technology Review paper drills into addressing these big data privacy and security issues.

Get the paper, Securing the Big Data Life Cycle and learn more here.

Monday May 11, 2015

Using Earthquakes to Predict Cybercrime

Known for big surf and occasional big earthquakes, Santa Cruz, California has also been in the news regarding big data. In fact, the police force has used predictive analytics to capture would-be thieves. Two women were taken into custody after they were discovered peering into cars in a downtown parking garage. After further questioning, one was found to have outstanding warrants while the other was carrying illegal drugs.

The unique thing here is that the police officers were directed to the parking structure by a computer program that had predicted that car burglaries were especially likely there that day. This computer program, developed by PredPol, is based on models used for predicting aftershocks from earthquakes, a common occurrence here in California. The algorithms used generated projections about which areas and windows of time are at highest risk for future crimes.

The Innovative Hacker 

Organizations struggle to mitigate threats due to the continuing evolution of hackers and their methods of attack. Since William T. Morris Jr. first introduced the infant internet to his Morris worm virus in 1988, organizations have been fighting tweakers, script kiddies, espionage, and organized crime. The problem is that every time a solution is advised, a new hack is created. It’s a never ending cycle, and unfortunately, the turnaround time for hackers is getting shorter and shorter. They are innovating and sharing their innovations with others, who in turn take advantage and increase the number of effective attacks.

According the 2015 Verizon Data Breach Investigations Report, with over 80,000 incidents examined, hackers have become more inventive, thinking up new tactics to evade defenses.  “I hate to admit defeat, says Jay Jacobs, co-author of the report, but there does seem to be an advantage to the attackers right now.”  (Source: Financial Times access for a fee).

Learning from the Past 

By analyzing and detecting patterns in years of past crime data, the Santa Cruz police department, were able to determine hot spots of potential crime. In fact, on the day the two women were arrested, the program had identified the approximately one-square-block area where the parking garage is situated as one of the highest-risk locations for car burglaries.

According to the RAND Corporation's “Predictive Policing"  study, there is strong evidence to support the theory that crime is statistically predictable. That’s because criminals tend to operate in their comfort zone. They commit the type of crimes that they’ve committed successfully in the past, generally close to the same time, location and methods.

There is a connection between physical crime and the cybercrime organizations face today. To explain this connection further, the RAND Corporation found that prediction-led policing is not just about making predictions; "but it is a comprehensive business process, of which predictive policing is a part.” That process is summarized here in order to explain the steps taken to analyze past information in order to prevent further criminal activity.

First, the police force collected and analyzed previous crime, incident, and offender data in order to produce predictions. These predictions uncovered hotspots. Next, data from multiple and disparate sources in the community gets combined together, often using Big Data environments to quickly process terabytes of data. This data helps inform police where hotpots of potential crime will break out based on time of day, weather, recent criminal activity and more. Using the predictions helps to inform how they will respond to a potential incident. Criminals will then react to the changed environment: either they will be removed, or those still operating in the area may change their practices or move to a different area. Regardless of the response, the environment has been altered, the initial data will be out of date, and new data will need to be collected for analysis. 

The Importance of Acquiring Good, Clean Data 

This entire process hinges on the collection of data and the importance of that data to make predictions. 

Organizations today have the data necessary to make these types of predictions. In fact, our systems are churning out this data all the time through system server logs, database audits, event logs and more.  If crime is statistically predictable, and we have all evidence right there in front of us, then we need to collect and analyze it.

Of course, the future of predictive analytics and machine learning is much more than analyzing audit and log data and monitoring our databases, however, these two critical practices are important first steps to a comprehensive cybersecurity program.

The recent 2015 Verizon Data Breach Investigations Report highlights that once you have the data you need, analysis is performed using inferred or computed elements of the data. In order to mitigate data breaches, they suggest looking for anomalies within the following:
  • Volume or amount of content transfer, such as e-mail attachments or uploads
  • Resource access patterns, such as logins or data repository touches
  • Time-based activity patterns, such as daily and weekly habits
  • Indications of job contribution, such as the amount of source code checked in by developers
  • Time spent in activities indicative of job satisfaction or discontent
Despite that this data is all around us, the tough part is how to effectively and efficiently collect all of this data--securely--and make sense of it to predict and prescribe future actions and prevent the next data breach. 

Wednesday Mar 25, 2015

86% of Data Breaches Miss Detection, How Do You Beat The Odds?

Information security is simply not detecting the bad guys

This according to the Verizon Data Breach Investigations Report. In fact, antivirus, intrusion detection systems, and log review all pick up less than 1% of data breach incidents. Very few companies do proactive monitoring and those that do are simply troubleshooting problems they already know about. The result is that 86% of data breach incidents were ultimately detected by someone other than the victimized organization; an embarrassing statistic.

Only 35% of organizations audit to determine whether privileged users are tampering with systems. As well, for nearly 70% of organizations, it would take greater than one day to detect and correct unauthorized database access or change. With average data breach compromises taking less than a day, the majority of organizations could lose millions of dollars before even noticing.

Join Oracle and learn how to put in place effective activity monitoring including:

  • Privileged user auditing for misuse and error
  • Suspicious activity alerting
  • Security and compliance reporting 

Monday Mar 16, 2015

Three Big Data Threat Vectors

The Biggest Breaches are Yet to Come

Where a few years ago we saw 1 million to 10 million records breached in a single incident, today we are in the age of mega-breaches, where 100 and 200 million records breached is not uncommon.

According to the Independent Oracle Users Group Enterprise Data Security Survey, 34% of respondents say that a data breach at their organization is "inevitable" or "somewhat likely" in 2015.

Combine this with the fact that the 2014 Verizon Data Breach Investigations Report tallied more than 63,000 security incidents—including 1,367 confirmed data breaches. That's a lot of data breaches.

As business and IT executives are learning by experience, big data brings big security headaches. Built with very little security in mind, Hadoop is now being integrated with existing IT infrastructure. This can further expose existing database data with less secure Hadoop infrastructure. Hadoop is an open-source software framework for storing and processing big data in a distributed fashion. Simply put, it was developed to address massive data storage and faster processing, not security.

With enormous amounts of less secure big data, integrated with existing database information, I fear the biggest data breaches are yet to be announced. When organizations are not focusing on security for their big data environments, they jeopardize their company, employees, and customers.

Top Three Big Data Threats

For big data environments, and Hadoop in particular, today's top threats include:
  • Unauthorized access. Built with the notion of “data democratization”—meaning all data was accessible by all users of the cluster—Hadoop is unable to stand up to the rigorous compliance standards, such as HIPPA and PCI DSS, due to the lack of access controls on data. The lack of password controls, basic file system permissions, and auditing expose the Hadoop cluster to sensitive data exposure.
  • Data provenance. In traditional Hadoop, it has been difficult to determine where a particular data set originated and what data sources it was derived from. At a minimum the potential for garbage-in-garbage-out issues arise; or worse, analytics that drive business decisions could be taken from suspect or compromised data. Users need to know the source of the data in order to trust its validity, which is critical for relevant predictive activities.
  • DIY Hadoop. A build-your-own cluster presents inherent risks, especially in shops where there are few experienced engineers that can build and maintain a Hadoop cluster. As a cluster grows from small project to advanced enterprise Hadoop, every period of growth—patching, tuning, verifying versions between Hadoop modules, OS libraries, utilities, user management etc.—becomes more difficult. Security holes, operational security and stability may be ignored until a major disaster occurs, such as a data breach.
Big data security is an important topic that I plan to write more about. I am currently working with MIT on a new paper to help provide some more answers to the challenges raised here. Stay tuned.

Monday Mar 09, 2015

Security and Governance Will Increase Big Data Innovation in 2015

"Let me begin with my vision of the FTC and its role in light of the emergence of big data. I grew up in a beach town in Southern California. To me, the FTC is like the lifeguard on a beach. Like a vigilant lifeguard, the FTC’s job is not to spoil anyone’s fun but to make sure that no one gets hurt. With big data, the FTC’s job is to get out of the way of innovation while making sure that consumer privacy is respected."

- Edith Ramirez, Chairwoman, Federal

Trade Commission Ms. Ramirez highlights the FTC's role in protecting consumers from what she refers to as "indiscriminate data collection" of personal information. Her main concern is that organizations can potentially use this information to ultimately implicate individual privacy. There are many instances highlighting the ability to take what was previously considered anonymous data, only to correlate with other publicly available information in order to increase the ability to implicate individuals.

Finding Out Truthful Data from "Anonymous" Information 

Her concerns are not unfounded; the highly referenced paper Robust De-anonymization of Large Sparse Datasets, illustrates the sensitivity of supposedly anonymous information. The authors were able to identify the publicly available and "anonymous" dataset of 500,000 Netflix subscribers by cross referencing it with the Internet Movie Database. They were able to successfully identify records of users, revealing such sensitive data as the subscribers' political and religious preferences, for example. In a more recent instance of big data security concerns, the public release of a New York taxi cab data set was completely de-anonymized, ultimately unveiling cab driver annual income, and possibly more alarming, the weekly travel habits of their passengers.

Many large firms have found their big data projects shut down by compliance officers concerned about legal or regulatory violations. Chairwoman Hernandez highlights specific cases where the FTC has cracked down on firms they feel have violated customer privacy rights, including the United States vs. Google, Facebook, and Twitter. She feels that big data opens up additional security challenges that must be addressed.

"Companies are putting data together in new ways, comingling data sets that have never been comingled before," says Jeff Pollock, Oracle vice president for product management. "That’s precisely the value of big data environments. But these changes are also leading to interesting new security and compliance concerns."

The possible security and privacy pitfalls of big data center around three fundamental areas:

  • Ubiquitous and indiscriminate collection from a wide range of devices 
  • Unexpected uses of collected data, especially without customer consent 
  • Unintended data breach risks with larger consequences

Organizations will find big data experimentation easier to initiate when the data involved is locked down. They need to be able to address regulatory and privacy concerns by demonstrating compliance. This means extending modern security practices like data masking and redaction to the full big data environment, in addition to the must-haves of access, authorization and auditing.

Securing the big data lifecycle requires:

  • Authentication and authorization of users, applications and databases 
  • Privileged user access and administration 
  • Data encryption of data at rest and in motion 
  • Data redaction and masking for non production environments 
  • Separation of roles and responsibilities 
  • Implementing least privilege 
  • Transport security 
  • API security 
  • Monitoring, auditing, alerting and compliance reporting

With Oracle, organizations can achieve all the benefits that big data has to offer while providing a comprehensive data security approach that ensures the right people, internal and external, get access to the appropriate data at right time and place, within the right channel. The Oracle Big Data solution prevents and safeguards against malicious attacks and protects organizational information assets by securing data in-motion and at-rest. It enables organizations to separate roles and responsibilities and protect sensitive data without compromising privileged user access, such as database administrators. Furthermore, it provides monitoring, auditing and compliance reporting across big data systems as well as traditional data management systems.

Learn more about Oracle Security Solutions.

This article has been re-purposed from the Oracle Big Data blog.  

Wednesday Mar 04, 2015

Securing Information in the New Digital Economy

We are in the midst of a data breach epidemic, fueled by a lucrative information black market. The perimeter security most IT organizations rely on has become largely ineffective. Nearly 70% of security resources are focused on perimeter controls, but most exploited vulnerabilities are internal. 

Effective modern security requires an inside-out approach with a focus on data and internal controls.

A New Hacker Economy

Today, a layered economy of specialized, organized hackers has created a black market estimated to be more lucrative than the illegal drug trade. (Lillian Ablon 2014) Hacking-for-hire has made the black market accessible to non-experts, expanding its reach exponentially.  As businesses grow their online footprints, criminals find new ways of attacking their vulnerabilities.

Thinking Inside-Out

Internal systems are the new perimeter – the new front line in the battle for data security. Security should be built into the customer and employee experiences.

  • Manage privileged user access and think beyond the password: another layer of authentication can vastly increase security.
  • Make it more costly and difficult for attackers by protecting the most valuable information first. 

Rebalancing Information Security

Diminish the information supply chain and cut off the cash flow to the black market. Taking a security inside-out approach could bring an end to the arms race, giving economic recovery a chance.

To learn more about Securing Information in the New Digital Economy, read the joint Oracle and Verizon Report.

Thursday Feb 19, 2015

Top Two Cloud Security Concerns: Data Breaches and Data Loss

Apply a Data-centric Security Strategy in the Cloud

Don't miss watching the webcast Applying a Data-centric Security Strategy in the Cloud

Most most organizations are worried about putting sensitive data into the cloud. In fact, industry reports indicate data breaches and data loss are their top two concerns. Rather than apply a one size fits all approach to data security, organizations would be better prepared if they Implemented security controls based on the type of data and its use. In this session, you will learn how to apply the appropriate levels of security controls based on data sensitivity, and then map them to your cloud environment.

Watch now.  

Tuesday Feb 03, 2015

All Data is Not Equal, Map Security Controls to the Value of Data

As you look at data, you will quickly realize that not all data is equal.   What do I mean by that? Quite simply, some data simply does not require the same security controls as other data.   

When explaining this to customers, we use a metals analogy to simplify the provisioning of controls. Bronze to represent the least sensitive data, up through to Platinum, the highest value and most sensitive data within an organization.

Thinking in this manner provides the ability to refine many configurations into a few pre-configured, pre-approved, reference architectures. Applying this methodology is especially important when it comes to the cloud. It comes down to consistency in applying security controls, based on the data itself.

Oracle’s preventive, detective, and administrative pillars can be applied to the various data categorizations. At this point in the conversation, customers begin to understand more pragmatically how this framework can be used to align security controls with the value, or sensitivity, of the data.

Security practitioners can then work with lines of business to assign the appropriate level of controls, both systematically and consistently across the organization.  

So for example, at the bronze level, items such as application of patches, secure configuration scanning and the most basic auditing would be appropriate. Data deemed more sensitive, such as personally identifiable information, or personal health information, require additional security controls around the application data. This would include, for example, blocking default access by those designated as database administrators.

Then finally, at the highest data sensitivity level--Platinum level--should exhibit blocking database changes during production time frames, preventing SQL injection attacks and centralized enterprise-wide reporting and alerting for compliance and audit requirements.  

To learn more about Oracle Security Solutions, download the ebook "Securing Oracle Database 12c: A Technical Primer" by Oracle security experts.

Wednesday Jan 28, 2015

Oracle Cloud Forum - Mapping Security Controls to the Value of Data

Learn how to prioritize your security control deployments by watching Oracle's Cloud Platform Online Forum session, "Applying a Data-Centric Security Strategy in the Cloud."

Most organizations are worried about putting sensitive data into the cloud. In fact, industry reports indicate data breaches and data loss are their top two concerns.

Case in point, my previous blog article discusses how more than a third (34%) of organizations believe that a data breach is "somewhat likely" to "inevitable" in 2015.

Rather than apply a one size fits all approach to data security, organizations would be better prepared if they implemented security controls based on the type of data and its use.


In this session, you will learn how to apply the appropriate levels of security controls based on data sensitivity, and then map them to your cloud environment. 

Register to watch the forum here.  

Tuesday Jan 13, 2015

34% of Organizations Say Data Breach “Somewhat likely” to “Inevitable” in 2015

According to the latest Independent Oracle Users Group (IOUG) Enterprise Data Security Survey, one third of organizations say that a data breach is "somewhat likely" to "inevitable" in the next 12 months, up from 20% in 2008. Are organizations coming to the realization that data breaches will happen? 

2014 IOUG Data Security Survey Likelihood of a Data Breach

Each year, the IOUG surveys a wide range of database security and IT professionals responsible for security, and examines the current state of enterprise data security. They summarize the 2014 findings of 353 data managers and professionals in order to help educate organizations about data security.

The likelihood of a data breach has grown over the years since they first began asking this question, and is similar to other surveys of this ilk. According to the Ponemon 2014 Cost of a Data Breach Study, we see as much as 30% probability.

According to another Ponemon study "Data Breach: The Cloud Multiplier Effect," those surveyed estimate that every one percent increase in the use of cloud services will result in a 3 percent higher probability of a data breach.

When looking at history, survey respondents of the IOUG report say that they often have no idea whether a breach has occurred--or worse--is occurring:

"We cannot be certain there has been no silent breach. There is no evidence we have detected a breach or corruption. But picturing yourself as highly unlikely to be breached we feel is like wearing a ‘kick-me’ sign on your backside."

2014 IOUG Data Security Survey Known Data Breaches

To learn more, download the 2014 IOUG Data Security Survey Report here

Wednesday Nov 12, 2014

Oracle Security Webcast Series for UK Customers

Over the next four Thursdays, beginning November 20th through December 11th, our UK team will be addressing security 

Preventive Controls to Avoid Next Data Breach, Nov 20, 2014. 11:00 AM - 11:45 AM (GMT)

Learn how preventive controls can increase your defense arsenal against the evolving threats to databases. Data breaches not only expose your customers' and employees' private data, but also diminish your reputation and impact the bottom line. Oracle Security specialists will demonstrate the latest database security capabilities which enable you to adopt a defense-in-depth strategy to mitigate risks and protect the data at source – the database.

Detective Controls for Compliance & Auditing, Nov 27, 2014, 11:00 AM - 11:45 AM (GMT)

Learn how you can enforce the “trust but verify” principle by consolidating audit and event sources from the Oracle and non-Oracle components of your infrastructure, offering integrated, real-time security analytics. Find out how Oracle detective controls can offer a first line of defense against SQL injection attacks, as well as a simplified compliance reporting platform, for audit data analysis, within a centralized, secure warehouse.

Identity Governance for Extended Enterprise, Dec 4, 2014, 11:00 AM - 11:45 AM (GMT)

As organizations deploy an ever-increasing number of cloud, mobile, and enterprise applications, identifying and managing user access can be a challenge, especially when departmental application deployments are outside the view of corporate IT. Join us for this live webcast to learn how Oracle’s Identity governance solution reduces risks and costs while providing fast access to new services through an intuitive user self-service solution.

Strategies for Mobile Application Security, Dec 11, 2014, 11:00 AM - 11:45 AM (GMT)

Enterprise mobility and the Internet of Things are both new IT endpoints that require melding device and user identities for security reasons.Join us for this live webcast to learn how identity management platform benefits are enabling customers to move deployments to the next level of sophistication, as the mobile security market consolidates.

Monday Nov 10, 2014

Encrypting, Redacting and Masking at Epsilon

Epsilon Uses Oracle Advanced Security and Data Masking and Subsetting“With Transparent Data Encryption, the key rotation process is really much simpler for us…attesting to the audit team is much easier.”

Hear Keith Wilcox discuss how Epsilon addresses their customer’s sensitive application data requirements in production and development databases using Oracle Advanced Security, and Oracle Data Masking and Subsetting

Challenges

  • Varying requirements across retail, financial, and more
  • Difficulty demonstrating compliance with custom solution
  • Sensitive data showing within customer’s application 
  • Data encryption key rotation 

Why Epsilon Chose Oracle

  • Flexible solution to meet multiple customer requirements
  • Attesting to audit team is more credible using Oracle
  • Provides standard “secure package” for future deployments
  • A lot of great Oracle information available on the internet

Notable Quote:

“We started using data redaction with the one particular client, for PII data, but we really look forward to rolling that out to other [customers], such as our financial clients. We’ll be adding it to our standard ‘secure package’ that we use across the enterprise.”

Friday Oct 17, 2014

Why Infinity Insurance Chose Oracle Advanced Security and Database Vault

Infinity InsuranceI had an opportunity to sit down with Cathy Robinson, Database Administrator at Infinity Property and Casualty Corporation while at Oracle OpenWorld 2014. Infinity Insurance is a public insurance company that deals with high risk maturities, mostly auto insurance, and provide products through a network of approximately 12,500 independent agencies and brokers. Cathy told me how they use Oracle Advanced Security for encryption and Oracle Database Vault for database privilege user controls.

Cathy has an interesting background with the Department of Defense and joined Infinity with a great understanding of what is required to lock down data and secure an IT environment. As I interviewed Cathy, I learned that the main overall issues they face include:

  • Protecting sensitive personally identifiable information ( i.e. payment card, social security numbers)
  • Educating employees on the importance of securing this data
  • Securing older applications where changing software code is prohibitive

So they have been able to implement Oracle Advanced Security to address these security requirements without having to make any application changes. Additionally, there has been "no performance degradation whatsoever."To further put in place a defense in depth database security strategy, Infinity is also implementing Oracle Database Vault for separation of duties and least privilege.

When I asked why they chose Oracle, Cathy responded with the following:

  • One vendor instead of multiple point solution vendors
  • Deep integration with Oracle Databases
  • Oracle security expertise, which included a database security assessment
Click here to listen to the interview.

Tuesday Oct 14, 2014

ISACA Webcast: Data-Centric Audit and Protection, Reducing Risk and Improving the Security Posture

A security strategy must begin with protecting the databases that hold the majority of sensitive and regulated data. Unfortunately, organizations do not have such a plan in place. They fail to protect their sensitive customer and organizational data. Join Oracle security expert, Roxana Bradescu, as she outlines a data-centric audit and protection strategy to help reduce organizational risk and improve the security posture. During this webcast you will learn:

  • What to audit and how to audit
  • Secure data infrastructure practices
  • How to prevent disclosures and leaks
  • And much more. 

Friday Sep 12, 2014

New KuppingerCole Report on Audit Vault and Database Firewall

KuppingerCole analyst Rob Newby recently (August 2014) put together an executive review of the award-winning Oracle Audit Vault and Database Firewall that you can pick up here for a fee. The paper (4 pages on AVDF, 7 total) goes into a description of the solution and how it works from both the Audit Vault, and Database Firewall perspectives. It further covers reporting and alerting, as well as integration with other Oracle products, summarizing with strengths and challenges.

Happy weekend reading.

About

Who are we?

Follow us on

  • TwitterFacebookLinkedIn

Search

Archives
« May 2015
SunMonTueWedThuFriSat
     
1
2
3
4
5
6
7
8
9
10
12
13
14
15
16
17
18
20
21
22
23
24
25
26
27
28
29
30
31
      
Today