Wednesday Jul 29, 2015

Security Inside Out Newsletter, July Edition is Out

The July edition of the Security Inside newsletter is now available. Sign up here for the Security Inside Out newsletter where we highlight key Oracle Security news and provide information on the latest webcasts, events, training and more. 

This month in the news:

Inoculating the Cloud

Another day, another data breach. From the recent cyber attack on the Internal Revenue Service to news of a security bug called VENOM, it seems as if frequent cybersecurity incidents represent the new normal. What new methods can your security group deploy to augment traditional perimeter defenses? The key is to focus on your most valuable asset—data—and build a security strategy that protects data at its source. 

Now Available! Oracle Identity Management 11g Release 2 PS3

Read about the new business-friendly user interface that simplifies the tasks associated with provisioning and managing today’s robust, identity-driven environments. Also learn about the expansion of mobile device management capabilities and a consolidated policy management framework that enables simplified provisioning of devices, applications, and access.

Securing Data Where It Matters Most

Putting defense in depth database protection in place is the first step to a security inside out data strategy. Even if an organization’s perimeter is breached, organizations can reduce risks by placing security controls around sensitive data, detecting and preventing SQL injection attacks, monitoring database activity, encrypting data at rest and in transit, redacting sensitive application data, and masking nonproduction databases. Read insights from Oracle Vice President of Security and Identity Solutions, Europe, the Middle East, and Africa, Alan Hartwell.

Wednesday Jun 17, 2015

Database Administrators –the Undercover Security Superheroes

Over the past five years, while enterprise IT departments were focusing on the rise of cloud, mobile, and social technologies, a lucrative black market emerged around the acquisition and sale of information. Today, this includes personal data, intellectual property, financial details and almost any form of information with economic value. 

It suffices to say that when it comes to data security, businesses now find themselves under assault like never before, and are in dire need of leadership to help overcome this systemic problem. Step forward the database administrator; the person with the knowledge and power to help secure sensitive data on behalf of the organization and its employees.

Like most free markets, the information black market sets the value of its focal commodity – in this case data – and allows buyers and sellers to connect via a complex underground network. Just as the world is producing more data than at any other point in history, these organized groups are finding new ways of stealing and monetizing this information.

For their part, senior executives are only too painfully aware of what’s at stake for their businesses, but often don’t know how to approach the problem. In an era where information is arguably the most valuable asset a company has, they will look to database professionals to help the business take a stand and prepare itself to best protect this crucial asset.

However, the knowledge gap these individuals will be addressing is large. Two-fifths of businesses admit they are not fully aware of where all the sensitive data in their organizations is kept, according to respondents to a recent Independent Oracle Users Group survey. Those taking proactive measures to lock down data and render it useless to outsiders are still in the minority, and relatively few have any safeguards in place to counter accidental or intentional staff abuse that could lead to a breach. These safeguards should also extend to DBAs themselves, as ultimately everyone in the organization is in a position to commit a data breach, whether inadvertently or intentionally. 40 Percent Unaware of Where Sensitive Data Resides

That said, together with security professionals, database administrators do have a fighting chance to combat assaults on their organization’s data. Their background gives them a unique understanding of what the risks are to the organization, where to find them and how they can ultimately be addressed or, in the best case, pre-empted.

As the stewards of highly sensitive intellectual property and personal information, database administrators will need to step up and lead the battle against the villains of the black market. As Voltaire once said, “With great power comes great responsibility”, a credo that holds as true for comic book superheroes as it does for the security champions of the enterprise.

If database administrators can bring security concerns front-of-mind for employees across the business, and help drive protective measures at every level of the organization’s IT, they will be well placed to take a stand and fend off the security challenges of the coming years.

Check out the Security Super Hero Infographic here.

Thursday Jun 04, 2015

Inoculate the Cloud: Moving to the Cloud FOR Security

Forbes BrandVoice features a new article, Inoculating the Cloud, on how organizations will be moving to the cloud in order to be more secure.

No matter what survey you look at regarding challenges of moving to the cloud, you'll usually see "security" as one of, if not the top, concern. It makes sense that organizations worry about putting their sensitive customer and company data in the cloud because of data breach risks and compliance concerns. "Who can protect my data, better than myself," they question.

However, I would much rather trust my money to a bank than putting it under my mattress. I think the bank is better positioned to protect my money.  I believe this same rationale goes for securing sensitive data. I would argue that a cloud vendor like Oracle could protect sensitive data better than corporations can. They should be focused on their core business, not maintaining and securing IT infrastructure.

The Forbes BrandVoice article highlights this logic:

A recent study from Harvard Business Review Analytic Services (sponsored by Oracle) found that 62% of survey respondents thought security issues were by far the biggest barriers to expanded cloud adoption at their companies. Nearly half pointed out that data is more difficult to secure in the cloud.

But those very same concerns will soon make security a selling point for the cloud. Established cloud vendors have the internal expertise and resources to install and maintain multilayer security—a level of expertise that many companies cannot hope to duplicate in house.

“This is one factor steering many CIOs toward established vendors for cloud services—they have the resources to invest in state-of-the-art security—both physical and logical,” according to the HBR-AS study.

Then, too, big service providers can automate and simplify many security measures such as implementing security patches, access management, and regulatory compliance.

Learn more by reading the article here

Tuesday Jun 02, 2015

MIT Technology Review: Diversity of Big Data Sources Creates Big Security Challenges

According to Oracle’s Neil Mendelson, many companies today make a key mistake in setting up their big data environments.

“In an effort to gain insights and drive business growth, companies can too often overlook or underestimate the challenge of securing information in a new and unfamiliar environment,” says Mendelson, vice president for big data and advanced analytics at Oracle. That lack of attention to big data security requirements can, of course, leave the organization open to attacks from any number of unknown sources. 

Other evolving circumstances also contribute to a wide range of security-related risks, hurdles, and potential pitfalls associated with big data. As the Cloud Security Alliance, an industry group, notes: “Large-scale cloud infrastructures, diversity of data sources and formats, the streaming nature of data acquisition, and high-volume inter-cloud migration all create unique security vulnerabilities.”

Learn more here about factors that complicate big data implementations, and what is required for organizations to secure the big data life cycle. 

Tuesday May 26, 2015

Oracle Database 12c Real Application Security Administration Application - Now Available on OTN

The release of Oracle Database 12c and the new Real Application Security (RAS) technology further demonstrated Oracle's decades long commitment to delivering cutting edge security technology to our customers.  The release of RAS fundamentally changed the technology available to application developers and data security architects.

“The release of RAS with Oracle Database 12c was the most important database security enhancement for application developers since the release of Oracle's ground breaking row level security solution, Virtual Private Database in 1998,” said Paul Needham, Senior Director for Oracle Database Security Product Management.  

Over the past two decades nearly every application developed has had its own unique security model.   Application users, roles, and privileges are mostly stored in custom application tables that require very specific domain knowledge to maintain.   This complexity has made it difficult and costly to keep pace with ever changing privacy and compliance regulations and protect against hackers.

Integrated with Oracle Fusion Middleware and Oracle Application Express 5.0, Real Application Security enables developers to build the world’s most secure applications by centralizing security policies within the database.  Benefits of Oracle Database 12c Real Application Security include:

  • End-user session propagation to the database
  • Data security based on application roles and privileges
  • Simplified security administration

Today, the database security development team is pleased to announce the release of Real Application Security Administration Application (RASADM).   RASADM is the new Oracle APEX 5.0-based tool for managing Oracle Database 12c Real Application Security.   It complements the comprehensive RAS PL/SQL API available today and is designed for both developers and application security policy administrators.   RASADM is designed to accelerate adoption of the powerful Oracle Database 12c RAS technology.  

"The release of Real Application Security with Oracle Database 12c demonstrates Oracle's continuous innovation in the database security arena.  RASADM was one of the first requests from those building on RAS with Oracle Database 12c and we are pleased to be able to deliver this to our customers,” says Vipin Samar, Vice President, Oracle Database Security.

Security Inside Out Newsletter, May Edition

Get the latest Security Inside Out newsletter and hear about securing the big data life cycle, data security training, and more.

Also, subscribe to get the bi-monthly news in your own inbox . 

Tuesday May 19, 2015

Securing the Big Data Life Cycle: A New MIT Technology Review and Oracle Paper

The big data phenomenon is a direct consequence of the digitization and “datafication” of nearly every activity in personal, public, and commercial life. Consider, for instance, the growing impact of mobile phones. The global smartphone audience grew from 1 billion users in 2012 to 2 billion today, and is likely to double again, to 4 billion, by 2020, according to Benedict Evans, a partner with the venture capital firm Andreessen Horowitz. 

“Companies of all sizes and in virtually every industry are struggling to manage the exploding amounts of data,” says Neil Mendelson, vice president for big data and advanced analytics at Oracle. “But as both business and IT executives know all too well, managing big data involves far more than just dealing with storage and retrieval challenges—it requires addressing a variety of privacy and security issues as well.”

With big data, comes bigger responsibility. A new joint Oracle and MIT Technology Review paper drills into addressing these big data privacy and security issues.

Get the paper, Securing the Big Data Life Cycle and learn more here.

Monday May 11, 2015

Using Earthquakes to Predict Cybercrime

Known for big surf and occasional big earthquakes, Santa Cruz, California has also been in the news regarding big data. In fact, the police force has used predictive analytics to capture would-be thieves. Two women were taken into custody after they were discovered peering into cars in a downtown parking garage. After further questioning, one was found to have outstanding warrants while the other was carrying illegal drugs.

The unique thing here is that the police officers were directed to the parking structure by a computer program that had predicted that car burglaries were especially likely there that day. This computer program, developed by PredPol, is based on models used for predicting aftershocks from earthquakes, a common occurrence here in California. The algorithms used generated projections about which areas and windows of time are at highest risk for future crimes.

The Innovative Hacker 

Organizations struggle to mitigate threats due to the continuing evolution of hackers and their methods of attack. Since William T. Morris Jr. first introduced the infant internet to his Morris worm virus in 1988, organizations have been fighting tweakers, script kiddies, espionage, and organized crime. The problem is that every time a solution is advised, a new hack is created. It’s a never ending cycle, and unfortunately, the turnaround time for hackers is getting shorter and shorter. They are innovating and sharing their innovations with others, who in turn take advantage and increase the number of effective attacks.

According the 2015 Verizon Data Breach Investigations Report, with over 80,000 incidents examined, hackers have become more inventive, thinking up new tactics to evade defenses.  “I hate to admit defeat, says Jay Jacobs, co-author of the report, but there does seem to be an advantage to the attackers right now.”  (Source: Financial Times access for a fee).

Learning from the Past 

By analyzing and detecting patterns in years of past crime data, the Santa Cruz police department, were able to determine hot spots of potential crime. In fact, on the day the two women were arrested, the program had identified the approximately one-square-block area where the parking garage is situated as one of the highest-risk locations for car burglaries.

According to the RAND Corporation's “Predictive Policing"  study, there is strong evidence to support the theory that crime is statistically predictable. That’s because criminals tend to operate in their comfort zone. They commit the type of crimes that they’ve committed successfully in the past, generally close to the same time, location and methods.

There is a connection between physical crime and the cybercrime organizations face today. To explain this connection further, the RAND Corporation found that prediction-led policing is not just about making predictions; "but it is a comprehensive business process, of which predictive policing is a part.” That process is summarized here in order to explain the steps taken to analyze past information in order to prevent further criminal activity.

First, the police force collected and analyzed previous crime, incident, and offender data in order to produce predictions. These predictions uncovered hotspots. Next, data from multiple and disparate sources in the community gets combined together, often using Big Data environments to quickly process terabytes of data. This data helps inform police where hotpots of potential crime will break out based on time of day, weather, recent criminal activity and more. Using the predictions helps to inform how they will respond to a potential incident. Criminals will then react to the changed environment: either they will be removed, or those still operating in the area may change their practices or move to a different area. Regardless of the response, the environment has been altered, the initial data will be out of date, and new data will need to be collected for analysis. 

The Importance of Acquiring Good, Clean Data 

This entire process hinges on the collection of data and the importance of that data to make predictions. 

Organizations today have the data necessary to make these types of predictions. In fact, our systems are churning out this data all the time through system server logs, database audits, event logs and more.  If crime is statistically predictable, and we have all evidence right there in front of us, then we need to collect and analyze it.

Of course, the future of predictive analytics and machine learning is much more than analyzing audit and log data and monitoring our databases, however, these two critical practices are important first steps to a comprehensive cybersecurity program.

The recent 2015 Verizon Data Breach Investigations Report highlights that once you have the data you need, analysis is performed using inferred or computed elements of the data. In order to mitigate data breaches, they suggest looking for anomalies within the following:
  • Volume or amount of content transfer, such as e-mail attachments or uploads
  • Resource access patterns, such as logins or data repository touches
  • Time-based activity patterns, such as daily and weekly habits
  • Indications of job contribution, such as the amount of source code checked in by developers
  • Time spent in activities indicative of job satisfaction or discontent
Despite that this data is all around us, the tough part is how to effectively and efficiently collect all of this data--securely--and make sense of it to predict and prescribe future actions and prevent the next data breach. 

Wednesday Mar 25, 2015

86% of Data Breaches Miss Detection, How Do You Beat The Odds?

Information security is simply not detecting the bad guys

This according to the Verizon Data Breach Investigations Report. In fact, antivirus, intrusion detection systems, and log review all pick up less than 1% of data breach incidents. Very few companies do proactive monitoring and those that do are simply troubleshooting problems they already know about. The result is that 86% of data breach incidents were ultimately detected by someone other than the victimized organization; an embarrassing statistic.

Only 35% of organizations audit to determine whether privileged users are tampering with systems. As well, for nearly 70% of organizations, it would take greater than one day to detect and correct unauthorized database access or change. With average data breach compromises taking less than a day, the majority of organizations could lose millions of dollars before even noticing.

Join Oracle and learn how to put in place effective activity monitoring including:

  • Privileged user auditing for misuse and error
  • Suspicious activity alerting
  • Security and compliance reporting 

Monday Mar 16, 2015

Three Big Data Threat Vectors

The Biggest Breaches are Yet to Come

Where a few years ago we saw 1 million to 10 million records breached in a single incident, today we are in the age of mega-breaches, where 100 and 200 million records breached is not uncommon.

According to the Independent Oracle Users Group Enterprise Data Security Survey, 34% of respondents say that a data breach at their organization is "inevitable" or "somewhat likely" in 2015.

Combine this with the fact that the 2014 Verizon Data Breach Investigations Report tallied more than 63,000 security incidents—including 1,367 confirmed data breaches. That's a lot of data breaches.

As business and IT executives are learning by experience, big data brings big security headaches. Built with very little security in mind, Hadoop is now being integrated with existing IT infrastructure. This can further expose existing database data with less secure Hadoop infrastructure. Hadoop is an open-source software framework for storing and processing big data in a distributed fashion. Simply put, it was developed to address massive data storage and faster processing, not security.

With enormous amounts of less secure big data, integrated with existing database information, I fear the biggest data breaches are yet to be announced. When organizations are not focusing on security for their big data environments, they jeopardize their company, employees, and customers.

Top Three Big Data Threats

For big data environments, and Hadoop in particular, today's top threats include:
  • Unauthorized access. Built with the notion of “data democratization”—meaning all data was accessible by all users of the cluster—Hadoop is unable to stand up to the rigorous compliance standards, such as HIPPA and PCI DSS, due to the lack of access controls on data. The lack of password controls, basic file system permissions, and auditing expose the Hadoop cluster to sensitive data exposure.
  • Data provenance. In traditional Hadoop, it has been difficult to determine where a particular data set originated and what data sources it was derived from. At a minimum the potential for garbage-in-garbage-out issues arise; or worse, analytics that drive business decisions could be taken from suspect or compromised data. Users need to know the source of the data in order to trust its validity, which is critical for relevant predictive activities.
  • DIY Hadoop. A build-your-own cluster presents inherent risks, especially in shops where there are few experienced engineers that can build and maintain a Hadoop cluster. As a cluster grows from small project to advanced enterprise Hadoop, every period of growth—patching, tuning, verifying versions between Hadoop modules, OS libraries, utilities, user management etc.—becomes more difficult. Security holes, operational security and stability may be ignored until a major disaster occurs, such as a data breach.
Big data security is an important topic that I plan to write more about. I am currently working with MIT on a new paper to help provide some more answers to the challenges raised here. Stay tuned.

Monday Mar 09, 2015

Security and Governance Will Increase Big Data Innovation in 2015

"Let me begin with my vision of the FTC and its role in light of the emergence of big data. I grew up in a beach town in Southern California. To me, the FTC is like the lifeguard on a beach. Like a vigilant lifeguard, the FTC’s job is not to spoil anyone’s fun but to make sure that no one gets hurt. With big data, the FTC’s job is to get out of the way of innovation while making sure that consumer privacy is respected."

- Edith Ramirez, Chairwoman, Federal

Trade Commission Ms. Ramirez highlights the FTC's role in protecting consumers from what she refers to as "indiscriminate data collection" of personal information. Her main concern is that organizations can potentially use this information to ultimately implicate individual privacy. There are many instances highlighting the ability to take what was previously considered anonymous data, only to correlate with other publicly available information in order to increase the ability to implicate individuals.

Finding Out Truthful Data from "Anonymous" Information 

Her concerns are not unfounded; the highly referenced paper Robust De-anonymization of Large Sparse Datasets, illustrates the sensitivity of supposedly anonymous information. The authors were able to identify the publicly available and "anonymous" dataset of 500,000 Netflix subscribers by cross referencing it with the Internet Movie Database. They were able to successfully identify records of users, revealing such sensitive data as the subscribers' political and religious preferences, for example. In a more recent instance of big data security concerns, the public release of a New York taxi cab data set was completely de-anonymized, ultimately unveiling cab driver annual income, and possibly more alarming, the weekly travel habits of their passengers.

Many large firms have found their big data projects shut down by compliance officers concerned about legal or regulatory violations. Chairwoman Hernandez highlights specific cases where the FTC has cracked down on firms they feel have violated customer privacy rights, including the United States vs. Google, Facebook, and Twitter. She feels that big data opens up additional security challenges that must be addressed.

"Companies are putting data together in new ways, comingling data sets that have never been comingled before," says Jeff Pollock, Oracle vice president for product management. "That’s precisely the value of big data environments. But these changes are also leading to interesting new security and compliance concerns."

The possible security and privacy pitfalls of big data center around three fundamental areas:

  • Ubiquitous and indiscriminate collection from a wide range of devices 
  • Unexpected uses of collected data, especially without customer consent 
  • Unintended data breach risks with larger consequences

Organizations will find big data experimentation easier to initiate when the data involved is locked down. They need to be able to address regulatory and privacy concerns by demonstrating compliance. This means extending modern security practices like data masking and redaction to the full big data environment, in addition to the must-haves of access, authorization and auditing.

Securing the big data lifecycle requires:

  • Authentication and authorization of users, applications and databases 
  • Privileged user access and administration 
  • Data encryption of data at rest and in motion 
  • Data redaction and masking for non production environments 
  • Separation of roles and responsibilities 
  • Implementing least privilege 
  • Transport security 
  • API security 
  • Monitoring, auditing, alerting and compliance reporting

With Oracle, organizations can achieve all the benefits that big data has to offer while providing a comprehensive data security approach that ensures the right people, internal and external, get access to the appropriate data at right time and place, within the right channel. The Oracle Big Data solution prevents and safeguards against malicious attacks and protects organizational information assets by securing data in-motion and at-rest. It enables organizations to separate roles and responsibilities and protect sensitive data without compromising privileged user access, such as database administrators. Furthermore, it provides monitoring, auditing and compliance reporting across big data systems as well as traditional data management systems.

Learn more about Oracle Security Solutions.

This article has been re-purposed from the Oracle Big Data blog.  

Wednesday Mar 04, 2015

Securing Information in the New Digital Economy

We are in the midst of a data breach epidemic, fueled by a lucrative information black market. The perimeter security most IT organizations rely on has become largely ineffective. Nearly 70% of security resources are focused on perimeter controls, but most exploited vulnerabilities are internal. 

Effective modern security requires an inside-out approach with a focus on data and internal controls.

A New Hacker Economy

Today, a layered economy of specialized, organized hackers has created a black market estimated to be more lucrative than the illegal drug trade. (Lillian Ablon 2014) Hacking-for-hire has made the black market accessible to non-experts, expanding its reach exponentially.  As businesses grow their online footprints, criminals find new ways of attacking their vulnerabilities.

Thinking Inside-Out

Internal systems are the new perimeter – the new front line in the battle for data security. Security should be built into the customer and employee experiences.

  • Manage privileged user access and think beyond the password: another layer of authentication can vastly increase security.
  • Make it more costly and difficult for attackers by protecting the most valuable information first. 

Rebalancing Information Security

Diminish the information supply chain and cut off the cash flow to the black market. Taking a security inside-out approach could bring an end to the arms race, giving economic recovery a chance.

To learn more about Securing Information in the New Digital Economy, read the joint Oracle and Verizon Report.

Thursday Feb 19, 2015

Top Two Cloud Security Concerns: Data Breaches and Data Loss

Apply a Data-centric Security Strategy in the Cloud

Don't miss watching the webcast Applying a Data-centric Security Strategy in the Cloud

Most most organizations are worried about putting sensitive data into the cloud. In fact, industry reports indicate data breaches and data loss are their top two concerns. Rather than apply a one size fits all approach to data security, organizations would be better prepared if they Implemented security controls based on the type of data and its use. In this session, you will learn how to apply the appropriate levels of security controls based on data sensitivity, and then map them to your cloud environment.

Watch now.  

Tuesday Feb 03, 2015

All Data is Not Equal, Map Security Controls to the Value of Data

As you look at data, you will quickly realize that not all data is equal.   What do I mean by that? Quite simply, some data simply does not require the same security controls as other data.   

When explaining this to customers, we use a metals analogy to simplify the provisioning of controls. Bronze to represent the least sensitive data, up through to Platinum, the highest value and most sensitive data within an organization.

Thinking in this manner provides the ability to refine many configurations into a few pre-configured, pre-approved, reference architectures. Applying this methodology is especially important when it comes to the cloud. It comes down to consistency in applying security controls, based on the data itself.

Oracle’s preventive, detective, and administrative pillars can be applied to the various data categorizations. At this point in the conversation, customers begin to understand more pragmatically how this framework can be used to align security controls with the value, or sensitivity, of the data.

Security practitioners can then work with lines of business to assign the appropriate level of controls, both systematically and consistently across the organization.  

So for example, at the bronze level, items such as application of patches, secure configuration scanning and the most basic auditing would be appropriate. Data deemed more sensitive, such as personally identifiable information, or personal health information, require additional security controls around the application data. This would include, for example, blocking default access by those designated as database administrators.

Then finally, at the highest data sensitivity level--Platinum level--should exhibit blocking database changes during production time frames, preventing SQL injection attacks and centralized enterprise-wide reporting and alerting for compliance and audit requirements.  

To learn more about Oracle Security Solutions, download the ebook "Securing Oracle Database 12c: A Technical Primer" by Oracle security experts.

Wednesday Jan 28, 2015

Oracle Cloud Forum - Mapping Security Controls to the Value of Data

Learn how to prioritize your security control deployments by watching Oracle's Cloud Platform Online Forum session, "Applying a Data-Centric Security Strategy in the Cloud."

Most organizations are worried about putting sensitive data into the cloud. In fact, industry reports indicate data breaches and data loss are their top two concerns.

Case in point, my previous blog article discusses how more than a third (34%) of organizations believe that a data breach is "somewhat likely" to "inevitable" in 2015.

Rather than apply a one size fits all approach to data security, organizations would be better prepared if they implemented security controls based on the type of data and its use.

In this session, you will learn how to apply the appropriate levels of security controls based on data sensitivity, and then map them to your cloud environment. 

Register to watch the forum here.  


Who are we?

Follow us on

  • TwitterFacebookLinkedIn


« July 2016