Welcome to All Things Data Integration: Announcements, Insights, Best Practices, Tips & Tricks, and Trend Related...

Oracle GoldenGate for Big Data is Generally Available Now!

Thomas Vengal
Director, Product Management

Much awaited Oracle GoldenGate for Big Data 12.2 is released
today and it is available for download at OTN.

Let me give you a quick recap on Oracle GoldenGate for Big
Data. Oracle GoldenGate for Big Data streams transactional data into big data
systems in real-time, raising the quality and timeliness of business insights. Oracle
GoldenGate for Big Data offers also provides a flexible and extensible solution
to support all major big data systems.

Oracle GoldenGate
for Big Data

  • Same
    trusted Oracle GoldenGate architecture used by 1000’s of customers
  • Data
    delivery to Big Data targets including NoSQL databases
  • Support
    for Polyglot, Lambda and Kappa architectures for streaming data


  • Less
    on source databases when compared to batch processing such as Sqoop or
    ETL processes
  • Simple
    for 1:1 data architecture for populating “raw data” zones
  • Real-time
    delivery for streaming analytics/apps
  • Reliable,
    proven at scale with high performance

Architecture – GoldenGate for
Big Data 12.2 versus 12.1

New Features in

New Java based Replicat Process 

The advantages of using Java based Replicat process are the following:

    1. Improved performance with Java based adapters
    2. Declarative
      design and configurable mapping
    3. Transaction
      grouping based on Operation count & Message size
    4. Improved
      check pointing functionality

      E.g.: CHECKPOINTSECS 1 (default 10 seconds)

Dynamic Data

You no longer require to define SOURCEDEFS. DDL changes are
automatically replicated to target. For example, if a new column named
“mycolumn“ is added on the source database, it will be automatically replicated
to the target without stopping and reconfiguring Oracle GoldenGate.


Oracle GoldenGate for Big Data can write into any Big Data
targets in various data formats such as delimited text or XML or JSON or Avro
or custom format. This can save users cost and time for staging data in
ETL operations.

Example: gg.handler.name.format=
values supported are delimitedtext”, “xml”, “json”, “avro” or “avro_row”, “avro_op”
Custom Format. Extended class
path needs to be included in the config file. <com.yourcompany.YourFormatter

Security Enhancement

Native Kerberos support is available in the

of configuration:




Oracle GoldenGate for Big Data is
able to provide mapping functionally between source table to target table and source
field to target field for HDFS/Hive, HBase, Flume and Kafka. The metadata is
also validated at Hive or using an Avro schema to ensure data correctness.

COLMAP (USEDEFAULTS, "cust_code2"=cust_code,"city2"=city);

Kafka as target

Oracle GoldenGate for Big Data can write Logical change
records data to a Kafka topic. Operations such as Insert, Update, Delete and
Primary Key Update can be handled. It can handles native compression such as GZIP
and Snappy in Kafka.

Example of defining Kafka Handler




Other Enhancements

  • Partition data by Hive Table and/or column. Partitioning
    into new file based on designated column values
    • gg.handler.{name}.partitionByTable =true |
    • gg.handler.{name}.partitioner.{fully
      qualified table name}={colname}
    • gg.handler.{name}.partitioner.{fully
      qualified table name}={colname1},{colname2}
    • gg.handler.<yourhandlername>.partitioner.dbo.TCUSTORD=region,
  • Configurable File Rolling Property for HDFS
    (file size, duration, inactivity timer, metadata change)
  • Configurable file output encoding into HDFS
  • Automatically create HBase table if it does not
  • Ability to treat primary key updates as a delete
    and then an insert in HBase
  • HBase row key generation
  • Treat Primary Key updates as delete and insert
    in Flume and HBase
  • New Time stamping functionality to include micro
    second precision as ISO-8601
  • Availability on additional OS platforms: Windows and Solaris
  • Certification for newer versions: Apache HDFS
    2.7.x, Cloudera 5.4.x, Hortonworks 2.3, Kafka and

For more details about new product features, you may refer to Oracle GoldenGate for Big
Data Release Notes and User Documentation

For more information about Oracle GoldenGate for Big Data.

Feel free to reach out to me for your queries by posting in this blog or tweeting @thomasvengal

Happy Holidays ! 

Join the discussion

Comments ( 2 )
  • guest Monday, January 4, 2016

    Does Goldengate work with Postgres as a source?

  • Thomas Monday, January 4, 2016

    Postgres is not supported as a source at this point of time. Postgres is supported as a target only. Please get in touch with your Oracle representative with your detailed use-case and versions.

Please enter your name.Please provide a valid email address.Please enter a comment.CAPTCHA challenge response provided was incorrect. Please try again.