If you’ve spent time with Autonomous Database (ADB-S) on Oracle Cloud, you have likely taken advantage of the DBMS_CLOUD package that provides users with a simple & secure syntax to load data from cloud object stores, as well as help manage data files between the object store and database file system. Today, we’re extending DBMS_CLOUD’s capabilities enabling users to load data from files in file system directories directly into their database, providing another convenient and efficient method for importing data.

Previously, loading data in ADB-S or creating external tables from data in directories required the additional step of sending your data files to your object store of choice, or using the Data Pump tool. While we recommend keeping most of your data in your cloud object storage of choice, this new feature in DBMS_CLOUD allows you to quickly load data you may have lying in your database file system or in a mounted external network file system.
To use this new feature, users may simply load data or create an external table using DBMS_CLOUD while pointing their file URIs to their directory of choice, as in the example below. The DBMS_CLOUD package takes care of all the underlying complexity, such as file access, data parsing, and loading into the database. Note, if you would like a complete walkthrough of how to load data using DBMS_CLOUD in ADB-S please visit this comprehensive LiveLab workshop.
-- Example 1: If users want to copy/load a single CSV file from a directory "MY_DIR"
BEGIN
DBMS_CLOUD.copy_data(
table_name => 'DWCS_IT1',
file_uri_list => 'MY_DIR:test.csv',
format => JSON_OBJECT('type' value 'csv') );
END;
/
-- Example 2: If users want to copy/load all files in a directory "MY_DIR" use the wildcard character:
BEGIN
DBMS_CLOUD.copy_data(
table_name => 'DWCS_IT1',
file_uri_list => 'MY_DIR:*',
format => JSON_OBJECT('type' value 'csv'));
END;
/
-- Example 3: If users want to copy files starting with "test" in directory "MY_DIR", they can also use the wildcard character:
BEGIN
DBMS_CLOUD.copy_data(
table_name => 'DWCS_IT1',
file_uri_list => 'MY_DIR:test*',
format => JSON_OBJECT('type' value 'csv'));
END;
/
-- Example 4: If users want to copy directory files from different directories, separate the directory files with a comma:
BEGIN
DBMS_CLOUD.copy_data(
table_name => 'DWCS_IT1',
file_uri_list => 'MY_DIR:test1.csv, MY_DIR2:test2.csv',
format => JSON_OBJECT('type' value 'csv'));
END;
/
This method of directory file access can be used to copy data or create external tables over all the datatypes ADB-S supports with object store access, including:
- CSV, Text
- JSON
- Datapump
- Parquet
- Avro
- ORC
Support for directory file support with external and hybrid partioned tables is not currently supported and will arrive soon. For additional detail with respect to directory file support, refer here to the ADB-S documentation.
This new feature in DBMS_CLOUD allows users to load data from directory files directly into their databases and is another step forward in simplifying and streamlining the data loading process. With its simplicity and efficiency, it is sure to be a valuable tool for deployments of all sizes.
Like what I write? Follow me on the Twitter!
