Many customers have chosen to migrate their systems to the Oracle Cloud Infrastructure. If you have Oracle APEX applications on-premises and are wondering how to move those applications with their related database objects and data to Autonomous Database, this double parts blog post is for you.
Oracle APEX is a low-code development platform. The engine of APEX lives basically inside the Oracle Database. APEX is a web data-centric platform. When transferring your APEX applications, you need to consider moving two parts. The APEX Application itself which contains the metadata objects for the application. The second part is the underlying database schema/objects and the data contained in those objects that the APEX Applications use.
A developer might manually export the APEX application and use Supporting Objects scripts such as Installation Scripts and Data Packages to migrate the APEX application and its related objects and data. But what if you want to migrate a whole schema, large datasets, or objects with BLOB support? In this case, your best option is to use Data Pump export, and you can import your objects using one of the following options:
- Database Actions, DBMS_CLOUD and DATAPUMP
- SQLcl and Object Storage
- impdp and Cloud Shell
Credentials and Object Storage
You need to upload your DMP file to the Object Storage. To import from Object Storage, you need to create a Database Credential using the Auth Token.
For more details, the following blog post shows how to create such credentials.
Database Actions, DBMS_CLOUD and DATAPUMP
The Autonomous Database comes pre-equipped with the great Database Actions tool. Under the Database Actions, you can click on SQL to run your SQL and PL/SQL code.
Using this method is easy and allows importing your Data Pump to Oracle Cloud – APEX Service where you don’t have SQL*Net access to your database. All you need is to log in to your Oracle Cloud account and access the SQL utility under Database Actions.
The disadvantage here is you can’t script the import of the exported Data Pump from your on-premises server.
The following is a PL/SQL snippet code based on Sinan’s great post that you can run in the Database Actions’ SQL.
DECLARE -- replace with your values exported_schema VARCHAR2(64) := 'SHLAYEL'; import_schema VARCHAR2(64) := 'SHLAYEL'; -- in case you want to remap schema data_pump_dir VARCHAR2(64) := 'DATA_PUMP_DIR'; dump_file_name VARCHAR2(256) := 'https://objectstorage.me-dubai-1.oraclecloud.com/n/spartaprogrammanagement/b/export/o/export.dmp'; credential_name VARCHAR2(64) := 'MYCRED'; parallel NUMBER := 4; job_handle NUMBER; job_name VARCHAR2(64); job_status VARCHAR2(128); output_message VARCHAR2(1024); BEGIN job_name := dbms_scheduler.generate_job_name('import_'); job_handle := dbms_datapump.open(operation => 'IMPORT', job_mode => 'SCHEMA', job_name => job_name); dbms_datapump.add_file(handle => job_handle, filename => dump_file_name, directory => credential_name, filetype => dbms_datapump.ku$_file_type_uridump_file); dbms_datapump.add_file(handle => job_handle, filename => import_schema || '_import.log', directory => data_pump_dir, filetype => 3); dbms_datapump.metadata_remap(job_handle, 'REMAP_SCHEMA', exported_schema, import_schema); dbms_datapump.metadata_filter(handle => job_handle, name => 'SCHEMA_EXPR', value => 'IN(''' || exported_schema || ''')'); dbms_datapump.set_parallel(handle => job_handle, degree => parallel); dbms_datapump.start_job(handle => job_handle, skip_current => 0, abort_step => 0); dbms_datapump.wait_for_job(handle => job_handle, job_state => job_status); output_message := 'Data Pump Import Execution: ''' || job_status || ''''; dbms_output.put_line(output_message); END; /
impdp and Cloud Shell
If you are a fan of the impdp command, you can use it to import your Data Pump directly from within your Cloud Shell. This option has the least number of clicks. You need to click once on the Cloud Shell icon and issue a command similar to this:
impdp admin/XXXXXXXXX@pdb1_high credential=mycred directory=data_pump_dir remap_tablespace=APEX_1422010366028782:DATA \ dumpfile='https://console.us-phoenix-1.oraclecloud.com/object-storage/buckets/spartaprogrammanagement/project-move/projects.dmp' \ parallel=1 partition_options=merge transform=segment_attributes:n transform=dwcs_cvt_iots:y transform=constraint_use_default_index:y \ exclude=cluster,db_link
Note the remap_tablespace and credential parameters. We will discuss those in more detail in the next blog post.
SQLcl and Object Storage
Oracle SQL Developer Command Line (SQLcl) is a free command line interface for Oracle Database. It allows you to interactively or batch-execute SQL and PL/SQL. SQLcl provides in-line editing, statement completion, and command recall for a feature-rich experience while supporting your previously written SQL*Plus scripts.
The advantage of using SQLcl is the ability to script the commands and automate the export/import process right from your on-premises network.
The following blog post will extensively show a practical example of using SQLcl and Object Storage.
Happy APEXing!