So by now you know that our #1 document of 2013 was the Master Generic Data Fix script, otherwise known as the MGD. If you are already familiar with our MGD then great, but if you aren't sure what it is or you are unclear on when and how you use this fantastic script, then read on...
What is the MGD?
Doc ID 1360390.1 - R12: Master GDF Diagnostic (MGD) to Validate Data Related to Invoices, Payments, Accounting, Suppliers and EBTax [VIDEO] was created because a significant amount of data corruption had been reported by customers and we needed a way to quickly identify all transactions that were affected so we could:
a ) fix the corrupt data
b) fix the code to prevent future corruption
Doc ID 1360390.1 also contains a link to watch the webcast video that fully explains the usage of this script and includes a demo of how it is used. Just click on the video link as shown in the following screenshot:
This script may be safely run at any time. It can be used in the following ways:
a) reactively - when you are experiencing errors when performing a particular action that is invoice, payment, or supplier related
b) proactively - to scan all transactions in a date range or an operating unit and head off potential issues.
The script will produce an HTML output of all known problems relating to the transactions you specify along with information on notes and/or data-fix patches which may already be available to resolve the data issues.
No data is created, updated, or deleted by this script and it is fully supported by Oracle Development and Support.
You should always run this script prior to logging a Service Request when you suspect there is data corruption or if you simply want to proactively verify the data integrity of your transactions.
Want to give it a try?
1) Query Doc ID 1360390.1
, and click the Download Now
button to save the ap_gdf_detect_pkg.zip
. Unzip this and you will see this contains three files:
a) ap_gdf_detect_pkg.sql: the package definition file
b) single_trx.sql: This script can be used to submit the validation procedure for a single transaction.
c) date_range.sql: This script will submit the validation procedure for a range of dates within specified operating units
2) Run ap_gdf_detect_pkg.sql as the APPS user to create the package definition.
3) Run the appropriate SQL file to submit the detection procedure:
a) In order to validate a single invoice, payment, or supplier run single_trx.sql which will prompt for an invoice id, check id, and supplier id. You must supply at least one of these values, but you can any combination of two, or even all three.
b) To validate all invoices, payments, or suppliers in a date range for specified operating units, run date_range.sql This script will prompt you for your date range, operating unit (or operating units), and the type of transactions to be validated. Dates for the date range must fall in open periods. To indicate more than one operating unit, separate values with a comma. If operating unit is left blank the process will run for all operating units (though this is generally not recommended for performance reasons). The transaction type specified should be either INVOICE, PAYMENT, SUPPLIER, or ALL to validate all three types.
3) The detection procedure will create an output file of the format AP-GDF-<timestamp>.html and will indicate the location of this file and the log file. Review this output and follow the instructions highlighted to resolve your corruption issues.
What if you aren't technical?
Great news, the script can also be installed by the DBA as a concurrent program as detailed in Doc ID 1361255.1 so that monthly usage is simplified and functional users can run and alert the DBA if they find a problem. The output doesn't require any SQL knowledge as the script provides a nicely formatted HTML output, highlighting any issues that were detected, and points the user to the solution document needed to resolve each issue detected.
Simple! What are you waiting for?
Still got questions?
Please let us know by updating the Master GDF Diagnostic Thread in the Payables Community.
HAPPY NEW YEAR!!