Friday 12 September 2014

Account Receivables - Terms Description

Account Receivables

1.      Invoice
Invoice in AR is used to bill the customer for the goods and the services offered (rendered).
2.      Debit Memo
Debit Memos in AR have the similar concept as that of an invoice. A debit memo could be used to invoice the customer for certain miscellaneous expenses incurred which were not included in the actual invoice for the product/service. Debit memo information is stored in ra_customer_trx_all, the type of transaction a particular row represents can be found out from the field class of ar_payment_schedules_all by linking the customer_trx_id field in the two tables.

Debit Memo is not created like credit memo. Debit memo is usually not linked to any other Invoice like credit memos; the whole idea behind creating a debit memo is to increase the customer outstanding balance.

Why create a debit memo instead of invoice then,
      Debit memo would mean that we billed you short in our invoice and with reference our previous transaction we are billing you with the remaining amount. You can enter the source invoice number in “Reference field”
3.      Credit Memo
Credit Memos with negative amount are typically used for crediting the customer’s account with some balance. Credit memo information is also stored in ra_customer_trx_all

Credit memo can be created in two different ways; the idea behind creating a credit memo is to reduce the customer invoice balance.
                        Credit Memo can be created in two ways
1.      Manually creating a credit memo like any other AR manual invoice.
                                                i.      Invoice Class: Credit Memo
                                                ii.      Invoice Type: OM Credit Memo, Credit Memo
2.      Querying an existing AR Invoice and calling the Credit function from Actions menu
Query the AR Invoice for which you want to create a credit memo
                                                i.      Click on “Actions menu”
                                                ii.      Specify the “Reason for Credit Memo”
                                                iii.      Enter the Line % for Credit memo (Eg: 25% entered)
                                                iv.      Enter the Tax % for Credit memo (Eg: 10% entered)
                                                v.      Click the Save button
4.      Deposit/Advance Creation Process and Apply to Invoices
Deposits are entered in the AR system to capture the customer advances and apply them to the AR invoices.

APPLY DEPOSITS TO AR INVOICES
Query or Enter the AR Invoice for which you want to apply the deposit
a.       Click on Actions menu
b.      Click Apply Deposit, Select the Deposit , Click Ok
c.       Deposit is applied to the Invoice.
5.      LOCKBOX

Auto Lockbox automatically creates receipts in Receivables using electronic information that your bank provides. Receivables let you specify the payment method for each Lockbox you define. Payment methods provide the default accounting information for receipts you create through Auto Lockbox. Receivables display active Lockboxes as list of values choices in the Submit Lockbox Processing window. You can disable a Lockbox by un-checking the Active box, and then saving your work.
Remittance        - Payment
6.      ADJUSTMENTS

An adjustment as the name signifies is used for adjusting some of the amount in invoices. For e.g. if some insignificant amount is outstanding a particular invoice, it can be adjusted for the purpose of closing the invoice.  Adjustment information is stored in the table ar_adjustments_all and can be linked to ra_customer_trx_all by the field customer_trx_id.
7.      AGING BUCKETS

Aging buckets are time periods we can use to review and report on your open receivables. For example, the 4–Bucket Aging bucket that Receivables provides consists of four periods: –999 to 0 days past due, 1 to 30 days past due, 31–61 days past due, and 61–91 days past due. When you create your Collections reports or view your customer accounts, you can specify an aging bucket and ’as of date’, and Receivables will group the transactions and their amounts in the appropriate days past due period. You can define an unlimited number of aging buckets and lines (time periods) within an aging bucket. However, all Receivables aging reports include a maximum of the first seven time periods for an aging bucket. If you want to report on additional time periods, you must create custom aging reports. You can also customize the aging buckets that Receivables provides.
8.      DUNNING LETTERS

These are letters that are used to chase up receivables from customers. Receivables let you create dunning letters using two different methods. You can use one of the ten dunning letters that Receivables provides, or create your own, custom dunning letters in the Dunning Letters window. The ten dunning letters that Receivables provides include text files that you can customize to suit your dunning needs. Each of these letters (entitled USER1 – 10) includes two flat files: one for the body of your dunning letter text and the other for footer information. These files reside in the Receivables $AR_TOP/reports directory. For example, the dunning letter USER1 has a body file of ardl1b.txt and a footer file of ardl1f.txt. The ’b’ and the ’f’ identify these files as body and footer files, respectively.
9.      MEMO LINES

          Standard memo lines are lines that you assign to a transaction when the item is not an inventory item (for example, ’Consulting Services’). You can assign memo lines to debit memos, on–account credits, debit memo reversals, charge backs, commitments, and invoices. Receivables display your standard memo lines as list of values choices during credit memo entry in the Credit Transactions window and during invoice entry in the Lines window. When you create charge backs and debit memo reversals, you can either use the standard line that Receivables provides or enter your own. You can create an unlimited number of standard memo lines.
AR Technical Process
Start with creating customers in Accounts Receivables. Customers in AR (or Accounts Receivables) can be of two types namely:
Person
Organization

As is evident from the names a customer type is defined as a Person when the customer is a single person and while creating a customer, which is organization, the customer type is kept as Organization. From the back-end point of view, a customer can be distinguished as a person or organization from the field party_type of the table hz_parties which is linked to the table hz_cust_accounts by the field party_id.

The main attributes of a customer are:

Customer Name (Stored as party_name in the table hz_parties, first name, middle name and last name are also present for a customer type as Person)
Customer Number (Stored as account_number in the table hz_cust_accounts)
Address: There can be multiple addresses defined for any customer and these addresses can be for various purposes for eg. Billing or Shipping the goods. Also, we can specify if a particular address is active at any point of time or not. The primary flag indicates whether a particular address is primary or not. Moreover, within an operating unit there can be only one primary bill to address (active) for any customer. Information regarding all this can be derived from the tables hz_cust_site_uses_all and hz_cust_acct_sites_all, where the cust_account_id of hz_cust_accounts acts as a foreign key.
Main Tables
hz_parties -> hz_cust_accounts -> hz_cust_acct_sites_all -> hz_cust_site_uses_all
A.    Creating Invoice
Transaction lines, tax, Freight and Discount
Now since we are ready with a customer in AR, we need to Invoice him/her for the goods/services rendered. This brings us to the next step of AR i.e. invoicing the customers.

The main attributes of creating an invoice in AR are:
Number (can be entered manually or can be generated automatically based on the source of the invoice).
Date
Currency
Source
Class (Invoice)
Type (Transaction type name e.g. INV_TRX_TYPE1)
Ship to Customer (Ship to address of the customer).
Bill to Customer (Customer to be invoiced)
Payment Terms (The system generates Due Date based on this).
Each invoice can have multiple lines describing the Item, Quantity, Price of the product/services for which the customer is being invoiced. The system calculates line Amount based on the Quantity and Price. The sum of line amounts of all the lines in an invoice makes the Invoice Amount.

After entering the lines for an invoice and saving it, an invoice needs to be completed for it to appear for any payment application. Pressing the Complete button at the lower most left corner of the transaction screen does this. Any further changes to the invoice can only be done by again incompleting the invoice.

The invoice information is stored in the table ra_customer_trx_all where the trx_number corresponds to the invoice number entered from the front end. Once an invoice is completed the record appears in ar_payment_schedules_all table, where the customer_trx_id from ra_customer_trx_all acts as a foreign key.

B.     CREATING MANUAL RECEIPTS

Now since we have invoiced our customer, we expect to receive payment from the customer against an invoice. This brings us to the next step of AR, where we have to create receipts in AR for the payment that we received from the customer in form of Check, DD etc.

The main attributes of receipt in AR are: -
Receipt Number
Currency
Net Receipt Amount
Receipt Date
GL Date
Payment Method
Transaction Number (to which the payment has to be applied)
C.    Receipt application
After entering this information we proceed to applying the receipt to the invoice, here we can change the amount to be applied to the invoice and also the apply date on which the receipt should be applied. Also we have control over the transaction to which the receipt should be applied, a receipt can also be kept unapplied or onaccount if so desired. After entering the information in the applications screen, save the application.

Receipts information is stored in table ar_cash_receipts_all, where receipt_number corresponds to the receipt number entered from the front end. Each receipt record is also stored in ar_payment_schedules_all where cash_receipt_id from ar_cash_receipts_all acts as a foreign key.

A receipt created in AR can have any one of the following statuses: -

Unidentified - When the receipt is created without linking it to customer or invoice. Unidentified receipts could be created when the source of the incoming payment is unclear.
Unapplied – When the payment is linked to a particular customer but has not been applied to any of the transactions.
Onaccount - When the amount of the receipt is not applied to any of the transactions and is kept on-account of the customer.

A receipt can also have some amount as applied, some as unapplied and some as on-account or a combination of any two of these.
This completes one basic cycle of AR i.e.

Creating a customer
Invoicing the customer
Applying payment to the invoice and closing it.

Note: While this illustrates the brief functionality of AR, appropriate accounting entries need to be passed by setting the accounts for the customer.

The application of a receipt to an invoice can take various other forms for e.g.

To an invoice of amount $100, a receipt of amount $50 is applied. In this case the invoice remains open with an outstanding amount of  $50 while the receipt amount gets exhausted.
A receipt of amount $200 is applied to an invoice of amount $100. In this case the remaining receipt amount i.e. $100 can be kept unapplied or on-account.
10.  RECEIPT REVERSAL

If you apply a receipt against an invoice whose revenue was automatically deferred upon import, and you later reverse that receipt, then the impact of the receipt reversal differs depending on the original reason for the revenue deferral:
If revenue on an invoice was deferred due to unmet header level collectibility requirements, then Receivables initiates revenue recognition whenever you apply a receipt to the invoice. If you
reverse a previously applied receipt, then Receivables automatically unearns the previously earned revenue. In some cases, you might apply a receipt against an invoice line, but Receivables cannot recognize revenue for that line due to unmet line level collectibility requirements. Therefore, Receivables leaves the receipt amount as unearned revenue, but flags the amount as pending revenue recognition at a later date. If you later reverse the receipt, then Receivables reflects the receipt reversal by simply removing that pending flag from the receipt amount.
If revenue on an invoice was deferred due to unmet line level collectibility requirements only, then the reversal of a receipt does not impact the amount and timing of revenue recognition.
11.  AUTO INVOICE OPEN INTERFACE

The process for creating invoices in AR manually has been demonstrated in the earlier sections. However, considering the volume of transactions involved, it is not always feasible to create all the invoices manually. To overcome this AR provides a standard program called ‘Autoinvoice Master Program’. Whenever shipment of certain items happen from the Order Management Module, Workflow background process is run to populate data into the interface tables provided by AR and then the autoinvoice program can be run to automatically generate invoices in AR. Optionally, data from some legacy systems (if applicable) can also be populated into the interface tables by building customized interfaces.  The following interface tables need to be populated before running autoinvoice

RA_INTERFACE_LINES_ALL
RA_INTERFACE_DISTRIBUTIONS_ALL (Optional depending on the specific project requirements)
RA_INTERFACE_SALESCREDITS_ALL (Optional depending on the specific project requirements)
Autoinvoice program can also be used for creating debit memos and credit memos apart from invoices. Also, applying a credit memo to invoice can also be done using autoinvoice

Auto invoice is the process used for importing the transactions from feeder moudles like project accounting, order entry etc. and also from existing applications/systems if the receivable module is installed for the first time.
Auto-Invoice
Navigation: Receivable >> Interface >> Auto-Invoice
Chargeback Creation Process
Chargeback is postponement of payment date. It can be done only in the receipts workbench. In this process the old invoice is cancelled and a new transaction is created for the postponed date.
Or, Adjusting remaining balance of the existing debit item to zero, and create a new debit item to bill customer for unpaid balance of original invoices.
After apply the receipts to any invoice, if you want to re-activate the Invoice by also keeping the Receipt in place. Then click the Chargeback button.

Adjustment Creation Process
Adjustments are created for Receipts in case of applying some extra value to the transaction, like Bank charges, bad debts etc. Click the Adjustment button on the Receipt apply window.

Credit Memo Usage

In order Management, sometimes goods may return to supplier because of damage. To refund that material amount, supplier will create Credit Memo to customer, to credit customer balance.

Auto Invoice

Create invoices from other sources like Order Management, Service Contracts, Projects Billing, etc,. This will be done using Auto Invoice master program.

Tables in AR

1.      The RA_CUSTOMER_TRX_ALL table stores invoice, debit memo, commitment, bills receivable, and credit memo header information. Each row in this table includes general invoice information such as customer, transaction type, and printing instructions. One row exists for each invoice, debit memo, commitment, bill receivable, and credit memo that you create in Oracle Receivables. Invoices, debit memos, credit memos, bills receivable, and commitments are distinguished by their associated transaction types, which are stored in the RA_CUST_TRX_TYPES_ALLtable.


2. The RA_CUSTOMER_TRX_LINES_ALL table stores information about invoice, debit memo, credit memo, bills receivable, and commitment lines. For example, an invoice can have one line for Product A and another line for Product B. Each line requires one row in this table.


3. The RA_CUST_TRX_LINE_GL_DIST_ALL table stores the accounting records for revenue, unearned revenue, and unbilled receivables for each invoice or credit memo line. Oracle Receivables creates one row for each accounting distribution, and at least one accounting distribution must exist for each invoice or credit memo line. Each row in this table includes the General Ledger account and the amount of the accounting entry. The AMOUNT column is required even though this column is null allowed. Receivable uses this information to post the proper amounts to General Ledger.


4. The RA_CUST_TRX_LINE_SALESREPS_ALL table stores sales credit assignments for invoice lines. If Receivables bases your invoice distributions on sales credits, a mapping exists between the sales credit assignments in this table with the RA_CUST_TRX_LINE_GL_DIST_ALL table

5. The RA_CUST_TRX_TYPES_ALL table stores information about each transaction type that is used for invoices, commitments, bills receivable, and credit memos. Each row includes Auto Accounting information as well as standard defaults for the invoices that result.

6. AR_PAYMENT_SCHEDULES_ALL holds the payment schedules for the transactions

Subledger Accounting In Detail

Subledger Accounting


About SLA:
1.SLA is an intermediate step between subledger products and the Oracle General Ledger
2.Journal entries are created in Subledger Accounting and then transferred to Oracle General Ledger


Tables that are involved in this process are
XLA_TRANSACTION_ENTITIES
XLA_EVENTS
XLA_AE_HEADERS
XLA_AE_LINES
GL_IMPORT_REFERENCES
GL_JE_LINES
GL_JE_HEADERS
GL_BALANCES
NOTE: For every transaction, it will create an entity_id in xla_transaction_entities table. This column we use to make link between transaction entities and xla ae headers table. By default at first time for an entity, one event will be there. Suppose for the same transaction (means for same entity), if we create any adjustment or some other thing, that we treat it as an event. So for one entity there will be more events.

Links Between these tables
GL_JE_LINES (JE_HEADER_ID, JE_LINE_NUM)    
GL_IMPORT_REFERENCES (JE_HEADER_ID, JE_LINE_NUM)
GL_IMPORT_REFERENCES (GL_SL_LINK_TABLE, GL_SL_LINK_ID
XLA_AE_LINES (GL_SL_LINK_TABLE, GL_SL_LINK_ID)
XLA_AE_LINES (APPLICAITON_ID, AE_HEADER_ID)
XLA_AE_HEADERS (APPLICATION_ID, AE_HEADER_ID)
XLA_AE_HEADERS (APPLICATION_ID, EVENT_ID)
XLA_EVENTS (APPLICATION_ID, EVENT_ID)
XLA_EVENTS (APPLICATION_ID, ENTITY_ID)
XLA_TRANSACTION_ENTITIES (APPLICATION_ID, ENTITY_ID)

Draft : Draft will create journal entries, which are not final, which means they are not ready to be transferred to GL.
You can create accounting on this transaction again and again, which will delete the old journal entries and create new ones. You can’t transfer these journal entries to GL.
xla_events.process_status_code = D
xla_events.event_status_code = U
xla_ae_headers.accounting_entry_status_code = D
Final : Final will create journal entries, which can be transferred to GL. Once it is finally accounted you can’t run create accounting on the particular transaction (specifically on that event).
Run Transfer Journal Entries to GL program
xla_events.process_status_code = P
xla_events.event_status_code = P
xla_ae_headers.accounting_entry_status_code = F
Final Post: Final Post will create journal entries in final mode, transfer them to GL and post them.
xla_ae_headers.accounting_entry_status_code = F
xla_ae_headers.transfer_status_code = Y
xla_events.process_status_code = P
xla_events.event_status_code = P
============================================================

In R12 we can't find any references columns. Where as in R11 we have 
Invoice numbers, Suppliers/Customer details in GL tables as references 
columns. 

So in R12 new module XLA introduced , XLA tables don't store any direct 
references. 

First we have to check the transactions in xla_transaction_entities, where 
source_id_int_1 stores the ID value of transactions. So based on the 
transaction we have to map the appropriate modules 

See the below examples . 

AP Invoices 

where xla_transaction_entities.source_id_int_1 = ap_invoices_all.invoice_id 

and entity_code = 'AP_INVOICES' 

AP Payments 

where xla_transaction_entities.source_id_int_1 = ap_checks_All.check_id 

and xte.entity_code = 'AP_PAYMENTS' 

AR Invoice 

where xla_transaction_entities.source_id_int_1 = 
ra_customer_trx_all.customer_trx_id 

and xte.entity_code = 'TRANSACTIONS' 

AR Receipt 

where xla_transaction_entities.source_id_int_1 
=ar_cash_receipts_all.CASH_RECEIPT_ID 

This is the way we have to map all modules based on entity_code. 

==============================================================


Technical details of Drilldown from GL to Sub Modules

February 13, 2014   //    No Comment
This article explains you the technical reference for journal import and drilldown functionalities, which will help in understanding the SLA architecture and helps you in troubleshooting of Support issues.

Drilldown from General Ledger takes us to the SLA Journal details.

The technical flow for this is as follows:

Drilldown is performed from Journal lines in General Ledger. This data is stored in the table GL_JE_LINES. The link between SLA data and data in GL is through the table GL_IMPORT_REFERENCES.

Note that data is populated in GL_IMPORT_REFERENCES only if  ‘Import References’ option is selected in the Journal source definition. The data can be mapped from GL_JE_LINES to GL_IMPORT_REFERENCES table using the columns je_header_id and je_lines_num.

The SLA Journal line data is stored in the table XLA_AE_LINES and header details are stored in XLA_AE_HEADERS.
The data in GL_IMPORT_REFERENCES can be mapped to XLA_AE_LINES using the columns gl_sl_link_id and gl_sl_link_table.

GL to Submodules Drilldown

Important points to note:

Data can be transferred from SLA to GL in either Summary or Detail mode. This option is defined for the Event Class, which is attached to the Journal Line Type. Journal Line Type is then attached to the Journal Line Definition, which is finally attached to the Subledger Accounting Method. Please note that if the transfer is done in Summary mode, then the Reference columns will not be populated in either GL_IMPORT_REFERENCES or GL_JE_LINES.
Data from SLA, which comes to GL_INTERFACE, can then be imported in either Summary or Detail Mode. This option is set while defining the Ledger in the Accounting Setup Manager.
Please note that irrespective of the Mode of Transfer (Detail or Summary), Drilldown will work from GL to all the Subledgers that are using the SLA Engine. The subledger reference information required to drilldown is taken from the SLA tables XLA_AE_HEADERS and XLA_AE_LINES, and not in GL_JE_LINES or GL_IMPORT_REFERENCES.
Due to patch# 7512923:R12.XLA.A, the reference columns REFERENCE5 to REFERENCE10 are populated with entity_id, event_id, ae_header_id, ae_line_num, accounted_cr and accounted_dr. You need to apply this patch if you want to see Line References on Account Inquiry -> Journal Detail.
To identify the mode of Transfer:

Column gl_transfer_mode_code in XLA_AE_LINES indicates the transfer mode from SLA to GL. For this column, value ‘S’ indicates that the transfer is done in Summary mode while ‘D’ indicates that the transfer is in Detail mode.
For the Journal Import mode, the values for specific applications is stored in the table XLA_LEDGER_OPTIONS. This table stores the setup for various applications for the Ledgers defined. The column indicating the Journal Import mode, is transfer_to_gl_mode_code. The values are:
P : Summarized by Period
A : Summarized by Accounting Date

D : No Summarization i.e Detail mode

Tuesday 9 September 2014

SQL Loader

SQL LOADER

SQL LOADER is an Oracle utility used to load data into table given a datafile which has the records that need to be loaded. SQL*Loader takes data file, as well as a control file, to insert data into the table. When a Control file is executed, it can create Three (3) files called a
 log file, bad file or reject file, discard file.

Log file tells you the state of the tables and indexes and the number of logical records already read from the input datafile. This information can be used to resume the load where it left off.
Bad file or reject file gives you the records that were rejected because of formatting errors or because they caused Oracle errors.
Discard file specifies the records that do not meet any of the loading criteria like when any of the WHEN clauses specified in the control file. These records differ from rejected records.

Structure of the data file:
The data file can be in fixed record format or variable record format.

Fixed Record Format would look like the below. In this case you give a specific position where the Control file can expect a data field:

7369 SMITH      CLERK        7902  12/17/1980         800                
7499 ALLEN      SALESMAN  7698  2/20/1981           1600  
7521 WARD      SALESMAN  7698  2/22/1981           1250  
7566 JONES      MANAGER   7839  4/2/1981             2975            
7654 MARTIN    SALESMAN  7698  9/28/1981           1250  
7698 BLAKE      MANAGER   7839  5/1/1981             2850            
7782 CLARK      MANAGER   7839  6/9/1981             2450            
7788 SCOTT      ANALYST    7566  12/9/1982           3000            
7839 KING        PRESIDENT          11/17/1981         5000            
7844 TURNER    SALESMAN  7698  9/8/1981            1500          
7876 ADAMS     CLERK         7788  1/12/1983          1100            
7900 JAMES      CLERK         7698  12/3/1981          950            
7902 FORD        ANALYST     7566  12/3/1981          3000          
7934 MILLER     CLERK         7782  1/23/1982          1300          

Variable Record Format would like below where the data fields are separated by a delimiter.
Note: The Delimiter can be anything you like. In this case it is "|"

1196700|9|0|692.64
1378901|2|3900|488.62
1418700|2|2320|467.92
1418702|14|8740|4056.36
1499100|1|0|3.68
1632800|3|0|1866.66
1632900|1|70|12.64
1637600|50|0|755.5

Structure of a Control file:

Sample CTL file for loading a Variable record data file:

OPTIONS (SKIP = 1)   --The first row in the data file is skipped without loading
LOAD DATA
INFILE '$FILE'             -- Specify the data file path and name
APPEND                       -- type of loading (INSERT, APPEND, REPLACE, TRUNCATE
INTO TABLE "APPS"."BUDGET"   -- the table to be loaded into
FIELDS TERMINATED BY '|'           -- Specify the delimiter if variable format datafile
 OPTIONALLY ENCLOSED BY '"'   --the values of the data fields may be enclosed in "
TRAILING NULLCOLS     -- columns that are not present in the record treated as null
  (ITEM_NUMBER    "TRIM(:ITEM_NUMBER)", -- Can use all SQL functions on columns
  QTY                 DECIMAL EXTERNAL,
  REVENUE             DECIMAL EXTERNAL,
  EXT_COST            DECIMAL EXTERNAL TERMINATED BY WHITESPACE "(TRIM(:EXT_COST))"  ,
  MONTH           "to_char(LAST_DAY(ADD_MONTHS(SYSDATE,-1)),'DD-MON-YY')" ,
DIVISION_CODE    CONSTANT "AUD"  -- Can specify constant value instead of
                                                                          Getting value from datafile
   )

OPTION statement precedes the LOAD DATA statement. The OPTIONS parameter allows you to specify runtime arguments in the control file, rather than on the command line. The following arguments can be specified using the OPTIONS parameter.
SKIP = n -- Number of logical records to skip (Default 0)
LOAD = n -- Number of logical records to load (Default all)
ERRORS = n -- Number of errors to allow (Default 50)
ROWS = n   -- Number of rows in conventional path bind array or between direct path data saves (Default: Conventional Path 64, Direct path all)
BINDSIZE = n -- Size of conventional path bind array in bytes (System-dependent default)
SILENT = {FEEDBACK | ERRORS | DISCARDS | ALL} -- Suppress messages during run
                (header, feedback, errors, discards, partitions, all)
DIRECT = {TRUE | FALSE} --Use direct path (Default FALSE)
PARALLEL = {TRUE | FALSE} -- Perform parallel load (Default FALSE)

LOAD DATA statement is required at the beginning of the control file.

INFILE: INFILE keyword is used to specify location of the datafile or datafiles.
INFILE * specifies that the data is found in the control file and not in an external file. INFILE '$FILE', can be used to send the filepath and filename as a parameter when registered as a concurrent program.
INFILE   '/home/vision/kap/import2.csv' specifies the filepath and the filename.

Example where datafile is an external file:
LOAD DATA
INFILE   '/home/vision/kap/import2.csv'
INTO TABLE kap_emp
FIELDS TERMINATED BY ","
( emp_num, emp_name, department_num, department_name )

Example where datafile is in the Control file:
LOAD DATA
INFILE *
INTO TABLE kap_emp
FIELDS TERMINATED BY ","            
( emp_num, emp_name, department_num, department_name )
BEGINDATA
7369,SMITH,7902,Accounting
7499,ALLEN,7698,Sales
7521,WARD,7698,Accounting
7566,JONES,7839,Sales
7654,MARTIN,7698,Accounting

Example where file name and path is sent as a parameter when registered as a concurrent program
LOAD DATA
INFILE '$FILE'
INTO TABLE kap_emp
FIELDS TERMINATED BY ","            
( emp_num, emp_name, department_num, department_name )


TYPE OF LOADING:
INSERT   -- If the table you are loading is empty, INSERT can be used.
APPEND  -- If data already exists in the table, SQL*Loader appends the new rows to it. If data doesn't already exist, the new rows are simply loaded.
REPLACE -- All rows in the table are deleted and the new data is loaded
TRUNCATE -- SQL*Loader uses the SQL TRUNCATE command.

INTO TABLE is required to identify the table to be loaded into. In the above example INTO TABLE "APPS"."BUDGET", APPS refers to the Schema and BUDGET is the Table name.
FIELDS TERMINATED BY specifies how the data fields are terminated in the datafile.(If the file is Comma delimited or Pipe delimited etc)

OPTIONALLY ENCLOSED BY '"' specifies that data fields may also be enclosed by quotation marks.
TRAILING NULLCOLS clause tells SQL*Loader to treat any relatively positioned columns that are not present in the record as null columns.
Loading a fixed format data file:
LOAD DATA
INFILE 'sample.dat'
INTO TABLE emp
(      empno         POSITION(01:04)   INTEGER EXTERNAL,
       ename          POSITION(06:15)   CHAR,
       job            POSITION(17:25)   CHAR,
       mgr            POSITION(27:30)   INTEGER EXTERNAL,
       sal            POSITION(32:39)   DECIMAL EXTERNAL,
       comm           POSITION(41:48)   DECIMAL EXTERNAL,
       deptno         POSITION(50:51)   INTEGER EXTERNAL)

Steps to Run the SQL* LOADER from UNIX:
At the prompt, invoke SQL*Loader as follows:
      sqlldr USERID=scott/tiger CONTROL= LOG=
      name>

SQL*Loader loads the tables, creates the log file, and returns you to the system prompt. You can check the log file to see the results of running the case study.

Register as concurrent Program:

Place the Control file in $CUSTOM_TOP/bin.
Define the Executable. Give the Execution Method as SQL*LOADER.
Define the Program. Add the Parameter for FILENAME.

 Skip columns:
You can skip columns using the 'FILLER' option.

Load Data
--
--
--
TRAILING  NULLCOLS
(
name Filler,
Empno ,
sal
)

here the column name will be skipped.

OAF Interview Questions

OAF Interview Questions

What is an EO?
a. Map to a database table or other data source
b. Each entity object instance represents a single row
c. Contains attributes representing database columns
d. Fundamental BC4J object through which all inserts/updates/deletes interact with the database
e. Central point for business logic and validation related to a table
f. Encapsulates attribute-level and entity-level validation logic
g. Can contain custom business methods

2. What is a VO?
a. Represent a query result
b. Are used for joining, filtering, projecting, and sorting your business data
c. Can be based on any number of entity objects
d. Can also be constructed from a SQL statement


3. What are the methods in controller?
ProcessRequest and processformrequest

4. What is a Controller?
Controller is the java file and can be associated to a complete OAF page or to a specific region.
There are several tasks you will do routinely in your code.
? Handle button press and other events
? Automatic queries
? Dynamic WHERE clauses
? Commits
? JSP Forwards
The logic for accomplishing all these tasks is written in controller

5. When is the processRequest method called?
PR method is called when the page is getting rendered onto the screen


6. When is processFormRequest method called?
PFR method is called when we perform some action on the screen like click of submit button or click on lov


7. What is extension?
Extension is when you take an already existing component ex an OAF page or a region and then add some more functionality to it without disturbing the original functionality.


8. What is personalization?
Oracle Apps Framework has an OA Personalization Framework associated with it so that you can personalize any OAF page in an Oracle E-business Suite application without changing the basic or underlying code of that OA Framework page, Oracle Application Framework makes it very easy to personalize the appearance of the page or even the personalization of data displayed on to an OA Framework page.

9. What are levels of personalization?
1. Function Level
2. Localization Level
3. Site Level
4. Organization Level
5. Responsibility Level
6. Admin-Seeded User Level
7. Portlet Level
8. User Level
1) What is BC4J?

Business Components for Java is JDeveloper's programming framework for building multitier database applications from reusable business components. These applications typically consist of:

• A client-side user interface written in Java and/or HTML.
• One or more business logic tier components that provide business logic and views of business objects.
• Tables on the database server that store the underlying data.

2.What are all the components of BC4J?
Following are the components of BC4J:

• Entity Object - EO encapsulates the business logic and rules. EO’s are used for Inserting, Updating and Deleting data from the database table. E0 is also used for validating the records across the applications.

• View Object - View object encapsulates the database query. It is used for selecting data. It provides iteration over a query result set. VO’s are primarily based on EO’s. It can be used on multiple EO’s if the UI is for update.

• Application Module - Application Modules serve as containers for related BC4J components. The pages are related by participating in the same task. It also defines the logical data model and business methods needed.

2) What is an EO?
EO encapsulates the business logic and rules.EO’s are used for Inserting, Updating and Deleting data. This is used for validating across the applications. We can also link to other EO’s and create a Association object.

3) What is an VO?
View object encapsulates the database query. It is used for selecting data. It provides iteration over a query result set.VO’s are primarily based on Eo’s. It can be used on multiple EO’s if the UI is for update. It provides a single point of contact for getting and setting entity object values. It can be linked together to form View Links.

4) What is an AO?
An association object is created where we link EO’s. For example take the search page where we link the same EO to form a association between the manager and employee. Every employee should have a manager associated. But if it President then no there is no manager associated. This is a perfect example to understand the AO.

5) What is an VL?
A view link is an active link between view links. A view link can be created by providing the source and destination views and source and destination attributes. There are two modes of View link operation that can be performed. A document and Master/Detail operation.

6). What is UIX?
UIX is an extensible, J2EE-based framework for building web applications. It is based on the Model-View-Controller (MVC) design pattern, which provides the foundation for building scalable enterprise web applications.

7). Where the VO is located in the MVC architecture?
VO is located in the View Layer in MVC which is responsible for presenting the data to the user.

9) Which package should include EO and AO.
The EO and AO will be present in the schema.server package.

10) What is the difference between inline lov and external lov.
Inline lov is a lov which is used only for that particular page for which it was created and cannot be used by any other page.

External lov is a common lov which can be used by any page. It is a common component for any page to use it. It can be used by giving the full path of the lov in the properties section “External LOV” of the item.

1) what is a Javabean?
JavaBeans is an object-oriented programming interface that lets you build re-useable applications or program building blocks called components that can be deployed in a network on any major operating system platform.

2) What is query Bean?
QueryBean is used to execute and return the results of a query on behalf of the QueryPortlet application.

3) what is the difference between autocustomization criteria and result based search?
Results based search generates search items automatically based on the columns on the results table.
In Autocustomization search we need to set what all fields are required to display as a search criteria.

4) what is MDS?
MDS is MetaData Service. When a web page is broken into small units like buttons,fields etc they are stored in a database. These are not stored as binary files but as data in tables. The data are present in JDR tables. MDS provides service to store & return page definitions. MDS collects those definitions in components/fields in a meaningful manner to build a page.

5) What is XML?
XML is a markup language for documents containing structured information.
Structured information contains both content (words, pictures, etc.) and some indication of what role that content plays (for example, content in a section heading has a different meaning from content in a footnote, which means something different than content in a figure caption or content in a database table, etc.).

6) What is the difference between customization and extension?
Customization is under direct user control. The user explicitly selects between certain options. Using customization a user can:
    Altering the functionality of an application
    Altering existing UI
    Altering existing business logic

Extension is about extending the functionality of an application beyond what can be done through personalization. Using extension we can:

    Add new functional flows
    Extend or override existing business logic
    Create New application/module
    Create New page
    Create New attribute
    Extend/Override defaults & validations

7) What is Personalization?
Personalization enables you to declaratively tailor the UI look-and-feel, layout or visibility of page content to suit a business need or a user preference. Using Personalization we can:

    • Tailor the order in which table columns are displayed.
    • Tailor a query result.
    • Tailor the color scheme of the UI.
    • Folder Forms
    • Do Forms Personalization
   
8)Can you extend every possible Application Module?
Answer: No..Root AM cannot be extended.

9) What is rootAM?
The application module which is associated with the top-level page region (the pageLayout region) is root application module.

10) Why can’t Root AM be extended?

The root AM is loaded first and after that the MDS Substitutions are parsed.
Hence ROOT AM gets loaded even before the time the substitutions definition from MDS layer get worked out.

Obviously, the root am cant substitute itself, hence it can't be extended

Monday 8 September 2014

Tech Diff between 11i and R12


Tech Diff between 11i and R12


MO_GLOBAL-DIVE INTO R12 MULTI ORG DESIGN

I hope you have already read article Basics of Multi Org in R12 . A few questions come to mind when we think about Multi Org in R12. The best way to analyse those questions is by opening package MO_GLOBAL. Don't worry if you are not yet on R12, package MO_GLOBAL is installed 11.5.10 too. Let’s get digging.

How is CLIENT_INFO being replaced in R12? 
Let’s take an example.
In pre Release 12, you would have had following methodology for PO_HEADERS_ALL
a. A table is created in PO Schema, named PO_HEADERS_ALL
b. A synonym named PO_HEADERS_ALL is created in APPS schema, referring to PO.PO_HEADERS_ALL
c. Create a view PO_HEADERS in APPS schema, as "select * from po_headers_all where org_id=client_info"

But now in R12, following will happen
a. A table is created in PO Schema, named PO_HEADERS_ALL
b. A synonym named PO_HEADERS_ALL is created in APPS schema, referring to PO.PO_HEADERS_ALL
c. Another synonym named PO_HEADERS is created in APPS, referring to PO_HEADERS_ALL
d. A Row Level security is applied to PO_HEADERS, using package function MO_GLOBAL.ORG_SECURITY.
This can be double-checked by running SQL 
select * from all_policies where object_name='PO_HEADERS'
e. The effect of this policy is that, whenever you access PO_HEADERS, Oracle RLS will dynamically append WHERE CLAUSE similar to below
SELECT * FROM PO_HEADERS 
WHERE EXISTS (SELECT 1 FROM mo_glob_org_access_tmp oa WHERE oa.organization_id = org_id) 
Also see **** below, latter

Does this mean, if I create a new custom table, I will have to apply RLS [Row Level Security] against Custom table too?

Yes indeed, if it contains data partitioned by ORG_ID. All you need to do in such case is to assign package function MO_GLOBAL.ORG_SECURITY to that table/synonym/view.

Will the Multi Org Row Level security be applied against the table or the synonym or the view?

In theory, RLS can be applied against any of the above objects. However in practice, you will apply RLS against Objects in APPS Schema. This means, you will most probably apply RLS on Synonyms. Basically, the Multi Org Views are now replaced by RLS Secured Synonyms. Hence no code change is required where the pre-R12 Multi-Org secured view was being accessed. The responsibility of securing data as per ORG_ID now lies with RLS [also known as VPD - Virtual Private Database].

I have made changes to my Multi Org Security Profile, by attaching a new Org Hierarchy. Do I need to run any process?

Just like we do in HRMS, it is advised that any changes to Security Profiles must be followed by running "Security List Maintenance"

What is MO_GLOBAL.INITPurpose of mo_global.init :-
It will check if new Multi Org Security Profile is set, to decide if new Security Profile method will be used.
If the new MO security profile is set, then mo_global.init inserts one record, for each Organization in Org Hierarchy, in table mo_glob_org_access_tmpWhen & from where is mo_global.init called ?
This package procedure will be called as soon as you login or as soon as you switch responsibility. Just like FND_GLOBAL.INITIALIZE is called. It is safe to assume that Oracle will invoke MO_GLOBAL.INIT after FND_GLOBAL.INITIALIZE

Is mo_glob_org_access_tmp table a global temporary table?
Yes, it is. Hence after Multi Org is initialised for your session, your session will have X number of records in table mo_glob_org_access_tmp. X is the number of organizations assigned to MO Security profile [view org hierarchy or org list in security profile]

What is the purpose of MO_GLOBAL.ORG_SECURITY?The purpose of Row-Level-Security is to hide certain data [based on some conditions]. RLS does so by appending a where clause to the secured object.
1. MO_GLOBAL.ORG_SECURITY is a function that returns a predicate for the WHERE CLAUSE
2. The where clause will be appended to Table/Synonym/View for which Multi Org Row Level security is enabled

What is the purpose of MO_GLOBAL.SET_POLICY_CONTEXT?
This procedure has two parameters
p_access_mode
Pass a value "S" in case you want your current session to work against Single ORG_ID
Pass a value of "M" in case you want your current session to work against multiple ORG_ID's
p_org_id
Only applicable if p_access_mode is passed value of "S"

In SQL*Plus, I wish to set my session to work against a specific Org [one single org]. How do I do that in R12
SQL>> exec MO_GLOBAL.SET_POLICY_CONTEXT ('S', 101);
In the above case, ORG_ID 101 will be assigned as current org for your session.
Internally, following code in 
blue will be executed by Oracle when you set your context to single Org, dbms_session.set_context('multi_org2', 'current_org_id', 101);

**** If the current database session is initialised for Single Org[as in above step], then Where clause appended to object by Row-Level-Security will be
WHERE org_id = sys_context('multi_org2','current_org_id')
Why will I as a Apps Techie ever use MO_GLOBAL.SET_POLICY_CONTEXT ?
Lets say you wish to call an API to create invoices in ORG_ID 101. In case the API does not have a parameter for Org_id, you can do the below
a. exec MO_GLOBAL.SET_POLICY_CONTEXT ('S', 101)
b. Call the Invoice API, which will internally read the ORG_ID from MO Current Context.

From SQL*Plus, I wish to simulate login to a specific responsibility. How do I do this?
a. Call FND_GLOBAL.INITIALIZE - This will set your responsibility id, user_id etc
b. call MO_GLOBAL.INIT - This will read the MO profile option values for your responsibility/user, and will initialize the Multi Org Access.
What happens to dbms_client_info.set_client_info(101)?
This will become redundant functionally. Use mo_global package instead. This package already exists in 11.5.10 instance. And if you open this, you will find this using Row Level Security. Technically I think 
dbms_client_info.set_client_info will still work, but will produce unexpected results if you have enabled the MultiOrg Security Profile feature too.

How does this effect my customizations?
Statement 1:- If you have hard-coded client-info command, then obviously that will no longer work [with disclaimers, but I think so will be the case]
Statement 2 :- Also, if you have been using fnd_profile.org_Id, that again will not work.
Both statements above are false if you decide not to implement Multi Org Access Control feature in Release 12.

SUPPLIERS IN TCA - A DIVE INTO VENDOR TABLES IN R12

Prior to R12, creation of a vendor/supplier record in e-Business suite largely meant insertion of records in PO_VENDORS.
However, from R12 onwards, records are inserted into at least half a dozen tables when a single Supplier record is created.
This is largely due to the fact that Suppliers have been moved into the TCA DataModel.

In this article, I would like to show you the set of tables that are affected when a Supplier record gets created in Release12.
I will also touch base upon Supplier Sites and changes to taxation related tables.

End User Step 1
To begin with, we need to create a Supplier. Let’s name it Go4Gold
Simply enter name of the Supplier in organization name field and click on Apply. This will create a Supplier.


End User Step 2:
You can double check the created Supplier, which has Supplier Number 20186.
This supplier number comes from a table named AP_SUPPLIERS. 


  The registry id that you see is the Party_number field from hz_parties [TCA Party Table]

Now, lets have a look at the 
list of tables impacted by creating the above Supplier record.
I am not saying that inserting into below listed tables is the way to create Suppliers in R12 TCA Model [Use API's for that].
This article is purely for your understanding of the new data model for Suppliers in R12 TCA.
Of course this will also be helpful to you when developing reports in R12.

Table HZ_PARTIES
SELECT * FROM hz_parties WHERE party_name= 'Go4Gold' ;
This happens to be the master table now instead of PO_VENDORS.
You will notice that the PARTY_NUMBER below is the Registry id in the R12 supplier screen.
Also, this party_id = 301934 will be referenced in the remainder set of tables.

Table HZ_PARTY_USG_ASSIGNMENTS 
SELECT party_id ,party_usg_assignment_id,party_usage_code FROM hz_party_usg_assignments 
WHERE party_id = 301934;

This table stores the Party Usages, for example, in this case it captures the fact that the given party_id is of type SUPPLIER.
Table HZ_ORGANIZATION_PROFILES
SELECT * FROM hz_organization_profiles WHERE party_id = 301934
This table captures additional Supplier information, for example, credit scoring details of Supplier or the Number of Employees working in Supplier Organization.
Table IBY_EXTERNAL_PAYEES_ALL
SELECT * FROM iby_external_payees_all WHERE payee_party_id = 301934
 This table captures Payment related details of the Supplier.
For example: -
    1. How should the supplier's remittance advice be sent?
    2. What is the default Payment method Code for this supplier?
    3. Who bears the bank charges when lets say SWIFT payment is made?
This information can be setup at either the Supplier level or at Supplier Site level.

Table AP_SUPPLIERS
SELECT vendor_id, vendor_name, segment1, enabled_flag FROM ap_suppliers WHERE party_id = 301934 Alongside HZ_PARTIES, this is another master table that replaces the PO_VENDORS table of 11i.
Instead of expanding the design of HZ_PARTIES, oracle decided to hold the supplier specific attributes in AP_SUPPLIERS [fair enough!].
 Table POS_SUPPLIER_MAPPINGS
SELECT * FROM pos_supplier_mappings WHERE party_id = 301934
This table holds the mapping between the AP_SUPPLIERS.VENDOR_ID and HZ_PARTIES.PARTY_ID.
This is useful in cases whereby two vendors  effectively belong  the same HZ_Party Record.
 Table ZX_PARTY_TAX_PROFILE
SELECT party_type_code, party_tax_profile_id FROM zx_party_tax_profile WHERE party_id = 301934
The taxation related details like Tax Codes, and Tax Accounts etc have been moved from AP into ZX.
ZX is the name of a new Application "E-Business Tax".
Effectively this application is the Tax repository/Taxation Engine for e-Business Suite starting from R12.Effectively this also means that our good old AP_TAX_CODES_ALL is no longer the master table for Taxes.
Now we have a new tax rate table, i.e. ZX_RATES_B
.ZX_ACCOUNTS is another table that has been introduced to capture accounting setup related to Tax Codes.

Database View PO_VENDORS
select vendor_name, segment1, party_number from po_vendors WHERE party_id = 301934
PO_VENDORS is a view in R12, that joins AP_SUPPLIERS & HZ_PARTIES.
Similarly, PO_VENDOR_SITES and PO_VENDOR_SITES_ALL are also views based upon AP_SUPPLIER_SITES_ALL.

Query to Find the List of Scheduled Concurrent Programs



select r.request_id,
 p.user_concurrent_program_name || nvl2(r.description,' ('||r.description||')',null) Conc_prog,
 s.user_name REQUESTOR,
 r.argument_text arguments,
 r.requested_start_date next_run,
 r.last_update_date LAST_RUN,
 r.hold_flag on_hold,
 r.increment_dates,
 decode(c.class_type,
 'P', 'Periodic',
 'S', 'On Specific Days',
 'X', 'Advanced',
 c.class_type) schedule_type,
 case
 when c.class_type = 'P' then
 'Repeat every ' ||
 substr(c.class_info, 1, instr(c.class_info, ':') - 1) ||
 decode(substr(c.class_info, instr(c.class_info, ':', 1, 1) + 1, 1),
 'N', ' minutes',
 'M', ' months',
 'H', ' hours',
 'D', ' days') ||
 decode(substr(c.class_info, instr(c.class_info, ':', 1, 2) + 1, 1),
 'S', ' from the start of the prior run',
 'C', ' from the completion of the prior run')
 when c.class_type = 'S' then
 nvl2(dates.dates, 'Dates: ' || dates.dates || '. ', null) ||
 decode(substr(c.class_info, 32, 1), '1', 'Last day of month ') ||
 decode(sign(to_number(substr(c.class_info, 33))),
 '1', 'Days of week: ' ||
 decode(substr(c.class_info, 33, 1), '1', 'Su ') ||
 decode(substr(c.class_info, 34, 1), '1', 'Mo ') ||
 decode(substr(c.class_info, 35, 1), '1', 'Tu ') ||
 decode(substr(c.class_info, 36, 1), '1', 'We ') ||
 decode(substr(c.class_info, 37, 1), '1', 'Th ') ||
 decode(substr(c.class_info, 38, 1), '1', 'Fr ') ||
 decode(substr(c.class_info, 39, 1), '1', 'Sa '))
 end as schedule,
 c.date1 start_date,
 c.date2 end_date,
 c.class_info
 from fnd_concurrent_requests r,
 fnd_conc_release_classes c,
 fnd_concurrent_programs_tl p,
 fnd_user s,
 (with date_schedules as (
 select release_class_id,
 rank() over(partition by release_class_id order by s) a, s
 from (select c.class_info, l,
 c.release_class_id,
 decode(substr(c.class_info, l, 1), '1', to_char(l)) s
 from (select level l from dual connect by level <= 31),
 fnd_conc_release_classes c
 where c.class_type = 'S'
 and instr(substr(c.class_info, 1, 31), '1') > 0)
 where s is not null)
 SELECT release_class_id, substr(max(SYS_CONNECT_BY_PATH(s, ' ')), 2) dates
 FROM date_schedules
 START WITH a = 1
 CONNECT BY nocycle PRIOR a = a - 1
 group by release_class_id) dates
 where r.phase_code = 'P'
 and c.application_id = r.release_class_app_id
 and c.release_class_id = r.release_class_id
 and nvl(c.date2, sysdate + 1) > sysdate
 and c.class_type is not null
 and p.concurrent_program_id = r.concurrent_program_id
 --and p.concurrent_program_id = 45543
 and p.language = 'US'
 and dates.release_class_id(+) = r.release_class_id
 and r.requested_by = s.user_id
 order by conc_prog, on_hold, next_run;