GL_INTERFACE Table si droped by mistake

While perform the data migration activity by mistake insted of dropping the rows in GL_INTERFACE table, i drop the table it self. Please help in recreating the table and giving it the right permissions to other users.
Waiting for quick reply
Thanks
Kiran

Ouch :-) Hopefully this is not a production instance. If you have another test/dev instance with the exact same patch level, you can export this table from that instance and import it into this instance, then truncate the table to delete the rows. Ever after recreating the table, there may be lingering issues since you have effectively lost any referential integrity. The only "correct" solution would be to restore the entire database from backups.
HTH
Srini Chavali

Similar Messages

  • Controlling posting sequence during GL Import(GL_INTERFACE table)

    I am putting multiple journal records in GL_INTERFACE table having same accounting date. I have enabled Auto Post.
    My requirement is, during Journal import and subsequent Auto posting. Some particular journals to be posted before posting other journals. Is there any column in GL_INTERFACE other than accounting date, which can be used to sequence the posting of journals in GL.
    This requirement comes due to the fact, that some of the journals add funds to a account and other journals use that funds from that account. My requirement is when I do a journal import, GL should first post the journals which add funds and then post the other journals which use those funds, else I may have funds check failed issue.
    Please let me know of solutions...

    Hi,
    Thanks for the solution. My company has assigned me similar kind of automation process. So if you could give me more input on this that would be great. You can send me further mails on the following id : [email protected] or a reply in the same forum is much appriciated. Waiting for your reply.
    Thanks,
    Nagaraj

  • GL_Interface Table

    Hi All,
    I have a question about GL_Interface Table.
    Our company has been spitted and at the time of split one of users run a journal import, and that request has been killed.
    After all in GL_Interface I have 4 records (a journal), which already has been posted to GL but still reside in a table.
    What should I do? Just Delete them? Oracle is not recommend it :(

    if you are absolutely sure they have been imported into the GL for the right company, etc, etc, you don't need them in the interface table anymore.
    You can delete them.

  • Prevent auto data deletion of records from gl_interface table after success

    Hello All,
    I currently developing the journal conversion.
    I have successfully imported some sample journal data.
    But the problem here is i am not able find the successfully uploaded journal data in the gl_interface table.It is getting deleted after successful import.
    How to prevent this deleted, i want that data to be available in the gl_interface table.
    Any setup needs to be done for this?
    Oracle ebs version : 12.1.1
    Thanks & Regards
    aboothahir

    any suggessions on the above query?
    Thanks & regards
    Aboo

  • Oracle GL_INTERFACE table - Data Populate

    We would like to populate the accounting data in Oracle GL_INTERFACE table (Oracle EBS R 12.1.3) and the data receiving that import from other systems like SAP/People Soft or Home based applications
    Since SAP/PeopleSoft/Homebased Application does not have web/other servcies to PUSH the accounting data to Oracle EBS GL_INTERFACE table. We are looking the possible options from Oracle EBS prospective to receive the data from other system.
    1. Is there any standard Oracle API is available to meet this requirement or do we need to develop the PL/Sql Prgm?
    And other alternative solution is looking for...
    2. Installting the BEPEL in a SOA and use the "mq" servcies to find the solution. In case if we want to use the BPEL-SOA it might need the seperate license. As SOA geteway is standalone component and it is not part of Oracle EBS R12 Suite.
    3. Is there any other solutions to achieve this?
    Thanks in advance to speraing the time on this.... Thank you.
    Rgds

    1. Is there any standard Oracle API is available to meet this requirement or do we need to develop the PL/Sql Prgm?
    And other alternative solution is looking for...Please see these docs/links.
    R12 Populating The GL_INTERFACE Table [ID 578930.1]
    R12 Populating GL_INTERFACE - Assigning Values to Optional Columns (i.e Reference Fields) [ID 578959.1]
    IREP
    http://irep.oracle.com/index.html
    eTRM
    http://etrm.oracle.com/
    2. Installting the BEPEL in a SOA and use the "mq" servcies to find the solution. In case if we want to use the BPEL-SOA it might need the seperate license. As SOA geteway is standalone component and it is not part of Oracle EBS R12 Suite.You need a separate license for Weblogic/SOA installation -- Contact your Oracle Sales representative for details.
    3. Is there any other solutions to achieve this?Please review the docs/links referenced above.
    Thanks,
    Hussein

  • GL_INTERFACE table accidentally dropped

    Hi all,
    Please this is urgent, i dropped the gl_interface table accidentally, oracle 10g, flashback wasnt enabled, so recyclebin doesnt hold it. so what can i do now? can i still possible recover the table or can the gl_Interface table be generated again, is there any other alternative?
    thank you
    larry

    This DDL is for
    RDBMS : 10.2.0.2.0
    Oracle Applications : 11.5.10.2
    CREATE TABLE GL_INTERFACE
    STATUS VARCHAR2(50 BYTE) NOT NULL,
    SET_OF_BOOKS_ID NUMBER(15) NOT NULL,
    ACCOUNTING_DATE DATE NOT NULL,
    CURRENCY_CODE VARCHAR2(15 BYTE) NOT NULL,
    DATE_CREATED DATE NOT NULL,
    CREATED_BY NUMBER(15) NOT NULL,
    ACTUAL_FLAG VARCHAR2(1 BYTE) NOT NULL,
    USER_JE_CATEGORY_NAME VARCHAR2(25 BYTE) NOT NULL,
    USER_JE_SOURCE_NAME VARCHAR2(25 BYTE) NOT NULL,
    CURRENCY_CONVERSION_DATE DATE,
    ENCUMBRANCE_TYPE_ID NUMBER,
    BUDGET_VERSION_ID NUMBER,
    USER_CURRENCY_CONVERSION_TYPE VARCHAR2(30 BYTE),
    CURRENCY_CONVERSION_RATE NUMBER,
    SEGMENT1 VARCHAR2(25 BYTE),
    SEGMENT2 VARCHAR2(25 BYTE),
    SEGMENT3 VARCHAR2(25 BYTE),
    SEGMENT4 VARCHAR2(25 BYTE),
    SEGMENT5 VARCHAR2(25 BYTE),
    SEGMENT6 VARCHAR2(25 BYTE),
    SEGMENT7 VARCHAR2(25 BYTE),
    SEGMENT8 VARCHAR2(25 BYTE),
    SEGMENT9 VARCHAR2(25 BYTE),
    SEGMENT10 VARCHAR2(25 BYTE),
    SEGMENT11 VARCHAR2(25 BYTE),
    SEGMENT12 VARCHAR2(25 BYTE),
    SEGMENT13 VARCHAR2(25 BYTE),
    SEGMENT14 VARCHAR2(25 BYTE),
    SEGMENT15 VARCHAR2(25 BYTE),
    SEGMENT16 VARCHAR2(25 BYTE),
    SEGMENT17 VARCHAR2(25 BYTE),
    SEGMENT18 VARCHAR2(25 BYTE),
    SEGMENT19 VARCHAR2(25 BYTE),
    SEGMENT20 VARCHAR2(25 BYTE),
    SEGMENT21 VARCHAR2(25 BYTE),
    SEGMENT22 VARCHAR2(25 BYTE),
    SEGMENT23 VARCHAR2(25 BYTE),
    SEGMENT24 VARCHAR2(25 BYTE),
    SEGMENT25 VARCHAR2(25 BYTE),
    SEGMENT26 VARCHAR2(25 BYTE),
    SEGMENT27 VARCHAR2(25 BYTE),
    SEGMENT28 VARCHAR2(25 BYTE),
    SEGMENT29 VARCHAR2(25 BYTE),
    SEGMENT30 VARCHAR2(25 BYTE),
    ENTERED_DR NUMBER,
    ENTERED_CR NUMBER,
    ACCOUNTED_DR NUMBER,
    ACCOUNTED_CR NUMBER,
    TRANSACTION_DATE DATE,
    REFERENCE1 VARCHAR2(100 BYTE),
    REFERENCE2 VARCHAR2(240 BYTE),
    REFERENCE3 VARCHAR2(100 BYTE),
    REFERENCE4 VARCHAR2(100 BYTE),
    REFERENCE5 VARCHAR2(240 BYTE),
    REFERENCE6 VARCHAR2(100 BYTE),
    REFERENCE7 VARCHAR2(100 BYTE),
    REFERENCE8 VARCHAR2(100 BYTE),
    REFERENCE9 VARCHAR2(100 BYTE),
    REFERENCE10 VARCHAR2(240 BYTE),
    REFERENCE11 VARCHAR2(100 BYTE),
    REFERENCE12 VARCHAR2(100 BYTE),
    REFERENCE13 VARCHAR2(100 BYTE),
    REFERENCE14 VARCHAR2(100 BYTE),
    REFERENCE15 VARCHAR2(100 BYTE),
    REFERENCE16 VARCHAR2(100 BYTE),
    REFERENCE17 VARCHAR2(100 BYTE),
    REFERENCE18 VARCHAR2(100 BYTE),
    REFERENCE19 VARCHAR2(100 BYTE),
    REFERENCE20 VARCHAR2(100 BYTE),
    REFERENCE21 VARCHAR2(240 BYTE),
    REFERENCE22 VARCHAR2(240 BYTE),
    REFERENCE23 VARCHAR2(240 BYTE),
    REFERENCE24 VARCHAR2(240 BYTE),
    REFERENCE25 VARCHAR2(240 BYTE),
    REFERENCE26 VARCHAR2(240 BYTE),
    REFERENCE27 VARCHAR2(240 BYTE),
    REFERENCE28 VARCHAR2(240 BYTE),
    REFERENCE29 VARCHAR2(240 BYTE),
    REFERENCE30 VARCHAR2(240 BYTE),
    JE_BATCH_ID NUMBER(15),
    PERIOD_NAME VARCHAR2(15 BYTE),
    JE_HEADER_ID NUMBER(15),
    JE_LINE_NUM NUMBER(15),
    CHART_OF_ACCOUNTS_ID NUMBER(15),
    FUNCTIONAL_CURRENCY_CODE VARCHAR2(15 BYTE),
    CODE_COMBINATION_ID NUMBER(15),
    DATE_CREATED_IN_GL DATE,
    WARNING_CODE VARCHAR2(4 BYTE),
    STATUS_DESCRIPTION VARCHAR2(240 BYTE),
    STAT_AMOUNT NUMBER,
    GROUP_ID NUMBER(15),
    REQUEST_ID NUMBER(15),
    SUBLEDGER_DOC_SEQUENCE_ID NUMBER,
    SUBLEDGER_DOC_SEQUENCE_VALUE NUMBER,
    ATTRIBUTE1 VARCHAR2(150 BYTE),
    ATTRIBUTE2 VARCHAR2(150 BYTE),
    ATTRIBUTE3 VARCHAR2(150 BYTE),
    ATTRIBUTE4 VARCHAR2(150 BYTE),
    ATTRIBUTE5 VARCHAR2(150 BYTE),
    ATTRIBUTE6 VARCHAR2(150 BYTE),
    ATTRIBUTE7 VARCHAR2(150 BYTE),
    ATTRIBUTE8 VARCHAR2(150 BYTE),
    ATTRIBUTE9 VARCHAR2(150 BYTE),
    ATTRIBUTE10 VARCHAR2(150 BYTE),
    ATTRIBUTE11 VARCHAR2(150 BYTE),
    ATTRIBUTE12 VARCHAR2(150 BYTE),
    ATTRIBUTE13 VARCHAR2(150 BYTE),
    ATTRIBUTE14 VARCHAR2(150 BYTE),
    ATTRIBUTE15 VARCHAR2(150 BYTE),
    ATTRIBUTE16 VARCHAR2(150 BYTE),
    ATTRIBUTE17 VARCHAR2(150 BYTE),
    ATTRIBUTE18 VARCHAR2(150 BYTE),
    ATTRIBUTE19 VARCHAR2(150 BYTE),
    ATTRIBUTE20 VARCHAR2(150 BYTE),
    CONTEXT VARCHAR2(150 BYTE),
    CONTEXT2 VARCHAR2(150 BYTE),
    INVOICE_DATE DATE,
    TAX_CODE VARCHAR2(15 BYTE),
    INVOICE_IDENTIFIER VARCHAR2(20 BYTE),
    INVOICE_AMOUNT NUMBER,
    CONTEXT3 VARCHAR2(150 BYTE),
    USSGL_TRANSACTION_CODE VARCHAR2(30 BYTE),
    DESCR_FLEX_ERROR_MESSAGE VARCHAR2(240 BYTE),
    JGZZ_RECON_REF VARCHAR2(240 BYTE),
    AVERAGE_JOURNAL_FLAG VARCHAR2(1 BYTE),
    ORIGINATING_BAL_SEG_VALUE VARCHAR2(25 BYTE),
    GL_SL_LINK_ID NUMBER,
    GL_SL_LINK_TABLE VARCHAR2(30 BYTE),
    REFERENCE_DATE DATE
    TABLESPACE APPS_TS_TX_INTERFACE
    PCTUSED 0
    PCTFREE 10
    INITRANS 10
    MAXTRANS 255
    STORAGE (
    INITIAL 128K
    NEXT 128K
    MINEXTENTS 1
    MAXEXTENTS 2147483645
    PCTINCREASE 0
    BUFFER_POOL DEFAULT
    LOGGING
    NOCOMPRESS
    NOCACHE
    NOPARALLEL
    MONITORING;
    CREATE INDEX GL_INTERFACE_N3 ON GL_INTERFACE
    (SUBLEDGER_DOC_SEQUENCE_VALUE, SUBLEDGER_DOC_SEQUENCE_ID)
    LOGGING
    TABLESPACE APPS_TS_TX_INTERFACE
    PCTFREE 10
    INITRANS 11
    MAXTRANS 255
    STORAGE (
    INITIAL 128K
    NEXT 128K
    MINEXTENTS 1
    MAXEXTENTS 2147483645
    PCTINCREASE 0
    BUFFER_POOL DEFAULT
    NOPARALLEL;
    CREATE INDEX GL_INTERFACE_N4 ON GL_INTERFACE
    (REFERENCE26, REFERENCE22, REFERENCE23)
    LOGGING
    TABLESPACE APPS_TS_TX_INTERFACE
    PCTFREE 10
    INITRANS 11
    MAXTRANS 255
    STORAGE (
    INITIAL 128K
    NEXT 128K
    MINEXTENTS 1
    MAXEXTENTS 2147483645
    PCTINCREASE 0
    BUFFER_POOL DEFAULT
    NOPARALLEL;
    CREATE INDEX GL_INTERFACE_N2 ON GL_INTERFACE
    (REQUEST_ID, JE_HEADER_ID, STATUS, CODE_COMBINATION_ID)
    LOGGING
    TABLESPACE APPS_TS_TX_INTERFACE
    PCTFREE 0
    INITRANS 11
    MAXTRANS 255
    STORAGE (
    INITIAL 128K
    NEXT 128K
    MINEXTENTS 1
    MAXEXTENTS 2147483645
    PCTINCREASE 0
    BUFFER_POOL DEFAULT
    NOPARALLEL;
    CREATE INDEX GL_INTERFACE_N1 ON GL_INTERFACE
    (USER_JE_SOURCE_NAME, SET_OF_BOOKS_ID, GROUP_ID)
    LOGGING
    TABLESPACE APPS_TS_TX_INTERFACE
    PCTFREE 0
    INITRANS 11
    MAXTRANS 255
    STORAGE (
    INITIAL 128K
    NEXT 128K
    MINEXTENTS 1
    MAXEXTENTS 2147483645
    PCTINCREASE 0
    BUFFER_POOL DEFAULT
    NOPARALLEL;
    CREATE SYNONYM APPS.GL_INTERFACE FOR GL_INTERFACE;
    GRANT DELETE, INSERT, SELECT, UPDATE ON GL_INTERFACE TO AP WITH GRANT OPTION;
    GRANT ALTER, DELETE, INDEX, INSERT, REFERENCES, SELECT, UPDATE, ON COMMIT REFRESH, QUERY REWRITE, DEBUG, FLASHBACK ON GL_INTERFACE TO APPS WITH GRANT OPTION;
    GRANT DELETE, INSERT, SELECT, UPDATE ON GL_INTERFACE TO AR WITH GRANT OPTION;
    GRANT DELETE, INSERT, SELECT, UPDATE ON GL_INTERFACE TO BOM WITH GRANT OPTION;
    GRANT DELETE, INSERT, SELECT, UPDATE ON GL_INTERFACE TO CE WITH GRANT OPTION;
    GRANT DELETE, INSERT, SELECT, UPDATE ON GL_INTERFACE TO CN WITH GRANT OPTION;
    GRANT DELETE, INSERT, SELECT, UPDATE ON GL_INTERFACE TO CRP WITH GRANT OPTION;
    GRANT DELETE, INSERT, SELECT, UPDATE ON GL_INTERFACE TO HR WITH GRANT OPTION;
    GRANT DELETE, INSERT, SELECT, UPDATE ON GL_INTERFACE TO HR_SECURE_USER;
    GRANT DELETE, INSERT, SELECT, UPDATE ON GL_INTERFACE TO INV WITH GRANT OPTION;
    GRANT DELETE, INSERT, SELECT, UPDATE ON GL_INTERFACE TO JG WITH GRANT OPTION;
    GRANT DELETE, INSERT, SELECT, UPDATE ON GL_INTERFACE TO MFG WITH GRANT OPTION;
    GRANT DELETE, INSERT, SELECT, UPDATE ON GL_INTERFACE TO MRP WITH GRANT OPTION;
    GRANT DELETE, INSERT, SELECT, UPDATE ON GL_INTERFACE TO OE WITH GRANT OPTION;
    GRANT DELETE, INSERT, SELECT, UPDATE ON GL_INTERFACE TO OSM WITH GRANT OPTION;
    GRANT DELETE, INSERT, SELECT, UPDATE ON GL_INTERFACE TO PA WITH GRANT OPTION;
    GRANT DELETE, INSERT, SELECT, UPDATE ON GL_INTERFACE TO PO WITH GRANT OPTION;
    GRANT DELETE, INSERT, SELECT, UPDATE ON GL_INTERFACE TO QA WITH GRANT OPTION;

  • Payables data stuck in gl_interface table

    Created Invoices in Payables.
    Validated
    process run for create accounting & trasfer to gl
    resulted in warning with message in log:
    SHRD0008: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Warning ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    R_CLPR0003: Unprocessed records exist in GL_INTERFACE for period JUN-10
    Any help will be highly appreciated.
    Regards,
    Umair

    common issues for this are,
    check if there is any security rule in place which is restricting the subledger transactions to get imported to GL.
    check any CVR might have been put after the invoice was accounted and before being transferred to GL.
    check if any CCID got invalid.
    there is a related article at,
    http://oracally.blogspot.com/2010/02/ap-period-is-not-closing-due-to.html
    cheers,

  • Need to restore a table which is del by mistake my archive log is on

    Hi Guru's
    i took full backup through RMAN. My archive log in ON.
    After taking backup one of table is deleted by mistake ,how can i restore it ?
    Thanks

    Hello,
    The solution depends on your Oracle Release and if you enabled RECYCLE BIN and/or FLASHBACK LOG.
    If the Table was dropped and you are in Oracle *9.2* or earlier release, you need to Restore/Recover the Database (from the RMAN BACKUP) to a time before the Table was dropped, somewhere else (another server,...). Then from this Database you may export the Table and import it to the original Database.
    If the rows of the Table were just Deleted (so the Table still exists), you may (in 9.2) use FLASHBACK QUERY to get back the deleted rows.
    If you are in *10.1* or later release, then if the Table was dropped you may use FLASHBACK DROP to get it back (see the previous post from Kamran). If the rows were just deleted (so the Table still exists) you may use FLASHBACK QUERY or more efficiently FLASHBACK TABLE:
    http://www.oracle-developer.net/display.php?id=313
    About FLASHBACK QUERY, it depends on the Undo Retention time (it should be large enough), and the table shouldn't have Column of specific Datatype like LONG or LOB:
    http://www.orafaq.com/node/50
    Hope this help.
    Best regards,
    Jean-Valentin

  • Beginner trying to import data from GL_interface to GL

    Hello I’m Beginner with Oracle GL and I ‘m not able to do an import from GL_Interface, I have putted my data into GL_interface but I think that something is wrong with it Becose when I try to import data
    with Journals->Import->run oracle answer me that GL_interface is empty!
    I think that maybe my insert is not correct and that's why oracle don't want to use it... can someone help me?
    ----> I have put the data that way:
    insert into gl_interface (status,set_of_books_id,accounting_date, currency_code,date_created,
    created_by, actual_flag, user_je_category_name, user_je_source_name, segment1,segment2,segment3,
    entered_dr, entered_cr,transaction_date,reference1 )
    values ('NEW', '1609', sysdate, 'FRF', sysdate,1008009, 'A', 'xx jab payroll', 'xx jab payroll', '01','002','4004',111.11,0,
    sysdate,'xx_finance_jab_demo');
    insert into gl_interface (status,set_of_books_id,accounting_date, currency_code,date_created,
    created_by, actual_flag, user_je_category_name, user_je_source_name, segment1,segment2,segment3,
    entered_dr, entered_cr,transaction_date,reference1 )
    values ('NEW', '1609', sysdate, 'FRF', sysdate,1008009, 'A', 'xx jab payroll', 'xx jab payroll', '01','002','1005',0,111.11,
    sysdate,'xx_finance_jab_demo');
    ------------> Oracle send me that message:
    General Ledger: Version : 11.5.0 - Development
    Copyright (c) 1979, 1999, Oracle Corporation. All rights reserved.
    GLLEZL module: Journal Import
    Current system time is 14-MAR-2007 15:39:25
    Running in Debug Mode
    gllsob() 14-MAR-2007 15:39:25sob_id = 124
    sob_name = Vision France
    coa_id = 50569
    num_segments = 6
    delim = '.'
    segments =
    SEGMENT1
    SEGMENT2
    SEGMENT3
    SEGMENT4
    SEGMENT5
    SEGMENT6
    index segment is SEGMENT2
    balancing segment is SEGMENT1
    currency = EUR
    sus_flag = Y
    ic_flag = Y
    latest_opened_encumbrance_year = 2006
    pd_type = Month
    << gllsob() 14-MAR-2007 15:39:25
    gllsys() 14-MAR-2007 15:39:25fnd_user_id = 1008009
    fnd_user_name = JAB-DEVELOPPEUR
    fnd_login_id = 2675718
    con_request_id = 2918896
    sus_on = 0
    from_date =
    to_date =
    create_summary = 0
    archive = 0
    num_rec = 1000
    num_flex = 2500
    run_id = 55578
    << gllsys() 14-MAR-2007 15:39:25
    SHRD0108: Retrieved 51 records from fnd_currencies
    gllcsa() 14-MAR-2007 15:39:25<< gllcsa() 14-MAR-2007 15:39:25
    gllcnt() 14-MAR-2007 15:39:25SHRD0118: Updated 1 record(s) in table: gl_interface_control
    source name = xx jab payroll
    group id = -1
    LEZL0001: Found 1 sources to process.
    glluch() 14-MAR-2007 15:39:25<< glluch() 14-MAR-2007 15:39:25
    gl_import_hook_pkg.pre_module_hook() 14-MAR-2007 15:39:26<< gl_import_hook_pkg.pre_module_hook() 14-MAR-2007 15:39:26
    glusbe() 14-MAR-2007 15:39:26<< glusbe() 14-MAR-2007 15:39:26
    << gllcnt() 14-MAR-2007 15:39:26
    gllpst() 14-MAR-2007 15:39:26SHRD0108: Retrieved 110 records from gl_period_statuses
    << gllpst() 14-MAR-2007 15:39:27
    glldat() 14-MAR-2007 15:39:27Successfully built decode fragment for period_name and period_year
    gllbud() 14-MAR-2007 15:39:27SHRD0108: Retrieved 10 records from the budget tables
    << gllbud() 14-MAR-2007 15:39:27
    gllenc() 14-MAR-2007 15:39:27SHRD0108: Retrieved 15 records from gl_encumbrance_types
    << gllenc() 14-MAR-2007 15:39:27
    glldlc() 14-MAR-2007 15:39:27<< glldlc() 14-MAR-2007 15:39:27
    gllcvr() 14-MAR-2007 15:39:27SHRD0108: Retrieved 6 records from gl_daily_conversion_types
    << gllcvr() 14-MAR-2007 15:39:27
    gllfss() 14-MAR-2007 15:39:27LEZL0005: Successfully finished building dynamic SQL statement.
    << gllfss() 14-MAR-2007 15:39:27
    gllcje() 14-MAR-2007 15:39:27main_stmt:
    select int.rowid
    decode(int.SEGMENT1|| int.SEGMENT2|| int.SEGMENT3|| int.SEGMENT4|| int.SEGMENT5|| int.SEGMENT6
    , '', replace(ccid_cc.SEGMENT2,'.','
    ') || '.' || replace(ccid_cc.SEGMENT1,'.','
    ') || '.' || replace(ccid_cc.SEGMENT3,'.','
    ') || '.' || replace(ccid_cc.SEGMENT4,'.','
    ') || '.' || replace(ccid_cc.SEGMENT5,'.','
    ') || '.' || replace(ccid_cc.SEGMENT6,'.','
    , replace(int.SEGMENT2,'.','
    ') || '.' || replace(int.SEGMENT1,'.','
    ') || '.' || replace(int.SEGMENT3,'.','
    ') || '.' || replace(int.SEGMENT4,'.','
    ') || '.' || replace(int.SEGMENT5,'.','
    ') || '.' || replace(int.SEGMENT6,'.','
    ') ) flexfield , nvl(flex_cc.code_combination_id,
    nvl(int.code_combination_id, -4))
    , decode(int.SEGMENT1|| int.SEGMENT2|| int.SEGMENT3|| int.SEGMENT4|| int.SEGMENT5|| int.SEGMENT6
    , '', decode(ccid_cc.code_combination_id,
    null, decode(int.code_combination_id, null, -4, -5),
    decode(sign(nvl(ccid_cc.start_date_active, int.accounting_date-1)
    - int.accounting_date),
    1, -1,
    decode(sign(nvl(ccid_cc.end_date_active, int.accounting_date +1)
    - int.accounting_date),
    -1, -1, 0)) +
    decode(ccid_cc.enabled_flag,
    'N', -10, 0) +
    decode(ccid_cc.summary_flag, 'Y', -100,
    decode(int.actual_flag,
    'B', decode(ccid_cc.detail_budgeting_allowed_flag,
    'N', -100, 0),
    decode(ccid_cc.detail_posting_allowed_flag,
    'N', -100, 0)))),
    decode(flex_cc.code_combination_id,
    null, -4,
    decode(sign(nvl(flex_cc.start_date_active, int.accounting_date-1)
    - int.accounting_date),
    1, -1,
    decode(sign(nvl(flex_cc.end_date_active, int.accounting_date +1)
    - int.accounting_date),
    -1, -1, 0)) +
    decode(flex_cc.enabled_flag,
    'N', -10, 0) +
    decode(flex_cc.summary_flag, 'Y', -100,
    decode(int.actual_flag,
    'B', decode(flex_cc.detail_budgeting_allowed_flag,
    'N', -100, 0),
    decode(flex_cc.detail_posting_allowed_flag,
    'N', -100, 0)))))
    , int.user_je_category_name
    , int.user_je_category_name
    , 'UNKNOWN' period_name
    , decode(actual_flag, 'B'
         , decode(period_name, NULL, '-1' ,period_name), nvl(period_name, '0')) period_name2
    , currency_code
    , decode(actual_flag
         , 'A', actual_flag
         , 'B', decode(budget_version_id
         , 1210, actual_flag
         , 1211, actual_flag
         , 1212, actual_flag
         , 1331, actual_flag
         , 1657, actual_flag
         , 1658, actual_flag
         , NULL, '1', '6')
         , 'E', decode(encumbrance_type_id
         , 1000, actual_flag
         , 1001, actual_flag
         , 1022, actual_flag
         , 1023, actual_flag
         , 1024, actual_flag
         , 1048, actual_flag
         , 1049, actual_flag
         , 1050, actual_flag
         , 1025, actual_flag
         , 999, actual_flag
         , 1045, actual_flag
         , 1046, actual_flag
         , 1047, actual_flag
         , 1068, actual_flag
         , 1088, actual_flag
         , NULL, '3', '4'), '5') actual_flag
    , '0' exception_rate
    , decode(currency_code
         , 'EUR', 1
         , 'STAT', 1
         , decode(actual_flag, 'E', -8, 'B', 1
         , decode(user_currency_conversion_type
         , 'User', decode(currency_conversion_rate, NULL, -1, currency_conversion_rate)
         ,'Corporate',decode(currency_conversion_date,NULL,-2,-6)
         ,'Spot',decode(currency_conversion_date,NULL,-2,-6)
         ,'Reporting',decode(currency_conversion_date,NULL,-2,-6)
         ,'HRUK',decode(currency_conversion_date,NULL,-2,-6)
         ,'DALY',decode(currency_conversion_date,NULL,-2,-6)
         ,'HLI',decode(currency_conversion_date,NULL,-2,-6)
         , NULL, decode(currency_conversion_rate,NULL,
         decode(decode(nvl(to_char(entered_dr),'X'),'X',1,2),decode(nvl(to_char(accounted_dr),'X'),'X',1,2),
         decode(decode(nvl(to_char(entered_cr),'X'),'X',1,2),decode(nvl(to_char(accounted_cr),'X'),'X',1,2),-20,-3),-3),-9),-9))) currency_conversion_rate
    , to_number(to_char(nvl(int.currency_conversion_date, int.accounting_date), 'J'))
    , decode(int.actual_flag
         , 'A', decode(int.currency_code
              , 'EUR', 'User'
         , 'STAT', 'User'
              , nvl(int.user_currency_conversion_type, 'User'))
         , 'B', 'User', 'E', 'User'
         , nvl(int.user_currency_conversion_type, 'User')) user_currency_conversion_type
    , ltrim(rtrim(substrb(rtrim(substrb(int.reference1, 1, 50)) || ' ' || int.user_je_source_name || ' 2918896: ' || int.actual_flag || ' ' || int.group_id, 1, 100)))
    , rtrim(substrb(nvl(rtrim(int.reference2), 'Journal Import ' || int.user_je_source_name || ' 2918896:'), 1, 240))
    , ltrim(rtrim(substrb(rtrim(rtrim(substrb(int.reference4, 1, 25)) || ' ' || int.user_je_category_name || ' ' || int.currency_code || decode(int.actual_flag, 'E', ' ' || int.encumbrance_type_id, 'B', ' ' || int.budget_version_id, '') || ' ' || int.user_currency_conversion_type || ' ' || decode(int.user_currency_conversion_type, NULL, '', 'User', to_char(int.currency_conversion_rate), to_char(int.currency_conversion_date))) || ' ' || substrb(int.reference8, 1, 15) || int.originating_bal_seg_value, 1, 100)))
    , rtrim(nvl(rtrim(int.reference5), 'Journal Import 2918896:'))
    , rtrim(substrb(nvl(rtrim(int.reference6), 'Journal Import Created'), 1, 80))
    , rtrim(decode(upper(substrb(nvl(rtrim(int.reference7), 'N'), 1, 1)),'Y','Y', 'N'))
    , decode(upper(substrb(int.reference7, 1, 1)), 'Y', decode(rtrim(reference8), NULL, '-1', rtrim(substrb(reference8, 1, 15))), NULL)
    , rtrim(upper(substrb(int.reference9, 1, 1)))
    , rtrim(nvl(rtrim(int.reference10), nvl(to_char(int.subledger_doc_sequence_value), 'Journal Import Created')))
    , int.entered_dr
    , int.entered_cr
    , to_number(to_char(int.accounting_date,'J'))
    , to_char(int.accounting_date, 'YYYY/MM/DD')
    , int.user_je_source_name
    , nvl(int.encumbrance_type_id, -1)
    , nvl(int.budget_version_id, -1)
    , NULL
    , int.stat_amount
    , decode(int.actual_flag
    , 'E', decode(int.currency_code, 'STAT', '1', '0'), '0')
    , decode(int.actual_flag
    , 'A', decode(int.budget_version_id
    , NULL, decode(int.encumbrance_type_id, NULL, '0', '1')
    , decode(int.encumbrance_type_id, NULL, '2', '3'))
    , 'B', decode(int.encumbrance_type_id
    , NULL, '0', '4')
    , 'E', decode(int.budget_version_id
    , NULL, '0', '5'), '0')
    , int.accounted_dr
    , int.accounted_cr
    , nvl(int.group_id, -1)
    , nvl(int.average_journal_flag, 'N')
    , int.originating_bal_seg_value
    from GL_INTERFACE int,
    gl_code_combinations flex_cc,
    gl_code_combinations ccid_cc
    where int.set_of_books_id = 124
    and int.status != 'PROCESSED'
    and (int.user_je_source_name,nvl(int.group_id,-1)) in (('xx jab payroll', -1))
    and flex_cc.SEGMENT1(+) = int.SEGMENT1
    and flex_cc.SEGMENT2(+) = int.SEGMENT2
    and flex_cc.SEGMENT3(+) = int.SEGMENT3
    and flex_cc.SEGMENT4(+) = int.SEGMENT4
    and flex_cc.SEGMENT5(+) = int.SEGMENT5
    and flex_cc.SEGMENT6(+) = int.SEGMENT6
    and flex_cc.chart_of_accounts_id(+) = 50569
    and flex_cc.template_id(+) is NULL
    and ccid_cc.code_combination_id(+) = int.code_combination_id
    and ccid_cc.chart_of_accounts_id(+) = 50569
    and ccid_cc.template_id(+) is NULL
    order by decode(int.SEGMENT1|| int.SEGMENT2|| int.SEGMENT3|| int.SEGMENT4|| int.SEGMENT5|| int.SEGMENT6
    , rpad(ccid_cc.SEGMENT2,30) || '.' || rpad(ccid_cc.SEGMENT1,30) || '.' || rpad(ccid_cc.SEGMENT3,30) || '.' || rpad(ccid_cc.SEGMENT4,30) || '.' || rpad(ccid_cc.SEGMENT5,30) || '.' || rpad(ccid_cc.SEGMENT6,30)
    , rpad(int.SEGMENT2,30) || '.' || rpad(int.SEGMENT1,30) || '.' || rpad(int.SEGMENT3,30) || '.' || rpad(int.SEGMENT4,30) || '.' || rpad(int.SEGMENT5,30) || '.' || rpad(int.SEGMENT6,30)
    ) , int.entered_dr, int.accounted_dr, int.entered_cr, int.accounted_cr, int.accounting_date
    control->len_mainsql = 16402
    length of main_stmt = 7428
    upd_stmt.arr:
    update GL_INTERFACE
    set status = :status
    , status_description = :description
    , je_batch_id = :batch_id
    , je_header_id = :header_id
    , je_line_num = :line_num
    , code_combination_id = decode(:ccid, '-1', code_combination_id, :ccid)
    , accounted_dr = :acc_dr
    , accounted_cr = :acc_cr
    , descr_flex_error_message = :descr_description
    , request_id = to_number(:req_id)
    where rowid = :row_id
    upd_stmt.len: 394
    ins_stmt.arr:
    insert into gl_je_lines
    ( je_header_id, je_line_num, last_update_date, creation_date, last_updated_by, created_by , set_of_books_id, code_combination_id ,period_name, effective_date , status , entered_dr , entered_cr , accounted_dr , accounted_cr , reference_1 , reference_2
    , reference_3 , reference_4 , reference_5 , reference_6 , reference_7 , reference_8 , reference_9 , reference_10 , description
    , stat_amount , attribute1 , attribute2 , attribute3 , attribute4 , attribute5 , attribute6 ,attribute7 , attribute8
    , attribute9 , attribute10 , attribute11 , attribute12 , attribute13 , attribute14, attribute15, attribute16, attribute17
    , attribute18 , attribute19 , attribute20 , context , context2 , context3 , invoice_amount , invoice_date , invoice_identifier
    , tax_code , no1 , ussgl_transaction_code , gl_sl_link_id , gl_sl_link_table , subledger_doc_sequence_id , subledger_doc_sequence_value
    , jgzz_recon_ref , ignore_rate_flag)
    SELECT
    :je_header_id , :je_line_num , sysdate , sysdate , 1008009 , 1008009 , 124 , :ccid , :period_name
    , decode(substr(:account_date, 1, 1), '-', trunc(sysdate), to_date(:account_date, 'YYYY/MM/DD'))
    , 'U' , :entered_dr , :entered_cr , :accounted_dr , :accounted_cr
    , reference21, reference22, reference23, reference24, reference25, reference26, reference27, reference28, reference29
    , reference30, :description, :stat_amt, '' , '', '', '', '', '', '', '', '' , '', '', '', '', '', '', '', '', '', '', ''
    , '', '', '', '', '', '', '', '', '', gl_sl_link_id
    , gl_sl_link_table
    , subledger_doc_sequence_id
    , subledger_doc_sequence_value
    , jgzz_recon_ref
    , null
    FROM GL_INTERFACE
    where rowid = :row_id
    ins_stmt.len: 1818
    glluch() 14-MAR-2007 15:39:27<< glluch() 14-MAR-2007 15:39:27
    LEZL0008: Found no interface records to process.
    LEZL0009: Check SET_OF_BOOKS_ID, GROUP_ID, and USER_JE_SOURCE_NAME of interface records.
    If no GROUP_ID is specified, then only data with no GROUP_ID will be retrieved. Note that most data
    from the Oracle subledgers has a GROUP_ID, and will not be retrieved if no GROUP_ID is specified.
    SHRD0119: Deleted 1 record(s) from gl_interface_control.
    Start of log messages from FND_FILE
    End of log messages from FND_FILE
    Executing request completion options...
    Finished executing request completion options.
    No data was found in the GL_INTERFACE table.
    Concurrent request completed
    Current system time is 14-MAR-2007 15:39:27
    ---------------------------------------------------------------------------

    As per the error message said, you need to specify a group id.
    as per documentation :
    GROUP_ID: Enter a unique group number to distinguish import data within a
    source. You can run Journal Import in parallel for the same source if you specify a
    unique group number for each request.
    For example if you put data for payables and receivables, you need to put different group id to separate payables and receivables data.
    HTH

  • Deleting data from a very large log table (custom table in our namespace)

    Hello,
    I have been tasked with clearing a log table in our landscape to only include the most recent entries.  Is it possible to do this given that the table has already got 230 000 000 entries and will need to keep around 600 000 recent entries?
    Should I do this via ABAP and if so, how?  Thanks,
    Samir

    Hi,
    so you are going to keep 0,3 % of your data?
    If you should do it in ABAP or on the database is your decission.
    In my opinion doing things on the database directly should be done
    exceptional cases only e.g. for one time actions or actions that have to
    be done very rarely and with different parameters / options. Regular
    and similar tasks should be done in ABAP i think.
    In any case i would not delete the majority of the records but copy
    the records to keep in an empty table with the same structure, delete the
    table as a whole (check clients!) and "copy" the new table back in ABAP
    or rename the new table to the old table after droping the old table on the database.
    If you have only one client you can copy the data you need in a new
    table and truncate the old table (fast deletion for all clients). If you have
    data to keep  for other clients as well check how much data it is per client
    in comparison to the total number of lines (if only a small fraction, pefer copying
    them too).
    On the database you can use CTAS (create table as select) and drop table and
    rename table. Those commands shoudl be very efficient but work client independently.
    If you have to consider clients SELECT; INSERT; DELETE or TRUNCATE (depends on if you have
    copied all data considering clients) are
    your friends.
    Kind regards,
    Hermann

  • Persistence two beans in the same table, Duplicate table name in different beans

    Hi,
    I need to persist two beans belonging to different classes in the same
    table, but the following mistake happens:
    [jdo-enhance] javax.jdo.JDOFatalUserException: Invalid jdbc-table-name:
    Duplicate table name
    There exists some way of persisting both objects in the same table?
    thanks
    jasabino

    No. It would be very confusing to do so. Can they not share an
    inheritance hierarchy? You can use flat mapping to do so.
    Jose wrote:
    Hi,
    I need to persist two beans belonging to different classes in the same
    table, but the following mistake happens:
    [jdo-enhance] javax.jdo.JDOFatalUserException: Invalid jdbc-table-name:
    Duplicate table name
    There exists some way of persisting both objects in the same table?
    thanks
    jasabino

  • SRKIM: R12: GL_INTERFACE IN R12

    PURPOSE
    R12 에서 GL_INTERFACE TABLE 의 쓰임에 대해 알아 보도록 한다.
    SOLUTION
    R12 에서는 R11i 때와는 달리 subledger module 에서gl transfer 를 수행 한다고 해서 데이타가 무조건 GL_INTERFACE TABLE 로 가지 않는다.
    profile option 'SLA: Disable Journal Import = No' 이 설정 되어 있는 경우 AP 등의 SUBLEDGER 에서 GL Transfer program을 수행 하면 gl_interface table 을 거치지 않고
    temporary table - xla_glt_<group id> 로 생성됨 - 을 만들어 데이타를 extract 한 후 journal import 를 자동 수행 하게 된다.
    이러한 경우 transfer 되는 데이타에 문제가 있어 EC12 등의 ERROR 가 발생하면 GL_INTERFACE 에 데이타가 남지 않고 다시 SUBLEDGER 의 ACCOUNTING TABLE 로 ROLLBACK 하게 된다.
    따라서 반드시 gl_interface table 을 거치게 하고 싶을 경우에는 'SLA: Disable Journal Import =Yes' 로 설정 한 후 사용 해야 한다.
    REFERENCE
    Note. 459383.1 - GL_Interface in R12
    NOTE.551386.1 FSAH After Journal Import Ends In Errors: No Data In

    ER information for Apps is available at https://etrm.oracle.com - you will need a valid MOS login. Some information is in these MOS Docs
    1012626.102 - HOW DO GL TO AR TO GL REFERENCE FIELDS, GL_IMPORT_REFERENCES MAP?
    578959.1 - R12 Populating GL_INTERFACE - Assigning Values to Optional Columns (i.e Reference Fields)
    186230.1 - R11i Payables Transfer To GL Reference Field Mappings
    HTH
    Srini

  • Controlling Journal Import from Gl_INTERFACE

    Hi All,
    We are moving data into GL_INterface through a conc program. All the not null fields are populated and when joournal Import is run, it is imported properly.
    The requirement:
    Now, we need a control, such that we will have a param, Yes_No, and if we select No, the data will be inserted into GL_INTERFACE but some flag/field will be unchecked or something so that when the journal import runs the records won't be selected to import. it should just sit in GL_Interface.
    I have checked the GL_INTERFACE table and not getting any field which when set to a value, won't be picked by journal import. Anyone has worked on this or has the required info?
    Thanks
    Shanks

    Hi Shanks,
    If you requirement is for the records not to be selected in journal import then try inserting the records with a different value other than NEW or the one which is not recognized by the journal import program in the Status Column.You can get these values from an already run journal import listing out the values or codes which are recognized by the journal import.
    Regards,
    Kumaran

  • How to drop all tables in perticular schema??

    Hi,
    I am new in oracle.
    I want to drop all tables in one perticular schema,
    Please tell me solution.
    PratHamesh

    If your few of your tables have referential intigrity constraints and trying to drop master table whithout droping child table first, oracle will produce an error.
    Better option would be to drop the entire schema and then create a new schema with the same name.
    on sql plus.
    set long size 20000
    select dbms_metadata.get_ddl('USER','USERNAME') from dual;
    --then save the above output to create the user later.
    drop username cascade
    use the above saved script to create the user again.
    Jaffar

  • What is the basic use of Interface Tables

    Hi there,
    We are trying to connect and send data to the EBS Oracle System.
    We are facing different scenarios and I don't know which one to use in which scenario?
    So if I add a vendor for instance, should I use the Interface tables or call a stored procedure?
    When using interface tables would I have a feedback? Success/Failure and if there is a failure would I have a detailed exception (Name too long, id mendatory...etc.)
    Same question for stored procedure
    A simple answer would be very helpful, and a link to any documentation would be great
    Thanks

    Hi,
    using interface tables is - normally - accompanied by a concurrent request, which does data validation, generation of a report and data import as well.
    So - if you are using correct data structures - using interface tables and running matching concurrent request might save you a lot of time.
    Most concurrent requests which are processing open interface tables are generating some detailled reports on records being processed or having
    failed validation. In addition you normally have some kind of status flag in open interface tables indicating processing state and sometimes error codes
    as well (see gl_interface table or ap_invoices_interface tables).
    When using api packages, you may have more flexibility (sometimes more features, when looking at the TCA apis), but you have to do some extra
    coding for validation, populating default values and so on.
    API Packages do always return a success/error state, which can be extracted in detail by using fnd_msg_pub package - but you have to code it by
    yourself.
    I would say open interface tables are a bit easier to handle whereas some of the api packages need more experience (e.g. when talking about TCA apis,
    there are some steps to do in the right order to get a customer created).
    Regards

Maybe you are looking for