Populating the PO_REQUISITIONS_INTERFACE_ALL table

Hi all
I am new to the technical side of Oracle.
Which product would you advise as the best way to populate the PO_REQUISITIONS_INTERFACE_ALL table , so that we can import requistions from an external system into oracle.
Thanks

Which product would you advise as the best way to populate the PO_REQUISITIONS_INTERFACE_ALL table , so that we can import requistions from an external system into oracleAs per my technical experience, we transfer the data from external system to temporary table in to new system and write a custom interface script, which will populates the data into interface table ( PO_REQUISITIONS_INTERFACE_ALL) from tempoary table. You need to discuss with your functional guy regarding, which column needs to be populate in the interface table, that will be given in the po user guide.
AFAIK, I dont think, there is third party software available for this

Similar Messages

  • Getting an error while populating the XREF table

    Hi all,
    I have created a package for populating xref table(have imported the required kms into this project)in which the steps are as follows:-
    1. getting the source column name from the AIAserviceConfigProperties file
    2. then the interface for extracting the job_id into the xref table
    I am getting the source column name but i am not able to populate the xref table..i am getting the following error:-
    com.sunopsis.core.SnpsInexistantObjectException: There is no connection for this logical schema/context pair:ESB_XREF / GLOBAL
    please throw a light on this
    Regards,
    Sourav
    Edited by: user13263578 on Feb 20, 2011 8:26 PM

    hi all,
    Do i have to create a new context because of importing KM_LKM SQL to SQL (ESB XREF) in my project?
    Regards,
    Sourav
    Edited by: user13263578 on Feb 20, 2011 10:36 PM

  • Populating the condition tables

    hi ,
    can anyone explain me clearly ,how to populate the condition tables programatically.
    its really urgent.
    regards,
    srikanth tulasi.
    Edited by: srikanth tulasi on Apr 9, 2008 11:02 AM

    Hi Prabhu,
                 Thanks for your quick reply. In fact, I am developing a BAPI function Module in which I am sending pricing scales to the external interface. For the same reason, I have to access the condition tables to find out the pricing scales. So, please help me find out the procedure to access the condition table (ex: A001) which comes as a value into my internal table.
    Thanks,
    John.

  • Select statement not populating the internal table

    Hi,
    I have a requirement where I have to upload a file from C drive, the fields in this file are VBELN, description & date of creation.
    I am able to get this file into the internal table. After this i need to cross check the VBELN against VBRK-VBELN, if present then update a Z-table.... How do I do the cross check part ?...
    if not tw_zvatcn[] is initial,
      select * from vbrk
          into table tw_vbrk
          for all entries in tw_zvatcn
           where vbeln = tw_zvztcn-vbeln
                and vkorg = p_vkorg.
      if sy-subrc = 0.
       modify ztzb from lw_zvatcn.
      endif.
    endif.
    Internal table tw_vbrk is coming blank, which is not correct because I see the data in db tbl VBRK

    Is p_vkorg a parameter or select option.
    If it is parameter and is blank you will not get data in the table.
    In that case make a condition for that field also.
    if not tw_zvatcn[] is initial.
    if p_vkorg is not initial.
    select * from vbrk
    into table tw_vbrk
    for all entries in tw_zvatcn
    where vbeln = tw_zvztcn-vbeln
    and vkorg = p_vkorg.
    if sy-subrc = 0.
    modify ztzb from lw_zvatcn.
    endif.
    else.
    select * from vbrk
    into table tw_vbrk
    for all entries in tw_zvatcn
    where vbeln = tw_zvztcn-vbeln.
    if sy-subrc = 0.
    modify ztzb from lw_zvatcn.
    endif.
    endif.
    endif.

  • Error: while populating the target.

    While populating the target table gettig the following error:
    ODI-1228: Task Int_HDM_IND_PRTY_ETHN (Export) fails on the target ORACLE connection ORACLE_ETLDEV_INOVA_3.
    Caused By: java.sql.SQLException: ORA-01471: cannot create a synonym with same name as object
         at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:457)
         at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:405)
         at oracle.jdbc.driver.T4C8Oall.processError(T4C8Oall.java:889)
         at oracle.jdbc.driver.T4CTTIfun.receive(T4CTTIfun.java:476)
         at oracle.jdbc.driver.T4CTTIfun.doRPC(T4CTTIfun.java:204)
         at oracle.jdbc.driver.T4C8Oall.doOALL(T4C8Oall.java:540)
         at oracle.jdbc.driver.T4CPreparedStatement.doOall8(T4CPreparedStatement.java:217)
         at oracle.jdbc.driver.T4CPreparedStatement.executeForRows(T4CPreparedStatement.java:1079)
         at oracle.jdbc.driver.OracleStatement.doExecuteWithTimeout(OracleStatement.java:1466)
         at oracle.jdbc.driver.OraclePreparedStatement.executeInternal(OraclePreparedStatement.java:3752)
         at oracle.jdbc.driver.OraclePreparedStatement.execute(OraclePreparedStatement.java:3937)
         at oracle.jdbc.driver.OraclePreparedStatementWrapper.execute(OraclePreparedStatementWrapper.java:1535)
         at oracle.odi.runtime.agent.execution.sql.SQLCommand.execute(SQLCommand.java:163)
         at oracle.odi.runtime.agent.execution.sql.SQLExecutor.execute(SQLExecutor.java:102)
         at oracle.odi.runtime.agent.execution.sql.SQLExecutor.execute(SQLExecutor.java:1)
         at oracle.odi.runtime.agent.execution.TaskExecutionHandler.handleTask(TaskExecutionHandler.java:50)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.processTask(SnpSessTaskSql.java:2906)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java:2609)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatAttachedTasks(SnpSessStep.java:537)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:453)
         at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:1740)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$2.doAction(StartSessRequestProcessor.java:338)
         at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:214)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.doProcessStartSessTask(StartSessRequestProcessor.java:272)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.access$0(StartSessRequestProcessor.java:263)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$StartSessTask.doExecute(StartSessRequestProcessor.java:822)
         at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:123)
         at oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$2.run(DefaultAgentTaskExecutor.java:82)
         at java.lang.Thread.run(Thread.java:662)

    Hi,
    Bottom line: The object name is already persist in ur back end DB.
    Login to ur back end schema and query,
    select * from all_objects where object_name like '<object name which ODI generate>%'
    Drop it manually and run the interface.
    Cause for the problem, initially u would have ran the same interface with different KM and opted DELETE_TEMPORARY_OBJECTS to False/ODI fails without dropping the temporary table, so ODI will keep the C$_<object name> in back end (may be in table form).
    Now, with DBLINK u are trying to create Synonym with the same name as the earlier object which eventually fails.
    Makes sense?
    Thanks,
    Guru

  • How to populate the main table and the lookup's at the same time

    Hi ,
       What I have with me is the XML files which contain the data from the material master and the excel sheet which talks about the mapping . it basically tells me which field of the main table maps to which field of which segment in the IDOC and also the name of the table and the field in the R3 system .
    I wish to use this info to populate the data in the material repository .
    - How can I populate the data in the lookup table at the same time when I am populating the main table ? I have only the XML's that correspond to the main table . I don't have seperate data for the lookup tables .
    - Can I use the standard maps available for import in the business content of material repository in MDM ?
    - If the answer to the second question is NO then I think i can create the maps and save them for future use .
    Regards
    Deepak Singh

    Hi, Deepak
    >>> - How can I populate the data in the lookup table at the same time when I am populating the main table ? I have only the XML's that correspond to the main table . I don't have seperate data for the lookup tables .
    I don't think you can populate both main table and all fields of lookup tables at the same time, i.e. using same map. You can consider 2 options to upload all information you have:
    1) If your XML file contains data you would like to upload to lookup tables, you can upload it to MDM lookup tables with several maps using same XML and choosing different sections of that XML corresponding to different MDM lookup tables.
    2) Also you can upload main table simultaneously with lookup table entries (using same map), but in this case new lookup table entries will only contain display field values that you mapped. To do this you should use 'Add' value mapping functionality for fields that you mapped to lookup tables.
    >>>- Can I use the standard maps available for import in the business content of material repository in MDM ?
    1) In case you have material master repository delivered by SAP and you use XML files which structure corresponds to SAP predelivered XSD schemas then you can use these maps undoubtedly.
    2) If your repository is based on SAP predelivered, but you changed it ,you should adjust these maps due to differences in repository structure and  XML files structure.
    3) If you created your repository from scratch you should consider option of making your own import maps.
    Regards,
    Vadim Kalabin

  • Populating a hash table

    Hi,
    I have a hashtable which contains a number of objects of my custom class.
    I need the properties of these objects to be stored in a jTable, a row per object.
    Could anyone offer me any help,
    Anything will be very much apreciated

    The title of your post is "populating a hash table" but the first sentence suggests you have already populated the hash table and you want to get data out of it.
    The elements() method gives you an Enumeration that allows you to extract the values from the Hashtable.
    There's a tutorial here about how to use JTables: http://java.sun.com/docs/books/tutorial/uiswing/components/table.html

  • Moving  data in all the internal tables to the final table  t_data

    hi all,
    how to data in all the internal tables to the final table  t_data
    *selecting fields from bkpf table
      SELECT     bukrs
                 belnr
                 gjahr
                 bldat
                 xblnr
                 usnam
         FROM    bkpf
         INTO TABLE t_bkpf
        WHERE  bukrs  EQ po_bukrs AND
               belnr IN  so_belnr  AND
               budat IN  so_budat  AND
               blart IN  so_blart.
      IF t_bkpf[] IS INITIAL.
        MESSAGE a999(zfi_ap_gl) WITH text-011.
        STOP.
      ELSE.
    *selecting fields from  bseg table.
        SELECT  bukrs
                belnr
                gjahr
                koart
                shkzg
                dmbtr
                zuonr
                sgtxt
                kostl
                hkont
                lifnr
                prctr
                FROM bseg
                INTO  TABLE  t_bseg
                FOR ALL ENTRIES IN t_bkpf
              WHERE bukrs EQ  t_bkpf-bukrs AND
                    belnr EQ t_bkpf-belnr AND
                    gjahr EQ t_bkpf-gjahr AND
                    lifnr IN so_lifnr.
      ENDIF.
      IF t_bseg[] IS INITIAL.
        MESSAGE a999(zfi_ap_gl) WITH text-011.
        STOP.
      ELSE.
    *selecting the companies address from adrc table
        SELECT  SINGLE addrnumber street str_suppl2 city1
                       region post_code1
                       FROM adrc
                       INTO wa_adrc
                       WHERE addrnumber EQ w_adrnr.
    *selecting adrnr from the lfa1 table
        SELECT lifnr adrnr name1 ort01 regio pstlz
                     FROM lfa1
                     INTO TABLE t_adrnr
                     FOR ALL ENTRIES IN t_bseg
                     WHERE  lifnr EQ t_bseg-lifnr.
        IF NOT t_adrnr[] IS INITIAL.
    *populating the t_vaddress table.
          SELECT  addrnumber
                  street
                  str_suppl2
                  FROM adrc
                  INTO TABLE t_vaddress
                  FOR ALL ENTRIES IN t_adrnr
                  WHERE addrnumber  EQ t_adrnr-adrnr.
    *populating the t_vendor table with the vendor address
          SELECT lifnr
                 adrnp_2
                 namev
                 name1
                 INTO TABLE t_vendor
                 FROM knvk
                 FOR ALL ENTRIES IN t_adrnr
                 WHERE  lifnr EQ t_adrnr-lifnr AND
                        adrnp_2 EQ t_adrnr-adrnr.
        ENDIF.
      ENDIF.

    Loop the internal table which is having the maximum number of records,then use read table....for other internal tables....in that loop and then append them into final internal table.
    Ex-LOOP AT IT_VBRP INTO WA_VBRP.
        WA_FINAL-WERKS = WA_VBRP-WERKS_I.
        WA_FINAL-KUNAG = WA_VBRP-KUNAG.
        WA_FINAL-AEDAT = WA_VBRP-AEDAT.
        READ TABLE IT_KONV INTO WA_KONV WITH KEY KNUMV = WA_VBRP-KNUMV
                                                 KPOSN = WA_VBRP-POSNR_I.
        IF SY-SUBRC EQ 0.
          WA_FINAL-KSCHL = WA_KONV-KSCHL.
          CLEAR WA_KONV.
        ENDIF.
        READ TABLE IT_KNA1 INTO WA_KNA1  WITH KEY KUNNR = WA_VBRP-KUNAG.
        IF SY-SUBRC EQ 0.
          WA_FINAL-NAME1 = WA_KNA1-NAME1.
          CLEAR WA_KNA1.
        ENDIF.
        ENDIF.
        APPEND WA_FINAL TO IT_FINAL.
        CLEAR: WA_FINAL,WA_KONV,wa_kna1.
      ENDLOOP.

  • How do we populate the SIGN_POST table for routing server

    Hi,
    How do we populate the SIGN_POST table for routing server…? We having the Street data network... we have populated the NODE, EDGE and PARTITION tables from street data. Is there any procedure available for populating the SIGN_POST table from street data? ARE there any predefined methods available?
    Thanks and Regards
    Aravindan

    This data has to come from the data provider.
    There are no procedures to populate this information, if this
    data is not available for your data set.
    For example, NAVSTREETS from NAVTEQ has this data as part
    of their street network data.
    siva

  • Users details not getting populated in the portal tables..

    Hi,
    We have implemented single sign on (SSO) and we do not create users in the portal as user/pwd come from OID.
    Now becoz of this the portal tables are not getting populated.I require that all the users of the OID to be populated in the portal tables. How can I do it.I depserately require it.

    You can use the function wwsec_api.activate_portal_user to create the shadow records in the portal repository.
    Ref. : http://www.oracle.com/technology/products/ias/portal/html/plsqldoc/pldoc1014/wwsec_api.html

  • Functional Area field not getting populated in the AUFK table

    Hi Friends,
    The table AUFK is the table for Internal Order master data.
    The issue which we are facing is:
    In a particular Order type, the field 'Functional Area' is suppressed in the field status group for the order master data.
    But the cost centers which are maintained in the order master has the functional area field populated with value.
    In the table AUFK, the functional area field is populated for some internal orders and not populated for others. In all cases it should pick the functional area from the cost center which is maintained in the internal order.
    Please guide us about what can be the possible reason(s) for populating the field in some cases while not populating in other cases.
    Thanks a lot in advance for your help.
    Regards,
    Shilpi

    Hi,
    maintain the functional area in order master data to ensure 100% functional area is available in AUFK.
    I can't tell you the reason why its sometimes appearing and sometimes not bur FUNC_AREA is not picked from responsible/requesting cost center (from master data).
    If in customizing of internal order order types a "model-order" is assigned with maintained functional area, this func_area is taken if you create a new order for this ordertype.
    Best regards, Christian

  • Populating the table with values in a jsp

    need help...
    i have a jsp with 2 textboxes lastname and firstname and a button submit.
    wen i click on submit i should get a table on the same jsp below the submit button and it should be populated with the values entered in the textboxes.
    Can you pl help me out the the functionality of populating the textboxes?

    add a onclick function to button...on that onclick u hav e to populate a div with fname and lname
    in tat new div giv bean :write so tat u can get wat u have entered in above text boxes

  • Populating the test data in table of IDES ECC 5.0 in Oracle

    Hi Guys,
    I have installed IDES ECC 5.0 successfully without any errors. But I don't see the data in tables like PA0001 etc. Can some body give me the steps for populating the tables with test data. I was able to sign on using DDIC in client 000.
    Thanks,

    you are using wrong client, login into client 800. check tcode SCC4 to check which client you want to log into.

  • How to add business logic before populating the read only tables.

    Hi All
    Could you please suggest something for the following requirement:
    I want to populate read only tables from the data control pallete(Based on VO) into the jsff. But Before populating the table I want to add conditions based on what the data should appear in the table.
    For e.g:
    There is a VO fetching the completed and pending tasks by the user. Only one VO is there to fetch the tasks but different tables are used to show the pending and completed tasks..I have created the bind variable for the task status.
    Shall I need to add business logic in the backing bean for jsff which is having the setter and getter for the tables.
    Regards,
    Kanika

    Hi,
    why don't you specify ViewCriteria on the View Object you use and use a bind variable to filter the table data? You can apply ViewCriteria in the AM data model so that they only show e.g. completed tasks, or you can assign a view criteria that uses bind variables, in which case the table is filtered dynamically by the value of the bind variable, or you use Java code to dynamically apply the View Criterias
    Frank

  • Ship to record not populated in the CDHDR table at the time of creation

    HI
    I have created the Ship to party in R/3 and automatically i couldnt found the customer number entry with Change object as " I " in the table CDHDR. but ideally all the master data creation entries will be written in the CDHDR table with " I ".
    Please help me out here.
    Helpful answers will be rewarded.
    Regards,
    Ram.

    In that case, try maintaining only the dates in that table and execute.  Once the data is populated, ensure that transaction code is XD01 and not XD02.  Moreover, I am not sure, system will show the creation time there.  May be you can try in KNA1 where also, you can see the creation date but not time.
    As an alternative, execute XD02 for that ship to party where from top menu bar, click on Environment => Account changes => All fields.  Again double click on "Entries", so that system will display the creation and changes made to that customer with date.  Double click on the first record so that system will show when that particular activity was taken place and at what time.
    thanks
    G. Lakshmipathi

Maybe you are looking for