OSI / OTI table data mismatch, OTI being subset of OSI

Hi,
We have a custom application where we fetch count and data for all provisioning records assigned to logged in user.
To fetch details of open provisioning tasks, Oracle recommendation is to use OTI table.
When we studied current system, we had below findings regarding behavior of OSI and OTI tables -
1. When request is submitted to OIM, approval process is triggered depending upon approval policy.
2. this workflow is responsbile for creating approval tasks.
3. when approval tasks are approved, request in OIM is closed and provisioning task for the system gets created.
4. OIM database tables which are used to fetch these provisioning tasks are OSI/OTI.
5. According to Oracle documentation, OTI table is subset of OSI table.
6. when we checked both these tables, we found that entry in both these tables is not exactly same.
7. For a particular request, OSI table had OSI_ASSIGN_TYPE as 'Group' and OTI table had OSI_ASSIGN_TYPE as 'User'.
8. As OTI table has osi_assigned_to_ugp_key value as null, this table cannot be used.
Iit looks like OTI table does not hold correct data. And OSI table has around 9.6 million records. So when we query OSI table we get correct data but it takes 6min to get the data.
We are now looking for the issue why there is this mismatch and how can it be solved ?
Regards,
Deepika

Hi Kevin,
Thanks for these details.
We would try this query in our environment and see the result.
But regarding OSI /OTI data mismatch, i understand that OTI contains tasks that are still opened, like Rejected, and Pending. And these tasks are also there in OSI, when we check the record in OSI those tasks are assigned to user while when we see same record those are getting assign type changes.
Is that intended functionality or is it something wrong here.
Because of this we cannot get correct data directly from OTI table and have to use OSI(which is hampering performance)
This is the query that we are using:
select distinct oti.orc_key, a.usr_login, (select min(inner.oti_create) from oti inner where inner.orc_key = oti.orc_key)
as oti_create from oti inner join oiu on oti.orc_key = oiu.orc_key inner join usr a on oiu.usr_key = a.usr_key
inner join osi on oti.sch_key = osi.sch_key where sch_status in ('P', 'R') and obj_key in
(select obj_key from obj where obj_name in ('DB App', 'K2 App', 'LDAP App', 'SPML App', 'Manual App'))
and (osi.osi_assign_type = 'Group' and osi.osi_assigned_to_ugp_key in
(select ugp_key from usg where usr_key = " + Long.toString(Oim.getCurrentUser().getKey()) + "))
UNION ALL
select distinct oti.orc_key, a.usr_login, (select min(inner.oti_create) from oti inner where inner.orc_key = oti.orc_key)
as oti_create from oti inner join oiu on oti.orc_key = oiu.orc_key inner join usr a on oiu.usr_key = a.usr_key
inner join osi on oti.sch_key = osi.sch_key where sch_status in ('P', 'R') and obj_key in
(select obj_key from obj where obj_name in ('DB App', 'K2 App', 'LDAP App', 'SPML App', 'Manual App'))
and (osi.osi_assign_type in ('User', 'Default task assignment') and
osi.osi_assigned_to_usr_key = " + Long.toString(Oim.getCurrentUser().getKey()) + ")
order by oti_create
Regards,
Deepika

Similar Messages

  • How to prevent Pivot Table Data Source from being changed automatically?

    I have an excel file that contains a Pivot Table that references a table within the workbook (Data Source = "MyTable").  When I save the file and copy it, the copy contains a reference to the original file.  Instead of displaying, "MyTable",
    it shows  "OldFilename!MyTable" where OldFilename is the name of the original file, not the current file.
    How do I prevent this from occurring.  It is only doing this on my current machine, if I perform these steps on a different machine, the problem does not occur and the Data Source property stays the same as it should.
    What setting(s) do I change to fix this problem?
    Thank you in advance for your help!
    Michael

    Hi Michael,
    How about the issue now, is it solved?For the 'data model' feature,please refer to:
    http://blogs.office.com/2012/08/23/introduction-to-the-data-model-and-relationships-in-excel-2013/
    let us know if you have any additional questions or concerns.
    Best regards,
    Wind

  • Data Mismatch while selecting from External Table

    Hi I am not able create an external table,I am trying to create and test table and able to create but when i selecting the data it showing data mismatch.I tried for an test data but it returned error.I want to load from an excel file saved as test.csv
    CREATE TABLE Per_ext
    CITY VARCHAR2(30),
    STATE VARCHAR2(20),
    ZIP VARCHAR2(10),
    COUNTRY VARCHAR2(30)
    ORGANIZATION EXTERNAL
    TYPE ORACLE_LOADER
    DEFAULT DIRECTORY dataload
    ACCESS PARAMETERS
    MISSING FIELD VALUES ARE NULL
    LOCATION ('test.csv')
    REJECT LIMIT UNLIMITED;
    test.csv file contents
    city ----------     state---------Zip------------country
    Bombay-----     MH------------34324-------india
    london-------London------1321---------UK
    Pune---------MH------------3224---------india
    Banglore----     Karnataka---11313-------india
    rgds
    soumya

    Hi Justin
    I am getting following error when i trying from toad
    ORA-29913: error in executing ODCIEXTTABLEOPEN callout
    ORA-29400: data cartridge error
    KUP-00554: error encountered while parsing access parameters
    KUP-01005: syntax error: found "comma": expecting one of: "badfile, byteordermark, characterset, column, data, delimited, discardfile, exit, fields, fixed, load, logfile, language, nodiscardfile, nobadfile, nologfile, date_cache, processing, readsize, string, skip, territory, variable"
    KUP-01007: at line 1 column 29
    ORA-06512: at "SYS.ORACLE_LOADER", line 19
    rgds
    soumya

  • Data mismatch between TABLE and Std. Extract str. (AFPO & 2LIS_04_P_MATNR)

    Hi,
    I am getting data mismatch in ECC standard table and ECC standard extract structure. Please help me.
    I am using 2LIS_04_P_P_MATNR data source (standard extract structure).
    <b>I have following data in ECC Table (AFPO - Order Item).</b>
    AUFNR (Order Number) --> 4000460
    PGMNG (Total planned quantity)  --> 0
    PSMNG (ORDER QTY) --> 2000
    <b>Extract Structure is showing following data.</b>
    AUFNR (Order Number) --> 4000460
    PGMNG (Total planned quantity)  --> 2000 (It should be zero)
    PSMNG (ORDER QTY) --> 2000
    I have checked with many other order numbers. <b>The extract structure in RSA3 is showing correct values only if PGMNG is non-zero values. If PGMNG is zero, then it is automatically assigning PSMNG value.</b>
    Is this common? I want to display correct values for PGMNG from AFPO Table. Why this mismatch between table data and extract data. I have done  all required steps refreshing data (deleting setup tables, filling set up tables etc.).
    pls suggest. Poinsts will be assigned to every useful answer.
    Thanks in advance.

    Hi Avneet,
    I have a similar problem where in AFPO shows value 1000 in PSMNG field and 1200 in RSA3 for extractor 2LIS_04_P_MATNR.
    I have found that the standard feature of the extractor extracts the data at the time of release of process order. If the process order is changed after the release then the extractor will not picup the chagne.
    Ex: AFPO value 1000 ---BI Value 1000
    Now released in ECC
    AFPO value 1000 ---BI Value 1000
    In ECC value changed to 1500 after the release. BI will not pickup the change.
    Extractor is pulling the data from MCAFPOV table. You can check the values, there it matches with RSA3 values.
    I was just going thru forums for the solution. After i found the solution. I am just updating it.
    Hope it helps you.
    Thaks
    Srikanth

  • Data mismatch in AFPO table & 2LIS_04_P_MATNR str. in RSA3

    Hi,
    I am getting data mismatch in ECC standard table and ECC standard extract structure. Please help me.
    I am using 2LIS_04_P_P_MATNR data source (standard extract structure).
    I have following data in ECC Table (AFPO - Order Item).
    AUFNR (Order Number) --> 4000460
    PGMNG (Total planned quantity) --> 0
    PSMNG (ORDER QTY) --> 2000
    Extract Structure is showing following data.
    AUFNR (Order Number) --> 4000460
    PGMNG (Total planned quantity) --> 2000 (It should be zero)
    PSMNG (ORDER QTY) --> 2000
    I have checked with many other order numbers. The extract structure in RSA3 is showing correct values only if PGMNG is non-zero values. If PGMNG is zero, then it is automatically assigning PSMNG value.
    Is this common? I want to display correct values for PGMNG from AFPO Table. Why this mismatch between table data and extract data. I have done all required steps refreshing data (deleting setup tables, filling set up tables etc.).
    pls suggest. Poinsts will be assigned to every useful answer. Can any one debug in debug mode in RSA3 and let me know your findings.
    Thanks in advance.

    I have tried all the ways. But did not understand the logic. Can any one pls explain me logic behind it.
    thanks,

  • How to identify the data mismatch between inventory cube and tables?

    Hi experts,
    i have a scenario like how to identify the data mismatch between 0IC_C03 and tables,and what are the steps to follow for avoiding the data mismatch

    Hi
    U can use data reconcilation method to check the consistency of data between the r/3 and bw. Plz check the below link
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/a0931642-1805-2e10-01ad-a4fbec8461da?QuickLink=index&overridelayout=true
    http://www.sdn.sap.com/irj/scn/index?rid=/library/uuid/d08ce5cd-3db7-2c10-ddaf-b13353ad3489
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/7a5ee147-0501-0010-0a9d-f7abcba36b14?QuickLink=index&overridelayout=true
    Thanx & Regards,
    RaviChandra

  • IDOC failed due to date mismatch

    Hi Experts,
    I am facing a problem in IDOC.
    I IDOC is sent through a one SAP to another system.
    In target system ,It was failed ,the reason being the date mismatch.
    When I check through WE02 ,it was failed becoz a TODATE and FROMDATE got wrong ,i.e it got swapped..
    So is there any way to process this error out IDOC?
    Is it possible to change the dates and then process it manually?
    Pls let me know as its on high priorty.
    Note : BOTH SAP system are Production .
    thanks in advance

    Hi Hemant,
    Please find the following steps to edit IDOC segment after you find the error using WE02.
    The example codes can be found in website
    http://www.sapgenie.com/sapedi/idoc_abap.htm
    STEP 1 - Open document to edit
    CALL FUNCTION 'EDI_DOCUMENT_OPEN_FOR_EDIT'
    EXPORTING
    document_number = t_docnum
    IMPORTING
    idoc_control = itab_edidc
    TABLES
    idoc_data = itab_edidd
    EXCEPTIONS
    document_foreign_lock = 1
    document_not_exist = 2
    document_not_open = 3
    status_is_unable_for_changing = 4
    OTHERS = 5.
    STEP 2 - Loop at itab_edidd and change data
    LOOP AT itab_edidd WHERE segnam = 'E1EDKA1'.
    e1edka1 = itab_edidd-sdata.
    IF e1edka1-parvw = 'LF'.
    e1edka1-partn = t_eikto.
    itab_edidd-sdata = e1edka1.
    MODIFY itab_edidd.
    EXIT.
    ENDIF.
    ENDLOOP.
    STEP 3 - Change data segments
    CALL FUNCTION 'EDI_CHANGE_DATA_SEGMENTS'
    TABLES
    idoc_changed_data_range = itab_edidd
    EXCEPTIONS
    idoc_not_open = 1
    data_record_not_exist = 2
    OTHERS = 3.
    STEP 3a - Change control record
    CALL FUNCTION 'EDI_CHANGE_CONTROL_RECORD'
    EXPORTING
    idoc_changed_control = itab_edidc
    EXCEPTIONS
    idoc_not_open = 1
    direction_change_not_allowed = 2
    OTHERS = 3.
    STEP 4 - Close Idoc
    Update IDoc status
    CLEAR t_itab_edids40.
    t_itab_edids40-docnum = t_docnum.
    t_itab_edids40-status = '51'.
    t_itab_edids40-repid = sy-repid.
    t_itab_edids40-tabnam = 'EDI_DS'.
    t_itab_edids40-mandt = sy-mandt.
    t_itab_edids40-stamqu = 'SAP'.
    t_itab_edids40-stamid = 'B1'.
    t_itab_edids40-stamno = '999'.
    t_itab_edids40-stapa1 = 'Sold to changed to '.
    t_itab_edids40-stapa2 = t_new_kunnr.
    t_itab_edids40-logdat = sy-datum.
    t_itab_edids40-logtim = sy-uzeit.
    APPEND t_itab_edids40.
    CALL FUNCTION 'EDI_DOCUMENT_CLOSE_EDIT'
    EXPORTING
    document_number = t_docnum
    do_commit = 'X'
    do_update = 'X'
    write_all_status = 'X'
    TABLES
    status_records = t_itab_edids40
    EXCEPTIONS
    idoc_not_open = 1
    db_error = 2
    OTHERS = 3.
    Other alternative is to use WE19 with reference to the error IDoc number.
    After you change the incorrect segment(s), you can post to the application. 
    As of result, new IDoc will be created.
    You can use FM EDI_DOCUMENT_DELETE to delete the incorrect/errror IDoc number.
    Also you can use WE02 to change the incorrect segment.
    Double click on the incorrect segment to get the detail info. Then go to Data Record (menu) -> Change -> make neccessary changes -> Save.
    Then you can use this program RBDINPUT to reprocess the error IDoc.
    Hope this will help.
    Regards,
    Ferry Lianto

  • Data mismatch error

    hi team
    i have a report, where in my report, Data being mismatch with r/3 data my question is
    1.how we will do reconcellation with r/3 data
    2.what is agregated level
    3.how we will run the statistical setup in lo
    can u explain with step by step how we solve the data mismatch erors eg:company code missing
    thanks in adv
    v.muralikrishna

    Hi
    1.how we will do reconcellation with r/3 data
    Yes we can do for MM we use MB5B, MMBE,Tcodes and also some Tables this is on eexample only and also Related Functional People will need to work to reconcile the data, BW consultanta alone can't do the reconcilization
    2.what is agregated level
    Eg:
    06.2010 -- 10
    06.2010 -- 20
    Aggr Level is
    06.2010 -- 30
    3.how we will run the statistical setup in LO
    LO you can see the below steps
    Follow the steps, these steps are for SD module, but for your datasource, change the Tcode to fill setup tables and replace the SD DataSource with your datasource in the following steps.
    1. First Install the DataSOurce in RSA5 and see it in RSA6 and activate in LBWE.
    Before doing the steps from 2 to 6 lock the ECC System, i.e. no transaction will happen
    2. Then delete the Queues in LBWQ like below
         MCEX11  --> For 2LIS_11_* 
         MCEX12  --> For 2LIS_12_* 
         MCEX13  --> For 2LIS_13_* 
      Be carefull while doing all these deleations in Production Servers
    3. Then delete if any entry is there in RSA7
         Eg:
         2LIS_11_*
         2LIS_12_*
         2LIS_13_*
    At the time of setup table filling no entry will exists in LBWQ in ECC for the following Queues.
         MCEX11  --> For 2LIS_11_* 
         MCEX12  --> For 2LIS_12_* 
         MCEX13  --> For 2LIS_13_* 
    6. Check the job status in SM37 and once finish it then goto RSA3 and then execute it and check.
    7. Then Replicate the DataSource in BW.
    4. Then delete setp tables using LBWG Tocde and select Application Number. i.e. 11, 12 and 13.
    5. Then Load Setup Tables using OLI7BW, OLI8BW and OLI9BW.
       Give Name of run = XYZ, Termination Date = tomorrows date and execute it in background.
       i.e. Program-->Execute in Background.
       2LIS_11_*  Use OLI7BW Tcode to fill the Setup tables
       2LIS_12_*  Use OLI8BW Tcode to fill the Setup tables
       2LIS_13_*  Use OLI9BW Tcode to fill the Setup tables
    8. Install InfoCube/DSO from Business Content or cretae InfoCube/DSO and then Map the ECC DataSOurce Fields and BW          InfoObejcts in Transfer Rules (in BW 3.5) or in Transfermations (in BI 7.0).
    9. Map the InfoObejcts in InfoSource and InfoObejects in InfoCube/DSO in in Update Rules (in BW 3.5) or in Transfermations (in BI 7.0).
    10.Create InfoPackage and Load Init OR Full.
    Thanks
    Reddy

  • Delta load from ODS to cube failed - Data mismatch

    Hi all
    We have a scenario where the data flow is like
    R/3 table - >dataSrc -- > pSA - >InfoSrc -> ODS ->Cube.
    The cube has an additional field called "monthly version"
    and since it is a history cube , it is supposed to hold data
    snapshots of the all data in current cube for each month .
    We are facing the problem that the Data for the current month
    is there in the history ODS but not in the cube . In the ODS ->Manage->requests
    tab i can see only 1 red request that too with 0 recs.
    However ,In the cube -> manage-> reconstruction tab , i can see 2 Red request
    with the current month date. Could these red requests be the reason for
    data mismatch in ODS and cube .
    Please guide as to how can i solve this problem .
    thanks all
    annie

    Hi
    Thanks for the reply.
    Load to Cube is Delta and goes daily .
    Load to ODS is a full on daily basis .
    Can you help me how to sort this issue . I have to work directly in production env . so has to be safe and full proof .
    Thanks
    annie

  • Excel issues with importing CSV or HTML table data from URL - Sharepoint? Office365?

    Greetings,
    We have a client who is having issues importing CSV or HTML table data as one would do using Excel's Web Query import from a reporting application.  As the error message provided by Excel is unhelpful I'm reaching out to anyone who can help us begin to
    troubleshoot problems affecting what is normal standard Excel functionality.  I'd attach the error screenshot, but I can't because my account is not verified....needless to say it says "Microsoft Excel cannot access  the file https://www.avantalytics.com/reporting_handler?func=wquery&format=csv&logid=XXXX&key=MD5
    Where XXXX is a number and MD5 is an md5 code.  The symptoms stated in the error message are:
    - the file name or path does not exist
    -The file is being used by another program
    -The workbook you are trying to save has the same name as a currently open workbook.
    None of these symptoms are the case, naturally. The user encountered this with Excel2010, she was then upgraded to Excel2013 and is still experiencing the same issue. The output of this URL in a browser (IE, Chrome, Firefox) is CSV data for the affected
    user, so it is not a network connectivity issue.  In our testing environment using both Excel2010 or 2013 this file is imported successfully, so we cannot replicate.  The main difference I can determine between our test environment and the end-user
    is they have a Sharepoint installation and appear to have Office365 as well.
    So,  my question might more appropriately be for Sharepoint or Office365 folks, but I can't be sure they're  a culprit.  Given this - does anyone have any knowledge of issues which might cause this with Sharepoint or Office365 integrated with
    Excel and/or have suggestions for getting more information from Excel or Windows other than this error message?  I've added the domain name as a trusted publisher in IE as I thought that might be the issue, but that hasn't solved anything.  As you
    can see its already https and there is no authentication or login - the md5 key is the authentication.  The certificate for the application endpoint is valid and registered via GoDaddy CA.
    I'm at a loss and would love some suggestions on things to check/try.
    Thanks  -Ross

    Hi Ross,
    >> In our testing environment using both Excel 2010 and 2013 this file is imported successfully, so we cannot replicate.
    I suspect it is caused by the difference of web server security settings.
    KB: Error message when you use Web query to a secure Web page (HTTPS://) in Excel: "Unable to open"
    Hope it will help.
    By the way, this forum is mainly for discussing questions about Office Development (VSTO, VBA and Apps for Office .etc.). For Office products feature specific questions, you could consider posting them on
    Office IT Pro forum or Microsoft Office Community.
    Regards,
    Jeffrey
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.

  • How can we export table data to a CSV file??

    Hi,
    I have the following requirement. Initially business agreed upon, exporting the table data to Excel file. But now, they would like to export the table data to a CSV file, which is not being supported by af:exportCollectionActionListener component.
    Because, when i opened the exported CSV file, i can see the exported data sorrounded with HTML tags. Hence the issue.
    Does someone has any solution for this ... Like, how can we export the table data to csv format. And it should work similar to exporting the data to excel sheet.
    For youre reference here is the code which i have used to export the table data..
    ><f:facet name="menus">
    ><af:menu text="Menu" id="m1">
    ><af:commandMenuItem text="Print" id="cmi1">
    ><af:exportCollectionActionListener exportedId="t1"
    >title="CommunicationDistributionList"
    >filename="CommunicationDistributionList"
    >type="excelHTML"/> ---- I tried with removing value for this attribute. With no value, it did not worked at all.
    ></af:commandMenuItem>
    ></af:menu>
    ></f:facet>
    Thanks & Regards,
    Kiran Konjeti

    Hi Alex,
    I have already visited that POST and it works only in 10g. Not in 11g.
    I got the solution for this. The solution is :
    Use the following code in jsff
    ==================
    <af:commandButton text="Export Data" id="ctb1">><af:fileDownloadActionListener contentType="text/csv; charset=utf-8"
    >filename="test.csv"
    >method="#{pageFlowScope.pageFlowScopeDemoAppMB.test}"/>
    ></af:commandButton>
    OR
    <af:commandButton text="Export Data" id="ctb1">><af:fileDownloadActionListener contentType="application/vnd.ms-excel; charset=utf-8"
    >filename="test.csv"
    >method="#{pageFlowScope.pageFlowScopeDemoAppMB.test}"/>
    ></af:commandButton>
    And place this code in ManagedBean
    ======================
    > public void test(FacesContext facesContext, OutputStream outputStream) throws IOException {
    > DCBindingContainer dcBindings = (DCBindingContainer)BindingContext.getCurrent().getCurrentBindingsEntry();
    >DCIteratorBinding itrBinding = (DCIteratorBinding)dcBindings.get("fetchDataIterator");
    >tableRows = itrBinding.getAllRowsInRange();
    preparaing column headers
    >PrintWriter out = new PrintWriter(outputStream);
    >out.print(" ID");
    >out.print(",");
    >out.print("Name");
    >out.print(",");
    >out.print("Designation");
    >out.print(",");
    >out.print("Salary");
    >out.println();
    preparing column data
    > for(Row row : tableRows){
    >DCDataRow dataRow = (DCDataRow)row;
    > DataLoaderDTO dto = (DataLoaderDTO)dataRow.getDataProvider();
    >out.print(dto.getId());
    >out.print(",");
    >out.print(dto.getName());
    >out.print(",");
    >out.print(dto.getDesgntn());
    >out.print(",");
    >out.print(dto.getSalary());
    >out.println();
    >}
    >out.flush();
    >out.close();
    > }
    And do the following settings(*OPTIONAL*) for your browser - Only in case, if the file is being blocked by IE
    ==================================================================
    http://ais-ss.usc.edu/helpdoc/main/browser/bris004b.html
    This resolves implementation of exporting table data to CSV file in 11g.
    Thanks & Regards,
    Kiran Konjeti

  • How to Populate Internal table data to Table Control in a Report Program

    Dear All,
           How to Populate Internal table data to Table Control in a Report Program? It is a pure report program with out any Module pool coding involved, which is just used to display data. Till now it is being displayed in a report. Now the user wants the data to be displayed in a table control. Could someone tell me how to go about with this.
    Thanks in Advance,
    Joseph Reddy

    If you want to use a table control, you will need to create a screen.
    In your report....
    start-of-selection.
    perform get_data.  " Get all your data here
    call screen 100. " Now present to the user.
    Double click on the "100" in your call screen statement.  This will forward navigate you to the screen.  If you have not created it yet, it will ask you if you want to create it, say yes.  Go into screen painter or layout of the screen.  Use the table control wizard to help you along the process.  It will write the code for you.  Since it is an output only table control, it will be really easy with not a lot of code. 
    A better way to present the data to the user would be to give it in a ALV grid.  If you want to go that way, it is a lot easier.  Here is a sample of the ALV function module.  You don't even have to create a screen.
    report zrich_0004
           no standard page heading.
    type-pools slis.
    data: fieldcat type slis_t_fieldcat_alv.
    data: begin of imara occurs 0,
          matnr type mara-matnr,
          maktx type makt-maktx,
          end of imara.
    * Selection Screen
    selection-screen begin of block b1 with frame title text-001 .
    select-options: s_matnr for imara-matnr .
    selection-screen end of block b1.
    start-of-selection.
      perform get_data.
      perform write_report.
    *  Get_Data
    form get_data.
      select  mara~matnr makt~maktx
                into corresponding fields of table imara
                  from mara
                   inner join makt
                     on mara~matnr = makt~matnr
                        where mara~matnr in s_matnr
                          and makt~spras = sy-langu.
    endform.
    *  WRITE_REPORT
    form write_report.
      perform build_field_catalog.
    * CALL ABAP LIST VIEWER (ALV)
      call function 'REUSE_ALV_GRID_DISPLAY'
           exporting
                it_fieldcat = fieldcat
           tables
                t_outtab    = imara.
    endform.
    * BUILD_FIELD_CATALOG
    form build_field_catalog.
      data: fc_tmp type slis_t_fieldcat_alv with header line.
      clear: fieldcat. refresh: fieldcat.
      clear: fc_tmp.
      fc_tmp-reptext_ddic    = 'Material Number'.
      fc_tmp-fieldname  = 'MATNR'.
      fc_tmp-tabname   = 'IMARA'.
      fc_tmp-outputlen  = '18'.
      fc_tmp-col_pos    = 2.
      append fc_tmp to fieldcat.
      clear: fc_tmp.
      fc_tmp-reptext_ddic    = 'Material'.
      fc_tmp-fieldname  = 'MAKTX'.
      fc_tmp-tabname   = 'IMARA'.
      fc_tmp-outputlen  = '40'.
      fc_tmp-col_pos    = 3.
      append fc_tmp to fieldcat.
    endform.
    Regards,
    Rich Heilman

  • How to get the plsql table data into output cursor

    Hi,
    Could anybody please help me.
    Below is an example of the scenario..
    CREATE OR REPLACE PACKAGE chck IS
    PROCEDURE getdata(dept_no IN VARCHAR2,oc_result_cursor OUT sys_REFCURSOR);
    TYPE get_rec is record (ename varchar2(20),
    eno number(12));
    TYPE t_recs IS TABLE OF get_rec INDEX BY BINARY_INTEGER;
    emp_tab t_recs;
    END chck;
    CREATE OR REPLACE PACKAGE BODY chck AS
    PROCEDURE getdata(dept_no IN VARCHAR2,oc_result_cursor OUT sys_REFCURSOR)
    is
    BEGIN
    select ename, eno
    bulk collect into emp_tab
    from emp;
    open oc_result_cursor for select * from table(emp_tab); -- I believe something is wrong here ....
    END;
    END chck;
    the above package is giving me an error:
    LINE/COL ERROR
    10/29 PL/SQL: SQL Statement ignored
    10/43 PL/SQL: ORA-22905: cannot access rows from a non-nested table
    item
    let me know what needs to be changed
    Thanks
    Manju

    manjukn wrote:
    once i get the data into a plsql table, how to get this plsql table data into the cursor?There is no such thing as a PL/SQL table - it is an array.
    It is nothing at all like a table. It cannot be indexed, partitioned, cluster, etc. It does not exist in the SQL engine as an object that can be referenced. It resides in expensive PGA memory and needs to be copied (lock, stock and barrel) to the SQL engine as a bind variable.
    It is an extremely primitive structure - and should never be confused as being just like a table.
    Its use in SQL statements is also an exception to the rule. Sound and valid technical reasons need to justify why one want to push a PL/SQL array to the SQL engine to run SELECT 's against it.

  • Data mismatch with ECC FBL5N report and debtor ageing report(BI side)

    Hi,
    I am facing a data mismatch problem with FBl5N(t-code for customer line item report) report in ECC side and Debtor ageing report in BI side.
    The problem is
    1) there are mismatch of data with some customer amounts in both ECC and Bi side.
    2) and also there are customer Nos with amounts in BI Debtor ageing report which are not there in the ECC FBL5N(t-code for customer line item report)
    for the second problem, I checked in the Tables BSID,BSAD in ECC side ,there also these customer Nos are not available.
    One more strange thing is that with the same selectionin report on both ECC and BI reports the data mismatch and The extra customers in Bi reports are changing everyday, i.e. we are getting new set of data mismatch and extra customers Nos in BI Side.
    If anyone  have worked on this type of issue.....kindly help...
    Thanks in advance

    Hi,
    on the one hand it may be delta mechanism of FI_*_4 extractors with the timestamps issue, that your comparision between BI and ECC is at no time up to date.
    FI Extraction
    on the other hand, it may be the delta problem between data targets in your BI-System, in case you load the FI-Data from a DSO to a cube and make a report on the cube. I have this problem at the moment and will watch this thread for more suggestions.

  • Inserting to MS-Access -Data mismatch

    I am trying to insert records in to a MS-Access table. However i got a data mismatch exception and on figuring it out i realized there was a date field and i was populating it with string. hence i converted it in to a date and run the query again, however now, i am neither getting an exception nor is the table getting updated.
    The following is the code snippet where i get the data
    List <org.cuashi.wof.ws.nwis.ValueSingleVariable > ValueList = result.getTimeSeries().getValues().getValue();
    for (org.cuashi.wof.ws.nwis.ValueSingleVariable value : ValueList)
    System.out.format("%20s%10.4f",value.getDateTime().toString(),value.getValue());
    System.out.println();
    System.out.println("obtaining time series data");
    String dateTime = value.getDateTime().toString().replace('T',' ');
    //to convert string in to date
    SimpleDateFormat sdf = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss");
    java.util.Date dt = sdf.parse(dateTime);
    sdf.applyPattern("yyyy-MM-dd HH:mm:ss");
    java.util.Date dt2 = sdf.parse(dateTime);
    updateTable(siteCode,variableCode,dt2,timvalue,varCode,qualifierCode,qualityControlLevel,conn,rs);
    }catch(Exception e)
    public void updateTable(String siteCode,String variableCode,java.util.Date dt2,double tmvalue,String varCode,String qualifierCode,int qualityControlLevel,Connection con,ResultSet rs)
    try{
    System.out.println("inside update");
    // con.setAutoCommit(false);
    PreparedStatement pstmt = con.prepareStatement("INSERT INTO DataValues(ValueID,DataValue,ValueAccuracy,LocalDateTime,UTCOffset,DateTimeUTC,SiteID,VariableID,OffsetValue,OffsetTypeID,CensorCode,QualifierID,MethodID,SourceID,SampleID,DerivedFromID,QualityControlLevelID) VALUES(?,?,?,?,?,?.?,?,?,?,?,?,?,?,?,?,?)");
    pstmt.setString(1,"1");
    pstmt.setDouble(2,tmvalue);
    pstmt.setInt(3,0);
    pstmt.setDate(4,(java.sql.Date) dt2);
    pstmt.setInt(5,0);
    pstmt.setString(6,"0");
    pstmt.setString(7,siteCode);
    pstmt.setString(8,varCode);
    pstmt.setInt(9,0);
    pstmt.setInt(10,0);
    pstmt.setInt(11,0);
    pstmt.setString(12,qualifierCode);
    pstmt.setInt(13,0);
    pstmt.setInt(14,1);
    pstmt.setInt(15,0);
    pstmt.setInt(16,0);
    pstmt.setInt(17,qualityControlLevel);
    System.out.println("Statement prepared");
    pstmt.execute();
    //commit the transaction
    con.commit();
    pstmt.close();
    }catch(SQLException e)
    System.out.println("The Exception is " +e);
    I found out that after field 4 the control does not go to the remaining fields at all.
    Please let me know what i am missing.

    System.out.format("%20s%10.4f",value.getDateTime().toString(),value.getValue());
    System.out.println();
    System.out.println("obtaining time series data");
    String dateTime = value.getDateTime().toString().replace('T',' ');
    //to convert string in to date
    SimpleDateFormat sdf = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss");I'd recommend that you setLenient(false) on your sdf.
    This pattern is what you've got? Sure?
    java.util.Date dt = sdf.parse(dateTime);
    sdf.applyPattern("yyyy-MM-dd HH:mm:ss");
    java.util.Date dt2 = sdf.parse(dateTime);
    updateTable(siteCode,variableCode,dt2,timvalue,varCode,qualifierCode,qualityControlLevel,conn,rs);
    }catch(Exception e)
    {Empty catch block? Not a smart idea. Print the stack trace.
    public void updateTable(String siteCode,String variableCode,java.util.Date dt2,double tmvalue,String varCode,String qualifierCode,int qualityControlLevel,Connection con,ResultSet rs)
    try{
    System.out.println("inside update");
    // con.setAutoCommit(false);
    PreparedStatement pstmt = con.prepareStatement("INSERT INTO DataValues(ValueID,DataValue,ValueAccuracy,LocalDateTime,UTCOffset,DateTimeUTC,SiteID,VariableID,OffsetValue,OffsetTypeID,CensorCode,QualifierID,MethodID,SourceID,SampleID,DerivedFromID,QualityControlLevelID) VALUES(?,?,?,?,?,?.?,?,?,?,?,?,?,?,?,?,?)");
    pstmt.setString(1,"1");
    pstmt.setDouble(2,tmvalue);
    pstmt.setInt(3,0);
    pstmt.setDate(4,(java.sql.Date) dt2);I'd recommend this:
        pstmt.setDate(4,new java.sql.Date(dt2.getTime()));>
    pstmt.setInt(5,0);
    pstmt.setString(6,"0");
    pstmt.setString(7,siteCode);
    pstmt.setString(8,varCode);
    pstmt.setInt(9,0);
    pstmt.setInt(10,0);
    pstmt.setInt(11,0);
    pstmt.setString(12,qualifierCode);
    pstmt.setInt(13,0);
    pstmt.setInt(14,1);
    pstmt.setInt(15,0);
    pstmt.setInt(16,0);
    pstmt.setInt(17,qualityControlLevel);
    System.out.println("Statement prepared");
    pstmt.execute();
    //commit the transaction
    con.commit();
    pstmt.close();You should be closing your statement and connection in a finally block, in indivdual try/catch blocks.
    Set autoCommit back to true.
    >
    }catch(SQLException e)
    {So you don't roll back if there's an exception? Bad idea.
    System.out.println("The Exception is " +e);
    I found out that after field 4 the control does not go to the remaining fields at all.
    Please let me know what i am missing.Lots of stuff. See above.
    %

Maybe you are looking for

  • Error While Creating New Field thru EEWB

    Hello CRM Experts, I have found following Error while adding New Fields through EEWB. Can any one help to analyse and give the solution? It gives Error Log as-      System type OLTP with role DEV is not specified correctly. System type OLTP with role

  • IPhone Mail, Contacts, Calendars iCloud and Macbook iTunes settings what a mess!

    So I have been having some trouble with my iCloud settings and my iPhone lately and tried some changes and have discovered many more issues that make no sense to me.  I have an iPhone and a Macbook Pro.  Before I set the settings for each device sepa

  • Problems in 3d only - 680i & X-Fi ExtremeGamer

    Upgraded to the X-Fi FPS from the onboard HD audio on my setup: EVGA 22-CK-NF68-AR (p26) Intel Core 2 Quad Q6600 Kentsfield 2.4GHz (stock speed\cooling) EVGA 52-P2-N635-AR Geforce 7950GT KO 52MB GDDR3 PCI Express x6 (stock speed\cooling) CORSAIR XMS2

  • Table not fitting on iPhone

    This is responding fine on Andnroid but not on iPhone, where it spills outside the viewport: http://www.aptcoweb.com/dev/vip/reservations.htm#ratetable It's defined as 80% width, do I need something else?

  • Lenovo S205 & Windows 7 64bit. Problems with graphics, please help

    Hi,  I recently bought a Lenovo s205: AMD E-350 It came with 1GB but I upgraded it to 4Gb RAM (1333 running at 1066) Windows 7 64 Ultimate. I have tried also Windows 7 32 (Professional, ultimate) and all versions give me the same problem, I will expl