Error while fetching data into collection type.

Hi all,
I'm facing a problem while i'm fetching data into a table type.
Open c1;
open c2;
loop
Fetch c1 into partition_name_1;
fetch c2 into partition_name_2;
exit when c1%notfound or c2%notfound;
open C1_refcursor for 'select a.col1,b.col2 from table1 partition('||partition_name_1||') a, table2 partition('||partition_name_2) b
where a.col2=b.col2';
loop
fetch c1_refcursor BULK COLLECT into v1,v2 <-----This is the line where i'm getting the error as "ORA-01858: a non-numeric character was found where a numeric was expected"
limit 100000;
exit when v1.count=0;
forall i in 1..v1.count
end loop;
i also checked the data type of the table variable its same as the column selected in the refcursor.
Please help me out.
Message was edited by:
Sumit Narayan

Ok I see that, but I don't think you can use associative arrays in this manner then, because you cannot select data directly into an associative_array.
As far as I'm aware, they must be assigned an index and the error suggests to me that its missing an index.
You can select directly into records and maybe this is where you need to go.

Similar Messages

  • Fatal error while fetching data from bi

    hi,
    i am getting following error while fetching data from bi using select statement
    i have written code in this way
    SELECT  [Measures].[D2GFTNHIOMI7KWV99SD7GPLTU] ON COLUMNS, NON EMPTY { [DEM_STATE].MEMBERS} ON ROWS FROM DEM_CUBE/TEST_F_8
    error description when i click on test
    Fatal Error
    com.lighthammer.webservice.SoapException: The XML for Analysis provider encountered an error

    thanks for answering .but when i tried writing the statement in transaction 'MDXTEST' and clicked on check i am getting following error
    Error occurred when starting the parser: timeout during allocate / CPIC-CALL: 'ThSAPCMRCV'
    Message no. BRAINOLAPAPI011
    Diagnosis
    Failed to start the MDX parser.
    System Response
    timeout during allocate / CPIC-CALL: 'ThSAPCMRCV'
    Procedure
    Check the Sys Log in Transaction SM21 and test the TCP-IP connection MDX_PARSER in Transaction SM59.
    SO I WENT IN SM 59 TO CHECK THE CONNECTION.
    CAN U TELL ME WHAT CONFIGERATION I NEED TO DO FOR MAKING SELECT STATEMENTS WORK?

  • Error while loading data into clob data type.

    Hi,
    I have created interface to load data from oracle table into oracle table.In target table we have attribute with clob data type. while loading data into clob field ODI gave below error. I use odi 10.1.3.6.0
    java.lang.NumberFormatException: For input string: "4294967295"
         at java.lang.NumberFormatException.forInputString(Unknown Source)
         at java.lang.Integer.parseInt(Unknown Source)
         at java.lang.Integer.parseInt(Unknown Source)
    Let me know if anyone come across and resolved this kind of issue.
    Thanks much,
    Nishit Gajjar

    Mr. Gajjar,
    You didnt mention what KMs you are using ?
    have a read of
    Re: Facing issues while using BLOB
    and
    Load BLOB column in Oracle to Image column in MS SQL Server
    Try again.
    And can you please mark the Correct/Helpful points to the answers too.
    Edited by: actdi on Jan 10, 2012 10:45 AM

  • Error while loading  data into External table from the flat files

    HI ,
    We have a data load in our project which feeds the oracle external tables with the data from the Flat Files(.bcp files) in unix.
    While loading the data, we are encountering the following error.
    Error occured (Error Code : -29913 and Error Message : ORA-29913: error in executing ODCIEXTTABLEOPEN callout
    ORA-29400: data cartridge error
    KUP-04063: un) while loading data into table_ext
    Please let us know what needs to be done in this case to solve this problem.
    Thanks,
    Kartheek

    Kartheek,
    I used Google (mine still works).... please check those links:
    http://oraclequirks.blogspot.com/2008/07/ora-29400-data-cartridge-error.html
    http://jonathanlewis.wordpress.com/2011/02/15/ora-29913/
    HTH,
    Thierry

  • Error while Inserting data into flow table

    Hi All,
    I am very new to ODI, I am facing lot of problem in my 1st interface. So I have many questions here, please forgive me if it has irritated to you.
    ========================
    I am developing a simple Project to load a data from an input source file (csv) file into a staging table.
    My plan is to achieve this in 3 interfaces:
    1. Interface-1 : Load the data from an input source (csv) file into a staging table (say Stg_1)
    2. Interface-2 : Read the data from the staging table (stg_1) apply the business rules to it and copy the processed records into another staging table (say stg_2)
    3. Interface-3 : Copy the data from staging table (stg_2) into the target table (say Target) in the target database.
    Question-1 : Is this approach correct?
    ========================
    I don't have any key columns in the staging table (stg_1). When I tried to execute the Flow Control of this I got an error:
    Flow Control not possible if no Key is declared in your Target Datastore
    With one of the response (the response was - "FLOW control requires a KEY in the target table") in this Forum I have introduced a column called "Record_ID" and made it a Primary Key column into my staging table (stg_1) and my problem has been resolved.
    Question-2 : Is a Key column compulsary in the target table? I am working in BO Data Integrator, there is no such compulsion ... I am little confused.
    ========================
    Next, I have defined one Project level sequence. I have mapped the newly introduced key column Record_Id (Primary Key) with the Project level sequence. Now I am got another error of "CKM not selected".
    For this, I have inserted "Insert Check (CKM)" knowledge module in my Project. With this the above problem of "CKM not selected" has been resolved.
    Question-3 : When is this CKM knowledge module required?
    ========================
    After this, the flow/interface is failing while loading data into the intermediar ODI created flow table (I$)
    1 - Loading - SS_0 - Drop work table
    2 - Loading - SS_0 - Create work table
    3 - Loading - SS_0 - Load data
    5 - Integration - FTE Actual data to Staging table - Drop flow table
    6 - Integration - FTE Actual data to Staging table - Create flow table I$
    7 - Integration - FTE Actual data to Staging table - Delete target table
    8 - Integration - FTE Actual data to Staging table - Insert flow into I$ table
    The Error is at Step-8 above. When opened the "Execution" tab for this step I found the message - "Missing parameter Project_1.FTE_Actual_Data_seq_NEXTVAL RECORD_ID".
    Question-4 : What/why is this error? Did I made any mistake while creating a sequence?

    Everyone is new and starts somewhere. And the community is there to help you.
    1.) What is the idea of moving data from stg_1 and then to stg_2 ? Do you really need it for any other purpose other than move data from SourceFile to Target DB.
    Otherwise, its simple to move data from SourceFile -> Target Table
    2.) Does your Target table have a Key ?
    3.) CKM (Check KM) is required when you want to do constraint validation (Checking) on your data. You can define constraints (business rules) on the target table and Flow Control will check the data that is flowing from Source File to Target table using the CKM. All the records that donot satisfy the constraint will be added to E$ (Error table) and will not be added to the Target table.
    4.) Try to avoid ODI sequences. They are slow and arent scalable. Try to use Database sequence wherever possible. And use the DB sequence is target mapping as
    <%=odiRef.getObjectName( "L" , "MY_DB_Sequence_Row" , "D" )%>.nextval
    where MY_DB_Sequence_Row is the oracle sequence in the target schema.
    HTH

  • Hyperion IR : Getting out of memory error while fetching data for whole year through web client (wrokspace)

    Hi,
    While fetching data though IR wen client from workspace for a year(all 12 months) I am getting error as ("Out of Memory .Advice : Close other applications or windows and try again").
    If I am trying same through IR studio it does not give any output and show me same repoting front page.
    If i am selecting periods till 8 months it is giving the required data in both IR web client and IR studio.
    Could you please suggest how can we resolve this issue.
    Thanks,
    D.N.Rana

    Issue Cause :
    Sometimes this is due to excessive data which brings the size of the BQY file up around one gigabyte uncompressed in size (for processing may take twice as actual RAM, plus the memory space space for the plugin, and the typical memory limit on a 32-bit system is 2 gigabytes).
    Solution :
    To avoid excessive BQY size exceeding memory availability:
    Ensure that your computer has at least 2Gb of free RAM before he runs IR Studio.
    Put a limit to the number of rows that can be pulled down: Right click on Request label of Query section and put a value in Return First xxx Rows (and check the check box).
    Do not pull down more than 750 MB of data (remember it may be duplicated while processing).
    Place limits or aggregations in Query section (as opposed to Result section) to limit data entering the BQY.

  • Error while inserting data into a table.

    Hi All,
      I created a table.While inserting data into the table i am getting an error.Its telling "Create data Processing Function Module".Can any one help me regarding this?
    Thanx in advance
    anirudh

    Hi Anirudh,
      Seems there is already an entry in the Table with the same Primary Key.
    INSERT Statement will give short dump if you try to insert data with same key.
    Why dont you use MODIFY statement to achieve the same.
    Reward points if this Helps.
    Manish

  • Getting error while loading  Data into ASO cube by flat file.

    Hi All,
    i am getting this error Essbase error 1270040: Data load buffer[1] does not exist while loading data into ASO cube.
    does anyone have solution.
    Regards,
    VM

    Are you using ODI to load the data or maxl? If you are using an ODI interface, are you using a load rule also which version of essbase and ODI are you using
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • Error while load data into Essbase using ODI

    Hi ,
    I'm getting the following error while loading measures into Essbase using ODI, I used the same LOG nd Error file and file path for all my Dimensions , this worked well but not sure why this is not working for measures....need help.
    File "<string>", line 79, in ?
    com.hyperion.odi.common.ODIHAppException: c:/temp/Log1.log (No such file or directory)
    Thanks
    Venu

    Are you definitely running it against an agent where that path exists.
    Have you tried using a different location and filename, have you restarted the agent to make sure there is not a lock on the file.
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • Error while importing data into Oracle 11gr2 with arcsde 9.3.1

    I am getting error while importing the data into oracle 11g r2. We are using arcsde 9.3.1
    It seems to be having some problem with spatial index creation.
    kindly help
    IMP-00017: following statement failed with ORACLE error 29855:
    "CREATE INDEX "A3032_IX1" ON "DGN_POLYLINE_2D" ("SHAPE" ) INDEXTYPE IS "MDS"
    "YS"."SPATIAL_INDEX""
    IMP-00003: ORACLE error 29855 encountered
    ORA-29855: error occurred in the execution of ODCIINDEXCREATE routine
    ORA-13249: internal error in Spatial index: [mdidxrbd]
    ORA-13249: Error in Spatial index: index build failed
    ORA-13249: Error in spatial index: [mdrcrtxfergm]
    ORA-13249: Error in spatial index: [mdpridxtxfergm]
    ORA-13200: internal error [ROWID:AAAT5pAA9AACIy5AAQ] in spatial indexing.
    ORA-13206: internal error [] while creating the spatial index
    ORA-13033: Invalid data in the SDO_ELEM_INFO_ARRAY in SDO_GEOMETRY object
    ORA-06512: at "MDSYS

    Guys,
    I am also getting the same error and also my issue is like I am not even to analyze for which indexes I am getting error. It does not hve any indx name before error.
    Processing object type DATABASE_EXPORT/SCHEMA/TABLE/INDEX/DOMAIN_INDEX/INDEX
    ORA-39083: Object type INDEX failed to create with error:
    ORA-29855: error occurred in the execution of ODCIINDEXCREATE routine
    ORA-13249: internal error in Spatial index: [mdidxrbd]
    ORA-13249: Error in Spatial index: index build failed
    ORA-13249: Error in spatial index: [mdrcrtxfergm]
    ORA-13249: Error in spatial index: [mdpridxtxfer]
    ORA-29400: data cartridge error
    ORA-12801: error signaled in parallel query server P000
    ORA-13249: Error in spatial index: [mdpridxtxfergm]
    ORA-13200: internal error [ROWID:AA
    ORA-39083: Object type INDEX failed to create with error:
    ORA-29855: error occurred in the execution of ODCIINDEXCREATE routine
    ORA-13249: internal error in Spatial index: [mdidxrbd]
    ORA-13249: Error in Spatial index: index build failed
    ORA-13249: Error in spatial index: [mdrcrtxfergm]
    ORA-13249: Error in spatial index: [mdpridxtxfer]
    ORA-29400: data cartridge error
    ORA-12801: error signaled in parallel query server P002
    ORA-13249: Error in spatial index: [mdpridxtxfergm]
    ORA-13200: internal error [ROWID:AA
    ORA-39083: Object type INDEX failed to create with error:
    ORA-29855: error occurred in the execution of ODCIINDEXCREATE routine
    ORA-13249: internal error in Spatial index: [mdidxrbd]
    ORA-13249: Error in Spatial index: index build failed
    ORA-13249: Error in spatial index: [mdrcrtxfergm]
    ORA-13249: Error in spatial index: [mdpridxtxfer]
    ORA-29400: data cartridge error
    stack cnt....
    How can I find for which indexes it is failing?
    Thank you,
    Myra

  • Error while loading data into BW (BW as Target) using Data Services

    Hello,
    I'm trying to extract data from SQL Server 2012 and load into BW 7.3 using Data Services. Data Services shows that the job is finished successfully. But, when I go into BW, I'm seeing the below / attached error.
    Error while accessing repository Violation of PRIMARY KEY constraint 'PK__AL_BW_RE_
    Please let me know what this means and  how to fix this. Not sure if I gave the sufficient information. Please let me know if you need any other information.
    Thanks
    Pradeep

    Hi Pradeep,
    Regarding your query please refer below SCN thread for the same issue:
    SCN Thread:
    FIM10 to BW 73- Violation of PRIMARY KEY -table AL_BW_REQUEST
    Error in loading data from BOFC to BW using FIM 10.0
    Thanks,
    Daya

  • Error while loading data into 0pur_c03 using 2lis_02_scl

    Hi,
    While trying to load data into 0pur_c03 by using Data Source 2lis_02_scl from R/3 the error message "The material 100 does not exist or not active" is being displayed. I have loaded master data for 0Material and performed change Run also.Still the problem is not resolved.Please suggest in resolving this issue.
    Regards,
    Hari Prasad B.

    I have the same problem in my system and i found that the material XXXXX had movements.....sales documents and more but in certain moment it material was deleted in R3 and it is not loaded into 0MATERIAL infoobject because is marked for deletion....
    If you have this error must activate the load request manually and your infocube can hold the data without any problem.....
    Regards

  • Error while loading data into SAP BW using BO Data Services......

    Hello All,
    I have a job to load data from SQL Server to SAP BW. I have followed the steps got from SAP wiki to do this.
    1. I have created an RFC connection between two servers(SAP and BODS Job Server)
    when I schedule and start the job immediately from the SAP BW, i get this error and it aborts the RFC connection....
    "Error while executing the following command:  -sJO return code:238"
    Error message during processing in BI
    Diagnosis
    An error occurred in BI while processing the data. The error is documented in an error message.
    System Response
    A caller 01, 02 or equal to or greater than 20 contains an error meesage.
    Further analysis:
    The error message(s) was (were) sent by:
    Scheduler
    Any help would be appreciated......
    Thanks
    Praveen...

    Hi Praveen,
    I want to know which version of BODS you are using for your development of ETL jobs?.
    If it's BODS 12.2.2.2 then  you will get this type of problems frequently as in BODS 12.2.2.2 version , only two connection
    is possible to create while having RFC between BW and BODS.
    So , i suggest if you are using BODS 12.2.2.2 version  , then upgrade it with BODS 12.2.0.0 with Service PACK  3 and Fix Pack3 .
    AS in BODS 12.2.3.3. we have option of having ten connection parallely at a time which helps in resolving this issues.
    please let me know what is your BODS version and if you have upgraded your BODS to SP3 with FP3 , whether your problem is resolved or not..
    All the best..!!
    Thanks ,
    Shandilya Saurav

  • Error while exporting data into Excel Sheet

    Hi All,
    I have created a VO which is based on Query(Not based on EO) and the Query is as follows:
    select f.user_name ,
    f.description ,
    a.currency_code,
    a.amount_to ,
    a.amount_from
    from seacds.ar_approval_user_limits_nv a , seacds.fnd_user_nv f
    where f.user_id = a.user_id
    and a.document_type = 'CM'
    order by 2;
    Based on this VO I have created a search page which will search and returns data from the table and finally standard export button will export the data into excel sheet.
    In this am searching the data based on above 5 attributes. Without entering anything in the messageTextInput if i am clicking GO button, it is returning all the data into the table region. After this if i click on Export Button, data are getting exported into Excel Sheet. It is fine.
    But if i am searching the data by entering any value in any of the messageTextInput, it is returning data. But if i am clicking the Export Button then it is throwing the following error:
    Exception Details.
    oracle.apps.fnd.framework.OAException: oracle.jbo.SQLStmtException: JBO-27122: SQL error during statement preparation. Statement: SELECT * FROM (select f.user_name ,
    f.description ,
    a.currency_code,
    a.amount_to ,
    a.amount_from
    from seacds.ar_approval_user_limits_nv a , seacds.fnd_user_nv f
    where f.user_id = a.user_id
    and a.document_type = 'CM'
    order by 2) QRSLT WHERE (( UPPER(CURRENCY_CODE) like UPPER(:1) AND (CURRENCY_CODE like :2 OR CURRENCY_CODE like :3 OR CURRENCY_CODE like :4 OR CURRENCY_CODE like :5))) ORDER BY DESCRIPTION asc
         at oracle.apps.fnd.framework.OAException.wrapperException(Unknown Source)
         at oracle.apps.fnd.framework.webui.OAPageErrorHandler.prepareException(Unknown Source)
         at oracle.apps.fnd.framework.webui.OAPageErrorHandler.processErrors(Unknown Source)
         at oracle.apps.fnd.framework.webui.OAPageBean.processFormRequest(Unknown Source)
         at oracle.apps.fnd.framework.webui.OAPageBean.preparePage(Unknown Source)
         at oracle.apps.fnd.framework.webui.OAPageBean.preparePage(Unknown Source)
         at oracle.apps.fnd.framework.webui.OAPageBean.preparePage(Unknown Source)
         at OA.jspService(_OA.java:71)
         at com.orionserver.http.OrionHttpJspPage.service(OrionHttpJspPage.java:59)
         at oracle.jsp.runtimev2.JspPageTable.service(JspPageTable.java:462)
         at oracle.jsp.runtimev2.JspServlet.internalService(JspServlet.java:594)
         at oracle.jsp.runtimev2.JspServlet.service(JspServlet.java:518)
         at javax.servlet.http.HttpServlet.service(HttpServlet.java:856)
         at com.evermind.server.http.ServletRequestDispatcher.invoke(ServletRequestDispatcher.java:713)
         at com.evermind.server.http.ServletRequestDispatcher.forwardInternal(ServletRequestDispatcher.java:370)
         at com.evermind.server.http.HttpRequestHandler.doProcessRequest(HttpRequestHandler.java:871)
         at com.evermind.server.http.HttpRequestHandler.processRequest(HttpRequestHandler.java:453)
         at com.evermind.server.http.HttpRequestHandler.serveOneRequest(HttpRequestHandler.java:221)
         at com.evermind.server.http.HttpRequestHandler.run(HttpRequestHandler.java:122)
         at com.evermind.server.http.HttpRequestHandler.run(HttpRequestHandler.java:111)
         at oracle.oc4j.network.ServerSocketReadHandler$SafeRunnable.run(ServerSocketReadHandler.java:260)
         at com.evermind.util.ReleasableResourcePooledExecutor$MyWorker.run(ReleasableResourcePooledExecutor.java:303)
         at java.lang.Thread.run(Thread.java:595)
    ## Detail 0 ##
    java.sql.SQLException: Invalid column type
         at oracle.jdbc.driver.DatabaseError.throwSqlException(DatabaseError.java:138)
         at oracle.jdbc.driver.DatabaseError.throwSqlException(DatabaseError.java:175)
         at oracle.jdbc.driver.DatabaseError.throwSqlException(DatabaseError.java:240)
         at oracle.jdbc.driver.OraclePreparedStatement.setObjectCritical(OraclePreparedStatement.java:7895)
         at oracle.jdbc.driver.OraclePreparedStatement.setObjectInternal(OraclePreparedStatement.java:7572)
         at oracle.jdbc.driver.OraclePreparedStatement.setObjectInternal(OraclePreparedStatement.java:8183)
         at oracle.jdbc.driver.OraclePreparedStatement.setObjectAtName(OraclePreparedStatement.java:8206)
    Kindly give me any idea to clear this error.
    Thanks and Regards,
    Myvizhi

    Hi Myvizhi ,
    Did you try running the query from back end that got generated in the error trace ? see if that query is returning desired
    output .
    Also would like to know if you are using oracle standard search mode i.e result based search / auto customization search ?
    --Keerthi                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                       

  • Error while downloading data into excel file.

    Hi,
        I have a requirement to download data available in xstring to excel file.
    I have a RFC which has export parameter 'file_data' of type xstring. When i call RFC
    from web dynpro abap application it gives data out pout in xstring format.
    I am opening excel file using that xstring data as below
    cl_wd_runtime_services=>attach_file_to_response(
    i_filename = 'file.xls'
    i_content = ls_data_source-data_source   ***[xstring data from RFC]
    i_mime_type = 'x-excel/application'
    i_in_new_window = abap_true ).
    But excel file not coming in correct format. its  giving an error while download
    excel file
    Error is
          " the file you are trying to open , 'file[1].xls', is in a different format than
    specified by the file extension. Verify that the file is not corrupted and is from
    a trusted source before opening the file. Do you want to open the file now? "
    If i click on 'YES' button file opening but not in proper excel format. Data is coming
    in tab delimiter format in excel file.
    Please suggest any solution for this problem.
    Thanks,
    Venkat.

    Hi Thomas,
    Following is the logic implemented in RFC which is giving XSTRING as export parameter
    STEP1      create a dynamic internal table
               Create field catalog for the table   using LVC_FIELDCATALOG_MERGE
                   CALL METHOD cl_alv_table_create=>create_dynamic_table
                        EXPORTING
                           it_fieldcatalog = it_fieldcat
                         IMPORTING
                           ep_table        = dyn_table.
                  CREATE  EXCEL SHEET BY SENDING FIELD CATALOG AND DATA TAB
    STEP2 # Convert text table to xstring.
            CALL FUNCTION 'SCMS_TEXT_TO_XSTRING'
              IMPORTING
                buffer   = l_content
              TABLES
                text_tab = lt_data_tab
              EXCEPTIONS
                failed   = 1
                OTHERS   = 2.
    STEP3  *# Psss Data to Netweaver
              PERFORM pass_data_to_nw USING is_import
                                      CHANGING es_attachment_metadata.

Maybe you are looking for