Error while loading from DSO to DSO in BI

Hi All,
While we are loaind the data from base DSO to Target DSO. Using DTP we are getting the following error. Could any one please provide me the solution for this.
"The argument 'B' cannot be interpreted as a number"
i am getting the above message. The lload is successfull from R3 to base DSO, but the same data is ffailed in target DSO.
Helpful answers will be rewared with the points.
Many thanks,
Khaja

Hi Siva,
thanks for ur reply. Mapping in transformation is Ok. And we are using the same info objects which we have used in base DSO. The load is successfull to the base DSO it got failed in target DSO.
How to know to which info object it is throwing the error. we have 80 info objects. even in log also it is not giving the info object name.
Thanks
Khaja

Similar Messages

  • DUmp Erro ASSIGN_TYPE_CONFLICT while loading from DSO to INFOCUBE

    HI ALL,
    Here i had rquirement of modifying the Tranformation rules and i have done some routine coding for a particular infoobject
    and deleted the data from DOS  & Cube , then i started Loading to DSO, here i got the sucess for DSO, now i have got  the modified data of the particular Infoobject in the DSO,
    But i am facing a Dump Error while loading data from DSO to Cube thorugh the DTP .
    HEre is the Error "ASSIGN_TYPE_CONFLICT" which i see in ST22.
    Plz,
    Thanks in Advance,
    sravan

    HI,
    When i started the load for the first time i got he same error, so i have activated the DTP and laoded it again,
    then i faced the same problem,
    where the Modified data is already in DSO and  i have validated the Data also,  that is ok ...
    so i need to Delete or to need to create another DTP to load the data from DSA to CUBE.
    and even i have cheked all the tansformation rules they are all fine  the DSO structure and Infocube Structure is ok,
    Please  suggest,
    Thanks LAX,

  • Error while extracting from DSO

    Hellow ! ! !
    Got a error message.
    Scenario:
    Tried Loading data from one DSO to another DSO through transformation.
    But, got a following error message.
    Error:
    Error while extracting from source 0FIGL_O02 (type DataStore)
    Message no. RSBK242
    Inconsistent input parameter (parameter: <unknown>, value <unknown>)
    Message no. RS_EXCEPTION101
    System:
    SAP_BW:700
    Support Package: SAPKW70016
    Regards,
    DongSun Choi

    Got same results though tried several times.
    Updated well  from DSO after Changing from transformation to  update rule.
    Think the transformation is some problem.
    Thanks a lot...
    dongsun, choi

  • Error while loading from PSA to Cube. RSM2-704 & RSAR -119

    Dear Gurus,
    I have to extract about 1.13 million records from set up tables to the MM cube 0IC_C03. While doing, the following two errors occured while loading from the psa to the cube:
    Error ID - RSM2 & error no. 704: Data records in the PSA were marked for data package 39 . Here there were 2 errors. The system wrote information or warnings for the remaining data records.
    Error ID- RSAR & No. 119: The update delivered the error code 4 .
    (Data records for package 39 selected in PSA - 2 error(s). Record 5129 :No SID found for value 'EMN ' of characteristic 0BASE_UOM . Record 5132 :No SID found for value 'EMN ' of characteristic 0BASE_UOM )
    I tried to change the defective records in psa by deleting the erraneous field value EMN and tried to load but failed.
    Now, my questions are:
    How can I resolve the issue ?
    (ii) How to rectify the erraneous record ? should we only delete the erraneous field value or delete the entire record from the psa.
    (iii) How to delete a record from psa.
    Thanks & regards,
    Sheeja.

    Hi,
    Data records for package 39 selected in PSA - 2 error(s). Record 5129 :No SID found for value 'EMN ' of characteristic 0BASE_UOM . Record 5132 :No SID found for value 'EMN ' of characteristic 0BASE_UOM
    The issue with record no. 5129 and 5132.
    In PSA check errorneous records and it will display only the error records, you just edit as per the require and try reloading into cube.
    Deleting single record is not possible.
    Let us know if you still have any issues.
    Reg
    Pra

  • Transformation Rule: Error while loading from PSA to ODS using DTP

    Hi Experts,
    I am trying to load data from PSA to ODS using DTP. For about 101 records I get the following error:
    "Runtime error while executing rule -> see long text     RSTRAN     301"
    On further looking at the long text:
    Diagnosis
        An error occurred while executing a transformation rule:
        The exact error message is:
        Overflow converting from ''
        The error was triggered at the following point in the program:
        GP4808B5A4QZRB6KTPVU57SZ98Z 3542
    System Response
        Processing the data record has been terminated.
    Procedure
          The following additional information is included in the higher-level
         node of the monitor:
         o   Transformation ID
         o   Data record number of the source record
         o   Number and name of the rule which produced the error
    Procedure for System Administration
    When looking at the detail:
    Error Location: Object Type    TRFN
    Error Location: Object Name    06BOK6W69BGQJR41BXXPE8EMPP00G6HF
    Error Location: Operation Type DIRECT
    Error Location: Operation Name
    Error Location: Operation ID   00177 0000
    Error Severity                 100
    Original Record: Segment       0001
    Original Record: Number        2
    Pls can anyone help in deducing and pointing this error to the exact spot in the transformation rule
    Thanks & Regards,
    Raj

    Jerome,
    The same issue.
    Here are some fields which are different in terms of length when mapped in transformation rules
    ODS                    |Data Source
    PROD_CATEG     CHAR32           |Category_GUID      RAW 16
    CRM_QTYEXP     INT4          |EXPONENT      INT2
    CRM_EXCRAT     FLTP16          |EXCHG_RATE     Dec 9
    CRM_GWEIGH     QUAN 17, 3     |Gross_Weight     QUAN 15
    NWEIGH          QUAN 17, 3     |Net_Weight     QUAN 15
    CRMLREQDAT     DATS 8          |REQ_DLV_DATE     Dec 15
    The difference is either some dats field are mapped to decimal, or the char 32 field is mapped to raw 16 OR Calweek, Calmonth is mapped to Calday
    Both mostly all the ods field size is greater than the input source field.
    Thanks
    Raj

  • Error while loading from DTP

    Hi,
    We are facing some issue while loading the data from R/3 to BW using DTP.
    We are getting the error (both for full and delta)as follows:
    Data package 1:Errors during Processing
    *     Extraction datasource*
    *     Filter out new records with same key.*
    *     RSDS material*
    *     Update to datastore object*
    *          Unpack data package*
    *          Exception in substep:write data package*
    *          Processing terminated.*
    *     Set technical status to red*
    *     Set overall status to red*
    *     Set overall status to green*
    *     Further processing started*
    *     Set status to u2018Processed furtheru2019*
    The expansion of the error Exception in substep: write data package:
    Record 0, segment 0001 is not in teh cross-record table
    Error while writing error stack
    Record 0, segment 0001 is not in the cross-record table
    Error in substep update to DataStore Object
    We are using both start routine and end routine in the transformation.
    Any help on this is highly appreciable.
    End routine:
        IF NOT RESULT_PACKAGE IS INITIAL.
          SELECT salesorg comp_code FROM /bi0/psalesorg
          INTO TABLE it_salesorg
          FOR ALL ENTRIES IN RESULT_PACKAGE
          WHERE salesorg = RESULT_PACKAGE-SALESORG
          AND objvers = 'A'.
          SELECT /bic/ZMATRTUGP FROM /bic/pZMATRTUGP
          INTO TABLE it_mat
          FOR ALL ENTRIES IN RESULT_PACKAGE
          WHERE /bic/ZMATRTUGP = RESULT_PACKAGE-/bic/ZMATRTUGP.
          it_data[] = RESULT_PACKAGE[].
        ENDIF.
    get company code from salesorg for all material
        LOOP AT it_data INTO st_data.
          w_tabix = sy-tabix.
          READ TABLE it_salesorg INTO st_salesorg
          WITH TABLE KEY salesorg = st_data-salesorg.
          IF sy-subrc = 0.
            st_data-comp_code = st_salesorg-compcode.
            MODIFY it_data FROM st_data INDEX w_tabix.
          ENDIF.
        ENDLOOP.
        REFRESH RESULT_PACKAGE.
        RESULT_PACKAGE[] = it_data[].
    *Finally check for each material if something has changed since the last
    *loading : if yes compare and update, otherwise no upload
        IF NOT it_mat[] IS INITIAL.
          SELECT * FROM /bic/azalomrtu00
          INTO TABLE it_check
          FOR ALL ENTRIES IN it_mat
          WHERE /bic/ZMATRTUGP = it_mat-zmaterial.
        ENDIF.
        DELETE it_check WHERE salesorg IS INITIAL.
        CLEAR st_result_package.
        LOOP AT RESULT_PACKAGE INTO st_RESULT_PACKAGE.
          READ TABLE it_check INTO st_check
          WITH TABLE KEY
          /bic/ZMATRTUGP = st_result_package-/bic/ZMATRTUGP
          CUST_GRP4 = st_result_package-CUST_GRP4
          SALESORG = st_result_package-SALESORG
          VALIDTO = st_result_package-VALIDTO.
          IF sy-subrc = 0.
    *If one of the characteristic is different, let that entry in the
    *result_package otherwise no use updating it
            IF st_check-/BIC/ZCSBRTUGP = st_result_package-/BIC/ZCSBRTUGP
            AND st_check-/BIC/ZCONDTYPE = st_result_package-/BIC/ZCONDTYPE
            AND st_check-comp_code = st_result_package-comp_code
            AND st_check-/BIC/ZCNT_RTU = st_result_package-/bic/ZCNT_RTU
            AND st_check-VALIDFROM = st_result_package-VALIDFROM.
              DELETE RESULT_PACKAGE.
            ELSE.
    *entry is new : let it updated.
            ENDIF.
            DELETE it_check WHERE
                  /bic/ZMATRTUGP = st_result_package-/bic/ZMATRTUGP
          AND CUST_GRP4 = st_result_package-CUST_GRP4
          AND SALESORG = st_result_package-SALESORG
          AND VALIDTO = st_result_package-VALIDTO.
          ENDIF.
        ENDLOOP.
    *if some entries are in the ODS but not in the datapackage, they have to
    *be deleted in the ODS since they don't exist anymore
        LOOP AT it_check INTO st_check.
          CLEAR st_result_package.
          st_result_package-/bic/ZMATRTUGP = st_check-/bic/ZMATRTUGP.
          st_result_package-CUST_GRP4 = st_check-CUST_GRP4.
          st_result_package-SALESORG = st_check-SALESORG.
          st_result_package-VALIDTO = st_check-VALIDTO.
          st_result_package-recordmode = 'D'.
          APPEND st_result_package TO RESULT_PACKAGE.
        ENDLOOP.
    Thanks in advance.
    Thanks,
    Meghana

    Hi Meghana
    HAve you got an short dump in ST22 transaction ?
    If yes, what is it ?
    In your routine, maybe you should as well add a counter for index in your loop :
    DATA : w_idx like sy-tabix.
    When you loop at result_package, do :      w_idx = sy-tabix.
    And make :     DELETE result_package INDEX w_idex.
    Hope it helps you
    Mickael

  • Error while loading from ODS to CUBE

    Hi guys,
    I am loading data from source system to ODS and from ODS to CUBE. Ok. The data came successfully from Source System to ODS. But while coming to ODS to CUBE it is showing some error at Data Packet 13. Ok. In CUBE, I loaded the data by using “update ODS data to data target”, at that time it shows two options like FULL UPDATE and INIT UPDATE. Then I selected FULL UPDATE and then I checked in “processing” tab there is only one option enabled i.e. only data target. OK and then I loaded it. Now after getting the error where I can correct it. There is no PSA option in monitoring.
             Other wise how I can change option in Processing tab of infopackage for PSA. But I know one point that when we load the data from one target to another the only one option available in processing tab is “only data target”. How can I change that option and how can I correct the error.
    Thanks
    Rajesh

    Hi,
    i solved my question like the following.
    Go to monitoring of the CUBE and select the option "read every thing as manually" -> then it shows another screen for correcting the records -> correct all the records and load the data again. ok
    Thanks
    Rajesh

  • Error while loading from Flat file

    Hi,
    I'm trying to load master data from a flat file. After loading I get the following message even though I have used PSA as staging area. I see that data is present in the master data table (Text) without any errors though. Kindly help.
    Error when updating Idocs in Business Information Warehouse
    Diagnosis
    Errors have been reported in Business Information Warehouse during IDoc update:
    EDI: Partner profile not available
    System response
    There are IDocs with incorrect status.
    Procedure
    Check the IDocs in Business Information Warehouse . You can get here using the BW Monitor.
    Removing errors:
    How you remove the errors depends on the error message you receive.

    Check your flat file format and field separator. Is it .txt or .csv file?
    Go to "Source systems" in "Modeling" chose your source system for flat files and perform check.
    Maybe problem is with your source system...
    This might also be due to logical names of the systems not being maintained. Also has anyone doen a flat file upload before this on the system and what is the version ?

  • Error while loading from R/3 TO BW  ie 0material error

    my is leela,
                 hai i am BW CONSULTANT.i got some problem in
    loading s.d billing document ie 2lis_13_vahdr (billing document header data).the problem is i got 0material error
    ie some characterisics are entered in material field.
    these are ERSTE HILFE AND COACHING. these two are inserted in that field.i copied these once and executed in RSKC t.code ,but it is not accepting ,so pls kindly go through this ,give me a solution for this error .thank you.bye my mail id is [email protected]

    Leela
    Welcome to SDN
    Your 0Material has incorrect Data in R/3 system. Now you have two options
    1 Ask your R/3 team to correct this in R/3 who is responsible for this material.
    2 In BW correct it in PSA and load it to cube/ODS
    Hope this helps
    Thanks
    Sat

  • Data flow tasks faills while loading from database to excel

    Hello,
    I am getting error while loading from oledb source to excel and the error as shown below.
    Error: 0xC0202009 at DFT - Company EX, OLE DB Destination [198]: SSIS Error Code DTS_E_OLEDBERROR.  An OLE DB error has occurred. Error code: 0x80004005.
    Error: 0xC0209029 at DFT - Company EX, OLE DB Destination [198]: SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR.  The "input "OLE DB Destination Input" (211)" failed because error code 0xC020907B occurred, and the error row
    disposition on "input "OLE DB Destination Input" (211)" specifies failure on error. An error occurred on the specified object of the specified component.  There may be error messages posted before this with more information about the
    failure.
    Error: 0xC0047022 at DFT - Company EX: SSIS Error Code DTS_E_PROCESSINPUTFAILED.  The ProcessInput method on component "OLE DB Destination" (198) failed with error code 0xC0209029. The identified component returned an error from the ProcessInput
    method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.  There may be error messages posted before this with more information about the failure.
    Error: 0xC02020C4 at DFT - Company EX, OLE DB Source 1 [1]: The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020.
    Error: 0xC0047021 at DFT - Company EX: SSIS Error Code DTS_E_THREADFAILED.  Thread "WorkThread0" has exited with error code 0xC0209029.  There may be error messages posted before this with more information on why the thread has exited.
    Error: 0xC0047038 at DFT - Company EX: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED.  The PrimeOutput method on component "OLE DB Source 1" (1) returned error code 0xC02020C4.  The component returned a failure code when the pipeline engine
    called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.  There may be error messages posted before this with more information about the failure.
    Error: 0xC0047021 at DFT - Company EX: SSIS Error Code DTS_E_THREADFAILED.  Thread "SourceThread0" has exited with error code 0xC0047038.  There may be error messages posted before this with more information on why the thread has exited.
    Any help would be appreciated ASAP.
    Thanks,
    Vinay s

    You can use this code to import from SQL Server to Excel . . .
    Sub ADOExcelSQLServer()
    ' Carl SQL Server Connection
    ' FOR THIS CODE TO WORK
    ' In VBE you need to go Tools References and check Microsoft Active X Data Objects 2.x library
    Dim Cn As ADODB.Connection
    Dim Server_Name As String
    Dim Database_Name As String
    Dim User_ID As String
    Dim Password As String
    Dim SQLStr As String
    Dim rs As ADODB.Recordset
    Set rs = New ADODB.Recordset
    Server_Name = "EXCEL-PC\EXCELDEVELOPER" ' Enter your server name here
    Database_Name = "AdventureWorksLT2012" ' Enter your database name here
    User_ID = "" ' enter your user ID here
    Password = "" ' Enter your password here
    SQLStr = "SELECT * FROM [SalesLT].[Customer]" ' Enter your SQL here
    Set Cn = New ADODB.Connection
    Cn.Open "Driver={SQL Server};Server=" & Server_Name & ";Database=" & Database_Name & _
    ";Uid=" & User_ID & ";Pwd=" & Password & ";"
    rs.Open SQLStr, Cn, adOpenStatic
    ' Dump to spreadsheet
    With Worksheets("sheet1").Range("a1:z500") ' Enter your sheet name and range here
    .ClearContents
    .CopyFromRecordset rs
    End With
    ' Tidy up
    rs.Close
    Set rs = Nothing
    Cn.Close
    Set Cn = Nothing
    End Sub
    Also, check this out . . .
    Sub ADOExcelSQLServer()
    Dim Cn As ADODB.Connection
    Dim Server_Name As String
    Dim Database_Name As String
    Dim User_ID As String
    Dim Password As String
    Dim SQLStr As String
    Dim rs As ADODB.Recordset
    Set rs = New ADODB.Recordset
    Server_Name = "LAPTOP\SQL_EXPRESS" ' Enter your server name here
    Database_Name = "Northwind" ' Enter your database name here
    User_ID = "" ' enter your user ID here
    Password = "" ' Enter your password here
    SQLStr = "SELECT * FROM Orders" ' Enter your SQL here
    Set Cn = New ADODB.Connection
    Cn.Open "Driver={SQL Server};Server=" & Server_Name & ";Database=" & Database_Name & _
    ";Uid=" & User_ID & ";Pwd=" & Password & ";"
    rs.Open SQLStr, Cn, adOpenStatic
    With Worksheets("Sheet1").Range("A2:Z500")
    .ClearContents
    .CopyFromRecordset rs
    End With
    rs.Close
    Set rs = Nothing
    Cn.Close
    Set Cn = Nothing
    End Sub
    Finally, if you want to incorporate a Where clause . . .
    Sub ImportFromSQLServer()
    Dim Cn As ADODB.Connection
    Dim Server_Name As String
    Dim Database_Name As String
    Dim User_ID As String
    Dim Password As String
    Dim SQLStr As String
    Dim RS As ADODB.Recordset
    Set RS = New ADODB.Recordset
    Server_Name = "Excel-PC\SQLEXPRESS"
    Database_Name = "Northwind"
    'User_ID = "******"
    'Password = "****"
    SQLStr = "select * from dbo.TBL where EMPID = '2'" 'and PostingDate = '2006-06-08'"
    Set Cn = New ADODB.Connection
    Cn.Open "Driver={SQL Server};Server=" & Server_Name & ";Database=" & Database_Name & ";"
    '& ";Uid=" & User_ID & ";Pwd=" & Password & ";"
    RS.Open SQLStr, Cn, adOpenStatic
    With Worksheets("Sheet1").Range("A1")
    .ClearContents
    .CopyFromRecordset RS
    End With
    RS.Close
    Set RS = Nothing
    Cn.Close
    Set Cn = Nothing
    End Sub
    Knowledge is the only thing that I can give you, and still retain, and we are both better off for it.

  • Error While loading DSO

    Dear All,
    Error While Loading DSO in BI7.. Activation SID for Data Store Object changed;from 0 to 436921.

    Prasad,
    Please do not raise duplicate posts ....
    ERROR WHILE LOADING DSO
    as a result - answers have become spread across posts..
    the least you can do is mention the duplicate posts and update which post you want to keep and which you want to remove...
    As for your error - it is not an error - but then when you have a to be activated request on the screen and then on refreshing the screen a status is changed from green ( unactivated ) to yellow and an SID gets generated for the same. Since there is a change in status for the program that displays the requests - you see this as an error.
    Since you want a way to not see this error....
    Another way is to start the activation ,
    come back to RSA1 and then go to the manage screen and you should not see this error or go to RSA1 and then go back to the manage screen after few minutes.

  • Delta records are not loading from DSO to info cube

    My query is about delta loading from DSO to info cube. (Filter used in selection)
    Delta records are not loading from DSO to Info cube. I have tried all options available in DTP but no luck.
    Selected "Change log" and "Get one request only" and run the DTP, but 0 records got updated in info cube
    Selected "Change log" and "Get all new data request by request", but again 0 records got updated
    Selected "Change log" and "Only get the delta once", in that case all delta records loaded to info cube as it was in DSO and  gave error message "Lock Table Overflow" .
    When I run full load using same filter, data is loading from DSO to info cube.
    Can anyone please help me on this to get delta records from DSO to info cube?
    Thanks,
    Shamma

    Data is loading in case of full load with the same filter, so I don't think filter is an issue.
    When I follow below sequence, I get lock table overflow error;
    1. Full load with active table with or without archive
    2. Then with the same setting if I run init, the final status remains yellow and when I change the status to green manually, it gives lock table overflow error.
    When I chnage the settings of DTP to init run;
    1. Select change log and get only one request, and run the init, It is successfully completed with green status
    2. But when I run the same DTP for delta records, it does not load any data.
    Please help me to resolve this issue.

  • Data load from DSO to cube fails

    Hi Gurus,
    The data loads have failed last night and when I try to dig inside the process chains I find out that the cube which should be loaded from DSO is not getting loaded.
    The DSO has been loaded without errors from ECC.
    Error Message say u201DThe unit/currency u2018source currency 0currencyu2019 with the value u2018spaceu2019 is assigned to the keyfigure u2018source keyfigure.ZARAMTu2019 u201D
    I looked in the  PSA has about 50, 000 records .
    and all data packages have green light and all amounts have 0currency assigned
    I went in to the DTP and looked at the error stack it has nothing in it then I  changed the error handling option from u2018 no update,no reportingu2019 to u2018valid records update, no reporting (resquest red)u2019 and executed then the error stactk showed 101 records
    The ZARAMT filed has 0currency blank for all these records
    I tried to assign USD to them and  the changes were saved and I tried to execute again but then the message says that the request ID should be repaired or deleted before the execution. I tried to repair it says can not be repaired so I deleted it and executed . it fails and the error stack still show 101 records.when I look at the records the changes made do not exist anymore.
    If I delete the request ID before changing and try to save the changes then they donu2019t get saved.
    What should I do to resolve the issue.
    thanks
    Prasad

    Hi Prasad....
    Error Stack is request specific.....Once you delete the request from the target the data in error stach will also get deleted....
    Actually....in this case what you are suppose to do is :
    1) Change the error handling option to u2018valid records update, no reporting (resquest red)u2019 (As you have already done....) and execute the DTP......all the erroneous records will get sccumulated in the error stack...
    2) Then correct the erroneos records in the Error Stack..
    3) Then in the DTP----in the "Update tab"....you will get the option "Error DTP".......if it is already not created you will get the option as "Create Error DT".............click there and execute the Error DTP..........The Error DTP will fetch the records from the Error Stack and will create a new request in the target....
    4) Then manually change the status of the original request to Green....
    But did you check why the value of this field is blank....if these records again come as apart of delta or full then your load will again fail.....chcek in the source system and fix it...for permanent solution..
    Regards,
    Debjani......

  • Partial data loading from dso to cube

    Dear All,
    I am loading the data from dso to the cube through DTP. The problem is that some records are getting added  through data
    packet 1 to the cube around 50 crore records while through data packet2 records are not getting added.
    It is full load from dso to cube.
    I have tried deleting the request and executing DTP but again the same no of records get added to the cube through datapacket 1   and after that records are not added to the cube through datapacket 1  ;request remains in yellow state only.
    Please suggest .

    Nidhuk,
    Data load transfers package by package. Your story sounds like it got stuck in second package or something. Suggest you check package size, try to increase to a higher number to see if anything  changes ? 50 records per package kind of low, your load should not spread out into too many packages.
    Regards. Jen
    Edited by: Jen Yakimoto on Oct 8, 2010 1:45 AM

  • Error While loading a image from database

    Purpose
    Error While loading the Image from the database
    ORA-06502: PL/SQL : numeric or value error:Character string buffer too small
    ORA-06512: at "Sys.HTP" ,line 1480
    Requirement:
    I am developing the web pages using PSP(Pl/sql Serverpages) . I have a requirement to show a image in the webpage. I got the following procedures from the oracle website.
    I have created the following table to store the images
    create table DEMO
    ID INTEGER not null,
    THEBLOB BLOB
    And I also uploaded the Images. Now I am try to get a image from the table using the following procedure .But I am getting the error message line 25( htp.prn( utl_raw.cast_to_varchar2( l_raw ) );) .at it throws the following error messages
    ORA-06502: PL/SQL : numeric or value error:Character string buffer too small
    ORA-06512: at "Sys.HTP" ,line 1480
    Procedure that I used to get the image
    create or replace package body image_get
    as
    procedure gif( p_id in demo.id%type )
    is
    l_lob blob;
    l_amt number default 30;
    l_off number default 1;
    l_raw raw(4096);
    begin
    select theBlob into l_lob
    from demo
    where id = p_id;
    -- make sure to change this for your type!
    owa_util.mime_header( 'image/gif' );
    begin
    loop
    dbms_lob.read( l_lob, l_amt, l_off, l_raw );
    -- it is vital to use htp.PRN to avoid
    -- spurious line feeds getting added to your
    -- document
    htp.prn( utl_raw.cast_to_varchar2( l_raw ) );
    l_off := l_off+l_amt;
    l_amt := 4096;
    end loop;
    exception
    when no_data_found then
    NULL;
    end;
    end;
    end;
    What I have to do to correct this problem. This demo procedure and table that I am downloaded from oracle. Some where I made a mistake. any help??
    Thanks,
    Nats

    Hi Satish,
    I have set the raw value as 3600 but still its gives the same error only. When I debug the procedure its throwing the error stack in
    SYS.htp.prn procedure of the following line of code
    if (rows_in < pack_after) then
    while ((len - loc) >= HTBUF_LEN)
    loop
    rows_in := rows_in + 1;
    htbuf(rows_in) := substr(cbuf, loc + 1, HTBUF_LEN);
    loc := loc + HTBUF_LEN;
    end loop;
    if (loc < len)
    then
    rows_in := rows_in + 1;
    htbuf(rows_in) := substr(cbuf, loc + 1);
    end if;
    return;
    end if;
    Its a system procedure. I don't no how to proceed .. I am really stucked on this....is their any other method to take picture from the database and displayed in the web page.....???? any idea..../suggesstion??
    Thanks for your help!!!.

Maybe you are looking for