DBIF_RSQL_SQL_ERROR - while dataloading from PSA to InfoCube!

Hi Friends,
I know this topic has been posted before too and i have gone through almost all the threads related to this topic but could not find the solution for this issue. Here is the details:
I am in Production and loading data to one the cube through PSA. Data comes fine till PSA (arround 1 million records). But when I am updating data to Infocube through DTP I am getting the following error:
Dump: ABAP/4 processor: DBIF_RSQL_SQL_ERROR
when i click on the error it says the following:
DBIF_RSQL_SQL_ERROR
CX_SY_OPEN_SQL_DB
An exception occurred that is explained in detail below.
The exception, which is assigned to class 'CX_SY_OPEN_SQL_DB'
According the previous threads on this issue its a Table space issue in Database. But I am sure its not table space issue as my other loads are successfull with much more number of records. I checked in DB02 too and see no problem with the table space.
I tried selective loading through DTP and my data load was successfull. But this is not the way we want to load the data. we dont want to give any selections in InfoPacakge.  
I tried to see in ST22 and see the following dumps:
CONNE_IMPORT_WRONG_COMP_TYPE    CX_SY_IMPORT_MISMATCH_ERROR
OBJECTS_OBJREF_NOT_ASSIGNED
CX_SY_REF_IS_INITIAL
DBIF_RSQL_SQL_ERROR
CX_SY_OPEN_SQL_DB
DBIF_RSQL_SQL_ERROR
CX_SY_OPEN_SQL_DB
I tried to delete infocube index first before load from PSA. but it did not work for me.
i would highly appriciate any help from ur side and will awarded with points.
Thanks,
Manmit

Thanks Srinivas. I can not reduce the package size because i am loading data from PSA to InfoCube through DTP. This DTP is created upon a data source which is a genereted export datasource i.e. <b>8XYZABC</b> which does not allow me to change the package size and this is decided at run time. I can see the following under the Extraction Tab of DTP :
<b>The package size corresponds to package size in source.
It is determined dynamically at runtime.</b>
Any idea how to solve this prob?

Similar Messages

  • Data not loading from PSA to infocube

    Hi,
    I'm facing a problem while extracting the data from PSA to infocube (On APO SIDE). we are using 3 infocubes and we extracting data in such a way that each infocube has only one unit of measure, so with the help of abaper, we have written a start routine in the transformation that for eg: the products with unit of measure KM should come only in the cube.But no data is coming into the cube and it is showing the following errors
    From Abap Side:
    It turns up with an error message. The error is generated at the start routine method. In the method abap code is added. the code is "DELETE SOURCE_PACKAGE where VRKME NE 'KM'." Following is the error displayed :
    1.) Call of the method START_ROUTINE of the class LCL_TRANSFORM failed; wrong type for parameter SOURCE_PACKAGE
    2.) Exception CX_RS_STEP_FAILED logged
    please help me out in this.
    Thank you
    Regards,
    Raj

    Could you please explain it in more elaborate.
    V r not passing anything to DATA_PACKAGE neither using it. In the method call v r passing <_yt_SC_1> to parameter SOURCE_PACKAGE. And inside the method v have written the delete code as "DELETE SOURCE_PACKAGE where VRKME NE 'KM'. "
    Do we need to change the coding to "DELETE DATA_PACKAGE where VRKME NE 'KM'. "
    Regards,
    Raj

  • Unable to load the data from PSA to INFOCUBE

    Hi BI Experts, good afternoon.
        I am loading 3 years data( Full load ) from R/3 to Infocube.
       So loaded the data by monthwise. So i created 36 info packages.
      Everything is fine. But i got a error in Jan 2005 and Mar 2005. It is the same error in both months. That is Caller 01and caller 02 errors( Means invalid characteristics are there PSA data )
    So i deleted both PSA and Data target Requests and again i loaded the data only to PSA.
      Here i got data in PSA without fail.
      Then i tried to load the data from PSA to Infocube MANUALLY.
    But its not happening.
      One message came this
           SID 60,758 is smaller than the compress SID of cube ZIC_C03; no        request booking.
       Please give me the solution how to solve this problem.
      Thanks & Regards
         Anjali

    Hi Teja,
       Thanks for the good response.
      How can i check whether it is already compressed or not?
      Pls give me the reply.
      Thanks
              Anjali

  • Transformation Rule: Error while loading from PSA to ODS using DTP

    Hi Experts,
    I am trying to load data from PSA to ODS using DTP. For about 101 records I get the following error:
    "Runtime error while executing rule -> see long text     RSTRAN     301"
    On further looking at the long text:
    Diagnosis
        An error occurred while executing a transformation rule:
        The exact error message is:
        Overflow converting from ''
        The error was triggered at the following point in the program:
        GP4808B5A4QZRB6KTPVU57SZ98Z 3542
    System Response
        Processing the data record has been terminated.
    Procedure
          The following additional information is included in the higher-level
         node of the monitor:
         o   Transformation ID
         o   Data record number of the source record
         o   Number and name of the rule which produced the error
    Procedure for System Administration
    When looking at the detail:
    Error Location: Object Type    TRFN
    Error Location: Object Name    06BOK6W69BGQJR41BXXPE8EMPP00G6HF
    Error Location: Operation Type DIRECT
    Error Location: Operation Name
    Error Location: Operation ID   00177 0000
    Error Severity                 100
    Original Record: Segment       0001
    Original Record: Number        2
    Pls can anyone help in deducing and pointing this error to the exact spot in the transformation rule
    Thanks & Regards,
    Raj

    Jerome,
    The same issue.
    Here are some fields which are different in terms of length when mapped in transformation rules
    ODS                    |Data Source
    PROD_CATEG     CHAR32           |Category_GUID      RAW 16
    CRM_QTYEXP     INT4          |EXPONENT      INT2
    CRM_EXCRAT     FLTP16          |EXCHG_RATE     Dec 9
    CRM_GWEIGH     QUAN 17, 3     |Gross_Weight     QUAN 15
    NWEIGH          QUAN 17, 3     |Net_Weight     QUAN 15
    CRMLREQDAT     DATS 8          |REQ_DLV_DATE     Dec 15
    The difference is either some dats field are mapped to decimal, or the char 32 field is mapped to raw 16 OR Calweek, Calmonth is mapped to Calday
    Both mostly all the ods field size is greater than the input source field.
    Thanks
    Raj

  • Error in rules while updating from PSA to ODS

    Hi Experts,
    I am trying to load data from PSA to ODS using DTP. For about 101 records I get the following error:
    "Runtime error while executing rule -> see long text     RSTRAN     301"
    On further looking at the long text:
    Diagnosis
        An error occurred while executing a transformation rule:
        The exact error message is:
        Overflow converting from ''
        The error was triggered at the following point in the program:
        GP4808B5A4QZRB6KTPVU57SZ98Z 3542
    System Response
        Processing the data record has been terminated.
    Procedure
          The following additional information is included in the higher-level
         node of the monitor:
         o   Transformation ID
         o   Data record number of the source record
         o   Number and name of the rule which produced the error
    Procedure for System Administration
    When looking at the detail:
    Error Location: Object Type    TRFN
    Error Location: Object Name    06BOK6W69BGQJR41BXXPE8EMPP00G6HF
    Error Location: Operation Type DIRECT
    Error Location: Operation Name
    Error Location: Operation ID   00177 0000
    Error Severity                 100
    Original Record: Segment       0001
    Original Record: Number        2
    Pls can anyone help in deducing and pointing this error to the exact spot in the transformation rule
    Thanks & Regards,
    Raj

    Hi Kazmi:
    Runtime Errors         CONNE_IMPORT_WRONG_COMP_TYPE
    Exception              CX_SY_IMPORT_MISMATCH_ERROR
    Short text
        Error when attempting to IMPORT object "HIST2".
    What happened?
        Error in the ABAP Application Program
        The current ABAP program "RSORAT4M" had to be terminated because it has
        come across a statement that unfortunately cannot be executed.
    Error analysis
        An exception occurred that is explained in detail below.
        The exception, which is assigned to class 'CX_SY_IMPORT_MISMATCH_ERROR', was
         not caught in
        procedure "AKT_DAY_HIST2" "(FORM)", nor was it propagated by a RAISING clause.
        Since the caller of the procedure could not have anticipated that the
        exception would occur, the current program is terminated.
        The reason for the exception is:
        When importing the object "HIST2", the component no. 8 in the
        dataset has a different type from the corresponding component
        of the target object in the program "RSORAT4M".
        The data type is "I" in the dataset, but "P" in the program.
    How to correct the error
        Try to find out why the type of the object should be different.
        There are various possible options:
        1. The type of the imported field has changed in the Data Dictionary.
           Make sure that the type of the imported field matches the type
           of the field in the Data Dictionary.
           If the data cannot be restored from another source, the data must be
           read by the 'old' structure, converted und again eported with the new
           structure, so that future IMPORTs will always function with the new
           structure.
        2. A new program version is active, which no longer fits the dataset.
           Try to solve the error generating the program "RSORAT4M" again. This
           works as follows: Select transaction SE38 in the SAP system. Enter
           the program name "RSORAT4M". Then activate the function 'Generate'.
    If the error occures in a non-modified SAP program, you may be able to
    find an interim solution in an SAP Note.
    If you have access to SAP Notes, carry out a search with the following
    keywords:
    "CONNE_IMPORT_WRONG_COMP_TYPE" "CX_SY_IMPORT_MISMATCH_ERROR"
    "RSORAT4M" or "RSORAT4M"
    "AKT_DAY_HIST2"
    If you cannot solve the problem yourself and want to send an error
    notification to SAP, include the following information:
    1. The description of the current problem (short dump)
        To save the description, choose "System->List->Save->Local File
    (Unconverted)".
    2. Corresponding system log
        Display the system log by calling transaction SM21.
        Restrict the time interval to 10 minutes before and five minutes
    after the short dump. Then choose "System->List->Save->Local File
    (Unconverted)".
    3. If the problem occurs in a problem of your own or a modified SAP
    program: The source code of the program
        In the editor, choose "Utilities->More
    Utilities->Upload/Download->Download".
    4. Details about the conditions under which the error occurred or which
    actions and input led to the error.
    The exception must either be prevented, caught wit
    "AKT_DAY_HIST2" "(FORM)", or its possible occurren
    RAISING clause of the procedure.
    Hope this helps.
    Thanks to you and all
    Raj

  • Error while loading from PSA to Cube. RSM2-704 & RSAR -119

    Dear Gurus,
    I have to extract about 1.13 million records from set up tables to the MM cube 0IC_C03. While doing, the following two errors occured while loading from the psa to the cube:
    Error ID - RSM2 & error no. 704: Data records in the PSA were marked for data package 39 . Here there were 2 errors. The system wrote information or warnings for the remaining data records.
    Error ID- RSAR & No. 119: The update delivered the error code 4 .
    (Data records for package 39 selected in PSA - 2 error(s). Record 5129 :No SID found for value 'EMN ' of characteristic 0BASE_UOM . Record 5132 :No SID found for value 'EMN ' of characteristic 0BASE_UOM )
    I tried to change the defective records in psa by deleting the erraneous field value EMN and tried to load but failed.
    Now, my questions are:
    How can I resolve the issue ?
    (ii) How to rectify the erraneous record ? should we only delete the erraneous field value or delete the entire record from the psa.
    (iii) How to delete a record from psa.
    Thanks & regards,
    Sheeja.

    Hi,
    Data records for package 39 selected in PSA - 2 error(s). Record 5129 :No SID found for value 'EMN ' of characteristic 0BASE_UOM . Record 5132 :No SID found for value 'EMN ' of characteristic 0BASE_UOM
    The issue with record no. 5129 and 5132.
    In PSA check errorneous records and it will display only the error records, you just edit as per the require and try reloading into cube.
    Deleting single record is not possible.
    Let us know if you still have any issues.
    Reg
    Pra

  • Dataloading taking more time from PSA to infocube through DTP

    Hi All,
    when i am loading data to infocube through PSA with DTP, it taking more time. Its taken already 4 days to load. Irrespective of the data the previous loads were completed in 2-3 hrs. This is the first time am facing this issue.
    Note: There is a start routine written in Transformations
    Plz let me know how to identify whether the code is written in global declaration or local. If so what is the procedure to correct.
    Thanks,
    Jack

    Hi Jack,
    To improve the performance of the data load, you can do the below:
    1. Compress Old data
    2. Delete and Rebuild Indexes
    3. Read with binary search
    4. Do no use LOOP withing LOOP.
    5. Check sy-subrc after read etc.
    Hope this helps you.
    -Vikram

  • DUmp Erro ASSIGN_TYPE_CONFLICT while loading from DSO to INFOCUBE

    HI ALL,
    Here i had rquirement of modifying the Tranformation rules and i have done some routine coding for a particular infoobject
    and deleted the data from DOS  & Cube , then i started Loading to DSO, here i got the sucess for DSO, now i have got  the modified data of the particular Infoobject in the DSO,
    But i am facing a Dump Error while loading data from DSO to Cube thorugh the DTP .
    HEre is the Error "ASSIGN_TYPE_CONFLICT" which i see in ST22.
    Plz,
    Thanks in Advance,
    sravan

    HI,
    When i started the load for the first time i got he same error, so i have activated the DTP and laoded it again,
    then i faced the same problem,
    where the Modified data is already in DSO and  i have validated the Data also,  that is ok ...
    so i need to Delete or to need to create another DTP to load the data from DSA to CUBE.
    and even i have cheked all the tansformation rules they are all fine  the DSO structure and Infocube Structure is ok,
    Please  suggest,
    Thanks LAX,

  • Pushing a request from PSA..but says.."Request is already updated in cube"

    Guys,
    Pushing a request from PSA..but says.."Request is already updated in cube". I have deleted before scheduling from psa...i could see the request in yellow/not yet completed list.
    I am couldnt find the request in the cube...but while pushing from psa it says"request already updated".
    Pls Advice,,,Thanks!

    Ganesh,
    Goto Infocube Manage --> Reconstruction Tab --> select request --> Reconstruct and try..
    Hope it Helps
    Srini

  • Regarding the dataload from the Psa to infocube

    Hi Experts,
                           I am having the dought when i am loding the data into the infocube the data is not updating into the infocube .But the data is present in the PSA abt 4 millions of records are there when i am running the DTP from PSA to the Infocube the data is not updating but it is showing the yellow color signal and in PSA totally 20 packets are there each packet is containning abt 50,000 records Can you suggest me what are all the steps i have to take if i want to load the data into an infocube from PSA threw DTP Because i am new to this field .
                                                           Bye.

    Hi Vinay,
    Check this link...this solves all ur worries..
    http://help.sap.com/saphelp_nw2004s/helpdata/en/42/f98e07cc483255e10000000a1553f7/frameset.htm
    Also,
    Performance Tips for Data Transfer Processes  
    Request processing, the process of loading a data transfer process (DTP), can take place in various parallelization upgrades in the extraction and processing (transformation and update) steps. The system selects the most appropriate and efficient processing for the DTP In accordance with the settings in the DTP maintenance transaction, and creates a DTP processing mode.
    To further optimize the performance of request processing, there are a number of further measures that you can take:
    ●      By taking the appropriate measures, you can obtain a processing mode with a higher degree of parallelization.
    ●      A variety of measures can help to improve performance, in particular the settings in the DTP maintenance transaction. Some of these measures are source and data type specific.
    The following sections describe the various measures that can be taken.
    Higher Parallelization in the Request Processing Steps
    With a (standard) DTP, you can modify an existing system-defined processing by changing the settings for error handling and semantic grouping. The table below shows how you can optimize the performance of an existing DTP processing mode:
    Original State of DTP Processing Mode
    Processing Mode with Optimized Performance
    Measures to Obtain Performance-Optimized Processing Mode
    Serial extraction and processing of the source packages (P3)
    Serial extraction, immediate parallel processing (P2)
    Select the grouping fields
    Serial extraction and processing of the source packages (P3)
    Parallel extraction and processing (P1)
    Only possible with persistent staging area (PSA) as the source
    Deactivate error handling
    Serial extraction, immediate parallel processing (P2)
    Parallel extraction and processing (P1)
    Only possible with PSA as the source
    Deactivate error handling
    Remove grouping fields selection
    Further Performance-Optimizing Measures
    Setting the number of parallel processes for a DTP during request processing.
    To optimize the performance of data transfer processes with parallel processing, you can set the number of permitted background processes for process type Set Data Transfer Process globally in BI Background Management.
    To further optimize performance for a given data transfer process, you can override the global setting:
    In the DTP maintenance transaction, choose Goto ®  Batch Manager Setting .Under Number of Processes, specify how many background processes should be used to process the DTP. Once you have made this setting, remember to save.
    Setting the Size of Data Packets
    In the standard setting in the data transfer process, the size of a data packet is set to 50,000 data records, on the assumption that a data record has a width of 1,000 bytes. To improve performance, you can increase the size of the data packet for smaller data records.
    Enter this value under Packet Size on the Extraction tab in the DTP maintenance transaction.
    Avoid too large DTP requests with a large number of source requests: Retrieve the data one request at a time
    A DTP request can be very large, since it bundles together all transfer-relevant requests from the source. To improve performance, you can stipulate that a DTP request always reads just one request at a time from the source.
    To make this setting, select Get All New Data in Source by Requeston the Extraction tab in the DTP maintenance transaction. Once processing is completed, the DTP request checks for further new requests in the source. If it finds any, it automatically creates an additional DTP request.
    With DataSources as the source: Avoid too small data packets when using the DTP filter
    If you extract from a DataSource without error handling, and a large amount of data is excluded by the filter, this can cause the data packets loaded by the process to be very small. To improve performance, you can modify this behaviour by activating error handling and defining a grouping key.
    Select an error handling option on the Updating tab in the DTP maintenance function. Then define a suitable grouping key on the Extraction tab under Semantic Groups. This ensures that all data records belonging to a grouping key in a packet are extracted and processed.
    With DataStore objects as the source: Do not extract data before the first delta or during full extraction from the table of active data
    The change log grows in proportion to the table of active data, since it stored before and after-images. To optimize performance during extraction in the Fill mode or with the first delta from the DataStore object, you can read the data from the table of active data instead of from the change log.
    To make this setting, select Active Table (with Archive) or Active Table (without Archive) on the Extraction tab in Extraction fromu2026 or Delta Extraction fromu2026 in the DTP maintenance function.
    With InfoCubes as the source: Use extraction from aggregates
    With InfoCube extraction, the data is read in the standard setting from the fact table (F table) and the table of compressed data (E table). To improve performance here, you can use aggregates for the extraction.
    Select data transfer process Use Aggregates on the Extraction tab in the DTP maintenance transaction. The system then compares the outgoing quantity from the transformation with the aggregates. If all InfoObjects from the outgoing quantity are used in aggregates, the data is read from the aggregates during extraction instead of from the InfoCube tables.
    Note for using InfoProviders as the source
    If not all key fields for the source InfoProvider in the transformation have target fields assigned to them, the key figures for the source will be aggregated by the unselected key fields in the source during extraction. You can prevent this automatic aggregation by implementing a start routine or an intermediate InfoSource. Note though that this affects the performance of the data transfer process
    Hope this helps u..
    VVenkat..

  • Error while updating data from PSA to ODS

    Hi Sap Gurus,
    I am facing the error while updating data from PSA to ODS in BI 7.0
    The exact error message is:
    The argument 'TBD' cannot be interpreted as a number
    The error was triggered at the following point in the program:
    GP44QSI5RV9ZA5X0NX0YMTP1FRJ 5212
    Please suggest how to proceed on this issue.
    Points will be awarded.

    Hi ,
    Try to simulate the process.That can give you exact error location.
    It seems like while updating few records may be no in the format of the field in which it is updated.
    Regards
    Rahul Bindroo

  • Error while loading data from PSA to DSO using DTP

    Hi,
    I have a Unique aplha numeric identifier of  type "Char" length "32" . When I am loading the data from PSA to DSO using DTP I get the following error message:
    "An error occurred while executing a transformation rule:
    The exact error message is
    Overflow converting from ' '
    The error was triggered at the following point in the program:
    GP4JJHUI6HD7NYAK6MVCDY4A01V 425
    System response
    Processing the data record has been terminated"
    Any idea how I can resolve this....
    Thanks

    Hi,
    fist check weather any special characteristics if not
    check in data source under this we have fields tab check the format of a particular field format internal/external/check u choose internal format, if any check routine once
    use Semantic Groups in the DTP.
    Try  it
    Thanku
    lokeeshM
    Edited by: lmedaSAP_BI on Oct 20, 2010 6:44 AM

  • Error while loading data from PSA to DSO

    Hi,
    How to identify the erroneous records in DSO?
    While loading the data from PSA to DSO through Process chains we are facing the error says:
    "Value '#' (hex. '0023') of characteristic 0BBP_DELREF contains invalid characters"
    "Error when assigning SID: Action VAL_SID_CONVERT InfoObject 0BBP_DELREF"
    There is no error records in PSA but it seems some invalid characters exists.
    Could you please help us how to find the error records in DSO and how to correct it.

    Hi,
    These are errors BRAIN290 & RSDRO302.
    The problem here most likely is that BW doesn't recognise a character you are trying to load. Generally the character is not "#",
    as BW displays all symbols it does not recognise as #. You should decode from the hex string what the actual value is. Note that hex values < 20 are not allowed in BW.
    Please review Note 173241 and the document mentioned within.
    This shows what characters are not allowed in BW characteristic values.
    You should check if the character is allowed, and then you can solve the problem in one of the following ways:
    1) add this character to the "permitted character" list in RSKC as described in the note.
    2) correct the value in the source system.
    3) correct the value in the PSA (to do this, you will need to delete the request from the ODS object and then you can change the disallowed character via the PSA maintenance).
    4) follow Note 1075403 so that the characters HEX00 to HEX1F are not checked (this affects only characteristics that do not allow "lower case").
    5) if you cannot use any of the above options, then you will need to create a routine in your transfer rules for the affected infoobject, and change the value to a character which is permitted in BW.
    These are the usual ways to solve this issue.
    Rgds,
    Colum

  • Error while loading the data from PSA to Data Target

    Hi to all,
         I'm spacing some error while loading the data to data target.
    Error :  Record 1 :Value 'Kuldeep Puri Milan Joshi ' (hex. '004B0075006C0064006500650070002000500075007200690
    Details:
    Requests (messages): Everything OK
    Extraction (messages): Everything OK
    Transfer (IDocs and TRFC): Errors occurred
          Request IDoc : Application document posted
          Info IDoc 2 : Application document posted
          Info IDoc 1 : Application document posted
          Info IDoc 4 : Application document posted
          Info IDoc 3 : Application document posted
          Data Package 1 : arrived in BW ; Processing : Data records for package 1 selected in PSA - 1 er
    Processing (data packet): Errors occurred
          Update PSA ( 2462  Records posted ) : No errors
          Transfer Rules ( 2462  -> 2462  Records ) : No errors
          Update rules ( 2462  -> 2462  Records ) : No errors
          Update ( 0 new / 0 changed ) : Errors occurred
          Processing end : Errors occurred
    I'm totally new to this issue. please help to solve this error.
    Regards,
    Saran

    Hi,
    I think you are facing an invalid character issue.
    This issue can be resolved by correcting the error records in PSA and updating it into the target. For that the first step should be to identify if all the records are there in PSA. You can find out this from checking the Details tab in RSMO, Job log , PSA > sorting records based on status,etc. Once its confirmed force the request to red and delete the particular request from the target cube. Then go to PSA and edit the incorrect records (correcting or blanking out the invalid entries for particular field InfoObject for the incorrect record) and save it. Once all the incorrect records are edited go to RSA1>PSA find the particular request and update to target manually (right click on PSA request > Start update immediately).
    I will add the step by step procedure to edit PSA data and update into target (request based).
    In your case the error message says Error : Record 1 :Value 'Kuldeep Puri Milan Joshi '. You just need to conver this to Capital letter in PSA and reload.
    Edit the field to KULDEEP PURI MILAN JOSHI in PSA and push it to target.
    Identifying incorrect records.
    System wont show all the incorrect records at the first time itself. You need to search the PSA table manually to find all the incorrect records.
    1. First see RSMO > Details > Expand upate rules / processing tabs and you will find some of the error records.
    2. Then you can go to PSA and filter using the status of records. Filter all the red requests. This may also wont show the entire incorrect records.
    3. Then you can go to PSA and filter using the incorrect records based on the particular field.
    4. If this also doesnt work out go to PSA and sort (not filter) the records based on the particular field with incorrect values and it will show all the records. Note down the record numbers and then edit them one by one.
    If you want to confirm find the PSA table and search manually."
    Also Run the report RS_ERRORLOG_EXAMPLE,By this report you can display all incorrected records of the data & you can also find whether the error occured in PSA or in TRANSFER RULES.
    Steps to resolve this
    1. Force the request to red in RSMO > Status tab.
    2. Delete the request from target.
    3. Come to RSMO > top right you can see PSA maintenace button > click and go to PSA .
    4.Edit the record
    5. Save PSA data.
    6. Got to RSA15 > Search by request name > Right click > update the request from PSA to target.
    Refer how to Modify PSA Data
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/40890eda-1b99-2a10-2d8b-a18b9108fc38
    This should solve your problem for now.
    As a long term you can apply some user exit in source system side or change your update rules to ensure that this field is getting blanked out before getting loaded in cube or add that particular char to permitted character list in BW.
    RSKC --> type ALL_CAPITAL --> F8 (Execute)
    OR
    Go to SE38 and execute the program RSKC_ALLOWED_CHAR_MAINTAIN and give ALL_CAPITAL or the char you want to add.
    Check the table RSALLOWEDCHAR. It should contain ALL_CAPITAL or the char you have entered.
    Refer
    /people/sap.user72/blog/2006/07/23/invalid-characters-in-sap-bw-3x-myths-and-reality-part-2
    /people/sap.user72/blog/2006/07/08/invalid-characters-in-sap-bw-3x-myths-and-reality-part-1
    /people/aaron.wang3/blog/2007/09/03/steps-of-including-one-special-characters-into-permitted-ones-in-bi
    http://help.sap.com/saphelp_nw04/helpdata/en/64/e90da7a60f11d2a97100a0c9449261/frameset.htm
    For adding Other characters
    OSS note #173241 – “Allowed characters in the BW System”
    Thanks,
    JituK
    Edited by: Jitu Krishna on Mar 22, 2008 1:52 PM

  • Error while transferring Data from PSA to DSO

    Hi all,
    We had a requirement to load data into the DSO.We are extracting it from R3.
    Problem is that while loading in PSA it is showing 1056 records ,But when we are trying to load it in DSO it is showing Transferred records as 1428 and added records as 1073.Please help ASAP!!!!!!!!

    Hi,
    the difference in records can be due to some  changes happening in the transformation.
    check your transformation from the DS to your DSO if it contains any start,end or update routines oe any other such thing.
    Also check if there is one to one mapping from DS to DSO or if some fields in DS are not mapped to DSO.
    Regards,
    Dhanya

Maybe you are looking for

  • How to enable Sql Dependency OnDataChange event to handle multiple request at a time

    when we work with sql dependency then we work with OnDataChange event and this event notify us regarding data change in table. suppose my application monitor a test table by using sql dependency. suppose huge number of data is getting insert and upda

  • Problem re-installing Acrobat Pro 9.0

    On Dec. 9, 2009, I purhcased Adobe Acrobat Pro 9.0 on line. My confirmation is here: ======================================== Dear Richard Harshaw, Thank you for your purchase of downloadable Adobe products. For your records, we have included your se

  • Help with the custom tag

    Hi i want to make a custom tag encapsulating certain functionalities of existing tags in one tag namely tomahawk dataPanel and html facets. Where can I get the source for html jsf tags so that I can look into it and modify. Thanks

  • How to hold the values as  it's not holding the values when it cross 255

    DATA : fval1  TYPE edidd-sdata. DATA : fval2  TYPE edidd-sdata. DATA : fval3 TYPE edidd-sdata. DATA : fval4 TYPE edidd-sdata. DATA : fval5 TYPE edidd-sdata.   DATA : len(3) TYPE n. values1 = wa_final-low.   values2 = wa_final-high.   IF wa_final-high

  • Black thumbnails and pictures in Media Center in PE8

    Hi, I've had this issue for years (in PE6, 7 and now 8) and on various Windows versions (XP MCE, Vista and now on W7 x86 and x64) but I've finally decided to try and do something about it as I'd really like to be able to use the MC plugin. As the tit