Idoc issue while loading the data into BI

Hello Gurus,
Initially i am having the Source System connection problem. After it was fixed by the basis i had followed the below process.
I am loading the data using the Generic extractor specifying the selection and its a full load. When i load the data from R3 PP application  i found the below result in the monitor screen of BW.
1. Job was completed in the R3 system and 1 million records are fetched by extractor.
2. the records are not posted to the BW side Because of TRFC issue.
3. i got the idoc number and process it in the BD87 tCode. But it was not process sucessfully it gives the following error " Error when saving BOM. Please see the application log."
4. when i check the Application Log using the Tcode SLG1 with the help of time and date of that particular process it was in the yelow status.
Kindly let me know i can i resolve this issue. i have already tried by repeating the infopackage job still i am facing the same issue. I have also check the connection its is ok.
Regards

hello veerendra,
Thanks for your quick response. yes i am able to process it manually. after processing  it was ended with the status 51 application not posted.
could you pls help me out with the same
regard
Edited by: KK on Nov 4, 2009 2:19 AM
Edited by: KK on Nov 4, 2009 2:28 AM

Similar Messages

  • Error While loading the data into PSA

    Hi Experts,
           I have already loaded the data into my cube. But it doesn't have values for some fields. So again i modified the data in the flat file and tried to load them into PSA. But while starting the Info Package, I got an error saying that,
    "Check Load from InfoSource    
    Created       YOKYY  on   20.02.2008   12:44:33 
    Check Load from InfoSource , Packet IP_DS_C11
    Please execute the mail for additional information.
    Error message from the source system
    Diagnosis
    An error occurred in the source system.
    System Response
    Caller 09 contains an error message.
    Further analysis:
    The error occurred in Extractor .
    Refer to the error message.
    Procedure
    How you remove the error depends on the error message.
    Note
    If the source system is a Client Workstation, then it is possible that the file that you wanted to load was being edited at the time of the data request. Make sure that the file is in the specified directory, that it is not being processed at the moment, and restart the request.
    Pls help me in this......
    With Regards,
    Yokesh.

    Hi,
    After editing the file did you save the file and close it.
    This error may come if your file was open at the time of request.
    Also did you check the file path settings.
    If everything is correct try saving the infopackage once and loading again.
    Thanks,
    JituK

  • Error while loading the data into the cube using DTP.

    No SID found for the value'UL' of characterstics 0BASE_UOM.
    Please give me the idea ,to solve the error.
    Thanks&Regards
    Syam Prasad Dasari

    https://forums.sdn.sap.com/click.jspa?searchID=23985990&messageID=6733764
    https://forums.sdn.sap.com/click.jspa?searchID=23985990&messageID=5062570

  • While loading master data into PSA, the eorror is 'Error from PSA"

    Hi All,
    While loading master data into PSA, the eorror is 'Error from PSA"
    Could you anybody please help in this reagard.
    Thanx,
    Sridhar.

    Hi rakesh,
    May be IDOCs not processed completely,
    Idoc Problem, Either wait till time out & process Idoc from detail monitor screen, or go to BD87 & process Idoc with status = YELLOW ( be careful while processing IDOCS from BD87, choose only relevant Idocs
    Cheers
    Raj

  • While loading transaction data into cube what are the tables generats

    Hi,
    while loading transaction data into cube what are the tables normally generats.

    Hi,
    Normally datas will be loading to 'F'- Fact tables (/BIC/F****) *** - cube name..
    When you do compress the request the data will be moved into E tables.
    Regards,
    Siva.

  • Error while loading the data from PSA to Data Target

    Hi to all,
         I'm spacing some error while loading the data to data target.
    Error :  Record 1 :Value 'Kuldeep Puri Milan Joshi ' (hex. '004B0075006C0064006500650070002000500075007200690
    Details:
    Requests (messages): Everything OK
    Extraction (messages): Everything OK
    Transfer (IDocs and TRFC): Errors occurred
          Request IDoc : Application document posted
          Info IDoc 2 : Application document posted
          Info IDoc 1 : Application document posted
          Info IDoc 4 : Application document posted
          Info IDoc 3 : Application document posted
          Data Package 1 : arrived in BW ; Processing : Data records for package 1 selected in PSA - 1 er
    Processing (data packet): Errors occurred
          Update PSA ( 2462  Records posted ) : No errors
          Transfer Rules ( 2462  -> 2462  Records ) : No errors
          Update rules ( 2462  -> 2462  Records ) : No errors
          Update ( 0 new / 0 changed ) : Errors occurred
          Processing end : Errors occurred
    I'm totally new to this issue. please help to solve this error.
    Regards,
    Saran

    Hi,
    I think you are facing an invalid character issue.
    This issue can be resolved by correcting the error records in PSA and updating it into the target. For that the first step should be to identify if all the records are there in PSA. You can find out this from checking the Details tab in RSMO, Job log , PSA > sorting records based on status,etc. Once its confirmed force the request to red and delete the particular request from the target cube. Then go to PSA and edit the incorrect records (correcting or blanking out the invalid entries for particular field InfoObject for the incorrect record) and save it. Once all the incorrect records are edited go to RSA1>PSA find the particular request and update to target manually (right click on PSA request > Start update immediately).
    I will add the step by step procedure to edit PSA data and update into target (request based).
    In your case the error message says Error : Record 1 :Value 'Kuldeep Puri Milan Joshi '. You just need to conver this to Capital letter in PSA and reload.
    Edit the field to KULDEEP PURI MILAN JOSHI in PSA and push it to target.
    Identifying incorrect records.
    System wont show all the incorrect records at the first time itself. You need to search the PSA table manually to find all the incorrect records.
    1. First see RSMO > Details > Expand upate rules / processing tabs and you will find some of the error records.
    2. Then you can go to PSA and filter using the status of records. Filter all the red requests. This may also wont show the entire incorrect records.
    3. Then you can go to PSA and filter using the incorrect records based on the particular field.
    4. If this also doesnt work out go to PSA and sort (not filter) the records based on the particular field with incorrect values and it will show all the records. Note down the record numbers and then edit them one by one.
    If you want to confirm find the PSA table and search manually."
    Also Run the report RS_ERRORLOG_EXAMPLE,By this report you can display all incorrected records of the data & you can also find whether the error occured in PSA or in TRANSFER RULES.
    Steps to resolve this
    1. Force the request to red in RSMO > Status tab.
    2. Delete the request from target.
    3. Come to RSMO > top right you can see PSA maintenace button > click and go to PSA .
    4.Edit the record
    5. Save PSA data.
    6. Got to RSA15 > Search by request name > Right click > update the request from PSA to target.
    Refer how to Modify PSA Data
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/40890eda-1b99-2a10-2d8b-a18b9108fc38
    This should solve your problem for now.
    As a long term you can apply some user exit in source system side or change your update rules to ensure that this field is getting blanked out before getting loaded in cube or add that particular char to permitted character list in BW.
    RSKC --> type ALL_CAPITAL --> F8 (Execute)
    OR
    Go to SE38 and execute the program RSKC_ALLOWED_CHAR_MAINTAIN and give ALL_CAPITAL or the char you want to add.
    Check the table RSALLOWEDCHAR. It should contain ALL_CAPITAL or the char you have entered.
    Refer
    /people/sap.user72/blog/2006/07/23/invalid-characters-in-sap-bw-3x-myths-and-reality-part-2
    /people/sap.user72/blog/2006/07/08/invalid-characters-in-sap-bw-3x-myths-and-reality-part-1
    /people/aaron.wang3/blog/2007/09/03/steps-of-including-one-special-characters-into-permitted-ones-in-bi
    http://help.sap.com/saphelp_nw04/helpdata/en/64/e90da7a60f11d2a97100a0c9449261/frameset.htm
    For adding Other characters
    OSS note #173241 – “Allowed characters in the BW System”
    Thanks,
    JituK
    Edited by: Jitu Krishna on Mar 22, 2008 1:52 PM

  • Caller-70 Error while loading master data into infoobject

    hi ,
    I am getting following error while loading master data into infoobject (0tb-account). I am loading this master data in production environment for the first time. there are about 300000 records. All have got loaded upto PSA. Infopackage settings were PSA and then into data target.
    Short dump in the Warehouse
    Diagnosis
    The data update was not finished. A short dump has probably been logged in BI. This provides information about the error.
    System Response
    "Caller 70" is missing.
    ST22 dump analysis is as below:
    Termination occurred in the ABAP program "GP476CZYBEF2WX53UZ8TXFG6XOS" - in                  
    "VALUE_TO_SID_CONVERT_DB".                                                                  
    The main program was "RSMO1_RSM2 ".
    Please help as soon as you can..Production problem....
    Regards
    Rakesh

    Hi rakesh,
    May be IDOCs not processed completely,
    Idoc Problem, Either wait till time out & process Idoc from detail monitor screen, or go to BD87 & process Idoc with status = YELLOW ( be careful while processing IDOCS from BD87, choose only relevant Idocs
    Cheers
    Raj

  • Unable to load the data into HFM

    Hello,
    We created new HFM app configured that with FDM, generated output file through FDM and loaded that file through HFM directly 5-6 times, there was no issue till here.
    Then I loaded the file through FDM 4 times successfully, even for different months. But, after 4 loads I start getting Error. Attached is the error log .
    Please help us earliest..
    ** Begin fdmFM11XG6A Runtime Error Log Entry [2013-10-30-13:44:26] **
    Error:
    Code............-2147217873
    Description.....System.Runtime.InteropServices.COMException (0x80040E2F): Exception from HRESULT: 0x80040E2F
       at HSVCDATALOADLib.HsvcDataLoadClass.Load(String bstrClientFilename, String bstrClientLogFileName)
       at fdmFM11XG6A.clsFMAdapter.fDBLoad(String strLoadFile, String strErrFile, String& strDelimiter, Int16& intMethod, Boolean& blnAccumFile, Boolean& blnHasShare, Int16& intMode)
    Procedure.......clsHPDataManipulation.fDBLoad
    Component.......E:\Opt\Shared\Apps\Hyperion\Install\Oracle\Middleware\EPMSystem11R1\products\FinancialDataQuality\SharedComponents\FM11X-G6-A_1016\AdapterComponents\fdmFM11XG6A\fdmFM11XG6A.dll
    Version.........1116
    Identification:
    User............fdmadmin
    Computer Name...EMSHALGADHYFD02
    FINANCIAL MANAGEMENT Connection:
    App Name........
    Cluster Name....
    Domain............
    Connect Status.... Connection Open
    Thanks,'
    Raam

    We are working with the DB team but they have confirmed that they is no issue with the TB, the process we have followed
    As a standard process – while loading the data from FDM or manually to HFM – we don’t write any SQL query. Using the web interface – data would be loaded to HFM application. This data can we viewed by different reporting tools (smart view(excel)/HFR Report/etc.)
    There is no any official documents on oracle website which talk about Insert SQL query which is used to insert data to HFM tables. Even, Hyperion does not provide much details on its internal tables used. Hyperion does not provide much insight on internal structure of HFM system.
    As per Hyperion blogs/forums on internet –HFM stores the base level data in so called DCE tables (for example EMHFMFinal _DCE_1_2013 where EMHFMFinal  is application name, 1 identifies the Scenario and 2013 the Year).  Each row in the DCE table contains data for all periods of a given combination of dimensions (also called an intersection).
    We are trying to load same data file with a replace option( it should delete the existing data before loading the data file).

  • Error loading the data into ODS - Message no. BRAIN060?

    Hi,
    I am getting following error while load the data from flat file, data loaded successfully from flat file to PSA but I got following error while updating the data from PSA to data target:
    Value '010384 javablue' (hex. '30003100300033003800340020006A0061007600610062006C') of characteristic 0PO_NUMBER contains invalid characters
    Message no. BRAIN060
    Diagnosis
    The following standard characters are valid in characteristic values as default:
    !"%&''()*+,-./:;<=>?_0123456789ABCDEFGHIJKLMNOPQRSTUVWXYZ
    Characteristic values are not allowed if they only consist of the character "#" or begin with "!". If the characteristic is compounded, this also applies to each partial key.
    You are trying to load the invalid characteristic value 1. (hexidecimal representation 30003100300033003800340020006A0061007600610062006C).
    I am trying to load Value '010384 javablue' to "0PO_NUMBER" info object for ODS with in one row with some other data.
    Any idea or any input to resolve this issue?
    Thanks in advance for any input.
    Steve

    Thanks Soumya, I have maintained the upper case letters, but I am loading upper and lower case mixed to PO number? and it is not working. What is the solution to this? If I set lower case property to the PO number infoobject, it won't accept upper case. If I uncheck the lower case then it won't accept lower case letters. I cannot add upper and lower case letters in RSKC, because it accepts up to 72 characters, I have already have more than 60 characters (special char, numbers and 26 upper case letters).
    I have already tried transfer routine but you can either convert to lower or upper but it doesn't work. we need both uper and lower case for po number, and R/3 is accepting it. why BW doesn't accept both?
    Any idea what can be done?
    Thanks in advance for your help.
    Steve

  • Records was deleted at R/3 sys - getting error while loading the data

    Hi All,
    we are extracting the data from 2LIS_12_VCITM, 2LIS_11_VASCL, 2LIS_11_V_ITM, 2LIS_13_VDITM and 2LIS_11_VAITM into Cube RSD_C03. But one record (Sales Order) was deleted at R/3 system so we getting the error message that Caller 09 contains an error message while loading the data from 2LIS_12_VCITM.
    we have replicate the DataSource and activate Trans rules but no use still getting the same error.
    when we execute the data source  2LIS_12_VCITM at RSA3 with update mode as D (Transfer of the deltas since the last request)getting error message that Errors occurred during the extraction
    and we are getting the zero records at RSA3 for the 2LIS_12_VCITM as update mode as F (Transfer of all requested data)
    And getting shot dump that TSV_TNEW_PAGE_ALLOC_FAILED at BW side.
    Could you kindly suggest me any one how to rectify this error.
    Thanks in Advance,
    Shaliny. M

    hai Lilly
    I activated myself source system  and replicate the my datasource .
    Then activate the update rules for ODS , InfoCube and InfoCube .
    Then i tried to load the data into InfoCube from ODs .
    Again im getting same problem .
    why it is like that
    pls tell me
    rizwan

  • How to write a procedure to load the data into a table using xml file as input to the procedure?

    Hi,
    Iam new to the xml,
    can u please anyone help me how to write procedure to load the data into a table using xml as input parameter to a procedure and xml file is as shown below which is input to me.
    <?xml version="1.0"?>
    <DiseaseCodes>
    <Entity><dcode>0</dcode><ddesc>(I87)Other disorders of veins - postphlebitic syndrome</ddesc><claimid>34543></claimid><reauthflag>0</reauthflag></Entity>
    <Entity><dcode>0</dcode><ddesc>(J04)Acute laryngitis and tracheitis</ddesc><claimid>34543></claimid><reauthflag>0</reauthflag></Entity>
    <Entity><dcode>0</dcode><ddesc>(J17*)Pneumonia in other diseases - whooping cough</ddesc><claimid>34543></claimid><reauthflag>0</reauthflag></Entity>
    </DiseaseCodes>.
    Regards,
    vikram.

    here is the your XML parse in 11g :
    select *
      from xmltable('//Entity' passing xmltype
    '<?xml version="1.0"?>
    <DiseaseCodes>
    <Entity><dcode>0</dcode><ddesc>(I87)Other disorders of veins - postphlebitic syndrome</ddesc><claimid>34543></claimid><reauthflag>0</reauthflag></Entity>
    <Entity><dcode>0</dcode><ddesc>(J04)Acute laryngitis and tracheitis</ddesc><claimid>34543></claimid><reauthflag>0</reauthflag></Entity>
    <Entity><dcode>0</dcode><ddesc>(J17*)Pneumonia in other diseases - whooping cough</ddesc><claimid>34543></claimid><reauthflag>0</reauthflag></Entity>
    </DiseaseCodes>
    ') columns
      "dcode" varchar2(4000) path '/Entity/dcode',
      "ddesc" varchar2(4000) path '/Entity/ddesc',
      "reauthflag" varchar2(4000) path '/Entity/reauthflag'
    dcode                                                                            ddesc                                                                            reauthflag
    0                                                                                (I87)Other disorders of veins - postphlebitic syndrome                           0
    0                                                                                (J04)Acute laryngitis and tracheitis                                             0
    0                                                                                (J17*)Pneumonia in other diseases - whooping cough                               0
    SQL>
    Using this parser you can create procedure as
    SQL> create or replace procedure myXMLParse(x clob) as
      2  begin
      3    insert into MyXmlTable
      4      select *
      5        from xmltable('//Entity' passing xmltype(x) columns "dcode"
      6                      varchar2(4000) path '/Entity/dcode',
      7                      "ddesc" varchar2(4000) path '/Entity/ddesc',
      8                      "reauthflag" varchar2(4000) path '/Entity/reauthflag');
      9    commit;
    10  end;
    11 
    12  /
    Procedure created
    SQL>
    SQL>
    SQL> exec myXMLParse('<?xml version="1.0"?><DiseaseCodes><Entity><dcode>0</dcode><ddesc>(I87)Other disorders of veins - postphlebitic syndrome</ddesc><claimid>34543></claimid><reauthflag>0</reauthflag></Entity><Entity><dcode>0</dcode><ddesc>(J04)Acute laryngitis and tracheitis</ddesc><claimid>34543></claimid><reauthflag>0</reauthflag></Entity><Entity><dcode>0</dcode><ddesc>(J17*)Pneumonia in other diseases - whooping cough</ddesc><claimid>34543></claimid><reauthflag>0</reauthflag></Entity></DiseaseCodes>');
    PL/SQL procedure successfully completed
    SQL> select * from MYXMLTABLE;
    dcode                                                                            ddesc                                                                            reauthflag
    0                                                                                (I87)Other disorders of veins - postphlebitic syndrome                           0
    0                                                                                (J04)Acute laryngitis and tracheitis                                             0
    0                                                                                (J17*)Pneumonia in other diseases - whooping cough                               0
    SQL>
    SQL>
    Ramin Hashimzade

  • Error while writing the data into the file . can u please help in this.

    The following error i am getting while writing the data into the file.
    <bindingFault xmlns="http://schemas.oracle.com/bpel/extension">
    <part name="code">
    <code>null</code>
    </part>
    <part name="summary">
    <summary>file:/C:/oracle/OraBPELPM_1/integration/orabpel/domains/default/tmp/
    .bpel_MainDispatchProcess_1.0.jar/IntermediateOutputFile.wsdl
    [ Write_ptt::Write(Root-Element) ] - WSIF JCA Execute of operation
    'Write' failed due to: Error in opening
    file for writing. Cannot open file:
    C:\oracle\OraBPELPM_1\integration\jdev\jdev\mywork\
    BPEL_Import_with_Dynamic_Transformation\WORKDIRS\SampleImportProcess1\input for writing. ;
    nested exception is: ORABPEL-11058 Error in opening file for writing.
    Cannot open file: C:\oracle\OraBPELPM_1\integration\jdev\jdev\mywork\
    BPEL_Import_with_Dynamic_Transformation
    \WORKDIRS\SampleImportProcess1\input for writing. Please ensure 1.
    Specified output Dir has write permission 2.
    Output filename has not exceeded the max chararters allowed by the
    OS and 3. Local File System has enough space
    .</summary>
    </part>
    <part name="detail">
    <detail>null</detail>
    </part>
    </bindingFault>

    Hi there,
    Have you verified the suggestions in the error message?
    Cannot open file: C:\oracle\OraBPELPM_1\integration\jdev\jdev\mywork\BPEL_Import_with_Dynamic_Transformation\WORKDIRS\SampleImportProcess1\input for writing.
    Please ensure
    1. Specified output Dir has write permission
    2. Output filename has not exceeded the max chararters allowed by the OS and
    3. Local File System has enough space
    I am also curious why you are writing to a directory with the name "..\SampleImportProcess1\input" ?

  • Error while activating the data into DSO

    Hi
    My base DSO is used to load 4 other data targets.
    In process chain, after the base DSO gets activated there are 4 DTPu2019s running to load the data from base DSO to other DSO and 3 cubes.
    When loading to other DSO, We have encountered an error
    Object is currently locked by BI Remote
    Lock not set for : Activating data in DSO
    Activation of M records terminated.
    1. My question is when loading the data from base DSO to other objects , how does the lock mechanism works.
    I know that we cannot load the data into base DSO, when base DSO is sending data into target.
    2. What difference does it make when loading DSO to DSO and cube parallel?
    Thanks
    Annie

    Hi Annie.....
    1. My question is when loading the data from base DSO to other objects , how does the lock mechanism works.
    I know that we cannot load the data into base DSO, when base DSO is sending data into target.
    Do you mean to say that the loading in the 2nd level DSO was successful .....but the activation failed ?
    Have you checked in SM12 that whether that 2nd level DSO is somehow locked or not ?
    Is any further targets getting loaded from this 2nd level DSO ?
    Look suppose u r loading a DSO A.........and in the mean time some load starts from DSO A to some other target(it may be DSO or a cube).........then the activation in the DSO A will fail........because since the last request in the DSO A is not activated....that request will not get considered in the subsequent load....and since the load is already in progress....system will not allow to activate any new request......
    Another option can be that DSO A is getting loaded from some other targets as well.......so since still some load is in progress in this target....it will not allow the activation....
    So check it and atart the activation again..
    2. What difference does it make when loading DSO to DSO and cube parallel?
    The main difference is that there is no activation concept in the cube....so a cube may get loaded from several targets in parallel......
    A DSO can also get loaded in parallel.......but activation should start once all the loads get completed successfully.....
    Regards,
    Debjani....

  • Error while loading the data from text file

    Hi,
    I got an error " Data Value Encountered before all Dimensions selected" while loading the data from the text file.
    Can any one please suggest me the solution.

    Possible Solutions
    Make sure that the data source is valid.
    Is a member from each dimension specified correctly in the data source or rules file?
    Is the numeric data field at the end of the record? If not, move the numeric data field in the data source or move the numeric data field in the rules file.
    Are all members that might contain numbers (such as "100") enclosed in quotation marks in the data source?
    If you are using a header, is the header set up correctly? Remember that you can add missing dimension names to the header.
    Does the data source contain extra spaces or tabs?
    Has the updated outline been saved?

  • While loading the data in transaction RSA1 we are getting dump in SCM 5.1

    Hi,
    I am getting runtime error while loading the data in transaction RSA1.
    Runtime error is GETWA_NOT_ASSIGNED associated with program /SAPAPO/CL_PDEM_WORKDAYS.
    Please help me out with this.

    Hi Rahul,
    Check whether sap note 482494 can be applicable to you.

Maybe you are looking for

  • Access-to-Oracle via scripts

    Hi, I've been asked to export my Access 2000 database as scripts and data, for the DBA to import into Oracle 10. I've been using 10g Express, SQL Developer and a multi-step procedure. Is there any easier/cleaner way to do this? Thanks for your time.

  • How do I permanently delete browser search is on fire fox?

    My android keeps a long list of. My past searches. I want to remove them permanently.

  • Creating a custom serializer for SOAP

    Hi guys, i'm wondering if someone could help me. I'm trying to create a custom serializer for a class with SOAP and I'm geeting the following SOAPException: Fault Code = SOAP-ENV:Server.Exception: Fault String = org/apache/soap/util/xml/Serializer I'

  • Problems getting CD information when I am importing

    I was adding CDs to my library. One time I put a disk in and was not conneced to the internet. Itunes said that it could not get track information. Now that I am connected to the internet when I put the cd back in it does not check the internet for i

  • How far dose the first generation's iPad iOS  updates go?

    I cannot download  some of the new apps because my iPad has to have the latest iOS. When I go to download the latest iOS it states that I have the latest version. So , my question is : How far dose the first generation's iPad iOS  updates go? How can