Transaction Data Load from Infoprovider Problem?

Hi Dear colleagues;
I have a load transaction from infoprovider data problem. I have transformation file and i prepared our mapping. my transformation file like this:
http://img822.imageshack.us/i/capture1cw.jpg/
Then , run package .The result like this :
http://img291.imageshack.us/i/capture2m.jpg/
and also Package status situation like this : 
http://img836.imageshack.us/i/53445642.jpg/
As you see in 3rd picture .. Submit count seems 18525. So i have 18k datas in my BW cube. Lets look Bw side
http://img513.imageshack.us/img513/5760/capture4o.jpg
as you see My BW Cube is empty
http://img137.imageshack.us/img137/6451/capture3dg.jpg
and there is no pack in my BW cube.
I tried to explain my problem in this way.
What should i do in this situation?
Edited by: Breathe_ on Jan 5, 2011 10:34 AM

Thanks for your answer nilanjan. But when i check reject records and reject datas , i couldn't see any hint. Otherwise i want to ask one more question:
my transformation select row like this :
SELECTION=0SALESORG,1000;0SALESORG,2000;0SALESORG,3000
  can i write multiple selection about 0SALESORG ?  ( EX: 0SALESORG,2000,3000...) are there any syntax about that?
Take it easy...

Similar Messages

  • Transaction Data Load from BW info provider

    Hi gurus,
    I am doing a transaction dat aload from the BW data feed to BPC cube. When I did the validation of the transformation file the task was sucessfully completed with some skipped records as expected based on the conversion files.
    ValidateRecords = YES
    [List of conversion file]
    Conversion file: DataManager\ConversionFiles\EXAMPLES\TIMECONV.XLS!CONVERSION
    Conversion file: DataManager\ConversionFiles\EXAMPLES\VERSIONCONV.XLS!CONVERSION
    Conversion file: DataManager\ConversionFiles\EXAMPLES\ACCOUNTCONV.XLS!CONVERSION
    Conversion file: DataManager\ConversionFiles\EXAMPLES\ENTITY.XLS!CONVERSION
    Record count: 25
    Accept count: 13
    Reject count: 0
    Skip count: 12
    This task has successfully completed
    but when did run package, load fails.
    /CPMB/MODIFY completed in 0 seconds
    /CPMB/INFOPROVIDER_CONVERT completed in 0 seconds
    /CPMB/CLEAR completed in 0 seconds
    [Selection]
    InforProvide=ZPCAOB01
    TRANSFORMATION= DATAMANAGER\TRANSFORMATIONFILES\EXAMPLES\ABF_TRANS_LOAD.xls
    CLEARDATA= Yes
    RUNLOGIC= No
    CHECKLCK= No
    [Messages]
    Task name CONVERT:
    No 1 Round:
    Error occurs when loading transaction data from other cube
    Application: TEST Package status: ERROR
    This is a fresh system and we are doing the data load for the first time.  we are using BPC NW 7.5 with SP4
    Is there something which we are missing which is supposed to be performed before starting the load for the first time.
    my transformation file is as below
    *MAPPING
    Account=0ACCOUNT
    Currency=0CURRENCY
    DataSrc=*NEWCOL(INPUT)
    Entity=ZPCCCPLN
    ICP=*NEWCOL(ICP_NONE)
    Scenario=0VERSION
    Time=0FISCPER
    SIGNEDDATA=0AMOUNT
    *CONVERSION
    TIME=EXAMPLES\TIMECONV.XLS
    SCENARIO=EXAMPLES\VERSIONCONV.XLS
    ACCOUNT=EXAMPLES\ACCOUNTCONV.XLS
    ENTITY=EXAMPLES\entity.xls
    Thanks a lot in advance.
    Regards
    Sharavan

    Hi Gersh,
    Thanks for the quick response.
    I checked in SLG1 and i have the below error in the log.
    Class: CL_UJD_TRANSFORM_PROXY:CONSTRUCTOR Log:DATATYPE 3
    Class: CL_UJD_TRANSFORM_PROXY:CONSTRUCTOR Log:CURRENT_ROUND 1
    Error occurs when loading transaction data from other cube
    Message no. UJD_EXCEPTION137
    we are on SP4, BPC NW 7.5. Please advice
    Regards
    Sharavan.

  • Selection in data load from infoprovider

    Hi Guys,
    In BPC NW 7.5 we have to load data from infoprovider allowing users to select data (like for dimension TIME) by BPC prompt.  We found two solutions solving partially our problem. In fact, users have to modify manually the selections.
    SOLUTION 1:
    To load data we want use the process chain /CPMB/INFOPROVIDER, but we know that it is not possible insert selections in the /CPMB/INFOPROVIDER prompt (as describe in this post Re: Package LOAD INFOPROVIDER, Select input ENTITY).
    To select data we can use an intermediate infocube as a BW workaround (as describe in this post Re: BPC 7.5: Delta Load when loading from BI InfoProvider ) to have a source with only the selected data.This could be done by a selection in the DTP between the source infocube and the intermediate infocube. This solution is not dynamic, in fact, in this case users have to modify manually the DTP selection.
    How can we allow users to insert this selection in the DTP by a BPC prompt?
    SOLUTION 2:
    To select data we can use a transformation file inserting a selection like
    SELECTION = <Dimension1_techname>,<Dimension1_value>.
    It is not dynamic, in fact, also in this case users have to modify manually the file selection.
    Do you know how to allow these selection by a BPC prompt to avoid these manual changes?
    Do you know other solutions?
    Thank you for your support.

    Hi D-Mark,
    This is definitely a place where it would be nice to see some additional functionality added to BPC. Variable replacement in the transformation file based on the data manager prompt would probably be the best thing to have in the software.
    In any case, getting back to your question, manually modifying the transformation file selection is the most common practice on BPC projects. The blog linked by Naresh is a fairly elegant way to do this, though it doesn't completely get around the fact that it's easy to forget to do and easy to get confused about what is going on in the transformation file.
    A third option that no one has mentioned is to do a SELECTION statement in the transformation file based on navigational attributes in the source InfoProvider. This approach can make the selection statement dynamic based on the contents of BW InfoObjects. Still not very user-friendly, but if you can put an automatic process in place to update the BW navigational attributes this might meet your need without having to set up an extra BW staging InfoProvider.
    The SELECTION syntax is documented here, though it doesn't mention that you can select on navigational attributes: [http://help.sap.com/saphelp_bpc75_nw/helpdata/en/5d/9a3fba600e4de29e2d165644d67bd1/frameset.htm]
    With navigational attributes (the profit center attribute of cost center, for example) it would be something like:
    SELECTION=0COST_CENTER___0PROFIT_CENTER,PC01
    Ethan

  • Data load from infoprovider : status is green, but no data is loaded.

    Hi,
    I have created a new application copying the planning application (which was again a copy from the apshell application set). I modified the dimensions, after which I wanted to load data from a BW infoprovider using the datamanager.
    The first thing I noticed in the datamanager, was that the packages were not the same as in the original planning application. In fact I had chosen to copy all objects to the new application. For example, the package 'load data from infoprovider' that was contained in the data management map of the planning application, was not visible anymore in my new application. Instead, there is a package 'Loadinfoprovider' located in the system administrative map. As these packages are exactly the same, the package 'Loadinfoprovider' does not seem to work properly. When you run the package, it says that is has submitted records to your application, but when you run a report on it, or go look to the bpc infocube in BW, no data is visible.
    Does anyone know what may cause this problem ?
    Edited by: Wouter Vanhoutte on Dec 1, 2009 3:46 PM

    Hi Wouter,
    Did you check your transformation file?
    I had the same problem once, my mistake was that I didn't add a line for my key figure amount (AMOUNT=ZAMOUNT). So the status is green but no data transferred because no key figure to transform.
    I hope this will help you
    Mathieu

  • Regarding master data ,transactional data loading from flat file

    Hi friends,
    Please tell me how to load master data and transactional data from flat file ....
    Thanks in advance ,
    Regards,
    ramnaresh.

    Hi,
    Please use the 'search forum' functionality and search the BI Forum with say 'flat file loading'.  You would get plenty of links of previous threads.
    BR/
    Mathew.

  • Error while doing Transaction data loading from ECC to BW

    HI,
    I faced error in data records so i corrected it manually by going PSA Maintenance. then now when i am trying to schedule data laod by going to that Data source then right click on Manage then select Request which i have updated with correct records. then Again right click-->UPDATE WITH SCHEDULER .now next screen is SCHEDULER(PSA SUBSQUENT UPDATE) here i am not getting any Data Target means data target field is disabled  so i cannot schedule it for load.
    Any solution for this error.
    Thanks
    Nilesh Pathak

    Hi,
    I guess you have not deleted the request from the InfoCube 'Manage' Screen. If the request is already updated to your InfoCube/ODS ,then it will not show the Data Target in PSA Scheduler.
    You can try to re-load the data using 'Only PSA'  in the InfoPackage and then correct the errors and load the data using Update option from the Manage screen.
    Thanks

  • Enhancing package 'load transactional data for BW infoprovider UI'

    Dear,
    for project reasons, we would like to enhance the package ''load transactional data for BW infoprovider UI'. To be more precisely, we want to add some dimensions to the method of data transfer 'replace & clear datavalues'. Now the system clears the data that mirror each entity/category/time/datasource. With only those dimensions, we cannot be precise enought, so in our case the system clears too many data. We want to be able to extend this with some dimensions.
    Is there any way this can be adapted ?
    thx.

    Hi Wouter,
    for a more precise delete you should execute first a clear and after import using Merge option.
    Kind regards
         Roberto

  • Automated data load from APO to SEM transactional cube

    Hi ,
    We have BW-SEM system integrated with APO system.
    I could see automated data loads from APO to SEM transactional cube ..
    Infopackage name as "Request loaded using the APO interface without monitor log" ..
    I don't see any infopackage by this name in both the systems ( APO & SEM )..
    I am not sure how it configured ..
    Appreciate any inputs on how its happens .....
    Thanks in advance

    Hi,
    As I mentioned the starting point will be the tcode BPS0. There will be 2 planning areas created (if I am correct) one for SEM cube and the other for APO cube. The better way to find it will be goto tcode se16 and enter UPC_BW_AREA and key in the cube names in the cube field. this will give you the planning area names now look for a multiplanning area which has the 2 areas included in them (this is available in table UPC_AREAM).
    then goto BPS0 and you will have to find which function is being used to post the data.
    thanks

  • Load from Infoprovider with selection package error

    Hello,
    Requirement: Data transfer from one application to another application in BPC
    Version: BPC 7.5 NW SP08
    Source application: MGMT
    Target Application: LEGAL
    Transformation File: Yes
    Conversation File: Yes for Location Dimension
    Source records: 103 (The data is for only one Location)
    When I run Load from Infoprovider with selection Package (/CPMB/LOAD_INFOPROV_UI package) i am able to load 99 records loading and 4 records are getting error.
    Error Message: 121 ,41513000,,N,AVER,Y,2011,,1030,1030,,,0.0000000
    Line 121 :Dimension:LOCATION member: convert failed in line 121; rejected
    Analysis: Location conversion is already in place and other 99 records are able to success on the same conversion rule. I have checked transformation, Conversion files. When I validate the transformation file with data then also i am seeing same error.
    Note: I am using same package to load data from BW Cube to  MGMT application that time i am able to load without error, When I load the same data from MGMT to LEGAL then this error is coming.
    I appreciate any clue on this error.
    Thanks,
    Sri
    Edited by: sribpc on Oct 10, 2011 8:02 PM

    We are able to solve the problem,
    In the Account Dimension for one of the property we maintained some value with comma in it. That created a bad data and system throwing error.
    After changing Account dimension attribute then the package is working fine.
    Thanks for the support.

  • Transactional data loads PIR, IM stock, Open PO's documentation

    I have to make a documentation of the process of transactional data loads.
    transactional data loads pirchase info rec, IM stock, Open PO's
    the transactional data is live in both sap and legacy, so they have to be even in both systems in all stages of this process.
    how do i maintain that is the question.
    send me any details regarding this.
    thank you
    sridhar

    Check these three thigns
    /n/sapapo/CCR
    /n/sapapo/CQ
    Check what type of stock has active IM, and what type of stock went in after you created the GR.
    Still if you have any problem let us know.
    My

  • Error is data loading from 3rd party source system with DBCONNECT

    Hi,
    We have just finished an upgrade of SAP BW 3.10 to SAP NW 7.0 EHP1.
    After the upgrade, we are facing a problem with data loads from a third party Oracle source system using DBConnect.
    The connection is working OK and we can see the tables in the source system. But we cannot load the data.
    The error in the monitor is as follows:
    'Error message from the source system
    Diagnosis
    An error occurred in the source system.
    System Response
    Caller 09 contains an error message.
    Further analysis:
    The error occurred in Extractor .
    Refer to the error message.'
    But, unfortunately, the error message has no further information.
    If we look at the job log in sm37, the job finished with the following log -                                                                               
    27.10.2009 12:14:19 Job started                                                                                00           516          S 
    27.10.2009 12:14:19 Step 001 started (program RSBATCH1, variant &0000000000119, user ID RXSAHA)                    00           550          S 
    27.10.2009 12:14:23 Start InfoPackage ZPAK_4FMNJ2ZHNNXC6HT3A2TYAAFXG                                              RSM1          797          S 
    27.10.2009 12:14:24 Element NOAUTHORITYCHECK is not available in the container                                     OL           356          S 
    27.10.2009 12:14:24 InfoPackage ZPAK_4FMNJ2ZHNNXC6HT3A2TYAAFXG created request REQU_4FMXSQ6TLSK5CYLXPBOGKF31G     RSM1          796          S 
    27.10.2009 12:14:24 Job finished                                                                                00           517          S 
    In a BW 3.10 system, there is no  message related to element NOAUTHORITYCHECK. So, I am wondering if this is something new in NW 7.0.
    Thanks in advance,
    Rajib

    There will be three things to get the errors like this
    1.RFC CONNECTION FAILED
    2.CHECK THE SOURCE SYSTEM
    3.CHECK IT OUT WITH Oracle Consultants WEATHER THEY ARE FILLING UP THE LOADS.TELL THEM TO STOP
    4.CHECK I DOC PROCESSING
    5.FINALLY MEMORY ISSUES.
    6.CATCH THE DATA SOURCE FIRST CHANGE IT AND THEN ACTIVATE AND RUN THE LOAD
    7.LAST IS MEMORY ISSUE.
    and also Check the RFC connection in SM59 If  it is ok then
    check the SAP note : 692195 for authorization
    Santosh

  • How to find the data loaded from r/3 to bw

    hi
    how to find the data loaded from r/3 to bw is correct . i am not able to find which feild in the query is connected to which feild in the r/3 . where i am geting the data from r/3 . is there any process to find which feild  and table the data is comming from . plz help
    thanks in advance to u all

    Hi Veda ... the mapping between R/3 fields and BW InfoObjects should take place in Transfer Rules. Other transformation could take place in Update Rule.
    So you could proceed this way: look at InfoProvider Data Model and see if the Query does perform any calculation (even with Virtual keyfigures / chars). Than go back to Update Rules and search for other calculation / transformation. At least there are Tranfer Rule and eventually DataSource / Extraction Enhancements.
    As you can easily get there are many points where you have to look for ... it's a quite complex work but very usefull.
    Once you will have identified all mappings / transfromation see if BW data matchs R/3 (considering calculations ...)
    Good job
    GFV

  • Help Required regding: Validation on Data Loading from Flat File

    Hi Experts,
    I need u r help in the following issue.
    I need to validated the transactional data loading to the GL Cube from Flat file,
    1) The transactional data to the Cube to be loaded <b>only if master data</b> record exists for the <b>“0GL_ACCOUNT”</b> info object.
    2) If the master data record does not exits then the record need to be skipped from the loading and after the loading  the system should throw a message saying that these many records have been skipped (if there are any skipped records.).
    I would really appriciate u r help and suggestions on solving this issue.
    Regds
    Hari

    Hi, write a <b>start routine</b> in transfer rules like this.
      DATA: l_s_datapak_line type TRANSFER_STRUCTURE,
            l_s_errorlog TYPE rssm_s_errorlog_int,
            <b>l_s_glaccount type /BI0/PGLACCOUNT</b>,
            new_datapak type tab_transtru.
           refresh new_datapak.
           loop at datapak into l_s_datapak_line.
           select single * from /BI0/PGLACCOUNT into l_s_glaccount
             where CHRT_ACCTS eq l_s_datapak_line-<b>field name in transfer structure/datsource for CHRT_ACCTS</b>
    and GL_ACCOUNT eq l_s_datapak_line-<b>field name in transfer structure/datsource for GL_ACCOUNT</b>
    and OBJVERS eq 'A'.
           if sy-subrc eq 0.
             append l_s_datapak_line to new_datapak.
           endif.
           endloop.
           datapak = new_datapak.
           if datapak[] is initial.
    abort <> 0 means skip whole data package !!!
             ABORT = 4.
           else.
             ABORT = 0.
           endif.
    i have already some modifications but U can slightly change it to suit your need.
    regards
    Emil

  • Master Data/transactional Data Loading Sequence

    I am having trouble understanding the need to load master data prior to transactional data.  If you load transactional data and there is no supporting master data, when you subsequently load the master data, are the SIDs established at that time, or will then not sync up?
    I feel in order to do a complete reload of new master data, I need to delete the data from the cubes, reload master data, then reload transactional data.  However, I can't explain why I think this.
    Thanks,  Keith

    Different approach is required for different scenario of data target.  Below are just two scenarios out of many possibilities.
    Scenario A:
    Data target is a DataStore Object, with the indicator 'SIDs Generation upon Activation' is set in the DSO maintenance
    Using DTP for data loading.
    The following applies depending on the indicator 'No Update without Master Data' in DTP:
    - If the indicator is set, the system terminates activation if master data is missing and produces an error message.
    - If the indicator is not set, the system generates any missing SID values during activation.
    Scenario B:
    Data target has characteristic that is determined using transformation rules/update rules by reading master data attributes.
    If the attribute is not available during the data load to data target, the system writes initial value to the characteristic.
    When you reload the master data with attributes later, you need to delete the previous transaction data load and reload it, so that the transformation can re-determine the attributes values that writes to the characteristics in data target.
    Hope this help you understand.

  • Number of parallel process definition during data load from R/3 to BI

    Dear Friends,
    We are using Bi7.00. We have a requirement in which i should increase the number of parallel process during data load from R/3 to BI.  I want to modify this for a particular data source and check.Can experts provide helpful answers for the following question.
    1) When load is taking place or have taken place, where can we see how many parallel process that particular load has taken.
    2) Where should i change the setting for the number of parallel process for data load (from R/3 to BI) and not within BI.
    3) How system works and what will be net result of increasing or decreasing the number of parallel process.
    Expecting Experts help.
    Regards,
    M.M

    Dear Des Gallagher,
    Thank you very much for the useful information provided. The following was my observation.
    From the posts in this forum, i was given to understand that the setting for specific data source can be done in the infopackage and DTP level, i carried out the same and found that there is no change in the load, i.e., system by default takes only one parallel process even though i maintained 6.
    Can you kindly explain about the above mentioned point. i.e.,
    1) Even though the value is maintained in the infopackage level , will system consider it or not. -> if not then from which transaction system is able to derive the 1 parallel process.
    Actually we wanted to increase the package size but we failed because i could not understand what values have to be maintained  -> can you explain in detail
    Can you calrify my doubt and provide solution?
    Regards,
    M.M

Maybe you are looking for

  • TrueTime Card and how to interface with dll file!?

    I am trying to interface with a PCI-SG 2U (IRIG) timing card (Used to made by TrueTime, now Symmetricom).  Does anyone have a solution for this card in LV 2009?   If not, can anyone give advice on writing code to interface with .dll file?  I am not

  • CBO (10g): unable to estimate the good NL join cardinality

    Hi all, In my 10.2.0.4 database, the CBO does not choose the good execution plan for the following query: SELECT  A.CNACT, A.LCACT, I.CMECH, I.CAECH, I.CSOPT, T.COLAN, I.MTSNA FROM TPOOIN_TT T, IAVOPE I, NATACF A WHERE T.POTMP IN ('TABDJO_TT', 'TCLRJ

  • Remittance advice by email

    hi gurus, i need to setup/configure "remittance advice by email".required some Pre Requisite 1.UTL_SMTP should install on database 2.Valid Exchange Server IP with access to and from database server How to check/install item no .1 and no.2 tq

  • I need help with and Error PLEASE!!

    Everytime i try to upgarde my itunes 6 to itunes 7 i get and Error Drive E:/ message and then i get a message saying The installer encountered errors before Itunes could be configured. WHAT DO I DO? I NEED SERIOUS HELP HERE!

  • Large pitch slides in logic 7? are they possible?

    Does anyone know how to do smooth pitch glissandos (or portamentos) of intervals more than a major 3rd? I seem to be limited to automating the slides for the pitch bend and that only allows a limit of 5 or 6 semitones? Is there a certain synth in log