Master data loading - Dimension Property value incorrect

Hi,
I have a dimension in BPC which has an attribute amount.
When am trying to load the master data from BW, the number is truncated and displayed with a * if the number has more than 6 digits.
If this is a problem with the delimited - comma - then it should not bring in the values even with one punctuation (e.g. 6,000) but strangely it's happening only for 6 digits or more.
e.g. 19,605,500.00 -> *05500.00
This is master data (estimate field) and is not used for calculations/planning.
I tried the following but nothign worked.
1) Removing commas using a script in the conversion file
    js:%external%.toString().replace(",","")
2) Removing the delimiter in the transformation file and making comma as decimal point.
DELIMITER =
AMOUNTDECIMALPOINT = ,
Let me know if you have any ideas to resolve this.
Thanks,
Vasu
Thanks,
Vasu

Hi Vasu,
The field in your BW is 17,2 which I am assuming is a decimal value of 17 digits long with 2 digits after the decimal point.  Your property of 40 characters should have plenty of room  to fit any value so something else is wrong. Can you please validate the transformation and conversion files with this data?
I highly recommend validating the transformation and conversion files with the new external data.  In my experience most of the time when there is a problem in mapping the external data the transformation and/or conversion files do not validate successfully and they generate useful diagnostics.
Regards,
Leila Lappin

Similar Messages

  • Process Dimension after BW to BPC Master Data Load

    Hi All,
    i'm looking for the way to process a dimension after master data load from BW to BPC (following the corresponding how to guide in BPC 7.5 NW). The new members are visible from BPC Administrator -> Mantain Dimension Members, but not visible in BPC for Excel after "Refresh Dimension Member".
    New members are visible only if i process dimension from BPC Administrator , but I want this task to be automatic.
    ¿Any ideas?
    Thanks in advance,
    Enrique

    Thanks for your reply,
    I'm using Standard Process Chain IMPORT_IOBJ_MASTER in BPC 7.5 NW SP03. I will open a Sap note.
    As a quick solution, I found in SAP Help (is 7.0 but I suppose it's the same behaviour in 7.5) there is another Process Chain (ADMINTASK_MAKEDIM) that can be used to process a Dimension http://help.sap.com/saphelp_bpc70sp02/helpdata/en/36/339907938943ad95d8e6ba37b0d3cd/frameset.htm. It has two parameteres ("Dimension Name" and "Input File"). Help says that if you leave "Input File" field blank, the dimension is processed, but it is mandatory field in package.
    I have tried to modify package parameters but always appears an error with MEM_XML_PATH parameter in process chain.
    ¿Does anybody know how this package/process chain works?
    Thanks again,.
    Enrique

  • Master Data load does not extract Hierarchy nodes in BPC Dimension ACCOUNT

    Hi Experts,
    I am performing master data load through standard DM package with Filter selection as:
    1. Chart of Accounts
    2. Hieararchy selection has 4 hierarchy names
    3. Selected Import Text nodes
    4. Selected Set Filters by Attribute OR Hierarchies
    I have run this DM package for a set of data and selections a week ago and it worked fine.
    However when i run it now, it is giving issues,
    It extracts any new GL maintained in the BI system however it does not extract any hierarchy nodes at all! (Have tested this by deleting the hierarchy nodes and tried to run the master data load)
    I am running the DM package in Update and have selection as External.
    Any sugestions for checks / has anyone encountered this issue earlier?
    Regards,
    Shweta Salpe

    Hi Guyz,
    Thanks.
    I found that the issue was with the transformation file where i was maintaining the RATETYPE.
    When i removed the mapping of RATETYPE this works fine. (Pulls the nodes of hierarchies)
    however now i do not have Ratetype populated in the system.
    my rate type mapping is:
    RATETYPE=*IF(ID(1:1)=*STR(C) then *STR(TOSKIP);ID(1:1)=*STR(H) then *STR(TOSKIP);ID)
    and in conversion file i have TOSKIP     *skip
    I have to skip the ratetypes for the hierarchy nodes and my hierarchy nodes start with C and H.
    So now that i have removed the mapping for RATETYPE can anyone suggest me a correct way to achieve this? (Note the above mapping formula was skipping all of the hierarchy nodes starting with C and H)
    Regards,
    Shweta Salpe

  • BPC:: Master data load from BI Process chain

    Hi,
    we are trying to automatize the master data load from BI.
    Now we are using a package with:
    PROMPT(INFILES,,"Import file:",)
    PROMPT(TRANSFORMATION,%TRANSFORMATION%,"Transformation file:",,,Import.xls)
    PROMPT(DIMENSIONNAME,%DIMNAME%,"Dimension name:",,,%DIMS%)
    PROMPT(RADIOBUTTON,%WRITEMODE%,"Write Mode",2,{"Overwirte","Update"},{"1","2"})
    INFO(%TEMPNO1%,%INCREASENO%)
    INFO(%TEMPNO2%,%INCREASENO%)
    TASK(/CPMB/MASTER_CONVERT,OUTPUTNO,%TEMPNO1%)
    TASK(/CPMB/MASTER_CONVERT,FORMULA_FILE_NO,%TEMPNO2%)
    TASK(/CPMB/MASTER_CONVERT,TRANSFORMATIONFILEPATH,%TRANSFORMATION%)
    TASK(/CPMB/MASTER_CONVERT,SUSER,%USER%)
    TASK(/CPMB/MASTER_CONVERT,SAPPSET,%APPSET%)
    TASK(/CPMB/MASTER_CONVERT,SAPP,%APP%)
    TASK(/CPMB/MASTER_CONVERT,FILE,%FILE%)
    TASK(/CPMB/MASTER_CONVERT,DIMNAME,%DIMNAME%)
    TASK(/CPMB/MASTER_LOAD,INPUTNO,%TEMPNO1%)
    TASK(/CPMB/MASTER_LOAD,FORMULA_FILE_NO,%TEMPNO2%)
    TASK(/CPMB/MASTER_LOAD,DIMNAME,%DIMNAME%)
    TASK(/CPMB/MASTER_LOAD,WRITEMODE,%WRITEMODE%)
    But we need to include these tasks into a BI process chain.
    How can we add the INFO statement into a process chain?
    And how can we declare the variables?
    Regards,
    EZ.

    Hi,
    i have followed your recomendation, but when i try to use the process /CPMB/MASTER_CONVERT, with the parameter TRANSFORMATIONFILEPATH and the root of the transformation file as value, i have a new problem. The value only have 60 char, and my root is longer:
    \ROOT\WEBFOLDERS\APPXX\PLANNING\DATAMANAGER\TRANSFORMATIONFILES\trans.xls
    How can we put this root???
    Regards,
    EZ.

  • Problem with Master Data Load

    Dear Experts,
    If somebody can help me by the following case, please give me some solution. Iu2019m working in a project BI 7.0 were needed to delete master data for an InfoObject material. The way that I took for this was through tcode u201CS14u201D. After that, I have tried to load again the master data, but the process was broken and the load done to half data.
    This it is the error:
    Second attempt to write record 'YY99993' to /BIC/PYYYY00006 failed
    Message no. RSDMD218
    Diagnosis
    During the master data update, the master data tables are read to determine which records of the data package that was passed have to be inserted, updated, or modified. Some records are inserted in the master data table by a concurrently running request between reading the tables at the start of package processing and the actual record insertion at the end of package processing.
    The master data update tries to overwrite the records inserted by the concurrently running process, but the database record modification returns an unexpected error.
    Procedure
    u2022     Check if the values of the master data record with the key specified in this message are updated correctly.
    u2022     Run the RSRV master data test "Time Overlaps of Load Requests" and enter the current request to analyze which requests are running concurrently and may have affected the master data update process.
    u2022     Re-schedule the master data load process to avoid such situations in future.
    u2022     Read SAP note 668466 to get more information about master data update scheduling.
    Other hand, the SID table in the master data product is empty.
    Thanks for you well!
    Luis

    Dear Daya,
    Thank for your help, but I was applied your suggesting. I sent to OSS with the following details:
    We are on BI 7.0 (system ID DXX)
    While loading Master Data for infoobject XXXX00001 (main characteristic in our system u2013 like material) we are facing the following error:
    Yellow warning u201CSecond attempt to write record u20182.347.263u2019 to BIC/ XXXX00001 was successfulu201D
    We are loading the Master data from data source ZD_BW_XXXXXXX (from APO system) through the DTP ZD_BW_XXXXX / XXX130 -> XXXX00001
    The Master Data tables (S, P, X) are not updated properly.
    The following reparing actions have been taken so far:
    1.     Delete all related transactional and master data, by checking all relation (tcode SLG1 à RSDMD, MD_DEL)
    2.     Follow instructions from OSS 632931 (tcode RSRV)
    3.     Run report RSDMD_CHECKPRG_ALL from tcode SE38 (using both check and repair options).
    After deleting all data, the previous tests were ok, but once we load new master data, the same problem appears again, and the report RSDMD_CHECKPRG_ALL gives the following error.
    u201CCharacteristic XXXX00001: error fund during this test.u201D
    The RSRV check for u201CCompare sizes of P and X and/or Q and Y tables for characteristic XXXX00001u201D is shown below:
    Characteristic XXXX00001: Table /BIC/ PXXXX00001, /BIC/ XXXXX00001 are not consistent 351.196 derivation.
    It seems that our problem is described in OSS 1143433 (SP13), even if we already are in SP16.
    Could somebody please help us, and let us know how to solve the problem?
    Thank for all,
    Luis

  • R/3 Master Data Load Error

    Hi Experts
       Our master data load from R/3 to BW was failed today.
    This is the error message on status Tab of failed request
    Diagnosis
    No IDocs could be sent to the SAP BW using RFC.
    System response
    There are IDocs in the source system ALE outbox that did not arrive in the ALE inbox of the SAP BW.
    Further analysis:
    Check the TRFC log.
    You can get to this log using the wizard or the menu path "Environment -> Transact. RFC -> In source system".
    Removing errors:
    If the TRFC is incorrect, check whether the source system is completely connected to the SAP BW. Check especially the authorizations of the background user in the source system.
    I navigated through Environment -> Transact. RFC -> In source system"
    In R/3 i got the following error
    timeout during allocate / CPIC-CALL: 'ThS
    #Timeout during connect                 
    How shall i proceed now.
    Thanks

    Hi,
    Check the RFC connection between R3 and BW. Use transaction SM59 for the same or take help from the BASIS team.
    If that is correct check the EDI partner profiles, if they are active and functioning properly. You can take BASIS help here.
    Cheers,
    Kedar

  • Blank Row in table during Master Data Load

    I am having some success with my master data loads, but when I maintain the master data I have noticed that every table has had a blank row inserted.
    Does anybody know why and what I should do with the row (i.e. delete it)?

    This blank row is by default created and you can no way delete it.  Though you delete it, a new row with blank values will be appended.  This is required for some technical reason while reading the table within ABAP programs. 
    This is applicable only for SAP tables and may not be required for custom developed ones unless you want to use this in screen programs.
    Regards,
    Raj

  • Error in Master data load because of ###

    Hello,
    I am getting an error in master data load.
    The problem is ####### symbols are encountered in few records.
    We have defined a formula to replace the ### with S in transfer structure.
    But still it does not replace the #### symbols and gives error.
    But when I insert the ####### symbols manually in PSA and try to load it is replaced with the formula definition and loads the data.
    Suggestions required urgently…
    Thanks in advance
    Ram

    Hello srikanth,
    Check the code below.
    I have used it in the update rule to replace the #### symbol with A.
    But when i check in debug mode it is not recognizing this symbol and sy-subrc ia always 4.
    GV_STRING = COMM_STRUCTURE-/BIC/XXX
    FIND '#' in R_STRING.
    IF SY-SUBRC = 0.
    REPLACE ALL OCCURRENCES OF '#' in R_STRING with 'A'.
    ENDIF.
    result value of the routine
      RESULT = R_STRING
    Thanks
    Ram

  • 7.5 Hierarchy Master Data Load

    Hi experts,
    I am trying to load material masterdata with a hierarchy.
    When I was loading my members I applied an attribute filter.
    It loaded OK and loaded the hierarchy nodes ok.
    Then as I went to load the hierarchy it gave me an error saying that "Hierarchy nodes include dimension members that do not exist"
    When I checked through the log I could see that the "missing" members where all the ones that I had filtered out with the attribute filter.
    But it seems the hierarchy load needs them?
    Is there a way to apply the same filter in the hierarchy load? Or any other way around this?
    Or can an attribute filter only be used if you are not loading hierarchies?
    Thanks

    Hi Joergen,
    Thanks so much for your quick reply.
    My problem is not that my nodes don't load. They do actually come through fine in my first master data load.
    But then when I load the hierarchy BPC complains that the node I want to now create as parents has members that don't exist in BPC. These are basemembers, not nodes.
    For example:
    BW:
    Parent A:
       Child A
       Child B
       Child C
    BPC
    I only load Child A and Child B because Child C does not meet the attribute filter
    When I then load Parent A I get an error that Child C is missing?
    Regardless I tried your method and it didn't work for me?
    -First I loaded my members only with the attribute filter - no problem
    -but then I put in a rubbish filter like you said and ticked to load the hierarchy nodes. When I choose the AND filter nothing is being loaded at all. when I choose the OR filter it loads everything (incl the members that I excluded in the first load)?
    Did I understand that wrong?
    Thanks
    Sabine

  • Automate Master data loads in BPC7(MS)

    I need to update dimension members in SAP BPC7 (MS) from an external source.
    The source system for my current project is BW, but if you have done any automated master data loads from any other system or even from a "temp" table please post your solution or tips.
    Thanks!
    Mez

    Hi Mezert,
    Would like to chime in with some of the things I've learned about this process over 3-4 years I've spent with BPC/Outlooksoft:
    -- Automating maintenance of the dimension/master data might be the biggest single weakness of the BPC solution on the MS platform.  It's one of the big reasons that the Netweaver platform basically replaces DTS/SSIS and has more direct linkage to BI/BW cubes and master data (InfoCubes).
    -- It is also one a very common issue and in fact, you will have to deal with what the old-timers in BI call "slowly-changing dimensions" at almost every client eventually.  For instance, you have a list of proucts/materials that every so often someone adds an item to.  ...Well, before you can enter anything for it in BPC, you have to add the dimension member, of course.
    -- So far, this challenge has been met by bringing in specialist DTS/SSIS experts in most cases who can employ the special "Custom Tasks" and build one of these packages to run in an automated way to populate the tables inside SQL Server and then execute the Admin MakeDim task to process the cube/application.
    -- Usually the development/testing/deployment of these solutions is a non-trivial undertaking.  It can be done, but there are peculiar challenges.
    -- One example of a challenge is what happens if you need a dimension with > 65,000 members?  Well, it can be created and populated by a DTS package, however, when you check the package and the custom task from the old Outlooksoft stuff out, you see that it CLEARS the entire dimension table before re-filling it.  ...Yep, it wouldn't allow an "Append" function to run. 
    -- A second example of a peculiar challenge is that unless you update the Excel Dimension Sheet, you will have to do all the updates in SQL Server Mgmt. Studio and really won't be able to touch the BPC Admin again for that dimension.
    Apologies for such a long post, but this is a fairly significant issue and I have to do a lot of research to piece together some of these things.  ...If anyone else has something to add as a best-practice, please post.

  • Bi admin process chains-  content master data load has failed

    Hi Experts,
    i worked with Bi Admin cockpit set-up, all  the things are going fine
    but Content Master data load (Attribute)has failed and i got the errors like
    1)data records for package 1 selected in PSA-2 errors
    2)Error 4 in the update
    can anyone suggest me how can i sove the issue
    regards
    Mrudula

    Hi
    yes, i got 2 errors in Psa
    when i click the 1st error i got like
    Data record 10023 & with the key
    'EBTCLNT100ACGRAZ_BWRPRT_USR_FBW_Ptest0001 &' is invalid in value
    'EBTCLNT100ACGRAZ_BWRPRT_USR_FBW_attribute/characteristic 0TCTBWOBJECT &.
    And i it gave the procedure as
    if this message appears during a data load, maintain the attribute in the PSA maintenance screens. if this message appears in the master data maintenance screens, leave the transaction and call it again. this allows you to maintain your master data.
    And when i click the 2nd error
    it has showed the data record as 10024 and remaing explanation is same like 1st one
    in this i am not getting, how to proceed the solution.
    plz help me anyone
    Thanks and regards
    Mrudula
    Edited by: mrudularajesh on Jul 1, 2009 7:39 AM

  • Master Data Loading for Prices and Conditions in CRM - "/SAPCND/GCM"

    Hi,
    Could anyone give me some inputs on Master Data Loading for Prices and Conditions in CRM.
    T. Code is:  /SAPCND/GCM
    I need to load data on a file (extracted from 4.6) for service contracts.
    I tried LSMW : for this transaction, recording does not work.
    I am trying loading thru Idocs (LSMW). But that too is note really working.
    Do we require some custom development for this , or is some SAP standard funcntionality available ??
    Can anyone provide some valuable inputs one this.
    Would appreciate your responses.

    Hi Tiest,
    Thanx for responding.
    U r right, our clint is upgrading from 4.6 to ECC.
    So as per the clients requirements, we are maintaining all the configs for Services in CRM.
    Services Data which was in 4.6 is being pulled put on flat files which needs to be loaded in CRM. So middleware would not be able to do this.
    What I am looking os some standard upload program.
    LSMW recording does not work.
    This I-Doc "CRMXIF_COND_REC_SLIM_SAVE_M", i am able to load a single record. But I am not able to find , how to make this function for multiple entries.
    IN standard we for loading master data thru I-docs, we map the values to the standard fields which are available in that I-Doc.
    But in this particular i-doc, there is a common field for which I need to define the field name and a field value..
    Till now, I am only able to define just one field name and a field value.
    I want this to word for mutliple entries.
    Hope u get my point.
    Thanx

  • BPC NW 10.0 - Master data load

    Hey guys have you tried the automatic master data load in BPC 10.0?
    Would it be possible to perform a delta load for master data?
    When you do a delata load whether the exisiting dimensions and properties will be overwritten or just add another column/row with new members and properties?
    Please provide your thoughts.
    Thanks a million

    By automatic do you mean scheduled? If yes, you can schedule the 'Import Master Data Attrib and Text InfoObj' data manager package. The second prompt will ask if the process should 'Overwrite' or 'Update' (you can also modify the advanced script if you want to always update) Depending on what properties are you populating with your transformation file, those will be updated with what's in the source InfoObject. Assume for example the transformation file in the Mapping section is blank, you have ID '123' with description 'Test' in your dimension and ID '123' and description 'Test - changed' in BW, when the update runs the description will be changed from Test to Test - changed. All other IDs will be added to the dimension. Hope this makes sense.

  • Master Data Load for New Attribute

    Hi Users,
    We had to implement a separate load flow for a new field coming from R3. This field was to be added to existing master data object.
    I added a new Display attribute for an existing 0GL_ACCOUNT master data object.
    This new attribute along with some other existing fields is getting data from another master data object with an infosource in between because two transformations cannot be created for same source and target.
    When i load the data i dont see data being populated for this new field. I did ACR, checked the keys e.t.c.
    Source object has data but after executing DTP no data comes to this attribute. No routines or anything.
    Please suggest
    Regards
    Zabi

    Hi,
    The situation is:
    Field x from source maps to
    1. field y ( which was existing field ) and also maps to
    2. field z which is the new attribute.
    field y has to get updated for company codes 10 for example.
    field z for company codes 30.
    now if i use same flow and map field x to both y and z then there is overwriting happening if 10 does not have value for x and 20 has then its not good.
    So if i use a separate flow with infosource then i will map only x to z so after loads which means for 10 code if no value went in first dtp to y then if code 30 has value for x then z will only be updated and y remains empty....
    Master Data Load for New Attribute 

  • How do we improve master data load performance

    Hi Experts,
    Could  you please tell me how do we identify the master data load performance problem  and what can be done to improve the master data load performance .
    Thanks in Advance.
    Nitya

    Hi,
    -Alpha conversion is defined at infoobject level for objects with data type CHAR.
    A characteristic in SAP NetWeaver BI can use a conversion routine like the conversion routine called ALPHA. A conversion routine converts data that a user enters (in so called external format) to an internal format before it is stored on the data base.
    The most important conversion routine - due to its common use - is the ALPHA routine that converts purely numeric user input like '4711' into '004711' (assuming that the characteristic value is 6 characters long). If a value is not purely numeric like '4711A' it is left unchanged.
    We have found out that in customers systems there are quite often characteristics using a conversion routine like ALPHA that have values on the data base which are not in internal format, e.g. one might find '4711' instead of '004711' on the data base. It could even happen that there is also a value '04711', or ' 4711' (leading space).
    This possibly results in data inconsistencies, also for query selection; i.e. if you select '4711', this is converted into '004711', so '04711' won't be selected.
    -The check for referential integrity occurs for transaction data and master data if they are flexibly updated. You determine the valid InfoObject values.
    - SID genaration is must for loading transaction data with respect to master data, to cal master data at bex level.
    Regards,
    rvc

Maybe you are looking for