Bw to BPC actuals Data load

Hello Experts,
Here is the scenario I need to views.
1)I am trying to load Actuals data from BW to BPC and there is one Infoobject whose display attribute's value is required to be loaded to dimension.
Currently I am trying with static conversion file where in I have given the IO and its attribute value in external and internal respectively.
MAPPING
Dimension = BW Info Object.
Can anybody tell me if there is some other better workaround for this situation.
2) Second issue is while I try to validate the transformation file with BW IP it gives error saying: "validation with Data file Failed".
I have tried giving selections for the record value where no field have blank value in cube. Not sure what the reason is.
Regards,
Kavita

Hi,
Transformation mapping based on Display attribute may not work. As per the below note Navigational attribute is not supporting in lower version. But you can give a try if you are in BPC 10.
                         1455947  - Transformation mapping to a Navigational Attribute
Thanks,
Raju

Similar Messages

  • BPC - Consolidation - Data Loading - Will BS/PL accounts data ONLY be loaded from ECC?

    Dear All,
    In BPC, when we load data from ECC for Consolidation, my understanding is that we load only BS and PL accounts' data for/by the entity.
    Apart from BS and PL data, will there be any data that will have to be loaded into BPC for consolidation?
    The following three financial statements -
    -Statement of Cash Flow
    -Statement of Changes in Equity
    -Statement of Comprehensive Income
    are actually derived/calculated from the loaded BS and PL data. This is my understanding. Pls. correct me if I am wrong.
    Thank you!
    Regards,
    Peri

    Hi Peri,
    Balance sheet, PL and those three financial statements are derived from BS/ PL accounts, however, there should also be "flow" information.  Otherwise you won't end up with a correct consolidated cash flow or equity movement. ( or you can prefer to enter flow detail manually)
    Second thing is, while getting BS & PL accounts, you will also need trading partner detail, otherwise you won't be able to do the eliminations. (or you can prefer to manually enter trading partner detail for intercompany accounts)
    Thirdly, you should also consider other disclosures. (Depending on what standart you are implementing - IFRS, US GAAP, Local Gaap whatever...)
    Hope this gives an idea.
    Mehmet.

  • Process Dimension after BW to BPC Master Data Load

    Hi All,
    i'm looking for the way to process a dimension after master data load from BW to BPC (following the corresponding how to guide in BPC 7.5 NW). The new members are visible from BPC Administrator -> Mantain Dimension Members, but not visible in BPC for Excel after "Refresh Dimension Member".
    New members are visible only if i process dimension from BPC Administrator , but I want this task to be automatic.
    ¿Any ideas?
    Thanks in advance,
    Enrique

    Thanks for your reply,
    I'm using Standard Process Chain IMPORT_IOBJ_MASTER in BPC 7.5 NW SP03. I will open a Sap note.
    As a quick solution, I found in SAP Help (is 7.0 but I suppose it's the same behaviour in 7.5) there is another Process Chain (ADMINTASK_MAKEDIM) that can be used to process a Dimension http://help.sap.com/saphelp_bpc70sp02/helpdata/en/36/339907938943ad95d8e6ba37b0d3cd/frameset.htm. It has two parameteres ("Dimension Name" and "Input File"). Help says that if you leave "Input File" field blank, the dimension is processed, but it is mandatory field in package.
    I have tried to modify package parameters but always appears an error with MEM_XML_PATH parameter in process chain.
    ¿Does anybody know how this package/process chain works?
    Thanks again,.
    Enrique

  • BPC and Data Loads

    Hi all,
    I'm sure i've read an article showing me how to automate data loads, (transactional) from the BW into BPC. Does anyone know where i can find the document.
    I also would like to know what the function is called. i was sure it was load_infprovider.

    All of the BPC related How-to Guides are here:
    https://wiki.sdn.sap.com/wiki/display/BPX/Enterprise+Performance+Management+%28EPM%29+How-to+Guides
    You can read about the standard Data Manager package to [import transaction data from an InfoProvider|http://help.sap.com/saphelp_bpc70sp02/helpdata/en/af/36a94e9af0436c959b16baabb1a248/content.htm] in the SAP help documentation at
    http://help.sap.com
    SAP Business User > EPM Solutions > Business Planning and Consolidation
    [Jeffrey Holdeman|https://www.sdn.sap.com/irj/sdn/wiki?path=/display/profile/jeffrey+holdeman]
    SAP BusinessObjects
    Enterprise Performance Management
    Regional Implementation Group

  • BPC:: Master data load from BI Process chain

    Hi,
    we are trying to automatize the master data load from BI.
    Now we are using a package with:
    PROMPT(INFILES,,"Import file:",)
    PROMPT(TRANSFORMATION,%TRANSFORMATION%,"Transformation file:",,,Import.xls)
    PROMPT(DIMENSIONNAME,%DIMNAME%,"Dimension name:",,,%DIMS%)
    PROMPT(RADIOBUTTON,%WRITEMODE%,"Write Mode",2,{"Overwirte","Update"},{"1","2"})
    INFO(%TEMPNO1%,%INCREASENO%)
    INFO(%TEMPNO2%,%INCREASENO%)
    TASK(/CPMB/MASTER_CONVERT,OUTPUTNO,%TEMPNO1%)
    TASK(/CPMB/MASTER_CONVERT,FORMULA_FILE_NO,%TEMPNO2%)
    TASK(/CPMB/MASTER_CONVERT,TRANSFORMATIONFILEPATH,%TRANSFORMATION%)
    TASK(/CPMB/MASTER_CONVERT,SUSER,%USER%)
    TASK(/CPMB/MASTER_CONVERT,SAPPSET,%APPSET%)
    TASK(/CPMB/MASTER_CONVERT,SAPP,%APP%)
    TASK(/CPMB/MASTER_CONVERT,FILE,%FILE%)
    TASK(/CPMB/MASTER_CONVERT,DIMNAME,%DIMNAME%)
    TASK(/CPMB/MASTER_LOAD,INPUTNO,%TEMPNO1%)
    TASK(/CPMB/MASTER_LOAD,FORMULA_FILE_NO,%TEMPNO2%)
    TASK(/CPMB/MASTER_LOAD,DIMNAME,%DIMNAME%)
    TASK(/CPMB/MASTER_LOAD,WRITEMODE,%WRITEMODE%)
    But we need to include these tasks into a BI process chain.
    How can we add the INFO statement into a process chain?
    And how can we declare the variables?
    Regards,
    EZ.

    Hi,
    i have followed your recomendation, but when i try to use the process /CPMB/MASTER_CONVERT, with the parameter TRANSFORMATIONFILEPATH and the root of the transformation file as value, i have a new problem. The value only have 60 char, and my root is longer:
    \ROOT\WEBFOLDERS\APPXX\PLANNING\DATAMANAGER\TRANSFORMATIONFILES\trans.xls
    How can we put this root???
    Regards,
    EZ.

  • BPC YTD data load

    I am in filtering the data extraction for Jan 2010 only in the applicatioin and in the BI cube only Jan 2010 data exist.
    But in the BPC YTD balance report..It is showing the Jan 2010 actual amounts and Feb 2010 amount as inverted amount of Jan 2010.
    But when I check the BI/BW cube for BPC there is no amount for Feb 2010.
    How to ensure this Feb 2010 amount is not shown in BPC report when such data is not loaded to BPC. Is there some flag or config has to be changed?
    Please share your thoughts...

    It was a reporting issue . which is resolved.

  • DIRECT MODE에서의 PARALLEL DATA LOADING

    제품 : ORACLE SERVER
    작성날짜 : 1999-08-10
    Direct mode에서의 parallel Data Loading
    =======================================
    SQL*Loader는 동일 table에 대한 direct mode에서의 parallel data load를
    지원하고 있다. 이는 여러 session에서 동시에 데이타를 direct mode로 올림으로써
    대용량의 데이타의 로드 속도를 향상시킬 수 있다. 특히, data file을 물리적으로
    다른 disk에 위치시킴으로써 더욱 큰 효과를 낼 수 있다.
    1. 제약사항
    - index가 없는 table에만 로드 가능
    - APPEND mode에서만 가능. (replace, truncate, insert mode는 지원 안됨)
    - parallel query option이 설치되어 있어야 함.
    2. 사용방법
    각각의 data file을 load할 control file들을 생성한 후, 차례차례 수행하면 됨.
    $sqlldr scott/tiger control=load1.ctl direct=true parallel=true&
    $sqlldr scott/tiger control=load2.ctl direct=true parallel=true&
    $sqlldr scott/tiger control=load3.ctl direct=true parallel=true
    3. constraint
    - enable parameter를 사용하면 데이타 로드 작업이 모두 끝난 후, 자동으로
    constraint을 enable시켜 준다. 그러나 종종 enable되지 못하는 경우가
    있으므로 반드시 status를 확인해야 한다.
    - primary key 나 unique key constraint이 걸려 있는 경우, 데이타 로드 후
    자동으로 enable할 때, index를 생성하느라 시간이 많이 소모될 수 있다.
    따라서 data만 parallel direct 모드로 로드 한 후, index를 따로 parallel로
    생성하는 것이 성능 측면에서 바람직하다.
    4. storage 할당 방법 및 주의사항
    direct로 데이타를 로드하는 경우 다음 절차를 따라 작업한다.
    - 대상 table의 storage 절에 기초해 temporary segment를 생성한다.
    - 마지막 데이타 로드 작업이 끝난 후, 마지막에 할당되었던 extent의 비어 있는
    즉, 사용하지 않은 부분을 trim 한다.
    - temporary segment에 해당되어 있는 extent들의 header 정보를
    변경하고, HWM 정보를 수정하여, 대상 table에 extent가 소속되도록 한다.
    이러한 extent 할당 방법은 다음과 같은 문제를 야기시킨다.
    - parallel data load에서는 table 생성 시 할당된 최초 INITIAL extent를
    사용하지 않는다.
    - 정상적인 extent 할당 rule을 따르지 않고, 각 process는 next extent에
    정의된 크기를 할당하여 data load를 시작하고, 새로운 extent가 요구될
    때에는 pctincrease 값을 기준으로 할당되게 되는데, 이는 process 간에
    독립적으로 계산되어진다.
    - fragmentation이 심하게 발생할 수 있다.
    fragmentation을 줄이고, storage 할당을 효율적으로 하기 위해서는
    - INITIAL을 2-5 block 정도로 작게 하여 table을 생성한다.
    - 7.2 이상 버젼에서는 options 절에서 storage parameter를 지정하여
    사용한다. 이 때 initial과 next를 동일한 크기로 주는 것이 바람직하다.
    OPTIONS (STORAGE=(MINEXTENTS n
    MAXEXTENTS n
    INITIAL n K
    NEXT n K
    PCTINCREASE n))
    - options 절을 control file에 기술하는 경우 반드시 insert into tables
    절 다음에 기술해야 한다.

    First, thanks for the hints. In the meanwhile I found some other documentation regarding my issue.
    As far as I understand, if I want to load parallel, I have to create multiple InfoPackes and split up the records in the selection criteria, e.g.:
    - InfoPackage 1, Students 1 - 10.000
    - InfoPackage 2, Students 10.001 - 20.000
    ...and so on.
    Following that I need to create a Process Chain that starts loading all packages at the same point in time.
    Now...when the extractor is called, there are two parts that it runs through:
    - Initialization of the extractor
    - Fetching of records
    ( via flag i_initflag in the extractor ).
    In the initialization I want to run the pre-fetch module. I worked everything regarding that already. Only when the pre-fetch is finished, will the actual data loading start.
    What I am not sure about is: Is this flag (the i_initflag mentioned above) passed for each InfoPackage that is started?
    Jeroen

  • Parallel data loading

    Hi,
    I am in need of some help. I am currently designing a new extractor for transactional data that needs to be able to handle a high volume (> 1 mil.) of records.
    We have a function module that already fetches the records in the desired structure. So...I could use this FM as the extractor.
    However. This FM is not the most performant. For this reason we have a prefetch FM that, based on the selection criteria, pre-fetches the data and puts it in a buffer. The first FM I mentioned then reads from the buffer instead of the DB.
    So, I would need to call this pre-fetch FM once during the initialization and at the record fetching I would use the other FM...right?
    Now...I saw that I can set-up the BW system smart enough so that it will load data in parallel.
    Imagine I create an InfoPackage in which I define as selection options that students 1 - 100.000 need to be loaded. I start the data loading. What selection criteria are passed to the extractor. 1 - 100.000 right?
    If 3 parallel threads are started, the initialization is done by the only 1 request right?
    The problem I am then facing is that the buffering might take X minutes and the buffer is bypassed as the needed records are not in there yet.
    I am not sure how to do this properly. Can anyone advise?
    Thanks.
    Jeroen

    First, thanks for the hints. In the meanwhile I found some other documentation regarding my issue.
    As far as I understand, if I want to load parallel, I have to create multiple InfoPackes and split up the records in the selection criteria, e.g.:
    - InfoPackage 1, Students 1 - 10.000
    - InfoPackage 2, Students 10.001 - 20.000
    ...and so on.
    Following that I need to create a Process Chain that starts loading all packages at the same point in time.
    Now...when the extractor is called, there are two parts that it runs through:
    - Initialization of the extractor
    - Fetching of records
    ( via flag i_initflag in the extractor ).
    In the initialization I want to run the pre-fetch module. I worked everything regarding that already. Only when the pre-fetch is finished, will the actual data loading start.
    What I am not sure about is: Is this flag (the i_initflag mentioned above) passed for each InfoPackage that is started?
    Jeroen

  • Data load from BPC into BI 7 cubes for Bex reporting

    Hi,
    Can we pull BPC planning data into the BI cubes so that Bex reports can be created for combined data from BPC cubes and BI cubes?
    Also can we create Bex reports on BPC cubes just like we create for BI reporting cubes?
    Let me also give an example of the scenario we face.
    We have actuals data in a BI 7 basic cube(Actuals cube). Planning is done in BPC, so I understand  that the planned data gets automatically stored in a BPC cube in BI7 system. I want to load the data from this BPC cube to BI Actuals cube and create a combined Bex report to show actuals and plan data.
    Please let me know.
    Thanks,
    Archana

    AS of now, if you report data in the BPC7NW cubes through BEx, you may not get the same result as you get from BPC reports. The reason being BEx won't do the same calculations (for example sign reversals) that BPC client would do. In addition, you won't be able to report the results of the dimension member formulas when you report through BEx. If you have to get data out from BPC cubes to other cubes, you can go through SQE (shared query engine) and that way, yyour results will match with what you get in BPC reports. Once you get data out through SQE into other cubes, you can use BEx to report if you wish.
    More functionality will be available in near future to accomplish BEx reporting for BPC data.
    Regards
    Pravin

  • BPC NW 7.0: Data Load: rejected entries for ENTITY member

    Hi,
    when trying to load data from a BW info provider into BPC (using UJD_TEST_PACKAGE & process chain scheduling), a number of records is being rejected due to missing member entries for the ENTITY dimension in the application.
    However, the ENTITY member actually do exist in the application. Also, the dimension is processed with no errors. The dimension member are also visible usnig the Excel Client naviagtion pane for selecting members.
    The error also appears when kicking of the data load from the Excel Client for BPC. Any ideas how to analyze this further or resolve this?
    Thanks,
    Claudia Elsner

    Jeffrey,
    this question is closely related to the issue, because there is also a short dump when trying to load the data into BPC. I am not sure whether both problems are directly related though:
    Short dump with UJD_TEST_PACKAGE
    Problem desription of the post:
    When running UJD_TEST_PACKAGE, I get a short dump.
    TSV_TNEW_PAGE_ALLOC_FAILED
    No more storage space available for extending an internal table.
    Other keywords are CL_SHM_AREA and ATTACHUPDATE70.
    When I looked at NOTES, I found this Note 928044 - BI lock server". Looking at the note and debugging UJD_TEST_PACKAGE leaves me some questions:
    1. Do I need a BI lock server?
    2. Should I change the enque/table_size setting be increased on the central instance from 10000 to 25000 or larger?
    Claudia

  • Automate data load from BPC cube to BW cube

    Hi Gurus,
    I've got all my budgeting & forecasting data in BPC cube. Now I need to load it to a BW cube and combine with Actuals in another BW cube through Multiprovider and build reports on the multiprovider.
    My question is:
    What is the best way to automate the loading process of BPC cube data to BW cube ??
    I should also be able to load the property values of BPC dimensions to the BW info objects.
    The methods I followed are:
    1. Run "Export" data package and load BPC data to a CSV file and run BW DTP/infopackage to process the CSV file into BW cube. Problem with this is - I canot automate these two steps, and even if I did, I cannot export property values to a flat file.
    2. Build transformations directly from BPC cube to an Infosource and from Infosource to BW cube. Problem with this is - in the transformations I cannot use the rule: "Read Master Data". I may have to write a routine, but my ABAP is not good enuf.
    Please help with an alternative solution
    Thanks,
    Venkat

    Thanks for the reply. I know I will have more options if I'm BPC 7.5 NW. But I'm on BPC 7.0 NW.
    I managed to pull the attribute values of BPC dimensions using routines. But still it's risky because the tech names of BPC objects may change if one modifies the dimensions or run optimization.
    It's a shame SAP haven't provided a robust solution to load BPC cube data to BW cube data.
    I don't want to load BW cube data to BPC cube and depend on EVDRE reports for my 'Plan vs Actual' reports. I (and end users) always want to lean towards BEx reports.

  • When we get the actual data in BPC for planning, is the data consolidated

    I had a question regarding Planning. When we get the actual data in BPC for planning, is the data consolidated? Do we need to run the consolidation business rules( IC Booking, Matching, IC Eliminations) on the data before we can use it for planning? Per my understanding we have to run Currency Conversion on this...correct?
    Also, where do I get my actual data from? Does ECC/ Source system have data from all entities( CHQ, Region and Countries)?
    Please help!
    Thanks in advance

    Hi Kimi,
    In a hierarchical structure, the data is always loaded in to the base level, and the data is automatically rolled up to its parents.
    The currency conversion will also take place at the base level, and as mentioned earlier, the converted data will also be rolled up as per the hierarchy.
    The heading of this thread says planning. So, ideally, there wont be any legal (or statutory) consolidation. You might use the US elimination however, for eliminating the intercompany transactions, if any.
    The planning can be done as zero based (wherein the user has to enter the planned data manually from scratch) or non-zero based (wherein the planned data of previous year is copied and the user can change the data as required).
    The flow of events cannot be suggested by us. It has to be discussed with the business to understand how do they do the planning.
    Hope this helps.

  • Master Data loading in BPC 7.5 with prefix

    Hi Gurus,
    I need a help in master loading in BPC 7.5
    Currently we are loading GL account with prefix of A and controlling area(COA1)
    BI:0000600000
    BPC: ACOA10000600000
    I have maintained logic in the transformation file as below. Z001 represents the hierarchy Node.
    ID=IF(ID(1:4)=STR(Z001) then STR(A)+ID;STR(A)0CO_AREAID)
    Now client has came with new requirement as below
                               BI                   BPC
    GL Acc               0000600000     A_600000
    GL Node             Z00113            Z001_13
    GL Node             Z00213            Z002_13 
    For the above requirement i have maintained the logic in the transformation file as below. when running master data load package i got the error as
    Line 755 :Command failed ; end position is out of record index
    ID=IF(ID(1:4)=STR(Z001) then STR(Z001_)+ID(5:12);STR(A_)+ID(5:12))
    I need your help in modifying the logic in the transformation file for handling the above requirement.
    And also for handling two hierarchy Z001 and Z002.
    Thanks
    Mahesh

    Hi,
    Your command should like:
    ID=*IF(ID(1:4)=*STR(Z001) then *STR(Z001_)+ID(5:6);*STR(A_)+ID(5:10))
    Hope this helps.

  • How we can automate the data loading from BI-BPC

    Dear  Guru's
    Thanks for watching this thread,my question is
                  How we can load the data from BI7.0 to BPC.My environment is SAP-BI 7.0 and BPC is 7.5 MS version and 2008SQL.
    How we can automate the data loading from  BI- BPC Ms version.Is manual flat file load is mandatory in ms version.
    Thanks in Advance,
    Srinivasan.

    Here are some options
    1) Use standars packages and schedule them :
        A) Openhub masterdata file into a flat file/ BPC App server  and Schedule the package - Import Master Data from a Data File and  other relevent packages.
    2 ) Using Custom Tasks in Custom Packages ( SSIS)
    Procedure
    From the Microsoft SQL Server Business Intelligence Developer Studio, open the Microsoft SSIS folder.
    Create a new package, or select an existing package to modify.
    Choose  Task  Register Custom Task .
    In the Task Location field, browse for the target .dll file.
    Note
    By default, the .dll files are stored in BPC/Websrvr/bin.
    End of the note.
    Enter a task description, select an appropriate icon, then click OK.
    Drag the icon to the designer window. Enter data as required.
    Save the package.

  • "Sematic Group" in BPC data loading?

    Hi All:
    Now, I have a Data Manager Package to load data from BW to BPC, during the loading, I wrote a UJD routine to do the calculation of expense from cost center. The issue is the data volume for one individual cost center is huge. So, those records could come in multiple packages. I would like to know if there is similar mechanism in BPC like the "Sematic Group" in BW. So, I can put all records for one cost center into one package.
    Thanks,
    Sean

    Sean,
    You can increase the default package size (default 50K) to include all your records for that Cost center.
    Nikhil

Maybe you are looking for

  • How to fetch Header and Payload from a MimeMessage

    Hi, I have created a MimeMessage using an InputStream 'ins' like this MimeMessage mimeMsg = new MimeMessage(Session.getInstance(new Properties(), null), ins);This MimeMessage now looks like as shown below: Message-ID: <30289364.1195019945580.JavaMail

  • Web service deployment error - CE 7.2(CAF)

    I am getting deployment error for CE 7.2 CAF application. it was working fine. Suddenly with out any changes in the code or the web service, I am not able deploy the application. if I remove the modeled service , it is successfully deployed. But mode

  • Connecting up to Windows server 2000

    Hi I have an iMac running 10.3.2 on windows server 2000. When I turned the Mac on today, it just opened up without asking for me to login, when I opens up I don't have the internet and it doesn't allow me to to check e-mails on the exchange server. p

  • Is there a way to call a SQL Program from an Oracle Report?

    Morning friends. I had a question about Oracle Reports and them calling a SQL Program. Here is the problem at hand. I am hoping to not have to change the existing program a whole lot! Currently, here is what is happening. I have a .sh file which call

  • RemoteUploadException

    hi I am trying to import IDOC from R/3 to XI and I get the following message + com.sap.aii.ibrep.sbeans.upload.RemoteUploadException: java.lang.NullPointerException can it be realted to closed ports between the XI Server and the R/3 Server ? I dont k