BPC:NW - Best practices to load Transaction data from ECC to BW
I have a very basic question for loading GL transaction data into BPC for variety of purposes, would be great if you can point me towards best practices/standard ways of making such interfaces.
1. For Planning
When we are doing the planning for cost center expenses and need to make variance reports against the budgets, what would be the source Infocube/DSO for loading the data from ECC via BW, if the application is -
YTD entry mode:
Periodic entry mode:
What difference it makes to use 0FI_GL_12 data source or using 0FIGL_C10 cube or 0FLGL_O14 or 0FIGL_D40 DSOs.
Based on the data entry mode of planning application, what is the best way to make use of 0balance or debit_credit key figures on the BI side.
2. For consolidation:
Since we need to have trading partner, what are the best practices for loading the actual data from ECC.
What are the typical mappings to be maintained for movement type with flow dimensions.
I have seen multiple threads with different responses but I am looking for the best practices and what scenarios you are using to load such transactions from OLTP system. I will really appreciate if you can provide some functional insight in such scenarios.
Thanks in advance.
-SM
For - planning , please take a look at SAP Extended Financial Planning rapid-deployment solution: G/L Financial Planning module. This RDS captures best practice integration from BPC 10 NW to SAP G/L. This RDS (including content and documentation) is free to licensed customers of SAP BPC. This RDS leverages the 0FIGL_C10 cube mentioned above.
https://service.sap.com/public/rds-epm-planning
For consolidation, please take a look at SAP Financial Close & Disclosure Management rapid-deployment solution. This RDS captures best practice integration from BPC 10 NW to SAP G/L. This RDS (including content and documentation) is also free to licensed customers of SAP BPC.
https://service.sap.com/public/rds-epm-fcdm
Note: You will require an SAP ServiceMarketplace ID (S-ID) to download the underlying SAP RDS content and documentation.
The documentation of RDS will discuss the how/why of best practice integration. You can also contact me direct at [email protected] for consultation.
We are also in the process of rolling out the updated 2015 free training on these two RDS. Please register at this link and you will be sent an invite.
https://www.surveymonkey.com/s/878J92K
If the link is inactive at some point after this post, please contact [email protected]
Similar Messages
-
Load transaction data from ECC to BPC 10.1 using US GAAP starter kit SP3
I need to understand a bit more the process to load transactional data into BPC 10.1. We have US GAAP Starter Kit SP 3. Below is an screenshot from the config guide:
It explains how transactional data can be extracted from ECC system to the SAP Netweaver BW data source 0FI_GL_10 and then transform it
to get it loaded into BPC. The Objects /PKG/FC_C01 and PKG/FC_DS01 are just mentioned here as reference because they come in the RDS for Financial Close and Disclosure Management, which not all companies have.
I believe the upwards data flow should be from Data Source 0FI_GL_10 to Data Store Object 0FIGL_O10 and then to InfoCube 0FIGL_R10. There is also a data flow that goes from 0FI_FL_10 to InfoCube 0FIGL_R10. Can anyone with experience with US GAAP starter kit answer this?
Thank you.Hello, we were able to load actuals to our environment with the US GAAP Starter Kit, SP03. I followed the Operating Guide document up to section 5.2 and ran the Consolidation Data Manager Package with no issue. We are using the A20 Input and A20 Consolidation process flows based on flows F00, F10, F20, F30, F99, etc... According to the documentation, the Statemnet of Cash Flow and Changes in Equity should be automatically calculated and available from the changes
in Balance Sheet accounts once Consolidation is completed . However, when I ran the corresponding reports, they bring no data.
We loaded actual data for the whole 2013 and Jan 2014. Our intention is
to run the first consolidation for Jan 2014. The closing balance for period 12.2013 in flow 99 was copied to flow 00 2014 (opening balance). Flows 20 and 30 were updated with the corresponding balance sheet movements (increase/decrease), according to the delivered controls in the starter kit. However, cash flow is still showing no results.
I found the following text in the operating guide, but I am not clear if I am missing a step. Can you please clarify? This is telling me that I need to copy my 01.2014 opening balance (F00) to 12.2013 closing balance (F99, which is done by the copy opening data manager package, but in the opposite direction, from Y-1 F00 to Y F99) and in addition to also copy that balance to 12.2013 F00 (previous year opening balance)??
"5.2.2 First Consolidation
When operating the consolidation for a given Scope for the first time in the application, it is necessary to populate and process the prior year-end time period for this Scope, in addition to the first required consolidation period. This is because the flows dedicated to the scope change analysis are properly populated by the consolidation engine provided that at least one automatic journal entry has been detected in the consolidation used as opening consolidation, which by default is the prior year end (December, Y-1). To accomplish this, it is recommended that you copy all input-level opening data of the first required consolidation (flow “F00”) into the closing data (“F99”) and opening data (“F00”) of the prior year-end time period, including intercompany breakdowns.
This breakdown by intercompany in the balance sheet triggers automatic eliminations, which are carried over to the closing position. After the consolidation process, the closing data of the prior year
end time period will contain the appropriate source data for the opening data of the following consolidation, notably the opening automatic journal entries. -
Hi everyone,
Anyone know if is it possible to load transaction data from ECC (FI, CO, HR, MM...) to BPC 10 directly? There is a documment titled "How to Implement Delta Loading of Transaction Data from ECC and BW 7.3 into BPC using Delta Initialization in BPC 10NW" but only refers to load data from BW.
Thanks&Regards,
Fernando.Hi Fernando,
BPC NW 10.0 has DM packages that allow load of Master and Transaction data directly from ECC system, but it's very limited. I.e. just a couple extractors are supported right now.
Regards,
Gersh -
BPC 7.5 NW: Loading transaction data from infocube
Hello, I am trying to load transaction data from an infocube into a BPC application using a package based on /CPMB/LOAD_INFOPROVIDER.
The master data (cost center and cost element) are already loaded. As they are compounded, we have added in the key the CO area (eg ID=0CO_AREA+ID)
I am facing an issue that the system ads leading 0 in front of the values of those master data comming from the cube.
So the data looks like CO_AREA00000000000cost_center.
As the data where correctly loaded in the dimensions (CO_AREAcost_center), the loading fails.
How can we remove those 'O'?
Thanks!Hi Anton,
We use this in most of our projects; since auditing is one of the important requirements.
You should enable the auditing for selective categories only. With each transaction data entered, it will keep a log, which can increase the size a lot, and hence impacting the performance.
If you are enabling the log, you should frequently purge the audit data as well. However, you should take a backup of this and then purge it.
Hope this helps. -
"Error occurs when loading transaction data from other model" - BW loading into BPC
Hi Experts,
I'm having a problem with my data loading from BW, using the standard Load InfoProvider Selections data manager package.
If I run for a period without data it succeeds (with warning) but if there is data to be extracted I get the following error:
Task name CONVERT:
No 1 Round:
Error occurs when loading transaction data from other model
model: AIACONS. Package status: ERROR
As it runs ok when there isn't data it appears there is something preventing the movements of data out of the cube itself, rather then a validation issue.
Has anyone encountered similar or have any ideas as to the problem?
Best,
ChrisHi Vadim,
It's not specific to the transformation file as I have tried with others for the same BW cube and get the same result.
We get a warning when we try and validate the transformation file:
"Error occurs when loading transaction data from other model".
This only appears in the validation pop up and doesn't throw up any warnings about the transformation file itself. The validation log says:
Validate and Process Transformation File Log
Log creation time
3/7/2014 16:09
The result of validation of the
conversion file
SUCCESS
The result of validation of the
conversion file with the data file
FAIL
Validation Result
Validation Option
ValidateRecords = NO
Message
Error occurs when loading transaction data from other model
Reject List
I can't find any errors anywhere else.
Best,
Chris -
Can i load transactional Data from ge DS without having load M Data before
Hello,
I want to load transactional data from a generic data source.
I just want to know whether it would be successful, after the master data of 0material i load in the PSA couldn't be loaded in the infoObject (000000XXXXXXX -> material number error) by the DTP (see Thread BI - BI general - User arnaud).
If i can load the transactional data without having loaded the master data in the info Object, is it possible that it loads also the master data simultaneous.
Thank you very much for helping
AR.Loading transactions before master data can result in data elements not being populated in the InfoProvider in some cases, hence the "Best Practice" recommendation to load master data first.
For example, if you have update/transformation rules that go read master data to get additional data to enhance the transactional data, or you map master data attributes during the transaction load, the master data attributes could be missing values if you don't load master data first. -
Error occurs when loading transaction data from other cube
Hi Gurus,
I'm currently working on a Transformation File for loading Transactional Data from BW to BPC but a error message is displayed "Error occurs when loading transaction data from other cube". I have already checked permissions for my user, double checked my transformation file and the dimensions, made all conversion files needed and the message has not changed.
Can anybody help me to solve this problem?!
Thanks a lot & Best Regards,
HHHi,
Here, the Transformation File & Conversion File. I have already tested both with another different InfoCube and they work but no for the one needed.
*OPTIONS
FORMAT = DELIMITED
HEADER = YES
DELIMITER = ,
AMOUNTDECIMALPOINT = .
SKIP = 0
SKIPIF =
VALIDATERECORDS=YES
CREDITPOSITIVE=YES
MAXREJECTCOUNT=
ROUNDAMOUNT=
*MAPPING
Category=*NEWCOL(ACTUAL)
P_0BASE_UOM=0BASE_UOM
P_0BUS_AREA=0BUS_AREA
P_0COSTCENTER=0COSTCENTER
P_0FUNDS_CTR=*NEWCOL(null)
P_0GL_ACCOUNT=0ACCOUNT
P_0LOC_CURRCY=*NEWCOL(MXN)
P_0MATL_TYPE=*NEWCOL(null)
P_0VENDOR=0VENDOR
P_DataSrc=*NEWCOL(UPLOAD)
P_ZMATERIAL1=0MATERIAL
P_ZMATERIAL2=*NEWCOL(null)
P_ZMATL_CLASS=*NEWCOL(null)
P_ZMATL_TESP=*NEWCOL(null)
P_ZRATIO=*NEWCOL(KF_inpmdInt)
Time=0CALMONTH
SIGNEDDATA=0TOTALSTCK
*CONVERSION
Time=Time_conv.xls
EXTERNAL INTERNAL
201101 2011.JAN
201102 2011.FEB
201103 2011.MAR
201104 2011.APR
201105 2011.MAY
201106 2011.JUN
201107 2011.JUL
201108 2011.AUG
201109 2011.SEP
201110 2011.OCT
201111 2011.NOV
201112 2011.DEC
Thank you for taking at glance to the files.
Best Regards,
HH -
Error while loading Transactional data from NW BW Infoprovider
Hi,
I am trying to load the transactional data using the delivered "Transactional data from NW BW Infoprovider" package and getting error message Error occurs when loading transaction data from other cube .
Below is the transformation file content
*OPTIONS
FORMAT = DELIMITED
HEADER = YES
DELIMITER = TAB
AMOUNTDECIMALPOINT = .
SKIP = 0
SKIPIF =
VALIDATERECORDS=NO
CREDITPOSITIVE=YES
MAXREJECTCOUNT= 9999999999999
ROUNDAMOUNT=
*MAPPING
ACCOUNT = 0ACCOUNT
BUSINESSTYPE = *NEWCOL(NOBTYPE)
BWSRC = *NEWCOL(R3)
COMPANYCODE = 0COMP_CODE
DATASRC = *NEWCOL(R3)
FUNCTIONALAREA = *IF(0FUNC_AREA = *STR() then *STR(NOFA);0FUNC_AREA)
GAME = *NEWCOL(NOGAME)
INPUTCURRENCY = *IF(0CURRENCY = *STR() then *STR(NOCURR) ; 0CURRENCY)
PROBABILITY = *NEWCOL(NOPROB)
PRODUCT = *NEWCOL(NOPROD)
PROFITCENTER = 0PROFIT_CTR
PROJECT = *NEWCOL(NOPROJECT)
TIME = 0FISCPER
VERSION = *NEWCOL(REV0)
WEEKS = *NEWCOL(NOWEEK)
SIGNEDDATA= 0AMOUNT
*CONVERSION
Below is the Error Log
/CPMB/MODIFY completed in 0 seconds
/CPMB/INFOPROVIDER_CONVERT completed in 0 seconds
/CPMB/CLEAR completed in 0 seconds
[Selection]
InforProvide=ZPCA_C01
TRANSFORMATION= DATAMANAGER\TRANSFORMATIONFILES\R3_TD_P&L.xls
CLEARDATA= Yes
RUNLOGIC= No
CHECKLCK= Yes
[Messages]
Task name CONVERT:
No 1 Round:
Error occurs when loading transaction data from other cube
Application: ACTUALREPORTING Package status: ERROR
Does anyone had this issue before or is there something that needs to be looked in the BW delivered task ?
ANy Inpout will be appreciated.
SanjayHello Guru,
Currently i am getting below error while loading costcenter master data from BW to BPC.
Task name MASTER DATA SOURCE:
Record count: 189
Task name TEXT SOURCE:
Record count: 189
Task name CONVERT:
No 1 Round:
Info provider is not available
Application: ZRB_SALES_CMB Package status: ERROR
Anybody can tell me, if i have missed anything ???
Regards,
BI NEW
Edited by: BI NEW on Feb 23, 2011 12:25 PM -
Problem in loading transactional data from to 0MKT_DSO1(ods) to 0MKTG_C01
Hi,
I am trying to load lead transaction data to the standard Crm lead management cube from ODS.There is a problem while loading transaction data from 0MKT_DSO1(ods) to the infocube 0MKTG_C01 as the field 0STATECSYS2(CRM STATUS) is set to 10 in ods -meaning incorrect transaction. This feild is not there in the infocube.
There is a routine in the cube that deletes data records with (0statecsys2) set to 10.
THIS field is not coming in the transaction.
so, where can i see the master data in crm source system? and why is that feild getting set to 10 ?
thanks in advance!Thanks for the reply..
I have checked the Fact table which shows
1. packet Dimension
2. Time dimension
3. Unit dimension.
I have kept the 0CALDAY as the time characteristics.
Sample data i have loaded from ODS to Cube.
Sample data in ODS.
Sales order No_____0CALDAY_____AMOUNT
800001___________12/02/2009____15
I have loaded this data in Cube with Full Upload.
Data in Cube.
Sales order No_____0CALDAY_____AMOUNT
800001___________12/02/2009____15
Again i am loading the same data to cube
Data in cube after loading.
Sales order No_____0CALDAY_____AMOUNT
800001___________12/02/2009____15
800001___________12/02/2009____15
The data is duplicated and it is not cumulating.
Am i missing anything on this.
Pls help..
Thanks,
Siva. -
Error occurs when loading transaction data from other model
Hello Experts, I am trying to validate my transformation file and I can see peculiar behaviour of the transformation file. Even though the transformation file is not complete/ complete with all the mappings, i am getting the same error as above.
I can see options, mapping and conversion sections are validating successfully and throwing the above error.
Incomplete Transformation File
*OPTIONS
FORMAT = DELIMITED
HEADER = YES
DELIMITER = ,
AMOUNTDECIMALPOINT = .
SKIP = 0
SKIPIF =
VALIDATERECORDS=YES
CREDITPOSITIVE=YES
MAXREJECTCOUNT= 10
ROUNDAMOUNT=
*MAPPING
CUSTOMER = *NEWCOL (NO_CUST)
Validating the transformation files
Validating options...
Validation of options was successful.
Validating mappings...
Validation of mappings was successful.
Validating conversions...
Validation of the conversion was successful
Creating the transformation xml file. Please wait...
Transformation xml file has been saved successfully.
Begin validate transformation file with data file...
[Start test transformation file]
Validate has successfully completed
ValidateRecords = YES
Error occurs when loading transaction data from other model
Validation with data file failed
I am getting the same error with complete transformation file also. Please let me know where I am doing the mistake or is it a system error?
Thanking you
PraveenHi,
By
*MAPPING
CUSTOMER = *NEWCOL (NO_CUST)
you want CUSTOMER to receive a fixed string "NO_CUST"?
If so use,
*MAPPING
CUSTOMER = *STR (NO_CUST) -
Query on loaded the data from ECC to BI
Hello Bw guys,
I kept one responsibility in my resume as Loaded the data from ECC to BI and used intialization with delta update and full update methods.What does it means? Can u please help me out?
Thanks
Amarnath KVery simple.
This is somewhat like how you updated your resume by transfering from another and did a fake update to your resume.
The difference is ECC does that using a authorized route to update it in BI.
PS:If you have provided the client name you are going to join it will be easy for us to explain to them directly.
~Andrew
Edited by: Andrew J on Dec 18, 2009 3:44 PM -
Best Practices for Loading Master Data via a Process Chain
Currently, we load attributes, text, and hierarchies before loading the transactional data. We have one meta chain. To load the master data it is taking more than 2 hours. Most of the master data is full loads. We've noticed that a lot of the master data, especially text, has not changed or changed very little since we implemented 18 months ago. Is there a precedence or best practice to follow such as do we remove these processes from the chain? If so, how often should it be run? We would really like to reduce the amount of the master data loading time. Is there any documentation that I can refer to? What are other organizations doing to reduce the amount of time to load master data?
Thanks!
DebbyHi Debby,
I assume you're loading Master Data from a BI system? The forum here are related to SAP NetWeaver MDM, so maybe you should ask this question in a BI forum?
Nevertheless, if your data isn't changed this much, maybe you could use a delta mechanism for extraction? This would send only the changed records and not all the unchanged all the time. But this depends on your master data and of course on your extractors.
Cheers
Michael -
Error while loading transaction data from an Infoprovider Mapping C_ACCT
Hi Gurus,
We are implementing BPC Consolidation for NetWeaver 7.5, and we're facing the following situation:
We have loaded master data from 0RC_ACCOUNT (group account) to our dimension C_ACCT without a problem, but when we run the loading package from an infoprovider 0FIGL_C10 (Transactional Data), the following error message appear:
Dimension: Member C_ACCT: 000 not valid
We already check data in source InfoProvider and there's no record with 000 account value.
Any suggestions?
Thanks in advance
Best Regards
Abraham MéndezRad,
See if SAP note# 492647 & 849501 is of some help in your scenario. -
Loading transaction data from flat file to SNP order series objects
Hi,
I am an BW developer and i need to provide data to my SNP team.
Can you please let me know more about <b>loading transaction data (sales order, purchase order, etc.,) from external systems into order based SNP objects/structure</b>. there is a 3rd party tool called webconnect that gets data from external systems and can give data in flat file or in database tables what ever required format we want.
I know we can use BAPI's, but dont know how. can you please send any <b>sample ABAP program code that calls BAPI to read flat file and write to SNP order series objects</b>.
Please let me know ASAP, how to get data from flat file into SNP order based objects, with options and I will be very grateful.
thanks in advance
RahulHi,
Please go through the following links:
https://forums.sdn.sap.com/click.jspa?searchID=6094493&messageID=4180456
https://forums.sdn.sap.com/click.jspa?searchID=6094493&messageID=4040057
https://forums.sdn.sap.com/click.jspa?searchID=6094493&messageID=3832922
https://forums.sdn.sap.com/click.jspa?searchID=6094493&messageID=4067999
Hope this helps...
Regards,
Habeeb
Assign points if helpful..:) -
Load Transaction Data From InfoProvider
Hi expert,
Iu2019m trying to upload transaction data from an Infoprovider (using BPC 7.5 NW)
From datamanager package I can only select the name of the infoprovider, but I cannot filter the data (I need to upload only a subset of them)
The strange thing is that when validating the corresponding trasformation file, I can see the u201CSet Selectionu201D button where to make those selection. It is a bug that I canu2019t see it in the datamanager package?
Thanks & regards
FabiolaHi Fabiola,
for you 2 options are there.
1. Options section of transformation file using SELECTION (eg: SELECTION=0CALMONTH,200901)
2. Maintain conversion file, in that write conversion rules to skip records
(eg: external col: 2008, internal column: *skip )
SET SELECTION what you said is, useful while validating transfromation file against infoprovider, you can select this options in order to restriction of number of records to validate.
regards,
Maybe you are looking for
-
Question on optimum choice of index - whether to use CTXCAT or CONTEXT
Hi , I have a situation in which there are short texts that are to be searched for diacritical characters and for that I implemented CTXCAT type of index. The solution works fine except for left side wild card search - in that case I have suggested t
-
hi i have a unusual requirement Presently in one of my HR reports I am using 'PCH' LDB. In the selection screen we have the object ID which is defined in standard program with no-intervals like this. SELECT-OPTIONS: PCHOBJID FOR OBJEC-OBJID VALUE-REQ
-
Moving pictures from Ovi to standard folders
This probably sounds stupid, but now I have used Ovi Suite to copy my photos from my N95 onto my computer, I cannot seem to do anything with them. They must be on my computer, as they still show in Ovi Suite when my phone is not plugged in. But I c
-
Hi all I am writing procedure to take some of the work out of adding new data and I am so close to finishing it but one problem, I keep getting a wrong data type error on the date insert parameter 'p_todate' the procedure compiles fine but i still ge
-
Photoshop software on disk and no disk drive
I've just received my adobe photoshop CS6 on disk and I'm waiting for my new MacBook Pro with retina display to arrive... Just realised it doesn't have a disk drive. Can I just transfer p'shop files to external hardrive and install it from there?