Parallel loading of Transaction Data
Hi,
We are trying to load transaction data into Inventory Valuation cube (0COPC_C04).
This gets data from multiple datasources. My question is, Can the different infopackages (different infosources) be scheduled in parallel into the same cube .
This is transaction data. We are loading manually in production based on some selection criteria.
Thanks
Hi,
For the first part, yes your understanding is correct. Delete all cube indexes before you start any loads and then recreate them after all loading has been completed. This is to increase load performance. Make sure that you recreate the indexes otherwise you'll have problems in querying.
For the second part, yes, as long as the selections are different and the data loads are from different source systems, you can start another data load in parallel to the one already in progress. This again will help you load faster and will reduce load times..
Cheers,
Kedar
Similar Messages
-
SAP Best Practice: Problems with Loading of Transaction Data
Hi!
I am about to implement SAP Best Practices scenario "B34: Accounts Receivable Analysis".
Therefore I load the data from SAP ERP IDES system into SAP NetWeaver 2004s system.
My problems are:
when I try to load the Transaction data for Infosources 0FI_AR_4 and 0FI_AR_6 I get the following errors/warnings:
when I start the "schedule process" the status getting still "yellow 14:27: 31(194 from 0 records)"
On the right side I see some actions that are also "yellow", e.g. "DataStore Activation (Change Lo ): not yet activated".
As a result I cannot see any data in tcode "RSRT" executing the queries "0FIAR_C03/...".
The problems there
1) Input help of the web template of query don't contain any values
2) no data will be shown
Can some one help me to solve this problem?
Thank you very much!
JürgenBe in the monitor window where u got the below issue
when I start the "schedule process" the status getting still "yellow 14:27: 31(194 from 0 records)"
and go to environment in the menu options TransactRFC--->in the sourcesystem...
give the logon details and enter and from there give the correct target destination as ur BI server and execute...
if u find some idoc's pending there push that manually using F6..
and come back to ur load and refresh....
if still it doen't turn green u can manully change status to red in STATUS tab and come to processing tab and expand ur processing details and right click on ur data packet which was not yet updated and select manual update...
it shows busy status and when it comes out of that once again refresh...
rgds, -
Loading master/transaction data from BI to NW BPC
hi
what is best way of loading master & transaction data from BI to BPC?
I know around 6 options for loading master data and 5 options for loading transactional data from BI to BPC .
What is easiest and recommended method? what is typical method followed in your NW bpc projects?
thanks,.Hi Raju,
There is a white paper on Loading master data. I've used it previously and both options work quite well. You can use either of them depending on what you are comfortable with. Its available on the HOW To white papers. -
Master data and impact of loads from transaction data
If I have the following scenario:
1. Load to cube which contains a dimension which has an info object 0CUSTOMER
2. The transaction data is being loaded from a non SAP system and 0CUSTOMER is loaded on a daily basis from SAP system
3. As part of the transaction load some Customers have a value which is not existing in the 0CUSTOMER info object
Do these new customers also get loaded to the 0CUSTOMER master data i.e. the SIDs are added to the SID table for 0CUSTOMER or is it just loaded as part of the cube and stored in the specific dimension
ThanksHi,
they are also automatically loaded to 0customer, as the sid of that record is needed in the cube. Possibly, if there is no relation to your R/3 customers, you have some initial records in your customer master (means all attributes as well as the texts are initial).
regards
Siggi -
Dear Sirs,
during loading of master data from a 4.6C SAP system to BI 7.0 SP9 I get the following error during processing (data packet).
<i>"An RFC call-up triggered the exception (ID: RSAR NO: 503) "</i>
The strange thing with this error is that it emerged today, after successfully loading master data and a subset of transaction data yesterday from same source system.
Has anyone else come across this error, and can tell us where we should look for a possible solution? (or a better error explaination).
The RFC connections works fine when testing in SM59, and was functioning yesterday. No relevant dumps in either system.Hi folks,
I am also interested in this error.
Thanks. -
Error while loading the transaction data ..
Hello Expert
I need your help in resolving this error, I have encountered this below error while uploading transaction data from the cube 0PP_C14 even though my transformation file is giving the accepted record
/CPMB/MODIFY completed in 0 seconds
/CPMB/INFOPROVIDER_CONVERT completed in 0 seconds
/CPMB/CLEAR completed in 0 seconds
[Selection]
InforProvide = 0PP_C14
SELECTION = <?xml version="1.0" encoding="utf-16"?><Selections xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema"><Selection Type="Selection"><Attribute><ID>0EXTMATLGRP</ID><Operator>1</Operator><LowValue>NAFS-DESIGNER
</LowValue><HighValue /></Attribute><Attribute><ID>0EXTMATLGRP</ID><Operator>1</Operator><LowValue>NAFS-PLANK
</LowValue><HighValue /></Attribute><Attribute><ID>0EXTMATLGRP</ID><Operator>1</Operator><LowValue>NAFS-HALF-WIDTH
</LowValue><HighValue /></Attribute><Attribute><ID>0EXTMATLGRP</ID><Operator>1</Operator><LowValue>NAFS-NON-STD-WIDTH
</LowValue><HighValue /></Attribute><Attribute><ID>0EXTMATLGRP</ID><Operator>1</Operator><LowValue>NAFS-RECESS
</LowValue><HighValue /></Attribute><Attribute><ID>0EXTMATLGRP</ID><Operator>1</Operator><LowValue>NAFS-STD-WIDTH
</LowValue><HighValue /></Attribute><Attribute><ID>0BASE_UOM</ID><Operator>1</Operator><LowValue>4FT</LowValue><HighValue /></Attribute><Attribute><ID>0PLANT__0SALESORG</ID><Operator>1</Operator><LowValue>SILD</LowValue><HighValue /></Attribute><Attribute><ID>0PLANT__0SALESORG</ID><Operator>1</Operator><LowValue>SILE</LowValue><HighValue /></Attribute></Selection><Selection Type="FieldList"><FieldID>0EXTMATLGRP</FieldID><FieldID>0FISCPER</FieldID><FieldID>0PLANT</FieldID><FieldID>0PLANT__0SALESORG</FieldID></Selection></Selections>
TRANSFORMATION = \ROOT\WEBFOLDERS\SIL\MIS\DATAMANAGER\TRANSFORMATIONFILES\COST SHEET\0PP_C14_FOR_DOMESTICS_EXPORT.xls
TARGETMODE = Yes
RUNLOGIC = No
CHECKLCK = No
[Message]
Task name CONVERT:
No 1 Round:
An exception with the type CX_ST_MATCH_TYPE occurred, but was neither handled locally, nor declared in a RAISING clause
System expected a value for the type g
model: MIS. Package status: ERROR
Any one has the solution of it ..?Hi,NEERAJ DEPURA
I think it is caused by incorrect character entries you made in the 'set selection'
screen when you filtering data.
Because I was able to reproduce the error in the way below:
1, I changed my input language to Chinese.
2,In the 'set selection'screen,I selected '0company_code’ as the attribute ,operater is ‘=’,
and enter the value ‘SGA01’.
3,Saved the selection and ran the package. it failed with the same error as you got.
In my cube the value ‘SGA01’ is in English so in the ‘set selection’ screen if it is
entered in Chinese ,a mismatch will occur and result in the error.
If I change my input language to English and enter the valune ‘SGA01’ again in the ‘set
selection’screen,the error won’t occur.
Please check your entered values in the ‘set selection’screen.
Hope it solves your issue -
Loading of transaction data from SAP ECC system failed
Hi!
I successfully connected SAP ECC system to SAP BI system.
The following steps have been executed:
- user ALEREMOTE with max. authorization
- RFC destination
- Distributing Data model
- Generated Partner profile
- Maintaining message types in WE20
Now when I try to load any data from SAP ECC system the loading process in hanging in status "yellow" and never comletes.
[0FI_AR_4|http://www.file-upload.net/view-1447743/0FI_AR_4.jpg.html]
The following steps within Load process are yellow:
Extraction (messages): Missing messages
Missing message: Request received
Missing message: Number of sent records
Missing message: Selection completed
Transfer (IDocs and TRFC): Missing messages or warnings
Request IDoc : Application document posted (is green)
Data Package 1 : arrived in BW ; Processing : 2nd processing step not yet finished
Info IDoc 1 : sent, not arrived ; Data passed to port OK
Info IDoc 2 : sent, not arrived ; Data passed to port OK
Info IDoc 3 : sent, not arrived ; Data passed to port OK
Info IDoc 4 : sent, not arrived ; Data passed to port OK
Subseq. processing (messages) : Missing messages
Missing message: Subseq. processing completed
DataStore Activation (Change Log) : not yet activated
Question:
Can some one give me some technical steps (tcode, report) to solve this problem?
Thank you very much!
HolgerHi!
Many thanks for your answer.
Via BD87 on BW system I detect that all the IDOC's (type: RSRQST) will be received from SAP ECC system.
Via tcode SM58 I could not detect any entries.
However the loading status from yesterday is set to "red".
The errors are:
Extraction (messages): Missing messages
Data Package 1 : arrived in BW ; Processing : 2nd processing step not yet finished
Info IDoc 1 : sent, not arrived ; Data passed to port OK
Info IDoc 2 : sent, not arrived ; Data passed to port OK
Can you investigate my issue again?
Thank you very much! -
Program/FM to load test transaction data
Friends,
I remember using a program or a function module to load test data to SNP.
It has a simple interface of specifying Product, Location, ATP category and quantity.
Can anyone of you help me recall this program/fm? I am in middle of testing a scenario and stuck with create test data.
-BPAre you referring to /SAPAPO/SJKTST04
Look into SAP Note 886075 for SAP's disclaimer.
Somnath -
Changing master data record while loading Transaction data
Hello All,
We have a requirementt to change one of the master data field(FLAG) while loading on the transaction data.
we get the material info in the Master data. and in the sales order item data also we get the material.
While loading the Transaction data, I have to set a FLAG field has "s" in the Master data material based on the Key selection:
Master data - MAterial = Tramsaction - Data Material.
I have written the code.. and implemented.. i get the correct records but i face huge performance issue. can any one guide me please
DATA: itab1 TYPE STANDARD TABLE OF /bi0/pmaterial,
wa_itab1 TYPE /bi0/pmaterial,
w_tabix TYPE sy-tabix.
IF itab1 IS INITIAL.
SELECT * FROM /bi0/pmaterialINTO TABLE itab1.
ENDIF.
LOOP AT result_package ASSIGNING <result_fields>.
READ TABLE itab1 INTO wa_itab1 WITH KEY
material = <result_fields>-material.
IF sy-subrc = 0.
w_tabix = sy-tabix.
IF <result_fields>-/bic/paa1c2033 IS NOT INITIAL.
wa_itab1-FLAG = 'S'.
MODIFY itab1 FROM wa_itab1 INDEX w_tabix TRANSPORTING FLAG .
ENDIF.
ENDIF.
ENDLOOP.
IF itab1 IS NOT INITIAL.
MODIFY /bi0/pmaterial FROM TABLE itab1.
ENDIF.Here are some performance tips:
Add FOR ALL ENTRIES IN result_package WHERE material = result_package-material to your select statement
After your select statement, add IF SY-SUBRC = 0. SORT itab1 BY material. ENDIF.
In your read statement, add BINARY SEARCH to the end of it
At the end of your end routine, make sure to CLEAR itab1.
You can also increase the number of parallel processes for your DTP, and DSO activation (assuming your target is DSO). -
Error while loading Transactional data from NW BW Infoprovider
Hi,
I am trying to load the transactional data using the delivered "Transactional data from NW BW Infoprovider" package and getting error message Error occurs when loading transaction data from other cube .
Below is the transformation file content
*OPTIONS
FORMAT = DELIMITED
HEADER = YES
DELIMITER = TAB
AMOUNTDECIMALPOINT = .
SKIP = 0
SKIPIF =
VALIDATERECORDS=NO
CREDITPOSITIVE=YES
MAXREJECTCOUNT= 9999999999999
ROUNDAMOUNT=
*MAPPING
ACCOUNT = 0ACCOUNT
BUSINESSTYPE = *NEWCOL(NOBTYPE)
BWSRC = *NEWCOL(R3)
COMPANYCODE = 0COMP_CODE
DATASRC = *NEWCOL(R3)
FUNCTIONALAREA = *IF(0FUNC_AREA = *STR() then *STR(NOFA);0FUNC_AREA)
GAME = *NEWCOL(NOGAME)
INPUTCURRENCY = *IF(0CURRENCY = *STR() then *STR(NOCURR) ; 0CURRENCY)
PROBABILITY = *NEWCOL(NOPROB)
PRODUCT = *NEWCOL(NOPROD)
PROFITCENTER = 0PROFIT_CTR
PROJECT = *NEWCOL(NOPROJECT)
TIME = 0FISCPER
VERSION = *NEWCOL(REV0)
WEEKS = *NEWCOL(NOWEEK)
SIGNEDDATA= 0AMOUNT
*CONVERSION
Below is the Error Log
/CPMB/MODIFY completed in 0 seconds
/CPMB/INFOPROVIDER_CONVERT completed in 0 seconds
/CPMB/CLEAR completed in 0 seconds
[Selection]
InforProvide=ZPCA_C01
TRANSFORMATION= DATAMANAGER\TRANSFORMATIONFILES\R3_TD_P&L.xls
CLEARDATA= Yes
RUNLOGIC= No
CHECKLCK= Yes
[Messages]
Task name CONVERT:
No 1 Round:
Error occurs when loading transaction data from other cube
Application: ACTUALREPORTING Package status: ERROR
Does anyone had this issue before or is there something that needs to be looked in the BW delivered task ?
ANy Inpout will be appreciated.
SanjayHello Guru,
Currently i am getting below error while loading costcenter master data from BW to BPC.
Task name MASTER DATA SOURCE:
Record count: 189
Task name TEXT SOURCE:
Record count: 189
Task name CONVERT:
No 1 Round:
Info provider is not available
Application: ZRB_SALES_CMB Package status: ERROR
Anybody can tell me, if i have missed anything ???
Regards,
BI NEW
Edited by: BI NEW on Feb 23, 2011 12:25 PM -
Deleting master data after loading transactional data using flat file
Dear All,
I have loaded transaction data into an infocube using a flat file . While loading DTP i have checked the option "load transactional data with out master data exists" . So transactional data is loaded even if no master data is there in BW.
While loading the flat file, I made a mistake for DIVISION Characteristic where original master data value is '04000' , but i loaded the transactional data with value '4000' .Then i later realized after seeing the data from the infocube and deleted the request. then i reloaded data with value '04000'. Till now every thing is fine.
But when I see the master data for DIVISION , i can see a new entry with value '4000'.
My question is how to delete the entry value '4000' from DIVISION. I tried deleting manually this entry from 'maintaining masterdata' , but it is not allowing me to do so .
I have also checked if any transactional data exists for that value '4000' , as i said earlier I have deleted the transactional data with that values. even tried to delete the entries from the master data table, but i donot see a option to delete entries there.
Please suggest me on this.
Regards,
VeeraHi,
Goto RSA1 right click on the Info object and select Delete Master data. This will delete the master data unused existing in the table.
If this master data is not used any where else just delete the master data completely with SID option.
If even this doesnt work you can delete the complete table entire in SE14. But this will wipe out the entire table. Be sure if you wanna do this.
Hope this helps
Akhan. -
Error in Transaction Data - Full Load
Hello All,
This is the current scenario that I am working on:
There is a process chain which has two transaction data load (FULL LOADS) processes to the same cube.In the process monitor everything seems okay (data loads seem fine) but overall status for both loads failed due to 'Error in source system/extractor' and it says 'error in data selection'.
Processing is set to data targets only.
On doing a manage on the cube, I found 3 old requests that were red and NOT set to QM status red. So I set them to QM status red and Deleted them and the difference I saw was that the subsequent requests became available for Reporting.
Now this data load which is a full load takes for ever - I dont even know why I do not see a initialize delta update option there - can Anyone tell me why I dont see that.
And, coming to the main question, how do I get the process chain completed - will I have to repeat the data loads or what options do I have to have a succesfully running process chain or at least these 2 full loads of transaction data.
Thank you - points will be assigned for helpful answers
- DB
Edited by: Darshana on Jun 6, 2008 12:01 AM
Edited by: Darshana on Jun 6, 2008 12:05 AMOne interesting discovery I just found in R/3, was this job log with respect to the above process chain:
it says that the job was cancelled in R/3 because the material ledger currencies were changed.
the process chain is for inventory management and the data load process that get cancelled are for the job gets cancelled in the source system:
1. Material Valuation: period ending inventories
2. Material Valuation: prices
The performance assistant says this but I am not sure how far can I work on the R/3 side to rectify this:
Material ledger currencies were changed
Diagnosis
The currencies currently set for the material ledger and the currency types set for valuation area 6205 differ from those set at conversion of the data (production startup).
System Response
The system does not allow you to post transactions after changing the currency settings to ensure consistency.
Procedure
Replace the current settings with the those entered at production
start-up.
If you wish to change the currency settings, you must use programs to convert data from the old to the new currencies.
Inform your system administrator
Anyone knowledgable in this area please give your inputs.
- DB -
Need more Info about "Load transactional data when master data not loaded"
Hi,
Can you please explain me this option in the infopackage " Load transactional data when master data is not loaded"....
Say i load a transactional data record which has a material no. AAAXX.
In the fact table, the material no. is replaced with the corresp. DIM ID.
Now, assume that there is no entry for this Material no.
AAAXX in the master data table...so no DIM ID for it..
How is it then stored in the fact table ?
Hope i have managed to explain the scenario..
Thanks in advance,
PunkujHello Punkuj K,
How r u ?
No, if the entry for that Material Number "AAAXX" is not there in the Master Data then it will create a SIDs & DIMs ID for that & Transaction Data will be loaded.
Use
Choose this indicator if you want to always update the data, even if no master data for the navigation attributes of the loaded records exists. The master data is generated from the loaded transaction data. The system draws SIDs. You can subsequently load the master data.
Dependencies
The texts, attributes, and hierarchies are not recognized by the system. Load the texts / attributes / hierarchies for the corresponding InfoObjects separately.
This function corresponds to the update. Possible errors when reading the master data into the update rules are not associated with this indicator.
Recommendation
We recommended loading the master data into the productive systems in advance. Select this indicator especially in test systems.
Best Regards....
Sankar Kumar
+91 98403 47141 -
BPC:NW - Best practices to load Transaction data from ECC to BW
I have a very basic question for loading GL transaction data into BPC for variety of purposes, would be great if you can point me towards best practices/standard ways of making such interfaces.
1. For Planning
When we are doing the planning for cost center expenses and need to make variance reports against the budgets, what would be the source Infocube/DSO for loading the data from ECC via BW, if the application is -
YTD entry mode:
Periodic entry mode:
What difference it makes to use 0FI_GL_12 data source or using 0FIGL_C10 cube or 0FLGL_O14 or 0FIGL_D40 DSOs.
Based on the data entry mode of planning application, what is the best way to make use of 0balance or debit_credit key figures on the BI side.
2. For consolidation:
Since we need to have trading partner, what are the best practices for loading the actual data from ECC.
What are the typical mappings to be maintained for movement type with flow dimensions.
I have seen multiple threads with different responses but I am looking for the best practices and what scenarios you are using to load such transactions from OLTP system. I will really appreciate if you can provide some functional insight in such scenarios.
Thanks in advance.
-SMFor - planning , please take a look at SAP Extended Financial Planning rapid-deployment solution: G/L Financial Planning module. This RDS captures best practice integration from BPC 10 NW to SAP G/L. This RDS (including content and documentation) is free to licensed customers of SAP BPC. This RDS leverages the 0FIGL_C10 cube mentioned above.
https://service.sap.com/public/rds-epm-planning
For consolidation, please take a look at SAP Financial Close & Disclosure Management rapid-deployment solution. This RDS captures best practice integration from BPC 10 NW to SAP G/L. This RDS (including content and documentation) is also free to licensed customers of SAP BPC.
https://service.sap.com/public/rds-epm-fcdm
Note: You will require an SAP ServiceMarketplace ID (S-ID) to download the underlying SAP RDS content and documentation.
The documentation of RDS will discuss the how/why of best practice integration. You can also contact me direct at [email protected] for consultation.
We are also in the process of rolling out the updated 2015 free training on these two RDS. Please register at this link and you will be sent an invite.
https://www.surveymonkey.com/s/878J92K
If the link is inactive at some point after this post, please contact [email protected] -
Problem in loading transactional data from to 0MKT_DSO1(ods) to 0MKTG_C01
Hi,
I am trying to load lead transaction data to the standard Crm lead management cube from ODS.There is a problem while loading transaction data from 0MKT_DSO1(ods) to the infocube 0MKTG_C01 as the field 0STATECSYS2(CRM STATUS) is set to 10 in ods -meaning incorrect transaction. This feild is not there in the infocube.
There is a routine in the cube that deletes data records with (0statecsys2) set to 10.
THIS field is not coming in the transaction.
so, where can i see the master data in crm source system? and why is that feild getting set to 10 ?
thanks in advance!Thanks for the reply..
I have checked the Fact table which shows
1. packet Dimension
2. Time dimension
3. Unit dimension.
I have kept the 0CALDAY as the time characteristics.
Sample data i have loaded from ODS to Cube.
Sample data in ODS.
Sales order No_____0CALDAY_____AMOUNT
800001___________12/02/2009____15
I have loaded this data in Cube with Full Upload.
Data in Cube.
Sales order No_____0CALDAY_____AMOUNT
800001___________12/02/2009____15
Again i am loading the same data to cube
Data in cube after loading.
Sales order No_____0CALDAY_____AMOUNT
800001___________12/02/2009____15
800001___________12/02/2009____15
The data is duplicated and it is not cumulating.
Am i missing anything on this.
Pls help..
Thanks,
Siva.
Maybe you are looking for
-
I do not have a Cellular data tab showing - can anyone help me please?
-
hi what does invalid argument mean ive connected to my wi fi ok but nothing else works eg apps store i tuness store helppppppp please new ipod received for xmas
-
No audio with Sony cybershot mpeg imports
Can anybody help? When I import my sony cybershot digital camera mpegs into any video application (ie. imovie, final cut, final cut express) I get no audio. The video clips DO play in quicktime with audio. They also replay properly with web applicati
-
Looking for examples using BigDecimal
Two newbies to Java are looking for examples of code that use the BigDecimal features for rounding and truncating large numbers to more user-friendly values. We are doing some hefty math calculations on values defined as doubles and want to round and
-
Multiple primary site - Discovery issue
Hi, I am working on a scenario where there is 1 CAS and 3 Primary sites(PS1,PS2,PS3). At PS1 site, Active directory system Discovery is only configured and all the OU for all the sites are added. On the other two Primary sites this discovery was not