Incorrect records When load Transaction Data
Hi experts:
I just loaded transactional Data. Data first came from a CSV file to PSA, then from PSA to DSO, Last DSO to Cube.
Data in PSA looks correct. After data got loaded to DSO, I found so many Item Numbers are not in PSA. I deleted all the old requests in PSA, and only have one request now. I don't know what are those old Item Numbers come from. It gets loaded everytime. Anyone knows?
Thanks!
In that case you must be having some update routine that is increasing number of records in DSO. Can you please paste here number of records transferred and added from DSO Manage for the data load request you are talking about. Also let me know what do you mean by - I found so many Item Numbers are not in PSA
Regards
Pradip
Similar Messages
-
"Error occurs when loading transaction data from other model" - BW loading into BPC
Hi Experts,
I'm having a problem with my data loading from BW, using the standard Load InfoProvider Selections data manager package.
If I run for a period without data it succeeds (with warning) but if there is data to be extracted I get the following error:
Task name CONVERT:
No 1 Round:
Error occurs when loading transaction data from other model
model: AIACONS. Package status: ERROR
As it runs ok when there isn't data it appears there is something preventing the movements of data out of the cube itself, rather then a validation issue.
Has anyone encountered similar or have any ideas as to the problem?
Best,
ChrisHi Vadim,
It's not specific to the transformation file as I have tried with others for the same BW cube and get the same result.
We get a warning when we try and validate the transformation file:
"Error occurs when loading transaction data from other model".
This only appears in the validation pop up and doesn't throw up any warnings about the transformation file itself. The validation log says:
Validate and Process Transformation File Log
Log creation time
3/7/2014 16:09
The result of validation of the
conversion file
SUCCESS
The result of validation of the
conversion file with the data file
FAIL
Validation Result
Validation Option
ValidateRecords = NO
Message
Error occurs when loading transaction data from other model
Reject List
I can't find any errors anywhere else.
Best,
Chris -
Error occurs when loading transaction data from other model
Hello Experts, I am trying to validate my transformation file and I can see peculiar behaviour of the transformation file. Even though the transformation file is not complete/ complete with all the mappings, i am getting the same error as above.
I can see options, mapping and conversion sections are validating successfully and throwing the above error.
Incomplete Transformation File
*OPTIONS
FORMAT = DELIMITED
HEADER = YES
DELIMITER = ,
AMOUNTDECIMALPOINT = .
SKIP = 0
SKIPIF =
VALIDATERECORDS=YES
CREDITPOSITIVE=YES
MAXREJECTCOUNT= 10
ROUNDAMOUNT=
*MAPPING
CUSTOMER = *NEWCOL (NO_CUST)
Validating the transformation files
Validating options...
Validation of options was successful.
Validating mappings...
Validation of mappings was successful.
Validating conversions...
Validation of the conversion was successful
Creating the transformation xml file. Please wait...
Transformation xml file has been saved successfully.
Begin validate transformation file with data file...
[Start test transformation file]
Validate has successfully completed
ValidateRecords = YES
Error occurs when loading transaction data from other model
Validation with data file failed
I am getting the same error with complete transformation file also. Please let me know where I am doing the mistake or is it a system error?
Thanking you
PraveenHi,
By
*MAPPING
CUSTOMER = *NEWCOL (NO_CUST)
you want CUSTOMER to receive a fixed string "NO_CUST"?
If so use,
*MAPPING
CUSTOMER = *STR (NO_CUST) -
Error occurs when loading transaction data from other cube
Hi Gurus,
I'm currently working on a Transformation File for loading Transactional Data from BW to BPC but a error message is displayed "Error occurs when loading transaction data from other cube". I have already checked permissions for my user, double checked my transformation file and the dimensions, made all conversion files needed and the message has not changed.
Can anybody help me to solve this problem?!
Thanks a lot & Best Regards,
HHHi,
Here, the Transformation File & Conversion File. I have already tested both with another different InfoCube and they work but no for the one needed.
*OPTIONS
FORMAT = DELIMITED
HEADER = YES
DELIMITER = ,
AMOUNTDECIMALPOINT = .
SKIP = 0
SKIPIF =
VALIDATERECORDS=YES
CREDITPOSITIVE=YES
MAXREJECTCOUNT=
ROUNDAMOUNT=
*MAPPING
Category=*NEWCOL(ACTUAL)
P_0BASE_UOM=0BASE_UOM
P_0BUS_AREA=0BUS_AREA
P_0COSTCENTER=0COSTCENTER
P_0FUNDS_CTR=*NEWCOL(null)
P_0GL_ACCOUNT=0ACCOUNT
P_0LOC_CURRCY=*NEWCOL(MXN)
P_0MATL_TYPE=*NEWCOL(null)
P_0VENDOR=0VENDOR
P_DataSrc=*NEWCOL(UPLOAD)
P_ZMATERIAL1=0MATERIAL
P_ZMATERIAL2=*NEWCOL(null)
P_ZMATL_CLASS=*NEWCOL(null)
P_ZMATL_TESP=*NEWCOL(null)
P_ZRATIO=*NEWCOL(KF_inpmdInt)
Time=0CALMONTH
SIGNEDDATA=0TOTALSTCK
*CONVERSION
Time=Time_conv.xls
EXTERNAL INTERNAL
201101 2011.JAN
201102 2011.FEB
201103 2011.MAR
201104 2011.APR
201105 2011.MAY
201106 2011.JUN
201107 2011.JUL
201108 2011.AUG
201109 2011.SEP
201110 2011.OCT
201111 2011.NOV
201112 2011.DEC
Thank you for taking at glance to the files.
Best Regards,
HH -
Error 3, when loading transaction data into an info cube from flat file
HI friends,
I have taken 2 dimension info cube. (customer and product dimensions).I loaded master data through direct update by preparing 2 excel sheets seperately for customer and product. For loading transaction data(through flexible update), I prepared single excel sheet covering all values that is master data info objects of customer and product, key figure values and taken 0unit, 0currency,0calday in transfer rules.(in excel sheet)
Once I load the data, the "data was requested" option is appearing and error 3 coming, when I go for monitor to check the data.
saiHi Ram,
In Monitor see the error message by clicking on Error message button or
go to details tab and expand the nodes to see which record is having problem.
Regards,
Prakash -
How to only update existing records when loading master data ?
Hello experts, I need your lights one more time.
Here is my need :
I have created an infoobject (IO) which is a very simple version of 0material, let's call it Znewmat --> Znewmat has material type and trademark as attributes, those two fields are available in 2 different datasources :
- 0MATERIAL_ATTR for material type (field MTART)
- 0MAT_SALES_ATTR for trademark (field MVGR2)
When loading my new IO from 0MATERIAL_ATTR I use a filter (at DTP level) to get only a few material types (I get something like 1000 records),
here is my issue : when I load from 0MAT_SALES_ATTR the field "material type" is not available to keep the same filter as for 0MATERIAL_ATTR and existing records are updated with the trademark, but I also get 5000 records I don't need, and my master data is "polluated" with useless lines.
*and my question : is there a way while performing the second loading to ONLY UPDATE EXISTING RECORDS AND NOT ADD ANY
NEW RECORDS ? (i didn't find anything in main options of my DTP)*
(I'd like to avoid the solution to update the 0MAT_SALES_ATTR datasource to add the missing field)
Thanks in advance for any help, points will be distributed.
Guillaume P.
Still no idea ?in the start routine of transformation from 0MAT_SALES_ATTR to znewmat do the following:
select materials from /BIC/PZNEWMAT into i_mat
for all entries in source_package where material eq source_package-material.
loop at source_package.
p_ind = sy-tabix.
read table i_mat with key material = source_package-material.
if sy-subrc ne 0.
delete i_mat index p_ind.
endif.
this way you'll only update records that have previously been loaded by 0MATERIAL_ATTR DS
loading sequence:
first load ZNEWMAT from 0MATERIAL_ATTR. then activate ZNEWMAT. then load 0MAT_SALES_ATTR to ZNEWMAT.
M. -
Changing master data record while loading Transaction data
Hello All,
We have a requirementt to change one of the master data field(FLAG) while loading on the transaction data.
we get the material info in the Master data. and in the sales order item data also we get the material.
While loading the Transaction data, I have to set a FLAG field has "s" in the Master data material based on the Key selection:
Master data - MAterial = Tramsaction - Data Material.
I have written the code.. and implemented.. i get the correct records but i face huge performance issue. can any one guide me please
DATA: itab1 TYPE STANDARD TABLE OF /bi0/pmaterial,
wa_itab1 TYPE /bi0/pmaterial,
w_tabix TYPE sy-tabix.
IF itab1 IS INITIAL.
SELECT * FROM /bi0/pmaterialINTO TABLE itab1.
ENDIF.
LOOP AT result_package ASSIGNING <result_fields>.
READ TABLE itab1 INTO wa_itab1 WITH KEY
material = <result_fields>-material.
IF sy-subrc = 0.
w_tabix = sy-tabix.
IF <result_fields>-/bic/paa1c2033 IS NOT INITIAL.
wa_itab1-FLAG = 'S'.
MODIFY itab1 FROM wa_itab1 INDEX w_tabix TRANSPORTING FLAG .
ENDIF.
ENDIF.
ENDLOOP.
IF itab1 IS NOT INITIAL.
MODIFY /bi0/pmaterial FROM TABLE itab1.
ENDIF.Here are some performance tips:
Add FOR ALL ENTRIES IN result_package WHERE material = result_package-material to your select statement
After your select statement, add IF SY-SUBRC = 0. SORT itab1 BY material. ENDIF.
In your read statement, add BINARY SEARCH to the end of it
At the end of your end routine, make sure to CLEAR itab1.
You can also increase the number of parallel processes for your DTP, and DSO activation (assuming your target is DSO). -
Error while loading Transactional data from NW BW Infoprovider
Hi,
I am trying to load the transactional data using the delivered "Transactional data from NW BW Infoprovider" package and getting error message Error occurs when loading transaction data from other cube .
Below is the transformation file content
*OPTIONS
FORMAT = DELIMITED
HEADER = YES
DELIMITER = TAB
AMOUNTDECIMALPOINT = .
SKIP = 0
SKIPIF =
VALIDATERECORDS=NO
CREDITPOSITIVE=YES
MAXREJECTCOUNT= 9999999999999
ROUNDAMOUNT=
*MAPPING
ACCOUNT = 0ACCOUNT
BUSINESSTYPE = *NEWCOL(NOBTYPE)
BWSRC = *NEWCOL(R3)
COMPANYCODE = 0COMP_CODE
DATASRC = *NEWCOL(R3)
FUNCTIONALAREA = *IF(0FUNC_AREA = *STR() then *STR(NOFA);0FUNC_AREA)
GAME = *NEWCOL(NOGAME)
INPUTCURRENCY = *IF(0CURRENCY = *STR() then *STR(NOCURR) ; 0CURRENCY)
PROBABILITY = *NEWCOL(NOPROB)
PRODUCT = *NEWCOL(NOPROD)
PROFITCENTER = 0PROFIT_CTR
PROJECT = *NEWCOL(NOPROJECT)
TIME = 0FISCPER
VERSION = *NEWCOL(REV0)
WEEKS = *NEWCOL(NOWEEK)
SIGNEDDATA= 0AMOUNT
*CONVERSION
Below is the Error Log
/CPMB/MODIFY completed in 0 seconds
/CPMB/INFOPROVIDER_CONVERT completed in 0 seconds
/CPMB/CLEAR completed in 0 seconds
[Selection]
InforProvide=ZPCA_C01
TRANSFORMATION= DATAMANAGER\TRANSFORMATIONFILES\R3_TD_P&L.xls
CLEARDATA= Yes
RUNLOGIC= No
CHECKLCK= Yes
[Messages]
Task name CONVERT:
No 1 Round:
Error occurs when loading transaction data from other cube
Application: ACTUALREPORTING Package status: ERROR
Does anyone had this issue before or is there something that needs to be looked in the BW delivered task ?
ANy Inpout will be appreciated.
SanjayHello Guru,
Currently i am getting below error while loading costcenter master data from BW to BPC.
Task name MASTER DATA SOURCE:
Record count: 189
Task name TEXT SOURCE:
Record count: 189
Task name CONVERT:
No 1 Round:
Info provider is not available
Application: ZRB_SALES_CMB Package status: ERROR
Anybody can tell me, if i have missed anything ???
Regards,
BI NEW
Edited by: BI NEW on Feb 23, 2011 12:25 PM -
Need more Info about "Load transactional data when master data not loaded"
Hi,
Can you please explain me this option in the infopackage " Load transactional data when master data is not loaded"....
Say i load a transactional data record which has a material no. AAAXX.
In the fact table, the material no. is replaced with the corresp. DIM ID.
Now, assume that there is no entry for this Material no.
AAAXX in the master data table...so no DIM ID for it..
How is it then stored in the fact table ?
Hope i have managed to explain the scenario..
Thanks in advance,
PunkujHello Punkuj K,
How r u ?
No, if the entry for that Material Number "AAAXX" is not there in the Master Data then it will create a SIDs & DIMs ID for that & Transaction Data will be loaded.
Use
Choose this indicator if you want to always update the data, even if no master data for the navigation attributes of the loaded records exists. The master data is generated from the loaded transaction data. The system draws SIDs. You can subsequently load the master data.
Dependencies
The texts, attributes, and hierarchies are not recognized by the system. Load the texts / attributes / hierarchies for the corresponding InfoObjects separately.
This function corresponds to the update. Possible errors when reading the master data into the update rules are not associated with this indicator.
Recommendation
We recommended loading the master data into the productive systems in advance. Select this indicator especially in test systems.
Best Regards....
Sankar Kumar
+91 98403 47141 -
Problem in loading transactional data from to 0MKT_DSO1(ods) to 0MKTG_C01
Hi,
I am trying to load lead transaction data to the standard Crm lead management cube from ODS.There is a problem while loading transaction data from 0MKT_DSO1(ods) to the infocube 0MKTG_C01 as the field 0STATECSYS2(CRM STATUS) is set to 10 in ods -meaning incorrect transaction. This feild is not there in the infocube.
There is a routine in the cube that deletes data records with (0statecsys2) set to 10.
THIS field is not coming in the transaction.
so, where can i see the master data in crm source system? and why is that feild getting set to 10 ?
thanks in advance!Thanks for the reply..
I have checked the Fact table which shows
1. packet Dimension
2. Time dimension
3. Unit dimension.
I have kept the 0CALDAY as the time characteristics.
Sample data i have loaded from ODS to Cube.
Sample data in ODS.
Sales order No_____0CALDAY_____AMOUNT
800001___________12/02/2009____15
I have loaded this data in Cube with Full Upload.
Data in Cube.
Sales order No_____0CALDAY_____AMOUNT
800001___________12/02/2009____15
Again i am loading the same data to cube
Data in cube after loading.
Sales order No_____0CALDAY_____AMOUNT
800001___________12/02/2009____15
800001___________12/02/2009____15
The data is duplicated and it is not cumulating.
Am i missing anything on this.
Pls help..
Thanks,
Siva. -
Deleting master data after loading transactional data using flat file
Dear All,
I have loaded transaction data into an infocube using a flat file . While loading DTP i have checked the option "load transactional data with out master data exists" . So transactional data is loaded even if no master data is there in BW.
While loading the flat file, I made a mistake for DIVISION Characteristic where original master data value is '04000' , but i loaded the transactional data with value '4000' .Then i later realized after seeing the data from the infocube and deleted the request. then i reloaded data with value '04000'. Till now every thing is fine.
But when I see the master data for DIVISION , i can see a new entry with value '4000'.
My question is how to delete the entry value '4000' from DIVISION. I tried deleting manually this entry from 'maintaining masterdata' , but it is not allowing me to do so .
I have also checked if any transactional data exists for that value '4000' , as i said earlier I have deleted the transactional data with that values. even tried to delete the entries from the master data table, but i donot see a option to delete entries there.
Please suggest me on this.
Regards,
VeeraHi,
Goto RSA1 right click on the Info object and select Delete Master data. This will delete the master data unused existing in the table.
If this master data is not used any where else just delete the master data completely with SID option.
If even this doesnt work you can delete the complete table entire in SE14. But this will wipe out the entire table. Be sure if you wanna do this.
Hope this helps
Akhan. -
What if i load transaction data without loading master data
Hello experts,
What are the consequences if i load transaction data without loading master data?Are there any other factors except the load performance because of SID generations etc and inconsistencies.
<b>What kind of potential inconsistencies will occur?</b>
Problem here is:
when the transaction load starts a new master data such as employee(x) would be have been created in R/3 which does not exist in BW and hence transaction load fails.
Thanks and Regards
Uma Srinivasa raoHi Rao,
In case you load the master data after loading the transcation data and if there is any look up at the master data in teh update rules, then you can delete and reconstruct the requests in the ODS/Cube so that the latest master data is pulled in in the data target.
Make sure you do the Apply hier/attr change before doing the delete and reconstruct.
Bye
Dinesh -
While loading transaction data into cube what are the tables generats
Hi,
while loading transaction data into cube what are the tables normally generats.Hi,
Normally datas will be loading to 'F'- Fact tables (/BIC/F****) *** - cube name..
When you do compress the request the data will be moved into E tables.
Regards,
Siva. -
Error Loading Transactional Data into Cube(0PCA_C01)
Hi Guys,
I am trying to Install the following cubes from Business Content. 0PCA_C01 / 0PCA_C02 (Profit Center Analysis). Everything got replicated. I am trying to load transaction data now. I created Infopackage and loaded the data. Its running for a long time. It still says " not yet completed/warning ". If i try to see the content of the cube, I am getting the following errors.
"Your user master record is not sufficiently maintained for object Authorization Object3.
System error: RSDRC / FORM AUTHORITY_CHECK USER NOT AUTHORIZED 0PCA_C01 0PCA_C01
System error: RSDRC / FUNC RSDRC_BASIC_CUBE_DATA_GET ERROR IN RSDRC_BASIC_QUERY_DATA_GET 0PCA_C01 64
System error: RSDRC / FORM DATA_GET ERROR IN RSDRC_BASIC_CUBE_DATA_GET 0PCA_C01 64"
Also if i try to change something in the infopackage, it says "Init. select. for field name currently running in". I guess its because the job is still running.
Please let me know if i missed something.
RajHi Raj
This seems to be an authorization case.
i guess you are in BW Dev system.
Go to SU01. Enter your user id and display.Go to Tabs ROle and PRofile.
Ideally in development, a developers id shud have access to all devevlopment activities. So check with basis folks and get access to the relevant profiles. If you can get SAP_ALL, then perfect!!
Prakash
Assignin points is a way of saying thanks on SDN!! -
Load transaction data from ECC to BPC 10.1 using US GAAP starter kit SP3
I need to understand a bit more the process to load transactional data into BPC 10.1. We have US GAAP Starter Kit SP 3. Below is an screenshot from the config guide:
It explains how transactional data can be extracted from ECC system to the SAP Netweaver BW data source 0FI_GL_10 and then transform it
to get it loaded into BPC. The Objects /PKG/FC_C01 and PKG/FC_DS01 are just mentioned here as reference because they come in the RDS for Financial Close and Disclosure Management, which not all companies have.
I believe the upwards data flow should be from Data Source 0FI_GL_10 to Data Store Object 0FIGL_O10 and then to InfoCube 0FIGL_R10. There is also a data flow that goes from 0FI_FL_10 to InfoCube 0FIGL_R10. Can anyone with experience with US GAAP starter kit answer this?
Thank you.Hello, we were able to load actuals to our environment with the US GAAP Starter Kit, SP03. I followed the Operating Guide document up to section 5.2 and ran the Consolidation Data Manager Package with no issue. We are using the A20 Input and A20 Consolidation process flows based on flows F00, F10, F20, F30, F99, etc... According to the documentation, the Statemnet of Cash Flow and Changes in Equity should be automatically calculated and available from the changes
in Balance Sheet accounts once Consolidation is completed . However, when I ran the corresponding reports, they bring no data.
We loaded actual data for the whole 2013 and Jan 2014. Our intention is
to run the first consolidation for Jan 2014. The closing balance for period 12.2013 in flow 99 was copied to flow 00 2014 (opening balance). Flows 20 and 30 were updated with the corresponding balance sheet movements (increase/decrease), according to the delivered controls in the starter kit. However, cash flow is still showing no results.
I found the following text in the operating guide, but I am not clear if I am missing a step. Can you please clarify? This is telling me that I need to copy my 01.2014 opening balance (F00) to 12.2013 closing balance (F99, which is done by the copy opening data manager package, but in the opposite direction, from Y-1 F00 to Y F99) and in addition to also copy that balance to 12.2013 F00 (previous year opening balance)??
"5.2.2 First Consolidation
When operating the consolidation for a given Scope for the first time in the application, it is necessary to populate and process the prior year-end time period for this Scope, in addition to the first required consolidation period. This is because the flows dedicated to the scope change analysis are properly populated by the consolidation engine provided that at least one automatic journal entry has been detected in the consolidation used as opening consolidation, which by default is the prior year end (December, Y-1). To accomplish this, it is recommended that you copy all input-level opening data of the first required consolidation (flow “F00”) into the closing data (“F99”) and opening data (“F00”) of the prior year-end time period, including intercompany breakdowns.
This breakdown by intercompany in the balance sheet triggers automatic eliminations, which are carried over to the closing position. After the consolidation process, the closing data of the prior year
end time period will contain the appropriate source data for the opening data of the following consolidation, notably the opening automatic journal entries.
Maybe you are looking for
-
The target costs in target cost version 0 are missing
Dear Experts, I am getting following error while running Varience cost through KKS2: Only remaining var. in version 0 - no target costs for 000001061683 0001 Message no. KV151 Diagnosis For the calculation of variances with 000001061683 0001, the tar
-
Monitor recommendations for PS/CS5.5 print work
Hopefully this won't be retreading old ground too much, but I'm looking for monitor recommendations for using with CS5.5 for print work in Photoshop and InDesign mainly on a new Mac Mini. I've got a Spyder 3 for calibration already so that side of th
-
In case of service Purchase Orders we can give service reference in text column in the item-level tab of PO. Again we assign different service masters in the service tab of PO. Our clients requirement is that for one purchase order, it should allow
-
Passing of old and new trigger details to a generic function or procedure
Hi all. I have a requirement to pass all the new and old column information from a database trigger ( for the purposes of this discussion, it will be a before row, insert, update and delete trigger) to a procedure/function ( either is fine). My first
-
The verification of the server's certificate chain failed
Hi All, Not sure this is the right forum for this but never mind. I am trying to get abap2GApps working and am having problems with the client certificates. I am getting the below error in ICM :- [Thr 06] Mon Jul 30 09:34:47 2012 [Thr 06] *** ERROR d