Deleting master data after loading transactional data using flat file
Dear All,
I have loaded transaction data into an infocube using a flat file . While loading DTP i have checked the option "load transactional data with out master data exists" . So transactional data is loaded even if no master data is there in BW.
While loading the flat file, I made a mistake for DIVISION Characteristic where original master data value is '04000' , but i loaded the transactional data with value '4000' .Then i later realized after seeing the data from the infocube and deleted the request. then i reloaded data with value '04000'. Till now every thing is fine.
But when I see the master data for DIVISION , i can see a new entry with value '4000'.
My question is how to delete the entry value '4000' from DIVISION. I tried deleting manually this entry from 'maintaining masterdata' , but it is not allowing me to do so .
I have also checked if any transactional data exists for that value '4000' , as i said earlier I have deleted the transactional data with that values. even tried to delete the entries from the master data table, but i donot see a option to delete entries there.
Please suggest me on this.
Regards,
Veera
Hi,
Goto RSA1 right click on the Info object and select Delete Master data. This will delete the master data unused existing in the table.
If this master data is not used any where else just delete the master data completely with SID option.
If even this doesnt work you can delete the complete table entire in SE14. But this will wipe out the entire table. Be sure if you wanna do this.
Hope this helps
Akhan.
Similar Messages
-
Loading international Character using Flat File
Hi All,
We have a requirement in which we want to load Japanese and German Characters using a Flat file. How can we go about it.
When I change the file into CSV then only only ???? all displayed in place of the internatioanl character.
Any help or document which tell that how we can load the internatioanl Characters would be appreciated and awarded in SDN.
Regards,
SamvedDear Samved Sinho,
when u are creating the data source for Flat file select the language dependent ..and in the Text table u have to include the corresponding language in SAP terminology
Regards
venu -
What if i load transaction data without loading master data
Hello experts,
What are the consequences if i load transaction data without loading master data?Are there any other factors except the load performance because of SID generations etc and inconsistencies.
<b>What kind of potential inconsistencies will occur?</b>
Problem here is:
when the transaction load starts a new master data such as employee(x) would be have been created in R/3 which does not exist in BW and hence transaction load fails.
Thanks and Regards
Uma Srinivasa raoHi Rao,
In case you load the master data after loading the transcation data and if there is any look up at the master data in teh update rules, then you can delete and reconstruct the requests in the ODS/Cube so that the latest master data is pulled in in the data target.
Make sure you do the Apply hier/attr change before doing the delete and reconstruct.
Bye
Dinesh -
Need more Info about "Load transactional data when master data not loaded"
Hi,
Can you please explain me this option in the infopackage " Load transactional data when master data is not loaded"....
Say i load a transactional data record which has a material no. AAAXX.
In the fact table, the material no. is replaced with the corresp. DIM ID.
Now, assume that there is no entry for this Material no.
AAAXX in the master data table...so no DIM ID for it..
How is it then stored in the fact table ?
Hope i have managed to explain the scenario..
Thanks in advance,
PunkujHello Punkuj K,
How r u ?
No, if the entry for that Material Number "AAAXX" is not there in the Master Data then it will create a SIDs & DIMs ID for that & Transaction Data will be loaded.
Use
Choose this indicator if you want to always update the data, even if no master data for the navigation attributes of the loaded records exists. The master data is generated from the loaded transaction data. The system draws SIDs. You can subsequently load the master data.
Dependencies
The texts, attributes, and hierarchies are not recognized by the system. Load the texts / attributes / hierarchies for the corresponding InfoObjects separately.
This function corresponds to the update. Possible errors when reading the master data into the update rules are not associated with this indicator.
Recommendation
We recommended loading the master data into the productive systems in advance. Select this indicator especially in test systems.
Best Regards....
Sankar Kumar
+91 98403 47141 -
Load transaction data from ECC to BPC 10.1 using US GAAP starter kit SP3
I need to understand a bit more the process to load transactional data into BPC 10.1. We have US GAAP Starter Kit SP 3. Below is an screenshot from the config guide:
It explains how transactional data can be extracted from ECC system to the SAP Netweaver BW data source 0FI_GL_10 and then transform it
to get it loaded into BPC. The Objects /PKG/FC_C01 and PKG/FC_DS01 are just mentioned here as reference because they come in the RDS for Financial Close and Disclosure Management, which not all companies have.
I believe the upwards data flow should be from Data Source 0FI_GL_10 to Data Store Object 0FIGL_O10 and then to InfoCube 0FIGL_R10. There is also a data flow that goes from 0FI_FL_10 to InfoCube 0FIGL_R10. Can anyone with experience with US GAAP starter kit answer this?
Thank you.Hello, we were able to load actuals to our environment with the US GAAP Starter Kit, SP03. I followed the Operating Guide document up to section 5.2 and ran the Consolidation Data Manager Package with no issue. We are using the A20 Input and A20 Consolidation process flows based on flows F00, F10, F20, F30, F99, etc... According to the documentation, the Statemnet of Cash Flow and Changes in Equity should be automatically calculated and available from the changes
in Balance Sheet accounts once Consolidation is completed . However, when I ran the corresponding reports, they bring no data.
We loaded actual data for the whole 2013 and Jan 2014. Our intention is
to run the first consolidation for Jan 2014. The closing balance for period 12.2013 in flow 99 was copied to flow 00 2014 (opening balance). Flows 20 and 30 were updated with the corresponding balance sheet movements (increase/decrease), according to the delivered controls in the starter kit. However, cash flow is still showing no results.
I found the following text in the operating guide, but I am not clear if I am missing a step. Can you please clarify? This is telling me that I need to copy my 01.2014 opening balance (F00) to 12.2013 closing balance (F99, which is done by the copy opening data manager package, but in the opposite direction, from Y-1 F00 to Y F99) and in addition to also copy that balance to 12.2013 F00 (previous year opening balance)??
"5.2.2 First Consolidation
When operating the consolidation for a given Scope for the first time in the application, it is necessary to populate and process the prior year-end time period for this Scope, in addition to the first required consolidation period. This is because the flows dedicated to the scope change analysis are properly populated by the consolidation engine provided that at least one automatic journal entry has been detected in the consolidation used as opening consolidation, which by default is the prior year end (December, Y-1). To accomplish this, it is recommended that you copy all input-level opening data of the first required consolidation (flow “F00”) into the closing data (“F99”) and opening data (“F00”) of the prior year-end time period, including intercompany breakdowns.
This breakdown by intercompany in the balance sheet triggers automatic eliminations, which are carried over to the closing position. After the consolidation process, the closing data of the prior year
end time period will contain the appropriate source data for the opening data of the following consolidation, notably the opening automatic journal entries. -
Hi Experts,
We are doing load monitoring on dailiy basis, As we are doing some testing in the BW Development system, we are loading data into the cube 0PM_C01. we want to load the current year data (2007-2008). but whenever we are trying to load through the InfoPackage "Load Transactional Data Current Year: 0PM_OM_OPA_1" it doesn't delete previous load requests. It was initially configured with this option not selected. we recently changed it, but it still doesn't delete. The only difference we see between this load and others that do delete is that this one is executed from a process chain rather than an InfoPackage Group. What do we need to do to make it delete previous load requests?
So please help me in this regard. points will be assigned.
thanks
sekharHello,
I think as you are running this in process chain, you need to add the step "Delete Overlapping Requests from InfoCube" after the load step so that the previous request will be deleted.
regds,
Shashank -
This year master data-previous year transaction data load
hi guys,
I support some infocubes where financial data is stored and reported on.our source systems are financial databases.
now we load transaction data every month-end throughout a year.we load master data(full load) from flatfiles whenever there are changes.whenever we change master data,we include these changes in transaction data infopackage selections,so the transaction data related to new master data will also be loaded.
now my situation is:
Now this is June,2010.My boss said I have to reload transaction data for 2009 as there was some changes in rates on which the data was calulated in 2009.Some new rates came in,so they re-calculated 2009 transaction data and sent it to us.Now we have to reload this 2009 data and delete OLD 2009 data from cubes.
My question---As I am going to re-load 2009 transaction data...but the master data present in system is up-to-date..i.e, June 2010..(its not the same master data of 2009....there were few additions and renamings until today.)...so what do we need to do in this type of situations?
Also,as I said above,whenever we have changes in master data,we change transaction data infopackage selections so they load transaction data related to this new master data also...so it means Infopackage selections of today donot reflect the infopackage selections of 2009....but we are still trying to reload 2009 transaction data with todays infopackage??
How do we need to do in these type of situations?Do we just go ahead and load 2009 data with 2010 master data and 2010 infopackage selections........?
Thanks alot for your patience....
Rgds.we are actually having time-dependent master data....one version of master data for each year ...like for 2005..we have verion 2....for 2006..we have version 3....master data version and to which year they are linked to is maintained in a customised table....
so coming back to my previous post...
for 2009...we have version 4....and transaction data was loaded then.....
for 2010..we have version 5...and transaction data will be loaded at the end of every month...and also we have decided to reload 2009 data because the the rates on which 2009 data was calculated,changed and so our business recalculated that data and sent to us and we have to reload 2009 data and delete OLD 2009 data....
now my questions...if we re-load 2009 data,today master data is of 2010(version 5)....this is meant for 2010 transaction data loadings....Doesnt it create inconsistencies....or am I thinking too much...is just to go ahead and load....?
Also the infopackage selections today donot match with infopackage selections of 2009 as there were additions in master data and so we included in selections....Do we need to change back the infopackage selections to 2009 until 2009 re-load is over?
Hope I explained it better...............
Thanks alot. -
Can we use LSMW to load Transaction data
Hi.
Can we use LSMW to load transactional data.Or it is used only for Master Data.
With Best Regards
MamathaHi,
you can do upload of transactional data using,
Standard Programs,
Batch Input Recording,
BAPI method,
IDoc,
best regards,
Thangesh -
How to rectify the errors in master data loads & transactional data loads?
hy,
please any one tell me
How to rectify the errors in master data loads & transactional data loads?
thnQ
RaviHi,
Please post specific questions in the forum.
Please explain the error you are getting.
-Vikram -
Why we load Master data first before loading Transaction data
Hi Experts,
why we load Master data first before loading Transaction data, specify any reasons for that ? Is it mandatory to load MD first ?
I will allocate points to those who help me in detail. My advance thanks who respond to my query.
Edited by: Nagireddy Pothireddy on Mar 10, 2008 8:17 AMHi Nagireddy,
I hope this helps....
The bottom line for building cubes it to view facts against dimensions. When i say facts these are the key-figures i.e sales volume, Sales vat etc against some characteristics like sales Area, Cost center , plant.
Basically charateristics are those against which key-figures are measures like Costcenter, plant, material etc.
Dimensions are a grouping of related characteristic. So basically a cube has a central fact table with dimesions associated to it in a relational schema. Imagine now you want to view a key figure Sales Volume against a dimension plant. when you consider plant , it has a distribution channel, purchasing organisation , company code, sales area, region etc associated with it. So which form the attributes of plant and also have some or the other description (texts) and aslo hierarchy. first we load the master data and then the transaction data follows. -
Can we load Transaction data without loading Master data, explain w.r.tSID
Can we load Transaction data without loading Master data, if so can you explain me how and when Surrogate Ids/ Dimension Ids gets generated.
Hi,
We can load the transaction data without loading master data.
But for loading performance issues it is recommended not to load the transaction data first.
Every infoobject in the BW is identified by SID, that means which we are loading the transaction data, considering a scenario for an infocube, the system will identify the infoobject by existing sid against that infoobject in the bw and correspondingly create DIM id's.
Now consider the secnario where master data is not loaded and transaction data is being loaded.
Now the system should have sid to identify and infoobject and to create DIM id ( which is the combination of sid's of the characterisc in it)
And as the sid's are not yet created i mean as there is no referece for the master data, the system will first create the sid for that infoobject it means it will create a copy of the master data in that infoobject nd then create the dim ids based on it and then load the data.
Here the additional task of creating sids are also being done during loading the transaction data which hinders the loading performance
I hope this is clear5 now
Janardhan Kumar -
Loading transaction data using an attribute field in the BW cube
I am trying to load transaction data into BPC. All but one of my dimensions are referencing technical fields. One of my fields is referencing an attribute of a technical field. The technical field is called 0MAT_PLANT and the attribute of that field is 0PROFIT_CTR. In my mapping section of my transaction file I set the PROFIT_CTR dimension to 0MAT_PLANT__0PROFIT_CTR. The transaction file validates and processes fine but when I try to import I get the error message: 0MAT_PLANT__0PROFIT_CTR is not a valid command or column 0MAT_PLANT__0PROFIT_CTR does not exist in source
Does anyone know how to complete this successfully?
THANK YOU!
Karen B. ThibodeauHi,
If is mapped fine and still getting same errors, maybe you can try to check if the IO attribute it's checked in the options in the data package.
Regards -
Problem in loading transactional data from to 0MKT_DSO1(ods) to 0MKTG_C01
Hi,
I am trying to load lead transaction data to the standard Crm lead management cube from ODS.There is a problem while loading transaction data from 0MKT_DSO1(ods) to the infocube 0MKTG_C01 as the field 0STATECSYS2(CRM STATUS) is set to 10 in ods -meaning incorrect transaction. This feild is not there in the infocube.
There is a routine in the cube that deletes data records with (0statecsys2) set to 10.
THIS field is not coming in the transaction.
so, where can i see the master data in crm source system? and why is that feild getting set to 10 ?
thanks in advance!Thanks for the reply..
I have checked the Fact table which shows
1. packet Dimension
2. Time dimension
3. Unit dimension.
I have kept the 0CALDAY as the time characteristics.
Sample data i have loaded from ODS to Cube.
Sample data in ODS.
Sales order No_____0CALDAY_____AMOUNT
800001___________12/02/2009____15
I have loaded this data in Cube with Full Upload.
Data in Cube.
Sales order No_____0CALDAY_____AMOUNT
800001___________12/02/2009____15
Again i am loading the same data to cube
Data in cube after loading.
Sales order No_____0CALDAY_____AMOUNT
800001___________12/02/2009____15
800001___________12/02/2009____15
The data is duplicated and it is not cumulating.
Am i missing anything on this.
Pls help..
Thanks,
Siva. -
Enhancing package 'load transactional data for BW infoprovider UI'
Dear,
for project reasons, we would like to enhance the package ''load transactional data for BW infoprovider UI'. To be more precisely, we want to add some dimensions to the method of data transfer 'replace & clear datavalues'. Now the system clears the data that mirror each entity/category/time/datasource. With only those dimensions, we cannot be precise enought, so in our case the system clears too many data. We want to be able to extend this with some dimensions.
Is there any way this can be adapted ?
thx.Hi Wouter,
for a more precise delete you should execute first a clear and after import using Merge option.
Kind regards
Roberto -
Can i load transaction data thro flat files without loading masters?
Hi all,
is it possible to load history data using flat files in sales cube without having loaded any masters?
thanksHi
You can load the transaction data without loading the master data. You have to check the option in update rules (update also if no master data exists).
In reporting you will not be able to see the attribute and text data. Its always adivceable to load master data first and run apply hierarchy/attribute change and load the transaction data.
AHP: Nice to see you after a long time
REGards
Rak
Maybe you are looking for
-
Metro fails completely after system restore - can no longer access either Metro or Metro Apps
Following a system restore, Metro no longer works - The windows button on the keyboard does nothing and clicking the start icon doesn't do anything either. Attempting to start Metro Apps results in failure - they cannot display on the screen and I se
-
Keynote can't export. Can't stop and start playback
I have 45 slides. I have added audio (aiff and wav files) to 14 of them. 1. Regardless of which method, they won't export. I get a message that there is not enough room (which isn't so) or "something is wrong with the files. I think something's wring
-
Fresh installation + what updates = good Airport connectivity?
Last summer I did an erase and install and apparently applied some updates that made my Airport become extremely problematic. See Airport extremem card can't hang onto wifi signal - what to do? and PB's Airport Extreme card not staying connected - pl
-
How i create an audio streaming in adobe muse?
I am creating an online radio station. As I can create streaming audio with adobe muse?
-
Error # 2032 while opening SAP BI Query in Xcelsius using Direct connectivi
Hi Experts, I have following set up - Xcelsius Enterprise 2008 SP3 SAP BI EHP1 - SP5 Operating System - XP SAP GUI 710. I have two systems - 1. Internal IDES and 2. Client system I open a Xcelsius and try to create a connection "SAP BW NetWeaver conn