Master data extractor vs Transaction data extractor
Hi all,
I am creating a generic (custom) extractor.
What is the difference between creating it as an transaction data extraction? or master data attributes extractor?
Hi,
If you are interested in extracting master data like materail, customer then one should go for master data extraction whereas if you are interested in transactional data Purchasing, finance then go for transaction data extractor.
E.g. some of the master datasources are not available in BI content ( z object in ECC ) then go for master data extractor to get attribute as well as text. note you will not have KF values for master data load.
and some of transactional data been stored in some ztable in ECC also data like purchasing request although stored in std. ecc table but there is no datasource provided by SAP BI, so you have to create transaction extractor.
T. Code to create these extractor in ECC is RSO2.
Thank-You.
Regards,
Vinod
Similar Messages
-
This year master data-previous year transaction data load
hi guys,
I support some infocubes where financial data is stored and reported on.our source systems are financial databases.
now we load transaction data every month-end throughout a year.we load master data(full load) from flatfiles whenever there are changes.whenever we change master data,we include these changes in transaction data infopackage selections,so the transaction data related to new master data will also be loaded.
now my situation is:
Now this is June,2010.My boss said I have to reload transaction data for 2009 as there was some changes in rates on which the data was calulated in 2009.Some new rates came in,so they re-calculated 2009 transaction data and sent it to us.Now we have to reload this 2009 data and delete OLD 2009 data from cubes.
My question---As I am going to re-load 2009 transaction data...but the master data present in system is up-to-date..i.e, June 2010..(its not the same master data of 2009....there were few additions and renamings until today.)...so what do we need to do in this type of situations?
Also,as I said above,whenever we have changes in master data,we change transaction data infopackage selections so they load transaction data related to this new master data also...so it means Infopackage selections of today donot reflect the infopackage selections of 2009....but we are still trying to reload 2009 transaction data with todays infopackage??
How do we need to do in these type of situations?Do we just go ahead and load 2009 data with 2010 master data and 2010 infopackage selections........?
Thanks alot for your patience....
Rgds.we are actually having time-dependent master data....one version of master data for each year ...like for 2005..we have verion 2....for 2006..we have version 3....master data version and to which year they are linked to is maintained in a customised table....
so coming back to my previous post...
for 2009...we have version 4....and transaction data was loaded then.....
for 2010..we have version 5...and transaction data will be loaded at the end of every month...and also we have decided to reload 2009 data because the the rates on which 2009 data was calculated,changed and so our business recalculated that data and sent to us and we have to reload 2009 data and delete OLD 2009 data....
now my questions...if we re-load 2009 data,today master data is of 2010(version 5)....this is meant for 2010 transaction data loadings....Doesnt it create inconsistencies....or am I thinking too much...is just to go ahead and load....?
Also the infopackage selections today donot match with infopackage selections of 2009 as there were additions in master data and so we included in selections....Do we need to change back the infopackage selections to 2009 until 2009 re-load is over?
Hope I explained it better...............
Thanks alot. -
Master data tables and Transaction data Tables
Hello Gurus,
Please let me know how to know which table belongs to master data and which table belongs to transaction data.
for FICO module.
Does any one have specific material relating to master data table and transaction data tables.
Thanks
Edited by: Manu Rathore on Jan 18, 2012 4:38 AMHi Manu,
Find attached table relation diagram by Christopher Solomon. It is one of the very comprehensive chart on this topic.
deleted
Warm regards,
Murukan Arunachalam -
Data Type field read only in Data source for transaction data (PC_FILE)
Hi folks,
I need to change Data Type for some of the fields in "Field" tab in data source for transactional data. It became read only after I activate the data source. Need help in making it editable. All fields now have Data type=CHAR.
ThanksNevermind folks. I got it.
-
Problem in data sources for transaction data through flat file
Hello Friends,
While creating the data sources for transaction data through flat file, I am getting the following error "Error 'The argument '1519,05' cannot be interpreted as anumber' while assigning character to application structure" Message no. RSDS016
If any one come across this issue, please provide me the solution.
Thanks in Advance.
Regards
RaviHallo,
just for information.
I had the same problem.
Have changed the field type from CURR to DEC and have set external instead of internal.
Then, the import with flatfile worked fine.
Thank you. -
Deleting master data after loading transactional data using flat file
Dear All,
I have loaded transaction data into an infocube using a flat file . While loading DTP i have checked the option "load transactional data with out master data exists" . So transactional data is loaded even if no master data is there in BW.
While loading the flat file, I made a mistake for DIVISION Characteristic where original master data value is '04000' , but i loaded the transactional data with value '4000' .Then i later realized after seeing the data from the infocube and deleted the request. then i reloaded data with value '04000'. Till now every thing is fine.
But when I see the master data for DIVISION , i can see a new entry with value '4000'.
My question is how to delete the entry value '4000' from DIVISION. I tried deleting manually this entry from 'maintaining masterdata' , but it is not allowing me to do so .
I have also checked if any transactional data exists for that value '4000' , as i said earlier I have deleted the transactional data with that values. even tried to delete the entries from the master data table, but i donot see a option to delete entries there.
Please suggest me on this.
Regards,
VeeraHi,
Goto RSA1 right click on the Info object and select Delete Master data. This will delete the master data unused existing in the table.
If this master data is not used any where else just delete the master data completely with SID option.
If even this doesnt work you can delete the complete table entire in SE14. But this will wipe out the entire table. Be sure if you wanna do this.
Hope this helps
Akhan. -
Dependent master data for the Transaction data
Hi,
Wish u Happy New Year to All.
I use to schedule daily some 250 master data.Then i will run transaction data.Because of this the system performance is effecting.I want to know, where i can find dependent master data objects for the particular transaction data?
Thanks in Advance
Regards,
SivDear Siva
In the Info Source Over View for the particular data target can shown dependent master data objects.
Regards
Saravanan.ar -
Dates in Master data is correct & Transaction data date one day in advance
Dear All
In master data the Date is current date say 16.01.2009 say (CS01.CA01,SU01etc)
In Transaction data say MIGO,F-53 etc takes 17.01.2009.
IN Operating system the current date is 16.01.2009.
In Su01 the user settings are
System Time zone is CET.Time format is 24 hour format
Kindly advice me how to resolve this issue.
Regards
Jayshankar.AHi Ruchit
For all users the same settings are there and we tried logging out and logging in twice its not this problem and as per your advice we checked with SAP note and as per instructions we ran the program in SE38 - Program name - TZCUSTDISP this was the output
Kindly check and advice
Display System Parameters for Time Zones
current date: 00.00.0000
current time: 00:00:00
timezone:
Time Zone Information of System
sy-tzone : 28800-
sy-dayst :
sy-zonlo : CET
sy-datlo : 17.01.2009
sy-timlo : 03:57:05
sy-datum : 16.01.2009
sy-uzeit : 18:57:05
Setting in Operation System
Diff. from UTC 28,800
Time Zone Text Paci
R/3 Time Diagnostic Program on sarita
Universal Time Coordinated UTC....: 1232161025
Date and time of database.........: 16.01.2009 18:57:05
Date and Time of R/3-Kernel.......: 16.01.2009 18:57:05
Date and Time of ABAP-Processor...: 16.01.2009 18:57:05
ABAP Timezone Setup ..............: 28800-
Date and Time / localtime ........: 16.01.2009 18:57:05 -
Master Data vs. Transactional Data
Hi,
I'm creating a report (Daily Sales Report) where I am retrieving most of the values from DataSource 2LIS_11_VAITM (SD Sales: Document Item).
Some values (e.g. Sales Org., Distribution Channel, Sales Office, Customer Group, Plant) could be taken from Customer or Material Master Data instead of from the DataSource 2LIS_11_VAITM.
Could anyone explain for me the benifits of taking the transactional data (i.e. 2LIS_11_VAITM) instead of master data?
Are there any disadvantages?
Thanks,
TobiasHi ,
You can ofcourse take the Master Data values from Customer or Material Master Data instead of from the DataSource
2LIS_11_VAITM. but as at the item level one purchase order or sales order can have Multiple Material attached to it so for one entry in
the MASTER table there can be MULTIPLE entries in the ITEM table ... so while accessing data from the ITEM table u must clearly
mention two things FIRST : the SALES ORDER NUMBER / PURCHASE ORDER NUMBER .
SECOND: the ITEM NUMBER to find out the unique records u required .. or else there can be ambiguity when u
retrive the data from the ITEM TABLE......
So .. i can say that while accessing the HEADER level data u can access MASTER TABLES ... and while accessing ITEM level data
u should access ITEM TABLE with the proper combination of the keys....
there is no advt. or disadvt in accessing MASTER or ITEM table it totaslly depends on the DATA required.....
i think now u r clear with ur doubt... if not please reply .. i will explain it in more detail......
REWARDS POINS IF USEFUL...
regards,
JAY. -
Whats the difference (Generic Master Data Source vs Transaction Data Sce)
Hi Gurus,
I have been wondering on whats the difference when creating <b>GENERIC Data Sources:</b>
1.) Transaction Data
2.) Master Data
Pls. dont give an answer like (<i>A Master Data Generic Data Source is used to populate Master Data Info Objects and Transaction data Sources are used to populate ODS and CUBES.</i>)
This definition is <b>FALSE</b> since I can use a <b>"Master Data (Data Source)"</b> to populate my ODS.
They only differ in Process Chain since the Action when loading data using <i>Generic Masterdata Data Source</i> is <b>Attribute Change Run</b>. Unlike when loading ODS using Transaction Data Source will use "<b>Activate ODS Data</b>".
Hope Somebody will clarify this...
--JkyleHi Jkyle,
at the moment I think it is more or less a grouping information in BW and additionally a information for you, your users, ... what type of data is being loaded thru the datasource.
With grouping I mean just a type of information used for the list that opens if you'll assign a datasource to a infosource.
With type of data I mean some sort of classification of the data for a special application. I can imagine, that the same data can be used as transactional data as well as as master data depending on the application. A good example for this can be the header data. For some applications you have header (transactional) datasources providing a header counter and additonally you have the document number (basically the header information) with some attributes as master data.
May be there is a difference in the function modules that are called to extract the data, I am not sure about this.
Hope this help a bit.
regards
Siggi -
Chaning Master Data related to transactional data
I have master data for example partner. i load master data monthly. parnter will change.
Master Data
Load 1 (Jan)
PARTNER Date
partner1 Jan
partner2 Jan
partner3 Jan
Load2 (Feb)
PARTNER
partner1 Feb
partner2 Feb
partner4 Feb
For every load tehre is also transactional data
Load1 (Jan)
Partner1 Jan = 50 Dollar
Partner2 Jan = 60 Dollar
Partner3 Jan = 60 Dollar
Load2 (Feb)
Partner1 Feb = 30 Dollar
Partner2 Feb = 20 Dollar
PArtner4 Feb = 40 Dollar
now i have the whole history data. but i have to load all master data. is there a way to connect eaysier to the fact table than loading all master data monthly?
Do I store all the master data in the infoobject or is there a possibility to use master data from dso if the date is not actual?
we get lots of master data if we create that like this. i think in 2 years it gets really slow...Please give some details of information you are storing in master data.
is it a time dependent master data?
Regards,
Gaurav -
Return latest transaction data, based upon transaction dates.
I appreciate I'm being a little dense here, and I have searched, read, and tried out a few different solutions I've seen give around the place. However; I think I'm struggling more with the naming conventions and logic of other people queries than you might in understanding mine (here's hoping!)
I have a huge table, which contains a record for every transaction which has an effect on our inventory (yup - BIG table!)
For a given transaction type 'CHC' (CHange Costs) I want to return the Part code, Transaction Date and Transaction cost of the LAST TWO changes.
Because its to be used for tracking updates to the cost of materials, and further for calculating the ongoing effect of these, I just need the two for now.
So,
Table is I50F
Columns required are
I50PART - [Part Code|http://forums.oracle.com/forums/]
I50TDAT - [Transaction Date|http://forums.oracle.com/forums/]
I50UCOST - [Price changed to|http://forums.oracle.com/forums/]
I50TRNS - [Transaction Type - we just want CHC]
Sample Data (Including just columns we are interested in)
I50PART I50TDAT I50UCOST I50TRNS
BACCA001 08/03/2006 07:34:51 0.08829 CHC
BACCA001 25/07/2007 08:26:30 0.10329 CHC
BACCA001 10/04/2008 16:29:02 0.10639 CHC
BACCA003 20/06/2006 12:22:30 0.16814 CHC
BACCA003 25/07/2007 08:26:54 0.17024 CHC
BACCA003 10/04/2008 13:30:12 0.17535 CHC
BACCA004 28/08/2007 15:46:03 0.06486 CHC
BACCA004 28/08/2007 15:49:15 0.06328 CHC
BACCA004 30/10/2008 09:22:40 0.06952 CHC
BACCA004 13/01/2009 09:09:07 0.06867 CHC
BACCA005 25/07/2007 08:27:24 0.06715 CHC
BACCA005 10/04/2008 15:45:14 0.06916 CHC
BACCA005 30/10/2008 09:05:17 0.07453 CHC
BACCA005 13/01/2009 09:06:49 0.07275 CHC To take a part in isolation, BACCA005;
I'm interested in the last two records.
It makes sense for there to be two records output per part at this stage, as it may be that the powers that be decide that they want the last 3, or 4, or whatever (I'm sure everybody has similar experiences with beancouters)
Is it A) Easy, and B) relatively efficient. There are 2.4m records in the table.
If I've been stupid and not included enough info, please do [metaphorically] poke me in the eye, and I'll pad out a bit.
Thanks ever so much for reading - and even more so if you can help!
Cheers
JAnalytic functions FTW!
with I50F as (select 'BACCA001' I50PART, to_date('08/03/2006 07:34:51', 'dd/mm/yyyy hh24:mi:ss') I50TDAT, 0.08829 I50UCOST, 'CHC' I50TRNS from dual union all
select 'BACCA001' I50PART, to_date('25/07/2007 08:26:30', 'dd/mm/yyyy hh24:mi:ss') I50TDAT, 0.10329 I50UCOST, 'CHC' I50TRNS from dual union all
select 'BACCA001' I50PART, to_date('10/04/2008 16:29:02', 'dd/mm/yyyy hh24:mi:ss') I50TDAT, 0.10639 I50UCOST, 'CHC' I50TRNS from dual union all
select 'BACCA003' I50PART, to_date('20/06/2006 12:22:30', 'dd/mm/yyyy hh24:mi:ss') I50TDAT, 0.16814 I50UCOST, 'CHC' I50TRNS from dual union all
select 'BACCA003' I50PART, to_date('25/07/2007 08:26:54', 'dd/mm/yyyy hh24:mi:ss') I50TDAT, 0.17024 I50UCOST, 'CHC' I50TRNS from dual union all
select 'BACCA003' I50PART, to_date('10/04/2008 13:30:12', 'dd/mm/yyyy hh24:mi:ss') I50TDAT, 0.17535 I50UCOST, 'CHC' I50TRNS from dual union all
select 'BACCA004' I50PART, to_date('28/08/2007 15:46:03', 'dd/mm/yyyy hh24:mi:ss') I50TDAT, 0.06486 I50UCOST, 'CHC' I50TRNS from dual union all
select 'BACCA004' I50PART, to_date('28/08/2007 15:49:15', 'dd/mm/yyyy hh24:mi:ss') I50TDAT, 0.06328 I50UCOST, 'CHC' I50TRNS from dual union all
select 'BACCA004' I50PART, to_date('30/10/2008 09:22:40', 'dd/mm/yyyy hh24:mi:ss') I50TDAT, 0.06952 I50UCOST, 'CHC' I50TRNS from dual union all
select 'BACCA004' I50PART, to_date('13/01/2009 09:09:07', 'dd/mm/yyyy hh24:mi:ss') I50TDAT, 0.06867 I50UCOST, 'CHC' I50TRNS from dual union all
select 'BACCA005' I50PART, to_date('25/07/2007 08:27:24', 'dd/mm/yyyy hh24:mi:ss') I50TDAT, 0.06715 I50UCOST, 'CHC' I50TRNS from dual union all
select 'BACCA005' I50PART, to_date('10/04/2008 15:45:14', 'dd/mm/yyyy hh24:mi:ss') I50TDAT, 0.06916 I50UCOST, 'CHC' I50TRNS from dual union all
select 'BACCA005' I50PART, to_date('30/10/2008 09:05:17', 'dd/mm/yyyy hh24:mi:ss') I50TDAT, 0.07453 I50UCOST, 'CHC' I50TRNS from dual union all
select 'BACCA005' I50PART, to_date('13/01/2009 09:06:49', 'dd/mm/yyyy hh24:mi:ss') I50TDAT, 0.07275 I50UCOST, 'CHC' I50TRNS from dual)
select I50PART, I50TDAT, I50UCOST, I50TRNS
from (select I50PART, I50TDAT, I50UCOST, I50TRNS, row_number() over (partition by I50PART order by I50TDAT desc) rn
from I50F
where I50TRNS = 'CHC')
where rn <= 2
order by I50PART, I50TDAT desc;
I50PART I50TDAT I50UCOST I50
BACCA001 10/04/2008 16:29:02 .10639 CHC
BACCA001 25/07/2007 08:26:30 .10329 CHC
BACCA003 10/04/2008 13:30:12 .17535 CHC
BACCA003 25/07/2007 08:26:54 .17024 CHC
BACCA004 13/01/2009 09:09:07 .06867 CHC
BACCA004 30/10/2008 09:22:40 .06952 CHC
BACCA005 13/01/2009 09:06:49 .07275 CHC
BACCA005 30/10/2008 09:05:17 .07453 CHC -
Data (Master Data Changes and Transaction Data) from SAP BW to SAP BPC 5.1
Hi guys
I have seen posts on this forum describing data transfers from SAP R/3 to SAP BPC. I assume the procedure for data transfers from SAP BW to SAP BPC 5.1 should be the same i.e. using SSIS packages.
However I have some unique requirements -
1. DATA AT DIFFERENT AGGREGATED LEVELS - I need data from SAP BW at different levels - Some data comes at Product level while other at Customer level and some at Project Level. The current procedure takes BW queries output in excel sheets (6 files) and then use the data manager package to load the data in SAP BPC 5.1 using appropriate transformation and conversion files. This procedure is highly manual and I am looking at using SSIS package to do this. However, because of having data at different levels, it becomes a little tricky. How can we achieve this using SSIS?
2. UPDATING MASTER DATA - I need to update the master data (dimension members) in SAP BPC 5.1 at the start of every month. The current procedure compares (in MS ACCESS) the data from the queries mentioned in 1 to the dimension members in SAP BPC 5.1 and spits a file with the new entries which needs to be manually updated in the appropriate dimensions using Admin Console. I am looking at automating this task. I cannot just replace all the contents of a dimension with the members coming from SAP BW since the dimension members contains some dummy members which are used for planning.
3. HIERARCHY CHANGES - What is the best way to capture the hierarchy changes in SAP BW into SAP BPC 5.1?
Please advise.
Thanks,
Ameya KulkarniHi Ameya,
how did you solve the described problems? Can you give some hints about uploading master data and updating the hierarchy?
BR, André -
Relation between master and transaction data
Hi experts,
I have a basic question how the relation between masterdata and transction data is maintained in R/3. As I know first we extract master data and later transaction data in a typical R/3 based scenario.
1.How does BW knows that , the transaction data are related masterdata in the same sense as in R/3 ?
I am involved in an enhancement of R/3 extractor. I need to derive 3 more fields and map it to info objects in BW. Cld any one explain me ( with an example plz) how these alignments to be made. How can I establish relation b/w derived fields and transaction data during extractions ?
regs
D BretHi Dave.
Let's see if I can get all of your questions answered. First, when Transactional Data is loaded, BW checks to see if a SID exists for each characterisatic field & value in the load. If one doesn't, it either creates a SID (Surregate ID) in the master data tables (default action) or the load fails (if the enforece referential intgegrity flag is set). It's better to load master data first in either case as it will improve the performance of the transactional load.
In terms of R/3, the data has similar relationships as in BW. There are master data tables and transactional data tables. The difference is that while R/3 uses ~1 master data table per field, BW uses more. Same thing for transactional data.
As for your enhancement, if everything you need for the derivation of the extra field is in R/3, then you can enhance the extractor in RSA6 and do the mapping in R/3. Once you replicate the DataSource to BW and reconnect the transformation & update rules, the load should work. If you can give a bit more information on what sort of enhancement you are attempting, I can give a better example.
Cheers,
Adam -
Master Data/transactional Data Loading Sequence
I am having trouble understanding the need to load master data prior to transactional data. If you load transactional data and there is no supporting master data, when you subsequently load the master data, are the SIDs established at that time, or will then not sync up?
I feel in order to do a complete reload of new master data, I need to delete the data from the cubes, reload master data, then reload transactional data. However, I can't explain why I think this.
Thanks, KeithDifferent approach is required for different scenario of data target. Below are just two scenarios out of many possibilities.
Scenario A:
Data target is a DataStore Object, with the indicator 'SIDs Generation upon Activation' is set in the DSO maintenance
Using DTP for data loading.
The following applies depending on the indicator 'No Update without Master Data' in DTP:
- If the indicator is set, the system terminates activation if master data is missing and produces an error message.
- If the indicator is not set, the system generates any missing SID values during activation.
Scenario B:
Data target has characteristic that is determined using transformation rules/update rules by reading master data attributes.
If the attribute is not available during the data load to data target, the system writes initial value to the characteristic.
When you reload the master data with attributes later, you need to delete the previous transaction data load and reload it, so that the transformation can re-determine the attributes values that writes to the characteristics in data target.
Hope this help you understand.
Maybe you are looking for
-
I have used your info to give my two iPads different names. My problem is that my old iPad still has the old email in the Apple ID pop up so it will not let me sign in. I checked my info in the manage my account page and there is my email as primary,
-
hi all experts pl i want to display selection screen and below selection screen i want to display output of the report. how is this possible?? sample code please. thanx rocky
-
Hi2all! I have made a simple video player on Flash 8 with FLVPlayback component, created a skin for it and so on. It worked perfect in IE and FireFox with FlashPlayer 8. But When I installed FlashPlayer 9, My Movie failed to load skin and flv video i
-
Function Module to find the Last Execution date of Back ground Job
Hi, Is there any function module to find the <b>last execution date of back ground job</b>. So that I can transfer the data from SAP to Legacy system based on the document creation date should be in between last execution date and current date. thank
-
Bouncing All Tracks As Audio Files
When I do this, is it writing the sends(not inserts) also is it writing the output, forexample if i compress all drums by sending their outputs to a specific bus