Error handling for master data with direct update
Hi guys,
For master data with flexible update, error handling can be defined in InfoPackege, and if the load is performed via PSA there are several options - clear so far. But what about direct update...
But my specific question is: If an erroneous record (e.g invalid characters) occur in a master data load using direct update, this will set the request to red. But what does this mean in terms of what happens to the other records of the request (which are correct) are they written to the master data tables, so that they can be present once the masterdata is activated, or are nothing written to masterdata tables if a single record is erroneous???
Many thanks,
/ Christian
Hi Christian -
Difference between flexible upload & Direct upload is that direct upload does not have Update Rules, direct upload will have PSA as usual & you can do testing in PSA.
second part when you load master data - if error occurs all the records for that request no will be status error so activation will not have any impact on it i.e. no new records from failed load will be available.
hope it helps
regards
Vikash
Similar Messages
-
Change communication structure for master data with direct update
Hi All,
I am having a problem with a change I want to make to some master data. I have added the attribute to the characteristic, but when I have gone to change the communication structure, its not possible (the add line button is greyed out)
I can see the new infoobject in the datasource/trans structure, but not in the comms structure (And yes it is in change mode ).
The master data uses direct update, and I have read that this causes some hassles in changing the comms str.
Can someone please give me some steps in doing this??
Thanks
RyanHi Ryan,
Why u have nothing to map is ?
U r datasource trasfer structure from the source ssytem has nothing new ..
1)TO ur Info object u can only write the routing or give any formula or anythign like the trasfer rules oftions that have when u click on the IO icon..
2) Enhave the structure of the datasoruce that is getting from the R3 source system...
3) Replicate it ..
Then only u will be able to find the extra field at the Trasfer structure... to map to the IO which have been added..
hope it helps
regards
AK -
Loading Master Data with Flexible Update
Hi,
I have created an Infoobject - Business Partner (Master Data bearing characteristics). The attributes to this are Region, Sales Person, Industry code which are master bearing characteristic as well.
I need to load data to the business partner from a csv file.
The layout of the CSV file is -
BP number, BP text (long Text), Region Code, Region Desc (Med Text), Sales Person Code, Sales Person Desc (Med Text), Industry Code, Industry description (Med Text).
How do I define the infosource to load this data.
Appreciate any help with this.
Thank You,
PrashanthHi,
First of all, I need to note that there are two kind of Infosources: with direct and flexible update.
If you choose a direct IS, then in the creating of the IS you just enter the name of the infoobject where you are going to load data. The system will create a IS (comm. structure). Enter this IS for changing. The system will propose you the communication structure, click on a bottom icon Transfer structure/Transfer rules, choose as a Source system your flat file system. Agree with the system when it asks you to save assignments (up to 3 times). Activate TRs. Then click in the field for Datasource. You may see there another datasources (for texts, attributes and hierarchies). Choose one by one another datasources with their activation.
Now you can create an infopackage for a load you can choose what kind of package it is going to be for loading texts, attributes or hierarchies.
Note that in this case the structure of the flat file is proposed by the system and you need just prepare the flat file corresponding to the proposed structure (different for each of 3 possible datasources). Execute infopackage.
If you use a flexible IS, then you may insert into comm. structure the fields that you think you may need in the master data. Note that here you may have not only attributes but TEXTS also. Save a comm. structure. Assign a flat file source system and activate a TRs.
After that in RSA1-dataproviders tab right click insert IO as a data target choose your IO. Refresh the screen. Youll see up to three data targets. Create update rules for each of them. In URs map the fields in the IS with the fields in URs.
Best regards,
Eugene -
OpenForm Methood for Master data with two keyFileds
Hi Experts
I`m wonering how to Open Batch Details Form Using Application.OpenForm Methood
as You all know BatchDetails Has two keys ItemCode and SysNumber.
Please Help
Best RegadsAfter attribute load, did you activate your master data? Everytime you load master data, you have to activate it to see the new data.
Ok - Do the following:
RSA1-> Find you InfoObject-> right click and click on 'Activate Master Data'.
OR
If you want to activate all master data in one shot, goto RSA1->tools->Apply Hierarchy/Attribute change->Execute
Now check the data, you will be able to see all the loaded master data.
Rewards if helps.
Regards,
Ashok -
Is it possible to make a delta load for a Master data with Standard DS
I have a full load bringing huge data for master data with standard datasource.
I want to run a delta due to huge no. of records but when I create a new Infopackage it dont give a option for delta update.
Are delta loads specific to only standard or customized DS's or any other reason behind that ?I kind of understand what you are asking about, but I am unclear as to how it pertains to our BO SDK.
You are wanting to find the differences between a large dataset and another large dataset.
I am not sure what an Infopackage is.
Are you using the BO Enterprise SDK or some other product?
Jason -
36 duplicate record found. -- error while loading master data
Hello BW Experts,
Error while loading master data
( green light )Update PSA ( 50000 Records posted ) No errors
( green light )Transfer rules ( 50000 ³ 50000 Records ): No errors
( green light )Update rules ( 50000 ³ 50000 Records ): No errors
( green light ) Update ( 0 new / 50000 changed ): No errors
( red light )Processing end: errors occurred
Processing 2 finished
Þ 36 duplicate record found. 15718 recordings used in table /BIC/PZMATNUM
Þ 36 duplicate record found. 15718 recordings used in table /BIC/XZMATNUM
This error repeats with all the data-packets.
what could be the reason of the error. how to correct the error.
Any suggestions appreciated.
Thanks,
BWerBWer,
We have exactly the same issue when loading the infoobject 0PM_ORDER, the datasource is 0PM_ORDER_ATTR.
The workaround we have been using is 'Manual push' from the details tab of the monitor. Weirdly we don't have this issue in our test and dev systems, even in Production, this doesn't occur somedays. We have all the necessary settings enabled in the infopackage.
Did you figure out a solution for your issue, if so please let me know and likewise I will.
-Venkat -
I am trying ABAP for master data lookup in update rule.
Here is the scenario ---
There is a Master Data object MDABC with attributes A1 , A2 . I need to Map IO1 to A1 and IO2 to A2.
What should be the Start Routine and Update Routine . Please help with a working code.
Thanks>
sap_newbee wrote:
> Thanks Aashish ,
> Here is the code I am usind but Its not populating any result May be you could help me out in debugging
>
>
> Start Routine -
>
> DATA: BEGIN OF ITAB_MDABC OCCURS 0,
> MDABC LIKE /BIC/PMDABC-/BIC/MDABC,
> A1 LIKE /BIC/PMDABC-/BIC/A1,
> A2 LIKE /BIC/PMDABC-/BIC/A2,
> END OF ITAB_NMDABC.
>
> SELECT
> /BIC/MDABC
> /BIC/A1
> /BIC/A2
> FROM /BIC/PMDABC INTO TABLE ITAB_MDABC
> FOR ALL ENTRIES IN DATA_PACKAGE
> WHERE /BIC/MDABC = DATA_PACKAGE-/BIC/MDABC.
> ENDSELECT.
>
>
> In Update Routine for Infoobject IO1 The code Iam using is
>
>
> READ TABLE ITAB_MDABC WITH KEY
> MDABC = COMM_STRUCTURE-/BIC/MDABC
> BINARY SEARCH.
> IF sy-subrc = 0.
> RESULT = ITAB_MDABC-A1.
> ENDIF.
> RETURNCODE = 0.
>
> ABORT = 0.
>
> Please help.
> Thanks
Please use table in the select statement. Modifications in BOLD
Edited by: Ashish Gour on Oct 17, 2008 2:57 PM -
Master data attributes with direct update...its very urgent
Hi all,
Could anyone tell me how to laod the master data attributes with direct update in the infopackge..
provide steps to create master data attributes and how to load..
Thanks,
ManjulaHi Manjula,
Flexible Uploading
Transaction code RSA1LEAD YOU TO MODELLING
1. Creation of Info Objects
In left panel select info object
Create info area
Create info object catalog ( characteristics & Key figures ) by right clicking the created info area
Create new characteristics and key figures under respective catalogs according to the project requirement
Create required info objects and Activate.
2. Creation of Data Source
In the left panel select data sources
Create application component(AC)
Right click AC and create datasource
Specify data source name, source system, and data type ( Transaction data )
In general tab give short, medium, and long description.
In extraction tab specify file path, header rows to be ignored, data format(csv) and data separator( , )
In proposal tab load example data and verify it.
In field tab you can you can give the technical name of info objects in the template and you not have to map during the transformation the server will automatically map accordingly. If you are not mapping in this field tab you have to manually map during the transformation in Info providers.
Activate data source and read preview data under preview tab.
Create info package by right clicking data source and in schedule tab click star to load data to PSA.( make sure to close the flat file during loading )
3. Creation of data targets
In left panel select info provider
Select created info area and right click to create ODS( Data store object ) or Cube.
Specify name fro the ODS or cube and click create
From the template window select the required characteristics and key figures and drag and drop it into the DATA FIELD and KEY FIELDS
Click Activate.
Right click on ODS or Cube and select create transformation.
In source of transformation , select object type( data source) and specify its name and source system Note: Source system will be a temporary folder or package into which data is getting stored
Activate created transformation
Create Data transfer process (DTP) by right clicking the master data attributes
In extraction tab specify extraction mode ( full)
In update tab specify error handling ( request green)
Activate DTP and in execute tab click execute button to load data in data targets.
4. Monitor
Right Click data targets and select manage and in contents tab select contents to view the loaded data. There are two tables in ODS new table and active table to load data from new table to active table you have to activate after selecting the loaded data . Alternatively monitor icon can be used
honor with points if this helps,
Sudhakar -
Characteristics - Master data Tab- infosource with direct update
Dear Experts,
Please explain me in Laymans Language as i am new to BI (but am an ABAP consultant).
In Info Object Characteristics, Master data Tab there is infosource with direct update with an application Component.
1. Now my doubt is, when i specify the Application Component, this infosource would appear under the application component in Modeling->Infosources.
Then on this infosource i would right click and create the info pakage and then data transfer process.
is it correct?
2. Suppose for this characteristic if i dont specify the infosource with direct update then it wont appear in the infosource, so how can i load data in this case?
Regards
BI LearnerHi,
Using an InfoSource with direct updating, master data (characteristics with attributes, texts, or hierarchies) of an InfoObject can be updated directly (without update rules, only using transfer rules) into the master data table. To do this you must assign it an application component. The system displays the characteristic in the InfoSource tree in the Data Warehousing Workbench. You can assign DataSources and source systems to the characteristic from there. You can then also load master data, texts, and hierarchies for the characteristic.
You cannot use an InfoObject as an InfoSource with direct updating if:
i. The characteristic you want to modify is characteristic 0SOURSYSTEM (source system ID).
ii. The characteristic has neither master data nor texts nor hierarchies. It is therefore impossible to load data for the characteristic.
iii. The characteristic that you want to modify turns out not to be a characteristic, but a unit or a key figure.
To generate an export DataSource for a characteristic, the characteristic must also be an InfoSource with direct updating.
1. *Now my doubt is, when i specify the Application Component, this infosource would appear under the application component in Modeling->Infosources.
Then on this infosource i would right click and create the info pakage and then data transfer process.*
The characteristic will be available in info source tree, you can assign a datasource->assignsource system->create package->load data.
2. Suppose for this characteristic if i dont specify the infosource with direct update then it wont appear in the infosource, so how can i load data in this case?
For a characteristic If you dont specify 'InfoSources with Direct Updating' then it wont appear in infosource tree. -
Use of "with master data & with out master data" at DTP update level
Hello experts,
In DTP, I check "with out master data". When I try to send corresponding transactional data, It is showing SID related error. Can anybody suggest what is the use of "with master data & with out master data" at DTP level.
Thanks in advance,
Zakir.HI
HI in DTP level If you set this indicator, the system terminates the update of the request if no values are available for a data record.
Load the relevant master data before you load the transaction data.
If you set this indicator, the system terminates activation if master data is missing and produces an error message.
If you do not set this indicator, the system generates any missing SID values during activation.
In DataStore maintenance, if you do not set the SIDs Generation upon Activation indicator, the No Update without Master Data indicator in the DTP has no effect.
thx
vijju -
Option for error handling for DTP, ' no updata, no reporting" and "deactiva
Hello Gurus,
option for error handling for DTP, ' no updata, no reporting" and "deactivated" , please give some explanation and instance for them?
Many Thanks,On the Update tab page, specify how you want the system to respond to data records with errors:
a. No update, no reporting (default)
If errors occur, the system terminates the update of the entire data package. The request is not released for reporting. However, the system continues to check the records.
b. Update valid records, no reporting (request red)
This option allows you to update valid data. This data is only released for reporting after the administrator checks the incorrect records that have not been updated and manually releases the request by setting the overall status on the Status tab page in the monitor (QM action).
c. Update valid records, reporting possible
Valid records can be reported immediately. Automatic follow-up actions, such as adjusting the aggregates, are also carried out.
http://help.sap.com/saphelp_smehp1/helpdata/en/42/fbd598481e1a61e10000000a422035/content.htm
Hope it helps.
rgds, Ghuru -
Error while creating Data Source for master data attributes
Hi BI Experts,
Well its been some time for me that I have been part of Extraction in BI.I primarily handled reporting in my last assignments.
I was trying extraction with flat files in SAP BI 7(new to sap bi 7 but very much familiar with BW3.5) but failed in the activity during master data attributes and text upload in infoobject (say IOSP_Mat).
Here is the procedure that I did after creation of characteristic IOSP_Mat.I created a source system for flat file followed by data source for Master data attributes, i selected all the parameters correctly.i.e. csv file format, data seperator as ,
and other settings, now when i am trying to look at the proposed data in the next tab using Load example data.its not showing the desired result.The columns that I have maintained in Flat File is MAT_NUMBER and MAT_NAME (with say 100 data in the file)
same is the result when I am trying to load the text data too columns maintained are
(LANGUAGE MAT_NUMBER Short Description)(same 100 data).
now i used to rsa1old transaction to upload the file using 3.5 version.i created info source for master data/text/hierarchies for IOSP_Mat
now when trying to upload it using info package for master and text data,I observe its(the data) not maintained in the characteristic IOSP_Mat.
When I monitored ,I figured the data has not been even uploaded to the PSA level.
Can you BI experts tell me the answer for this.
Thanks,
Srijithapologies to all of you for late response,
was busy with some other activities.
I don't remember the exact message,but I remember it was not loaded to even the PSA level.I will try it again and post the exact message.
Thanks again for your quick response.
Once again sorry to all of you for my late response
Thanks,
Sri -
Can routine replace "master data attribute of" update rule for performance?
Hi all,
We are working on CRM-BW data modeling, We have to look up agent master data for agent level and position for each transaction data. So now we are using "Master data attribute of" update rule. Can we use routine instead of "Master data Attribute of" ? Will it improve the loading performance? Since we have to load 1 lack transaction records , where as we have 20,000 agent details in agent master data.My understanding is, for each record in data package the system has to go to master data table and bring the agent details & store in cubes. Say one agent created 10 transactions, then this option "master data attribute of" will read the agent master data 10 times even though we are going to pull same details for all 10 transactions from master data. if we use routine, we can pull the agent details& storing in internal table removing all duplicates and in update routine we can read the internal table.
Will this way improve performance?
let me know if you need further info?
Thanks in advance.
Arun ThangarajHi,
your thinking is absolutely right!
I don't recommend to use the standard attribute derivation since it will perform a SELECT to the database for EACH record.
Better implement a sorted table in your start routine; fill it with SELECT <fields> FROM <master_data_table> FOR ALL ENTRIES OF datapak WHERE OBJVERS = 'A' etc...
In your routine perform a READ itab ... BINARY SEARCH.... I believe that you won't be able to go faster...
hope this helps...
Olivier. -
Effect of 'No update' for master data texts
Hello all,
I am using flexible update - update rules - for master data text. I want to confirm effect of 'No update' in following scenario
In the fisrt set of update rules -from first source system -I have mapped both short text and long text
Key | short text| long text
......| overwrite| overwrite
K1 | some SHTX| some LGTX
In the second set of update rules, I have selected 'No update' for Long text
Key | short text | long text
......|overwrite| NO UPDATE
K1 | some SHTX |
My loading sequence is first source system followed by second source system
what should be my ultimate long text and why?
Many thanks in advance!
Regards
SanjyotHi surekha,
your long text will not get changed.
Take the example you mentioned
Key | short text.....| long text
........| overwrite......| overwrite
K1 | some SHTX.. | some LGTX
In the second set of update rules, I have selected 'No update' for Long text
Key | short text... | long text
........|overwrite......| NO UPDATE
K1 | some SHTX. |
In the second set of update rules, you mentioned "No UPdate", So, when a record with the same key is found n Masterdata, it is not going to make any changes to the field Long text.
Where as short text will get overwritten.
Take some data for ex with the above mentioned type
data from source 1 : 2000 Maggi Noodles
data from source 2 : 2000 Feasters
when the data from first source system arrives, data will be 2000 maggi noodles
when the data from second source suystem arrives, as there exist this records with same key(2000), it checks for the other fiels
short text will be over written with the value in second system (according to rules its in overwrite mode)
Long text will not be updated as the rule says no update.
hope this helps,
Cheers,
Srinath.
Cheers,
Srinath. -
Error in creating RSDS using transfer rules for master data datasources
Hi ,
when i am creating the transfermations(RSDS) using transfer rules for master data datasources while i want to migrate datasource.it is giving the error message saying that INITIAL ERROR OCCURED DURING GENERATION OF THE TRANSFERMATION.AND in the taskbar it is giving the message LOG IS BEING LOADED.
MY QUESTION IS HOW CAN WE SEE THE LOG? AND
PLEASE GIVE ME THE STEPS TO MIGRATE MASTER DATASOURCES.
THANKS IN ADVANCE .Hi Ankit,
Yoy have to load data to the cube.For that you need update rules.
Steps to load data to the cube
1.Replicate the datasource.
2.Assign Infosource
3.Create transfer rules
4.Maintain the update rules to the cube.
5.Right click on the datasource and schedule an infopackage for it.
Execute the Infopackage and pull the data to the cube.
Hope this helps
Regards
Karthik
Maybe you are looking for
-
Hi, I have a Dual 2.5 Ghz PowerPC G5, which has two 250GB hard disks. I also have a Lacie 250GB Firewire hard disk installed. I am using Mac OS X 10.4.4. When use this computer the CPU and disk performance are OK. However, whenever and application us
-
When you send a PS file which has a linked Smart Object to a print vendor, will you need to send the linked Smart Object file as well?
-
How to use rtp to transmit data other than audio / video
hi all, as you all know the jmf is based upon a specific paradigm: data source -> player (processor) -> renderer. i am not being very precise on this but that does not matter. i want (for testing purposes) to create a tool that transmit some custom d
-
Error message while creating transfer order
Hi Experts, My objective is to create a transfer order from a transfer requirement programatically. I am trying to create a transfer order using FM L_TO_CREATE_TR programatically. Prior to this FM, I am calling FM L_TR_CREATE to create a transfer req
-
How do I get a white background with black lines
How do I get a white background with black lines and characters on my screen. To run the program, copy it out and delete the ` and it should go. If it does not compile on your machine, I would like to know. There is a test on about line 34 for changi