Extra data records in ODS
Hi Friends,
We are loading Cutomer Hierarchy in to BW system from Customer Extractor from R3. We are not using real Hierarchy table in BW (H Table)
This extractor is based on Transparent Table or DB View
It is based on Table KNVH.
Delta Process: Delta only via Full upload.
So, we are loading this full load daily into ODS. without deleting all previous data from ODS
Today when we see active table of ODS of one particular customer data is in the below form
Cust No ValidFrom ValidDate HigherCustomer
Cust1 01012007 31129999 HighCust1
Cust1 01062007 31129999 HighCust2
Here Time is overlapping which is not correct.
In R3 system In RSA3 it has only one record
Cust No ValidFrom ValidDate HigherCustomer
Cust1 01012007 31129999 HighCust1
In R3 Table VNVH we are able to see only one record which i similar to RSA3 data.
We are unable to find how this extra data record came to the ODS table.....we are facing same problem with may customers.
Any ideas where the problem is in this case...
Thanks
Tony
SELECT in field routine:
==========================================================
DATA src_doc_no(10) TYPE c.
DATA src_item_num(3) TYPE c.
DATA src_year(5) TYPE c.
DATA src_doc_typ(2) TYPE c.
if ( COMM_STRUCTURE-AC_DOC_TYP = 'DD' or COMM_STRUCTURE-AC_DOC_TYP =
'LD' ) and STRLEN( COMM_STRUCTURE-REF_KEY3 ) = 17.
CONCATENATE COMM_STRUCTURE-REF_KEY3+0(4) '%' INTO src_year.
src_item_num = COMM_STRUCTURE-REF_KEY3+15(3).
src_doc_no = COMM_STRUCTURE-REF_KEY3+4(10).
select from NEW data of ODS:
SELECT AC_DOC_TYP into src_doc_typ
FROM /BI0/AFIAR_O0340
WHERE COMP_CODE = COMM_STRUCTURE-COMP_CODE
AND DEBITOR = COMM_STRUCTURE-DEBITOR
AND FISCVARNT = COMM_STRUCTURE-FISCVARNT
AND AC_DOC_NO = src_doc_no
AND ITEM_NUM = src_item_num
AND FISCPER like src_year.
endselect.
if src_doc_typ = ''.
select from ACTIVE data of ODS:
SELECT AC_DOC_TYP into src_doc_typ
FROM /BI0/AFIAR_O0300
WHERE COMP_CODE = COMM_STRUCTURE-COMP_CODE
AND DEBITOR = COMM_STRUCTURE-DEBITOR
AND FISCVARNT = COMM_STRUCTURE-FISCVARNT
AND AC_DOC_NO = src_doc_no
AND ITEM_NUM = src_item_num
AND FISCPER like src_year.
endselect.
endif.
RESULT = src_doc_typ.
endif.
============================================
Similar Messages
-
Error on activation of data records from ODS object
hi,
Kindly help with my BW Problem, the error occured on the
activation of ODS. The error says "<i>Request
REQU_43MNOPW29W4F5M037J7OAFP52 in ODS ZPUR_O01 must have QM
status green before it is activated</i>" and "<i>Activation of data
records from ODS object ZPUR_O01 terminated</i>"
I tried to check the data in the ODS Object by choosing
<LS>Administrator Workbench > data targets > context menu on ODS
Object ZPUR_O01 but the status are green and when i tried to
activate it the logs says "<i>ODS object ZPUR_O01 was built
incorrectly. Cannot update request
REQU_43MODQ3GF69X17B71ZKQMN85I(20,154)
ODS object ZPUR_O01 was built incorrectly. Cannot update request
REQU_43MQ8A8F0L1US5COQTUUWX3BA(20,155)
ODS object ZPUR_O01 was built incorrectly. Cannot update request
REQU_43MQ8OQQTY21YQ37RMA1WMO06(20,156)
Data to be activated successfully checked against archiving
objects
Activation of data records from ODS object ZPUR_O01 terminated</i>"
Please help me with this problem.Hi Jay
U need to update these requests to ODS
REQU_43MODQ3GF69X17B71ZKQMN85I(20,154)
REQU_43MQ8A8F0L1US5COQTUUWX3BA(20,155)
REQU_43MQ8OQQTY21YQ37RMA1WMO06(20,156)
1) delete the current request
2)Reconstruct the requests above three request
3)and then Load the current reqest (REQU_43MNOPW29W4F5M037J7OAFP52)
Then only u will be able to activate the ODS....
This is an serialization issue...
hope it helps
regrds
AK
assign points if usefull -
Error:Activation of data records from ODS object terminated
Hi All
i got error while loading the data "Activation of data records from ODS object terminated '
could you please tell me procedure how can i solve this issue?
regards
vasu.- First check the Error Messages from the context menu of the activation step. Also check ST22 & SM21. This will tell you where the error is. Sometimes you might get duplicate records.
- Next use the "Manage" context menu for the ODS in RSA1. Check if any of the previous records are red. If its red, it would mean that either load/activation has failed for previous request which needs to be corrected first b4 you correct the current one.
- If al the above are OK, simply repeat the Activation step.
Hope this helps.
Regards
Anujit Ghosh -
ODS activation error"Activation of data records from ODS object terminated"
Hi All,
While activating ODS request I am getting following error
"Activation of data records from ODS object KFI02O50 terminated ".
data load is suceesful for ODS but during activation of the request it is giving error.
I tried to change the status to green manually & then activated the request but still same problem.
Also I tried to delete request & loaded again from PSA & then activated the request but still same problem
If anybody has some solution please let me know.
Thanks in advance
Regards
SonalHi Sonal,
Sometimes when there is erroeneous records that time while activating the Request request get fails.To check this go to manage tab of the ODS there under log u can see the log icon click on that and check y activation got failed..
Assign points if its helpful
Regards,
vid. -
Loading ODS - Data record exists in duplicate within loaded data
BI Experts,
I am attemping to load an ODS with the Unique Data Records flag turned ON. The flat file I am loading is a crosswalk with four fields, the first 3 fields are being used as Key Fields in order to make the records unique. I have had this issue before, but gave up in frustration and added an ascending number count field so simply create a unique key. This time I would like to solve the problem if possible.
The errors come back referring to two data rows that are duplicate:
Data record 1 - Request / Data package / Data record: REQU_4CNUD93Q3RCC80XFBG2CZXJ0T/000003/ 339
Data record 2 - Request / data package / data record: REQU_4CNUD93Q3RCC80XFBG2CZXJ0T/000003/ 338
And below here are the two records that the error message refers to:
3 338 3902301480 19C* * J1JD
3 339 3902301510 19C* * J1Q5
As you can see, the combination of my three Key Fields should not be creating a duplicate. (3902301480, 19C(asterisk) , (asterisk)) and (3902301510, 19C(asterisk) , (asterisk)) I replaced the *'s because they turn bold!
Is there something off with the numbering of the data records? Am I looking in the wrong place? I have examined my flat file and can not find duplicates and the records that BW say are duplicates are not, I am really having a hard time with this - any and all help greatly appreciated!!!Thank you for the response Sabuj....
I was about to answer your questions but I wanted to try one more thing, and it actually worked. I simply moved the MOST unique Key Field to the TOP of my Key Field list. It was at the bottom before.
FYI for other people with this issue -
Apparantly the ORDER of your Key Fields is important when trying to avoid creating duplicate records.
I am using four data fields, and was using three data fields as the Key Fields. Any combination of all three would NOT have a duplicate, however when BW finds that the first two key fields match, sometimes it apparantly doesn't care about the third one which would make the row unique. By simply changing the order of my Key Fields I was able to stop getting the duplicate row errors...
Lesson - If you KNOW that your records are unique, and you are STILL getting errors for duplicates, try changing the ORDER of your key fields. -
ODS not picking one particular master data record
Hi All,
Problem is when we run a query on ODS,it is not displaying the one particular data records.Then we check the ods that particular record values are not available,but it is in master data tables.
My question is what will be the reason for this.we are using Generic extraction.
thanksHi,
Check for primary keys of master data tables and key fields of ODS . I think there should be mismatch which is why some records are getting overwritten.
Also in the ods request check number of records transfered and number of records added.
ODS key fields should be exactly same as Primary key of master data table otherwise there are chances of loosing data.
Hope that helps.
Regards
Mr Kapadia
Assigning points is the way to say thanks in SDN. -
ODS - Number of Active Data Records count?
Hi,
I am having a ODS. In manage screen of ODS, i can able to see the records as shown bellow:
Transferred records: 352053
Added Records: 1836
But the number of entries in Active Data of ODS is only 471 records.
My question here is that why Added records (1836) are not equal to number of Active data records (471)? Where i am missing or is it correct?
Best Regards,
Venkata.Hi,
Have you completed the Activation of ODS data after uploading? If no see the entries in NEW records' table of ODS.
If you have activated already,still there is a chance to have different count, because of no of records with same value of KEY fields but exist in different packets.
With rgds,
Anil Kumar Sharma .P
Message was edited by:
Anil Kumar -
Activation Error in ODS - Record mode 0 unknown (data record REQU_4DPGJKCKN
I created a generic extractor in R3 with a generic delta. In BW, my delta init loads with no errors but when I activate the data in the ODS, I get the activation error.
"Record mode 0 unknown (data record REQU_4DPQ31IK7WM9OTK26S5JMJ6LM/000001/"
I searched SDN but didn't found anything, and I searched help but only found general information about 0recordmode.
Can someone provide a solution and/or some more detailed information about the error message?
ThanksHello,
When you have created the InfoSource for the generic datasource, have you added field 0RECORDMODE to the InfoSource? When loading data to a ODS, 0RECORDMODE is essential.
The InfoObject is usually mapped to ROCANCEL (Indicator: Cancel Data Record) field of the extractor. This field is usually present in Standard extractors. But even if the field is not present in your extractor, add it to InfoSource Communication structure and leave it unmapped in transfer rules.
I think this might solve the problem.
Regards,
Pankaj -
Unable to load data from an ODS to a Cube
Hi all,
I am traying to load data from an ODS to a cube, and I'm getting the following msg at the bottom of the screen monitor: <b>"No corresponding requests were found for request REQU_BZUH9ZPYNL5XLQNB7K8IQDKFS in the ODS/Cube" Message no. RSBM043.</b>
I am unable to load the data. The QM status is yellow. When the process starts, on the left hand of the monitor it shows: "0 of 84408 records" which is ok. I load that with this package before and it works well.
Can you help me what I have to do? I tried to wait but it was the same: no progress and no data in the cube.
Thank you very much and kind regards,
MM.
May be this helps...
When I look at the status, it says that I have a short dump at BW. It was CALL_FUNCTION_REMOTE_ERROR and the short text is "The function module "SUBST_TRANSFER_UPGEVAL" cannot be used for 'remote'". This short dump occurs very much before I shoot the upload.
Thanks again.Hello MM,
Can you do the following..
make the Total status Red Delete the request from cube.
goto ODS -> Remove the Data Mart Status -> Try loading it again.
The error message that you get is common when we are trying to run a infopackage with the source object having no new request, (means all the requests available in the source are already moved to the targets). So please try the steps given above.
Hope it will help! -
How to find the number of records in ods?
how to find the number of records in ods?
Pls suggest the solution apart going to the table of ods and seeing the number of records.
Is there any program or function module to see the number of records in ods?
For eg: SAP_INFOCUBE_DESIGNS is a program which gives the detail (number of records) of infocube.Hi,
I was looking at this and found the following tables that may be of help.
One of these tables will include a summary of the record count of all the tables in your system based on the last time each table had it's database statistics calculated:
DBSTATTADA
DBSTATTDB2
DBSTATTDB4
DBSTATTDB6
DBSTATTINF
DBSTATTMSS
DBSTATTORA
We run on an Oracle database so the table record counts can be taken from DBSTATTORA. Type in AZ in table selection field in SE16 to restrict the output to ODS (or DSO) tables only.
The record count is at the time indicated in the timestamp field. Obviously this is not real time but should not be too out of date - if it is you may be having performance issues and should get your DBA / Basis to run a full refresh of DB stats.
Hope this helps, alhtough not real time the table should give you a decent indication of the size of all your ODS objects (or any other table for that matter!) -
InfoCube Modelling-Adding data from different ODS's on to the Infocube
Hi Experts,
I am new to SAP BI. I have a basic doubt on Modelling the InfoCube.
In our requirement, I have to populate data from 9 custom SAP Tables on to 9 ODS's. And, then take these data on to Infocubes.
And, they want to reduce the number of cubes as much as possible. So, I have to combine the data from different ODS's and build 2-3 Infocubes.
For Example.
I am going to combine 5 ODS's data on to 1 CUBE based on Delivery number...
there are 5 ODS with common key Delivery number. And, suppose I have added some set of fields from ODS1.
And, now when I add other set of fields from the second ODS, WHAT WILL HAPPEN TO THE 'Delivery Number' field ??
I will make it clear.
I have a record in CUBE already containing Fields- : Delivery no, field_a, field_b, field_c, field_d. Where the 'Delivery no =112333'. This record comes from ODS1.
Now, I want to add data data from ODS2, containg fields -: Delivery no, field_e, field_f, field_g, field_h.
And, what happens to the already existing record in CUBE with 'Delivery no = 11233'. ?
Will the value in this info-object get overwritten ?
OR.. will it combine the data from both the ODS's and show it as ONE record ???
Please advice ... How will I solve this scenario ?
Thanking You in Advance
Shyne SasimohananAnswer for your question and the suggestion.
the data will look like as given below
Delivery no, field_a, field_b, field_c, field_d, field_e, field_f, field_g, field_h
11233 1 1 1 1 0 0 0 0
11233 0 0 0 0 1 1 1 1
but the best way, according the design standards is creating another DSO on the top of all the DSO's and combine all the data in that DSO and send the data to Infocube. then the data will be shown as below.
Delivery no, field_a, field_b, field_c, field_d, field_e, field_f, field_g, field_h
11233 1 1 1 1 1 1 1 1
Regards,
Siva A -
Difference between Key feilds and Data fileds in ODS
Hi team,
What is diff between data fields and Key fileds in ODS ?.
Pl let me know.
Regards,
SenthilHi
Key fields:
Unique identification of record: uniqueness of a record in ods is maintained by how you define the key fiedls in the ods. key fields in the ods are equal to the primany indexes on a table.
If you activate the ODS it will generate a Table with Primarykey which you defined in the ODS.
Datafields:
data fields that can also contain character fields and keyfigures.
http://help.sap.com/saphelp_nw04/helpdata/en/a6/1205406640c442e10000000a1550b0/frameset.htm
Regards,
Senthil Kumar.P -
Key Fields and Data Fields in ODS
Hi Guru's ,
Can u Explain what are Key Field's and Data Field's in ODS,and How they work.
Thanx in Advance.Hi Pradeep,
An ODS key fields forms a unique combination of chars according to which data in the ODS will be summed up. Data fields are additional fields, chars or key figures.
For example, you have the following key fields:
Doc No, Doc line item (item no in the doc) and you load the following data:
Doc# Item# Amount CALDAY
1234 1 100 26.09.2005
1234 1 150 27.09.2005
1234 2 300 27.09.2005
1235 1 400 27.09.2005
The first two rows with the same key fields (1234-1) will be aggregated and into ODS will go amount of 250.
What I said is true if for Amount key figure the Add option is set.
But if you set an Overwrite option, then the second record will overwrite the first one and only amount of 150 will go to ODS!
But if CALDAY is also a key field, then into ODS will go all records without any aggregation, because key fields combination is different for all records.
So, ODS key fields choice and KFs settings are very important for data aggregation in the ODS.
Best regards,
Eugene -
DTP Error: Duplicate data record detected
Hi experts,
I have a problem with loading data from DataSource to standart DSO.
In DS there are master data attr. which have a key containing id_field.
In End routine I make some operations which multiple lines in result package and fill new date field - defined in DSO ( and also in result_package definition )
I.E.
Result_package before End routine:
__ Id_field ____ attra1 ____ attr_b ...___ attr_x ____ date_field
____1________ a1______ b1_________ x1
____2________ a2______ b2_________ x2
Result_package after End routine:
__ Id_field ____ attra1 ____ attr_b ..___ attr_x ____ date_field
____1________ a1______ b1_________ x1______d1
____2________ a1______ b1_________ x1______d2
____3________ a2______ b2_________ x2______d1
____4________ a2______ b2_________ x2______d2
The date_field (date type) is in a key fields in DSO
When I execute DTP I have an error in section Update to DataStore Object: "Duplicate data record detected "
"During loading, there was a key violation. You tried to save more than one data record with the same semantic key."
As I know the result_package key contains all fields except fields type i, p, f.
In simulate mode (debuging) everything is correct and the status is green.
In DSO I have uncheched checkbox "Unique Data Records"
Any ideas?
Thanks in advance.
MGHi,
In the end routine, try giving
DELETE ADJACENT DUPLICATES FROM RESULT_PACKAGE COMPARING XXX YYY.
Here XXX and YYY are keys so that you can eliminate the extra duplicate record.
Or you can even try giving
SORT itab_XXX BY field1 field2 field3 ASCENDING.
DELETE ADJACENT DUPLICATES FROM itab_XXX COMPARING field1 field2 field3.
this can be given before you loop your internal table (in case you are using internal table and loops) itab_xxx is the internal table.
field1, field2 and field 3 may vary depending on your requirement.
By using the above lines, you can get rid of duplicates coming through the end routine.
Regards
Sunil
Edited by: Sunny84 on Aug 7, 2009 1:13 PM -
Data Extraction and ODS/Cube loading: New date key field added
Good morning.
Your expert advise is required with the following:
1. A data extract was done previously from a source with a full upload to the ODS and cube. An event is triggered from the source when data is available and then the process chain will first clear all the data in the ODS and cube and then reload, activate etc.
2. In the ODS, the 'forecast period' field was now moved from data fields to 'Key field' as the user would like to report per period in future. The source will in future only provide the data for a specific period and not all the data as before.
3) Data must be appended in future.
4) the current InfoPackage in the ODS is a full upload.
5) The 'old' data in the ODS and cube must not be deleted as the source cannot provide it again. They will report on the data per forecast period key in future.
I am not sure what to do in BW as far as the InfoPackages are concerned, loading the data and updating the cube.
My questions are:
Q1) How will I ensure that BW will append the data for each forecast period to the ODS and cube in future? What do I check in the InfoPackages?
Q2) I have now removed the process chain event that used to delete the data in the ODS and cube before reloading it again. Was that the right thing to do?
Your assistance will be highly appreciated. Thanks
Cornelius FaurieHi Cornelius,
Q1) How will I ensure that BW will append the data for each forecast period to the ODS and cube in future? What do I check in the InfoPackages?
-->> Try to load data into ODS in Overwrite mode full update asbefore(adds new records and changes previous records with latest). Pust delta from this ODS to CUBE.
If existing ODS loading in addition, introduce one more ODS with same granularity of source and load in Overwrite mode if possible delta or Full and push delta only subsequently.
Q2) I have now removed the process chain event that used to delete the data in the ODS and cube before reloading it again. Was that the right thing to do?
--> Yes, It is correct. Otherwise you will loose historic data.
Hope it Helps
Srini
Maybe you are looking for
-
How do i transfer music from one account to another on the same computer
my wife and i share a computer but have two id's. I downloaded a music album from i tunes into my (id) account. How do i get that album to my wife's music liberary?
-
I SIGHT NOT WORKING IN WINDOWS XP
i have now loaded windows xp onto my mac, via bootcamp, but all seems ok untill i try to use webcam, it does not work, i have tried the external i sight as well but same problem can anyone help or is it just a problem that cannot be resolved
-
Hello Experts, I am using the said FM to send e-mail with Excel attachment but it always give me the error DOCUMENT_NOT_SENT. What maybe the cause of this error? hope you could help me out here guys. Thank you and take care!
-
JVM takes more RAM than defined in -Xmx param on solaris
Hi all, We have a problem on production installation of our product on Solaris paltfroms: jvm takes more (much more) RAM than defined in -Xmx param and without any OutOfMemory exceptions. (how itt possible at all ?) The only JNI call in application i
-
Is sql server using trigger to capture change data in SQL server CDC?
hi all, what is the Architecture of SQL server change data capture(CDC)? is sql server using trigger to capture change data in SQL server CDC? for example Change Data Capture (CDC) in SQL Server 2008 using we can capture change Data records INSERTs,