Double record in infocub
Hi expert,
I am getting double record in my infocube.Data is going from DSO to infocube
in dso data is fine but when run DTP from infocube to dso so ,im getting
double record.
Note: DSO is runing one delta(daily)
and one full weekly
pls advice how could be resolve
Regards
Wasem
Hi,
Note: DSO is runing one delta(daily)
and one full weekly
That's why you get duplicates. If you run a daily delta, why do you need a full load every week?
Please explain in details how the loads are performed from DSO to Cube, and also some examples of the duplicates you're seeing in the cube..
Besides, what are the key fields of the DSO?
Regards,
Suhas
Similar Messages
-
Duplicate Records in Infocube : Full Repair without DSO is possible ?
Hello Gurus,
I have a critical situation in BI production. we have more then 3 years of data in inventory infocubes (PSA->ODS->CUBE). everything was working fine till december 2010. after that in January 2010 and March -2010 we were getting double records in infocubes. in this scenarion we don't have ODS in between.
The solution which i find is delete all data from Jan -2010 till today in infocube and also from PSA. on R/3 side delete set up tables and fill setup tables for Jan-2010 to Mar-2010. and create a infopackage for Full Repair request . but i have below questions for these solution
(1) For Full Repair Info Package or Full Repair Request we don't need to delete INIT request .. correct me if i am wrong.
(2) For full repair request do we need to run Initialize Delta With Out Data Transfer first and then we have to do Full Repair ?
(3) If we don't have DSO in these scenario then also can we solve this with full repair request ?
Regards,
Komik ShahHi Venu,
We have data in PSA since 13/04/2010 because the process chain was failing since last 15 days. so didn't get new records in PSA also. we are using BI 7.0.
Whole scenario is like this.
Data is there in Inventory Cube since last 3 year. but no body monitor the process chain since last 15 days and no body analyze any reports after dec-09. now they Analyze the report in April-10 and for some months they are getting Double records. process chain was failing since last 15 days and the reason behind that was INDEXING as well as wrong records in PSA.
So, My plan was to delete data from Jan-2010 to April-2010 and fill setup tables between Jan-2010 to april.2010. i will get data into PSA. but when i will load data to cube. i will get double records whether it's a full repair or not .
Regards,
Komik Shah
Edited by: komik shah on Apr 30, 2010 12:38 PM -
How to delete the double records connected to one or more than one tables in SQL 2008?
Hi
Can anyone please help me with the SQL query. I Im having a table called People with columns names: personno., lastname, firstname and so on. The personno. is having duplicate records,so all the duplicate records i have written with "double" in
the beginning of the numbers. I tried deleting these double records but they are linked to one or more than one tables. I have to find out, all the tables blocking the deleting of double person. And then create select statements which creates update statements
in order to replace the current id of double person with substitute id. (The personno. is in the form of id's in the database)
ThanksYou should not append "double" in the personno. When we append it will not be able to join or relate to other table. Keep the id as it is and use another field(STATUS) to mark as duplicate. Also we will require another field(PRIMARYID) against
those duplicate rows i.e the main or the primary personno.
SELECT * FROM OtherTable a INNER JOIN
(SELECT personno, status, primaryid FROM PEOPLE WHERE status = 'Duplicate') b
ON a.personno = b.personno
UPDATE OtherTable SET personno = b.primaryid
FROM OtherTable a INNER JOIN
(SELECT personno, status, primaryid FROM PEOPLE WHERE status = 'Duplicate') b
ON a.personno = b.personno
NOTE: Please take backup before applying the query. This is not tested.
Regards, RSingh -
Hi Gurus,
I have done full repair load from source to target.which I am doing every month. and after that loading I used to get the exact result in BW.
But this time after loading (full repair) the data I am getting the double records as compare to R/3 report.
R/3 report
material no Qty
100000 2
100030 6
100040 8
110000 5
21
bw report
material no Qty
100000 4
100030 12
100040 16
110000 10
42
where is the issue and how to solve the issue
Regards.....Hi Saha,
Instead of doing Selective deletion, right click to cube --> Delete Data including dimension Tables
then load the Data
Regards
ReddY A -
Delete double records by coding in Infoset
Hello,
I join 7 tables in a infoset and get later in my datasource double records. My question:
Is it possible to delete these double records( I need only 1 record) by coding in the infoset? And if so how I can do this? I´m a ABAP novice.
Thanks.
MichaelThanks, for the fast answers. The example is very good, but I still have two questions to it.
1. I have also characteristic values, can I use here MIN and MAX?
2. Only the same records with the same key of "KUNNR" and "BUKRS" should be aggregated, how I can do this? (A small example would be helpful).
Many Thanks.
Michael -
How to resolve double records entry
Hi Gurus
Can any body tell if i will be asked like how do you solve the issues regarding double records?
thanks in advance
muralidear murali,
This discussion has happened lot of times in our forums. some of the links for easy access for you...
duplicate records and Issues regarding double records
duplicate records error?
Re: Duplicate Records in Employee MD
Thanks,
Raj -
Remove double records during data upload from one InfoCube to another
Dear Experts
We have transactional financial data available in an InfoCube including cummulated values by period . Some companys have 12 reporting periods (0FISCPER3) and some companys have 16 periods (but all 16 periods are not always filled) . The data must be prepared for a consolidation system which expects only 12 periods. Therefore I bulit up a routine with the following logik:
If period > 12, result = 12, else result = source field.
But as the data target is (must be) an InfoCube the new values with reporting period 12 are not overwritten, but summarised instead. This means the original records with period 12 and the new records - see example:
Records before transformation:
Period Amount
12 100
13 120
Records after transformation in the InfoCube:
Period Amount
12 100
12 120
This would lead to the following aggregation:
Period Amount
12 240
But as the values per period are cummulated, the consolidation system only needs the last period. So there should be only one record left in the InfoCube:
Period Amount
12 120
Is it possible to delete dublicate records or do you have any other idea to to keep only the record with the last period (e.g. period 13) and to assign the value 12?
Thanks a lot in advance for your help!
Regards
MarcoHi,
You have two options here, you can put DSO in between the Datasource and infocube, and load the delta using the change log.
Second is use delete the overlapping request from the infocube, it will delete the previuos requests and load the new request.
Check the below article:
[Automatic Deletion of Similar or Identical Requests from InfoCube after Update|http://www.sdn.sap.com/irj/scn/index?rid=/library/uuid/e0431c48-5ba4-2c10-eab6-fc91a5fc2719]
Hope this helps...
Rgs,
Ravikanth -
Double Records in Update from DSO to CUBE
Hi,
I have a standard flow of DataSource - DSO - InfoCube - Multi Provider
The data in DSO is correct BUT the data in INFOCUBE is DOUBLED.
Question, could this be due to mis match of Key Fields ?
I only have a subset of DSO Key Fields in my InfoCube.
Could this be the reasonHi,
once the DSO loads the request into infocube with some records.But the DSO wont be allowed to load the same request in second time. it transfer only '0' records.please check the request in Infocube.
if that cube have the double data you can delete data in Infocube after compression based on request Id. we can go for Reverse Posting. We can reverse post the data only when we process the data through PSA.
If u reverse post the data for a compressed request, the data for the compressed request is taken from PSA multiplying with -1 for the key figure values and loaded into the data target.Now the data in the data target is available in F table with negative key figure values.then Compression of Info Cube the data moves from F table to E table and nullifies the data in Infocube for the request ID.
Note: for deleting the zero key fig values by check mark zero elimination option in collapse tab. -
Loacate and remove duplicate records in infocube.
Hi!!
we have found the infocube 0PUR_C01 contians duplicate records for the month april 2008, approx 1.5lac records are extracted to this infocube, similar situations may be occuring in the subsequent months.
How do I locate these records and remove them for the infocube?
How do I ensure that duplicate records are not extracted in the infocube?
All answers/ links are welcome!!
Yours Truly
K SenguptoFirst :
1. How do I locate duplicate records in an Infocube? other than down load all the records in an excel file and use excel funtionality to locate duplicate records.
This is not possible since a duplicate record would not exist - the records are sent to a cube with a + and - sign to accordingly summarize data.
You search for duplicate data would become that much troublesome.
If you have a DSO to load it from - delete data for that month and reload if possible this would be quicker and cleaner as opposed to removing duplicate records.
If you had
ABC|100 in your DSO and it got doubled
it would be
ABC|+100
ABC|+100
against different requests in the cube - and added to this ill be your correct deltas also. -
Double record make it to single record in Query for perticular period in BW
Hi
In my ODS some of the records got doubled from Mar 17th to May 31st but from 1st of june its correct , its Open orders, i dont want to reload data again to that ODS , because its Open orders anyways its going to be close after some time , automatically it will become "0".
So i want tocalculate in query itself to make it single
please some one suggest me how can i do for that perticular period divided by 2
for example :
Open Orders (its In ODS)
120
140
20
10
50
I need output should be
60
120
10
5
25
Note : BW 3.1 Version
Thanks,
GAl
12hi,
You can create a a formula variable with replacement path on the Period(replace with key of Posting Date/Calday) and use it for comprision in the fomula like below
if posting date is in between date1 and date2
order value = order value/2.
else
ordervalue = ordervale.
In Bex Query designer you will have the below formula
(posting date formula variable >date1 and posting date formula variable <date2) * (Ordervalue/2) + ordervalue
But also note that when the order is closed the value of order value will net become zero since the value is doubled up in the DSO , when the delta records come into DSO the order will be closed but the value will still not become zero since the reverse image will give negative value of the actual order value.
For example if your order value is 100, in your DSO it will currently show the value is 200(doubled up). In the Delta the reverse value come as -100. So the final value will still be 100(200-100).
So the best solution will be to do selective deletion
Thanks,
Praveen
Edited by: Praveen kumar kamineni on Jun 11, 2010 11:06 AM -
Hi,
I am using 8i version and SQL*Plus 8.1.
When spooling a SQL statement to a file the number of records in
the file is doubled (the statement returns 27800, for example,
while the file contains 55600).
Why? How can I prevent it?
By the way, I am very new to oracle, so please be gentle...
Thanks.Are you using both a semi-colon ';' and a slash '/' to terminate
your SQL statement?
In SQL*Plus a SELECT statement can be terminated with either one.
If you subsequently use a '/', then the statement will be run a
second time.
E.g. do:
select * from dept;
and don't do:
select * from dept;
If this is not your problem, can you confirm that the records are
repeated? Or is each record wrapped over two lines?
- CJ -
How to delete a request with 1millions records in InfoCube
Hi All,
Our extraction from ODS to InfoCube lead to an error (time-out). The total data transfer is more than 1million records. We tried to delete the request but not to avail. Its still there.
Is there other safer way to remove this request, so that we can reconstruct back the data. No errors in data.
Any helps will be rewarded with points.
Thank you.
Regards,
Azlan.Hi All,
Thanks for the reply. Points rewarded for everyone. The scenario was; there was a time-out during the loading from ODS to InfoCube and the job was terminated. Short dump also was generated. Time-out was set to 7 hrs. We will further investigate what was the cause of the error. So the immediate resolution was;
1) Delete the error RequestID (mark with red icon)
2) Wait until finish
3) Refresh, the request still there. Select and delete again
4) Wait again. Refresh and now its gone.
5) Go back to ODS, reload back the appropriate request to InfoCube
6) Second time round, everything OK.
Hope this will help others.
Thanks everyone. -
Unable to delete double records from internal table
Hi all,
The internal table is like this
begin of ta_itab1 occurs 0,
mark type c,
cnt_hedg type c,
kunnr like vbak-kunnr,
vbeln like vbak-vbeln,
posnr like vbap-posnr,
matnr like vbap-matnr,
kwmeng like vbap-kwmeng,
h_kwmeng like vbap-kwmeng,
spart like vbap-spart,
werks like vbap-werks,
component like bom_item_api01-component,
comp_qty like bom_item_api01-comp_qty,
comp_qty1 like bom_item_api01-comp_qty,
base_quan like stko_api02-base_quan,
comp_unit like bom_item_api01-comp_unit,
base_unit like bom_item_api01-comp_unit,
bukrs_vf like vbak-bukrs_vf,
end of ta_itab1.
and used the sytax:
sort ta_itab6 by kunnr vbeln.
DELETE ADJACENT DUPLICATES FROM ta_itab6 comparing COMP_QTY COMP_QTY1.
but Im unable to delete duplicate record .
Thank You.
anuHi ,
You need to use the fields in sort statement on whichyiu wnat to perform Delete Adjacent duplicates..
sort ta_itab6 by kunnr vbeln COMP_QTY COMP_QTY1.
DELETE ADJACENT DUPLICATES FROM ta_itab6 comparing COMP_QTY COMP_QTY1. -
WebInterfaces for Millions of records - Transactional InfoCube
Hi Gerd,
Could u please suggest me which one can i use when i'm dealing with millions of records-Large amount of data.
(Displaying data from planning folders or WebInterfaceBuilder)
Right now i'm using WebInterfaceBuilder when i'm doing planning where user is allowed to enter values - for millions of records like Revenue forecast planning on salesorders.
Thanks in advance,
Thanks for your time,
Saritha.Hello Saritha,
Well - technically there is no big difference whether you are using Web interfaces or planning folders. All data has to be selected from the data base, processed by the BPS, the information has to be transmitted to the PC and displayed there. So both front ends should have roughly the same speed.
Sorry, but one question - is it really necessary to work with millions of data records online? The philosophy of the BPS is that you should limit the number of records you use online as much as possible - it should be an amount also the user can handle online - i.e. manually working with every record (which is probably not possible when handling 1 million of records). If a large number of records should be calculated/manipulated this should be done in a batch job - i.e. a planning sequence that runs in the back ground. This prevents the system from terminating the operation due to a long run time (usual time until a time out for an online transaction occurs is about 20 min) and gives you also more opportunities to control memory use or parallelizing of processes (see note 645454).
Best regards,
Gerd Schoeffl
NetWeaver RIG BI -
Hi Experts,
Last week, I encountered an odd case at my customer SBO application that I met for the first time.
Sometimes, when he adds a document as Invoice the document is created but the journal entry is created twice. The both journal entries get a different journal number, however they have the same origin number.
This case was created several times. I wondering what could be the reason for that and how can I solve it.
Thank you in advanceOnly on document such as Incoming Payment, etc where cancelling the payment is an option would there be a possibility that 2 JE were created referencing the same DocNum.
I can't think that this could happen with Invoices.
When you open the Invoice and click on the link next to Journal Remark....which ever JE is points to is the JE which it created.
Please compare all the columns of the JE (JDT1 table) and then get back.
Suda
Maybe you are looking for
-
Adobe Creative Cloud Not Downloading
I've signed up with Adobe Creative Cloud but the apps arent downloading to my computer. Also the application manager tries to open up twice at the same time and neither one opens up.
-
I purchased iPad 3. While trying install application,it says it is not available in Kuwaiti store. I entered my credit card details to complete the configuration . What is the issue.
-
How to restrict the total order size to 10 in atg CRS store
Hi, i want to restrict customer from selecting orders not more than 10 either the same product or different one. i got the solution for same product but the problem is when i am trying to restrict the customer from selecting items more than 10 by cha
-
BDC for MIGO for GR OutBound Delivery( A01 & R05 ) from Movement Type 107 to 109
Hi, I am stuck with a very typical situation need all your expertise and advices for a solution. We are basically trying to create change the movement type from 107 to 109 through MIGO. I have made a BDC recording for MIGO ( because we would like to
-
Basic question - trouble connecting iphone to macbook pro
Hi team, I have a fairly basic one for you all. I have recently purchased a macbook pro, previously had an ibook G4. I am connecting my iphone to the new computer and want to sync it to this machine. It is advising that once synced, all info from the