Deletion of duplicate request by routine in infopackage
I am loading data from a flat file without any selection conditions. The load is YTD load ie in april the load contains data of Jan to March and in may the load contains the data of Jan to April. So i want to delte the previous request when new one is loaded.
The problem is in Jan 2008 the data will be loaded for Jan 2007 to Dec 2007. So in February 2008 now the data will be loaded of Jan 2008. So I dont want to delte the request on Jan to Dec 2007. So i dont want deletion in February of the last years data.
How to write a routine for this.
Thanks.
Hi
Search for base table where requests are getting stored for infopackage..
And in routine you can make use of this table and delete corresponding requests.
Let me know if you need help
Thnx
Gaurav
Similar Messages
-
How to delete the duplicate requests in a cube after compression.
Hi experts,
1. How to delete the duplicate requests in a cube after compression.?
2. How to show a charaterstics and a keyfigure side by side in a bex query output?
Regards,
Nishuv.Hi,
You cannot delete the request as its compressed as all the data would have been moved to E table ..
If you have the double records you may use the selective deletion .
Check this thread ..
How to delete duplicate data from compressed requests?
Regards,
shikha -
Unable to delete the last request in full load infopackage
Hi All,
I have full load infopackage with many requests with green status and thier request genetaed is 0.
Because of last failure request ia m aunable to activate dso.
i Made last failure request green and triggered. but not successful.
i made it red and tried deleting, no sucess. when i delete its giving Dump
Now i am unable to delete the first request in green status also.its not deleting.
can i delete whole data in dso and do full load again? how to check if there any other infopackages on this dso..
ALL This issues camw across triggering process chian...plz let me know oyur answers....
Thanks,
VenkatHi Venkatesh,
I tried deleting through RSODSACTREQ.. but the delete option was disabled in the TABLE ENTRY TAB.
The last request is failed one with red and not tranfered with full records..
All the other records which are in green status does not have symbol genated for reporting purpose and
Request id generated up on activation is zero..
I tried deleting the first request.. when i change its satus to red...
QM action on PSA Z[DSONAME] must have now:Checked to see if automatic activation of the M version should be started.
The M version is then activated if necessary Now?
Req. 0001439786 in DataStore Z[DSONAME]must have QM
status green before it is activated
Request 0001427251 is not completely activated.
Please activate it again.
But i am unable to find both the requests under psa and in adminstartion data target tab... there are some other requests with red status iin psa...
When i try to delete failed request its giving dump
Thanks,
Venkat. -
How can I debug a routine created in Deletion of similar requests?
Hello Experts,
I need some help from you. I need to delete overlapping requests and the common
settings you can set are not suitable, so I decided to write my own ABAP routine.
But how can I debug a routine, which is implemented in the deletion of similar requests?
Hope you can help me.
Cheers
Daniel WeilbacherPlease go to the routine code..
In the menu options..u will find an option for breakpoint--set..
The line where breakpoint is set will get highlighted and u will also see a STOP symbol..
then come out of routine..and then run the data load..
this ABAP routine for overlapping request deletion runs before the scheduled data load..
the run shud stop at the set breakpoint and show the routine code..in debugger mode..
cheers,
Vishvesh -
How to debug routine in process type "Deletion of overlapping requests"
Hi all,
I created a process chain including a process of the type "Deletion of overlaopping requests from InfoCube".
In this process I created a routine to decide which requests to delete. Now I would like to debug this routine, but do not know how.
Merely setting a breakpoint does not seem to help.
Does anyone have a hint how to debug this routine?
Many thanks,
StefanHi,
put a "BREAK-POINT." statement into your code (don't forget to remove it afterwards). Then you activate your process chain and go the menu Execution at the top of RSPC. There you choose "Execute synchronous to Debugging". The process chain should then kick off and stop at the point where you put the break point.
Hope it helps.
Stefan -
Delete similar request option in the infopackage
Hello BW Experts,
I have to load Period 01 many times a month. each it loads to the cube it has to delete the similar request from the cube. Where can i do this setting in the infopackage.
Please advice.
Thanks,
BWerHi BWer,
You can do this on the Data Targets tab in the InfoPackage > third columns from the right (Automatic Deletion of Similar Requests > Click this and select Delete Existing Requests > Same or more comprehensive
Hope this helps... -
How to delete cube old request by using abap code?
Dear experts,
Here is one thing need your help:
We have one cube need to delete the old request. This cube is loaded daily and only need to delete the request of that month.
I know there is one "Delete Overlap Requet" type in the process chain, but my concern is: in our DTP filter, different day the selection will have different values, let's say A and B, and A and B are not even overlap at all.
So my question is:
1. Can I still use "Delete Overlap Request" to delete the old request?
2. If so, I see it can be implemented by abap routine, can anyone give me some sample code? Like how to delete request only in this month. And how to delete request whose selection is equal to a specific value.
Note: one of the source cube is real time cube so can not delete the request from infopackage.
Any post will be appreciated and thank you all for your time in advance!
Regards.
TimHere is the intial code when I choose "Request Selection through Routine", please help me on how to choose the specific request and delete them in this routine.
program conversion_routine.
Type pools used by conversion program
Global code used by conversion rules
$$ begin of global - insert your declaration only below this line -
TABLES: ...
DATA: ...
$$ end of global - insert your declaration only before this line -
InfoCube = ZPC_DEL_REQ_ZBPS_C01
form compute_ZPC_DEL_REQ_ZBPS_C01
tables l_t_request_to_delete structure rsreqdelstruc
using l_request like rsreqdone-rnr
changing p_subrc like sy-subrc.
*Insert Source Code to decide if requests should be deleted.
*All Requests in table l_t_request_to_delete will be deleted
*from Infocube ZPC_DEL_REQ_ZBPS_C01.
*Add new requests if you want to delete more (from this cube).
*Remove requests you did not want to be deleted.
$$ begin of routine - insert your code only below this line -
loop at l_t_request_to_delete.
endloop.
clear p_subrc.
$$ end of routine - insert your code only before this line -
endform. -
Regarding the routine in infopackage
Hi Experts,
I have an infopackage with only full load update,i am loading a daily load into the target with that ..now what i want to know is that i want to create a routine in infopackage so that it deletes the previous day data automatically ..so that if i am loading todays data it wont be duplicated..so can any one give me the code and step by step ways to do that ..i would appreciate the answers and assign maximum points for the valuable one...
regards,
RkIf it is going to be full load, you have some options like:-
a. At Infopackage Level:- Under the data-targets tab, you have a check box for deleting the targets contents before loading, if you check that tab, the data will be deleted automatically before loading.
But beware that ALL data will be deleted and only today's data will remain there.
b. Process Chains:- If your selections are same, you can use a process type "Delete Overlapping Request" in the process chain, and it will delete the contents before reloading them.
Hope this helps.
Cheers,
Sumit -
How to delete the compressed Request ID in the info cube... ?
Hi BW Gurus,
one of the info package will upload the request in to three data targets and failed due to error with duplicate records. I am able to delete the bad request in two data targets and facing problem with one data target due to the request is available in green and got compressed.
The reasons why i am unable to delete the request in that data target is it is already got compressed and rolledup.
I tried with selective deletion based on same request ID i have done that successfully but the request still presents in the data target. I have checked in listcube and found no data for the request id.
now i have one more question like the request has bring down to status not ok in the monitor and done selective deletion will it be a problem on the data missing please advise me if any data missing will occur for the particular data target.
can any one help on this. Thanks in Advance.
Venkat.Hi Venkat
You have one way to delete the compressed request
But this is possible only if u have the request in PSA
If the request is in PSA, do reverse posting
This will nullify the particular bad request by changing the press sign into - and - sign into + for all the records went in the cube for that particular request
Regards
N Ganesh -
How to delete the duplicate cards in ALUI
Hi,
How to delete the duplicate cards in knowledge directory
Thanks & Regards
Dheeraj
Edited by: dheeraj on Sep 17, 2008 2:31 PMNot sure if there is an automated way to do this, but I get around it by just searching on (2), (3), (4).
Not practical for large installs, but it works in our small environment.
You might be able to use the Smart Sort utility to find files that have () characters in them? Not positive though.
It would make a great enhancement request to add dupes to either a log file or better yet an admin utility to track the living documents to ensure they are fixed!
(I'm going to file such an enhancement right now!) -
How to delete the duplicate cards in alui crawlers
How to delete the duplicate cards in alui crawlers
Not sure if there is an automated way to do this, but I get around it by just searching on (2), (3), (4).
Not practical for large installs, but it works in our small environment.
You might be able to use the Smart Sort utility to find files that have () characters in them? Not positive though.
It would make a great enhancement request to add dupes to either a log file or better yet an admin utility to track the living documents to ensure they are fixed!
(I'm going to file such an enhancement right now!) -
How to delete a specific request out of the InfoProvider
Hi all,
we have a generic datasource to extract our CO-PA data. Unfortunately this datasource can not deliver any delta at the moment. This is why we need to have a full upload. In InfoPackage we select the single period 001, 002, or 003.
Now in March we only need to load the period 003 daily. In my process chain I would like to have a function that deletes only the request of this infopackage and not also the requests of the previous months, so I don't need to load all everyday, but only the current month.
Do you know such a process type?
Thanks in advance!Hi Sandra,
1. Try "Delete Overlapping Requests from InfoCube" process type.
2. You can also do this in the InfoPackage. Goto Data Targets tab "Automatic Loading of Similar/Identical Requests"
These two are only possible for a Cube.
Anup. -
ABAP to delete a specific request (last one) from a Cube
Hello Experts,
In a process chain, I would like to delete the last request in a cube before proceeding through with the rest of the PC. For that purpose, I have an ABAP program which will retrieve the "last request ID" from the cube, then call FM RSSM_DELETE_REQUEST to delete that request using that ID and the cube name.
Once in a while, this deletion doesn't work: the request status is set to erroneous and the PC goes on with its life.
I am currently investigating this issue and have no mean to easily test it, including getting the actual error code from the deletion FM if there is any, so I'm looking for leads or ideas on the possible reasons of such a problem : the cube is fairly simple and doesn't use compressed requests or aggregates. When I delete the request manually, there are no problems.
Is there any other FM more appropriate to delete a specific request such as the last one? Or some best practices associated with this particular task... I'm very new to SAP Netweaver so don't hesitate to point out the obvious.
Regards,
GuillaumeHello Matthew,
I may be wrong, but it seems that "Delete Overlapping Request" won't be enough to obtain the specific behavior I am looking for.
To give a more detailed description of the context, this specific Cube is used as a history cube, and until a specific date, the latest request, and only this one, not the previous ones which came from the same DTP, must be deleted and then replaced with the new data loaded in the process chain. This data is a value which gets updated everyday but after a specific date (loaded from a table), it's no longer updated and considered "history" and the new data won't replace the previous request but will be added to the cube. Not sure if that was clear
Here is an example.
The cube contains three requests, A, B and C.
At D1, a new request, D, is loaded and since D1 isn't the "specific day", C is deleted (being the last request) and replaced with D. => The cube contains A, B, D.
At D2, a new request, E, is loaded, and this time time D2 is the "specific day", so D isn't deleted and E is loaded into the cube which now contains A,B,D,E.
I hope this cleared up the actual purpose of this routine.
Do you think there is a way (some conditions or something like that?) to set a "Delete Overlapping Request" to achieve this? This would be much simpler than the current solution.
However, it would still be nice if I could understand what is wrong with the straightforward ABAP program we use in the first place.
Regards,
Guillaume -
Delete overlapping/duplicate records from cube
Hi All,
Kindly let me know how to delete overlapping requests from a cube. Actually the cube is getting loaded from varuous infosources, but there are records which get duplicated and the are not wanted , then hiow to delete the duplicate records from the cube.
Regards,
dolaI think what arun is perfectly right....
use DSO for consolidation of various requests..from diferenet infosources...
Now load from DSO to cube...and it is very much possible...though will require little work.
Delete duplicate records option is usually used for master data.With transacdtion data i don't think its advisable.
Regards,
RK -
Repeating successful uploads - procedure to delete all the requests/packets
Hi friends,
I have a sort of hypothetical question, so never mind if this makes sense in real life (actually, I can think of a few situations where it makes sense, but only as a last resort).
Let's say I want to repeat the upload that was successful (request is green in the cube). Or another point of view, let's say I want to delete all successful uploads except the initial one. The load is delta load and straight from R/3 into the cube (no DSOs or other objects in between).
What do I need to do/check? I obviously need to delete the requests in the cube, but where else can the data be stored? PSA is probably the first place to check, but it seems to be empty most of the times. How do I delete all the data from the PSA? The BI delta queue is always empty when I look at it from the Administration menu. Again, must I delete it and how? I guess I should delete the delta queue in R/3 as well (through SBIW)?
How do I do the same thing when there ARE other objects (like DSOs) before the cube?
Thanks in advance for your advice/help.
Kind regards,
K.Probably it does not )
Anyway, i was thinking of a hypothetical situation where an administrator accidentally repeats an upload at some point and thus doubles the figures. Then he does not notice it immediately and the system keeps loading requests for some time.
I actually wanted to predict a solution to this problem, since it would be difficult (if not impossible) to find the duplicate records, especially after "compress". I thought that simply deleting all the requests and reloading through normal delta would solve this hypothetical situation. But again, where and how to delete all (and be sure you deleted all) to ensure that the records will not be doubled.
K.
Maybe you are looking for
-
Error when launching SCORM Compliant Content
When launching a piece of SCORM Content, OLM is kicking out an error that says "Unable to Locate the LMS's API Implementation. Communication with the LMS will not occur" We have verified the physical and virtual paths for the content server and the S
-
What is report painter? and how to use it
hi all what is report painter and how we use it. please send tutorial thanks & regards
-
Populating International Article Number (EAN/UPC) in PO
Hi Experts, My requirement is to populate the field EAN/UPC in PO, with the value from the field 'OTHER NAME' present in the item line2 of the SRM shopping cart. Can this be achieved by using the BADI - BBP_CREATE_BE_PO_NEW ? If so, can you please he
-
How to protect setting in printing SAP Business One (Sales Order, Delivery Order, etc)? e.g: if we have make a Delivery Order, first print is an ORIGINAL but second print is a COPY. If a user need second print, user must be use a password. How to sol
-
Does anyone know how can i get or download the apple hardware test for a macbook pro made in 2012 running Mavericks 10.9.3. Thanks.