ABAP report to read request details in data-target and delete
Hi,
Is there any abap report which I can run in background to read all request details (Request ID, Date) in a data target?
And any ABAP report which can delete a particular request?
Regards
Vikrant
Hi Vikrant,
You can check Table RSREQDONE and RSREQICODS for all details about requests.
About deletion you can try with standard function in InfoPackage when uploading.
Ciao.
Riccardo.
Similar Messages
-
No of requests in a Data Target
Hi
How/Where i can see the count of Requests loaded in a Data Provider in BI 7.0
ThanksHi,
RSICCONT is table used to check the requests in the Data Target level ( Cube and DSO) and also if you are not able to delete the request from the data targets in PRODUCTION (Sometimes you will not have authorization for deleting Request) by going to the mentioned you can delete the request.
Regards
Ram. -
PSA request updated to data targets but not showing
Hi All,
It must be a basic question for BW support activities. However appreciate if someone can help me out as I am unable to find similar post.
In PSA the request shows it successfully updated to a data target but I can't see the delta request in the data target (Cube)- Manage.
This is a delta request. Can someone tell me how do I load the data back in to the cube? Do I need to delete the request from cube and run it again but I guess the deltas will be lost.
Help. pls give me all the steps in detail how do I fix this.
Thanks,
BHi,
There are two options.
1. at PSA find how many records are exist for your request.
Note/down load the records into excel if you have countable. As my guess delta records might be less.
Later thru repair full request you can load to PSA and then to target.
AT repair full request we need to use proper selections to load required records only.
2. PSA Request must be stored at back end table.
in that table if we can change update further status then we can do the psa request to target thru delta dtp.
Thanks -
Error "cannot load request real time data targets" for new cube in BI 7.
Hi All,
WE have recently upgarded our SCM system from 4.1 to SCM 7.0 which incorporated BI 7.0.
I am using BI 7.0 for first time and ahve the following issue:
I ceated a new infocube and data source of flat file and succesfully created transformation, and Data Transfer Process. Everything looked fine. I added flat file and checked preview and could see data. Now when I start job to load data in infocube the follwing error is shown "cannot load request real time data targets".
I checked cube type in setting in infcune is shows as Standard. When I doube clicked on error the following message showed up
You are trying to load data into a real-time InfoCube using a DTP.
This is only possible if the correct load settings have been defined for the InfoCube.
Procedure
In the object tree of the Data Warehousing Workbench, call Load Behavior of Real-Time InfoCube from the context menu of the InfoCube. Switch load behavior to Transactional InfoCube can be loaded; planning not allowed.
I did not understand what it is meant and how to set changes. Can someone advice and follow me through.
Thanks
KVHi Kverma,
Real-time InfoCubes can be filled with data using two different methods: using the transaction for entering planning data, and using BI staging, whereby planning data cannot be loaded simultaneously. With Real time cube you can select the method you want to use for update as
Real Time data Target can be loaded With Data; Planning not allowed &
Real Time data Target can be Planned; Data loading not allowed
You can change this behaviour by right clicking on cube and selecting Change real time load behaviour and select first option. You will be able to load the data then
Regards,
Kams -
Difference Between data target and infoprovider
Hi Experts,
I am new to BW
What is the difference between data target and infoprovider
Thanks and Regards,
saveenHi Saveen,
InfoProvider is an object on which BEx queries are created. It provides information (data) to the queries when they are executed. InfoProviders may contain data (like cube) or may not contain data (like multiproivder and InfoSet).
Data Target is an object to which you will load the data, like cube, ODS or InfoObject. But it is not necessary that each data target is an InfoProvider...like you may have ODS objects that are not enabled for reporting, but you are loading data to them for staging purpose.
Hope this helps... -
Hello. I am owner of an iPod Touch 5gen. I have an error "iTunes was unable to load provider data from sync services". I tried: All Apple's tutorials including reseting sync data, reinstalling and deleting all of iTunes components including deleting it from system registry. My system is Windows 7 64bit.
UPDATE: The true error is : '
"itunes was unable to load data class information from sync services"
Any of the topics didnt work.
Sync on my wife's laptop works. -
Query data being picked from which data targets and dimension tables.
Hi Guys,
I need help from you people.
My query is "If we execute any query, i want to know from which data targets and from which dimension tables data being read in the run time", is there any program or any table to find this data.
thanks in advacne.
Regards
PrasadHi Prasad,
We will get Data target information in query level in information TAB.
If you want get dimension tables information also you need use technical business content(bwstatistics) Cubes and need to customize the required information. I think standard statisics cubes is not provide dimention tables information. Need to customize that.
Hope it will help for you.
Thanks,
Chandra -
Scheduling a background abap report to run on a specific date
Hello Experts,
I have an ABAP report I want to schedule in another program.
Basically, I want to schedule it, so that it runs the next day (sy-datum + 1). I have used FM Job_open followed by submit and Job close, but the program is executed immediately.
Below is the code snippet:
v_datenext = sy-datum + 1.
CALL FUNCTION 'JOB_OPEN'
EXPORTING
delanfrep = ' '
jobgroup = ' '
jobname = jobname
sdlstrtdt = v_datenext
sdlstrttm = sy-uzeit
IMPORTING
jobcount = jobcount.
SUBMIT z_prg_spinterf
WITH i_year EQ v_subyrnext
WITH i_month EQ app->g_str_header-f_month
WITH i_dc EQ 'B'
USER sy-uname
VIA JOB jobname
NUMBER jobcount
AND RETURN.
starttime-sdlstrttm = sy-uzeit + 60.
Close job
CALL FUNCTION 'JOB_CLOSE'
EXPORTING
jobcount = jobcount
jobname = jobname
prddays = 1
sdlstrtdt = v_datenext
sdlstrttm = starttime-sdlstrttm
strtimmed = starttimeimmediate
targetsystem = host.
The report z_prg_spinterf is getting executed immediately, which i dont want. I wanted it to be executed the next day.
Can someone help find out the source of the error. Maybe I am missing out something.
Thanks,
Nitish.Hi
1. go to transaction sm36
2. select job wizard
2.1 press continue
2.2 Enter a job name say abc(user defined)
Enter job class:- select priority say 'high priority"
2.3press continue
2.4 Select Abap program step
2.5 press continue
2.6 Enter the correct program name say"Z_REPORT"
2.7 Enter the correct variant name say" v1" and press continue
2.8 press continue
2.9 Select job start condition: "Date/Time" radio button and press continue
3.0 Enter the scheduled date: ie, date to which the report to be executed and give the desire time
3.1 press continue
3.2 Press complete
3.3 If success the report will be executed
To see the output of the report
1. transaction sm36
2. select own jobs
3. Find ur job name .
4. Press on spool, u will find the out put. -
Help to read a table with data source and convert time stamp
Hi Gurus,
I have a req and need to write a ABAP prog. As soon as i excute ABAP program it should ask me enter a data source name, then my ABAP prog has excute teh code, in ABAP code i have to read a table with this data source as key, sort time stamp from table and should display the data source and time stamp as output.
As follows:
Enter Data Source Name:
Then user enters : 2lis_11_vahdr
Then out put should be "Data source :" 10-15-2008.
The time stamp format in table is 20,050,126,031,520 (YYYYMMDDhhmmss). I have to display as 05-26-2005. Any help would be apprciated.
Thanks,
RamHi Jayanthi Babu Peruri,
I tried to extract YEAR, MONTH, DAY separately and using
EDIT MASK written it.
Definitely there will be some STANDARD CONVERSION ROUTINE will be there. But no idea about it.
DATA : V_TS TYPE TIMESTAMP,
V_TS_T TYPE CHAR16,
V_YYYY TYPE CHAR04,
V_MM TYPE CHAR02,
V_DD TYPE CHAR02.
START-OF-SELECTION.
GET TIME STAMP FIELD V_TS.
V_TS_T = V_TS.
CONDENSE V_TS_T.
V_YYYY = V_TS_T.
V_MM = V_TS_T+4(2).
V_DD = V_TS_T+6(2).
V_TS_T(2) = V_MM.
V_TS_T+2(2) = V_DD.
V_TS_T+4(4) = V_YYYY.
SKIP 10.
WRITE : /10 V_TS," USING EDIT MASK '____-__-________'.
/10 V_YYYY,
/10 V_MM,
/10 V_DD,
/10 V_TS_T USING EDIT MASK '__-__-__________'.
If you want DATE alone, just declare the length of V_TS_T as 10.
Regards,
R.Nagarajan.
We can - -
Displaying datas of each and evey request in the data target
Hi All,
Can we display the datas of a single request alone in the data targets such as DSO, Cube, etc.,
If yes please let me know the procedure.
Thanks in advance.
Regards
GeethaYou can do that,
for DSO - se16 -> give the change log table name of the DSO (here change log data of the dso should not be delteted)
cube - listcube -> give the request number ( here that particular request should not be compressed)
If it satisfied above conditions you can display that particular request. -
Details about Data Target
Hi...........
Infoproviders
http://help.sap.com/saphelp_nw04/helpdata/EN/02/bafc3f9ec4e669e10000000a155106/content.htm
Infoobjects
http://help.sap.com/saphelp_nw70/helpdata/EN/80/1a63f9e07211d2acb80000e829fbfe/frameset.htm
Infoproviders
Different types of Infoproviders are : Infocube,DSO,Virtual providers,Infoset,Multiprovider...in this link you will find all the details..
http://help.sap.com/saphelp_nw70/helpdata/EN/80/1a63f9e07211d2acb80000e829fbfe/frameset.htm
Master Data
http://help.sap.com/saphelp_nw70/helpdata/EN/04/cce63736544d4de10000009b38f8cf/frameset.htm
Hope this helps you............
Regards,
Debjani............. -
Filtering request in Manage Data Targets
Hi,
is it possible to filter all Requests loaded in the data target (InfoCube) by some criteria (InfoSource or DataSource, Source_System...).
I would need filtering to locate all requests for delete (delete requests form one Source System / delete all request form particular InfoSource / etc.)
Thanks and regards, TomazHi,
At an infopackage level, under the tab 'data targets' there is a column titled "automatic loading of similar / identical requests from infocube". If you click on this, you will be presented with a window that will allow you to delete existing requests based on various criteria like if infosources are the same, if datasources are the same, source systems are the same etc. etc.
Check and see if this functionality meets your requirement.
Assign points if useful.
regards,
Pinaki -
Goldengate Extracts reads slow during Table Data Archiving and Index Rebuilding Operations.
We have configured OGG on a near-DR server. The extracts are configured to work in ALO Mode.
During the day, extracts work as expected and are in sync. But during any dialy maintenance task, the extracts starts lagging, and read the same archives very slow.
This usually happens during Table Data Archiving (DELETE from prod tables, INSERT into history tables) and during Index Rebuilding on those tables.
Points to be noted:
1) The Tables on which Archiving is done and whose Indexes are rebuilt are not captured by GoldenGate Extract.
2) The extracts are configured to capture DML opeartions. Only INSERT and UPDATE operations are captured, DELETES are ignored by the extracts. Also DDL extraction is not configured.
3) There is no connection to PROD or DR Database
4) System functions normally all the time, but just during table data archiving and index rebuild it starts lagging.
Q 1. As mentioned above, even though the tables are not a part of capture, the extracts lags ? What are the possible reasons for the lag ?
Q 2. I understand that Index Rebuild is a DDL operation, then too it induces a lag into the system. how ?
Q 3. We have been trying to find a way to overcome the lag, which ideally shouldn't have arised. Is there any extract parameter or some work around for this situation ?Hi Nick.W,
The amount of redo logs generated is huge. Approximately 200-250 GB in 45-60 minutes.
I agree that the extract has to parse the extra object-id's. During the day, there is a redo switch every 2-3 minutes. The source is a 3-Node RAC. So approximately, 80-90 archives generated in an hour.
The reason to mention this was, that while reading these archives also, the extract would be parsing extra Object ID's, as we are capturing data only for 3 tables. The effect of parsing extract object id's should have been seen during the day also. The reason being archive size is same, amount of data is same, the number of records to be scanned is same.
The extract slows down and read at half the speed. If normally it would take 45-50 secs to read an archive log of normal day functioning, then it would take approx 90-100 secs to read the archives of the mentioned activities.
Regarding the 3rd point,
a. The extract is a classic extract, the archived logs are on local file system. No ASM, NO SAN/NAS.
b. We have added "TRANLOGOPTIONS BUFSIZE" parameter in our extract. We'll update as soon as we see any kind of improvements. -
Unable to delete the Requests in the data target---Process chain
Hello Gurus!
Can anyone plz help me out knowing this issue...
<b>Let me explain the whole scenario:</b>
Like when monitoring the process chains,one of the process chainhas a sequence of variants like:
load info-package1>ODS1 activation>load infopackage2-->load infopackage3..
due to some problems they scheduing it manually.all the infopackages loads data into one ODS1......suddenly info-package3 failed.
so....when i looked into the datatarget ODS1->manage->requests.
<b>i found that the info-pack1 requests status was RED.
info-pack2 requests status was YELLOW.
info-pack3 requests status was RED</b>.
ALL THE THREE R FULL UPDATES ONLY.
so suddenly one of our cologue deleted the all the three requests by making ino-pack2 requests to red.
when we r loading again the ino-pack1 again its failed.we deleted it onceagain(removing the requests from the ods),when we are trying to load ino-pack2 for some priority ,automatically the info-pack1 requests with red were seemed in the ODS.so how to solve this issue and also how to load these three ino-packs..
i hope that all understand the scenario..plz suggest me how to work on it
Thanks in advance!!!
DilipHi,
Your loads are running on back ground you have to kill this porcess in SM50 or SM51 check which are all jobs are running with your user name and ALEREMOTE if jobs are running with ALEREMOTE you can check wether it is your load or some body else in SM37 so kill your jobs in SM50 or in SM51 and make request into RED of your 3 loads, delete the requests from ODS and repeat the loads it will go with out any issues, as i under stood your question when your colegue deleted the request without killing the process he simply made it to red and delete its wrong he has to kill the process and then make it to red.
Regards
Sankar -
Alv data upload and delete in database table
hi .
i have done data save in date base but not update and delete..writen this code..
form save_data.
CALL METHOD cont_editalvgd->check_changed_data.
IF lt_display EQ it_city.
MESSAGE s002(00) WITH 'No data changed'.
ELSE.
CLEAR: gd_answer.
CALL FUNCTION 'POPUP_TO_CONFIRM'
EXPORTING
text_question = 'Save_data?'
IMPORTING
answer = gd_answer
EXCEPTIONS
text_not_found = 1
OTHERS = 2.
delete
IF sy-subrc EQ 0.
MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
ENDIF.
IF ( gd_answer = '1' ). " yes
lt_display = it_city.
else.
MESSAGE s001(00) WITH 'Action cancelled by user' .
endif.
endif.
DELETE ADJACENT DUPLICATES FROM it_city.
update zmg_city1 from it_city.
MODIFY zmg_city1 FROM TABLE it_city.
*DELETE zmg_city1 FROM it_city.
clear it_city.
COMMIT WORK.
else.
DELETE ADJACENT DUPLICATES FROM it_city.
IF SY-SUBRC EQ 0.
DELETE ADJACENT DUPLICATES FROM it_city.
update zmg_city1 from it_city.
COMMIT WORK.
endif.
*ent rows from it_city
DELETE ADJACENT DUPLICATES FROM it_city.
update zmg_city1 from it_city.
endform. "save_data
not working update and delete ...plz sayhi .
i have done data save in date base but not update and delete..writen this code..
form save_data.
CALL METHOD cont_editalvgd->check_changed_data.
IF lt_display EQ it_city.
MESSAGE s002(00) WITH 'No data changed'.
ELSE.
CLEAR: gd_answer.
CALL FUNCTION 'POPUP_TO_CONFIRM'
EXPORTING
text_question = 'Save_data?'
IMPORTING
answer = gd_answer
EXCEPTIONS
text_not_found = 1
OTHERS = 2.
delete
IF sy-subrc EQ 0.
MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
ENDIF.
IF ( gd_answer = '1' ). " yes
lt_display = it_city.
else.
MESSAGE s001(00) WITH 'Action cancelled by user' .
endif.
endif.
DELETE ADJACENT DUPLICATES FROM it_city.
update zmg_city1 from it_city.
MODIFY zmg_city1 FROM TABLE it_city.
*DELETE zmg_city1 FROM it_city.
clear it_city.
COMMIT WORK.
else.
DELETE ADJACENT DUPLICATES FROM it_city.
IF SY-SUBRC EQ 0.
DELETE ADJACENT DUPLICATES FROM it_city.
update zmg_city1 from it_city.
COMMIT WORK.
endif.
*ent rows from it_city
DELETE ADJACENT DUPLICATES FROM it_city.
update zmg_city1 from it_city.
endform. "save_data
not working update and delete ...plz say
Maybe you are looking for
-
How to mount iPhone 3gs on a macbook pro with maverick ?
how to mount iPhone 3gs on a macbook pro with maverick ?
-
Why is my iSight camera is not working with the new Mac Pro?
I connected the iSight camera which is a FireWire 400 into a Sonnet FireWire 800 adapter, and connected that into a FireWire to ThunderBolt adapter and plugged directly into the Mac Pro running Mave. I receive power, noted by the green light coming o
-
I really like the ability to use the square double brackets for hyperlinks. It is much easier because it automatically changes the hyperlink if the page name changes. I would like to do this with bookmarks on other pages (for example, I am on the h
-
Can't attach an iPhoto pic to a new message
I know there is going to be an easy answer to this, but whenever I try to attach a photo to a new outgoing message the system locks up and I have to force quit mail and/or restart the machine. I click "attachment" and select Pictures. From that point
-
Best Practice for storing user preferences
Is there something like best practice or guideline for storing user preferences for desktop application such as window position, layout setting, etc. ?