ABAP report /VIRSA/ZVFATBAK run very long on backend
Hello experts,
For ABAP report /VIRSA/ZVFATBAK which runs in the backend system, normally how long would it take to finish? Because now the duration for the report is already 2k++ seconds and its still running in our test system, while the report scheduled in our development system only uses 1 or 2 secs to finish.
Any idea why is it taking that long in a test system?
Appreciate the replies, thank you in advance.
Thank you for the note sabita! Its really helpful.
One question regarding the sapnote, it mentioned STAT collector job, is this a standard job?
If yes, below are the standard collector jobs which is scheduled in the system, which would it be?
SAP_COLLECTOR_FOR_JOBSTATISTIC
SAP_COLLECTOR_FOR_NONE_R3_STAT
SAP_COLLECTOR_FOR_PERFMONITOR
Similar Messages
-
Discoverer report is taking a very long time
Hi All,
I need help on below discoverer issue.
discoverer report is taking a very long time for rows to be retrieved on export when it is run for India and it is required for month end. For some reason only 250 rows are retrieved at a time and retrieval is slow so it is taking 10 minutes to bring back 10,000 rows.
Regards
KumarPlease post the details of the application release, database version and OS along with the discoverer version.
I need help on below discoverer issue.
discoverer report is taking a very long time for rows to be retrieved on export when it is run for India and it is required for month end. For some reason only 250 rows are retrieved at a time and retrieval is slow so it is taking 10 minutes to bring back 10,000 rows.Please see these links.
https://forums.oracle.com/forums/search.jspa?threadID=&q=Discoverer+AND+Long+AND+Time&objID=c3&dateRange=all&userID=&numResults=15&rankBy=10001
https://forums.oracle.com/forums/search.jspa?threadID=&q=Discoverer+AND+Performance&objID=c3&dateRange=all&userID=&numResults=15&rankBy=10001
https://forums.oracle.com/forums/search.jspa?threadID=&q=Discoverer+AND+Slow&objID=c3&dateRange=all&userID=&numResults=15&rankBy=10001
Thanks,
Hussein -
Query is running very long when I bring in an attribute
I am running into an issue when i bring in an attribute from the filter. It is taking a very long time to display. When i run the same query against the DSO i copied form it runs fast but not on the one I copied. what could be the issue.
So, you copied an existing report to a new report. Both the reports are on a DSO. Old report runs fine, but the new report runs very long. Is all this correct?
How is the new report different? You said something about an attribute. Are you just displaying this attribute or are you filtering the results using this new attribute?
Without understanding your question clearly, here is my speculation: Your selection filters on the new query are not in any DSO index.
If I misunderstood your question and scenario, then please clarify. -
Crystal report with prompts takes very long time to open in infoview BOXI3.
Crystal report with prompts takes very long time to open in infoview BOXI3.1?
Is there any way to increase the performance.Ramsudhakar,
There are several items that could cause these slow down problems. Without knowing more about the way your environment is setup, I could cause more problems, by giving out performance tips. You would need to be more specific in your post.
What we know. BOXI3.1
What we don't know.
O/S
App. Server
Hardware Spec's
ETC.
I see that this post has been out here for some time. So if this is still a problem for you I'll try and help, if you provide more information.
Thanks
Bill -
How to schedule an abap report program to run every day, weekend etc..
Hi,
I want to schedule an abap report program to run every day, week, fortnight or month and the output to be redirected to the printer. How to achieve this ?
thanksHi,
go to t.code sm36 there give the name of the program which u want to eecute in background if u want u can give periodic that depends upon ur requirement from there in sm37 u can check the stauts and directly print from there
hope i am clear to u
plzz reward if it is useful... -
Inventory aged reports are taking a very long time to run
We are using Standard delievered extractors for Inventory. We have build an Aged report and it is taking a very long time to run as more an more data is added. We put the inventory in buckets 0-30, 31-60, .... >365 days. We are aging based on a batch date the user enters. the problem is it has to go through every record to recalculate because they are non cumulative.
any ideas/suggestions on how to make this more efficient? New design?Hi MM,
We can use snapshot of monthly data from Query and store it in DSO at month level.
We had used APD on Query and then Stored them in DSO1(WO)->DSO2(STD)->Cube->report based on Snap Shot.
From the New Query , calculate the Age.
Rgds
SVU -
OCDM - Sample Report Installation issue - Runs too longer with No progress
Hello,
As a part of OCDM environment setup for Practice we were successful installing following :
* Supported Enterprise Linux
* Oracle DB 11g R2 (OLAP & MINING option)
* OWB was already installed as a part of Oracle DB 11gR2 - just unlocked the OWB accounts.
* Installed OBIEE 11g R2 installation - Installation successful and able to open url's. (analytics and BI Publisher etc.) - Issue with login account.
* Installation of Communication Data Model (First Installation type from OCDM Installer)
When we try to install Sample Reports at same stage the screen remain open and runs too longer with No progress.
The Installation process for OCDM Sample Reports is given in the attached doc with screenshot and the log file is also included in the attached document.
Please let us know the following two issue:
1) How to login to Analytics after OBIEE 11gR2 installed on Linux - ( in our case we use user as Administrator and tried password same as username, even checked with the user weblogic and other but no success. Please provide us the information which give details(document) about the Administartion of OBIEE 11gR2 on Linux).
2) The issue with installation of OCDM Sampple Reports though successful installation of OCDM Data Model.(First Installation Type from OCDM Installer)
Thanks & Best Regards,
Amol Thite
I think, I cant Attach the file here and character limit is 30K for the post here so putting below the few lines of Log File content:
I can email the screen shots docs and complete log file on request if needed.
It stuck in Configuration Assistance screen -
After more than 1 hour the same screen – No progress – I tried 3 times but same issue. Log file also not growing and seems its waiting for some dependent thing that need to happen before it proceed.
=====start few lines of Log File=======================================
The file oraparam.ini could not be found at /u01/app/oracle/product/11.1.0/db_1/oui/bin/oraparam.ini
Using paramFile: /u01/app/oracle/product/11.1.0/db_1/oui/oraparam.ini
Checking swap space: must be greater than 500 MB. Actual 1027 MB Passed
Checking monitor: must be configured to display at least 256 colors. Actual 65536 Passed
The number of files bootstrapped for the jre is 689.
The number of files bootstrapped for the oui is 77.
Using the umask value '022' available from oraparam.ini
=====end few lines of Log File==================================
INFO: Copying Aggr XML for: Oracle Communications Data Model
INFO: The Top level Aggreage File = /u01/app/oracle/product/11.1.0/db_1/inventory/ContentsXML/ConfigXML/oracle.ocdm.11_2_3_0_0.xml
INFO: deleted all the required instance files
INFO: OUI_CAPlugIn is not found in XML
INFO: cf session will be created for OH: /u01/app/oracle/product/11.1.0/db_1/ TLAggr: oracle.ocdm instancePath: inventory/ContentsXML/ConfigXML/
INFO: cf session for OH: /u01/app/oracle/product/11.1.0/db_1/ TL Aggr: [oracle.ocdm] instancePath: inventory/ContentsXML/ConfigXML/
INFO: aggr ref length : 2
INFO: cf session hashcode: 22855989
INFO: cf session saved with key: OraDb11g_home1 oracle.ocdm
INFO: cf session is ok
INFO: created and saved cf session for oh: OraDb11g_home1
INFO: passing params to cf
INFO: Handling the storing of variables for aggr name oracle.ocdm
INFO: This variable sl_ASMSelectableDiskGroups is not added to the global context map
INFO: This variable s_scratchPath is not added to the global context map
INFO: exitonly tools to be excuted passed: 0
INFO: Starting to execute configuration assistants
INFO: Command = oracle.ocdm.OCDMCfgPlugIn /u01/app/oracle/product/11.1.0/db_1/ocdm/ocdm_install.sh ${s_dbSysPasswd}
OCDMCfgPlugIn: Starting OCDM configuration...
OCDMCfgPlugIn: OCDM configuration initialized, waiting for script response...
OCDMCfgPlugIn: Receiving SYSTEM password...
OCDMCfgPlugIn: Passing SYSTEM credentials to configuration script...
OCDMCfgPlugIn: SYSTEM credentials passed to configuration script.
OCDMCfgPlugIn: SYSTEM password received.
OCDMCfgPlugIn: Initializing config parameteres...
OCDMCfgPlugIn: Creating log folder...
OCDMCfgPlugIn: ocdm log folder exist
OCDMCfgPlugIn: Exporting config env...
OCDMCfgPlugIn: Performing OWB check...
OCDMCfgPlugIn: Installing OWB...
OCDMCfgPlugIn: WARNING: OWB OWNER already exists.
OCDMCfgPlugIn: Archive: /u01/app/oracle/product/11.1.0/db_1/ocdm/pdm/relational/sample_schema/ocdm_sample.dmp.zip
OCDMCfgPlugIn: inflating: /u01/app/oracle/product/11.1.0/db_1/ocdm/install_tmp/ocdm_sample.dmp
OCDMCfgPlugIn: 2011_02_28_11_24_30 10 % complete
OCDMCfgPlugIn: Importing OCDM sample schema
==================end of log File ========================================Hi Please check my answers
1) How to login to Analytics after OBIEE 11gR2 installed on Linux - ( in our case we use user as Administrator and tried password same as username, even checked with the user weblogic and other but no success. Please provide us the information which give details(document) about the Administartion of OBIEE 11gR2 on Linux).
The current OCDM version does not support BIEE 11g, so you need to install BIEE 10g. In next release(will be available soon), OCDM will support BIEE 11g.
2) The issue with installation of OCDM Sampple Reports though successful installation of OCDM Data Model.(First Installation Type from OCDM Installer)
The importing will take a long time, you can login database with ocdm_sample_sys/ocdm_sample_sys, and use "select count(*) from tabs;" to monitor the import process. -
Abap report need to run only once in a day
can i restrict my abap report that it should run once in a day. is there any authorization funda? or any sap provided concept? how can i approach this?
hi,
no authorizations for tht if u want try this code i think it will help u
DATA:count,date LIKE sy-datum.
GET PARAMETER ID 'MAX' field count.
GET PARAMETER ID 'DAT' FIELD date.
IF date is initial.
date = sy-datum.
ENDIF.
IF date = sy-datum.
count = count + 1.
else.
message e000(zmsgtab)."u con't execute it more than once in a day.
endif.
set PARAMETER ID 'MAX' field count.
IF count > 5.
count = 0.
set PARAMETER ID 'MAX' field count.
date = date + 1.
set PARAMETER ID 'DAT' field date.
message e000(zmsgtab)."u con't execute it more than once in a day.
ENDIF.
write:/ 'this is my program'.
<b><removed_by_moderator></b>
<b><removed_by_moderator></b>
feel free to ask any quiries
<b><removed_by_moderator></b>
<b><removed_by_moderator></b> -
Launch ASCP plan run very long time
Hi,
I launched constrained ASCP plan and it took very long time. I launch ASCP plan in friday and it still not finished yet till monday, Memory Based Shapshot & Snapshot Delete Worker are still running, Loader Worker With Direct Load Option is still in pending phase. MSC: Share Plan Partitions has been set to Yes.
When I run query below :
select table_name,
partition_name
from all_tab_partitions
where table_name like 'MSC_NET_RES_INST%'
OR table_name like 'MSC_SYSTEM_ITEMS%'
order by substr(partition_name,instr(partition_name,'_',-1,1)+1);
The results are:
MSC_SYSTEM_ITEMS
SYSTEM_ITEMS_0
MSC_NET_RES_INST_AVAIL
NET_RES_INST_AVAIL_0
MSC_SYSTEM_ITEMS
SYSTEM_ITEMS_1
MSC_SYSTEM_ITEMS
SYSTEM_ITEMS__21
MSC_NET_RES_INST_AVAIL
NET_RES_INST_AVAIL__21
MSC_NET_RES_INST_AVAIL
NET_RES_INST_AVAIL_999999
MSC_SYSTEM_ITEMS
SYSTEM_ITEMS_999999
Please help me how to increase the performance when launching the plan. Is change MSC: Share Plan Partitions to No the only way to increase the performance in running plan?
Thanks & Regards,
YolandaHi Yolanda,
a) So does it means that plan was working fine earlier but you are facing this issue recently.. ? If so then what you have changed at server side or have you applied any recent patches.. ?
b) If you have not completed plan for single time,
I will suggest that run data collection in complete refresh mode for one organization which is having relatively small data. Further, you can modify plan options in order to reduce planning calculation load like
- disable pegging
- remove any demand schedule / supply schedule / global forecast etc
- enable only single organization which having relatively small demand and supply picture
- disable forecast spread
Once one plan run will be completed, then expand your collection scope to other organizations and also enabling above mentioned setting.
There are lots of points need to consider for performance issue like server configuration, hardware configuration, num of demands etc. So you can raise SR in parallel while working on above points.
Thanks,
D -
Reporting solution for a very long report
Hi All,
We have requirement to print a very long report ( about 150.000 pages folio size, something like account statement ). Database server is in remote location, connected to report's client via 64 kbps VPN.
What is the best solution for such requirement ?
Some thoughts are :
- using Crystal Report / Oracle Report , display the report on browser and print it
- Query the data from client side, save the data locally and create report from it
or is there any best solution for this ?
Thank you for any help,
xtantoCreate an output file on the location where the database is located, zip it, and transfer it to the client. 150000 pages is an awful lot, but usually mostly spaces.
-
Calc scripts running very Long time
Hi All,
Recently, i am migrated the objects from Production to Test region. We have 5 applications and each of the application has a set of calc scripts.
In test region, they are running really long time. Where as in Production, they run for less time.
In TEST region each Calc script is taking 10 times more time than the Production times.
No Dimension added or no script is updated. No difference in objects between TEST and PROD.
Please suggest me, why is this difference.
Thanks
MaheshThe obvious first question would be if the hardware is different. You would expect prod to be a more powerful server and therefore perform better.I'm seeing a lot of virtualized test servers (who knows, really, what power the box has) and real prod servers. That can make a huge difference in performance.
It makes benchmarking tough -- yes, you can see how long something will take relative to another process, but there isn't any way to know how it will perform in production until you sneak it over there and benchmark it. It can be a real PITA for Planning.
And yes, the theory is that dev and prod are similar so that the above isn't an issue, but that seems to be a more theoretical than actual kind of thing.
Regards,
Cameron Lackpour -
ABAP Report WRITE, how to ouptut long lines without additional formatting?
I am developing a program, which dumps objects as XML and I need to create one big XML file at the end. I would like to start it as a background job and get this XML as spooled output at the end. The problem is that the XMLs come from CALL TRANSFORMATION as one big string without any CR LF and in the report spooled output there are additional "formatting" like header, pages etc.:
#COL0N#COL0H07.12.09 Programm EHFND_GENERATE_BO_DOCU 1
#<bopf_bo><item><KEY>gB4L/AeoHd6tl1YoR5AdIg==</KEY><PARENT_KEY/><ROOT_KEY>gB4L/A
#<bopf_bo><item><KEY>gB4L/AeoHd6tl2lLeQUKOQ==</KEY><PARENT_KEY/><ROOT_KEY>gB4L/A
#<bopf_bo><item><KEY>gB4L/AeoHd6tmEQlByTUEg==</KEY><PARENT_KEY/><ROOT_KEY>gB4L/A
#<bopf_bo><item><KEY>gB4L/AeoHd6xjRciawUXeQ==</KEY><PARENT_KEY/><ROOT_KEY>gB4L/A
#<bopf_bo><item><KEY>gB4L/AeoHe6t31MGbgZYFg==</KEY><PARENT_KEY/><ROOT_KEY>gB4L/A
P
|#COL0N#COL0H07.12.09 Programm EHFND_GENERATE_BO_DOCU 2
|#######################################################################################
|#<bopf_bo><item><KEY>gCFaXDIYHe60jgYnuB9agQ==</KEY><PARENT_KEY/><ROOT_KEY>gCFaXD
Is it possble to make the WRITE statement automatically insert CR LF so I do not loose the XML data and also do not output
any additional symbols? My purpose is to get clean XML data at the end, which can be e.g. directly loaded into explorer.
I already have a solution which fills a DB table with all XMLs and finaly I create the file with cl_gui_frontend_services=>gui_download but I would like to have a simplier solution using only the abap report output.
Edited by: Rob Burbank on Dec 8, 2009 10:56 AMBetter option would be writing this XML string to File on the application server using OPEN DATASET ... TRANSFER .... CLOSE DATASET. You can give the provision in the same program to download this application server file to Presentation.
Regards,
Naimesh Patel -
Select sql running very long when made to select many columns from a table
Hi,
I am using an Oracle DB 10g. I have a table with 250 columns and I need to select 200 columns out of them. The table holds around 0.1 million rows. But when I run the select sql it takes 15 mins to return .1 million rows. Where as if I select only 10 or 15 columns the sql runs in less than a minute returning same number of rows. My sql looks like below:
select p.col1,p.col2,.......,p.col200
from table Parent p;
The table also has a Primary key Index but it does not seem to be using it even when I forced an Index hint. Could you pls help?961796 wrote:
I am using an Oracle DB 10g. I have a table with 250 columns and I need to select 200 columns out of them. The table holds around 0.1 million rows. But when I run the select sql it takes 15 mins to return .1 million rows. Where as if I select only 10 or 15 columns the sql runs in less than a minute returning same number of rows. My sql looks like below:
As Sven points out, it is likely that most of your time is network (and client) time. You are sending 20 times as much data (based on column counts, at any rate) across the network, and concerned that it's taking 15 times as long.
If you're testing from SQL*Plus then setting the arraysize to a value larger than the default might help.
If you have control over the SQL*Net settings then you may get some benefit by adjusting the SDU sizes at both ends of the link.
If you have control over the tcp configuration (transmit and receive buffer sizes) then you may get some benefit by adjusting these.
Simple test, by the way, if you're on SQL*Plus
set autotrace traceonly statistics
select ...This will dump your data in the bit bucket as it arrives giving you
a) the database time plus network time
b) some statistics including the volume of data down the network and the number of network round-trips that Oracle saw.
Regards
Jonathan Lewis
http://jonathanlewis.wordpress.com
Author: <b><em>Oracle Core</em></b> -
UTLRP.SQL run very long, how to check ?
Hello all
Need your help and advice on my problem that i facing now,
I did to recompile all the object in oracle using
@UTLRP.SQL in $ORACLE_HOME\rdbms\admin
The sql was running about 4 hours but still running, but actually i dont know is it hang or still running
My question is
- is this normal, since that already run for 4 hours
- how to check whether is still running or hang, I did check on the task manager, the oracle.exe keep moving
- Can i cancell it ? any problem later on if i cancel it ?
- Is there any log to check
Sorry so many question, need all of your advice
Thanks in advance
Cheers
SuryaYou can check the decreasing number of invalid objects using below query:
select count(*) from dba_objects where status = 'INVALID'You can also check session waits for that session that is running utlrp.sql
With kind regards
Krystian Zieja
http://www.projectenvision.com -
Run VIRSA/ZVFATBAK to generate missed log
Hello experts, we had an upgrade a couple weeks ago and we had to put the VIRSA/ZVFATBAK job on hold. Because of this we have a couple of FF logons missing data. We have the "BACKGROUND JOB WAS NOT SCHEDULED/LOG & FILE NOT YET GENERATED" error. Usually I can run job immediately after the upgrade and gather this missed data, however this time we forgot and now trying to go back 3 weeks ago.
I thought since I can still see the data in STAD (ST03) I should be able to get VIRSA/ZVFATBAK to run and find the data. This is not the case this time. Can anyone confirm its too late to generate these missed logs?
Thanks
Dave WoodHi David,
If you see that logs are missing for certain date and time then you can run the report /VIRSA/ZVFATBAK with date and time for which logs where missing, see sap note - 1142312 (read this to understand what start & read time should be given to run the report)
Please be informed that the report will fetch data from STAT and CDHDR, CDPOS only if the data exists in these for the given date and time. If the data has been purged from STAT for that period then its not possible to get the logs.
Also note this point - As per design, manual FF log update considers only the firefighter ids
assigned to the user who is triggering the update through the firefighter dashboard. Whereas the scheduled background job considers all the FFIDs for a particular time period.
Best Regards,
Sirish Gullapalli.
Maybe you are looking for
-
Iphone 3GS no battery charging icon, does not show charging, does not get recognized by computer or Itunes, have tried restore mode & DFU mode. Bad battery or port?
-
I created a template file in DreamWeaver cs5. It is OK. and it looks like this, with Pink box showing the editable region. http://smsoftsolutions.com/downloads/test.jpg Now the problem is, when I create a NEW html file from this Template, in the DESI
-
Change Purchase Requisition Document Type when item category is D
Hi experts, i have problem, when i put servis number in to the servis line i havent change Purchase Requisition Document Type. Account Assignment Category is K and Item Category is D. It is standard of Purchase Requisition? When i set that i can chan
-
Hi Experts, is there any way to get back a deleted object after activating it in ID? any chance? I know this is not possible .... still if there any chance? Thanks Sugata
-
hi, my pro*c program throws "syntax error" when my query is like this. SELECT level, a_id,(select usine from CARA where cara_id=a_id) FROM causalites WHERE LEVEL <= 9999 START WITH i_id = 5 CONNECT BY PRIOR a_id = id_aff; The prob