Need to improve performance for bex queries
Dear Experts,
Here we have bex queries buit on BW infoset, further infoset is buit on 2 dsos and 4 infoobjects.
we have built secondary indices to the two dso assuming will improve performance, but still query execution time is very long.
Could you suggest me on this.
Thanks in advance,
Mannu
HI,
Thanks for the repsonse.
But as I have mentioned the infoset is based on DSOs and Infoobjects. So we could not perform on aggregates.
in RSRT
I have tried look in read mode of the query i.e. in 'x', which is also valid as qurey needs to fetch huge data.
Could you pls look into other possible areas, in order to improve this.
Thanks in advance,
Mannu
Similar Messages
-
Create a VC Model for BEx queries with Cell Editor
Hello All,
I am trying to create a VC Model for BEx queries with Cell Editor.
BW Development Team has created a BEX query with complex restrictions and calculations using Cell Editor feature of BI-BEX.
The query output in BEx analyzer is correct where all values are being calculated at each Cell level and being displayed.
But, while creating VC model, system is not displaying the Cells.Thus, no VC Model can be created.
I have executed below steps:
1. Created a VC Model for BEx Query - ZQRY_XYZ
2. Create Iview -> Create a dataService -> Provide a Table from the Output
In the Column field system is not showing any of the Cells (present in Cell Editor)
Please help me to solve this issue.
Thanks,
SivaHi
If 'Cell Editor' is been used then that query must have the structure in it. You have to select that 'structure' object in your 'VC Table'.
If you select that then you will get the required result in the output. This structure will be the structure where 'cell reference' is used in BI query, You have to select that structures name.
Regards
Sandeep -
To improve performance for report
Hi Expert,
i have generated the opensales order report which is fetching data from VBAK it is taking more time exectuing in the forground itself.
it is going in to dump in foreground and i have executed in the background also but it is going in to dump.
SELECT vbeln
auart
submi
vkorg
vtweg
spart
knumv
vdatu
vprgr
ihrez
bname
kunnr
FROM vbak
APPENDING TABLE itab_vbak_vbap
FOR ALL ENTRIES IN l_itab_temp
*BEGIN OF change 17/Oct/2008.
WHERE erdat IN s_erdat AND
submi = l_itab_temp-submi AND
*End of Changes 17/Oct/2008.
auart = l_itab_temp-auart AND
*BEGIN OF change 17/Oct/2008.
submi = l_itab_temp-submi AND
*End of Changes 17/Oct/2008.
vkorg = l_itab_temp-vkorg AND
vtweg = l_itab_temp-vtweg AND
spart = l_itab_temp-spart AND
vdatu = l_itab_temp-vdatu AND
vprgr = l_itab_temp-vprgr AND
ihrez = l_itab_temp-ihrez AND
bname = l_itab_temp-bname AND
kunnr = l_itab_temp-sap_kunnr.
DELETE itab_temp FROM l_v_from_rec TO l_v_to_rec.
ENDDO.
Please give me suggession for improving performance for the programmes.hi,
you try like this
DATA:BEGIN OF itab1 OCCURS 0,
vbeln LIKE vbak-vbeln,
END OF itab1.
DATA: BEGIN OF itab2 OCCURS 0,
vbeln LIKE vbap-vbeln,
posnr LIKE vbap-posnr,
matnr LIKE vbap-matnr,
END OF itab2.
DATA: BEGIN OF itab3 OCCURS 0,
vbeln TYPE vbeln_va,
posnr TYPE posnr_va,
matnr TYPE matnr,
END OF itab3.
SELECT-OPTIONS: s_vbeln FOR vbak-vbeln.
START-OF-SELECTION.
SELECT vbeln FROM vbak INTO TABLE itab1
WHERE vbeln IN s_vbeln.
IF itab1[] IS NOT INITIAL.
SELECT vbeln posnr matnr FROM vbap INTO TABLE itab2
FOR ALL ENTRIES IN itab1
WHERE vbeln = itab1-vbeln.
ENDIF. -
Need to improve Performance of select...endselect query
Hi experts,
I have a query in my program like below with inner join of 3 tables.
In my program used select....endselect again inside this select...endselect statements used..
While executing in production taking lot of time to fetch records. Can anyone suggest to improve performance of below query urgently...
Greatly appreciated ur help...
SELECT MVKEDWERK MVKEMATNR MVKEVKORG MVKEVTWEG MARA~MATNR
MARAMTART ZM012MTART ZM012ZLIND ZM012ZPRICEREF
INTO (MVKE-DWERK , MVKE-MATNR , MVKE-VKORG , MVKE-VTWEG , MARA-MATNR
, MARA-MTART , ZM012-MTART , ZM012-ZLIND , ZM012-ZPRICEREF )
FROM ( MVKE
INNER JOIN MARA
ON MARAMATNR = MVKEMATNR
INNER JOIN ZM012
ON ZM012MTART = MARAMTART )
WHERE MVKE~DWERK IN SP$00004
AND MVKE~MATNR IN SP$00001
AND MVKE~VKORG IN SP$00002
AND MVKE~VTWEG IN SP$00003
AND MARA~MTART IN SP$00005
AND ZM012~ZLIND IN SP$00006
AND ZM012~ZPRICEREF IN SP$00007.
%DBACC = %DBACC - 1.
IF %DBACC = 0.
STOP.
ENDIF.
CHECK SP$00005.
CHECK SP$00004.
CHECK SP$00001.
CHECK SP$00002.
CHECK SP$00003.
CHECK SP$00006.
CHECK SP$00007.
clear Check_PR00.
select * from A004
where kappl = 'V'
and kschl = 'PR00'
and vkorg = mvke-vkorg
and vtweg = mvke-vtweg
and matnr = mvke-matnr
and DATAB le sy-datum
and DATBI ge sy-datum.
if sy-subrc = 0.
select * from konp
where knumh = a004-knumh.
if sy-subrc = 0.
Check_PR00 = konp-kbetr.
endif.
endselect.
endif.
endselect.
CHECK SP$00008.
clear Check_ZPR0.
select * from A004
where kappl = 'V'
and kschl = 'ZPR0'
and vkorg = mvke-vkorg
and vtweg = mvke-vtweg
and matnr = mvke-matnr
and DATAB le sy-datum
and DATBI ge sy-datum.
if sy-subrc = 0.
select * from konp
where knumh = a004-knumh.
if sy-subrc = 0.
Check_ZPR0 = konp-kbetr.
endif.
endselect.
endif.
endselect.
CHECK SP$00009.
clear ZFMP.
select * from A004
where kappl = 'V'
and kschl = 'ZFMP'
and vkorg = mvke-vkorg
and vtweg = mvke-vtweg
and matnr = mvke-matnr
and DATAB le sy-datum
and DATBI ge sy-datum.
if sy-subrc = 0.
select * from konp
where knumh = a004-knumh.
if sy-subrc = 0.
ZFMP = konp-kbetr.
endif.
endselect.
endif.
endselect.
CHECK SP$00010.
clear mastercost.
clear ZDCF.
select * from A004
where kappl = 'V'
and kschl = 'ZDCF'
and vkorg = mvke-vkorg
and vtweg = mvke-vtweg
and matnr = mvke-matnr
and DATAB le sy-datum
and DATBI ge sy-datum.
if sy-subrc = 0.
select * from konp
where knumh = a004-knumh.
if sy-subrc = 0.
ZDCF = konp-kbetr.
endif.
endselect.
endif.
endselect.
CHECK SP$00011.
clear masterprice.
clear Standardcost.
select * from mbew
where matnr = mvke-matnr
and bwkey = mvke-dwerk.
Standardcost = mbew-stprs.
mastercost = MBEW-BWPRH.
masterprice = mBEW-BWPH1.
endselect.
ADD 1 TO %COUNT-MVKE.
%LINR-MVKE = '01'.
EXTRACT %FG01.
%EXT-MVKE01 = 'X'.
EXTRACT %FGWRMVKE01.
ENDSELECT.
best rgds..
hari..Hi there.
Some advices:
- why going to MVKE first and MARA then? You will find n rows in MVKE for 1 matnr, and then go n times to the same record in MARA. Do the oposite, i.e, go first to MARA (1 time per matnr) and then to MVKE.
- avoid select *, you will save time.
- use trace or measure performance in tcodes ST05 and SM30.
- replace:
select * from konp
where knumh = a004-knumh.
if sy-subrc = 0.
Check_ZPR0 = konp-kbetr.
endif.
endselect.
by
select * from konp
where knumh = a004-knumh.
Check_ZPR0 = konp-kbetr.
exit.
endselect.
Here, if I understood, you only need to atribute kbetr value to Check_ZPR0 if selecting anything (don't need the IF because if enters in select, subrc always equal to 0, and also don't need to do it several times from same a004-knumh - reason for the EXIT.
Hope this helps.
Regards.
Valter Oliveira.
Edited by: Valter Oliveira on Jun 5, 2008 3:16 PM -
Need Different Selection screen for different Queries in a Workbook
Hi,
I have created a workbook with Multiple tabs in BI 7.0. Each Tab has different Queries and each query has different Selection screens (Variable Selections).
When i open the workbook and refresh it, the selection screen is appearing only for one query. All the queries are refreshed by this single selection screen, though each query has different Variable selections. What i need is a seperate selection screen i.e seperate Variable selection appearing for each queries, when i refresh each one of them.
Is it possible to do this? If anybody has tried this, help me in solving this issue. Thanks for ur time.
Regards,
MuraliMurali,
If you un-check the 'Display Duplicate Variables Only Once' this WILL solve your problem.
When you Refresh, you should be presented with a single variable selection dialog box, but it should contain an area for each Query (DataProvider) that is embedded in the Workbook.
This is the case if the queries are all on the same tab, or on different tabs.
However, if you have multiple tabs each with a query on it, each query must have it's own DataProvider. If all queries are based on the same DataProvider, it will not work as the Workbook only 'sees' one Query for which it needs variable input.
If you REALLY want multiple variable selection dialog boxes, then maybe the best way to do this is to have the queries in separate Workbooks.
If you don't want the User to have to open 5 queries manually, you could use a Macro in each Workbook that runs on opening, to open the next Workbook in the sequence.
I hope this makes sense!
Regards
Steve -
Times ten to improve performance for search results in Oracle eBS
Hi ,
We have various search scenarios in our ERP implementaion using Oracle Apps eBS, for example searching for an item . Oracle apps does provide item search but performance is not great. We have about 30 million items and hence to improve the performance of the search thought Times ten may help.
Can anyone please clarify if Times ten can be used to improve performance on the eBS database , if yes how.Vikash,
We were thinking along the same lines (using TimesTen for massive item search in e-Business Suite). In our case massive Item / parametric search leveraging the Product Information Management application. We were thinking about setting up a POC on a Linux Server with a Vision Instance. We should compare notes?
SParker -
BIA to improve performance for BPS Applications
Hi All,
Is it possible to improve performance of BPS applications using BIA. Currently we are running applications on BI-BPS which because of huge range of period are having a performance issue.
Would request to please share whether in this read and write option of BPS would BIA be helpful and to what extent can the performance be increased?
Request an early reply as system is in really bad shape and users are grappling with poor performance?
Rgds,
RajeevHi Rajeev,
If the performance issue you are facing is while running the query on real-time (transactional) infocube being used in BPS, then BIA can help. The closed requests from real-time cube can be indexed in BIA. At the query runtime, analytic engine reads data from database for open request and from BIA for closed and indexed requests. It combines this data with the plan buffer cache and produce the result.
Hence if you are facing issue with query response time, BIA will defenitely help.
Regards,
Praveen -
Multiple Variable selection filtering for BEx queries
I created a web template which accesses three BEx queries. I have the same variable in all three queries so when I run the report, it pulls the data from all three queries correctly. BUT, when I change the filtering criteria within the report, only one of the query results changes, but not the other two. My question is: Is there a way to link all three queries so I can change the filtering criteria only once and have it affect all three?
Reading one Query variable(values) reflects in other query
are come under personalization.
this link give some idea...
https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/help/sdn_nw04/saphelp_nw04/helpdata/en/01/42c73c13b0ce5be10000000a114084/content.htm
Regards,
Vijay -
Need to Improve Performance - PPCS4 For PC
I've done numerous searches for this topic within this forum and I got zero returns - so please no flaming if my question has been asked a zillion times.
Okay, I've got a decently strong system as sepc'd here:
-Intel i980x 3.33GHz Hexacore CPU
-24GB 2000MHz DDR3 RAM
-Dual 256GB SATA 3 Corsair SSD's running Raid 0
-Dual AMD HD6950 Video cards (flashed to 6970 specs) Crossfire
-W7U x64
-CS4 Suite w/all current updates
This system is immaculately clean and except for CS4 it has no extra apps loaded. It will run ANY game at max settings and rarely gets to 60% CPU usage.
Open up PP, drop a 30 sec AVCHD clip into the timeline and the application will barely playback anything in the timeline - with the timeline marker cruising along, the video that is displayed might show 2 or 3 frames max. Project settings are correct.
I can take that same 30 sec clip and open up 100 instances of it in various media players and they all play back nicely at the same time. Drop it into PP - nothing...
CPU usage while playing a clip in PP is minimal.
I haven't used Premiere in a while, but I was an early adopter way back when (15 years ago) and most interations of Premiere worked well for me.
Any ideas?PPro, even the current CS5.5 version, does not work crossfire... and, in fact, having crossfire may actually cause problems
>Dual 256GB SATA 3 Corsair SSD's running Raid 0
If that is all you have, you are below drive specifications for editing video
You need OS and all software on your boot drive, and AT LEAST one other drive for video files
My 3 hard drives are configured as... (WD = Western Digital)
1 - 320G WD Win7 64bit Pro and all program installs
2 - 320G WD Win7 swap file and video project files
3 - 1T WD all video files... input & output files
Search Microsoft to find out how to redirect your Windows swap file
http://search.microsoft.com/search.aspx?mkt=en-US&setlang=en-US
Trying to use only ONE Hard Drive for Video Editing
You are a music conductor, with a baton that you use to point to various parts of the orchestra... this is like Windows pointing to various parts of the hard drive to do Windows housekeeping or to load program segments for various functions
Now, at the same time and with the same hand... while still using the baton to conduct the orchestra... pick up a bow and play a fiddle... this would be doing something with your video file at the same time as all the other work
You as a person cannot do both at the same time with the same hand
A computer is a LITTLE better, in that it can switch from one kind of task to another very quickly... but not quickly enough for EASY video editing
You need AT LEAST two hard drives (separate drives, never a partition http://forums.adobe.com/thread/650708?tstart=0 for more) with Windows (or Mac OS) and software on your boot drive, and video files on a 2nd drive so the boot drive is not slowed down by trying to do everything
I find that the three drives I use work very well for me, for editing AVCHD video... some people use a 4th drive, so video INPUT files are on drive three and all OUTPUT files are on drive four... I only bought a mid-tower case instead of a full tower case (my bad... but had to fit in the space available on my office desk!) so I use the three drives that will fit
Depending on your exact hardware (motherboard brand & model AND USB2 enclosure brand & model AND external hard drive brand & model) AND the type of video file, you may... or may NOT... be able to use an external USB2 hard drive for video editing
Steve Grisetti in the Premiere Elements forum http://forums.adobe.com/thread/856208?tstart=0 and Jim Simon in the Premiere Pro forum http://forums.adobe.com/thread/856433?tstart=0 use USB externals for editing
A USB3 hard drive connected to a motherboard with USB3 is supposed to be fast enough for video editing (I don't have such, so don't know) but eSata DOES have a fast enough data transfer for video editing... I have not used this eSata Dock... for reference only, YMMV and all the usual disclaimers
http://www.amazon.com/Thermaltake-BlacX-eSATA-Docking-Station/dp/B001A4HAFS/ref=cm_cmu_pg_ t -
How to improve performance for bulk data load in Dynamics CRM 2013 Online
Hi all,
We need to bulk update (or create) contacts into Dynamics CRM 2013 online every night due to data updated from another external data source. The data size is around 100,000 and the data loading duration was around 6 hours.
We are already using ExecuteMultiple web services to handle the integration, however, the 6 hours integraton duration is still not acceptable and we are seeking for any advise for further improvement.
Any help is highly appreciated. Many thanks.
GaryI think Andrii's referring to running multiple threads in parallel (see
http://www.mscrmuk.blogspot.co.uk/2012/02/data-migration-performance-to-crm.html - it's a bit dated, but should still be relevant).
Microsoft do have some throttling limits applied in Crm Online, and it is worth contacting them to see if you can get those raised.
100 000 records per night seems a large number. Are all these records new or updated records, or are there some that are unchanged, in which case you could filter them out before uploading ? Or are there useful ways to summarise the data before loading
Microsoft CRM MVP - http://mscrmuk.blogspot.com/ http://www.excitation.co.uk -
Improving performance for SM35
Hi all,
Are there any ways to improve the performance (time taken to load data) of SM35?
We are aware of executing the session in backgroud, but due to high data volume (~>10,000 records) per file, the time taken is still slow (about 3 hours per file).Hi Raj,
The previous posters gave already all the information you need, but since the question is still open, let me try to summarize it.
You're getting almost 1 transaction processed per second, which might be ok depending on the application area and the complexity of the executed transaction. So as Hermann initially pointed out, you should first profile the transaction you're running and check for any inefficiencies (custom coding in exits/BAdI's are often sources of slow-downs). If you find any problems, tune your transaction/application (not SM35).
If your application is fast enough (i.e. you cannot find any easy measures for making your transaction faster), you can compare application/transaction processing time versus total time taken in SM35. I personally doubt that you'll find any worthwhile discrepancy there (i.e. process time taken up by SM35, which is not due to the called transaction). Thus you should be left with Hermann's initial point of running several BDC's in parallel - meaning that you'll have to split your input file (you can automate that if you have to run such loads regularly). Without parallel processing you will always encounter unacceptable processing times when running huge data loads (even with optimal coding throughout the application).
Kind regards, harald -
SLOW Query ... Need help improving performance
Database: Oracle 8i
Note: I don't have a whole lot of experience with writing queries, so please forgive me for any dumb mistakes I most likely made.
I have a query in which I have a SUM in two levels. I think this is probably the root cause of the very slow performance of the query. However, I just don't see any way around it, and can't come up with any other ways to speed up the query. The query itself only returns one line, the summary line. And, by slow, I mean it can take up to an hour or two. This is a query I need to run multiple times, based on some parameters that I cannot query from a database.
The query basically calculates the current unit cost of a part. It has to sum up the issue cost of the part (cost of material issued to the part's order), the actual dollars put into a part (labor, etc.), and the burden dollars associated with the part. This sum has to be divided by the total quantity of parts completed on the part's order to get the unit cost. I have to account for the possibility that the quantity complete is 0, so that I don't end up dividing by 0.
Below is my query, and sample data for it:
SELECT a.part_nbr
, a.mo_nbr
, a.unit_iss_cost
, CASE
WHEN a.qty_complete_ind ='Nonzero'
THEN SUM(a.act_dlrs/a.qty_complete)
ELSE 0
END AS unit_dlrs
, CASE
WHEN a.qty_complete_ind ='Zero'
THEN SUM(a.act_dlrs)
ELSE 0
END AS qty_0_dlrs
FROM ( SELECT act.part_nbr AS part_nbr
, act.order_nbr || '-' || act.sub_order_nbr AS mo_nbr
, ic.unit_iss_cost AS unit_iss_cost
, SUM (
act.act_dlrs_earned +
act.act_brdn_dls_earned +
act.tool_dlrs_earned +
act.act_fix_brdn_dls_ea
) AS act_dlrs
, ord.qty_complete AS qty_complete
, CASE
WHEN ord.qty_complete <>0
THEN 'Nonzero'
ELSE 'Zero'
END AS qty_complete_ind
FROM ACT act
, ISSUE_COST ic
, ORD ord
WHERE ord.ord_nbr =act.order_nbr AND
ord.sub_ord_nbr =act.sub_order_nbr AND
ord.major_seq_nbr =act.maj_seq_nbr AND
ic.ord_nbr =act.order_nbr AND
ic.sub_ord_nbr =act.sub_order_nbr AND
(act.order_nbr =LPAD(?,10,'0')) AND
(act.sub_order_nbr =LPAD(?,3,'0')) AND
(act.activity_date <=?)
GROUP BY act.part_nbr
, act.order_nbr || '-' || act.sub_order_nbr
, act.maj_seq_nbr
, ord.qty_complete
, ic.unit_iss_cost
) a
GROUP BY a.part_nbr
, a.mo_nbr
, a.unit_iss_cost
, a.qty_complete_ind
CREATE TABLE ACT
creation_date date
, c_id number (5,0)
, part_nbr varchar(25)
, order_nbr varchar(10)
, sub_order_nbr varchar(3)
, maj_seq_nbr varchar(4)
, act_dlrs_earned number (15,2)
, act_brdn_dls_earned number (15,2)
, tool_dlrs_earned number (15,2)
, act_fix_brdn_dls_ea number (15,2)
, activity_date date
CONSTRAINT ACT_PK
PRIMARY KEY (creation_date, c_id)
);--Please note, issue_cost is actually a view, not a table, but by itself it runs very quickly
CREATE TABLE ISSUE_COST
unit_iss_cost number(15,2)
, ord_nbr varchar(10)
, sub_ord_nbr varchar(3)
);--Please note, ord table has a couple of foreign keys that I did not mention
CREATE TABLE ORD
ord_nbr varchar(10)
, sub_ord_nbr varchar(3)
, major_seq_nbr varchar(4)
, qty_complete number (13,4)
);Sample tables:
ACT
creation_date c_id part_nbr order_nbr sub_order_nbr maj_seq_nbr act_dlrs_earned act_brdn_dls_earned tool_dlrs_earned act_fix_brdn_dls_ea activity_date
01/02/2000 12345 ABC-123 0000012345 001 0010 10.00 20.00 0.00 0.00 01/01/2000
01/02/2000 12345 XYZ-987 0000054321 001 0030 100.00 175.00 10.00 10.00 01/01/2000
01/03/2000 12347 ABC-123 0000012345 001 0020 25.00 75.00 5.00 1.00 01/02/2000
01/03/2000 12348 ABC-123 0000012345 001 0020 75.00 120.00 25.00 5.00 01/02/2000
01/03/2000 12349 XYZ-987 0000054321 001 0050 50.00 110.00 0.00 0.00 01/02/2000
01/04/2000 12350 ABC-123 0000012345 001 0030 25.00 40.00 0.00 0.00 01/03/2000
ISSUE_COST
unit_iss_cost ord_nbr sub_ord_nbr
125.00 0000012345 001
650.00 0000054321 001
ORD
ord_nbr sub_ord_nbr major_seq_nbr qty_complete
0000012345 001 0010 10
0000012345 001 0020 10
0000012345 001 0030 0
0000054321 001 0030 20
0000054321 001 0050 19If insert statements are needed for the sample tables, let me know and I'll go re-figure out how to write them. (I only have read-only access to the database I'm querying, so creating tables and inserting values aren't things I ever do).
Thanks in advance!For diagnosing where the time of your query is being spent, we don't need create table and insert statements. If we execute your query with only a handful of rows, the query will be very fast. What we do need to know, is the plan the optimizer takes to compute your result set, and the cardinalities of each step.
Please read When your query takes too long ... carefully and post the full explain plan and tkprof output.
Regards,
Rob. -
Need to Improve Performance on a Spatial Boundary Crossing Calculator
I am attempting to compare a series of geometries to calculate a number of statistics where they overlap. Essentially I have a table of 50,000 lines and another table of 1000 circles. I need to determine which lines overlap each circle, and for each intersection, I need to determine how much time and distance each line spends in each circle.
I have a PL/SQL program that performs this operation now and it works. The problem is that it takes far too long.
Here is a summary of how the job runs:
1) For each LINE, determine which CIRCLES it overlaps with
2) Each each LINE/CIRCLE pair, determine the intersection points
3) Insert the intersection points in a temporary table
4) Once you have all the points, pair them up as Entry/Exit points for each circle
5) Calculate duration (time) and distance between entry and exit points
6) Return to step 1 for next LINE
There are multiple loops here:
1-6 is the outer loop performed once for each of the 50,000 lines.
2-3 is performed once for each line/circle pair (probable avg of 5 circles per line)
4-5 is performed once again for each line/circle pair
Even if the process only takes a couple of seconds per LINE, we are still taking more than 24 hours to process, which is not acceptable.
This original process was written with 9i, and I am now running 10gR2, so I know there are new features that should help. For starters, I think I can use SDO_JOIN in place of the original outer loop to generate a complete list of geometry interactions in one query. Of course, I am still concerned about how long that might take.
Even more troubling is, if that works, I still don't see how to improve the rest of the calculations. Any suggestions would be appreciated.No, I don't mind providing it.
Here it is:
-- cre_&stab._bndxing.sql
--Procedure definition of bndxings
def stab=&1
CREATE OR REPLACE PROCEDURE Find_&stab._bndxings
(theDate IN DATE, theStr IN VARCHAR2) IS
--Select flights from table
CURSOR FCursor IS
SELECT new_Flight_Index,
Acid,
New_Act_Date,
Dept_Aprt,
Dep_Time,
Arr_Aprt,
Arr_Time,
Acft_Type,
Physical_Class,
User_Class,
Nrp,
d_lat,
d_lon,
a_lat,
a_lon,
flight_track
FROM jady.Flight
WHERE new_act_date = theDate
AND flight_track IS NOT NULL
AND substr(acid,1,1) = theStr
--AND acid in (select acid from name_temp)
--AND acid = 'AAL1242'
ORDER BY acid,new_flight_index;
--Temp vars for storing flight info
fi_var NUMBER;
acid_var VARCHAR2(7);
dep_time_var DATE;
arr_time_var DATE;
F_Rec FCursor%ROWTYPE;
--Temp vars for flight
tcnt INTEGER;
cur_lat NUMBER;
cur_lon NUMBER;
last_lat NUMBER;
last_lon NUMBER;
--Temp vars for airspace and xing geometries
aname VARCHAR2(20);
bxings MDSYS.SDO_GEOMETRY;
bxcnt INTEGER;
--Select xings made from temp bndxing table
CURSOR XCursor IS
SELECT Act_Date,
Name,
Lon,
Lat,
Alt,
Time,
OPS
FROM bndxing_tmp
WHERE Flight_Index = fi_var
AND Acid = acid_var
ORDER BY Name,Time;
--Temp vars for paired in/out xings
ad date;
ilon NUMBER;
ilat NUMBER;
ialt NUMBER;
isec NUMBER;
iops NUMBER;
olon NUMBER;
olat NUMBER;
oalt NUMBER;
osec NUMBER;
oops NUMBER;
gcr NUMBER;
dist NUMBER;
dura NUMBER;
ops VARCHAR2(1);
i INTEGER;
i_aname VARCHAR2(20);
o_aname VARCHAR2(20);
names_match BOOLEAN;
theSeq NUMBER;
same_airport_no_tzdata BOOLEAN;
-- Cursor and variables for bndxing sequencing
CURSOR BCursor IS
SELECT * FROM bndxing
WHERE act_date = theDate
AND Acid = acid_var
AND Flight_Index = fi_var
ORDER BY in_time
FOR UPDATE;
BRec BCursor%ROWTYPE;
--Error logging variable
strErrorMessage VARCHAR2(255);
BEGIN --Start of Main Loop
--Loop for each flight in table
OPEN FCursor;
FETCH FCursor INTO F_Rec;
-- FOR f IN FCursor LOOP
WHILE FCursor%FOUND LOOP
fi_var:= F_Rec.new_Flight_Index;
acid_var := F_Rec.acid;
arr_time_var := F_Rec.arr_time;
dep_time_var := F_Rec.dep_time;
last_lat := -10000; --initializtion
last_lon := -10000; --initializtion
-- DEBUG STATEMENT
/* Insert into bnd_error values (err_seq.NEXTVAL,
sysdate,
F_Rec.Acid,
F_Rec.new_Flight_Index,
'Checkpoint 1');
--Add departing xing to temp table if in US airspace
DECLARE
CURSOR DepCur IS
SELECT Name
FROM &stab.
WHERE SDO_RELATE(Airspace,
MDSYS.SDO_GEOMETRY(2001,8307,
MDSYS.SDO_POINT_TYPE(F_Rec.d_lon,F_Rec.d_lat,null),
null, null),
'mask=CONTAINS querytype=WINDOW') = 'TRUE';
BEGIN -- Start of Departing Airspace Loop
FOR c in DepCur LOOP
INSERT INTO Bndxing_Tmp VALUES (F_Rec.new_Flight_Index,
F_Rec.acid,
F_Rec.New_Act_Date,
c.name,
2,
F_Rec.d_lon,
F_Rec.d_lat,
0,
(F_Rec.Dep_Time-F_Rec.New_Act_Date)*86400);
END LOOP;
EXCEPTION
WHEN NO_DATA_FOUND THEN NULL;
WHEN OTHERS THEN
strErrorMessage := SQLERRM;
INSERT INTO bnd_error VALUES (err_seq.NEXTVAL,
sysdate,
F_Rec.Acid,
F_Rec.new_Flight_Index,
'Exception from Departing Airspace loop: ' || strErrorMessage);
COMMIT;
END; -- End of Departing Airspace Loop
--Add arrival xing to temp table if in US airspace
DECLARE
CURSOR ArrCur IS
SELECT name
FROM &stab.
WHERE SDO_RELATE(Airspace,
MDSYS.SDO_GEOMETRY(2001,8307,
MDSYS.SDO_POINT_TYPE(F_Rec.a_lon, F_Rec.a_lat, null),
null, null),
'mask=CONTAINS querytype=WINDOW') = 'TRUE';
BEGIN -- Start of Arrival Airspace Loop
FOR c IN ArrCur LOOP
INSERT INTO Bndxing_Tmp VALUES (F_Rec.new_Flight_Index,
F_Rec.acid,
F_Rec.New_Act_Date,
c.name,
1,
F_Rec.a_lon,
F_Rec.a_lat,
0,
(F_Rec.Arr_Time - F_Rec.New_Act_Date)*86400);
END LOOP;
EXCEPTION
WHEN NO_DATA_FOUND THEN NULL;
WHEN OTHERS THEN
strErrorMessage := SQLERRM;
INSERT INTO bnd_error VALUES (err_seq.NEXTVAL,
sysdate,
F_Rec.Acid,
F_Rec.new_Flight_Index,
'Exception from Arrival Airspace loop: ' || strErrorMessage);
COMMIT;
END; -- End of Arrival Airspace Loop
--DEBUG STATEMENT
/* Insert into bnd_error values (err_seq.NEXTVAL,
sysdate,
F_Rec.Acid,
F_Rec.new_Flight_Index,
'Checkpoint 4');
--Find all intersections between the flight track and airspace boundaries and insert into temp table
DECLARE
--Find airspace boundaries that interact with the flight track
CURSOR CCursor IS
SELECT Name, Boundary
FROM &stab.
WHERE SDO_RELATE(boundary,F_Rec.flight_track,'mask=OVERLAPBDYDISJOINT querytype=WINDOW')='TRUE';
BEGIN
FOR c IN CCursor LOOP
bxings := SDO_GEOM.SDO_INTERSECTION(c.boundary,F_Rec.flight_track,10);
bxcnt:=bxings.sdo_ordinates.count;
LOOP
INSERT INTO bndxing_tmp VALUES (F_Rec.new_Flight_Index,
F_Rec.acid,
F_Rec.New_Act_Date,
c.name,
0,
bxings.sdo_ordinates(bxcnt-3),
bxings.sdo_ordinates(bxcnt-2),
bxings.sdo_ordinates(bxcnt-1),
SDO_LRS.FIND_MEASURE(F_Rec.flight_track,
MDSYS.SDO_GEOMETRY(2001,8307,NULL,
MDSYS.SDO_ELEM_INFO_ARRAY(1,1,1),
MDSYS.SDO_ORDINATE_ARRAY(bxings.sdo_ordinates(bxcnt-3),
bxings.sdo_ordinates(bxcnt-2)))));
bxcnt := bxcnt - 4;
EXIT WHEN (bxcnt < 1);
END LOOP;
END LOOP; -- end CCursor LOOP
EXCEPTION
WHEN OTHERS THEN
strErrorMessage := SQLERRM;
INSERT INTO bnd_error VALUES (err_seq.NEXTVAL,
sysdate,
F_Rec.Acid,
F_Rec.new_Flight_Index,
'Exception from bndxing loop: ' || strErrorMessage);
COMMIT;
END;
--DEBUG STATEMENT
/* Insert into bnd_error values (err_seq.NEXTVAL,
sysdate,
F_Rec.Acid,
F_Rec.new_Flight_Index,
'Checkpoint 6');
--After all xings for a flight have been collected sort Xings by name and time and grab pairwise
theSeq :=0;
OPEN XCursor;
BEGIN -- Start of Stats Loop
LOOP
FETCH XCursor INTO ad, i_aname, ilon, ilat, ialt, isec, iops; --CHANGED CODE
EXIT WHEN XCursor%NOTFOUND ;
FETCH XCursor INTO ad, o_aname, olon, olat, oalt, osec, oops; --CHANGED CODE
EXIT WHEN XCursor%NOTFOUND ;
names_match := (i_aname = o_aname); --NEW CODE
WHILE not names_match LOOP --NEW CODE
i_aname := o_aname; --NEW CODE
ilon := olon; --NEW CODE
ilat := olat; --NEW CODE
ialt := oalt; --NEW CODE
isec := osec; --NEW CODE
iops := oops; --NEW CODE
FETCH XCursor INTO ad, o_aname, olon, olat, oalt, osec, oops; --NEW CODE
EXIT WHEN XCursor%NOTFOUND; --NEW CODE
names_match := (i_aname = o_aname); --NEW CODE
END LOOP; --NEW CODE
--Calculate stats
BEGIN -- Start of In Values Loop
i:=4;
IF (iops<>2) THEN
-- Did not depart from this airspace, calculate entry altitude into airspace.
LOOP
i:=i+4;
EXIT WHEN F_Rec.flight_track.sdo_ordinates(i)>isec;
END LOOP;
IF ( F_Rec.flight_track.sdo_ordinates(i-1) = F_Rec.flight_track.sdo_ordinates(i-5) ) THEN
ialt := F_Rec.flight_track.sdo_ordinates(i-1);
ELSE
ialt:=SDO_LRS.FIND_MEASURE(
MDSYS.SDO_GEOMETRY(3302,8307,NULL,
MDSYS.SDO_ELEM_INFO_ARRAY(1,2,1),
MDSYS.SDO_ORDINATE_ARRAY(F_Rec.flight_track.sdo_ordinates(i-7),
F_Rec.flight_track.sdo_ordinates(i-6),
F_Rec.flight_tracK.sdo_ordinates(i-5),
F_Rec.flight_track.sdo_ordinates(i-3),
F_Rec.flight_track.sdo_ordinates(i-2),
F_Rec.flight_track.sdo_ordinates(i-1))),
MDSYS.SDO_GEOMETRY(2001,8307,NULL,
MDSYS.SDO_ELEM_INFO_ARRAY(1,1,1),
MDSYS.SDO_ORDINATE_ARRAY(ilon,ilat)));
END IF;
END IF;
EXCEPTION
WHEN OTHERS THEN
strErrorMessage := SQLERRM;
INSERT INTO bnd_error VALUES (err_seq.NEXTVAL,
sysdate,
F_Rec.Acid,
F_Rec.new_Flight_Index,
'Exception from In Values section: ' || strErrorMessage);
COMMIT;
END; -- End of In Values Loop
BEGIN -- Start of Out Values Loop
i:=4;
IF (oops<>1) THEN
-- Did not arrive in this airspace, calculate departure altitude from airspace.
LOOP
i:=i+4;
EXIT WHEN F_Rec.flight_track.sdo_ordinates(i)>osec;
END LOOP;
--Find alt at this time
IF ( F_Rec.flight_track.sdo_ordinates(i-1) = F_Rec.flight_track.sdo_ordinates(i-5) ) THEN
oalt := F_Rec.flight_track.sdo_ordinates(i-1);
ELSE
oalt:=SDO_LRS.FIND_MEASURE(
MDSYS.SDO_GEOMETRY(3302, 8307, NULL,
MDSYS.SDO_ELEM_INFO_ARRAY(1,2,1),
MDSYS.SDO_ORDINATE_ARRAY(F_Rec.flight_track.sdo_ordinates(i-7),
F_Rec.flight_track.sdo_ordinates(i-6),
F_Rec.flight_track.sdo_ordinates(i-5),
F_Rec.flight_track.sdo_ordinates(i-3),
F_Rec.flight_track.sdo_ordinates(i-2),
F_Rec.flight_track.sdo_ordinates(i-1))),
MDSYS.SDO_GEOMETRY(2001,8307,NULL,
MDSYS.SDO_ELEM_INFO_ARRAY(1,1,1),
MDSYS.SDO_ORDINATE_ARRAY(olon,olat)));
END IF;
END IF;
EXCEPTION
WHEN OTHERS THEN
strErrorMessage := SQLERRM;
INSERT INTO bnd_error VALUES (err_seq.NEXTVAL,
sysdate,
F_Rec.Acid,
F_Rec.new_Flight_Index,
'Exception from Out Values loop: ' || strErrorMessage);
COMMIT;
END; -- End of Out Values Loop
BEGIN -- Start of Finish Loop
--Find GCR, actual distance and duration in airspace
gcr := SDO_GEOM.SDO_DISTANCE(MDSYS.SDO_GEOMETRY(2001,8307,
MDSYS.SDO_POINT_TYPE(ilon,ilat,NULL),NULL,NULL),
MDSYS.SDO_GEOMETRY(2001,8307,
MDSYS.SDO_POINT_TYPE(olon,olat,NULL),NULL,NULL),
10,'unit=naut_mile');
--DEBUG STATEMENT
/* Insert into bnd_error values (err_seq.NEXTVAL,
sysdate,
F_Rec.Acid,
F_Rec.new_Flight_Index,
'In Finish Loop: isec: ' ||isec||' osec: '||osec
||' airspace: '||i_aname);
dist := SDO_GEOM.SDO_LENGTH(SDO_LRS.CLIP_GEOM_SEGMENT(F_Rec.flight_track,isec,osec),10,'unit=naut_mile');
dura := (osec - isec);
--Set OPS Flag
iops := iops + oops;
IF (iops=3) THEN
ops := 'B';
ELSIF (iops=2) THEN
ops := 'D';
ELSIF (iops=1) THEN
ops := 'A';
ELSE
ops := 'O';
END IF;
theSeq := theSeq + 1;
--Insert into Bndxing table
INSERT INTO Bndxing VALUES (F_Rec.Acid,
F_Rec.new_Flight_Index,
F_Rec.New_Act_Date,
theSeq,
F_Rec.Dept_Aprt,
F_Rec.Arr_Aprt,
i_aname,
round(ilon,3),
round(ilat,3),
ialt,
isec/86400 + ad,
NULL, -- IN_SPEED (TBD)
round(olon,3),
round(olat,3),
oalt,
osec/86400 + ad,
NULL, -- OUT_SPEED (TBD)
gcr,
dist,
dura,
ops, -- CHANGED CODE
nvl(F_Rec.Acft_Type,'----'),
NULL, -- IF_FLAG (NULL)
nvl(F_Rec.Physical_Class,'-'),
nvl(F_Rec.User_Class,'-'),
F_Rec.Nrp,
NULL, -- FFS_FLAG (NULL)
NULL, -- ER_SG (NULL)
NULL, -- ER_TI (NULL)
NULL, -- ER_ZT (NULL)
NULL, -- ER_DU (NULL)
NULL, -- ER_SP (NULL)
NULL -- ER_BD (NULL)
DELETE FROM bndxing_tmp
WHERE acid=F_Rec.Acid and flight_index=F_Rec.new_Flight_Index;
COMMIT;
EXCEPTION
WHEN OTHERS THEN
strErrorMessage := SQLERRM;
INSERT INTO bnd_error VALUES (err_seq.NEXTVAL,
sysdate,
F_Rec.Acid,
F_Rec.new_Flight_Index,
'Exception from Finish loop: ' || strErrorMessage);
COMMIT;
END; -- End of Finish Loop
END LOOP;
EXCEPTION
WHEN OTHERS THEN
strErrorMessage := SQLERRM;
INSERT INTO bnd_error VALUES (err_seq.NEXTVAL,
sysdate,
F_Rec.Acid,
F_Rec.new_Flight_Index,
'Exception from Stats loop: ' || strErrorMessage);
COMMIT;
END; -- End of Stats Loop
--Reset cursor and track geometry
CLOSE XCursor;
F_Rec.flight_track.sdo_ordinates.delete;
-- delete from hist_bndxing_tmp
-- where acid=acid_var and new_flight_index=fi_var;
FETCH FCursor INTO F_Rec;
END LOOP;
--DEBUG STATEMENT
/* INSERT INTO bnd_error VALUES (err_seq.NEXTVAL,
sysdate,
acid_var,
fi_var,
'Checkpoint 7');
CLOSE FCursor;
theSeq := 1;
OPEN BCursor;
LOOP
FETCH BCursor INTO BRec;
IF BCursor%NOTFOUND THEN
EXIT;
END IF;
UPDATE bndxing
SET segment = theSeq
WHERE CURRENT OF BCursor;
theSeq := theSeq + 1;
END LOOP;
CLOSE BCursor;
EXCEPTION
WHEN OTHERS THEN
strErrorMessage := SQLERRM;
Insert into bnd_error values (err_seq.NEXTVAL,
sysdate,
acid_var,
fi_var,
'Exception from main: ' || strErrorMessage);
COMMIT;
CLOSE FCursor;
END; -- End of Main Loop
SHOW ERRORS;
--exit; -
Feasibility of SAP BO Publication For BEx Queries
Hi,
We have a handsome amount of workbooks that are broad casted everyday to users/portal using the precal server. Howver, we notice that the perfomance of precal servers in not fail safe and requires a lot of manual intervention in the production box from the basis team and well as bw support team to manually process and distribute the files.
In this respect, we were thinking of using SAP BO's publication to distribute the workbooks. We would give them the bex query which they would use to develop their universe and the report thereafter for distribution.
Please let me know if anyone has used BO Publication in this scenarios or whether this would be a recommended or scalable approach in future.I have similar expeience with Cognos on top of SAP BW. I guess it's working the same way.
I'm talking about using BW as the datawarehouse and BO as a viewer on top - You are not moving data to another DB.
You'll be using the OLAP BAPI and the OLE DB for OLAP interfaces. Those interfaces write MDX queries the use the OLAP engin of BW. Usage of aggregates, OLAP cache, Security features and etc is transparent to the user.
I don't know about BO specific connectors or drivers. About performance, writing efficient MDX queries will make the performance OK. I don't know how BO deals with that.
What is CAD?
Tomer. -
How to improve performance for Custom Extractor in BI..
HI all,
I am new to BI and started working on BI for couple of weeks.. I created a Custom Extractor(Data View) in the Source system and when i pull data takes lot of time.. Can any one respond to this, suggesting how to improve the performance of my custom Extractor.. Please do the needfull..
Thanks and Regards,
Venugopal..Dear Venugopal,
use transaction ST05 to check if your SQL statements are optimal and that you do not have redundant database calls. You should use as much as possible "bulking", which means to fetch the required data with one request to database and not with multiple requests to database.
Use transaction SE30 to check if you are wasting time in loops and if yes, optimize the algorithm.
Best Regards,
Sylvia
Maybe you are looking for
-
Problem with loading pics in a photo gallery
Ok.. I don't know what is the problem for this.. maybe someone with more experience bumped to this too.. and can help me out.. I will put the link here so u can see it but before u open it, pay attention of the mess.. pictures are loaded one over ano
-
Problem in automatic rate routing selection
Dear all, I am using REM and according to that I have maintained REM profile in Material Master, rate routing, line hierarchy, production version, but the time of scheduling of planned order, system is saying " The automatic selection of routing is n
-
TableCellRenderer -- not showing as "selected" in JTable
Hi, I have a a JTable, and for one of my columns, i do for (int i = 0; i < dataModel.getColumnCount(); i++) { column = table.getColumnModel().getColumn(i); if (i ==0) column.setCellRenderer(new NotificationCellRenderer
-
File attachment​s in Blackberry Bridge Calendar
The utility of the Playbook for me depends on being able to open attachments that are in my calendar (Lotus Notes). The Notes calendar sync's fine (BES ver 5.01) with the Blackberry Tour and an attachment can be opened by the device. The bridge calen
-
What does it mean when icon bounces from dock
what does it mean when icon bounces from dock