Ref Cursors / throwing data into a Ref Cursor as data is fetched
I was wondering if anyone has executed an SQL statement and as each row is being fetched back from an SQL, doing some data checks and processing to see if the row is valid to return or not, based on the values being fetched in an SQL.
For example, I'm taking an SQL statement and trying to do some tuning. I have an Exists clause in the Where statement that has a nested sub-query with some parameters passed in. I am attempting to move that statement to a function call in a package (which is called in the SELECT statement). As I fetch each row back, I want to check some values that are Selected and if the values are met, then, I want to execute the function to see if the data exists. If it does exist, then, I want the fetched row returned in the Ref Cursor. If the criteria is met and the row doesn't exist in the function call, then, I don't want the fetched row to return.
Right now, the data has to be thrown to REF Cursor because it's being outputted to the Java application as a Result Set.
I've found many statements where you can take a SELECT statement and throw the Results in the Ref Cursor. But, I want to go a step further and before I throw each row in the Ref Cursor, I want to some processing to see if I put the Fetched Row in the Ref Cursor.
If someone has a better idea to accomplish this, I'm all ears. Like I say, I'm doing this method only for the sake of doing some database tuning and I think this will speed things up. Having the EXISTS clause works and it runs fast from an End-user standpoint but, when it processes on the database with the nested subquery, it is slow.
Here's an example of something that might be a problem (Notice the nested subquery). I moved the nested subquery to a function call written on the database package and make the call to the procedure/package in the SELECT statement. As I process each row, I want to check some values prior having the function call execute. If it meet some criteria, then the record is Ok to fetch and display in the Ref Cursor. If it does not meet the criteria and goes through the function and doesn't return data, then, I don't want the Fetched row from the main query to return the data.:
SELECT EMPNO,
FIRST_NAME,
LAST_NAME
FROM EMP E,
DEPT D
WHERE E.DEPTNO = D.DEPTNO
AND EXISTS (SELECT 'X'
FROM MANAGER M
WHERE M.MANAGER_ID = E.MANAGER_ID
AND MANAGER_TYPE IN (SELECT MANAGER_TYPE
FROM MANAGER_LOOKUP ML WHERE ML.MANAGER_TYPE = M.MANAGER_TYPE))
Any help or ideas of other things to try is appreciated. Keep in mind that I am returning this data to the Java application so, throwing the data to a Ref Cursor in the PL/SQL is the ideal method.
Chris
Ref cursors are not required nor desirable when writing java database application. Cursors are mentioned only once in the JDBC documentation reference guide, in the section "Memory Leaks and Running Out of Cursors".
In a word cursors are just plain ridiculous, and in fact I never used them in my 15+ years of application development practice:
http://vadimtropashko.wordpress.com/cursors/
Similar Messages
-
I wish to insert data into a table only when the value of the inserted data has changed. Thus, in a time series, if the value of the data at time, t-1, is 206 then if the data to be inserted at time t is 206, then it is skipped (not entered).
If the value of the data at time t+1 is 206, it is skipped also; until the value changes, so if the value at t+1 was 205, then that would be inserted, and if at time t+2 the data is 206, it would be inserted too.
What is the best way to do it without increasing overheads?This view works:
SELECT
i.IDNO,i.[Date],i.[Level]
FROM
mytable i
INNER
JOIN mytable
d
ON
d.IDNO
= i.IDNO-1
WHERE
i.[Level]
<> d.[Level]
on this mytable below. A trigger could be quite useful here although I am cautious using them. However I wish to avoid the overhead by not having a temp table (which could be sizable). mytable below
should give 3 lines. The IDNO is an identity column.
IDNO
Item
Date
Level
1
X24
12/23/13 10:41
22996
2
X24
12/23/13 10:41
22996
3
X24
12/23/13 9:21
23256
4
X24
12/23/13 9:21
23256
5
X24
12/23/13 9:22
23256
6
X24
12/23/13 9:22
23256
7
X24
12/23/13 9:22
22916 -
Loading data into multiple tables - Bulk collect or regular Fetch
I have a procedure to load data from one source table into eight different destination tables. The 8 tables have some of the columns of the source table with a common key.
I have run into a couple of problems and have a few questions where I would like to seek advice:
1.) Procedure with and without the BULK COLLECT clause took the same time for 100,000 records. I thought I would see improvement in performance when I include BULK COLLECT with LIMIT.
2.) Updating the Load_Flag in source_table happens only for few records and not all. I had expected all records to be updated
3.) Are there other suggestions to improve the performance? or could you provide links to other posts or articles on the web that will help me improve the code?
Notes:
1.) 8 Destination tables have at least 2 Million records each, have multiple indexes and are accessed by application in Production
2.) There is an initial load of 1 Million rows with a subsequent daily load of 10,000 rows. Daily load will have updates for existing rows (not shown in code structure below)
The structure of the procedure is as follows
Declare
dest_type is table of source_table%ROWTYPE;
dest_tab dest_type ;
iCount NUMBER;
cursor source_cur is select * from source_table FOR UPDATE OF load_flag;
BEGIN
OPEN source_cur;
LOOP
FETCH source_cur -- BULK COLLECT
INTO dest_tab -- LIMIT 1000
EXIT WHEN source_cur%NOTFOUND;
FOR i in dest_tab.FIRST .. dest_tab.LAST LOOP
<Insert into app_tab1 values key, col12, col23, col34 ;>
<Insert into app_tab2 values key, col15, col29, col31 ;>
<Insert into app_tab3 values key, col52, col93, col56 ;>
UPDATE source_table SET load_flag = 'Y' WHERE CURRENT OF source_cur ;
iCount := iCount + 1 ;
IF iCount = 1000 THEN
COMMIT ;
iCount := 0 ;
END IF;
END LOOP;
END LOOP ;
COMMIT ;
END ;
Edited by: user11368240 on Jul 14, 2009 11:08 AMAssuming you are on 10g or later, the PL/SQL compiler generates the bulk fetch for you automatically, so your code is the same as (untested):
DECLARE
iCount NUMBER;
CURSOR source_cur is select * from source_table FOR UPDATE OF load_flag;
BEGIN
OPEN source_cur;
FOR r IN source_cur
LOOP
<Insert into app_tab1 values key, col12, col23, col34 ;>
<Insert into app_tab2 values key, col15, col29, col31 ;>
<Insert into app_tab3 values key, col52, col93, col56 ;>
UPDATE source_table SET load_flag = 'Y' WHERE CURRENT OF source_cur ;
iCount := iCount + 1 ;
IF iCount = 1000 THEN
COMMIT ;
iCount := 0 ;
END IF;
END LOOP;
COMMIT ;
END ;However most of the benefit of bulk fetching would come from using the array with a FORALL expression, which the PL/SQL compiler can't automate for you.
If you are fetching 1000 rows at a time, purely from a code simplification point of view you could lose iCount and the IF...COMMIT...END IF and just commit each time after looping through the 1000-row array.
However I'm not sure how committing every 1000 rows helps restartability, even if your real code has a WHERE clause in the cursor so that it only selects rows with load_flag = 'N' or whatever. If you are worried that it will roll back all your hard work on failure, why not just commit in your exception handler? -
Cannot enter data into table with column named DATE
I have a table as follows:
CREATE TABLE PACKINGLINE (
SNO VARCHAR2(13),
"DATE" DATE DEFAULT SYSDATE,
PRDORDNO VARCHAR2(12),
NOW DATE DEFAULT SYSDATE,
CONSTRAINT PACKINGLINE_PK PRIMARY KEY ("SNO", "DATE"));
Note the "cleaverly" named DATE and NOW columns (yes, that is the name of the column... our company was bessed with a very clear SQL server DBA in the past)
This SQL statement works:
INSERT INTO PACKINGLINE (SNO, PRDORDNO) VALUES ('1000808972', '100080897');
However when I try use SQL Developer as follows I cannot enter data:
1. Open SQL Developer
2. Click on "Tables" then on my table named PACKINGLINE
3. Click the "Data" tab
4. Click the [+] "Insert Row" Icon
5. Enter the same data as above
6. Click Commit
7. The following error is shown in the log window and I cannot commit the data change.
Error message
==================
INSERT INTO "DBO"."PACKINGLINE" (SNO, PRDORDNO) VALUES ('1000808971', '100080897')
One error saving changes to table "DBO"."PACKINGLINE":
Row 5: ORA-06550: Row 1、Column 25:
PL/SQL: ORA-06552: PL/SQL: Compilation unit analysis terminated
ORA-06553: PLS-320: the declaration of the type of this expression is incomplete or malformed
ORA-06550: Row 1、Column 7:
PL/SQL: SQL Statement ignored
==================
I can copy and paste the above query into a normal SQL statement window and the data commits fine.Its so flaky there really must be something wrong Mr Adobe support person....
If I load a PDF, do nothing, gently save it, reopen and then waiting for my trial dialogue to wake up so that I can click on continue trial... then save as extended reader blah.... then close. I have finally saved one that works!!!! but how difficult is that????
WHAT IS WRONG ADOBE?? -
How to load data into Planning/Essbase from multiple data column
Dear All,
I have a interface file which contains multiple data column as follows.
Year,Department,Account,Jan,Feb,Mar,Apr,May,Jun,Jul,Aug,Sep,Oct,Nov,Dec
FY10,Department1,Account1,1,2,3,4,5,6,7,8,9,10,11,12
FY10,Department2,Account1,1,2,3,4,5,6,7,8,9,10,11,12
FY10,Department3,Account1,1,2,3,4,5,6,7,8,9,10,11,12
I created a data rule to load these interface file.
I want to use ODI to upload this interface. I try to specify the rule name in ODI and run the interface.
But, it came out following errors.
2010-02-22 11:40:25,609 DEBUG [DwgCmdExecutionThread]: Error occured in sending record chunk...Cannot end dataload. Analytic Server Error(1003014): Unknown Member [FY09,032003,910201,99,0,0,0,0,0,0,0,0,0,0,0,0] in Data Load, [1] Records Completed
Any idea to fix the column, I sure the member name is correct as I can load data from data load rule correctly.
ThanksDear John,
I updated the data load rule delimter to "," and found different error message as follows.
'A910201.99','HSP_InputValue','HKDepart','No_WT','032003','NO_Lease','FY09','Actual','Final','Local','0','0','0','0','0','0','0','0','0','0','0','0','Cannot end dataload. Analytic Server Error(1003014): Unknown Member [0] in Data Load, [1] Records Completed'
It seems that the data load rule can recognize the some member except the figures of Jan to Dec..
thanks for your help -
Hello All,
I have a job to load data from SQL Server to SAP BW. I have followed the steps got from SAP wiki to do this.
1. I have created an RFC connection between two servers(SAP and BODS Job Server)
when I schedule and start the job immediately from the SAP BW, i get this error and it aborts the RFC connection....
"Error while executing the following command: -sJO return code:238"
Error message during processing in BI
Diagnosis
An error occurred in BI while processing the data. The error is documented in an error message.
System Response
A caller 01, 02 or equal to or greater than 20 contains an error meesage.
Further analysis:
The error message(s) was (were) sent by:
Scheduler
Any help would be appreciated......
Thanks
Praveen...Hi Praveen,
I want to know which version of BODS you are using for your development of ETL jobs?.
If it's BODS 12.2.2.2 then you will get this type of problems frequently as in BODS 12.2.2.2 version , only two connection
is possible to create while having RFC between BW and BODS.
So , i suggest if you are using BODS 12.2.2.2 version , then upgrade it with BODS 12.2.0.0 with Service PACK 3 and Fix Pack3 .
AS in BODS 12.2.3.3. we have option of having ten connection parallely at a time which helps in resolving this issues.
please let me know what is your BODS version and if you have upgraded your BODS to SP3 with FP3 , whether your problem is resolved or not..
All the best..!!
Thanks ,
Shandilya Saurav -
Combining Data into one Cube from two Data-sources..
Dear Experts,
I am pulling data from two data sources and trying to combine in one Info-Cube. The data are like
Data-Source 01
1. GUID --Common
2.Document No ( User Entry)
3.Dist. Channel
4.Transaction Type
5.Date and Quantity
Data-Source 02
1.GUID -- Common
2.Billing Document ( If User drill down according to Document No , Billing Document should come in the report )
3.Billing date
4.Net Value
Out of the datas , The GUID is common between the 2 data-sources. I was thinking that, tha data will take according to its place and If i select the Document No in Report, it will atomatically fetch all the data like Tran type, dist ch, Billing Document No , Billing date.. .
The problem is , in the report Tha data is not coming as I was thinking.
And Another problem is , In future I need to create a Multiprovider between the above mentioned Info-cube and One ODS. And DOCUMENT NO is common in Cube and ODS.
Please Suggest,
How can I proceed for the following requirement.
Thanks,
SanjanaHi Sanjana,
In your case cube will create a problem because it will have multiple records . For example :
Data-Source 01 :
1. GUID -- 101
2.Document No - 999
3.Dist. Channel - DL
4.Transaction Type - GPRO
5.Date and Quantity - 20.02.2011 & 20
Data-Source 02
1.GUID -- 101
2.Billing Document - 6000
3.Billing date - 03.03.2011
4.Net Value - 500
Your cube will have 2 records . And your requirement is to show above two records in 1 record in the report .
Why dont you make an ODS in between , where you can put GUID as the Key field and rest all the fields as data fields. Create 2 transformations to this DSO from the 2 datasources . And let it get updated one by one . Your DSO will have 1 record only . Now either you do reporting on this DSO or take the data to the cube .
Hope the above reply was helpful.
Kind Regards,
Ashutosh Singh
Edited by: Ashutosh Singh on May 19, 2011 1:34 PM -
How can I load iCal data into Calendar and Address Book data into Contacts?
I just bought a MacBook Pro running Yosemite. Migration Assistant could not read the Backup of my old, dead iMac, so I have been transferring data manually. So far Contacts and Calendar are the only Apps I have not been able to set up.
What OS was the iMac running? I ask because where data is stored can change.
-
Stored Procedure: Extract data into second cursor
This is ready-to-run script with a commented out INCOMPLETE section of code where i need to select from a cursor.
--== PDF Dcument Table of contents ==---
--== PDF Report contains Period to date on condiments and bun sales for a region
--== For each week for a 4-week period with week 6 representing PTD and week 7 representing
--== YTD. Data should be gathered in rowsets and pivoted on client
--==
--== Basic Functionallity:
--=== Write a stored procedure that performs the following:
--== 1) Select the report data into cursor 1 in the order with a resultset order
--== in the way it will be printed to report.
--== 2) Create list of inique stores in separate cursor
--== Approach:
--== Create the Type object of fields needed to build table of contents for each row of the table. Add a sort field
--== to enable restore the original order of data after any sorting done on client
--== Create table table of the row objects
--== Declare 2 cursors:
-- a) ) First cursor holds the data for the PDF Report to be pivoted by the client
-- b) ) Second should contain a table of contents (unique storenbr) in the
--== same order as the stores in the first cursor.
--== Oracle version 10g v2 on W2K3
begin execute immediate 'drop type TYP_TBL_CWSR_TOC'; exception when others then null; end;
begin execute immediate 'drop type TYP_CWSR_TOC'; exception when others then null; end;
begin execute immediate 'drop procedure Create_Rpt_and_TOC'; exception when others then null; end;
create or replace TYPE TYP_CWSR_TOC AS OBJECT
( sortcol number --== probably not needed, just in case
, storenbr varchar2(100)
, storename varchar2(200)
create or replace TYPE TYP_TBL_CWSR_TOC AS TABLE OF TYP_CWSR_TOC;
create or replace procedure create_rpt_and_toc
pc_report_data OUT sys_refcursor
, pc_TOC OUT sys_refcursor
AS
v_tblTOC TYP_TBL_CWSR_TOC;
v_rec TYP_CWSR_TOC := TYP_CWSR_TOC(NULL,NULL, NULL);
BEGIN
OPEN pc_report_data FOR
with sample_data as
( select 22 storeid , 1 week_nbr, 15942 net_sales, 372 buns, 176 condiments from dual union all
select 22 storeid , 6 week_nbr, 15942 net_sales, 372 buns, 176 condiments from dual union all
select 22 storeid , 7 week_nbr, 15942 net_sales, 372 buns, 176 condiments from dual union all
select 23 storeid , 1 week_nbr, 25302 net_sales, 481 buns, 221 condiments from dual union all
select 23 storeid , 6 week_nbr, 25302 net_sales, 481 buns, 221 condiments from dual union all
select 23 storeid , 7 week_nbr, 25302 net_sales, 481 buns, 221 condiments from dual union all
select 24 storeid , 1 week_nbr, 29347 net_sales, 598 buns, 238 condiments from dual union all
select 24 storeid , 6 week_nbr, 29347 net_sales, 598 buns, 238 condiments from dual union all
select 24 storeid , 7 week_nbr, 29347 net_sales, 598 buns, 238 condiments from dual union all
select 25 storeid , 1 week_nbr, 17637 net_sales, 360 buns, 165 condiments from dual union all
select 25 storeid , 6 week_nbr, 17637 net_sales, 360 buns, 165 condiments from dual union all
select 25 storeid , 7 week_nbr, 17637 net_sales, 360 buns, 165 condiments from dual union all
select 27 storeid , 1 week_nbr, 22010 net_sales, 405 buns, 172 condiments from dual union all
select 27 storeid , 6 week_nbr, 22010 net_sales, 405 buns, 172 condiments from dual union all
select 27 storeid , 7 week_nbr, 22010 net_sales, 405 buns, 172 condiments from dual union all
select 31 storeid , 1 week_nbr, 16836 net_sales, 345 buns, 168 condiments from dual union all
select 31 storeid , 6 week_nbr, 16836 net_sales, 345 buns, 168 condiments from dual union all
select 31 storeid , 7 week_nbr, 16836 net_sales, 345 buns, 168 condiments from dual union all
select 38 storeid , 1 week_nbr, 28244 net_sales, 524 buns, 247 condiments from dual union all
select 38 storeid , 6 week_nbr, 28244 net_sales, 524 buns, 247 condiments from dual union all
select 38 storeid , 7 week_nbr, 28244 net_sales, 524 buns, 247 condiments from dual union all
select 39 storeid , 1 week_nbr, 21011 net_sales, 407 buns, 238 condiments from dual union all
select 39 storeid , 6 week_nbr, 21011 net_sales, 407 buns, 238 condiments from dual union all
select 39 storeid , 7 week_nbr, 21011 net_sales, 407 buns, 238 condiments from dual union all
select 41 storeid , 1 week_nbr, 18026 net_sales, 430 buns, 179 condiments from dual union all
select 41 storeid , 6 week_nbr, 18026 net_sales, 430 buns, 179 condiments from dual union all
select 41 storeid , 7 week_nbr, 18026 net_sales, 430 buns, 179 condiments from dual union all
select 42 storeid , 1 week_nbr, 24821 net_sales, 466 buns, 212 condiments from dual union all
select 42 storeid , 6 week_nbr, 24821 net_sales, 466 buns, 212 condiments from dual union all
select 42 storeid , 7 week_nbr, 24821 net_sales, 466 buns, 212 condiments from dual union all
select 65 storeid , 1 week_nbr, 13356 net_sales, 281 buns, 136 condiments from dual union all
select 65 storeid , 6 week_nbr, 13356 net_sales, 281 buns, 136 condiments from dual union all
select 65 storeid , 7 week_nbr, 13356 net_sales, 281 buns, 136 condiments from dual union all
select 66 storeid , 1 week_nbr, 15421 net_sales, 337 buns, 155 condiments from dual union all
select 66 storeid , 6 week_nbr, 15421 net_sales, 337 buns, 155 condiments from dual union all
select 66 storeid , 7 week_nbr, 15421 net_sales, 337 buns, 155 condiments from dual union all
select 67 storeid , 1 week_nbr, 28064 net_sales, 625 buns, 283 condiments from dual union all
select 67 storeid , 6 week_nbr, 28064 net_sales, 625 buns, 283 condiments from dual union all
select 67 storeid , 7 week_nbr, 28064 net_sales, 625 buns, 283 condiments from dual union all
select 68 storeid , 1 week_nbr, 22875 net_sales, 493 buns, 238 condiments from dual union all
select 68 storeid , 6 week_nbr, 22875 net_sales, 493 buns, 238 condiments from dual union all
select 68 storeid , 7 week_nbr, 22875 net_sales, 493 buns, 238 condiments from dual union all
select 70 storeid , 1 week_nbr, 26434 net_sales, 562 buns, 248 condiments from dual union all
select 70 storeid , 6 week_nbr, 26434 net_sales, 562 buns, 248 condiments from dual union all
select 70 storeid , 7 week_nbr, 26434 net_sales, 562 buns, 248 condiments from dual union all
select 71 storeid , 1 week_nbr, 14259 net_sales, 297 buns, 133 condiments from dual union all
select 71 storeid , 6 week_nbr, 14259 net_sales, 297 buns, 133 condiments from dual union all
select 71 storeid , 7 week_nbr, 14259 net_sales, 297 buns, 133 condiments from dual union all
select 82 storeid , 1 week_nbr, 24446 net_sales, 469 buns, 210 condiments from dual union all
select 82 storeid , 6 week_nbr, 24446 net_sales, 469 buns, 210 condiments from dual union all
select 82 storeid , 7 week_nbr, 24446 net_sales, 469 buns, 210 condiments from dual union all
select 83 storeid , 1 week_nbr, 13959 net_sales, 280 buns, 104 condiments from dual union all
select 83 storeid , 6 week_nbr, 13959 net_sales, 280 buns, 104 condiments from dual union all
select 83 storeid , 7 week_nbr, 13959 net_sales, 280 buns, 104 condiments from dual union all
select 181 storeid , 1 week_nbr, 13140 net_sales, 273 buns, 136 condiments from dual union all
select 181 storeid , 6 week_nbr, 13140 net_sales, 273 buns, 136 condiments from dual union all
select 181 storeid , 7 week_nbr, 13140 net_sales, 273 buns, 136 condiments from dual union all
select 221 storeid , 1 week_nbr, 27347 net_sales, 546 buns, 289 condiments from dual union all
select 221 storeid , 6 week_nbr, 27347 net_sales, 546 buns, 289 condiments from dual union all
select 221 storeid , 7 week_nbr, 27347 net_sales, 546 buns, 289 condiments from dual union all
select 222 storeid , 1 week_nbr, 16456 net_sales, 379 buns, 148 condiments from dual union all
select 222 storeid , 6 week_nbr, 16456 net_sales, 379 buns, 148 condiments from dual union all
select 222 storeid , 7 week_nbr, 16456 net_sales, 379 buns, 148 condiments from dual union all
select 223 storeid , 1 week_nbr, 20611 net_sales, 439 buns, 165 condiments from dual union all
select 223 storeid , 6 week_nbr, 20611 net_sales, 439 buns, 165 condiments from dual union all
select 223 storeid , 7 week_nbr, 20611 net_sales, 439 buns, 165 condiments from dual union all
select 224 storeid , 1 week_nbr, 21537 net_sales, 420 buns, 173 condiments from dual union all
select 224 storeid , 6 week_nbr, 21537 net_sales, 420 buns, 173 condiments from dual union all
select 224 storeid , 7 week_nbr, 21537 net_sales, 420 buns, 173 condiments from dual union all
select 260 storeid , 1 week_nbr, 19329 net_sales, 380 buns, 196 condiments from dual union all
select 260 storeid , 6 week_nbr, 19329 net_sales, 380 buns, 196 condiments from dual union all
select 260 storeid , 7 week_nbr, 19329 net_sales, 380 buns, 196 condiments from dual union all
select 280 storeid , 1 week_nbr, 20692 net_sales, 512 buns, 202 condiments from dual union all
select 280 storeid , 6 week_nbr, 20692 net_sales, 512 buns, 202 condiments from dual union all
select 280 storeid , 7 week_nbr, 20692 net_sales, 512 buns, 202 condiments from dual union all
select 294 storeid , 1 week_nbr, 26522 net_sales, 481 buns, 252 condiments from dual union all
select 294 storeid , 6 week_nbr, 26522 net_sales, 481 buns, 252 condiments from dual union all
select 294 storeid , 7 week_nbr, 26522 net_sales, 481 buns, 252 condiments from dual union all
select 362 storeid , 1 week_nbr, 20611 net_sales, 317 buns, 221 condiments from dual union all
select 362 storeid , 6 week_nbr, 20611 net_sales, 317 buns, 221 condiments from dual union all
select 362 storeid , 7 week_nbr, 20611 net_sales, 317 buns, 221 condiments from dual union all
select 501 storeid , 1 week_nbr, 28337 net_sales, 518 buns, 273 condiments from dual union all
select 501 storeid , 6 week_nbr, 28337 net_sales, 518 buns, 273 condiments from dual union all
select 501 storeid , 7 week_nbr, 28337 net_sales, 518 buns, 273 condiments from dual union all
select 521 storeid , 1 week_nbr, 26118 net_sales, 438 buns, 257 condiments from dual union all
select 521 storeid , 6 week_nbr, 26118 net_sales, 438 buns, 257 condiments from dual union all
select 521 storeid , 7 week_nbr, 26118 net_sales, 438 buns, 257 condiments from dual union all
select 524 storeid , 1 week_nbr, 31929 net_sales, 582 buns, 247 condiments from dual union all
select 524 storeid , 6 week_nbr, 31929 net_sales, 582 buns, 247 condiments from dual union all
select 524 storeid , 7 week_nbr, 31929 net_sales, 582 buns, 247 condiments from dual
, store_data as
select 27 storeid, 'County Gate' storename , '5601' storenbr , 'R1-Roosevelt' regionname , 'D11-Wilcox' districtname , 'VMS' companyname from dual union all
select 67 storeid, 'N. Jackson' storename , '0177' storenbr , 'R1-Roosevelt' regionname , 'D14-Sandus' districtname , 'VMS' companyname from dual union all
select 68 storeid, 'Dyersburg' storename , '0277' storenbr , 'R1-Roosevelt' regionname , 'D14-Sandus' districtname , 'VMS' companyname from dual union all
select 280 storeid, 'Poplar Ave.' storename , '3080' storenbr , 'R1-Roosevelt' regionname , 'D12-Smart' districtname , 'VMS' companyname from dual union all
select 294 storeid, 'Goodman Rd' storename , '5702' storenbr , 'R1-Roosevelt' regionname , 'D12-Smart' districtname , 'VMS' companyname from dual union all
select 25 storeid, 'Germantown' storename , '5094' storenbr , 'R1-Roosevelt' regionname , 'D11-Wilcox' districtname , 'VMS' companyname from dual union all
select 181 storeid, 'Mendehall' storename , '4090' storenbr , 'R1-Roosevelt' regionname , 'D12-Smart' districtname , 'VMS' companyname from dual union all
select 31 storeid, 'Winchester' storename , '2684' storenbr , 'R1-Roosevelt' regionname , 'D11-Wilcox' districtname , 'VMS' companyname from dual union all
select 41 storeid, 'Washington' storename , '4190' storenbr , 'R1-Roosevelt' regionname , 'D13-Bowser' districtname , 'VMS' companyname from dual union all
select 42 storeid, 'Cordova' storename , '4393' storenbr , 'R1-Roosevelt' regionname , 'D13-Bowser' districtname , 'VMS' companyname from dual union all
select 70 storeid, 'S. Jackson' storename , '0679' storenbr , 'R1-Roosevelt' regionname , 'D14-Sandus' districtname , 'VMS' companyname from dual union all
select 221 storeid, 'Jackson' storename , '5500' storenbr , 'R1-Roosevelt' regionname , 'D14-Sandus' districtname , 'VMS' companyname from dual union all
select 223 storeid, 'Highway 51' storename , '3485' storenbr , 'R1-Roosevelt' regionname , 'D14-Sandus' districtname , 'VMS' companyname from dual union all
select 66 storeid, 'New Summer' storename , '2980' storenbr , 'R1-Roosevelt' regionname , 'D15-Rickard' districtname , 'VMS' companyname from dual union all
select 82 storeid, 'Navy Road' storename , '1476' storenbr , 'R1-Roosevelt' regionname , 'D15-Rickard' districtname , 'VMS' companyname from dual union all
select 224 storeid, 'New Covington' storename , '5397' storenbr , 'R1-Roosevelt' regionname , 'D15-Rickard' districtname , 'VMS' companyname from dual union all
select 501 storeid, 'Kirby Quince' storename , '6504' storenbr , 'R1-Roosevelt' regionname , 'D11-Wilcox' districtname , 'VMS' companyname from dual union all
select 22 storeid, 'Wchstr/Good' storename , '2385' storenbr , 'R1-Roosevelt' regionname , 'D11-Wilcox' districtname , 'VMS' companyname from dual union all
select 23 storeid, 'Union Ave' storename , '1275' storenbr , 'R1-Roosevelt' regionname , 'D13-Bowser' districtname , 'VMS' companyname from dual union all
select 24 storeid, 'West Poplar' storename , '4290' storenbr , 'R1-Roosevelt' regionname , 'D11-Wilcox' districtname , 'VMS' companyname from dual union all
select 222 storeid, 'Thomas St.' storename , '1977' storenbr , 'R1-Roosevelt' regionname , 'D15-Rickard' districtname , 'VMS' companyname from dual union all
select 362 storeid, 'Wolfchase' storename , '5802' storenbr , 'R1-Roosevelt' regionname , 'D13-Bowser' districtname , 'VMS' companyname from dual union all
select 524 storeid, 'Houston Levee' storename , '6705' storenbr , 'R1-Roosevelt' regionname , 'D15-Rickard' districtname , 'VMS' companyname from dual union all
select 521 storeid, 'G-Town/I-40' storename , '6604' storenbr , 'R1-Roosevelt' regionname , 'D15-Rickard' districtname , 'VMS' companyname from dual union all
select 38 storeid, 'Horn Lake' storename , '4994' storenbr , 'R1-Roosevelt' regionname , 'D12-Smart' districtname , 'VMS' companyname from dual union all
select 39 storeid, 'Macon/Syc' storename , '2885' storenbr , 'R1-Roosevelt' regionname , 'D13-Bowser' districtname , 'VMS' companyname from dual union all
select 65 storeid, 'Poplar/Fenwick' storename , '2581' storenbr , 'R1-Roosevelt' regionname , 'D13-Bowser' districtname , 'VMS' companyname from dual union all
select 71 storeid, 'Humboldt' storename , '0785' storenbr , 'R1-Roosevelt' regionname , 'D14-Sandus' districtname , 'VMS' companyname from dual union all
select 83 storeid, 'Mt. Moriah' storename , '1174' storenbr , 'R1-Roosevelt' regionname , 'D15-Rickard' districtname , 'VMS' companyname from dual union all
select 260 storeid, 'Getwell' storename , '1576' storenbr , 'R1-Roosevelt' regionname , 'D12-Smart' districtname , 'VMS' companyname from dual
select decode(gc,0,companyname, 'VanderbiltFoods') as companyname
, decode(gr,0,regionname,decode(gc,0,companyname, 'VanderbiltFoods')) as regionname
, decode(gd,0,districtname,decode(gr,0,regionname,decode(gc,0,companyname, 'VanderbiltFoods'))) as districtname
, decode(gs,0,storenbr,decode(gd,0,districtname,decode(gr,0,regionname,decode(gc,0,companyname, 'VanderbiltFoods')))) as storenbr
, decode(gs,0,storename,decode(gd,0,districtname,decode(gr,0,regionname,decode(gc,0,companyname, 'VanderbiltFoods')))) as storename
, net_sales
, buns
, condiments
from ( select companyname
, grouping(companyname) gc
, regionname
, grouping(regionname) gr
, districtname
, grouping(districtname) gd
, storenbr
, grouping(storenbr) gs
, max(storename) storename
, sum(net_sales) net_sales
, sum(buns) buns
, sum(condiments) condiments
from store_data stdata
inner join sample_data sampdata on sampdata.storeid = stdata.storeid
group by rollup(companyname, regionname, districtname, storenbr), week_nbr
order by companyname nulls first,gc desc, regionname nulls first, gr desc, districtname nulls first, gd desc,storenbr nulls first, gs desc
/* --== INCOMPLETE CODE --
--== GET TABLE OF CONTENTS In same order as first cursor
open pc_report_data for
select rownum as sortcol
, storenbr
, storename)
BULK COLLECT INTO pc_TOC
END create_rpt_and_toc;I don't know sQL developer well enough to view cursor results from stored procedure
but here is test code from debugger window
DECLARE
PC_REPORT_DATA sys_refcursor;
PC_TOC sys_refcursor;
BEGIN
CREATE_RPT_AND_TOC(
PC_REPORT_DATA => PC_REPORT_DATA,
PC_TOC => PC_TOC
-- Modify the code to output the variable
-- DBMS_OUTPUT.PUT_LINE('PC_REPORT_DATA = ' || PC_REPORT_DATA);
-- Modify the code to output the variable
-- DBMS_OUTPUT.PUT_LINE('PC_TOC = ' || PC_TOC);
END;I am currently doing this in the presentation layer but is a lot cleaner and easier to maintain if handled in DB
I'd googled that suggested this was possible. I had decided on the FETCH. But I've been to avoid FETCH and LOOP
wherever possible.
One eample I found: I always try so much stuff, I forget where i got the idea) was:
-- pseudoscript
FETCH outer cursor.
Select outer cursor
open second cursor for select into and CLOSE Cursor.
LOOPI also found this.
create or replace procedure testproc(c_test out sys_refcursor) is
begin
open c_test for select first_name, last_name, email from employees where rownum < 10;
end;Here though, its a simple select from a table vs. a cursor.
I thouht it was woth asking the question. -
How to insert more than 32k xml data into oracle clob column
how to insert more than 32k xml data into oracle clob column.
xml data is coming from java front end
if we cannot use clob than what are the different options availableAre you facing any issue with my code?
String lateral size error will come when you try to insert the full xml in string format.
public static boolean writeCLOBData(String tableName, String id, String columnName, String strContents) throws DataAccessException{
boolean isUpdated = true;
Connection connection = null;
try {
connection = ConnectionManager.getConnection ();
//connection.setAutoCommit ( false );
PreparedStatement PREPARE_STATEMENT = null;
String sqlQuery = "UPDATE " + tableName + " SET " + columnName + " = ? WHERE ID =" + id;
PREPARE_STATEMENT = connection.prepareStatement ( sqlQuery );
// converting string to reader stream
Reader reader = new StringReader ( strContents );
PREPARE_STATEMENT.setClob ( 1, reader );
// return false after updating the clob data to DB
isUpdated = PREPARE_STATEMENT.execute ();
PREPARE_STATEMENT.close ();
} catch ( SQLException e ) {
e.printStackTrace ();
finally{
return isUpdated;
Try this JAVA code. -
How to put the data into cache and distribute to nodeusing oracle coherence
Hi Friends,
i am having some random number data writing into file,from that file i am reading the data and i want to put into cache,how can i put the data into cache and partition this data into different nodes ( machines) to caluculate like S.D,variance..etc..like that.(or how can i implement montecarlo using oracle coherence) if any one know plz suggest me with flow.
Thank you.
regards
chandraHi robert,
i have some bulk data in some arraylist or object format,i want to put into cache.
i am not able to put into cache.i am using put method like cache.put(object key ,object value) ,but its not allowing to put into cache.
can you please help me.i m sending my code.plz go thru and tel me whr i did mistake.
package lab3;
import com.tangosol.net.CacheFactory;
import com.tangosol.net.NamedCache;
import com.tangosol.net.cache.NearCache;
import java.io.File;
import java.io.FileNotFoundException;
import java.io.PrintWriter;
import java.util.ArrayList;
import java.util.List;
import java.util.Random;
import java.util.Scanner;
import javax.naming.Name;
public class BlockScoleData {
* @param args
* s=The spot market price
* x=the exercise price of the option
* v=instantaneous standard deviation of s
* r=risk free instantaneous rate of interest
* t= time to expiration of the option
* n – Number of MC simulations.
private static String outputFile = "D:/cache1/sampledata2.txt";
private static String inputFile = "D:/cache1/sampledata2.txt";
NearCache cache;
List<Credit> creditList = new ArrayList<Credit>();
public void writeToFile(int noofsamples) {
Random rnd = new Random();
PrintWriter writer = null;
try {
writer = new PrintWriter(outputFile);
for (int i = 1; i <= noofsamples; i++) {
double s = rnd.nextInt(200) * rnd.nextDouble();
//double x = rnd.nextInt(250) * rnd.nextDouble();
int t = rnd.nextInt(5);
double v = rnd.nextDouble() ;
double r = rnd.nextDouble() / 10;
//int n = rnd.nextInt(90000);
writer.println(s + " " + t + " " + v + " "
+ r );
} catch (FileNotFoundException e) {
e.printStackTrace();
} finally {
writer.close();
writer = null;
public List<Credit> readFromFile() {
Scanner scanner = null;
Credit credit = null;
// List<Credit> creditList = new ArrayList<Credit>();
try {
scanner = new Scanner(new File(inputFile));
while (scanner.hasNext()) {
credit = new Credit(scanner.nextDouble(), scanner.nextInt(),
scanner.nextDouble(), scanner.nextDouble());
creditList.add(credit);
System.out.println("read the list from file:"+creditList);
} catch (FileNotFoundException e) {
e.printStackTrace();
} finally {
scanner.close();
credit = null;
scanner = null;
return creditList;
// public void putCache(String cachename,List<Credit> list){
// cache = CacheFactory.getCache ( "VirtualCache");
// List<Credit> rand = new ArrayList<Credit>();
public Object put(Object key, Object value){
cache = (NearCache)CacheFactory.getCache("mycache");
String cachename = cache.getCacheName();
List<Credit> cachelist=new ArrayList<Credit>();
// Object key;
//cachelist = (List<Credit>)cache.put(creditList,creditList);
cache.put(creditList,creditList);
System.out.println("read to the cache list from file:"+cache.get(creditList));
return cachelist;
public static void main(String[] args) throws Exception {
NearCache cache = (NearCache)CacheFactory.getCache("mycache");
new BlockScoleData().writeToFile(20);
//new BlockScoleData().putCache("Name",);
System.out
.println("New file \"myfile.csv\" has been created to the current directory");
CacheFactory.ensureCluster();
new BlockScoleData().readFromFile();
System.out.println("data read from file successfully");
List<Credit> creditList = new ArrayList<Credit>();
new BlockScoleData().put(creditList,creditList);
System.out.println("read to the cache list from file:"+cache.get(creditList));
//cache=CacheFactory.getCache("mycache");
//mycacheput("Name",new BlockScoleData());
// System.out.println("name of cache is :" +mycache.getCacheName());
// System.out.println("value in cache is :" +mycache.get("Name"));
// System.out.println("cache services are :" +mycache.getCacheService());
regards
chandra -
I can not put the data of a field(LONG RAW) consulted into a item of a data block
I want to query a field that is LONG RAW(it's a image) and put the data into a item of a data block (in Forms Builder), when i write ":BLOCK.FOTO:=vfoto" i recieve this errors: "bad bind variable ....". How can i put the data of field of my DB and put the data in a item of a data block(in Forms Builder)?.
I can store a image in a table of my DB, but i want to query a image stored in my DB and put it into anothers table, all this on FORMS BUILDER.You have to base the image item on a base table and use EXECUTE_QUERY on that block. You can't do a direct select in PL/SQL into the item
-
How to Get the Current data into Planning Layout from the Planning cube
Hi,
I have a problem in BPS. I am selecting the data from cube based on Month org details but I want to see the latest data. How can i get this data into planning cube.
Like
data
Tran cal month org amt
1 jan a 100
1 feb a 200
if i want to read based on Tran org as input values I shoud get the below data but I am getting the previous data.
Tran cal month org amt
1 feb a 200
Kindly help me in this regard
Thanks
NaveenNaveen,
Are yo having issue when you save something in the layout, the data doesn't appear in the listcube ? Or do you have issue that the latest data you see in the cube doesn't appear in the layout ?
For the former issue, please look at your selections of listcube and for the second issue, please check your planning level definition, make sure all the restrictions you have applied are valid for this latest data to be presented in the layout.
Hope this helps.
Cheers
Srini -
Hello all,
I want create a BSP page, with 2 radio buttons,
If 1st selected I want read 2 tables and display data other BSP page.
If 2nd one selected I want download 2 tables data into excel file.
Any one can help how to download data into excel file and display data into other bsp page.
Thanks,
Regards,
VenkatFor downloading to a spread sheet:
Convert your fetched data to a string and convert to XSTRING format and finally download to a spread sheet.
Sample: I selected data from vbak say vbeln and kunnr in an internal table.
data : v_string type string, v_xstring type xstring.
<header portion - to have the heading in the spread sheet>
concatenate 'Order Number' 'Customer'
cl_abap_char_utilities=>cr_lf into v_string
separated by cl_abap_char_utilities=>horizontal_tab.
<Line item entries from my internal table say i_vbak>
loop at i_vbak into wa_vbak.
concatenate v_string wa_vbak-vbeln wa_vbak-kunnr
cl_abap_char_utilities=>cr_lf into v_string
separated by cl_abap_char_utilities=>horizontal_tab.
endloop.
All the data is now in string format. Calling the FM to convert to XSTRING
call function 'SCMS_STRING_TO_XSTRING'
exporting
text = v_string
mimetype = 'APPLICATION/MSEXCEL; charset=utf-16le' (probably create this in a variable and call here)
importing
buffer = v_xstring.
concatenate cl_abap_char_utilities=>byte_order_mark_little v_xstring into v_xstring in byte mode.
Now we have it in XSTRING format - this can be downloaded to a spread sheet.
v_appl = 'APPLICATION/MSEXCEL; charset=utf-16le'.
runtime->server->response->set_header_field( name = 'content-type' value = v_appl ).
Eliminating the cache problems when loading Excel Format
runtime->server->response->delete_header_field( name = if_http_header_fields=>cache_control ).
runtime->server->response->delete_header_field( name = if_http_header_fields=>expires ).
runtime->server->response->delete_header_field( name = if_http_header_fields=>pragma ).
Start excel in a separate window
runtime->server->response->set_header_field( name = 'content-disposition' value = 'attachment; filename=Order_list.xls' ).
Displaying the data in excel.
v_len = xstrlen( v_xstring ).
runtime->server->response->set_data( data = v_xstring length = v_len ).
navigation->response_complete( ).
All the above code can be written in onInputprocessing event (probably your loop/selection can be in a method of your appl class).
I believe you are triggering the event based on a click (say radio button or a button after selecting the radiobutton).
In the other screen you can use tableview to display your data - probably two sub screens(page fragment) to display each table.
Regds,
Krish -
Importing Data into Sql Server 2012 from Excel Data
Hi,
I got errors like this when i am doing import data into sql server from excel Data. Can you please help us?
- Executing (Error)
Messages
Error 0xc020901c: Data Flow Task 1: There was an error with Source - demotable$.Outputs[Excel Source Output].Columns[Comment] on Source - demotable$.Outputs[Excel Source Output]. The column status returned was: "Text was truncated or one
or more characters had no match in the target code page.".
(SQL Server Import and Export Wizard)
Error 0xc020902a: Data Flow Task 1: The "Source - demotable$.Outputs[Excel Source Output].Columns[Comment]" failed because truncation occurred, and the truncation row disposition on "Source - demotable$.Outputs[Excel Source Output].Columns[Comment]"
specifies failure on truncation. A truncation error occurred on the specified object of the specified component.
(SQL Server Import and Export Wizard)
Error 0xc0047038: Data Flow Task 1: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED. The PrimeOutput method on Source - demotable$ returned error code 0xC020902A. The component returned a failure code when the pipeline engine called PrimeOutput().
The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing. There may be error messages posted before this with more information about the failure.
(SQL Server Import and Export Wizard)Are you attempting to import into a newly made table or into an existing table? It looks like it's trying to insert data where it cannot be inserted (invalid column or lack of data size in your column).
Try the following:
1). In your excel sheet, highlight the whole sheet and make sure the cells are in 'text' form and try re-importing
2). save the document as ms dos TEXT and import as a text document.
3). double check your columns are correct for the data, for example if you have a column that has a string of 100 characters and your column is 'NvarChar(90)' - that might cause the error? Or just correct data type in your column
3). If that doesn't work and you're inserting into a new table, try importing it as string first and writing a query to insert columns that should be float/integer or whatever. You may want to convert float texts to a 'bigint' first rather than string
> float as that can cause problems if I remember correctly.
Maybe you are looking for
-
Wifi Working OK, MBP after 10.4.8 update SLOW!
Hi all! Right after I upgraded to 10.4.8 the Airport connectiom on my MBP is really slow. I used to get 700kbps speeds, but know i´m at 45kbps!!! I tried the "delete preferences" tip found on these boards, but it did not work in my user account. I th
-
How to fix Import and folder-synchronize?
Installed Lightroom 4.2 RC. Now Import and folder synchronize hang up. Uninstalled 42. RC with same result. Removed and then reinstalled 4.1 with same problem. Please advise. [email protected] Message was edited by: SD Rowdies
-
Is it possible to change what the "slide to unlock" says? I was told you could personalize it to say whatever you want.
-
Sun java application server 9.1
Hi everybody I am using sun java application server 9.1 and I am getting this warning message when I start the server java.util.MissingResourceException: Can't find bundle for base name com.sun.enterprise.jbi.serviceengine.core.LocalStrings, locale f
-
USB generation 3.0 compatiability with OS 10.4.11?
Kingston has all these generation 3.0 USB drives. According to their description, they can be used only with OS X v.10.5.x+. Do you guys think I would face some error messages if I try to use those gen 3.0 USB with my OS X v.10.4.11 (Tiger)?? I perso