Create table issue
Hi,
I am creating one table from select statment query. The row number order in select query is getting changed after table created. For example 5th records in select query is there in 6th records in newly created table.
Could any one hlp me?
Thanks
Hi,
849592 wrote:
Hi,
I am creating one table from select statment query. The row number order in select query is getting changed after table created. For example 5th records in select query is there in 6th records in newly created table.
Could any one hlp me?
ThanksSorry, I can't figure out what you're asking.
Whenever you have a question, post a complete test script that the people who want to help you can reun to re-create the problem and test their ideas. Include a little sample data (CREATE TABLE and INSERT statements, relevant columns only) for all the tables involved, and the results you want from that data.
Explain, using specific examples, how you get those results from that data.
Always say what version of Oracle you're using.
See the forum FAQ {message:id=9360002}
What do you mean by "The row number in select query"? The order of rows in a query output is determined by its ORDER BY clause. If you don't have an ORDER BY clause, then you can't be sure of the ourder of output. There is no way to make a default order for a table. (You can do this for a view, but it's usually not a very good idea.)
Similar Messages
-
Hi all,
I'm brand new to SQL, so I'm working my way through Sam's Teach Yourself SQL in 24 hours.
I'm using Microsoft SQL Server Management Studio
12.0.2000.8
I'm in Chapter 3 or the book where we are supposed to create our first table, and I'm having issues.
Here is the statement we're supposed to use:
Create table EMPLOYEE_TBL
emp_id varchar
(9) not null,
emp_name varchar
(40) not null,
emp_st_addr varchar
(20) not null,
emp_city varchar
(15) not null,
emp_st char
(2) not null,
emp_zip integer
(5) not null,
emp_phone integer
(10) null,
emp_pager integer
(10) null
It gives me this error:
Msg 2716, Level 16, State 1, Line 1
Column, parameter, or variable #6: Cannot specify a column width on data type int.
This seems to be saying that I have an INT data type on Line 1 that I'm trying to set a width on, but I don't see that.
What am I doing wrong?
Thanks!I'm not sure why it's not displaying that statement correctly. It should be:
Create
table EMPLOYEE_TBL
emp_id
varchar (9)
not null,
emp_name
varchar (40)
not null,
emp_st_addr
varchar (20)
not null,
emp_city
varchar (15)
not null,
emp_st
char (2)
not null,
emp_zip
integer (5)
not null,
emp_phone
integer (10)
null,
emp_pager
integer (10)
null -
Performance issue Create table as select BLOB
Hi!
I have a performance issue when moving BLOB´s between tables. (The size of images files are from 2MB to 10MB).
I'm using follwing statement for example,
"Create table tmp_blob as select * from table_blob
where blob_id = 333;"
Is there any hints that I can give when moving data like this or is Oracle10g better with BLOB's?Did you find a resolution to this issue?
We are also having the same issue and wondering if there is a faster mechanism to copy LOBs between two table. -
CREATE TABLE AS - PERFORMANCE ISSUE
Hi All,
I am creating a table CONTROLDATA from existing tables PF_CONTROLDATA & ICDSV2_AutoCodeDetails as per the below query.
CREATE TABLE CONTROLDATA AS
SELECT CONTROLVALUEID, VALUEORDER, CONTEXTID, AUDITORDER, INVALIDATINGREVISIONNUMBER, CONTROLID, STRVALUE
FROM PF_CONTROLDATA CD1
JOIN ICDSV2_AutoCodeDetails AC ON (CD1.CONTROLID=AC.MODTERM_CONTROL OR CD1.CONTROLID=AC.FAILED_CTRL OR CD1.CONTROLID=AC.CODE_CTRL)
AND CD1.AUDITORDER=(SELECT MAX(AUDITORDER) FROM PF_CONTROLDATA CD2 WHERE CD1.CONTEXTID=CD2.CONTEXTID);The above statement is taking around 10mins of time to create the table CONTROLDATA which is not acceptible in our environment. Can any one please suggest is there any way to improve the performance of the above query to create the table CONTROLDATA under a minute?
PF_CONTROLDATA has 1,50,00,000 (15million) rows and has composite index(XIF16PF_CONTROLDATA) on CONTEXTID, AUDITORDER columns and one more index(XIE1PF_CONTROLDATA) on CONTROLID column.
ICDSV2_AutoCodeDetails has only 6 rows and no indexes.
After the create table statement CONTROLDATA will have around 10,00,000 (1million) records.
Can some one give any suggestion to improve the performance of the above query?
oracle version is : 10.2.0.3
Tkprof output is:
create table CONTROLDATA2 as
SELECT CONTROLVALUEID, VALUEORDER, CONTEXTID, AUDITORDER, INVALIDATINGREVISIONNUMBER, CONTROLID, DATATYPE, NUMVALUE, FLOATVALUE, STRVALUE, PFDATETIME, MONTH, DAY, YEAR, HOUR, MINUTE, SECOND, UNITID, NORMALIZEDVALUE, NORMALIZEDUNITID, PARENTCONTROLVALUEID, PARENTVALUEORDER
FROM PF_CONTROLDATA CD1
JOIN ICDSV2_AutoCodeDetails AC ON (CD1.CONTROLID=AC.MODTERM_CONTROL OR CD1.CONTROLID=AC.FAILED_CTRL OR CD1.CONTROLID=AC.CODE_CTRL OR CD1.CONTROLID=AC.SYNONYM_CTRL)
AND AUDITORDER=(SELECT MAX(AUDITORDER) FROM PF_CONTROLDATA CD2 WHERE CD1.CONTEXTID=CD2.CONTEXTID)
call count cpu elapsed disk query current rows
Parse 1 0.00 0.03 2 2 0 0
Execute 1 15.25 593.43 211688 4990786 6617 1095856
Fetch 0 0.00 0.00 0 0 0 0
total 2 15.25 593.47 211690 4990788 6617 1095856
Misses in library cache during parse: 1
Optimizer mode: ALL_ROWS
Parsing user id: 40
********************************************************************************Explain plan output is:
Plan hash value: 2771048406
| Id | Operation | Name | Rows | Bytes | Cost (%CPU)| Time |
| 0 | CREATE TABLE STATEMENT | | 1 | 105 | 3609K (1)| 14:02:20 |
| 1 | LOAD AS SELECT | CONTROLDATA2 | | | | |
|* 2 | FILTER | | | | | |
| 3 | TABLE ACCESS BY INDEX ROWID | PF_CONTROLDATA | 178K| 9228K| 55344 (1)| 00:12:55 |
| 4 | NESTED LOOPS | | 891K| 89M| 55344 (1)| 00:12:55 |
| 5 | TABLE ACCESS FULL | ICDSV2_AUTOCODEDETAILS | 5 | 260 | 4 (0)| 00:00:01 |
| 6 | BITMAP CONVERSION TO ROWIDS | | | | | |
| 7 | BITMAP OR | | | | | |
| 8 | BITMAP CONVERSION FROM ROWIDS| | | | | |
|* 9 | INDEX RANGE SCAN | XIE1PF_CONTROLDATA | | | 48 (3)| 00:00:01 |
| 10 | BITMAP CONVERSION FROM ROWIDS| | | | | |
|* 11 | INDEX RANGE SCAN | XIE1PF_CONTROLDATA | | | 48 (3)| 00:00:01 |
| 12 | BITMAP CONVERSION FROM ROWIDS| | | | | |
|* 13 | INDEX RANGE SCAN | XIE1PF_CONTROLDATA | | | 48 (3)| 00:00:01 |
| 14 | BITMAP CONVERSION FROM ROWIDS| | | | | |
|* 15 | INDEX RANGE SCAN | XIE1PF_CONTROLDATA | | | 48 (3)| 00:00:01 |
| 16 | SORT AGGREGATE | | 1 | 16 | | |
| 17 | FIRST ROW | | 1 | 16 | 3 (0)| 00:00:01 |
|* 18 | INDEX RANGE SCAN (MIN/MAX) | XIF16PF_CONTROLDATA | 1 | 16 | 3 (0)| 00:00:01 |
Predicate Information (identified by operation id):
2 - filter("AUDITORDER"= (SELECT MAX("AUDITORDER") FROM "PF_CONTROLDATA" "CD2" WHERE
"CD2"."CONTEXTID"=:B1))
9 - access("CD1"."CONTROLID"="AC"."MODTERM_CONTROL")
11 - access("CD1"."CONTROLID"="AC"."FAILED_CTRL")
13 - access("CD1"."CONTROLID"="AC"."CODE_CTRL")
15 - access("CD1"."CONTROLID"="AC"."SYNONYM_CTRL")
18 - access("CD2"."CONTEXTID"=:B1)
Note
- dynamic sampling used for this statement
********************************************************************************I tried to change the above logic even by using insert statement and APPEND hint, but still taking the same time.
Please suggest.
Edited by: 867546 on Jun 22, 2011 2:42 PMHi user2361373
i tried using nologging also but still it is taking same amout of time. Please find below the tkprof output.
create table CONTROLDATA2 NOLOGGING as
SELECT CONTROLVALUEID, VALUEORDER, CONTEXTID, AUDITORDER, INVALIDATINGREVISIONNUMBER, CONTROLID, DATATYPE, NUMVALUE, FLOATVALUE, STRVALUE, PFDATETIME, MONTH, DAY, YEAR, HOUR, MINUTE, SECOND, UNITID, NORMALIZEDVALUE, NORMALIZEDUNITID, PARENTCONTROLVALUEID, PARENTVALUEORDER
FROM PF_CONTROLDATA CD1
JOIN ICDSV2_AutoCodeDetails AC ON (CD1.CONTROLID=AC.MODTERM_CONTROL OR CD1.CONTROLID=AC.FAILED_CTRL OR CD1.CONTROLID=AC.CODE_CTRL OR CD1.CONTROLID=AC.SYNONYM_CTRL)
AND AUDITORDER=(SELECT MAX(AUDITORDER) FROM PF_CONTROLDATA CD2 WHERE CD1.CONTEXTID=CD2.CONTEXTID)
call count cpu elapsed disk query current rows
Parse 1 0.03 0.03 2 2 0 0
Execute 1 13.98 598.54 211963 4990776 6271 1095856
Fetch 0 0.00 0.00 0 0 0 0
total 2 14.01 598.57 211965 4990778 6271 1095856
Misses in library cache during parse: 1
Optimizer mode: ALL_ROWS
Parsing user id: 40
********************************************************************************Edited by: 867546 on Jun 22, 2011 3:09 PM
Edited by: 867546 on Jun 22, 2011 3:10 PM -
Primary Key Issue With Creating Tables
Hi,
I have the following scripts to create my two tables in oracle 10g, i was wondering wether or not i had correctly set up my primary key or if there is a better way of doing it than the way i have.
Here are my two scripts for my tables:
CREATE TABLE CUSTOMER (
fname varchar2(15) NOT NULL,
lname varchar2(20) NOT NULL,
age varchar2(3) NOT NULL,
custAreaCode number(5) NOT NULL,
custNumber number(6) NOT NULL,
constraint cust_pk PRIMARY KEY (custAreaCode),
constraint cust_pk2 PRIMARY KEY (custNumber),
constraint age_chk CHECK (age >=18 AND age <150)
CREATE TABLE PHONECALL (
startDateTime smalldatetime NOT NULL,
custAreaCode number(5) NOT NULL, ---Reference PK From Customer Table
custNumber number(6) NOT NULL, ---Reference PK From Customer Table
dialledAreaCode number(5) NOT NULL,
dialledNumber number(6) NOT NULL,
crgPerMinute number number (3,1) NOT NULL,
endDateTime smalldatetime NOT NULL));
I am not sure if i have referenced the primary keys correctly in the PHONECALL table.
Thanks in advance :)Hi,
You want like this below ? I think that smalltime data type is not a valid type. Other thing, this is not a rule, but I advice you to put the primary key columns as the first columns of your table. One question: There is no PK on the phonecall table ?
SGMS@ORACLE10> create table customer (
2 custareacode number(5) not null,
3 custnumber number(6) not null,
4 fname varchar2(15) not null,
5 lname varchar2(20) not null,
6 age varchar2(3) not null,
7 constraint cust_pk primary key (custareacode),
8 constraint cust_uk unique (custnumber),
9 constraint age_chk check (age >=18 and age <150)
10 );
Table created.
SGMS@ORACLE10> create table phonecall (
2 custareacode number(5) not null constraint fk_phone_cusarecode_customer references customer,
3 custnumber number(6) not null constraint fk_phone_custnumber_customer references customer,
4 startdatetime date not null,
5 dialledareacode number(5) not null,
6 diallednumber number(6) not null,
7 crgperminute number (3,1) not null,
8 enddatetime date not null);
Table created.
SGMS@ORACLE10> select table_name,constraint_name,constraint_type from user_constraints
2 where table_name in ('CUSTOMER','PHONECALL') and constraint_type in ('P','U','R');
TABLE_NAME CONSTRAINT_NAME C
CUSTOMER CUST_PK P
CUSTOMER CUST_UK U
PHONECALL FK_PHONE_CUSARECODE_CUSTOMER R
PHONECALL FK_PHONE_CUSTNUMBER_CUSTOMER RCheers -
Create table select option Issue
I have created a copy of an exisitng table in the database as -
create table test1_temp as select * from test1 where 1=2;
The table was created w/o data.
But the problem is that the indexes and primary key of table "test1" are not copied to "test1_temp".
Since the table ddl is copied dynamically, I want to copy the indexes and primary keys also dynamically. How should I achieve this.
Pls help.Thanks.user10525117 wrote:
Thanks for your suggestions.
Actually there are 2 problems - 1) the sql needs to be run automatically using a script.Hence no manual intervention is possible. In that case have your script that search for the table_name and replace with new table_name (sed in unix)
2) the customer has adviced this approach of creating the table.
In that case you have to explain to your customer that creation of index is not possible in CTAS.
Based on these facts, pls suggest.
Thanks.I hope you are creating temporary tables, and would be dropping that tables later. If that is the case (ie. you do not mind the tablespace in which index is created). In this case you can have your pl/sql that runs dynamic sql to create the index fetching the data from user_constraints table.
Regards
Anurag -
30EA3 - Edit Table/Create Table(Advanced) - Table Name, Text Box Size
I am not sure if its the same on non-Linux versions, but in the edit table dialog, the table name text box size is tiny. Its only wide enough that you can see approx 7 chars.
Is it possible to increase this? The size of the text box in the simple create table dialog seems like a good size. As far as I can tell, it is not tiny to conserve space for other items.
Anyway, it's just a minor issue that wont prevent me from doing work - just think it'd be better a bit bigger. I am sure it didn't used to be this small.
Ta,
Trentoh interesting.
are you using openJDK?
$ /usr/lib/jvm/java-6-sun-1.6.0.22/bin/java -version
java version "1.6.0_22"
Java(TM) SE Runtime Environment (build 1.6.0_22-b04)
Java HotSpot(TM) 64-Bit Server VM (build 17.1-b03, mixed mode)
Ta,
Trent -
Hi Experts,
I have created a page in Apex where I m using File browse item say :p13_filebrowse(where i m using CSV format files) and other items too. On click of Submit button I m calling package which dynamically creates the table successfully of browsed CSV FIle.
Now My requirement is to create the dbms_scheduler Job for the existing package(AS Above) so that If I upload the very huge file it should run in backend using the DBMS Job.If I call package directly from the APex . it is happening successfully. But If i create the Job for that package then I m getting error for not identify the Apex file browsed items.
In Package I m using wwv_flow_files from where i m extracting the file and creating the table.Through Job it is unable to recognise the Apex file browse item.For more information please go through the follwing code:-
I m calling package from Apex which is succesfully executing in creaitng the table:-
htmldb_sid_50.parse_file(:P13_FILEBROWSE,'P13_COLLECTION' ,'P13_HEADINGS','P13_COLUMNS','P13_DDL');
htmldb_sid_50.parse_file(:P13_FILEBROWSE,'P13_COLLECTION' ,'P13_HEADINGS','P13_COLUMNS','P13_DDL','gdqdata.gdq_table1');
Foollwing code of Package:-
create or replace PACKAGE BODY htmldb_sid_50
AS
TYPE varchar2_t IS TABLE OF VARCHAR2(32767) INDEX BY binary_integer;
-- Private functions --{{{
PROCEDURE delete_collection ( --{{{
-- Delete the collection if it exists
p_collection_name IN VARCHAR2
IS
BEGIN
IF (htmldb_collection.collection_exists(p_collection_name))
THEN
htmldb_collection.delete_collection(p_collection_name);
END IF;
END delete_collection; --}}}
PROCEDURE csv_to_array ( --{{{
-- Utility to take a CSV string, parse it into a PL/SQL table
-- Note that it takes care of some elements optionally enclosed
-- by double-quotes.
p_csv_string IN VARCHAR2,
p_array OUT wwv_flow_global.vc_arr2,
p_num IN VARCHAR2 DEFAULT null,
p_separator IN VARCHAR2 := ','
IS
l_start_separator PLS_INTEGER := 0;
l_stop_separator PLS_INTEGER := 0;
l_length PLS_INTEGER := 0;
l_idx BINARY_INTEGER := 0;
l_new_idx BINARY_INTEGER := 0;
l_quote_enclosed BOOLEAN := FALSE;
l_offset PLS_INTEGER := 1;
p_csv_string_test varchar2(4000);
cell_value varchar2(400);
BEGIN
l_length := NVL(LENGTH(p_csv_string),0);
IF (l_length <= 0)
THEN
RETURN;
END IF;
LOOP
l_idx := l_idx + 1;
l_quote_enclosed := FALSE;
-- p_csv_string
p_csv_string_test := replace(p_csv_string,'""','@@');
--p_csv_string := p_csv_string_test;
IF SUBSTR(p_csv_string_test, l_start_separator + 1, 1) = '"'
THEN
l_quote_enclosed := TRUE;
l_offset := 2;
l_stop_separator := INSTR(p_csv_string_test, '"', l_start_separator + l_offset, 1);
ELSE
l_offset := 1;
l_stop_separator := INSTR(p_csv_string_test, p_separator, l_start_separator + l_offset, 1);
END IF;
IF l_stop_separator = 0
THEN
l_stop_separator := l_length + 1;
END IF;
if l_idx = 1 then
p_array(l_idx) := p_num;
l_idx := l_idx + 1;
end if;
if(l_idx < 50) then
cell_value := (SUBSTR(p_csv_string_test, l_start_separator + l_offset,(l_stop_separator - l_start_separator - l_offset)));
cell_value := replace(cell_value, '@@', '"');
p_array(l_idx) := cell_value;
end if;
EXIT WHEN l_stop_separator >= l_length;
IF l_quote_enclosed
THEN
l_stop_separator := l_stop_separator + 1;
END IF;
l_start_separator := l_stop_separator;
END LOOP;
END csv_to_array; --}}}
PROCEDURE get_records(p_blob IN blob,p_records OUT varchar2_t) --{{{
IS
l_record_separator VARCHAR2(2) := chr(13)||chr(10);
l_last INTEGER;
l_current INTEGER;
BEGIN
-- Sigh, stupid DOS/Unix newline stuff. If HTMLDB has generated the file,
-- it will be a Unix text file. If user has manually created the file, it
-- will have DOS newlines.
-- If the file has a DOS newline (cr+lf), use that
-- If the file does not have a DOS newline, use a Unix newline (lf)
IF (NVL(dbms_lob.instr(p_blob,utl_raw.cast_to_raw(l_record_separator),1,1),0)=0)
THEN
l_record_separator := chr(10);
END IF;
l_last := 1;
LOOP
l_current := dbms_lob.instr( p_blob, utl_raw.cast_to_raw(l_record_separator), l_last, 1 );
EXIT WHEN (nvl(l_current,0) = 0);
p_records(p_records.count+1) := utl_raw.cast_to_varchar2(dbms_lob.substr(p_blob,l_current-l_last,l_last));
l_last := l_current+length(l_record_separator);
END LOOP;
END get_records; --}}}
-- Utility functions --{{{
PROCEDURE parse_textarea ( --{{{
p_textarea IN VARCHAR2,
p_collection_name IN VARCHAR2
IS
l_index INTEGER;
l_string VARCHAR2(32767) := TRANSLATE(p_textarea,chr(10)||chr(13)||' ,','@@@@');
l_element VARCHAR2(100);
BEGIN
l_string := l_string||'@';
htmldb_collection.create_or_truncate_collection(p_collection_name);
LOOP
l_index := instr(l_string,'@');
EXIT WHEN NVL(l_index,0)=0;
l_element := substr(l_string,1,l_index-1);
IF (trim(l_element) IS NOT NULL)
THEN
htmldb_collection.add_member(p_collection_name,l_element);
END IF;
l_string := substr(l_string,l_index+1);
END LOOP;
END parse_textarea; --}}}
PROCEDURE parse_file( --{{{
p_file_name IN VARCHAR2,
p_collection_name IN VARCHAR2,
p_headings_item IN VARCHAR2,
p_columns_item IN VARCHAR2,
p_ddl_item IN VARCHAR2,
p_table_name IN VARCHAR2 DEFAULT NULL
IS
l_blob blob;
l_records varchar2_t;
l_record wwv_flow_global.vc_arr2;
l_datatypes wwv_flow_global.vc_arr2;
l_headings VARCHAR2(4000);
l_columns VARCHAR2(4000);
l_seq_id NUMBER;
l_num_columns INTEGER;
l_ddl VARCHAR2(4000);
l_keyword varchar2(300);
l_records_count number;
l_count number;
l_filename varchar2(400);
BEGIN
--delete from test_mylog;
-- insert into t_log(slog) values('p_file_name '||p_file_name||'p_collection_name '||p_collection_name||'p_headings_item '||p_headings_item||'p_columns_item '||p_columns_item||'p_ddl_item '||p_ddl_item||'p_table_name '||p_table_name);
-- commit;
insert into testing_job values(p_file_name,p_collection_name,p_headings_item,p_columns_item,p_ddl_item,p_table_name);commit;
IF (p_table_name is not null)
THEN
BEGIN
execute immediate 'drop table '||p_table_name;
EXCEPTION
WHEN OTHERS THEN null;
END;
l_ddl := 'create table '||p_table_name||' '||v(p_ddl_item);
htmldb_util.set_session_state('P13_DEBUG',l_ddl);
execute immediate l_ddl;
l_ddl := 'insert into '||p_table_name||' '||
'select '||v(p_columns_item)||' '||
'from htmldb_collections '||
'where seq_id > 1 and collection_name='''||p_collection_name||'''';
htmldb_util.set_session_state('P13_DEBUG',v('P13_DEBUG')||'/'||l_ddl);
execute immediate l_ddl;
RETURN;
END IF;
BEGIN
/*select blob_content into l_blob from wwv_flow_files
where name=p_file_name;*/
select the_blob into l_blob from t_batch_attachments
where blob_filename=p_file_name;
-- insert into testing_job values (l_blob);commit;
EXCEPTION
WHEN NO_DATA_FOUND THEN
raise_application_error(-20000,'File not found, id='||p_file_name);
END;
get_records(l_blob,l_records);
IF (l_records.count < 2)
THEN
raise_application_error(-20000,'File must have at least 2 ROWS, id='||p_file_name);
END IF;
-- Initialize collection
htmldb_collection.create_or_truncate_collection(p_collection_name);
-- Get column headings and datatypes
csv_to_array(l_records(1),l_record,'sid');
--l_records_count:=l_record.count+1;
--csv_to_array(l_records(l_records_count),l_datatypes);
l_num_columns := l_record.count;
if (l_num_columns > 100) then
raise_application_error(-20000,'Max. of 100 columns allowed, id='||p_file_name);
end if;
-- Get column headings and names
FOR i IN 1..l_record.count
LOOP
l_record(i) := trim(l_record(i));
l_record(i) := replace(l_record(i),' ','_');
l_record(i) := replace(l_record(i),'-','_');
l_record(i) := replace(l_record(i),'(','_');
l_record(i) := replace(l_record(i),')','_');
l_record(i) := replace(l_record(i),':','_');
l_record(i) := replace(l_record(i),';','_');
l_record(i) := replace(l_record(i),'.','_');
l_record(i) := replace(l_record(i),'"','_');
l_headings := l_headings||':'||l_record(i);
l_columns := l_columns||',c'||lpad(i,3,'0');
END LOOP;
l_headings := ltrim(l_headings,':');
l_columns := ltrim(l_columns,',');
htmldb_util.set_session_state(p_headings_item,l_headings);
htmldb_util.set_session_state(p_columns_item,l_columns);
-- Get datatypes
FOR i IN 1..l_record.count
LOOP
l_datatypes(i):='varchar2(400)';
l_ddl := l_ddl||','||l_record(i)||' '||l_datatypes(i);
END LOOP;
l_ddl := '('||ltrim(l_ddl,',')||')';
htmldb_util.set_session_state(p_ddl_item,l_ddl);
-- Save data into specified collection
FOR i IN 1..l_records.count
LOOP
csv_to_array(l_records(i),l_record,i-1);
l_seq_id := htmldb_collection.add_member(p_collection_name,'dummy');
FOR i IN 1..l_record.count
LOOP
htmldb_collection.update_member_attribute(
p_collection_name=> p_collection_name,
p_seq => l_seq_id,
p_attr_number => i,
p_attr_value => l_record(i)
END LOOP;
END LOOP;
-- DELETE FROM wwv_flow_files WHERE name=p_file_name;
END;
BEGIN
NULL;
END;
Now I m stuck in If I call the Package through following Job:-
DBMS_SCHEDULER.create_job (
job_name => 'New_Testing1',
job_type => 'PLSQL_BLOCK',
job_action => 'begin
GDQ.htmldb_sid_50.parse_file ('''||:P13_FILEBROWSE||''',''P13_COLLECTION'',''P13_HEADINGS'',''P13_COLUMNS'',''P13_DDL'');
htmldb_sid_50.parse_file('''||:P13_FILEBROWSE||''',''P13_COLLECTION'' ,''P13_HEADINGS'',''P13_COLUMNS'',''P13_DDL'',''gdqdata.gdq_table1'');
end;
start_date => NULL,
repeat_interval => NULL,
end_date => NULL,
enabled => TRUE,
comments => 'UPLOAD FILE INTO TABLE IN GDQ.'
I believe there is security issue in APex related to wwv_flow_files which is creating the problem . Please can anybody assist me in this issue.
Thanks in Advance
DanalaxmiPlease post the details of the application release, along with the complete error message and the steps you followed to reproduce the issue.
Thanks,
Hussein -
Source sys Restore "error while creating table EDISEGMENT "
Dear All,
I am Basis person and recently we have refreshed data of Test BI system from Development BI system. we normally carry out these Refresh but In this case we have also changed Hostname and SID for Test BI system.
We have done all BW refresh steps as per guide and during Restore of source system we
are getting errors during datasource activation as " error while creating table EDISEGMENT ".
we have checked RFC and Partner profiles and working fine.
We have 2 clients connected from source system to our Test BI box.strange thing is we got one source system activated without any errors and for second
source system we are getting above mentioned error.
We have reviewed notes 339957 , 493422 but our BI fuctional team is not sure whether
those apply to our OLTP system , as one source system from same OLTP system got
successfully activated and source system for other client giving this issue .
Please help us out with this problem.
we are running on BI 7.0 platform and ECC 6.0 for source.
Regards,
Rrcheck the relevant profiles in We20 t code and also in sm59 if the remote connection with autorisation is sucssessfull, connection is ok otherwise you need to check th paramters.
hope this helps
regards
santosh -
Error while creating table 'EDISEGMENT' entry for Transfer Structures
Hi Guyz...
I've been facing a few issues regarding activation of transfer structures.
I'll explain the whole process...so it makes sense to you guys.
I've initially activated the Transfer Structures in BI Content for Master Data. Later on we faced some issues in the DEV server. So we had to transport all the objects and store them locally on a temporary location. When the BI DEV sever, was up and running, we transported all the objects back to the server.
Now while moving the objects back to the sever. We had an errors in the Master Data Request (Log- Method Execution). After going into the log...this is what I found.
Error while creating table 'EDISEGMENT' entry '/BIC/CIIA0DIVISION_TEXT'
Error while creating table 'EDISEGMENT' entry '/BIC/CIIA0DISTR_CHAN_TEXT'
Error while creating table 'EDISEGMENT' entry '/BIC/CIIA0MATL_TYPE_TEXT'
Error while creating table 'EDISEGMENT' entry '/BIC/CIIA0SALESORG_TEXT'
Error while creating table 'EDISEGMENT' entry '/BIC/CIIA0SALES_GRP_TEXT'
I hope, I am clear here.
Any suggestions, guys...
Thanks,
Regards,
GHi,
Please check this link:
Error while creating table EDISEGMENT entry
Hope it helps.
Thanks and Regards,
MuraliManohar. -
Error while creating table 'EDISEGMENT' entry '/BIC/CIBD0BILBLK_DL_TEXT'
Hi All,
I am getting a error"Error while creating table 'EDISEGMENT' entry", when running the program RS_Transtru ActivateAll to activate the transfer rules as we had reconnected our source system,
we are on 2004s server and on SP level 17, I understand that this issue is created as the transfer rules are using IDOC as transfer method, and I Cannot manually change the transfer method to PSA and activate as the number of transfer rules with this issue are huge in number.
Please suggest the suitable solution
Thank you in advance
krishna
Edited by: krishna dvs on Feb 17, 2009 1:24 PMHi,
Please check the note 493422. The reports listed in the note
should correct the inconsistencies and afterwards you can re-import the
request and it should then work.
Regards,
Srikanth -
SQL Synatx error while creating table dynamically
I am getting error when I am trying to create table on runtime
Declare @FileName varchar(100)
Declare @File varchar(100)
set @FileName='brkrte_121227102828'
SET @File = SUBSTRING(@FileName,1,CHARINDEX('_',@FileName)-1)
--=select @File
Declare @table_name varchar(100)
Declare @ssql varchar(1000)
SET @table_name = 'DataStaging.dbo.Staging_'+ @File
SET @sSQL = 'CREATE TABLE ' + @table_name + ' ( ' +
' [COL001] VARCHAR (4000) NOT NULL, ' +
' [Id] Int Identity(1,1), ' +
' [LoadDate] datetime default getdate() ' +
Exec @sSQL
Error massage:-
Msg 203, Level 16, State 2, Line 16
The name 'CREATE TABLE DataStaging.dbo.Staging_brkrte ( [COL001] VARCHAR (4000) NOT NULL, [Id] Int Identity(1,1), [LoadDate]
datetime default getdate() )' is not a valid identifier.
Please help me to resolve above error
Thankx & regards, Vipin jha MCPGot the answer
issue with the Exec (@sSQL)
thanks
Thankx & regards, Vipin jha MCP -
Create table in Database Under Creating Direct Database Request in Answers
When creating the table, I got the following error:
State: HY000. Code: 10058. [NQODBC] [SQL_STATE: HY000] [nQSError: 10058] A general error has occurred. [nQSError: 43093] An error occurred while processing the EXECUTE PHYSICAL statement. [nQSError: 17001] Oracle Error code: 955, message: ORA-00955: name is already used by an existing object at OCI call OCIStmtExecute: CREATE TABLE AGG_PRODUCTS_CATEGORY_D AS (SELECT DISTINCT PROD_CATEGORY_ID ,PROD_CATEGORY ,PROD_TOTAL_ID ,PROD_TOTAL FROM PRODUCTS) . [nQSError: 17011] SQL statement execution failed. (HY000)
SQL Issued: EXECUTE PHYSICAL CONNECTION POOL "SH"."Connection Pool" CREATE TABLE AGG_PRODUCTS_CATEGORY_D AS (SELECT DISTINCT PROD_CATEGORY_ID ,PROD_CATEGORY ,PROD_TOTAL_ID ,PROD_TOTAL FROM PRODUCTS)
SQL Statement:
CREATE TABLE AGG_PRODUCTS_CATEGORY_D
AS
(SELECT DISTINCT PROD_CATEGORY_ID
,PROD_CATEGORY
,PROD_TOTAL_ID
,PROD_TOTAL
FROM PRODUCTS);
Connection Pool: "SH"."Connection Pool"
The above sql statement works when creating the table in SQLPlus. Is there a syntax error when trying to do it in Answers? I don't use the semi-colon at the end in Answers. Thanks.
Edited by: whoknows on May 24, 2010 1:55 PMlook at your physical layer in the RPD you'll find that the does table exist (easy to find, it's represented by an icon in red) and you need to drop it before running your aggregate.
Fiston -
Query to find out the sessions trying to create table.
Hi Folks,
I wanted to know the sessions that are trying to create tables. I'm trying to use the query... but not getting the correct results...
These are the three versions of queries i wrote..
SELECT A.SID,A.USERNAME,A.SQL_ID,B.SQL_TEXT FROM V$SESSION A,V$SQLTEXT B
WHERE A.STATUS='ACTIVE'
AND A.SQL_ID=B.SQL_ID
AND B.SQL_TEXT LIKE '%INSERT%' OR B.SQL_TEXT LIKE '%insert%'
SELECT A.SID,A.USERNAME,A.SQL_ID,B.SQL_TEXT FROM V$SESSION A,V$SQLTEXT B
WHERE A.SID IN (SELECT SID FROM V$SESSION WHERE STATUS='ACTIVE')
AND A.SQL_ID=B.SQL_ID
AND B.SQL_TEXT LIKE '%INSERT%' OR B.SQL_TEXT LIKE '%insert%'
select sid, serial#, status, username, osuser,machine,sql_id from v$session
where sql_id in (select sql_id from v$sqltext where SQL_TEXT like '%INSERT%'
union
select sql_id from v$sqltext where SQL_TEXT like '%insert%')
But its not giving me the correct results. Can any one help me with a query to determine a way to catch the sessions trying to create tables.
Thanks
KarthikI'm not sure why this needs further explanation. A database is a conceptual and logical whole, it is not a dumpground. Databases are designed prior to their creation, after their creation only the content of the database change, not the database itself.
Having end-users creating tables on the fly means you have no control over the database and also no control over it's integrity and consistency.
Basically this means your database has been converted into a garbage dump.
End-users creating tables on the fly is a big nono in my book, and whoever develops an application creating tables on the fly should be shown to the door of unemployment.
You would either set up audit and issue audit table as explained before, or revoke the create table privilege from all end-users. Personally I prefer the last approach, and I wouldn't even consider the former.
Sybrand Bakker
Senior Oracle DBA -
JPA - TroubleShooting - Error in Datatypes used in Creating Tables
h1. JPA - TroubleShooting - Error in Datatypes used in Creating Tables
h2. Error Description
The error appears when JPA is trying to automatically
generate tables from Entities. It uses types it shouldn't because they aren't
supported by the Database Engine. If you examine the createDDL.jdbc file you
are going to find not supported data types. You could run the statements that
are listed there directly to your Database Engine and find out that they don`t
run generating syntax errors like the error posted below.
h3. Query example (statement)
CREATE TABLE SEQUENCE (SEQ_NAME VARCHAR(50), SEQ_COUNT NUMBER(19))h3. PostgreSQL Error whent trying to execute the statment
CREATE TABLE SEQUENCE (SEQ_NAME VARCHAR(50),
SEQ_COUNT NUMBER(19))": org.postgresql.util.PSQLException: ERROR: syntax
error near "("
h3. MSSQL 2000 Error when trying to execute the statement:
[CREATE - 0 row(s), 0.578 secs] [Error Code: 2715, SQL State: HY000] [Microsoft][SQLServer 2000 Driver for
JDBC][SQLServer]Column or parameter #1: Cannot find data type NUMBER
h3. FireBird Error when trying to execute the statment
Dynamic SQL Error
SQL error code = -104
Token unknown - line 1, char 62
Statement: CREATE TABLE SEQUENCE (SEQ_NAME VARCHAR(50),
SEQ_COUNT NUMBER(19))
h2. TroubleShooting
Looks like it wants "NUMERIC", not "NUMBER".
[http://www.postgresql.org/docs/8.1/interactive/datatype.html#DATATYPE-NUMERIC]
I had this problem using Firebird. I found out it had to do
with the database jdbc driver. I tried using a different database engine, like
Microsoft SQL Server 2000 with the corresponding driver and it worked as it
should. I suspect the JDBC driver has to supply in some way the datatypes it
supports to JPA TopLink Library.
Edited by: juanemc2 on Apr 1, 2008 7:39 AMIn Hibernate you can supply the "dialect" to use for SQL generation. I assume TopLink to have a similar functionality.
This is not a generic JPA issue, but rather something with the specific implementation you're using.
Or rather it is a generic SQL issue, caused by different database engines using slightly (or not so slightly) different SQL syntax which causes problems when generating SQL automatically.
Maybe you are looking for
-
Content Engine compatability with Windows Media Player
I am currently running ACNS version 5.5 on our CE510 Content Engine. We want to set this box up to serve VOD requests from WMT. I think the clients on site are all using Windows Media Player version 10. Does this version of ACNS software work with Wi
-
As I have been told, the following code is mixing as2 and as3. picHolder[1].onPress = function() { picHolder[1].width = card.width; picHolder[1].height = card.height; picHolder[1].x = 0; picHolder[1].y = 0;
-
Unable to connect through Reports
Hi, I am unable to connect to Oracle using Reports alone. Whereas I can connect /work through forms /SQL/ Discoverer. When I try to connect from reports, I get the following error message appears: PDE-PXC002 Program unit exception. aborted due to unh
-
Hi all masters, I have a Question..... I'm working with two frames ( FR1 and FR2 ) with a tableView in FR1 I'm getting somethings events, (rowselect), and with the data of that row, i need show in the frame FR2 the content of it.. I try with, naviga
-
Problem retreiving unicode char.
I have a database MSSQL with some fields as nvarchar. I pasted some chinese chars on it in Enterprise administrator and it works fine. but using rs.getString , results to "?????" characther. I wonder whats wrong with the app. While the chinese charac