TABLES statement
Hi all,
In a report please tell me what the below statements will do an what is the use of this two statements
DATA:
TABLES:
Regards
Ajay
Hi,
Here is an example.
TABLES : ZRPM_ITEM_SUM.
------ DECLARATION OF INTERNAL TABLE --------- *
DATA : ITAB1 TYPE TABLE OF ZRPM_ITEM_SUM,
ITAB2 TYPE TABLE OF TY_ITAB2.
DATA : WA_ITAB TYPE ZRPM_ITEM_SUM OCCURS 0 WITH HEADER LINE.
so as we see,
data statement is used to declare any variable or create internal table etc.
while in ur report,the table being used in written under REPORT.
Thanks
Reward points if helpful
Edited by: Malvika Sharma on May 29, 2008 11:04 AM
Edited by: Malvika Sharma on May 29, 2008 11:06 AM
Similar Messages
-
URGENT : select from table statement in ABAP OO
Hi all,
I am an absolute ABAP OO beginner and need some quick help from an expert. How can I make a selection from an existing table (eg MARA) in an BADI which is programmed according to ABAP OO principles.
In the old ABAP school you could make a TABLES statement at the beginning and the do a SELECT¨* FROM but this does not work in ABAP OO.
How should i define such simple selections from existing tables. Anyone ?
Thanks a lot,
Eric Hassenberg*define internal table
data: i_mara like standard table of mara.
*select to this table
select * from mara into table i_mara.
Also you have to define work area for this internal table in order to use it.
data:w_mara like line of i_mara. -
How to get the number of hits ("returned rows") in read table statement
Hi Experts
I have the statement shown below, which seems not to work as I would like it to. My problem is that I would like to do two different things depending on weather or not a read table statement results in 0 hits or 1 (or more) hits.
READ TABLE g_ship_item
WITH KEY
l_ztknum = DATA_PACKAGE-/bic/ztknum
BINARY SEARCH.
IF sy-subrc is initial.
no hits found
DATA_PACKAGE-/BIC/ZSTAGEERR = 1.
ELSE.
hits where found and we will do something else...
DATA_PACKAGE-/BIC/ZSTAGEERR = 0.
Hope someone can help me out of my problem...
Thanks in advance, regards
TorbenHi,
As you are using READ statement with Binary search, check whether the internal table g_ship_item is sorted with field /bic/ztknum or not. If it is not sorted then the result of this READ statement is not correct.
Below is the correct code.
sort g_ship_item by /bic/ztknum.
READ table g_ship_item with key g_ship_item = xxx.
Thanks,
Satya -
Creating a better update table statement
Hello,
I have the following update table statement that I would like to make more effecient. This thing is taking forever. A little background. The source table/views are not indexed and the larger of the two only has 150k records. Any ideas on making more effecient would be appreciate.
Thanks.
Ryan
Script:
DECLARE
V_EID_CIV_ID SBI_EID_W_VALID_ANUM_V.SUBJECT_KEY%TYPE;
V_EID_DOE DATE;
V_EID_POE SBI_EID_W_VALID_ANUM_V.POINT_OF_ENTRY%TYPE;
V_EID_APPR_DATE DATE;
V_CASE_CIV_ID SBI_DACS_CASE_RECORDS.CASE_EID_CIV_ID%TYPE;
V_CASE_DOE DATE;
V_CASE_POE SBI_DACS_CASE_RECORDS.CASE_CODE_ENTRY_PLACE%TYPE;
V_CASE_APPR_DATE DATE;
V_CASE_DEPART_DATE DATE;
V_SBI_UPDATE_STEP SBI_DACS_CASE_RECORDS.SBI_UPDATE_STEP%TYPE;
V_SBI_CIV_ID SBI_DACS_CASE_RECORDS.SBI_CIV_ID%TYPE;
CURSOR VALID_CIV_ID_FROM_EID IS
SELECT EID.SUBJECT_KEY,
TO_DATE(EID.PROCESS_ENTRY_DATE),
EID.POINT_OF_ENTRY,
TO_DATE(EID.APPREHENSION_DATE),
DACS.CASE_EID_CIV_ID,
TO_DATE(DACS.CASE_DATE_OF_ENTRY,'YYYYMMDD'),
DACS.CASE_CODE_ENTRY_PLACE,
TO_DATE(DACS.CASE_DATE_APPR,'YYYYMMDD'),
TO_DATE(DACS.CASE_DATE_DEPARTED,'YYYYMMDD'),
DACS.SBI_UPDATE_STEP,
DACS.SBI_CIV_ID
FROM SBI_EID_W_VALID_ANUM_V EID,
SBI_DACS_CASE_RECORDS DACS
WHERE DACS.CASE_NBR_A = EID.ALIEN_FILE_NUMBER;
BEGIN
OPEN VALID_CIV_ID_FROM_EID;
SAVEPOINT A;
LOOP
FETCH VALID_CIV_ID_FROM_EID INTO V_EID_CIV_ID, V_EID_DOE, V_EID_POE,V_EID_APPR_DATE,V_CASE_CIV_ID, V_CASE_DOE,V_CASE_POE,V_CASE_APPR_DATE,V_CASE_DEPART_DATE,V_SBI_UPDATE_STEP,V_SBI_CIV_ID;
DBMS_OUTPUT.PUT_LINE('BEFORE');
EXIT WHEN VALID_CIV_ID_FROM_EID%FOUND;
DBMS_OUTPUT.PUT_LINE('AFTER');
UPDATE SBI_DACS_CASE_RECORDS
SET SBI_DACS_CASE_RECORDS.SBI_CIV_ID = V_CASE_CIV_ID,
SBI_DACS_CASE_RECORDS.SBI_UPDATE_STEP = 1
WHERE V_CASE_CIV_ID IS NOT NULL
AND V_CASE_CIV_ID <> 0;
UPDATE SBI_DACS_CASE_RECORDS
SET SBI_DACS_CASE_RECORDS.SBI_CIV_ID = V_EID_CIV_ID,
SBI_DACS_CASE_RECORDS.SBI_UPDATE_STEP = 2
WHERE V_SBI_CIV_ID IS NULL AND V_SBI_UPDATE_STEP = 0
AND V_EID_DOE = V_CASE_DOE
AND V_EID_POE = V_CASE_POE
AND V_EID_APPR_DATE = V_CASE_APPR_DATE
AND V_EID_APPR_DATE = V_CASE_DEPART_DATE;
UPDATE SBI_DACS_CASE_RECORDS
SET SBI_DACS_CASE_RECORDS.SBI_CIV_ID = V_EID_CIV_ID,
SBI_DACS_CASE_RECORDS.SBI_UPDATE_STEP = 3
WHERE V_SBI_UPDATE_STEP = 0
AND V_EID_DOE = V_CASE_DOE
AND V_EID_POE = V_CASE_POE
AND V_EID_APPR_DATE = V_CASE_APPR_DATE;
UPDATE SBI_DACS_CASE_RECORDS
SET SBI_DACS_CASE_RECORDS.SBI_CIV_ID = V_EID_CIV_ID,
SBI_DACS_CASE_RECORDS.SBI_UPDATE_STEP = 4
WHERE V_SBI_UPDATE_STEP = 0
AND V_EID_DOE = V_CASE_DOE
AND V_EID_POE = V_CASE_POE
AND (V_EID_APPR_DATE - V_CASE_APPR_DATE) > -4
AND (V_EID_APPR_DATE - V_CASE_APPR_DATE) < 4 ;
UPDATE SBI_DACS_CASE_RECORDS
SET SBI_DACS_CASE_RECORDS.SBI_CIV_ID = V_EID_CIV_ID,
SBI_DACS_CASE_RECORDS.SBI_UPDATE_STEP = 5
WHERE V_SBI_UPDATE_STEP = 0
AND V_EID_DOE = V_CASE_DOE
AND V_EID_POE <> V_CASE_POE
AND V_EID_APPR_DATE = V_CASE_APPR_DATE;
UPDATE SBI_DACS_CASE_RECORDS
SET SBI_DACS_CASE_RECORDS.SBI_CIV_ID = V_EID_CIV_ID,
SBI_DACS_CASE_RECORDS.SBI_UPDATE_STEP = 6
WHERE V_SBI_UPDATE_STEP = 0
AND V_EID_POE = V_CASE_POE
AND V_EID_APPR_DATE = V_CASE_APPR_DATE;
UPDATE SBI_DACS_CASE_RECORDS
SET SBI_DACS_CASE_RECORDS.SBI_CIV_ID = V_EID_CIV_ID,
SBI_DACS_CASE_RECORDS.SBI_UPDATE_STEP = 7
WHERE V_SBI_UPDATE_STEP = 0
AND V_EID_APPR_DATE = V_CASE_APPR_DATE;
UPDATE SBI_DACS_CASE_RECORDS
SET SBI_DACS_CASE_RECORDS.SBI_CIV_ID = V_EID_CIV_ID,
SBI_DACS_CASE_RECORDS.SBI_UPDATE_STEP = 8
WHERE V_SBI_UPDATE_STEP = 0
AND V_EID_DOE = V_CASE_DOE
AND (V_EID_APPR_DATE - V_CASE_APPR_DATE) > -4
AND (V_EID_APPR_DATE - V_CASE_APPR_DATE) < 4;
UPDATE SBI_DACS_CASE_RECORDS
SET SBI_DACS_CASE_RECORDS.SBI_CIV_ID = V_EID_CIV_ID,
SBI_DACS_CASE_RECORDS.SBI_UPDATE_STEP = 9
WHERE V_SBI_UPDATE_STEP = 0
AND V_EID_DOE = V_CASE_DOE
AND V_EID_POE = V_CASE_POE;
UPDATE SBI_DACS_CASE_RECORDS
SET SBI_DACS_CASE_RECORDS.SBI_CIV_ID = V_EID_CIV_ID,
SBI_DACS_CASE_RECORDS.SBI_UPDATE_STEP = 10
WHERE V_SBI_UPDATE_STEP = 0
AND (V_EID_APPR_DATE - V_CASE_APPR_DATE) > -4
AND (V_EID_APPR_DATE - V_CASE_APPR_DATE) < 4;
END LOOP;
CLOSE VALID_CIV_ID_FROM_EID;
COMMIT;
END;
-----Thats it. Thanks for your help.
RyanPlease use [ code] or [ pre] tags to format code before posing:
DECLARE
V_EID_CIV_ID SBI_EID_W_VALID_ANUM_V.SUBJECT_KEY%TYPE;
V_EID_DOE DATE;
V_EID_POE SBI_EID_W_VALID_ANUM_V.POINT_OF_ENTRY%TYPE;
V_EID_APPR_DATE DATE;
V_CASE_CIV_ID SBI_DACS_CASE_RECORDS.CASE_EID_CIV_ID%TYPE;
V_CASE_DOE DATE;
V_CASE_POE SBI_DACS_CASE_RECORDS.CASE_CODE_ENTRY_PLACE%TYPE;
V_CASE_APPR_DATE DATE;
V_CASE_DEPART_DATE DATE;
V_SBI_UPDATE_STEP SBI_DACS_CASE_RECORDS.SBI_UPDATE_STEP%TYPE;
V_SBI_CIV_ID SBI_DACS_CASE_RECORDS.SBI_CIV_ID%TYPE;
CURSOR VALID_CIV_ID_FROM_EID IS
SELECT EID.SUBJECT_KEY,
TO_DATE(EID.PROCESS_ENTRY_DATE),
EID.POINT_OF_ENTRY,
TO_DATE(EID.APPREHENSION_DATE),
DACS.CASE_EID_CIV_ID,
TO_DATE(DACS.CASE_DATE_OF_ENTRY,'YYYYMMDD'),
DACS.CASE_CODE_ENTRY_PLACE,
TO_DATE(DACS.CASE_DATE_APPR,'YYYYMMDD'),
TO_DATE(DACS.CASE_DATE_DEPARTED,'YYYYMMDD'),
DACS.SBI_UPDATE_STEP,
DACS.SBI_CIV_ID
FROM SBI_EID_W_VALID_ANUM_V EID,
SBI_DACS_CASE_RECORDS DACS
WHERE DACS.CASE_NBR_A = EID.ALIEN_FILE_NUMBER;
BEGIN
OPEN VALID_CIV_ID_FROM_EID;
SAVEPOINT A;
LOOP
FETCH VALID_CIV_ID_FROM_EID INTO V_EID_CIV_ID, V_EID_DOE,
V_EID_POE,V_EID_APPR_DATE,V_CASE_CIV_ID, V_CASE_DOE,
V_CASE_POE,V_CASE_APPR_DATE,V_CASE_DEPART_DATE,V_SBI_UPDATE_STEP,V_SBI_CIV_ID;
DBMS_OUTPUT.PUT_LINE('BEFORE');
EXIT WHEN VALID_CIV_ID_FROM_EID%FOUND;
DBMS_OUTPUT.PUT_LINE('AFTER');
UPDATE SBI_DACS_CASE_RECORDS
SET SBI_DACS_CASE_RECORDS.SBI_CIV_ID = V_CASE_CIV_ID,
SBI_DACS_CASE_RECORDS.SBI_UPDATE_STEP = 1
WHERE V_CASE_CIV_ID IS NOT NULL
AND V_CASE_CIV_ID <> 0;
UPDATE SBI_DACS_CASE_RECORDS
SET SBI_DACS_CASE_RECORDS.SBI_CIV_ID = V_EID_CIV_ID,
SBI_DACS_CASE_RECORDS.SBI_UPDATE_STEP = 2
WHERE V_SBI_CIV_ID IS NULL AND V_SBI_UPDATE_STEP = 0
AND V_EID_DOE = V_CASE_DOE
AND V_EID_POE = V_CASE_POE
AND V_EID_APPR_DATE = V_CASE_APPR_DATE
AND V_EID_APPR_DATE = V_CASE_DEPART_DATE;
UPDATE SBI_DACS_CASE_RECORDS
SET SBI_DACS_CASE_RECORDS.SBI_CIV_ID = V_EID_CIV_ID,
SBI_DACS_CASE_RECORDS.SBI_UPDATE_STEP = 3
WHERE V_SBI_UPDATE_STEP = 0
AND V_EID_DOE = V_CASE_DOE
AND V_EID_POE = V_CASE_POE
AND V_EID_APPR_DATE = V_CASE_APPR_DATE;
UPDATE SBI_DACS_CASE_RECORDS
SET SBI_DACS_CASE_RECORDS.SBI_CIV_ID = V_EID_CIV_ID,
SBI_DACS_CASE_RECORDS.SBI_UPDATE_STEP = 4
WHERE V_SBI_UPDATE_STEP = 0
AND V_EID_DOE = V_CASE_DOE
AND V_EID_POE = V_CASE_POE
AND (V_EID_APPR_DATE - V_CASE_APPR_DATE) > -4
AND (V_EID_APPR_DATE - V_CASE_APPR_DATE) < 4 ;
UPDATE SBI_DACS_CASE_RECORDS
SET SBI_DACS_CASE_RECORDS.SBI_CIV_ID = V_EID_CIV_ID,
SBI_DACS_CASE_RECORDS.SBI_UPDATE_STEP = 5
WHERE V_SBI_UPDATE_STEP = 0
AND V_EID_DOE = V_CASE_DOE
AND V_EID_POE <> V_CASE_POE
AND V_EID_APPR_DATE = V_CASE_APPR_DATE;
UPDATE SBI_DACS_CASE_RECORDS
SET SBI_DACS_CASE_RECORDS.SBI_CIV_ID = V_EID_CIV_ID,
SBI_DACS_CASE_RECORDS.SBI_UPDATE_STEP = 6
WHERE V_SBI_UPDATE_STEP = 0
AND V_EID_POE = V_CASE_POE
AND V_EID_APPR_DATE = V_CASE_APPR_DATE;
UPDATE SBI_DACS_CASE_RECORDS
SET SBI_DACS_CASE_RECORDS.SBI_CIV_ID = V_EID_CIV_ID,
SBI_DACS_CASE_RECORDS.SBI_UPDATE_STEP = 7
WHERE V_SBI_UPDATE_STEP = 0
AND V_EID_APPR_DATE = V_CASE_APPR_DATE;
UPDATE SBI_DACS_CASE_RECORDS
SET SBI_DACS_CASE_RECORDS.SBI_CIV_ID = V_EID_CIV_ID,
SBI_DACS_CASE_RECORDS.SBI_UPDATE_STEP = 8
WHERE V_SBI_UPDATE_STEP = 0
AND V_EID_DOE = V_CASE_DOE
AND (V_EID_APPR_DATE - V_CASE_APPR_DATE) > -4
AND (V_EID_APPR_DATE - V_CASE_APPR_DATE) < 4;
UPDATE SBI_DACS_CASE_RECORDS
SET SBI_DACS_CASE_RECORDS.SBI_CIV_ID = V_EID_CIV_ID,
SBI_DACS_CASE_RECORDS.SBI_UPDATE_STEP = 9
WHERE V_SBI_UPDATE_STEP = 0
AND V_EID_DOE = V_CASE_DOE
AND V_EID_POE = V_CASE_POE;
UPDATE SBI_DACS_CASE_RECORDS
SET SBI_DACS_CASE_RECORDS.SBI_CIV_ID = V_EID_CIV_ID,
SBI_DACS_CASE_RECORDS.SBI_UPDATE_STEP = 10
WHERE V_SBI_UPDATE_STEP = 0
AND (V_EID_APPR_DATE - V_CASE_APPR_DATE) > -4
AND (V_EID_APPR_DATE - V_CASE_APPR_DATE) < 4;
END LOOP;
CLOSE VALID_CIV_ID_FROM_EID;
COMMIT;
END;Peter D. -
How to specify tablespace for a primary key inde in create table statement
How to specify the tablespace for a primary key index in a create table statement?
Does the following statement is right?
CREATE TABLE 'GPS'||TO_CHAR(SYSDATE+1,'YYYYMMDD')
("ID" NUMBER(10,0) NOT NULL ENABLE,
"IP_ADDRESS" VARCHAR2(32 BYTE),
"EQUIPMENT_ID" VARCHAR2(32 BYTE),
"PACKET_DT" DATE,
"PACKET" VARCHAR2(255 BYTE),
"PACKET_FORMAT" VARCHAR2(32 BYTE),
"SAVED_TIME" DATE DEFAULT CURRENT_TIMESTAMP,
CONSTRAINT "UDP_LOG_PK" PRIMARY KEY ("ID") TABLESPACE "INDEX_DATA"
TABLESPACE "SBM_DATA"; Thank you
Edited by: qkc on 09-Nov-2009 13:42As orafad indicated, you'll have to use the USING INDEX clause from the documentation, i.e.
SQL> ed
Wrote file afiedt.buf
1 CREATE TABLE GPS
2 ("ID" NUMBER(10,0) NOT NULL ENABLE,
3 "IP_ADDRESS" VARCHAR2(32 BYTE),
4 "EQUIPMENT_ID" VARCHAR2(32 BYTE),
5 "PACKET_DT" DATE,
6 "PACKET" VARCHAR2(255 BYTE),
7 "PACKET_FORMAT" VARCHAR2(32 BYTE),
8 "SAVED_TIME" DATE DEFAULT CURRENT_TIMESTAMP,
9 CONSTRAINT "UDP_LOG_PK" PRIMARY KEY ("ID") USING INDEX TABLESP
ACE "USERS"
10 )
11* TABLESPACE "USERS"
SQL> /
Table created.Justin -
A Select statement with Appending table statement in it.
Hi,
How can I use a select statement with a <Appening table> statement in it.
SELECT DISTINCT <field Name>
FROM <DB table name>
APPENDING TABLE <itab>
WHERE <fieldname> EQ <Itab1-fieldname>
AND <fieldname> EQ <itab2-fieldname>.
Can I use the above select statement.If I'm using this...how this works?
Regards
DharmarajuHi, Dharma Raju Kondeti.
I found this in the SAP online help, hope this can help you.
Specifying Internal Tables
When you read several lines of a database table, you can place them in an internal table. To do this, use the following in the INTO clause:
SELECT ... INTO|APPENDING [CORRESPONDING FIELDS OF] TABLE <itab>
[PACKAGE SIZE <n>] ...
The same applies to the line type of <itab>, the way in which the data for a line of the database table are assigned to a table line, and the CORRESPONDING FIELDS addition as for flat work areas (see above).
The internal table is filled with all of the lines of the selection. When you use INTO, all existing lines in the table are deleted. When you use APPENDING; the new lines are added to the existing internal table <itab>. With APPENDING, the system adds the lines to the internal table appropriately for the table type. Fields in the internal table not affected by the selection are filled with initial values.
If you use the PACKAGE SIZE addition, the lines of the selection are not written into the internal table at once, but in packets. You can define packets of <n> lines that are written one after the other into the internal table. If you use INTO, each packet replaces the preceding one. If you use APPENDING, the packets are inserted one after the other. This is only possible in a loop that ends with ENDSELECT. Outside the SELECT loop, the contents of the internal table are undetermined. You must process the selected lines within the loop.
Regards,
feng.
Edited by: feng zhang on Feb 21, 2008 10:20 AM -
Table statement Obslete while using FM
Hi Experts,
While creating Function Module I am aware that use of table statement is obslete in ECC 6.
But I want to know the reason behind it.
I am creating a FM for an Inbound FM. Just wanted to know if the Use of table parameter affect it in future.
Thanks an advance!!!By using the TABLES parameters, you are creating the table with the HEADER LINE. So, from your program / FM / method if you pass the internal table without the header line, it would create a empty header line while processing the FM.
Even though this is just the warning, I would suggest to use the IMPORTING / CHANGING parameters instead of the TABLES. You can create a Table Type for your structure and use it as the types to define this parameter.
From the message long text of "TABLES Parameters are obsolete.
TABLES parameters are table parameters. Table parameters are obsolete CHANGING parameters that are typed as internal standard tables with a header line. If an internal table without a header line or a table body is passed as an actual parameter to such a formal parameter, an empty header line is generated in the function module. If an internal table with a header line is used as an actual parameter, both the table body and the header line are passed to the function module. In the case of formal parameters defined with TABLES, no value transmission is possible.
Formal parameters defined with TABLES can be replaced by formal parameters defined with CHANGING. A local work area can be created in the function module for the internal table using the addition LIKE LINE OF itab of the DATA statement in the function module.
Regards,
Naimesh Patel -
Why we use Tables statement in case of using SELECT-OPTIONS:
hi all,
Why we use Tables statement in case of using the following coding in an ABAP program ...
tables: vbak.
SELECT-OPTIONS: s1 for vbak-vbeln.
here if we dont provide the tables statement why it does not work ????
pls answwer ....???Hi
This statement is not allowed in classes and declares a data object table_wa as a table work area whose data type is adopted from the identically named structured data type table_wa from the ABAP Dictionary. table_wa must be defined as a flat structure in the ABAP Dictionary. You can specify database tables or Views for table_wa.
Work table areas declared with TABLES are interface work areas and should only be declared in the global declaration section of a program for the following purpose:
reward if usefull
The statement TABLES is required for exchanging data between screen fields that were defined in a program screen when transferring from the ABAP Dictionary and the ABAP program. For the screen event PBO, the content of the table work area is transferred to identically named screen fields; for PAI, the system adopts the data from identically named screen fields.
In executable programs, flat table work areas can be used for adopting data that were provided for the event GET table_wa from a linked logical database. TABLES is synonymous with the statement NODES for this purpose.
Work table areas declared with TABLES behave like the data declared with the addition COMMON PART, meaning the data are used by the programs of a program group.
Table work areas declared with TABLES can be declared in subroutines and
function modules. However, this is not recommended. A table work area declared in a procedure is not local but belongs to the context of a framework program. The table work area can be viewed starting from the declaration in the framework program and lives as long as the framework program. In contrast to normal program-global data, the content of the table work areas declared in subroutines and function modules is stored temporarily when these subroutines and function modules are called. Value assignments that were made during runtime of the procedure are preserved until the procedure is completed. When exiting the procedure, the table work areas are filled with the contents that they contained when the procedure was called. Table work areas declared in procedures behave like global data to which the statement LOCAL is applied in the procedure.
The form TABLES * is obsolete. -
How to gather table stats for tables in a different schema
Hi All,
I have a table present in one schema and I want to gather stats for the table in a different schema.
I gave GRANT ALL ON SCHEMA1.T1 TO SCHEMA2;
And when I tried to execute the command to gather stats using
DBMS_STATS.GATHER_TABLE_STATS (OWNNAME=>'SCHMEA1',TABNAME=>'T1');
The function is failing.
Is there any way we can gather table stats for tables in one schema in an another schema.
Thanks,
MK.You need to grant analyze any to schema2.
SY. -
Why Oracle not using the correct indexes after running table stats
I created an index on the table and ran the a sql statement. I found that via the explain plan that index is being used and is cheaper which I wanted.
Latter I ran all tables stats and found out again via explain plan that the same sql is now using different index and more costly plan. I don't know what is going on. Why this is happening. Any suggestions.
ThxI just wanted to know the cost using the index.
To gather histograms use (method_opt is the one that causes the package to collect histograms)
DBMS_STATS.GATHER_SCHEMA_STATS (
ownname => 'SCHEMA',
estimate_percent => DBMS_STATS.AUTO_SAMPLE_SIZE,
block_sample => TRUE,
method_opt => 'FOR ALL COLUMNS SIZE AUTO',
degree => 4,
granularity => 'ALL',
cascade => TRUE,
options => 'GATHER'
); -
TABLES statement in report program
Hi,
Is there any performance issue in using TABLES statement in report program? Because I have got an review point from QA insisting me to avoid TABLES statement. Is it recommended by SAP to avoid TABLES statement in report program?
Thanks in advance.
Regards,
Balaji Viswanath.Balaji,
If you use TABLE statement it will provide the work area of the given table.
EX: Decalre any fields of table in slection scereen
without declarin g the tabel name in TAble statement.It will give error.
This will give error.
SELECT-OPTIONS : s_matnr LIKE MARA-matnr.
This will work.
TABLES: mara.
SELECT-OPTIONS : s_matnr LIKE MARA-matnr.
It means it will occupy some area.
Don't forget to reward if useful. -
Question on Exporting Table Stats to another database
I have a question exporting Table Stats from one schema in database A to another schema in another database B.
Currently running Oracle 9.0.2.6 for unix in both prod and dev.
Currently table stats are gathered using the ANALYZE TABLE command. We currently don't use the DBMS_STATS package to gather table statistics.
Question:
If I execute DBMS_STATS.EXPORT_TABLE_STATS in database A can I import them to database B if I'm only using the ANALYZE TABLE to gather table stats? Do I need to execute the DBMS_STATS.GATHER_TABLE_STATS package in database A prior to excuting DBMS_STATS.EXPORT_TABLE_STATS ?
The overall goal is to take table stats from Production in its current state and import them into a Development environment to be used for testing data / processes.
Yes we will be upgrading to Oracle 10 / 11g in near future.
Yes we will be changing our method of gathering table stats by using the DBMS_STATS package.
Thanks,
Russ DHi,
If I execute DBMS_STATS.EXPORT_TABLE_STATS in database A can I import them to database B if I'm only using the ANALYZE TABLE to gather table stats? You need using the DBMS_STAT package for get and export statistics process if you want migrate the stats to other database.
Do I need to execute the DBMS_STATS.GATHER_TABLE_STATS package in database A prior to excuting DBMS_STATS.EXPORT_TABLE_STATS ?Yes, you need executing first DBMS_STATS.GATHER_TABLE_STATS.
Good luck.
Regards. -
Caculated column in a Create Table statement
This is my create table statement:
CREATE TABLE DTPartInv
( partinv_partnbr VARCHAR2(10) NOT NULL,
partinv_prodname VARCHAR2(25),
partinv_desc VARCHAR2(25),
partinv_manufact VARCHAR2(25),
partinv_instock INTEGER NOT NULL,
partinv_category VARCHAR2(20),
partinv_purchdate DATE,
partinv_loc VARCHAR2(15),
partinv_price NUMBER(6,2),
partinv_vendor VARCHAR2(20),
partinv_reorder INTEGER NOT NULL,
partinv_serial VARCHAR2(20),
partinv_flag as (case when partinv_instock < partinv_reorder then 'X' else 'O' end), calculated column
CONSTRAINT DTPartInv_partinv_partnbr_pk
PRIMARY KEY (partinv_partnbr)
and these are my Insert into table statements:
INSERT INTO DTPartInv VALUES('XT40010E',Null,'Exhaust Pipe','TMC Inc',2,'Pipes',TO_DATE('11-APR-10','DD-MON-RR'),Null,45.95,'Oracle Auto Parts',1,Null);
INSERT INTO DTPartInv VALUES('CH9260',Null,'Oil Filter','Mechanical Parts',5,'Fuild Filters',TO_DATE('15-Jan-10','DD-MON-RR'),Null,20.00,'Sink Auto P',2,Null);
INSERT INTO DTPartInv VALUES('15W40',Null,'Oil','Sink Oil',20,'Auto Fuilds',TO_DATE('10-Feb-11','DD-MON-RR'),Null,10.00,'Oracle Auto Parts',5,Null,);
INSERT INTO DTPartInv VALUES('C9262',Null,'Fuel Filter','Mechanical Parts',2,'Fuild Filters',TO_DATE('20-Oct-10','DD-MON-RR'),Null,35.95,'Sink Auto Parts',1,Null);
INSERT INTO DTPartInv VALUES('PS7716',Null,'Fuel/Water Separator','Mechanical Parts',4,'Fuild Filters',TO_DATE('09-Dec-10','DD-MON-RR'),Null,50.00,'Sink Auto Parts',1,Null);
INSERT INTO DTPartInv VALUES('800142',Null,'PPI Valve','Beink Pipes Inc',10,'Valves',TO_DATE('01-Jun-11','DD-MON-RR'),Null,20.00,'Oracle Auto Parts',2,Null);
INSERT INTO DTPartInv VALUES('TTS400',Null,'Butt Clamp','Beink Pipes Inc',10,'Valves',TO_DATE('31-Oct-11','DD-MON-RR'),Null,15.95,'Oracle Auto Parts',2,Null);
INSERT INTO DTPartInv VALUES('TBA400',Null,'Lap Clamp','Beink Pipes Inc',10,'Valves',TO_DATE('10-Nov-11','DD-MON-RR'),Null,30.00,'Oracle Auto Parts',2,Null);
INSERT INTO DTPartInv VALUES('SC16650',Null,'Brake pads','CostVB Mechanical',5,'Mechanical Parts',TO_DATE('15-May-11','DD-MON-RR'),Null,60.00,'Adosql Auto Parts',1,Null);
INSERT INTO DTPartInv VALUES('OB46613',Null,'Emergency Door Latch','CostVB Mechanical',3,'Mechanical Parts',TO_DATE('01-Sep-11','DD-MON-RR'),Null,45.95,'Adosql Auto Parts',1,Null);
And this is a sample of the error I'm geeting:
INSERT INTO DTPartInv VALUES('XT40010E',Null,'Exhaust Pipe','TMC Inc',2,'Pipes',TO_DATE('11-APR-10','DD-MON-RR'),Null,45.95,'Oracle Auto Parts',1,Null)
ERROR at line 1:
ORA-00947: not enough values
I need to figure out, what it is that I am missing here. partinv_flag is supposed to be calculated based on partinv_instock and partinv_reorder.You need to name columns:
1 INSERT INTO DTPartInv
2 (partinv_partnbr, partinv_prodname, partinv_desc, partinv_manufact, partinv_instock, partinv_category, partinv_purchdate,
3 partinv_loc, partinv_price, partinv_vendor, partinv_reorder, partinv_serial)
4* VALUES('XT40010E',Null,'Exhaust Pipe','TMC Inc',2,'Pipes',TO_DATE('11-01-10','DD-MM-RR'),Null,45.95,'Oracle Auto Parts',1,Null)
SQL> /
1 row created.
SQL> select * from dtpartinv;
PARTINV_PA PARTINV_PRODNAME PARTINV_DESC
PARTINV_MANUFACT PARTINV_INSTOCK PARTINV_CATEGORY PARTINV_
PARTINV_LOC PARTINV_PRICE PARTINV_VENDOR PARTINV_REORDER
PARTINV_SERIAL P
XT40010E Exhaust Pipe
TMC Inc 2 Pipes 11/01/10
45,95 Oracle Auto Parts 1
OEdited by: P. Forstmann on 21 nov. 2011 18:57
Edited by: P. Forstmann on 21 nov. 2011 19:00 -
I want to know when we issue truncate table statement in oracle .
i want to know when we issue truncate table statement in oracle .No log will be write in redo log .But we can recover data using flashback or scn.I want to know where is the actually truncate table statement log is stored in oracle database.Please explain me in detail step by step .
Hi,
I have truncated table after that i have restored that data.See below the example.I want to know from where it's restored.
From which log file it's restored.
create table mytab (n number, x varchar2(90), d date);
alter table mytab enable row movement;
Table altered.
SQL> insert into mytab values (1,'Monsters of Folk',sysdate);
1 row created.
SQL> insert into mytab values (2,'The Frames',sysdate-1/24);
1 row created.
SQL> commit;
Commit complete.
SQL> select CURRENT_SCN from v$database;
CURRENT_SCN
972383
SQL> select * from mytab;
N
X
D
1
Monsters of Folk
30-DEC-12
2
The Frames
30-DEC-12
N
X
D
SQL> set lines 10000
SQL> /
N X D
1 Monsters of Folk 30-DEC-12
2 The Frames 30-DEC-12
SQL> select to_char(sysdate,'yyyymmdd hh24:mi:ss') from dual;
TO_CHAR(SYSDATE,'
20121230 09:29:24
SQL> set timing on
SQL> truncate table mytab;
Table truncated.
Elapsed: 00:00:15.75
SQL> select * from mytab as of timestamp TO_TIMESTAMP('20121230 09:29:24','yyyymmdd hh24:mi:ss');
N X D
1 Monsters of Folk 30-DEC-12
2 The Frames 30-DEC-12
Elapsed: 00:00:00.28
SQL> insert into mytab select * from mytab as of timestamp TO_TIMESTAMP('20121230 09:29:24','yyyymmdd hh24:mi:ss');
2 rows created.
Elapsed: 00:00:00.01
SQL> -
Create Create table statement dynamicallly
I am trying to create a SP that return a “Create table “statement dynamically using a table called “Employee” in database. How can I create a dynamic Create Table
statement using sys.table? The create table statement should contains all the columns from Employee table.. i am using SQL server 2008 R2Hi SSAS_5000,
If you don't care about the constraints,dependency on the Table Employee and what you'd like is the table structure, you may use the below statement where the create statement generated from the SP is executed.
SELECT TOP 1 * INTO desiredTable FROM Eemploy;
TRUNCATE TABLE desiredTable
If you have question, feel free to let me know.
Eric Zhang
TechNet Community Support -
Create table statement from DatabaseMetaData
Hi Experts,
Do you know if it's possible to retrieve a create table statement based on a DatabaseMetaData ? (without looping through columnName / Type)
The idea is to get the DatabaseMetaData from one db server, and execute the Create Table statements on a different db server.
One obvious solution would be to loop through each table's column name/type and construct the create table statement manually, but I'd like to know whether this can be automated.
Thanks in advnace,
SidBigger databases provide a way to access most schema information from the database itself. That doesn't mean that the jdbc meta information is sufficient nor the best way to do that.
There are existing tools that allow for migrations as well. Especially if the migration is a one to one mapping.
Maybe you are looking for
-
I have been using my HDMI on my tv for a while but now the audio is not playin on th TV now. For some reason the ATI HDMI Output device is disabled. Can some one help to get it enabled again. Im running Windows 7 (64-bit)
-
On my old computer if i respond to an ad on craigslist a new tab will open to yahoo mail on my new computer this will not happen. it goes to yahoo in the same tab and does not automatically populate the email address in the yahoo to field.
-
Job is not running in Source system.
Hi Experts, One issue I have because of this I am not able to load data into Data sources. I am in BI 7.0 environment. when I execute infopackage, total and techinical status are in yellow. I found in R/3 Job is not running, based on this statement i
-
How to execute my procedure automatically every second
dear all i made a procedure that simply insert a record in a table . i need to execute this procedure automatiaclly every one second. how can i achive this. please help
-
Hi Guru, I am very much interested in understanding how you implemented partitions. Let me give you an overview of what I am doing with partitions. In our OLTP application, which is based on Monte Carlo simulation, there are six out put tables. They