Alternate for tables statement

hi friends
If i use dictionary fields in the screen i am using tables statement of that corresponding structure.
   but i think tables statement is obsolete one is it right so what is the alternative for tables statement
how to move the values of fields between screen and abap program, help me
           thanks in advance

Fields are moved from your program to the screen when the field names match.  There is no explicit transport command needed.
I do not think that the TABLES command is considered obsolete when used in screen programming.  That is the only time that I use it.  There was some training material in regards to the TABLES command being required for proper transport between the program and the screen.  I do not know if this has changed with 4.7.

Similar Messages

  • Alternate for oracle statement in sybase

    What is the alternate for the following oracle statement in sybase:
    insert /*+ APPEND */ into MYTABLE nologging
    I need to bulk insert into MYTABLE (in sybase) reading from a CSV file or another database. More than 0.4 million records.
    I am using Java as a programming language.

    let me get this straight:
    1) you're asking a SYBASE question
    2) and using Java as your language
    And this is
    1) Oracle forum
    2) SQL and PL/SQL is the language here
    I think you're asking your question in the wrong place.

  • How to use bind variable value for table name in select statement.

    Hi everyone,
    I am having tough time to use value of bind variable for table name in select statement. I tried &p37_table_name. ,
    :p37_table_name or v('p37_table_name) but none worked.
    Following is the sql for interactive report:
    select * from v('p37_table_name') where key_loc = :P37_KEY_LOC and
    to_char(inspection_dte,'mm/dd/yyyy') = :P37_INSP_DT AND :p37_column_name is not null ;
    I am setting value of p37_table_name in previous page which is atm_state_day_insp.
    Following is error msg:
    "Query cannot be parsed, please check the syntax of your query. (ORA-00933: SQL command not properly ended) "
    Any help would be higly appreciated.
    Raj

    Interestingly enough I always had the same impression that you had to use a function to do this but found out from someone else that all you need to do is change the radio button from Use Query-Specific Column Names and Validate Query to Use Generic Column Names (parse query at runtime only). Apex will substitute your bind variable for you at run-time (something you can't normally do in pl/sql without using dynamic sql)

  • How to specify  tablespace for a primary key inde in create table statement

    How to specify the tablespace for a primary key index in a create table statement?
    Does the following statement is right?
    CREATE TABLE 'GPS'||TO_CHAR(SYSDATE+1,'YYYYMMDD')
                ("ID" NUMBER(10,0) NOT NULL ENABLE,
                "IP_ADDRESS" VARCHAR2(32 BYTE),
                "EQUIPMENT_ID" VARCHAR2(32 BYTE),
                "PACKET_DT" DATE,
                "PACKET" VARCHAR2(255 BYTE),
                "PACKET_FORMAT" VARCHAR2(32 BYTE),
                "SAVED_TIME" DATE DEFAULT CURRENT_TIMESTAMP,
                 CONSTRAINT "UDP_LOG_PK" PRIMARY KEY ("ID") TABLESPACE "INDEX_DATA"
                 TABLESPACE "SBM_DATA";   Thank you
    Edited by: qkc on 09-Nov-2009 13:42

    As orafad indicated, you'll have to use the USING INDEX clause from the documentation, i.e.
    SQL> ed
    Wrote file afiedt.buf
      1  CREATE TABLE GPS
      2              ("ID" NUMBER(10,0) NOT NULL ENABLE,
      3              "IP_ADDRESS" VARCHAR2(32 BYTE),
      4              "EQUIPMENT_ID" VARCHAR2(32 BYTE),
      5              "PACKET_DT" DATE,
      6              "PACKET" VARCHAR2(255 BYTE),
      7              "PACKET_FORMAT" VARCHAR2(32 BYTE),
      8              "SAVED_TIME" DATE DEFAULT CURRENT_TIMESTAMP,
      9               CONSTRAINT "UDP_LOG_PK" PRIMARY KEY ("ID") USING INDEX TABLESP
    ACE "USERS"
    10               )
    11*              TABLESPACE "USERS"
    SQL> /
    Table created.Justin

  • Error when executing statement for table/stored proc

    Hi All,
          I am getting this error when executing IDOC to JDBC (Stored Procedure) Scenario.
         In my stored procedure I have three insert statements to insert rows in to 3 tables.
        This stored procedure is working fine for two insert statements i.e, 
             For this I have created data type for stored procedure with 10 elements and executed the scenario and was successfully running.
        when I added 3rd insert statement to stored procedure ie., when i added 5 more elements to the datatype (totally 15 elements) it starts giving the bellow error in Message Monitoring.
    <i><b>Exception caught by adapter framework: Error processing request in sax parser: Error when executing statement for table/stored proc. 'COGRP_TMP_PROC_1' (structure 'Statements'): java.sql.SQLException: General error</b></i>
    <i><b>Delivery of the message to the application using connection JDBC_http://sap.com/xi/XI/System failed, due to: com.sap.aii.af.ra.ms.api.RecoverableException: Error processing request in sax parser: Error when executing statement for table/stored proc. 'COGRP_TMP_PROC_1' (structure 'Statements'): java.sql.SQLException: General error</b></i>
       Note:- I have run the stored procedure  with three insert statements in Sql Server, and also by calling an external program also, and was working fine.
    <i><b> Note : Is there any structure needs to be follow when working with IDOC to Stored procedure.</b></i>
    I am struck up with the error, can any body resolve this issue.
    Thanks in Advance,
    Murthy

    Hi narasimha,
                      This seems to be any error due to incorrect query formation.In your receiver jdbc channel set the parameter logSQLstatement = true.you can find this parameter in the advanced mode. Using this parameter you will be able to see the sql query which is generated at runtime in the audit log in RWB.
    Regards,
    Pragati

  • File to JDBC :Error when executing statement for table/stored proc.

    Hi,
    I am getting following error when i am trying to insert data into z-table using JDBC recr adapter.
    Error while parsing or executing XML-SQL document: Error processing request in sax parser: Error when executing statement for table/stored proc. 'ZTEST' (structure 'STATEMENT'): java.sql.SQLException: <u>[Microsoft][SQLServer 2000 Driver for JDBC][SQLServer]Invalid object name '<b>ZTEST</b>'.</u>
    But the database table name 'ZTEST' exists in the system.
    XML structure:
    <?xml version="1.0" encoding="UTF-8"?>
    <ns0:MT_RECR xmlns:ns0="http://urn:srini/FileToJDBC">
       <STATEMENT>
          <TEST action="INSERT">
             <table>ZTEST</table>
             <access>
                <ROLLNO>123</ROLLNO>
                <FIRSTNAME>ABC</FIRSTNAME>
                <LASTNAME>XYZ</LASTNAME>
             </access>
          </TEST>
       </STATEMENT>
    </ns0:MT_RECR>
    Regards,
    Srinivas

    Hi,
    I have changed my MT str but still getting the same error.Is it possible to insert/ update a z-table using JDBC adapter.
    XML str:
    <?xml version="1.0" encoding="UTF-8"?>
    <ns0:MT_RECR xmlns:ns0="http://urn:srini/FileToJDBC">
       <STATEMENT>
          <ZTEST action="INSERT">
             <access>
                <ROLLNO>123</ROLLNO>
                <FIRSTNAME>abc</FIRSTNAME>
                <LASTNAME>XYZ</LASTNAME>
             </access>
          </ZTEST>
       </STATEMENT>
    </ns0:MT_RECR>
    Regards,
    Srinivas

  • How to gather table stats for tables in a different schema

    Hi All,
    I have a table present in one schema and I want to gather stats for the table in a different schema.
    I gave GRANT ALL ON SCHEMA1.T1 TO SCHEMA2;
    And when I tried to execute the command to gather stats using
    DBMS_STATS.GATHER_TABLE_STATS (OWNNAME=>'SCHMEA1',TABNAME=>'T1');
    The function is failing.
    Is there any way we can gather table stats for tables in one schema in an another schema.
    Thanks,
    MK.

    You need to grant analyze any to schema2.
    SY.

  • SQL*Loader-929: Error parsing insert statement for table

    Hi,
    I get the following error with SQL*Loader:
    Table MYTABLE loaded from every logical record.
    Insert option in effect for this table: INSERT
    Column Name Position Len Term Encl Datatype
    IDE FIRST * ; CHARACTER
    SQL string for column : "mysequence.NEXTVAL"
    CSI_NBR 1:10 10 ; CHARACTER
    POLICY_NBR 11:22 12 ; CHARACTER
    CURRENCY_COD 23:25 3 ; CHARACTER
    POLICY_STAT 26:27 2 ; CHARACTER
    PRODUCT_COD 28:35 8 ; CHARACTER
    END_DAT 44:53 10 ; CHARACTER
    FISCAL_COD 83:83 1 ; CHARACTER
    TOT_VAL 92:112 21 ; CHARACTER
    SQL*Loader-929: Error parsing insert statement for table MYTABLE.
    ORA-01031: insufficient privileges
    I am positive that I can SELECT the sequence and INSERT into the table with the user invoking sql*loader.
    Where does that "ORA-01031" come from?
    Regards
    ...

    Options:
    1) you are wrong about privileges OR
    2) you have the privilege only when you connect via SQL*Plus (or whichever other tool you used to test the insert).
    Is it possible that during your test you enabled the role which granted you the INSERT privilege - and that SQL*Loader doesn't do this?
    Can you see the table in this list?
    select *
    from user_tab_privs_recd
    where table_name='MY_TABLE'
    and owner='table owner whoever';
    select *
    from user_role_privs;Any roles where DEFAULT_ROLE is not YES?
    HTH
    Regards Nigel

  • How to find the tables which are the candidates for gathering stats in 10g

    Hi,
    In 10g how can we find the tables, partitions which are the candidates for gathering the stats.
    I want to findo those tables, partitions and gather the stats on daily basis.
    Thanks,
    Mahi

    The probem you describe has been posted about before. There are known issues with the default dbms_stats parameter settings for some environements.
    What you can do is just go ahead an manually submit dbms_stats commands with appropriate parameters for tables that have not been analyzed or modify the maintenance window to give it more time.
    I would rather just generate the missing statistics myself then the job can continue to run in the normal maintenance window. Since the job should be looking only for tables that meet the stale requirements then once everything is analyzed it is likely the job can keep the statistics updated in the default time allowed.
    Be warned that the default parameter settings do not work well for some tables in some environments and if that proves true for your shop you can generate workable statistics on a table and then lock them so the nightly job ships regenerating the statistics for that table. You would then manually unlock, generate, and lock that statistics for that table as necessary.
    HTH -- Mark D Powell --

  • Error when executing statement for table/stored proc  DB2 - Data Truncation

    Hi,
      I have one call sp in XI with n parameters int and two parameters out.
       well, to implement the interface gives the following error
    com.sap.aii.af.ra.ms.api.DeliveryException: Error processing request in sax parser: Error when executing statement for table/stored proc. 'SPSAPAR9' (structure 'Statement'): java.sql.SQLException: The number of parameter values set or registered does not match the number of parameters
    Thanks for your help
    Ximena
    Edited by: Ximena Gonzalez on Feb 19, 2008 11:50 AM
    Edited by: Ximena Gonzalez on Feb 20, 2008 12:17 PM

    My Error is change
    Error while parsing or executing XML-SQL document: Error processing request in sax parser: Error when executing statement for table/stored proc. 'SAPPRG.SPSAPAR9' (structure 'Statement'): java.sql.DataTruncation: Data truncation
    but de change DT SP
    <?xml version="1.0" encoding="UTF-8" ?>
    <ns1:AlistReqDet2_MT xmlns:ns1="urn:proxl:tmuc:proxl01:AlistReqItems">
    <Statement>
    <SPSAPAR9 action="EXECUTE">
      <table>SAPPRG.SPSAPAR9</table>
      <ISAPNU1 isInput="TRUE" type="STRING">0080000353</ISAPNU1>
      <ISAPEM1 isInput="TRUE" type="STRING">'LU'</ISAPEM1>
      <ISAPC05 isInput="TRUE" type="STRING">15353</ISAPC05>
      <ISAPSEC isInput="TRUE" type="STRING">10</ISAPSEC>
      <ISAPLOT isInput="TRUE" type="STRING">'lats'</ISAPLOT>
      <ISAPCA1 isInput="TRUE" type="STRING">10</ISAPCA1>
      <ISAPCA2 isInput="TRUE" type="STRING">10</ISAPCA2>
      <ISAPKIL isInput="TRUE" type="STRING">10</ISAPKIL>
      <ISAPES1 isInput="TRUE" type="STRING">'T'</ISAPES1>
      <ISAPSW isOutput="TRUE" type="STRING" />
      </SPSAPAR9>
      </Statement>
      </ns1:AlistReqDet2_MT>

  • Gather table stats taking longer for Large tables

    Version : 11.2
    I've noticed that gathers stats (using dbms_stats.gather_table_stats) is taking longer for large tables.
    Since row count needs to be calculated, a big table's stats collection would be understandably slightly longer (Running SELECT COUNT(*) internally).
    But for a non-partitioned table with 3 million rows, it took 12 minutes to collect the stats ? Apart from row count and index info what other information is gathered for gather table stats ?
    Does Table size actually matter for stats collection ?

    Max wrote:
    Version : 11.2
    I've noticed that gathers stats (using dbms_stats.gather_table_stats) is taking longer for large tables.
    Since row count needs to be calculated, a big table's stats collection would be understandably slightly longer (Running SELECT COUNT(*) internally).
    But for a non-partitioned table with 3 million rows, it took 12 minutes to collect the stats ? Apart from row count and index info what other information is gathered for gather table stats ?
    09:40:05 SQL> desc user_tables
    Name                            Null?    Type
    TABLE_NAME                       NOT NULL VARCHAR2(30)
    TABLESPACE_NAME                        VARCHAR2(30)
    CLUSTER_NAME                             VARCHAR2(30)
    IOT_NAME                             VARCHAR2(30)
    STATUS                              VARCHAR2(8)
    PCT_FREE                             NUMBER
    PCT_USED                             NUMBER
    INI_TRANS                             NUMBER
    MAX_TRANS                             NUMBER
    INITIAL_EXTENT                         NUMBER
    NEXT_EXTENT                             NUMBER
    MIN_EXTENTS                             NUMBER
    MAX_EXTENTS                             NUMBER
    PCT_INCREASE                             NUMBER
    FREELISTS                             NUMBER
    FREELIST_GROUPS                        NUMBER
    LOGGING                             VARCHAR2(3)
    BACKED_UP                             VARCHAR2(1)
    NUM_ROWS                             NUMBER
    BLOCKS                              NUMBER
    EMPTY_BLOCKS                             NUMBER
    AVG_SPACE                             NUMBER
    CHAIN_CNT                             NUMBER
    AVG_ROW_LEN                             NUMBER
    AVG_SPACE_FREELIST_BLOCKS                   NUMBER
    NUM_FREELIST_BLOCKS                        NUMBER
    DEGREE                              VARCHAR2(10)
    INSTANCES                             VARCHAR2(10)
    CACHE                                  VARCHAR2(5)
    TABLE_LOCK                             VARCHAR2(8)
    SAMPLE_SIZE                             NUMBER
    LAST_ANALYZED                             DATE
    PARTITIONED                             VARCHAR2(3)
    IOT_TYPE                             VARCHAR2(12)
    TEMPORARY                             VARCHAR2(1)
    SECONDARY                             VARCHAR2(1)
    NESTED                              VARCHAR2(3)
    BUFFER_POOL                             VARCHAR2(7)
    FLASH_CACHE                             VARCHAR2(7)
    CELL_FLASH_CACHE                        VARCHAR2(7)
    ROW_MOVEMENT                             VARCHAR2(8)
    GLOBAL_STATS                             VARCHAR2(3)
    USER_STATS                             VARCHAR2(3)
    DURATION                             VARCHAR2(15)
    SKIP_CORRUPT                             VARCHAR2(8)
    MONITORING                             VARCHAR2(3)
    CLUSTER_OWNER                             VARCHAR2(30)
    DEPENDENCIES                             VARCHAR2(8)
    COMPRESSION                             VARCHAR2(8)
    COMPRESS_FOR                             VARCHAR2(12)
    DROPPED                             VARCHAR2(3)
    READ_ONLY                             VARCHAR2(3)
    SEGMENT_CREATED                        VARCHAR2(3)
    RESULT_CACHE                             VARCHAR2(7)
    09:40:10 SQL> >
    Does Table size actually matter for stats collection ?yes
    Handle:     Max
    Status Level:     Newbie
    Registered:     Nov 10, 2008
    Total Posts:     155
    Total Questions:     80 (49 unresolved)
    why so many unanswered questions?

  • Error when executing statement for table/stored proc.  : ORA-00911

    Hi All,
    I am posting IDOC->XI->JDBC, approx 5000 Idocs.
    But few messages are giving following error in XI-_SXMB_MONI
    Can any one guide me what is the cause of error? I check whole Idoc data I am not able to see bad character? can any once guide me what are the bad character in XML to post data in oracle? so that i can search in XML and how to avoide this error.
    "com.sap.aii.af.ra.ms.api.MessagingException: Error processing request in sax parser: Error when executing statement for table/stored proc. 'HRP1001' (structure 'INSERT_PAD34'): java.sql.SQLException: ORA-00911: invalid character"

    > I am talking about following IDOC.<ZRMD_A06> ->
    > <E1PLOGI SEGMENT="1">-> <E1PITYP SEGMENT="1"> ->
    > <E1PAD34 SEGMENT="1"> -><PROZT>0.00 #</PROZT>.
    > Can we remove this # during message mapping in XI??
    sure you can remove it using Replace function or by writing UDF.
    as I am seing # is last character..<b>so remove # with " " (single space) and then use the TRIM fucntion.</b>
    Thanks
    Farooq.
    *<b>Rewards points if you find it useful*</b>

  • Performance for ALTER TABLE statements

    Hi,
    I'd like to improve performance for scripts running several ALTER TABLE statements. I have two questions regarding this.
    This is the original code:
    ALTER TABLE CUSTOMER_ORDER_DELIVERY_TAB ADD (
    QTY_TO_INVOICE NUMBER NULL );
    ALTER TABLE CUSTOMER_ORDER_DELIVERY_TAB ADD (
    QTY_INVOICED NUMBER NULL );
    1. Would I gain any performance by making the following changes?
    ALTER TABLE CUSTOMER_ORDER_DELIVERY_TAB ADD (
    QTY_TO_INVOICE NUMBER NULL,
    QTY_INVOICED NUMBER NULL );
    These columns are later on filled with values and then made NOT NULL.
    2. Would I gain anything by make these columns NOT NULL with a DEFAULT value in the first statement and then later on insert the values?
    /Roland Bali
    null

    1. It wud definitely increase the performance.
    2. You can only have NOT NULL columns added to an existing table if the table is empty.
    <BLOCKQUOTE><font size="1" face="Verdana, Arial, Helvetica">quote:</font><HR>Originally posted by Roland Bali ([email protected]):
    Hi,
    I'd like to improve performance for scripts running several ALTER TABLE statements. I have two questions regarding this.
    This is the original code:
    ALTER TABLE CUSTOMER_ORDER_DELIVERY_TAB ADD (
    QTY_TO_INVOICE NUMBER NULL );
    ALTER TABLE CUSTOMER_ORDER_DELIVERY_TAB ADD (
    QTY_INVOICED NUMBER NULL );
    1. Would I gain any performance by making the following changes?
    ALTER TABLE CUSTOMER_ORDER_DELIVERY_TAB ADD (
    QTY_TO_INVOICE NUMBER NULL,
    QTY_INVOICED NUMBER NULL );
    These columns are later on filled with values and then made NOT NULL.
    2. Would I gain anything by make these columns NOT NULL with a DEFAULT value in the first statement and then later on insert the values?
    /Roland Bali<HR></BLOCKQUOTE>
    Naveen

  • Alternate shading for tables

    Hi,
    I need script for table alternate shading (see below screen shot). with input value (ex: shading starts from 4th, 5th, 6th row etc...),
    Once the cell style is applied, InDesign Alternate shade option is not working. so i have applied manually for all the tables. Please suggest.
    Regards,
    Velu

    Fields are moved from your program to the screen when the field names match.  There is no explicit transport command needed.
    I do not think that the TABLES command is considered obsolete when used in screen programming.  That is the only time that I use it.  There was some training material in regards to the TABLES command being required for proper transport between the program and the screen.  I do not know if this has changed with 4.7.

  • Alternate for inner join to improve performance

    Hi all,
    I have used an inner join query to fetch data from five different tables into an internal table with where clause conditions.
    The execution time is almost 5-6 min for this particular query(I have more data in all five DB tables- more than 10 million records in every table).
    Is there any alternate for inner join to improve performance.?
    TIA.
    Regards,
    Karthik

    Hi All,
    Thanks for all your interest.
    SELECT  a~object_id a~description a~descr_language
                a~guid AS object_guid a~process_type
                a~changed_at
                a~created_at AS created_timestamp
                a~zzorderadm_h0207 AS cpid
                a~zzorderadm_h0208 AS submitter
                a~zzorderadm_h0303 AS cust_ref
                a~zzorderadm_h1001 AS summary
                a~zzorderadm_h1005 AS summary_uc
                a~zzclose_date     AS clsd_date
                d~stat AS status
                f~priority
                FROM crmd_orderadm_h AS a INNER JOIN crmd_link AS b ON  a~guid = b~guid_hi
                INNER JOIN crmd_partner AS c ON b~guid_set = c~guid
                INNER JOIN crm_jest AS d ON objnr  = a~guid
                INNER JOIN crmd_activity_h AS f ON f~guid = a~guid
                INTO CORRESPONDING FIELDS OF TABLE et_service_request_list
                WHERE process_type IN lt_processtyperange
                AND   a~created_at IN lt_daterange
                AND   partner_no IN lr_partner_no
                AND   stat IN lt_statusrange
                AND   object_id IN lt_requestnumberrange
                AND   zzorderadm_h0207 IN r_cpid
                AND   zzorderadm_h0208 IN r_submitter
                AND   zzorderadm_h0303 IN r_cust_ref
                AND   zzorderadm_h1005 IN r_trans_desc
                AND   d~inact = ' '
                AND   b~objtype_hi = '05'
                AND   b~objtype_set = '07'.
                f~priority
                FROM crmd_orderadm_h AS a INNER JOIN crmd_link AS b ON  a~guid = b~guid_hi
                INNER JOIN crmd_partner AS c ON b~guid_set = c~guid
                INNER JOIN crm_jest AS d ON objnr  = a~guid
                INNER JOIN crmd_activity_h AS f ON f~guid = a~guid
                INTO CORRESPONDING FIELDS OF TABLE et_service_request_list
                WHERE process_type IN lt_processtyperange
                AND   a~created_at IN lt_daterange
                AND   partner_no IN lr_partner_no
                AND   stat IN lt_statusrange
                AND   object_id IN lt_requestnumberrange
                AND   zzorderadm_h0207 IN r_cpid
                AND   zzorderadm_h0208 IN r_submitter
                AND   zzorderadm_h0303 IN r_cust_ref
                AND   zzorderadm_h1005 IN r_trans_desc
                AND   d~inact = ' '
                AND   b~objtype_hi = '05'
                AND   b~objtype_set = '07'.

Maybe you are looking for

  • Specific smarthost based on email header data

    Hi All We plan on upgradeing to Exchange 2010 and then 2013. Our requirement is that we have a service provider who handles our bulk email and one that handles our non-bulk email My Question is - in 2010 or 2013 Can Exchange be configured to inspect

  • File Read and Extract

    Hi, I have a script which runs and generates log files. Rather than manually read through these log files, I'd like to code a program which will take the file, read through it and extract the data I wish to extract. For example, it may read: bla bla

  • Quicktime movies corrupted by adding metadata in Adobe CS3

    Recently I happened to switch from using Adobe CS2 to CS3 (don't ask me why, I know they're old versions).  I use Adobe Bridge to manage/organize my digital photos and movie files from my digital camera.  In CS2, whenever I attempted to select multip

  • SetForwardUrl function

    Hi all, I want to modify an existing button to set the forward url to my custom page. How can I use the setForwardURL function from OAPageContext to link to my custom page. I've been trying to trace the methods but no luck. Please Help. If there is a

  • Obiee 11.1.1.5.0 cannot open .iqy file in Excel 2010

    Hi, when I try to open a .iqy file in Excel 2010 (windows 7) I get the Error: Unable to open http://answers.activision.com/analytics/saw.dll?Download. The hyperlink cannot be followed to the destiantion. In Excel 2007 (windows XP) I can open/refersh