Explain the SQL Query execution

Hi,
select replace('123,45,6,7',',',chr(10)) from dual;
REPLACE('1
123
45
6
7
Thanks,

I get something different on DB version 10.2.0.4:
SQL> select replace('123,45,6',',',chr(8)) from dual;
REPLACE(
123456
1 row selected.
SQL> select dump(replace('123,45,6',',',chr(8)))  from dual;
DUMP(REPLACE('123,45,6',',',CHR(8)
Typ=1 Len=8: 49,50,51,8,52,53,8,54
1 row selected.Anyway: this time all the comm's are replaced by chr(8)...

Similar Messages

  • Pinning the sql query in cache

    Hi All,
    I want to pin the sql query is cache because the physical reads are very high. Can anyone tell me steps to pin the sql query in the cache. Current version 10.2.1.0 OS : Windows.
    Reads CPU Elapsed
    Physical Reads Executions per Exec %Total Time (s) Time (s) SQL Id
    19,836 38 522.0 2.1 25.40 50.00 1r0wh3v6bayyk
    With regards
    kccrga

    Oscar,
    I've read Carys paper and a good paper it is.
    The point I was trying to get across is that there should be no rules of thumb. (par this one of course ;) )
    It all depends. Should one concentrate on reducing cpu usage on a disk-bound system? Is doing, say, 100 single-block reads from disk faster than doing 100 current mode gets?

  • How the SQL Query Parsing is processing inside SQL/PLSQL engine?

    Hi all,
    Can you explain how the SQL Query Parsing is processing inside SQL/PLSQL engine?
    Thanks,
    Sankar

    Sankar,
    Oracle Database concepts - Chapter 24..
    You will find the explanation required under the heading parsing.
    http://download-west.oracle.com/docs/cd/B19306_01/server.102/b14220/sqlplsql.htm

  • Storing the SQL Query results as XML

    Hi All,
    I have a proc to select all the data from emp table and insert into another table. For some reason it inserts only one column and one record.
    I don't have any row limit or skip ...
    CREATE OR REPLACE procedure Convert_Data_Into_XML1
    as
    -- Var Declartion
    qryCtx DBMS_XMLGEN.ctxHandle;
    result CLOB;
    BEGIN
    qryCtx := dbms_xmlgen.newContext('SELECT * from emp');
    -- set the row header to be EMPLOYEE
    DBMS_XMLGEN.setRowTag(qryCtx, 'Employee-ROW');
    DBMS_XMLGEN.setRowsetTag(qryCtx, 'Employee-rowset');
    LOOP
    -- now get the result
    result := DBMS_XMLGEN.getXML(qryCtx);
    DBMS_OUTPUT.PUT_LINE ('In the Loop');
    EXIT WHEN DBMS_XMLGEN.getNumRowsProcessed(qryCtx) = 0;
    INSERT INTO temp_clob_tab VALUES (result);
    END LOOP;
    COMMIT;
    --close context
    DBMS_XMLGEN.closeContext(qryCtx);
    exception
    when others then
    dbms_output.put_line(TO_CHAR(sqlerrm));
    END;
    Basically I got this sample proc from Oracle website. As per the website I should be able to store all the records from emp table.
    How can I debug this.
    Here is the output from SQLPLUS>
    SQL> select result from temp_clob_tab;
    RESULT
    <?xml version="1.0"?>
    <Employee-rowset>
    <Employee-ROW>
    <EMPNO>7369</EMPNO>
    >>>>>>>>>>
    Thanks in advance
    Siva

    I am able to convert the SQL query output and I am able to store it in a XMLTYPE column.
    But the code doesn't work when I took the same code and try to Implement inside a Trigger. The reason I am doing this is, when there is an update or delete, I want to store the whole record as a XMLTYPE doc (before and After). But the code compiles fine, but when I tried to update a record it gives error.
    CREATE OR REPLACE TRIGGER INS_UPDDEV_EMP3_F
    BEFORE UPDATE OR DELETE ON EMP
    FOR EACH ROW
    DECLARE
    V_DBUSER VARCHAR2(50);
    V_CHANGETYPE VARCHAR2(20) := 'INSERT' ;
    QRYCTX DBMS_XMLGEN.CTXHANDLE ;
    OLDVALUE XMLTYPE;
    NEWVALUE XMLTYPE;
    BEGIN
    IF :OLD.EMPNO <> :NEW.EMPNO or DELETING then
    ---------- old Value
    QRYCTX := DBMS_XMLGEN.NEWCONTEXT ('SELECT :OLD.EMPNO, :OLD.ENAME, :OLD.JOB, :OLD.MGR, :OLD.HIREDATE, :OLD.SAL, :OLD.COMM, :OLD.DEPTNO FROM EMP WHERE EMPNO=:OLD.EMPNO ');
    DBMS_XMLGEN.SETROWTAG (QRYCTX, 'DEPTROW');
    DBMS_XMLGEN.SETROWSETTAG(QRYCTX, 'DEPTSET');
    OLDVALUE := XMLTYPE(DBMS_XMLGEN.GETXML(QRYCTX));
    ------------------------- new Value
    QRYCTX := DBMS_XMLGEN.NEWCONTEXT ('Select * WHERE EMPNO=:NEW.EMPNO '); -- WHERE :OLD.EMPNO
         DBMS_XMLGEN.SETROWTAG (QRYCTX, 'DEPTROW');
    DBMS_XMLGEN.SETROWSETTAG(QRYCTX, 'DEPTSET');
    NEWVALUE := XMLTYPE(DBMS_XMLGEN.GETXML(QRYCTX));
    V_CHANGETYPE := 'UPDATE' ;
    INSERT INTO ADM_RECAUDITHISTORY VALUES ('table emp', 'fIELD all', OLDVALUE, OLDVALUE,
    V_CHANGETYPE , SYSDATE, 'TIGER TRIG');
    commit;
    END IF;
    END;
    I am getting the following error while updating the record.
    >>>>
    ERROR at line 1:
    ORA-19206: Invalid value for query or REF CURSOR parameter
    ORA-06512: at "SYS.DBMS_XMLGEN", line 83
    ORA-06512: at "SCOTT.INS_UPDDEV_EMP3_F", line 13
    ORA-04088: error during execution of trigger 'SCOTT.INS_UPDDEV_EMP3_F'
    >>>>>
    Is anyone knows why I am getting this error.
    Thanks
    Siva

  • Excel ADODB Sql Query Execution taking hours when manipulate excel tables

    Hello All 
    I have 28000 records with 8 column in an sheet. When I convert the sheet into ADODB database and copy to new
    excel using below code it is executing in less than a min
    Set Tables_conn_obj = New ADODB.Connection
    Tables_conn_str = "Provider=Microsoft.ACE.OLEDB.12.0;Data Source=" & Table_Filename & ";Extended Properties=""Excel 12.0;ReadOnly=False;HDR = Yes;IMEX=1"""
    Tables_conn_obj.Open Tables_conn_str
    First_Temp_sqlqry = "Select * INTO [Excel 12.0;DATABASE=C:\Prod Validation\Database\Second Acat Table.xlsb].[Sheet1] Table [first - Table$];" Tables_conn_obj.Execute First_Temp_sqlqry
    But when I change the query to manipulate one column in current table based on another table in the same excel
    and try to copy the results in another excel, it is taking more than one hour.. why it is taking this much time when both the query results returns the same number of rows and column. I almost spend one week and still not able to resolve this issue.
    Even I tried copyfromrecordset, getrows(), getstring(), Looping each recordset fields options all of them taking
    same amount of time. Why there is huge difference in execution time.
    Important note: Without into statement even below query is executing in few seconds.
    select ( ''''''manipulating first column based on other table data''''''''''''''
    iif(
    [Second - Table$].[Policy Agent] = (select max([ACAT$].[new_Agent_number]) from [ACAT$] where [ACAT$].[new_Agent_number] = [Second - Table$].[Policy Agent] and [ACAT$].[ACAT_EffectiveDate] = ( select MAX([ACAT$].[ACAT_EffectiveDate] ) from [ACAT$] where [ACAT$].[new_Agent_number] = [Second - Table$].[Policy Agent]and [ACAT$].[ACAT_EffectiveDate] > '2014-10-01') ) , (select max([ACAT$].[Old_Agent_number]) from [ACAT$] where [ACAT$].[new_Agent_number] = [Second - Table$].[Policy Agent] and [ACAT$].[ACAT_EffectiveDate] = ( select MAX([ACAT$].[ACAT_EffectiveDate] ) from [ACAT$] where [ACAT$].[new_Agent_number] = [Second - Table$].[Policy Agent]and [ACAT$].[ACAT_EffectiveDate] > '2014-10-01')) ,
    iif( [Second - Table$].[Policy Agent] = (select max([ACAT$].[Old_Agent_number]) from [ACAT$] where [ACAT$].[Old_Agent_number] = [Second - Table$].[Policy Agent] and [ACAT$].[ACAT_EffectiveDate] = ( select MAX([ACAT$].[ACAT_EffectiveDate] ) from [ACAT$]where [ACA T$].[Old_Agent_number] = [Second - Table$].[Policy Agent]and [ACAT$].[ACAT_EffectiveDate] <= '2014-10-01') ), (select max([ACAT$].[new_Agent_number]) from [ACAT$] where [ACAT$].[Old_Agent_number] = [Second - Table$].[Policy Agent] and [ACAT$].[ACAT_EffectiveDate] = ( select MAX([ACAT$].[ACAT_EffectiveDate] ) from [ACAT$] where [ACAT$].[Old_Agent_number] = [Second - Table$].[Policy Agent]and [ACAT$].[ACAT_EffectiveDate] <= '2014-10-01')) ,
    [Second - Table$].[Policy Agent] ))) as [Policy Agent],
    ''''''summing up all other columns''''''''''''''
    (iif(isnull(sum([Second - Table$].[Auto BW-Line Of Business Detail])),0,sum([Second - Table$].[Auto BW-Line Of Business Detail]))) as [Auto BW-Line Of Business Detail],(iif(isnull(sum([Second - Table$].[Auto Farmers])),0,sum([Second - Table$].[Auto Farmers]))) as [Auto Farmers],(iif(isnull(sum([Second - Table$].[MCA])),0,sum([Second - Table$].[MCA]))) as [MCA],(iif(isnull(sum([Second - Table$].[CEA])),0,sum([Second - Table$].[CEA]))) as [CEA],(iif(isnull(sum([Second - Table$].[Commercial P&C])),0,sum([Second - Table$].[Commercial P&C]))) as [Commercial P&C],(iif(isnull(sum([Second - Table$].[Comm WC])),0,sum([Second - Table$].[Comm WC]))) as [Comm WC],(iif(isnull(sum([Second - Table$].[Fire Farmers])),0,sum([Second - Table$].[Fire Farmers]))) as [Fire Farmers],(iif(isnull(sum([Second - Table$].[Flood])),0,sum([Second - Table$].[Flood]))) as [Flood],(iif(isnull(sum([Second - Table$].[Kraft Lake])),0,sum([Second - Table$].[Kraft Lake]))) as [Kraft Lake],(iif(isnull(sum([Second - Table$].[Life])),0,sum([Second - Table$].[Life]))) as [Life],(iif(isnull(sum([Second - Table$].[Foremost])),0,sum([Second - Table$].[Foremost]))) as [Foremost],(iif(isnull(sum([Second - Table$].[Umbrella])),0,sum([Second - Table$].[Umbrella]))) as [Umbrella],(iif(isnull(sum([Second - Table$].[MCNA])),0,sum([Second - Table$].[MCNA]))) as [MCNA]
    INTO [Excel 12.0;DATABASE=C:\Prod Validation\Database\Second Acat Table.xlsb].[Sheet1]
    from [Second - Table$] group by [Second - Table$].[Policy Agent] ;

    Hi Fei,
      Thank you so much for the reply post. I just executed the same above SQL without INTO Statement and assigned the SQL result to ADODB recordset as below. If the time difference is due to the SQL query then below statements also should execute for hours
    right, but it gets executed in seconds. But to copy the recordset to excel again it is taking hours. I tried copyfromrecordset,
    getrows(), getstring(), Looping each recordset fields options and all of them taking same amount of time. Please let me know there is delay in time for this much small data
    Even I tried to typecast all columns to double, string in SQL and still the execution time  is not reduced. 
    First_Temp_Recordset.Open sql_qry, Tables_conn_obj, adOpenStatic, adLockOptimistic ''' OR SET First_Temp_Recordset = Tables_conn_obj.Execute sql_qry

  • Excel ADODB Sql Query Execution taking hours when manipulate excel tables why?

    I have 28000 records with 8 column in an sheet. When I convert the sheet into ADODB database and copy to new excel using below code it is executing in less than a min
    Set Tables_conn_obj = New ADODB.Connection Tables_conn_str = "Provider=Microsoft.ACE.OLEDB.12.0;Data Source=" & Table_Filename & ";Extended Properties=""Excel 12.0;ReadOnly=False;HDR = Yes;IMEX=1""" Tables_conn_obj.Open
    Tables_conn_str First_Temp_sqlqry = "Select * INTO [Excel 12.0;DATABASE=C:\Prod Validation\Database\Second Acat Table.xlsb].[Sheet1] Table [first - Table$];" Tables_conn_obj.Execute First_Temp_sqlqry
    But when I change the query to manipulate one column in current table based on another table in the same excel and try to copy the results in another excel, it is taking more than one hour.. why it is taking this much time when both the query results returns
    the same number of rows and column. I almost spend one week and still not able to resolve this issue.
    Even I tried copyfromrecordset, getrows(), getstring(), Looping each recordset fields options all of them taking same amount of time. Appreciate any inputs...
    select ( ''''''manipulating first column based on other table data''''''''''''''
    iif( [Second - Table$].[Policy Agent] = (select max([ACAT$].[new_Agent_number]) from [ACAT$] where [ACAT$].[new_Agent_number] = [Second - Table$].[Policy Agent] and [ACAT$].[ACAT_EffectiveDate] = ( select MAX([ACAT$].[ACAT_EffectiveDate] ) from [ACAT$] where
    [ACAT$].[new_Agent_number] = [Second - Table$].[Policy Agent]and [ACAT$].[ACAT_EffectiveDate] > '2014-10-01') ) , (select max([ACAT$].[Old_Agent_number]) from [ACAT$] where [ACAT$].[new_Agent_number] = [Second - Table$].[Policy Agent] and [ACAT$].[ACAT_EffectiveDate]
    = ( select MAX([ACAT$].[ACAT_EffectiveDate] ) from [ACAT$] where [ACAT$].[new_Agent_number] = [Second - Table$].[Policy Agent]and [ACAT$].[ACAT_EffectiveDate] > '2014-10-01')) ,
    iif( [Second - Table$].[Policy Agent] = (select max([ACAT$].[Old_Agent_number]) from [ACAT$] where [ACAT$].[Old_Agent_number] = [Second - Table$].[Policy Agent] and [ACAT$].[ACAT_EffectiveDate] = ( select MAX([ACAT$].[ACAT_EffectiveDate] ) from [ACAT$]where
    [ACA T$].[Old_Agent_number] = [Second - Table$].[Policy Agent]and [ACAT$].[ACAT_EffectiveDate] <= '2014-10-01') ), (select max([ACAT$].[new_Agent_number]) from [ACAT$] where [ACAT$].[Old_Agent_number] = [Second - Table$].[Policy Agent] and [ACAT$].[ACAT_EffectiveDate]
    = ( select MAX([ACAT$].[ACAT_EffectiveDate] ) from [ACAT$] where [ACAT$].[Old_Agent_number] = [Second - Table$].[Policy Agent]and [ACAT$].[ACAT_EffectiveDate] <= '2014-10-01')) ,
    [Second - Table$].[Policy Agent] ))) as [Policy Agent],
    ''''''summing up all other columns''''''''''''''
    (iif(isnull(sum([Second - Table$].[Auto BW-Line Of Business Detail])),0,sum([Second - Table$].[Auto BW-Line Of Business Detail]))) as [Auto BW-Line Of Business Detail],(iif(isnull(sum([Second - Table$].[Auto Farmers])),0,sum([Second - Table$].[Auto Farmers])))
    as [Auto Farmers],(iif(isnull(sum([Second - Table$].[MCA])),0,sum([Second - Table$].[MCA]))) as [MCA],(iif(isnull(sum([Second - Table$].[CEA])),0,sum([Second - Table$].[CEA]))) as [CEA],(iif(isnull(sum([Second - Table$].[Commercial P&C])),0,sum([Second
    - Table$].[Commercial P&C]))) as [Commercial P&C],(iif(isnull(sum([Second - Table$].[Comm WC])),0,sum([Second - Table$].[Comm WC]))) as [Comm WC],(iif(isnull(sum([Second - Table$].[Fire Farmers])),0,sum([Second - Table$].[Fire Farmers]))) as [Fire
    Farmers],(iif(isnull(sum([Second - Table$].[Flood])),0,sum([Second - Table$].[Flood]))) as [Flood],(iif(isnull(sum([Second - Table$].[Kraft Lake])),0,sum([Second - Table$].[Kraft Lake]))) as [Kraft Lake],(iif(isnull(sum([Second - Table$].[Life])),0,sum([Second
    - Table$].[Life]))) as [Life],(iif(isnull(sum([Second - Table$].[Foremost])),0,sum([Second - Table$].[Foremost]))) as [Foremost],(iif(isnull(sum([Second - Table$].[Umbrella])),0,sum([Second - Table$].[Umbrella]))) as [Umbrella],(iif(isnull(sum([Second - Table$].[MCNA])),0,sum([Second
    - Table$].[MCNA]))) as [MCNA]
    INTO [Excel 12.0;DATABASE=C:\Prod Validation\Database\Second Acat Table.xlsb].[Sheet1]
    from [Second - Table$] group by [Second - Table$].[Policy Agent] ;

    Hi Fei,
      Thank you so much for the reply post. I just executed the same above SQL without INTO Statement and assigned the SQL result to ADODB recordset as below. If the time difference is due to the SQL query then below statements also should execute for hours
    right, but it gets executed in seconds. But to copy the recordset to excel again it is taking hours. I tried copyfromrecordset,
    getrows(), getstring(), Looping each recordset fields options and all of them taking same amount of time. Please let me know there is delay in time for this much small data
    Even I tried to typecast all columns to double, string in SQL and still the execution time  is not reduced. 
    First_Temp_Recordset.Open sql_qry, Tables_conn_obj, adOpenStatic, adLockOptimistic ''' OR SET First_Temp_Recordset = Tables_conn_obj.Execute sql_qry

  • How to view the sql query?

    hi,
      how to view the sql query formed from the xml structure in the receiver jdbc?

    You can view SAP Note at
    http://service.sap.com/notes
    But you require SMP login ID for this which you should get from your company. The content of the notes are as follows:
    Reason and Prerequisites
    You are looking for additional parameter settings. There are two possible reasons why a feature is available via the "additional parameters" table in the "advanced mode" section of the configuration, but not as documented parameter in the configuration UI itself:
    Category 1: The parameter has been introduced for a patch or a SP upgrade where no UI upgrade and/or documentation upgrade was possible. In this case, the parameter will be moved to the UI and the documentation as soon as possible. The parameter in the "additional parameters" table will be deprecated after this move, but still be working. The parameter belongs to the supported adapter functionality and can be used in all, also productive, scenarios.
    Category 2. The parameter has been introduced for testing purposes, proof-of-concept scenarios, as workaround or as pre-released functionality. In this case, the parameter may or may not be moved to the UI and documentation, and the functionality may be changed, replaced or removed. For this parameter category there is no guaranteed support and usage in productive scenarios is not supported.
    When you want to use a parameter documented here, please be aware to which category it belongs!
    Solution
    The following list shows all available parameters of category 1 or 2. Please note:
    Parameter names are always case-sensitive! Parameter values may be case-sensitive, this is documented for each parameter.
    Parameter names and values as documented below must be used always without quotaton marks ("), if not explicitly stated otherwise.
    The default value of a parameter is always chosen that it does not change the standard functionality
    JDBC Receiver Adapter Parameters
    1. Parameter name: "logSQLStatement"
                  Parameter type: boolean
                  Parameter value: true for any string value, false only for empty string
                  Parameter value default: false (empty String)
                  Available with: SP9
                  Category: 2
                  Description:
                  When implementing a scenario with the JDBC receiver adapter, it may be helpful to see which SQL statement is generated by the JDBC adapter from the XI message content for error analysis. Before SP9, this can only be found in the trace of the JDBC adapter if trace level DEBUG is activated. With SP9, the generated SQL statement will be shown in the details page (audit protocol) of the message monitor for each message directly.
                  This should be used only during the test phase and not in productive scenarios.
    Regards,
    Prateek

  • How to get cm:search to use the max attribute when creating the SQL query?

    When we use the max attribute in the cm:search tag, it does not seem to honor the max attribute when creating the SQL query. However, the result returned from the tag is limited to the number specified by the max attribute. Then the tag seems to work as intended, but the performance will be sub optimal when the SQL query returns unnecessary rows to the application.
    We use the cm:search tag to list the latest news (ordered by date), and with the current implementation we have to expect a decrease in performance over time as more news is published. But we can’t live with that. We need to do the constraint in the SQL query, not in the application.
    The sortBy attribute of cm:search is translated to “order by” in the SQL query, as expected.
    Is it possible to get cm:search to generate the SQL query with an addition of “where rownum <= maxRows”?

    Hi Erik,
    The behavior of a repository in regards to the search tag's max results parameter is dependent on the underlying repository's implementation. That said, the OOTB repository in WLP does augment the generated SQL to limit the number of rows returned from the database. This is done in the parsing logic. This behavior may differ with other repository implementations.
    -Ryan

  • How to find out which Tables have been accessed without looking at the SQL query ?

    Hi,
    I would like to know is there a way to find out what queries have been executed and on which tables without looking at the SQL query.
    I have an old C++ code which calls some library functions to access the Oracle database. The source code for the library is not available to me yet. The functions selects/updates/deletes and inserts based on some input parameters I give. I do not know which tables it affects. How do I find out the actual SQL query and/or the tables it accesses ? I was told about V$SQL table that
    has the most recently executed queries, but
    I did'nt see any queries connected to my process.
    Could anybody help me on this ?
    Thanks
    Jagdeep
    [email protected]
    null

    PRECISE/SQL can help you if you have access to it
    2nd option can be that turn on SQL_TRACE
    Run executable of ur c++ program
    it will create a trace file in user_dump_dest
    and then using TKPROF u can see all quesries and their plan also.
    HTH
    Gagan Deep Singh
    <BLOCKQUOTE><font size="1" face="Verdana, Arial, Helvetica">quote:</font><HR>Originally posted by (jagdeeps):
    Hi,
    I would like to know is there a way to find out what queries have been executed and on which tables without looking at the SQL query.
    I have an old C++ code which calls some library functions to access the Oracle database. The source code for the library is not available to me yet. The functions selects/updates/deletes and inserts based on some input parameters I give. I do not know which tables it affects. How do I find out the actual SQL query and/or the tables it accesses ? I was told about V$SQL table that
    has the most recently executed queries, but
    I did'nt see any queries connected to my process.
    Could anybody help me on this ?
    Thanks
    Jagdeep
    [email protected]<HR></BLOCKQUOTE>
    null

  • In JDBC Sender Adapter , the server is Microsoft SQL .I need to pass current date as the input column while Executing stored procedure, which will get me 10 Output Columns. Kindly suggest me the SQL Query String

    In JDBC Sender Adapter , the server is Microsoft SQL .I need to pass current date as the input column while Executing stored procedure, which will get me 10 Output Columns. Kindly suggest me the SQL Query String , for executing the Stored Procedure with Current date as the input .

    Hi Srinath,
    The below blog might be useful
    http://scn.sap.com/community/pi-and-soa-middleware/blog/2013/03/06/executing-stored-procedure-from-sender-adapter-in-sap-pi-71
    PI/XI: Sender JDBC adapter for Oracle stored procedures in 5 days
    regards,
    Harish

  • How to display the out put of the sql query in a text file using forms

    I want to display the out put of the sql query in a text file using forms 6.0.Same could be done using spool command in sqlplus but i want it using forms....Fiaz

    Have a look at the text_io package:
    http://www.oracle.com/webapps/online-help/forms/10g/state?navSetId=_&navId=3&vtTopicFile=f1_help/oraini/c_text_io.html&vtTopicId=
    cheers

  • Find out the sql query for special characters.

    Hi,
    I have the emp table with column name first_name.In that table i have lot records with first_name inlcudes the below mentioned special characters.
    ¡ , ¿, ,Ä,Å,ä,ª,À,Á,Ã,à,á,ã,å,Æ,æ,Ç,ç,È,É,Ê,Ë,è,é,ê,ë,Ì,Í,Î, Ï,ì,í,î,ï, Ñ,ñ, ô, º, Ò, Ó, Ô, Õ, Ö, Ø, ò, ó, õ, ö, ø, ß, Û, Ù, Ú, Ü,ù, ú, û, ü, ÿ,
    I am looking for the records with includes the above mentioned special characters in the first_name column.
    Can u please give the sql query to find out the records.
    Thanks&Regards
    N.Sivaraman

    I am looking for the records with includes the above mentioned special characters in the first_name column.One way would be:
    select emp.*
      from emp,
           table (
             sys.odcivarchar2list ('¡',
                                   'Ä',
                                   'Å',
                                   'ä',
                                   'ª',
                                   'À',
                                   'Á',
                                   'Ã',
                                   'à',
                                   'á',
                                   'ã',
                                   'å',
                                   'Æ',
                                   'æ',
                                   'Ç',
                                   'ç',
                                   'È',
                                   'É',
                                   'Ê',
                                   'Ë',
                                   'è',
                                   'é',
                                   'ê',
                                   'ë',
                                   'Ì',
                                   'Í',
                                   'Î',
                                   'Ï',
                                   'ì',
                                   'í',
                                   'î',
                                   'ï',
                                   'Ñ',
                                   'ñ',
                                   'ô',
                                   'º',
                                   'Ò',
                                   'Ó',
                                   'Ô',
                                   'Õ',
                                   'Ö',
                                   'Ø',
                                   'ò',
                                   'ó',
                                   'õ',
                                   'ö',
                                   'ø',
                                   'ß',
                                   'Û',
                                   'Ù',
                                   'Ú',
                                   'Ü',
                                   'ù',
                                   'ú',
                                   'û',
                                   'ü',
                                   'ÿ'
    where instr (ename, column_value) > 0

  • The SQL query is not executing

    Hi
    I have the following situation: In a project we designed our reports calling a stored procedure the exits in a MS SQL Server 200 database. The Stored Procedures works fine and when they are used in the report everything works perfectly.
    The reports are being made with CR DEsigner 11, when the designer ends them, ha pass them to me and we put them in our java web application. I open them and even preview them since the Crystal Reprots Perspective of Eclipse and I can see the data, so everything to this point is OK.
    The problem comes when I change of connection, I'm trying to connect every report to the same host and database, and when the reprot is displayed in the browser there is no data. I profile the SQL commands that are executed when the report is requested and I found that the reprot is not executing the stored procedure.
    I guess because i'm connectring the report to the same database and host that was used when the report is created and i'm also passing exactly the same parameters of the stored procedure, then report thinks that it doesn't have executing again becuase it will be the same information.
    SO, i wonder if there is a way to request to the report to execute the sql query every time i have to display it.
    thanks for any help.

    What happens when you try to view the report using a simple viewer.jsp?without changing the connection?
    2009-05-25 14:06:09,250 ERROR com.businessobjects.reports.sdk.JRCCommunicationAdapter -  detected an exception: Error de conexión: [SQLServer 2000 Driver for JDBC]Error establishing socket.
    But the database server is ok, so i think the rpt needs more information to be connect to the database.
    Also what happens if you dont use the data bean and give everything there only?
    The same, the rpt is not executing the stored procedure.
    Did anything change on the RDBMS side? Additional packages, changes to the schema, ownership, etc?
    No, the server is ok and the database is ok.
    If you configure Log4J logging to DEBUG, you should see the database invokes from the Crystal Java engine, specifically the queries sent and the number of rowsets returned. Do you notice any errors?
    I have a problem here with log4j. In my project i'm using spring and with the help of spring i configure log4j, basically with spring is easier to configure log4j.
    I'm telling you this because i have log4j working but when my application reaches the code to change the connection to the rerport, log4j dies and stop sending any message to the stdout or to a log file. I haven't since this behavior in any other app o library. So i want to guess i have something wrong with my log4j or maybe with log4j+spring.
    So, i remove all the code of spring referring to log4j, but the problem with log4j wasn't solved.
    What i will try now is to remove spring from the project.
    This is my log4j.properties:
    log4j.rootLogger=DEBUG, stdout
    log4j.appender.stdout=org.apache.log4j.ConsoleAppender
    log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
    log4j.appender.stdout.layout.ConversionPattern=%d %p [%c] - %m%n

  • Give me the sql query which calculte the table size in oracle 10g ecc 6.0

    Hi expert,
    Please  give me the sql query which calculte the table size in oracle 10g ecc 6.0.
    Regards

    Orkun Gedik wrote:
    select segment_name, sum(bytes)/(1024*1024) from dba_segments where segment_name = '<TABLE_NAME>' group by segment_name;
    Hi,
    This delivers possibly wrong data in MCOD installations.
    Depending on Oracle Version and Patchlevel dba_segments does not always have the correct data,
    at any time esp. for indexes right after being rebuild parallel (Even in DB02 because it is using USER_SEGMENTS).
    Takes a day to get the data back in line (never found out, who did the correction at night, could be RSCOLL00 ?).
    Use above statement with "OWNER = " in WHERE for MCOD or connect as schema owner and use USER_SEGMENTS.
    Use with
    segment_name LIKE '<TABLE_NAME>%'
    if you like to see the related indexes as well.
    For partitioned objects, a join from dba_tables / dba_indexes to dba_tab_partitions/dba_ind_partitions to dba_segments
    might be needed, esp. for hash partitioned tables, depending on how they have been created ( partition names SYS_xxxx).
    Volker

  • ADF BC / Why bind variables are mandatory in the sql query

    I got this error during view object excecution in the component browser :
    (oracle.jbo.SQLStmtException) JBO-27122: Erreur SQL lors de la préparation d'une instruction. Instruction : SELECT * FROM (Select distinct(socialgroup.socialgroup_i) from socialgroup, socialgroupmember, lodgingallocation, lodge
    where socialgroup.socialgroup_i = socialgroupmember.socialgroup_i and socialgroupmember.t_socialgrouprole_i = t_rolegroup_ipar.fgetflextypologyclassitem_i(t_rolegroup_ipar.fisbeneficiary) and socialgroupmember.datefrom <= :DateTo and nvl(socialgroupmember.dateto, :DateFrom) >= :DateFrom and socialgroupmember.requester_i = lodgingallocation.requester_i and lodgingallocation.datefrom <= :DateTo and nvl(lodgingAllocation.DateTo, :DateFrom) >= :DateFrom and lodgingallocation.lodge_i = lodge.lodge_i) QRSLT ORDER BY "SOCIALGROUP_I"
    ----- LEVEL 1: DETAIL 0-----
    (java.sql.SQLException) Tentative de définition d'un nom de paramètre qui ne se trouve pas dans le SQL: T_SICategory_I
    The bind variables T_SICategory_I is not yet use in the sql query but will be used later so i defined it for the view object.
    Is it a reason that the run time check this ? It would be more convenient to set the bind variables early when defining the view object and add them later during the development iteration.

    Design-time defined named bind variables can be marked as required, or not.
    This is decided by the use in the where clause or in view criteria. In my case the bind variable was not used in where clause neither in view criteria and that causes the error.
    May be i would be nice to add a check box (a flag) that enable or disable the bind variables for this checking so we will be able to let it defined even we removed some part of the query corresponding to the corresponding restriction or we defined it earlier for a part of the query that is not yet defined.
    In my current case i fully defined the bind variables but would refine my query later ... in that cause i would have to remove this bind variable and loose all the definitions to run this view object.
    sorry for my english ...

Maybe you are looking for