Executing DDL over a DB Link

Hi there gurus...
Version: Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bit
As a test of something I wish to do, I'm submitting the following CREATE TABLE DDL statement as a job from within a schema over a dblink:
declare
v number;
begin
dbms_job.submit@GL_UPDATE_DBA(v,'
declare v varchar2(2000);
begin
execute immediate ''create table tempcb2 (x number)'';
exception when others then v := SQLERRM; insert into tempcb1 (x) values (v); commit;
raise;
end;');
commit;
end;The DB Link GL_UPDATE_DBA connects to a remote DB instance using a schema called SUPPORT_DBA.
When I run this code, the job is submitted fine within SUPPORT_DBA schema on the remote DB. However, an exception is captured and written to the table TEMPCB1 stating: ORA-01031: insufficient privileges.
When I check the job details on the remote DB using
SELECT * FROM USER_JOBSin the SUPPORT_DBA schema, the LOG_USER, PRIV_USER and SCHEMA_USER all list the proper value: SUPPORT_DBA.
From the documentation description of PRIV_USER, this job should be running with the privileges of SUPPORT_DBA and therefore, to my feable mind at least(!), should be able to create a table within that schema.
Why is the job unable to create a table in the SUPPORT_DBA schema?? Have I misinterpreted the way jobs (and their associated privs) work??
Thanks in advance...

Hi,
Supposedly anonymous PL/SQL blocks run always with invoker's rights but if you're using a fixed-user db link it makes me wonder... fixed should be the fixed user nevertheless. I wish I had another instance to test this here properly.
Anyways, I believe for it to work properly you will need to grant the relevant privileges to SUPPORT_DBA directly, NOT through roles.
Users connect using the username and password referenced in the link. For example, if Jane uses a fixed user link that connects to the hq database with the username and password scott/tiger, then she connects as scott, Jane has all the privileges in hq granted to scott directly, and all the default roles that scott has been granted in the hq database.http://download.oracle.com/docs/cd/E11882_01/server.112/e10595/ds_concepts002.htm
If it doesn't work regardless of the direct grants, could you post your create database link statement please (I don't wanna see the password :p )?

Similar Messages

  • What is the best way to submit a Concurrent Request over a DB Link?

    Hi,
    We have a requirement to submit a Concurrent Request over a DB Link. What is the best way to do this?
    What I've done so far is I've created a function in the EBS instance that executes FND_GLOBAl.APPS_INITIALIZE and submits the Concurrent Request. I then call this function remotely from our NON-EBS database. It seems to work fine but I found out from metalink article id 466800.1 that this is not recommended.
    Why are Concurrent Programs Calling FND_GLOBAL.APPS_INITIALIZE Using DBLinks Failing? [ID 466800.1]
    https://support.oracle.com/epmos/faces/ui/km/SearchDocDisplay.jspx?_afrLoop=11129815723825&type=DOCUMENT&id=466800.1&displayIndex=1&_afrWindowMode=0&_adf.ctrl-state=17dodl8lyp_108
    Can anyone suggest a better approach?
    Thanks,
    Allen

    What I've done so far is I've created a function in the EBS instance that executes FND_GLOBAl.APPS_INITIALIZE and submits the Concurrent Request. I then call this function remotely from our NON-EBS database. It seems to work fine but I found out from metalink article id 466800.1 that this is not recommended.
    Why are Concurrent Programs Calling FND_GLOBAL.APPS_INITIALIZE Using DBLinks Failing? [ID 466800.1]
    https://support.oracle.com/epmos/faces/ui/km/SearchDocDisplay.jspx?_afrLoop=11129815723825&type=DOCUMENT&id=466800.1&displayIndex=1&_afrWindowMode=0&_adf.ctrl-state=17dodl8lyp_108
    Can anyone suggest a better approach?Please log a SR and ask Oracle support for any better (alternative) approach. You can mention in the SR that your approach works properly and ask what would be the implications of using it (even though it is not recommended).
    Thanks,
    Hussein

  • Insert, update and delete trigger over multiple Database Links

    Hello guys,
    first of all I'll explain my environment.
    I've got a Master DB and n Slave Databases. Insert, update and delete is only possible on the master DB (in my opinion this was the best way to avoid Data-inconsistencies due to locking problems) and should be passed to slave databases with a trigger. All Slave Databases are attached with DBLinks. And, additional to this things, I'd like to create a job that merges the Master DB into all Slave DB's every x minutes to restore consistency if any Error (eg Network crash) occurs.
    What I want to do now, is to iterate over all DB-Links in my trigger, and issue the insert/update/delete for all attached databases.
    This is possible with the command "execute immediate", but requires me to create textual strings with textually coded field values for the above mentioned commands.
    What I would like to know now, is, if there are any better ways to provide these functions. Important to me is, that all DB-Links are read dynamically from a table and that I don't have to do unnecessary string generations, and maybe affect the performance.
    I'm thankful for every Idea.
    Thank you in advance,
    best regards
    Christoph

    Well, I've been using mysql for a long time, yes, but I thought that this approach would be the best for my requirements.
    Materialized View's don't work for me, because I need real-time updates of the Slaves.
    So, sorry for asking that general, but what would be the best technology for the following problem:
    I've got n globally spread Systems. Each of it can update records in the Database. The easies way would be to provide one central DB, but that doesn't work for me, because when the WAN Connection fails, the System isn't available any longer. So I need to provide core information locally at every System (connected via LAN).
    Very important to me is, that Data remain consistent. That means, that it must not be that 2 systems update the same record on 2 different databases at the same time.
    I hope you understand what I'd need.
    Thank you very much for all your replies.
    best regards
    Christoph
    PS: I forgot to mention that the Databases won't be very large, just about 20k records, and about 10 queriees per second during peak times and there's just the need to sync 1 Table.
    Edited by: 907142 on 10.01.2012 23:14

  • Calling a Procedure and Function over a db link

    I am experiencing some errors with the following and would greatly appreciate the advice of some experts here.
    My use-case is to insert some records into a table via a database link. The records to be inserted will be queried from an identical table in the local data dictionary. Everything works, but occasionally I will get a unique constraint violation if I try to insert a duplicate record so I wrote a simple function to check for the scenario. My issue is that I can run my procedure using the db link and I can run my function using the db link, but I can't use them both together without getting errors.
    My test case just uses the standard emp table:
    create or replace procedure test_insert(p_instance varchar2)
    IS
    l_sql varchar2(4000);
    begin
        l_sql := 'insert into EMP@'||p_instance||' (EMPNO, ENAME, JOB, MGR, SAL, DEPTNO) (Select EMPNO, ENAME, JOB, MGR, SAL, DEPTNO from EMP)';
    execute immediate l_sql;
    END;
    BEGIN
    test_insert('myLink');
    END;
    This works fine and the insert occurs without any issues.
    If I run the same procedure a second time I get:
    00001. 00000 -  "unique constraint (%s.%s) violated" which is what I expect since EMPNO has a unique constraint. So far so good.
    Now I create a function to test whether the record exists:
    create or replace function record_exists(p_empno IN NUMBER, p_instance IN varchar2) return number
    IS
    l_sql varchar2(4000);
    l_count number;
    BEGIN
    l_sql := 'select count(*) from EMP@'||p_instance||' where empno = '||p_empno;
    execute immediate l_sql into l_count;
    IF
    l_count > 0
    THEN return 1;
    ELSE
    return 0;
    END IF;
    END;
    I test this as follows:
    select record_exists(8020, 'myLink') from dual;
    RECORD_EXISTS(8020,'myLink')
                                              1
    That works ok, so now I will add that function to my procedure:
    create or replace procedure test_insert(p_instance varchar2)
    IS
    l_sql varchar2(4000);
    begin
        l_sql := 'insert into EMP@'||p_instance||' (EMPNO, ENAME, JOB, MGR, SAL, DEPTNO) (Select EMPNO, ENAME, JOB, MGR, SAL, DEPTNO from EMP WHERE record_exists( EMPNO, '''||p_instance||''') = 0)';
    execute immediate l_sql;
    END;
    I test this as follows:
    BEGIN
    test_insert('myLink');
    END;
    Result is:
    Error report:
    ORA-02069: global_names parameter must be set to TRUE for this operation
    ORA-06512: at "FUSION.TEST_INSERT", line 6
    ORA-06512: at line 2
    02069. 00000 -  "global_names parameter must be set to TRUE for this operation"
    *Cause:    A remote mapping of the statement is required but cannot be achieved
               because global_names should be set to TRUE for it to be achieved
    *Action:   Issue alter session set global_names = true if possible
    I don't know why I am getting this. The function works, the procedure works, but when I combine them I get an error. If I set the global names parameter to true and then rerun this I get:
    02085. 00000 -  "database link %s connects to %s"
    *Cause:    a database link connected to a database with a different name.
               The connection is rejected.
    *Action:   create a database link with the same name as the database it
               connects to, or set global_names=false.
    Any advice is much appreciated. I don't understand why I can run the procedure and the function each separately over the db link but they don't work together.
    thank you,
    john

    The proper procedure depends on what how you define failure and what it should mean - error back to caller, log and continue, just continue. Constraints are created to ensure invalid data does not get into a table. Generally they provide the most efficient mechanism for checking for invalid data, and return useful exceptions which the caller can handle. You are also doubling the work load of the uniqueness check by adding in your own check.
    In general I'd say use Exceptions, they are your friend

  • Cannot execute report over portal  (WWC-51000)

    Hello forum,
    i created a report in the report builder and then i tried to deploy it into our 10g AS Portal (new installation with one infrastructure machine and one portal machine). I registered the reports server and created report definition file access to this file. machine. I try to run the report over the "run as portlet" -link and i get following error message:
    The preference path does not exist: ORACLE.WEBVIEW.PARAMETERS.1109700064110 (WWC-51000)
    if i try to execute the report over the "run"-link in the RDF-Access section i see following error message:
    The requested URL /pls/portal/testprovider.RPT_testreport.show was not found on this server.
    I placed the rdf-file in the reports/samples/demo directory of our portal instance machine. So what is wrong here , is the reports server wrongly configured? but where?
    thanks for your help!!
    katharina

    Hi Jaya,
    Have you got yot web template in an iview?  Do you get an error when trting to access it form the portal?
    Cheers,
    Nick

  • View for Last executed DDL ?

    Hi,
    Is thr any view to find out the last executed ddl ?
    Because i need to know last change happend in the database .
    Thanking You
    Jeneesh

    From asktom :
    Tom,
    Can I have my last executed SQL retrieved from the database?
    just by passing the sid. I want those SQL executed in my application not the onse executed by
    Oracle itself
    Followup   November 12, 2003 - 5pm US/Eastern:
    you'll get the last sql executed by joining v$session to v$sql. if the last sql executed was
    recursive SYS sql -- you'll get that (that would be the last sql).  we really don't discriminate
    between them
        Forgot to tell you this data dictionary view will age out. So there is no absolute guarantee you will find all the sql executed since the database instance has been started. Other option is Audit. For more information refer oracle manual.
    Regards
    Ra
    Message was edited by:
    s.rajaram

  • Getting an error while executing ddl commands using dblink

    Hi,
    i am using Oracle9iR2 Version.
    i have created a procedure like below to execute ddl commands on remote database through dblink using dbms_sql.
    CREATE OR REPLACE PROCEDURE run_remote_ddl (p_dblink VARCHAR2, qry VARCHAR2)
    AS
    c_handle NUMBER;
    feedback INTEGER;
    stat VARCHAR2 (2000);
    BEGIN
    stat := 'select DBMS_SQL.open_cursor' || p_dblink || ' from dual';
    EXECUTE IMMEDIATE stat
    INTO c_handle;
    stat :=
    'begin DBMS_SQL.parse'
    || p_dblink
    || ' ('
    || c_handle
    || ','''
    || qry
    || ''', DBMS_SQL.v7); end;';
    EXECUTE IMMEDIATE stat;
    stat :=
    ' select DBMS_SQL.EXECUTE' || p_dblink || '(' || c_handle
    || ') from dual';
    EXECUTE IMMEDIATE stat
    INTO feedback;
    stat :=
    'declare x integer; begin x:= :1; DBMS_SQL.close_cursor'
    || p_dblink
    || '(x); end;';
    EXECUTE IMMEDIATE stat
    USING c_handle;
    END;
    when i run this procedure like below
    begin
    run_remote_ddl ('@dblink', 'create table scott.ttt(num number)');
    end;
    got an error:
    ORA-06553: PLS-103: Encountered the symbol ".2" when expecting one of the following:
    . ( * @ & = - + ; < / > at in is mod not rem
    <an exponent (**)> <> or != or ~= >= <= <> and or like
    between ||
    The symbol ". was inserted before ".2" to continue.
    ORA-06512: at RUN_REMOTE_DDL", line 9
    ORA-06512: at line 2
    Please tell me how to resolve this.
    Thanks in advance.

    Hi,
    >
    ORA-06553: PLS-103: Encountered the symbol ".2" when expecting one of the following:
    . ( * @ & = - + ; < / > at in is mod not rem
    <an exponent (**)> or != or ~= >= <= <> and or like
    between
    >
    Hope you are not typing 2 instead of @ as both are on the same key
    Can you run the following and see what is happening
    CREATE OR REPLACE PROCEDURE run_remote_ddl (p_dblink VARCHAR2, qry VARCHAR2)
    AS
    c_handle NUMBER;
    feedback INTEGER;
    stat VARCHAR2 (2000);
    BEGIN
    dbms_output.put_line(p_dblink);
    stat := 'select DBMS_SQL.open_cursor@dblink from dual';
    --stat := 'select DBMS_SQL.open_cursor from dual';
    EXECUTE IMMEDIATE stat
    INTO c_handle;
    END;
    exec run_remote_ddl('@dblink', 'create table scott.ttt(num number)');Regards
    Edited by: yoonus on Feb 20, 2013 3:47 AM

  • Analysis Services Execute DDL Task Internal error

    Hi all,
    I need help in solving this sporadic problem...  the dimension below (blanked out as xxxx) is based on a view.  The cube processes fine many times - but fails - abruptly once a week... it's being called from a sql job.
    Any ideas?  Thanks in advance!!
    Error: 2010-06-28 15:45:17.69     Code: 0xC1000007     Source: xxxxxxx Analysis Services Execute DDL Task     Description: Internal error: The operation terminated unsuccessfully.  End Error 
    Error: 2010-06-28 15:45:17.69     Code: 0xC11F000D     Source: xxxxxxx Analysis Services Execute DDL Task     Description: Errors in the OLAP storage engine: An error occurred while the 'xxxxx' attribute
    of the 'xxxxx' dimension from the 'xxxxxx' database was being processed.  End Error  Error: 2010-06-28 15:45:17.69     Code: 0xC11F0006     Source: xxxxxxx Analysis Services Execute DDL Task    
    Description: Errors in the OLAP storage engine: The process operation ended because the number of errors encountered during processing reached the defined limit of allowable errors for the operation.  End Error  Error: 2010-06-28 15:45:17.69    
    Code: 0xC11C0002     Source: xxxxxx Analysis Services Execute DDL Task     Description: Server: The operation has been cancelled.  End Error  DTExec: The package execution returned DTSER_FAILURE (1). 
    Started:  3:45:00 PM  Finished: 3:45:17 PM  Elapsed:  17.328 seconds.  The package execution failed.  The step failed.
    Harsh B

    (From http://msdn.microsoft.com/en-us/library/cc966526.aspx)
    "ExternalCommandTimeout is a server property that is used to set the number of seconds that SSAS should wait to time out when issuing commands to external data sources, such as relational and other OLAP sources."
    "ExternalConnectionTimeout is a server property that is used to set the number of seconds, by default, that SSAS should wait to time out when connecting to external data sources, such as relational and other OLAP sources."
    What you set it to, it's absolutely up to you... But as I see it in your log, your task failed in 17 and 9 seconds... So I think you can multiply the values by 10, I guess the situation won't change... But it is worth to check it at least.
    So... How often your fact table gets new (degenerate) dimension reference values? Is it a process update, or a process full command? I really would like to see it :)
    -- Zoltán Horváth
    -- MCITP SQL Server Business Intelligence Developer 2005, 2008
    -- Please mark posts as answered or helpful where appropriate.

  • Analysis Service Execute DDL Task throwing error with SourceType Variable

    Hi,
    I have Configuring Analysis Services Execute DDL Task to use Variable and Process Data(xmla Script) like below:
    When I execute this task I get the below error message:
    [Analysis Services Execute DDL Task] Error: The -->
    text node at line 23, column 3 cannot appear inside the DataSource element (namespace http://schemas.microsoft.com/analysisservices/2003/engine) under Envelope/Body/Execute/Command/Batch/Parallel/Process. This element can
    only have text nodes containing white-space characters.
    Can anyone please let me know how to resolve this.

    If I run using the sourceType "Direct Input", the Analysis Execute DDL Task runs fine, but if I use the sourcetype as variable its throws the error. And below is the xmla script
    Here is my Package look and the xmla script; its failing at "ProcessAdd" Analysis Execute DDL task:
    "SELECT '<Batch xmlns=\"http://schemas.microsoft.com/analysisservices/2003/engine\">
    <ErrorConfiguration xmlns:xsd=\"http://www.w3.org/2001/XMLSchema\" xmlns:xsi=\"http://www.w3.org/2001/XMLSchema-instance\" xmlns:ddl2=\"http://schemas.microsoft.com/analysisservices/2003/engine/2\" xmlns:ddl2_2=\"http://schemas.microsoft.com/analysisservices/2003/engine/2/2\" xmlns:ddl100_100=\"http://schemas.microsoft.com/analysisservices/2008/engine/100/100\" xmlns:ddl200=\"http://schemas.microsoft.com/analysisservices/2010/engine/200\" xmlns:ddl200_200=\"http://schemas.microsoft.com/analysisservices/2010/engine/200/200\" xmlns:ddl300=\"http://schemas.microsoft.com/analysisservices/2011/engine/300\" xmlns:ddl300_300=\"http://schemas.microsoft.com/analysisservices/2011/engine/300/300\" xmlns:ddl400=\"http://schemas.microsoft.com/analysisservices/2012/engine/400\" xmlns:ddl400_400=\"http://schemas.microsoft.com/analysisservices/2012/engine/400/400\">
    <KeyNotFound>IgnoreError</KeyNotFound>
    </ErrorConfiguration>
    <Parallel>
    <Process xmlns:xsd=\"http://www.w3.org/2001/XMLSchema\" xmlns:xsi=\"http://www.w3.org/2001/XMLSchema-instance\" xmlns:ddl2=\"http://schemas.microsoft.com/analysisservices/2003/engine/2\" xmlns:ddl2_2=\"http://schemas.microsoft.com/analysisservices/2003/engine/2/2\" xmlns:ddl100_100=\"http://schemas.microsoft.com/analysisservices/2008/engine/100/100\" xmlns:ddl200=\"http://schemas.microsoft.com/analysisservices/2010/engine/200\" xmlns:ddl200_200=\"http://schemas.microsoft.com/analysisservices/2010/engine/200/200\" xmlns:ddl300=\"http://schemas.microsoft.com/analysisservices/2011/engine/300\" xmlns:ddl300_300=\"http://schemas.microsoft.com/analysisservices/2011/engine/300/300\" xmlns:ddl400=\"http://schemas.microsoft.com/analysisservices/2012/engine/400\" xmlns:ddl400_400=\"http://schemas.microsoft.com/analysisservices/2012/engine/400/400\">
    <Object>
    <DatabaseID>IIS_Version2</DatabaseID>
    <DimensionID>Application</DimensionID>
    </Object>
    <Type>ProcessAdd</Type>
    <DataSource xmlns:xsd=\"http://www.w3.org/2001/XMLSchema\" xmlns:xsi=\"http://www.w3.org/2001/XMLSchema-instance\" xmlns:ddl2=\"http://schemas.microsoft.com/analysisservices/2003/engine/2\" xmlns:ddl2_2=\"http://schemas.microsoft.com/analysisservices/2003/engine/2/2\" xmlns:ddl100_100=\"http://schemas.microsoft.com/analysisservices/2008/engine/100/100\" xmlns:ddl200=\"http://schemas.microsoft.com/analysisservices/2010/engine/200\" xmlns:ddl200_200=\"http://schemas.microsoft.com/analysisservices/2010/engine/200/200\" xmlns:ddl300=\"http://schemas.microsoft.com/analysisservices/2011/engine/300\" xmlns:ddl300_300=\"http://schemas.microsoft.com/analysisservices/2011/engine/300/300\" xmlns:ddl400=\"http://schemas.microsoft.com/analysisservices/2012/engine/400\" xmlns:ddl400_400=\"http://schemas.microsoft.com/analysisservices/2012/engine/400/400\" xmlns:dwd=\"http://schemas.microsoft.com/DataWarehouse/Designer/1.0\" xsi:type=\"RelationalDataSource\" dwd:design-time-name=\"1a3cb292-9bce-4c59-a182-177d6b3506ff\" xmlns=\"http://schemas.microsoft.com/analysisservices/2003/engine\">
    <ID>IISDW</ID>
    <Name>IISDW</Name>
    <ConnectionString>Provider=SQLNCLI11.1;Data Source=CO1MSFTSQLHKT02;Integrated Security=SSPI;Initial Catalog=IISDW</ConnectionString>
    <Timeout>PT0S</Timeout>-->
    </DataSource>
    <DataSourceView xmlns:xsd=\"http://www.w3.org/2001/XMLSchema\" xmlns:xsi=\"http://www.w3.org/2001/XMLSchema-instance\" xmlns:ddl2=\"http://schemas.microsoft.com/analysisservices/2003/engine/2\" xmlns:ddl2_2=\"http://schemas.microsoft.com/analysisservices/2003/engine/2/2\" xmlns:ddl100_100=\"http://schemas.microsoft.com/analysisservices/2008/engine/100/100\" xmlns:ddl200=\"http://schemas.microsoft.com/analysisservices/2010/engine/200\" xmlns:ddl200_200=\"http://schemas.microsoft.com/analysisservices/2010/engine/200/200\" xmlns:ddl300=\"http://schemas.microsoft.com/analysisservices/2011/engine/300\" xmlns:ddl300_300=\"http://schemas.microsoft.com/analysisservices/2011/engine/300/300\" xmlns:ddl400=\"http://schemas.microsoft.com/analysisservices/2012/engine/400\" xmlns:ddl400_400=\"http://schemas.microsoft.com/analysisservices/2012/engine/400/400\" xmlns:dwd=\"http://schemas.microsoft.com/DataWarehouse/Designer/1.0\" dwd:design-time-name=\"b0b61205-c64d-4e34-afae-6d4d48b93fb3\" xmlns=\"http://schemas.microsoft.com/analysisservices/2003/engine\">
    <ID>IISDW</ID>
    <Name>IISDW</Name>
    <DataSourceID>IISDW</DataSourceID>
    <Schema>
    <xs:schema id=\"IISDW_x0020_1\" xmlns=\"\" xmlns:xs=\"http://www.w3.org/2001/XMLSchema\" xmlns:msdata=\"urn:schemas-microsoft-com:xml-msdata\" xmlns:msprop=\"urn:schemas-microsoft-com:xml-msprop\">
    <xs:element name=\"IISDW_x0020_1\" msdata:IsDataSet=\"true\" msdata:UseCurrentLocale=\"true\" msprop:design-time-name=\"72037318-e316-469d-9a45-a10c77709b39\">
    <xs:complexType>
    <xs:choice minOccurs=\"0\" maxOccurs=\"unbounded\">
    <xs:element name=\"Application\" msprop:design-time-name=\"7f579e7e-e8b7-4a9d-8a93-a255fccbbfbe\" msprop:IsLogical=\"True\" msprop:FriendlyName=\"Application\" msprop:DbTableName=\"Application\" msprop:TableType=\"View\" msprop:Description=\"\" msprop:QueryDefinition=\"SELECT a.Application, DATEADD([hour], DATEDIFF([hour], 0, a.[Timestamp]), 0) AS [Timestamp], a.ServerName, CAST(a.ServerName AS char(3)) AS DataCenter, a.CS_URI_Stem, CAST(HashBytes(''MD5'', &#xD;&#xA; a.Application + a.ServerName + a.CS_URI_Stem + CAST(DATEADD([hour], DATEDIFF([hour], 0, a.[Timestamp]), 0) AS varchar(24))) AS uniqueidentifier) AS Server_URI_Identity&#xD;&#xA;FROM IIS_6_OLD AS a LEFT OUTER JOIN&#xD;&#xA; Dimension_Pointer AS b ON a.Application = b.Application&#xD;&#xA;WHERE (b.ProcessedFlag = 0) AND (a.Application IN ("+(DT_WSTR,100) @[User::strDistinctApplication]+"))&#xD;&#xA;GROUP BY a.Application, DATEADD([hour], DATEDIFF([hour], 0, a.[Timestamp]), 0), a.ServerName, a.CS_URI_Stem\" msprop:QueryBuilder=\"SpecificQueryBuilder\">
    <xs:complexType>
    <xs:sequence>
    <xs:element name=\"Application\" msprop:design-time-name=\"f3074e98-4a82-4bc5-a818-916203f7758b\" msprop:DbColumnName=\"Application\" msprop:FriendlyName=\"Application\" minOccurs=\"0\">
    <xs:simpleType>
    <xs:restriction base=\"xs:string\">
    <xs:maxLength value=\"255\" />
    </xs:restriction>
    </xs:simpleType>
    </xs:element>
    <xs:element name=\"Timestamp\" msdata:ReadOnly=\"true\" msprop:design-time-name=\"2662e3a8-8b1a-4d77-aecb-575329b84dc1\" msprop:DbColumnName=\"Timestamp\" msprop:FriendlyName=\"Timestamp\" type=\"xs:dateTime\" />
    <xs:element name=\"ServerName\" msprop:design-time-name=\"ced26d49-cd6e-4073-a40c-ff5ef70e4ef1\" msprop:DbColumnName=\"ServerName\" msprop:FriendlyName=\"ServerName\" minOccurs=\"0\">
    <xs:simpleType>
    <xs:restriction base=\"xs:string\">
    <xs:maxLength value=\"255\" />
    </xs:restriction>
    </xs:simpleType>
    </xs:element>
    <xs:element name=\"DataCenter\" msdata:ReadOnly=\"true\" msprop:design-time-name=\"4583e15a-dcf1-45a2-a30b-bd142ca8b778\" msprop:DbColumnName=\"DataCenter\" msprop:FriendlyName=\"DataCenter\" minOccurs=\"0\">
    <xs:simpleType>
    <xs:restriction base=\"xs:string\">
    <xs:maxLength value=\"3\" />
    </xs:restriction>
    </xs:simpleType>
    </xs:element>
    <xs:element name=\"CS_URI_Stem\" msprop:design-time-name=\"10db5a79-8d50-49d2-9376-a3b4d19864b9\" msprop:DbColumnName=\"CS_URI_Stem\" msprop:FriendlyName=\"CS_URI_Stem\" minOccurs=\"0\">
    <xs:simpleType>
    <xs:restriction base=\"xs:string\">
    <xs:maxLength value=\"4000\" />
    </xs:restriction>
    </xs:simpleType>
    </xs:element>
    <xs:element name=\"Server_URI_Identity\" msdata:DataType=\"System.Guid, mscorlib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089\" msdata:ReadOnly=\"true\" msprop:design-time-name=\"018ede0a-e15e-47c2-851b-f4431e8c839c\" msprop:DbColumnName=\"Server_URI_Identity\" msprop:FriendlyName=\"Server_URI_Identity\" type=\"xs:string\" />
    </xs:sequence>
    </xs:complexType>
    </xs:element>
    </xs:choice>
    </xs:complexType>
    <xs:unique name=\"Constraint1\" msprop:IsLogical=\"True\" msdata:PrimaryKey=\"true\">
    <xs:selector xpath=\".//Application\" />
    <xs:field xpath=\"Server_URI_Identity\" />
    <xs:field xpath=\"Timestamp\" />
    </xs:unique>
    </xs:element>
    </xs:schema>
    <IISDW_x0020_1 xmlns=\"\" />
    </Schema>
    </DataSourceView>
    <WriteBackTableCreation>UseExisting</WriteBackTableCreation>
    </Process>
    </Parallel>
    </Batch>' as XMLAScript_ProcessData"

  • Executing ddl statment in pl/sql block

    Hi all,
    could anyone pls help in method for executing ddl statment in pl/sql block other than 'execute immediate' ?

    could anyone pls help in method for executing ddl statment in pl/sql block other than 'execute immediate' ?On newer db versions you have more options:
    SQL> desc t
    Error: object t does not exist
    SQL> exec sys.dbms_prvtaqim.execute_stmt('create table michael.t (a integer)')
    PL/SQL procedure successfully completed.
    SQL> desc t
    TABLE t
    Name                                      Null?    Type                       
    A                                                  NUMBER                     
    SQL> exec sys.dbms_utility.exec_ddl_statement('drop table michael.t')
    PL/SQL procedure successfully completed.
    SQL> desc t
    Error: object t does not exist;)

  • How to test issue with accessing tables over a DB link?

    Hey all,
    Using 3.1.2 on XE, I have a little app. The database schema for this app only contains views to the actual tables, which happen to reside over a database link in a 10.1.0.5.0 DB.
    I ran across an issue where a filter I made on a form refused to work (see [this ApEx thread| http://forums.oracle.com/forums/message.jspa?messageID=3178959] ). I verified that the issue only happens when the view points to a table across a DB link by recreating the table in the local DB and pointing the view to it. When I do this, the filter works fine. When I change the view back to use the remote table, it fails. And it only fails in the filter -- every other report and every other tool accessing the remote table via the view works fine.
    Anyone know how I can troubleshoot this? For kicks, I also tried using a 10.2.0.3.0 DB for the remote link, but with the same results.
    TIA,
    Rich
    Edited by: socpres on Mar 2, 2009 3:44 PM
    Accidental save...

    ittichai wrote:
    Rich,
    I searched metalink for your issue. This may be a bug in 3.1 which will be fixed in 4.0. Please see Doc ID 740581.1 Database Link Tables Do NoT Show Up In Table Drop Down List In Apex. There is a workaround mentioned in the document.
    I'm not sure why I never thought of searching MetaLink, but thanks for the pointer! It doesn't match my circumstances, however. The Bug smells like a view not being queried in the APEX development tool itself -- i.e. the IDE's coding needs changing, not necessarily those apps created with the IDE.
    I'm working on getting you access to my hosted app...
    Thanks,
    Rich

  • TE over ospf virtual-link does not work

    Hi Expert,
    I want to practise the TE over ospf virtual-link. The topo is like this one:
    R1 R2
    | |
    R3---R4
    | |
    R5 R6
    all links are in area 0 except link between R3 and R4.
    rt5_1#ro
    router ospf 1
    router-id 1.1.1.1
    log-adjacency-changes
    network 0.0.0.0 255.255.255.255 area 0
    mpls traffic-eng router-id Loopback0
    mpls traffic-eng area 0
    interface Tunnel16
    ip unnumbered Loopback0
    tunnel destination 6.6.6.6
    tunnel mode mpls traffic-eng
    tunnel mpls traffic-eng autoroute announce
    tunnel mpls traffic-eng priority 5 5
    tunnel mpls traffic-eng bandwidth 5
    tunnel mpls traffic-eng path-option 10 dynamic
    rt5_2#ro
    router ospf 1
    router-id 2.2.2.2
    log-adjacency-changes
    network 0.0.0.0 255.255.255.255 area 0
    mpls traffic-eng router-id Loopback0
    mpls traffic-eng area 0
    rt5_3#ro
    router ospf 1
    router-id 3.3.3.3
    log-adjacency-changes
    area 1 virtual-link 4.4.4.4
    network 3.3.3.3 0.0.0.0 area 0
    network 10.10.13.0 0.0.0.255 area 0
    network 10.10.34.0 0.0.0.255 area 1
    network 10.10.35.0 0.0.0.255 area 0
    mpls traffic-eng router-id Loopback0
    mpls traffic-eng area 0
    mpls traffic-eng interface
    Ethernet1/0/1 area 0
    rt5_4#ro
    router ospf 1
    router-id 4.4.4.4
    log-adjacency-changes
    area 1 virtual-link 3.3.3.3
    network 4.4.4.4 0.0.0.0 area 0
    network 10.10.24.0 0.0.0.255 area 0
    network 10.10.34.0 0.0.0.255 area 1
    network 10.10.46.0 0.0.0.255 area 0
    mpls traffic-eng router-id Loopback0
    mpls traffic-eng area 0
    mpls traffic-eng interface Ethernet1/1 area 0
    rt5_5#ro
    router ospf 1
    router-id 5.5.5.5
    log-adjacency-changes
    network 5.5.5.5 0.0.0.0 area 0
    network 10.10.35.0 0.0.0.255 area 0
    mpls traffic-eng router-id Loopback0
    mpls traffic-eng area 0
    rt5_7#ro
    router ospf 1
    router-id 6.6.6.6
    log-adjacency-changes
    network 6.6.6.6 0.0.0.0 area 0
    network 10.10.46.0 0.0.0.255 area 0
    mpls traffic-eng router-id Loopback0
    mpls traffic-eng area 0
    The tunnel 16 on R1 from R6 does not come up .
    rt5_1#sh mpls traffic-eng tunnels t16
    Name: rt5_1_t16 (Tunnel16) Destination: 6.6.6.6
    Status:
    Admin: up Oper: down Path: not valid Signalling: Down
    path option 10, type dynamic
    Config Parameters:
    Bandwidth: 5 kbps (Global) Priority: 5 5 Affinity: 0x0/0xFFFF
    Metric Type: TE (default)
    AutoRoute: enabled LockDown: disabled Loadshare: 5 bw-based
    auto-bw: disabled
    Shortest Unconstrained Path Info:
    Path Weight: UNKNOWN
    Explicit Route: UNKNOWN
    History:
    Tunnel:
    Time since created: 1 hours, 7 minutes
    Path Option 10:
    Last Error: PCALC:: No path to destination, 6.6.6.6
    Does anyone the cause of the problem?
    thanks,

    Hi,
    There is one virutal link between R3 and R4.
    rt5_1#sh mpls traffic-eng topology
    IGP Id: 3.3.3.3, MPLS TE Id:3.3.3.3 Router Node (ospf 1 area 0)
    link[0]: Broadcast, DR: 10.10.35.5, nbr_node_id:24, gen:224
    frag_id 1, Intf Address:10.10.35.3
    TE metric:10, IGP metric:10, attribute flags:0x0
    link[1]: Broadcast, DR: 10.10.13.1, nbr_node_id:25, gen:224
    frag_id 0, Intf Address:10.10.13.3
    TE metric:10, IGP metric:10, attribute flags:0x0
    link[2]: Broadcast, DR: 10.10.34.3, nbr_node_id:-1, gen:224
    frag_id 3, Intf Address:10.10.34.3
    TE metric:10, IGP metric:invalid, attribute flags:0x0
    sh mpls traffic-eng topology brief
    IGP Id: 3.3.3.3, MPLS TE Id:3.3.3.3 Router Node (ospf 1 area 0)
    link[0]: Broadcast, DR: 10.10.35.5, nbr_node_id:31, gen:106
    frag_id 1, Intf Address:10.10.35.3
    TE metric:10, IGP metric:10, attribute flags:0x0
    link[1]: Broadcast, DR: 10.10.13.1, nbr_node_id:32, gen:106
    frag_id 0, Intf Address:10.10.13.3
    TE metric:10, IGP metric:10, attribute flags:0x0
    link[2]: Broadcast, DR: 10.10.34.3, nbr_node_id:-1, gen:106
    frag_id 3, Intf Address:10.10.34.3
    TE metric:10, IGP metric:invalid, attribute flags:0x0
    IGP Id: 4.4.4.4, MPLS TE Id:4.4.4.4 Router Node (ospf 1 area 0)
    link[0]: Broadcast, DR: 10.10.46.6, nbr_node_id:34, gen:104
    frag_id 0, Intf Address:10.10.46.4
    TE metric:10, IGP metric:invalid, attribute flags:0x0
    link[1]: Broadcast, DR: 10.10.24.2, nbr_node_id:19, gen:104
    frag_id 1, Intf Address:10.10.24.4
    TE metric:10, IGP metric:10, attribute flags:0x0
    link[2]: Broadcast, DR: 10.10.34.3, nbr_node_id:-1, gen:104
    frag_id 2, Intf Address:10.10.34.4
    TE metric:10, IGP metric:invalid, attribute flags:0x0
    IGP Id: 5.5.5.5, MPLS TE Id:5.5.5.5 Router Node (ospf 1 area 0)
    link[0]: Broadcast, DR: 10.10.35.5, nbr_node_id:31, gen:94
    thanks,
    guoming

  • Wb_rt_api_exec over a database link

    When I call wb_rt_api_exec over a database link I receive:
    ORA-20001: Task not found - Please check the Task Type, Name and Location are correct.
    ORA-06512: at "OWB_10_2_0_1_31.WB_RT_API_EXEC", line 704.
    When I run the command on the other database it works.
    Is there a way to do this?
    Thanks,
    Joe

    We had the same error when starting a package with WB_RT_API_EXEC.RUN_TASK from JBoss Middleware. It ran fine when started directly from SQLDeveloper, but threw the error when started from JBoss. Adding the pragma solved the problem, we also had to put a commt; at the end of the procedure.

  • Analysis Services execute ddl task

    Hi,
    I want to run the Analysis Services execute ddl task to execute a cube creation via xmla script.
    The cube could already exist, so I want to ask if it exists before running the task or else it will fail.
    What would be the best way to perform this - if exist question ?
    Dani

    You can use alter element like this example:
    <Alter ObjectExpansion="ObjectProperties" xmlns="http://schemas.microsoft.com/analysisservices/2003/engine">
    <Object>
    <DatabaseID>Adventure Works DW Multidimensional 2012</DatabaseID>
    <DataSourceID>AdventureWorksDW2012</DataSourceID>
    </Object>
    <ObjectDefinition>
    <DataSource xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:type="RelationalDataSource">
    <ID>AdventureWorksDW2012</ID>
    <Name>AdventureWorksDW2012</Name>
    <ConnectionString>Data Source=fr-dwk-02;Initial Catalog=AdventureWorksDW2008R2;Integrated Security=True</ConnectionString>
    <ManagedProvider>System.Data.SqlClient</ManagedProvider>
    <Timeout>PT30S</Timeout>
    </DataSource>
    </ObjectDefinition>
    </Alter>
    More info on "Alter Element" on MSDN 
    http://technet.microsoft.com/en-us/library/ms186630.aspx

  • EEM -automatic shut down or switch over of WAN link in OSPF when packet drop increase

    Hi,
    Need help..
    can any one help me how can EEM help for automatic shut down or switch over of WAN link in OSPF when packet drop increase a predefined level.
    I have a set up different branches connected together...OSPF is the routing protocol and need to communicate with two branches via hub locations.
    need to shut or switch some percent of traffic from primary to back up when packet drop in the link.

    I am not sure EEM can do what you want.
    Another option could be to use SLA tacking/monitoring. But you will fall back to the new route when you lose some percentage of pings, you can't switch only part of the traffic.
    I hope it helps.
    PK

Maybe you are looking for