Support for Progress Database

Hi,
I have a requirement to pull data into HANA from MFG PRO system that runs on Progress database, does SLT support it. Please advise. Thanks.

Saurabh,
Very simple answer..SLT supports all SAP supported databases,I think Progress Database is not supported by SLT.
But still  I would suggest you to  check PAM in Service marketplace.

Similar Messages

  • Lack of support for FIM database mirroring

    The official line is that database mirroring is not a supported architecture for the FIM deployment. I am not proposing using this, however I'd like to understand 1) What the issues really would be with a mirrored database deployment, 2) Will support
    ever be added for this, and will it come in the form of SQL AlwaysOn?
    Really appreciate help and input.
    Rgds,
    David

    Database mirroring comes has two modes regarding transactions: synchronous or asynchronous.
    Synchronous requires that the data be committed in both places before releasing the transaction. This has a big performance impact on the FIM Service database and to a lesser extent on the FIM Sync Database.
    Asynchronous means that data isn't committed in both places at the same time, the mirror can fall behind and then in failover you could be behind. In order to have automatic failover with Mirroring you have to be able to modify the connection string to include
    the failover partner or the client has to support getting that data at first logon. While you can modify the FIM database connection strings, it is not understood if FIM is using database clients that support mirroring. I believe it is. Even with asynchronous
    you still have performance hit for copying every transaction to the mirror.
    SQL Always On combines the best of mirroring and clustering to allow you to group databases together into an availability set, and then automatic failover the whole group to another server. It should be noted that Always On makes use of a similar underlying
    mechanisms as mirroring to copy the data -- this is evident when you read that Always on also has an asynchronous and synchronous mode. You will most likely run into the same performance quandary.
    Will the product group add support for it? My guess is that it depends on if they find a good way to address the performance issues.
    David Lundell, Get your copy of FIM Best Practices Volume 1 http://blog.ilmbestpractices.com/2010/08/book-is-here-fim-best-practices-volume.html

  • Is it possible to add support for new database type in Data Modeler?

    Hi,
    I see that Data Modeler v.4 supports different versions of Oracle, DB2 and MS SQL. Is it possible to add support for a new database family,
    PostgreSQL for example? I hoped that RDBMS Site editor can do it, but so far I don't see any possibility to add XML files with metadata for a new RDBMS.
    I did it previously for PowerDesigner were it is possible to add and modify definitions for new relational databases.
    Thank you,
    Sergei

    There is discussion option as an out of the box feature. Check this: BI launch pad 4.0: Participate in a discussion about a document

  • Support for 10g database target

    When can we expect OWB10g with support for the Oracle 10g database as a target? Is the only option available to use OWB with a 10g target simply to wait for this release? Patience may be a virture, but it an expensive virture.

    Yes, the upcoming 10g Warehouse Builder release will support the 10g database as a target. Expect Warehouse Builder 10g to release approximately one week after the database 10g for Windows is being released.
    Sorry you have to wait.
    Thanks,
    Mark.

  • Is ORE supported for Oracle database 12.1.0.2

    HI,
    As we plan to migrate to the new Oracle 12c 12.1.0.2 database? Would like to know if ORE is supported for this version?
    Also, can we install ORE now against CBD/PDB databases?

    Oracle R Enterprise 1.4.1 does support the Multitenant Container Database (CDB) feature of Oracle Database 12.1.0.1.
    Oracle R Enterprise is not yet supported under Oracle Database 12.1.0.2.
    See the Oracle R Enterprise 1.4.1 Installation and Administration Guide for more information.
    Sherry

  • JDBC Driver for Progress Database

    Hi experts,
    We need to connect our SAP PI 7.0 which is running on Windows 2003 Server x64 bit Standard/Enterprise Edition SP1 to a progress database 9.1D version on Unix. The problems are
    1. What JDBC driver should I use?
    2. We tried to copy progress JDBC driver jar files from the progress database installation directory into my windows machine and tried to run it, it fails to run. It gives the following error
    Error during database connection to the database URL 'jdbc:JdbcProgress:T:156.5.31.65:inodbc2' using the JDBC driver 'com.progress.sql.jdbc.JdbcProgressDriver': 'com.sap.aii.adapter.jdbc.sql.DriverManagerException: Unable to locate a suitable JDBC driver to establish a connection to URL 'jdbc:JdbcProgress:T:156.5.31.65:inodbc2''
    3. Does anyone know what connection architecture should I use to run under such environment?
    Thanks
    Charu

    Hi,
    To install JDBC driver follow the how to guide.
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/library/xi/xi-how-to-guides/how%20to%20install%20and%20configure%20external%20drivers%20for%20jdbc%20and%20jms%20adapters.pdf
    Configuration of JDBC Adapter for SQL Server
    JDBC Driver = com.microsoft.jdbc.sqlserver.SQLServerDriver
    Connection = jdbc:microsoft:sqlserver://hostname:<port>;DatabaseName=<DBName>
    UserID and Password.
    If the connection is not working find the correct port number.
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/40b92770-db81-2a10-8e91-f747188d8033
    JDBC- X I -  R/3 Scenario
    /people/bhavesh.kantilal/blog/2006/07/03/jdbc-receiver-adapter--synchronous-select-150-step-by-step
    /people/sap.user72/blog/2005/06/01/file-to-jdbc-adapter-using-sap-xi-30
    Please check the driver path as mentioned below.
    JDBC Driver : sun.jdbc.odbc.JdbcOdbcDriver
    Connection:jdbc:odbc:Driver={Microsoft Access Driver (*.mdb)};DBQ=//location of DB table.mdb;
    No JDBC driver required.
    Receiver JDBC scenario MS access - /people/sameer.shadab/blog/2005/10/24/connecting-to-ms-access-using-receiver-jdbc-adapter-without-dsn
    follow this thread
    Re: Problem when connecting to MS Access through JDBC Adapter.
    SAP Note 850116 has details
    Thanks,
    Satya Kumar
    Reward Points If it is Useful..

  • Support for Logical Databases PNP/PNPCE

    On this topic I would like to know:
    1. Will SAP will continue to support PNP or will support for PNP be eventually phased out?
    2. Were can I find the official SAP document that covers the introduction of PNPCE?
    3. Where is it stated officially by SAP the recommendation for all new development to use PNPCE?
    Edited by: John McKee on Oct 14, 2009 11:17 AM

    Hi John,
    I can not make an official SAP statement here but given the fact that there are several thousand reports based on PNP from SAP and even more from Customers the answer is:
    1. Will SAP will continue to support PNP
    Yes
    1. will support for PNP be eventually phased out?
    No
    2. Were can I find the official SAP document that covers the introduction of PNPCE?
    3. Where is it stated officially by SAP the recommendation for all new development to use PNPCE?
    Go to trasnaction SE36 enter PNPCE as LDB and read the thorough documentation.
    PNPCE has simply a nicer Selection screen and much more options to predefine it in the report class.
    If you have no CE you can simply use it in the old PNP-mode for details see the extensive documentation.
    Hope that answers your qestions,
    Michael
    Edited by: Michael Fruechtl on Oct 14, 2009 11:47 AM

  • Parameterized support for multiple database instances in CR 4 Eclipse

    Issue with Database Schemas and Crystal Reports
    When building a report with Crystal Reports, the user is required to enter database connectivity information.  This database information allows the user to select which tables the information for the report will come from.  When choosing the tables, the user must navigate through the available database schemas in the database connection to each table that is required.  The database structure is described below:
    Database Connector 1
    Schema1
    Schema2
    Table1
    Table2
    Table3
    Schema3
    After the user has selected the appropriate tables, they are allowed to begin building the report.  The user puts the necessary table fields onto the report.  Crystal Reports uses only the required fields on the report to build a query to the database for retrieving the data.  Thus, the fewer the fields on the report, the less information required.  A single query is built by Crystal Reports to get all the data from the database.  This query can be viewed from within the Crystal Reports application, but it cannot be changed by the user.
    When viewing the query generated by Crystal Reports, you will notice that the tables will have the schema name attached to them.  This is done as part of the fully qualified name of the table.
    Processing a Crystal Reports report in a batch environment requires that the database connector that is stored inside the report be changed.  This change is necessary due to the setup of our system.  The name of the database server, the database username and the database password are different in the different execution environments.  Our current implementation is to programmatically change these values before running the report.
    The issue arises in that changing the database connector does not change the schema of the tables used in the query.  The schema information in the query is hardcoded.  Thus, when moving the reports from one environment to another, the original schema used is being propogated to the new environment, where the original schema is not appropriate.
    Changing the database connector will, however, change the schema of the tables used in the field selector.  Any field that is used from the field selector on the report or in formulas will change appropriately.  But the fields used to create the initial query will not change.
    The results of the schema not being changed appropriately in the query is when the report is attempting to populate with the Java environment, an exception is thrown and processing is terminated.  When the exception is thrown, the entire process of populating the report is terminated.
    Anyone know a solution to this?
    Erik Bleifield

    Well, CSV would not be a preferred option.
    We need to get the output to a report archive that only supports ascii text. We are hoping to preserve the output formatting of columns, etc. CSV would not preserve the formatting.
    We are currently looking at outputting to PDF and then running a 3rd party product to convert from PDF to text. This leads to a good amount of formatting issues as well.
    Why would exporting to text not be supported in Java? Where one can execute this manually in CR 2008 for Windows.
    I wonder if this is intentional by product management choice - or a defect?
    Thanks for further consideration.
    Erik

  • BizTalk Adapter Pack support for Oracle Database 11.2

    Hi All,
    Is there a BizTalk Adapter Pack release that officially supports Oracle Database 11.2.0.3
    Thanks
    Steve

    Hi,
    The BizTalk 2010 adapter pack for the Orace adapter also supports 11.2
    See also:
    BizTalkServer 2010 Oracle 11g Release2
    http://social.msdn.microsoft.com/Forums/en-US/508063f5-b370-406c-a703-842db787b2cf/biztalkserver-2010-oracle-11g-release2?forum=biztalkr2adapters
    Kind regards,
    Tomasso Groenendijk
    Blog 
    |  Twitter
    MCTS BizTalk Server 2006, 2010
    If this answers your question please mark it accordingly

  • Support for Appleworks databases?

    Hi, I notice Numbers is supposed to handle spreadsheets but are Appleworks databases supported in Numbers 08?
    thanks,
    Dave

    I recently posted a question in the Address Book forum about moving addresses from Appleworks (DB) to the OSX Address Book. (http://discussions.apple.com/thread.jspa?threadID=1105599&tstart=0). This doesn't seem like a good replacement for what I am doing with AW.
    I saw in this Numbers forum thread your suggestion on doing a copy and paste from AW DB to numbers. I saved the AW Address DB as text, then opened it, selected ALL, copied and then pasted into a blank Numbers document. This worked fine.
    Using Sort and Filter I was able to organize entries similar to what I did in AW.
    So for this simple DB, Numbers seems OK.
    Owen

  • FCP studio 3 support for 'progressive segmented frames'

    Anyone got the new version of FCP. Do you happen to know if it handles 'progressive segmented frames' as anything other than interlaced.
    My sony hvr vie shoots what it call 'progressive'. This is explained by Sony as 'scanning progressively' yet laying down to tape as interlaced. Result is that FCP won't accept it as 'progressive' and you have to treat it as interlaced.
    Peter

    Hi Andy, the problem is that FCP 6 does not have a capture or edit preset for PsF. It offers either Interlaced or progressive presets. PsF is neither true progressive or true interlaced. So if I'm aiming at a Progressive product say for Bluray and I want to shoot using PsF which 'scans' progressively I am stuck with having to ingest the 'progressively scanned ' footage into FCP as interlaced which it isn't.
    Yes I can capture it as anything I want and render the arse off it but that's getting it wrong from the get-go.
    Just as a comparison, for a long while I was shooting 'progressively' as I thought, making a progressive m2v for SD DVD and importing it into DVDSP, all the while ignorant that DVDSP ONLY does interlaced. DVDSP would accept the progressive footage into the interlaced track without complaint but the results were always degraded because of incorrect source material.
    I don't want to make those same mistakes again with the BlUray format, I want my source material to be handled correctly from capture to disc. And at the moment it's a toss up, do I shoot progressivley, record it to tape as interlaced ( although not true interlaced ) orthen import it into FCP as, WHAT? interlaced or progressive ? and render the arse off it before I start working with it.
    Thats my conundrum, hope its clear
    Peter

  • In-place upgrade of Enterprise Manager for RAC databases is not supported

    upgrading 2 node RAC from 10.2.0.1 to 10.2.0.3 on Windows 2003 64 bit. I got the following:
    In-place upgrade of Enterprise Manager for RAC databases is not supported in this release
    Now I cannot log in to the database control as sys with sysdba role . I get "invalid username/password" error
    I can login as a dba user
    I also get a java error after I log in, some thing about class not found, but then I'm in and I can do everything that I try, although I haven't tried scheduling a backup yet
    Can I upgrade EM to 10.2.0.3?
    If so, how?

    On the patchset readme...
    11.3 Upgrade of Oracle Enterprise Manager Not Supported
    When you start Oracle Database Upgrade Assistant to upgrade 10.2.0.1 database to 10.2.0.3 patch set, the following error occurs:
    Inplace upgrade of Enterprise Manager is not supported for RAC databases is not supported in this release
    Workaround:
    Ignore warning.
    If you drop and recreate the EM repository?
    1). emca -deconfig dbcontrol db repos drop
    2). emca -config dbcontrol db -repos create
    Be careful, doing this put the database on QUIESCE mode.
    Regards,
    Rodrigo Mufalani

  • Does Veridata have Agent for oracle database 11gr2 on linux?

    Hi,
    Does veridata have agent for oracle database 11gr2 on linux? From the edelivery.oracle.com, I just found the latest version is 10.2. When I installed the 10.2 version agent for oracle database 11gr2, the agent needs libnnz10.*. Does 10.2 version agent support for 11gr2 database? Thank you.

    http://www.oracle.com/technetwork/middleware/goldengate/downloads/index.html
    PS: Google still works. ;-)

  • CDC for Progress

    Hi,
    We use ODI 11.1.1.7 and are looking to implement CDC. Our source tables are in progress. The standard KM's do not include a JKM for progress database. Can someone please let know if CDC is possible with progress database and the KM needed for achieving the same.
    Regards,
    Parag

    Arthur, thank you for responding.  I appreciate it! :)
    I followed the instructions from Rakesh Parida's video on Technet.  Below are the individual steps I followed for creating the CDC Service and Instance.
    CDC Service Setup:
    I made sure that supplemental logging was enabled on my Oracle server.  I also made sure that my Windows account had DBA and "execute_catalog_role" permissions on the Oracle database.
    From the CDC Service Config MMC I ran Action>Prepare SQL Server which created the MSXDBCDC database.
    I then ran Action>New Service and entered the following information:
    Service name: OracleCDCService1
    Service Account: "This Account" radio button with my Windows username and password information
    Associated SQL Server: <My SQL 2012 Server>
    Authentication: Windows authentication
    I also provided a CDC Service master password.
    CDC Instance Setup:
    Connected to <My SQL 2012 Server> using the CDC Designer MMC
    Went through the steps of creating the Oracle CDC Instance
    Oracle CDC Instance Name = CDC_TEST_SOURCE
    Connected to my Oracle DB using Windows authentication
    Selected a single table that I wanted to try CDC on and left CDC Gating Role blank
    Ran the Oracle Logging script that enables table logging and supplemental log groups on the Oracle DB
    Ran the Status Check step and all passed
    Finished the wizard and started the instance
    A few minutes after the instance was started I added six records to the table in Oracle.
    Please let me know if you need more info.
    Thanks!

  • Selective XML Index feature is not supported for the current database version , SQL Server Extended Events , Optimizing Reading from XML column datatype

    Team , Thanks for looking into this  ..
    As a last resort on  optimizing my stored procedure ( Below ) i wanted to create a Selective XML index  ( Normal XML indexes doesn't seem to be improving performance as needed ) but i keep getting this error within my stored proc . Selective XML
    Index feature is not supported for the current database version.. How ever
    EXECUTE sys.sp_db_selective_xml_index; return 1 , stating Selective XML Indexes are enabled on my current database .
    Is there ANY alternative way i can optimize below stored proc ?
    Thanks in advance for your response(s) !
    /****** Object: StoredProcedure [dbo].[MN_Process_DDLSchema_Changes] Script Date: 3/11/2015 3:10:42 PM ******/
    SET ANSI_NULLS ON
    GO
    SET QUOTED_IDENTIFIER ON
    GO
    -- EXEC [dbo].[MN_Process_DDLSchema_Changes]
    ALTER PROCEDURE [dbo].[MN_Process_DDLSchema_Changes]
    AS
    BEGIN
    SET NOCOUNT ON --Does'nt have impact ( May be this wont on SQL Server Extended events session's being created on Server(s) , DB's )
    SET TRANSACTION ISOLATION LEVEL READ UNCOMMITTED
    select getdate() as getdate_0
    DECLARE @XML XML , @Prev_Insertion_time DATETIME
    -- Staging Previous Load time for filtering purpose ( Performance optimize while on insert )
    SET @Prev_Insertion_time = (SELECT MAX(EE_Time_Stamp) FROM dbo.MN_DDLSchema_Changes_log ) -- Perf Optimize
    -- PRINT '1'
    CREATE TABLE #Temp
    EventName VARCHAR(100),
    Time_Stamp_EE DATETIME,
    ObjectName VARCHAR(100),
    ObjectType VARCHAR(100),
    DbName VARCHAR(100),
    ddl_Phase VARCHAR(50),
    ClientAppName VARCHAR(2000),
    ClientHostName VARCHAR(100),
    server_instance_name VARCHAR(100),
    ServerPrincipalName VARCHAR(100),
    nt_username varchar(100),
    SqlText NVARCHAR(MAX)
    CREATE TABLE #XML_Hold
    ID INT NOT NULL IDENTITY(1,1) PRIMARY KEY , -- PK necessity for Indexing on XML Col
    BufferXml XML
    select getdate() as getdate_01
    INSERT INTO #XML_Hold (BufferXml)
    SELECT
    CAST(target_data AS XML) AS BufferXml -- Buffer Storage from SQL Extended Event(s) , Looks like there is a limitation with xml size ?? Need to re-search .
    FROM sys.dm_xe_session_targets xet
    INNER JOIN sys.dm_xe_sessions xes
    ON xes.address = xet.event_session_address
    WHERE xes.name = 'Capture DDL Schema Changes' --Ryelugu : 03/05/2015 Session being created withing SQL Server Extended Events
    --RETURN
    --SELECT * FROM #XML_Hold
    select getdate() as getdate_1
    -- 03/10/2015 RYelugu : Error while creating XML Index : Selective XML Index feature is not supported for the current database version
    CREATE SELECTIVE XML INDEX SXI_TimeStamp ON #XML_Hold(BufferXml)
    FOR
    PathTimeStamp ='/RingBufferTarget/event/timestamp' AS XQUERY 'node()'
    --RETURN
    --CREATE PRIMARY XML INDEX [IX_XML_Hold] ON #XML_Hold(BufferXml) -- Ryelugu 03/09/2015 - Primary Index
    --SELECT GETDATE() AS GETDATE_2
    -- RYelugu 03/10/2015 -Creating secondary XML index doesnt make significant improvement at Query Optimizer , Instead creation takes more time , Only primary should be good here
    --CREATE XML INDEX [IX_XML_Hold_values] ON #XML_Hold(BufferXml) -- Ryelugu 03/09/2015 - Primary Index , --There should exists a Primary for a secondary creation
    --USING XML INDEX [IX_XML_Hold]
    ---- FOR VALUE
    -- --FOR PROPERTY
    -- FOR PATH
    --SELECT GETDATE() AS GETDATE_3
    --PRINT '2'
    -- RETURN
    SELECT GETDATE() GETDATE_3
    INSERT INTO #Temp
    EventName ,
    Time_Stamp_EE ,
    ObjectName ,
    ObjectType,
    DbName ,
    ddl_Phase ,
    ClientAppName ,
    ClientHostName,
    server_instance_name,
    nt_username,
    ServerPrincipalName ,
    SqlText
    SELECT
    p.q.value('@name[1]','varchar(100)') AS eventname,
    p.q.value('@timestamp[1]','datetime') AS timestampvalue,
    p.q.value('(./data[@name="object_name"]/value)[1]','varchar(100)') AS objectname,
    p.q.value('(./data[@name="object_type"]/text)[1]','varchar(100)') AS ObjectType,
    p.q.value('(./action[@name="database_name"]/value)[1]','varchar(100)') AS databasename,
    p.q.value('(./data[@name="ddl_phase"]/text)[1]','varchar(100)') AS ddl_phase,
    p.q.value('(./action[@name="client_app_name"]/value)[1]','varchar(100)') AS clientappname,
    p.q.value('(./action[@name="client_hostname"]/value)[1]','varchar(100)') AS clienthostname,
    p.q.value('(./action[@name="server_instance_name"]/value)[1]','varchar(100)') AS server_instance_name,
    p.q.value('(./action[@name="nt_username"]/value)[1]','varchar(100)') AS nt_username,
    p.q.value('(./action[@name="server_principal_name"]/value)[1]','varchar(100)') AS serverprincipalname,
    p.q.value('(./action[@name="sql_text"]/value)[1]','Nvarchar(max)') AS sqltext
    FROM #XML_Hold
    CROSS APPLY BufferXml.nodes('/RingBufferTarget/event')p(q)
    WHERE -- Ryelugu 03/05/2015 - Perf Optimize - Filtering the Buffered XML so as not to lookup at previoulsy loaded records into stage table
    p.q.value('@timestamp[1]','datetime') >= ISNULL(@Prev_Insertion_time ,p.q.value('@timestamp[1]','datetime'))
    AND p.q.value('(./data[@name="ddl_phase"]/text)[1]','varchar(100)') ='Commit' --Ryelugu 03/06/2015 - Every Event records a begin version and a commit version into Buffer ( XML ) we need the committed version
    AND p.q.value('(./data[@name="object_type"]/text)[1]','varchar(100)') <> 'STATISTICS' --Ryelugu 03/06/2015 - May be SQL Server Internally Creates Statistics for #Temp tables , we do not want Creation of STATISTICS Statement to be logged
    AND p.q.value('(./data[@name="object_name"]/value)[1]','varchar(100)') NOT LIKE '%#%' -- Any stored proc which creates a temp table within it Extended Event does capture this creation statement SQL as well , we dont need it though
    AND p.q.value('(./action[@name="client_app_name"]/value)[1]','varchar(100)') <> 'Replication Monitor' --Ryelugu : 03/09/2015 We do not want any records being caprutred by Replication Monitor ??
    SELECT GETDATE() GETDATE_4
    -- SELECT * FROM #TEMP
    -- SELECT COUNT(*) FROM #TEMP
    -- SELECT GETDATE()
    -- RETURN
    -- PRINT '3'
    --RETURN
    INSERT INTO [dbo].[MN_DDLSchema_Changes_log]
    [UserName]
    ,[DbName]
    ,[ObjectName]
    ,[client_app_name]
    ,[ClientHostName]
    ,[ServerName]
    ,[SQL_TEXT]
    ,[EE_Time_Stamp]
    ,[Event_Name]
    SELECT
    CASE WHEN T.nt_username IS NULL OR LEN(T.nt_username) = 0 THEN t.ServerPrincipalName
    ELSE T.nt_username
    END
    ,T.DbName
    ,T.objectname
    ,T.clientappname
    ,t.ClientHostName
    ,T.server_instance_name
    ,T.sqltext
    ,T.Time_Stamp_EE
    ,T.eventname
    FROM
    #TEMP T
    /** -- RYelugu 03/06/2015 - Filters are now being applied directly while retrieving records from BUFFER or on XML
    -- Ryelugu 03/15/2015 - More filters are likely to be added on further testing
    WHERE ddl_Phase ='Commit'
    AND ObjectType <> 'STATISTICS' --Ryelugu 03/06/2015 - May be SQL Server Internally Creates Statistics for #Temp tables , we do not want Creation of STATISTICS Statement to be logged
    AND ObjectName NOT LIKE '%#%' -- Any stored proc which creates a temp table within it Extended Event does capture this creation statement SQL as well , we dont need it though
    AND T.Time_Stamp_EE >= @Prev_Insertion_time --Ryelugu 03/05/2015 - Performance Optimize
    AND NOT EXISTS ( SELECT 1 FROM [dbo].[MN_DDLSchema_Changes_log] MN
    WHERE MN.[ServerName] = T.server_instance_name -- Ryelugu Server Name needes to be added on to to xml ( Events in session )
    AND MN.[DbName] = T.DbName
    AND MN.[Event_Name] = T.EventName
    AND MN.[ObjectName]= T.ObjectName
    AND MN.[EE_Time_Stamp] = T.Time_Stamp_EE
    AND MN.[SQL_TEXT] =T.SqlText -- Ryelugu 03/05/2015 This is a comparision Metric as well , But needs to decide on
    -- Peformance Factor here , Will take advise from Lance if comparision on varchar(max) is a vital idea
    --SELECT GETDATE()
    --PRINT '4'
    --RETURN
    SELECT
    top 100
    [EE_Time_Stamp]
    ,[ServerName]
    ,[DbName]
    ,[Event_Name]
    ,[ObjectName]
    ,[UserName]
    ,[SQL_TEXT]
    ,[client_app_name]
    ,[Created_Date]
    ,[ClientHostName]
    FROM
    [dbo].[MN_DDLSchema_Changes_log]
    ORDER BY [EE_Time_Stamp] desc
    -- select getdate()
    -- ** DELETE EVENTS after logging into Physical table
    -- NEED TO Identify if this @XML can be updated into physical system table such that previously loaded events are left untoched
    -- SET @XML.modify('delete /event/class/.[@timestamp="2015-03-06T13:01:19.020Z"]')
    -- SELECT @XML
    SELECT GETDATE() GETDATE_5
    END
    GO
    Rajkumar Yelugu

    @@Version : ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
    Microsoft SQL Server 2012 - 11.0.5058.0 (X64)
        May 14 2014 18:34:29
        Copyright (c) Microsoft Corporation
        Developer Edition (64-bit) on Windows NT 6.2 <X64> (Build 9200: ) (Hypervisor)
    (1 row(s) affected)
    Compatibility level is set to 110 .
    One of the limitation states - XML columns with a depth of more than 128 nested nodes
    How do i verify this ? Thanks .
    Rajkumar Yelugu

Maybe you are looking for

  • Audio Sync

    Is it possible to hear audio playing through speakers hooked up to built-in audio that is in synch with video playing through firewire on a NTSC monitor? The only way I seem to be able to get audio to sync with the monitor is to use the monitors spea

  • PC always losing connection

    I just got Airport express and hooked it up yesterday. It took a while but everything was working fine. The internet on both my PC and my Macbook Pro was lightning fast. The problem is, the pc always loses connection. The wireless to my macbook pro a

  • How do I get a printerdriver for Hp2540 on my ipad2?

    how do I connect my new printer HP 2540 and my iPad ?? Where Can I find the Right driver?

  • Cross Tab Top N Setting Programmatically

    <p>How can I set the value of N in the Sort Top N in the Group Sort Expert dialog programmatically for a cross tab object?  </p><p>Thanks,</p><p>-Carl</p>

  • Header condition disabled in RFQ

    Hi all, 1. In RFQ document header and item condition are not in active. When i go to RFQ header --> Conditions ( it is disabled ) Similarly RFQ item --> conditions ( it is disabled ) 2. In Quotation i cannot enter the header conditions , i can enter