SQL Server system tables

I am trying to convert a SQLServer 2000 database to Oracle 10g. One of the issues I am running into is the reference in a number of the stored procedures to system tables or procedures in the master database that SQLServer maintains:
1. sysprocesses table: the sp accesses this table to find out session id and login time for the specific session. I am aware that session id can be mimicked by select sys_context('USERENV','SESSIONID'), but how about login time?
2. Other procedures have references to sp_OACreate and sp_OADestroy etc that are extended procedures in the master db. The migration work-bench either does not detect them, or does not select them for conversion.
If anybody has input on how these items can be resolved, I would really appreciate it. Thanks.

I am trying to convert a SQLServer 2000 database to Oracle 10g. One of the issues I am running into is the reference in a number of the stored procedures to system tables or procedures in the master database that SQLServer maintains:
1. sysprocesses table: the sp accesses this table to find out session id and login time for the specific session. I am aware that session id can be mimicked by select sys_context('USERENV','SESSIONID'), but how about login time?
2. Other procedures have references to sp_OACreate and sp_OADestroy etc that are extended procedures in the master db. The migration work-bench either does not detect them, or does not select them for conversion.
If anybody has input on how these items can be resolved, I would really appreciate it. Thanks.

Similar Messages

  • Interfacing to a third-party system through a shared SQL Server DB Table

    Guys, I have been given a task of interfacing to a third-party system through a shared SQL Server database table.I had never come across such implementation.Can anyone please let me know the methodologies involved in it.
    Thanks,
    Jack.

    This line:
    stmt.executeQuery(query);
    should be:
    stmt.executeUpdate(query);

  • How to delete a row from a SQL Server CE Table with multiple JOINs?

    I want to delete a record from a SQL Server CE table.
    There are 3 tables scripts, options and results. I would like to remove a record from the results table. The where clause contains dynamic information which retrieved via other queries to different tables in the same database. These queries work fine and deliver
    the desired data.
    The Compact server is a clone of a remote table created using the sync framework. The same query to the remote table works fine.
    The error I get is:
    There was an error parsing the query. [ Token line number = 1,Token line offset = 10,Token in error = from ]
    The code that throws the exception is as follows:
    Dim connLoc As SqlCeConnection = New SqlCeConnection(My.Settings.ConnectionString)connLoc.Open()     Dim strDel As String = "Delete r from ResultsTable r inner join OptionsTable o ON o.TestName=r.TestName inner join ScriptTable c ON r.TestName=c.TestName WHERE r.TestName = '" & ds1Loc.Tables(0).Rows(0)(1) & "' AND [Index] = '" & lstIndex & "'"Dim cmdDel As SqlCeCommand = New SqlCeCommandcmdDel.CommandText = strDelcmdDel.Connection = connLoccmdDel.ExecuteNonQuery()
    The values held in ds1Loc.Tables(0).Rows(0)(1) and lstIndex are
    correct so should not be the problem.
    I also tried using parameterised queries
    Dim strDel As String = "Delete r from [ResultsTable] r inner join [OptionsTable] o ON o.TestName=r.TestName inner join [ScriptTable] c ON r.TestName=c.TestName WHERE r.TestName = @TestName AND [Index] = @lstIndex"
    Dim cmdDel As SqlCeCommand = New SqlCeCommand        cmdDel.CommandText = strDel       
    With cmdDel.Parameters           
    .Add(New SqlCeParameter("@TestName", ds1Loc.Tables(0).Rows(0)(1)))           
    .Add(New SqlCeParameter("@lstIndex", lstIndex))       
    End With 
    cmdDel.Connection = connLoc        cmdDel.ExecuteNonQuery()
    I have tried replacing the "=" with "IN" in the the WHERE clause but this has not worked.
    Is it the join that is causing the problem? I can do a select with the same search criteria and joins from the same database.
    Also this query works with SQL Server. Is it perhaps that SQL CE does not support the Delete function the same as SQL Server 2008? I have been looking at this for a while now and cannot find the source of the error. Any help would be greatly appreciated.

    Hello,
    In SQL Server Compact, we can use join in FROM clause. The DELETE statement fail may be caused by the FOREIGN KEY constraint.
    Please refer to:
    DELETE (SQL Server Compact)
    FROM Clause (SQL Server Compact)
    Regards,
    Fanny Liu
    Fanny Liu
    TechNet Community Support

  • INSERTING DATA INTO A SQL SERVER 2005 TABLE, WHICH HAS A IDENTITY COLUMN

    Hi All,
    I have to insert the data into a SQL SERVER 2005 Database table.
    I am able to insert the data into a normal SQL Server table.
    When I am trying to insert the data into a SQL Server table, which has a identity column (i.e. auto increment column in Oracle) I am getting error saying that can't insert value explicitly when IDENTITY_INSERT is set to OFF.
    Had anybody tried this??
    There are some SRs on this issue, Oracle agreed that it is a bug. I am wondering if there is any workaround from any one of you (refer Insert in MS-SQL database table with IDENTITY COLUMN
    Thanks
    V Kumar

    Even I had raised a SR on this in October 2008. But didn't get any solution for a long time, finally I had removed the identity column from the table. I can't to that now :).
    I am using 10.1.3.3.0 and MS SQL SERVER 2005, They said it is working for MS SQL SERVER TABLE, if the identity column is not a primary key and asked me to refer to note 744735.1.
    I had followed that note, but still it is not working for me.
    But my requirement is it should work for a MS SQL SERVER 2005 table, which has identity column as primary key.
    Thanks
    V Kumar

  • MS SQL Server system stored procedures can't be migrated into Oracle 11g

    During database migration from MS SQL Server 2008 to Oracle 11g R2, if the application stored procedure invokes the MS SQL Server system stored procedures (for example: sp_getapplock, sp_releaseapplock ....), these SQL server system stored procedures can't be transferred. See following migrated Oracle application stored procedure for example:
    create or replace
    PROCEDURE spPwSysID_GetNextID
    v_ID OUT NUMBER,
    iv_SysType IN NVARCHAR2 DEFAULT NULL ,
    iv_Cnt IN NUMBER DEFAULT NULL
    AS
    v_SysType NVARCHAR2(50) := iv_SysType;
    v_Cnt NUMBER(10,0) := iv_Cnt;
    v_result NUMBER(10,0);
    BEGIN
    --SQL Server BEGIN TRANSACTION;
    utils.incrementTrancount;
    v_Systype := UPPER(v_Systype) ;
    IF v_Cnt < 1 THEN
    v_Cnt := 1 ;
    END IF;
    v_result :=sp_getapplock(v_Resource => v_Systype,
    v_LockMode => 'Exclusive') ;
    IF v_result >= 0 THEN
    BEGIN
    SELECT ID
    INTO v_ID
    FROM PWSYSID
    WHERE SysType = v_SysType;
    IF SQL%ROWCOUNT = 1 THEN
    UPDATE PwSysID
    SET ID = ID + v_cnt
    WHERE SysType = v_SysType;
    ELSE
    BEGIN
    INSERT INTO PwSysID
    ( ID, SysType )
    VALUES ( v_cnt + 1, v_SysType );
    v_ID := 1 ;
    END;
    END IF;
    v_result :=sp_releaseapplock(v_Resource => v_Systype) ;
    END;
    ELSE
    BEGIN
    raise_application_error( -20002, 'Lock failed to acquire to generate Cityworks Id.' );
    END;
    END IF;
    utils.commit_transaction;
    END;

    During database migration from MS SQL Server 2008 to Oracle 11g R2, if the application stored procedure invokes the MS SQL Server system stored procedures (for example: sp_getapplock, sp_releaseapplock ....), these SQL server system stored procedures can't be transferred. See following migrated Oracle application stored procedure for example:
    create or replace
    PROCEDURE spPwSysID_GetNextID
    v_ID OUT NUMBER,
    iv_SysType IN NVARCHAR2 DEFAULT NULL ,
    iv_Cnt IN NUMBER DEFAULT NULL
    AS
    v_SysType NVARCHAR2(50) := iv_SysType;
    v_Cnt NUMBER(10,0) := iv_Cnt;
    v_result NUMBER(10,0);
    BEGIN
    --SQL Server BEGIN TRANSACTION;
    utils.incrementTrancount;
    v_Systype := UPPER(v_Systype) ;
    IF v_Cnt < 1 THEN
    v_Cnt := 1 ;
    END IF;
    v_result :=sp_getapplock(v_Resource => v_Systype,
    v_LockMode => 'Exclusive') ;
    IF v_result >= 0 THEN
    BEGIN
    SELECT ID
    INTO v_ID
    FROM PWSYSID
    WHERE SysType = v_SysType;
    IF SQL%ROWCOUNT = 1 THEN
    UPDATE PwSysID
    SET ID = ID + v_cnt
    WHERE SysType = v_SysType;
    ELSE
    BEGIN
    INSERT INTO PwSysID
    ( ID, SysType )
    VALUES ( v_cnt + 1, v_SysType );
    v_ID := 1 ;
    END;
    END IF;
    v_result :=sp_releaseapplock(v_Resource => v_Systype) ;
    END;
    ELSE
    BEGIN
    raise_application_error( -20002, 'Lock failed to acquire to generate Cityworks Id.' );
    END;
    END IF;
    utils.commit_transaction;
    END;

  • Migration from sql server 2005 tables to oracle tables.

    Hi,
    Kindly give the steps to migrate from sql server 2005 tables to oracle tables.
    Kindly advise
    Oracle database version:
    Oracle Database 10g Release 10.2.0.1.0 - Production
    PL/SQL Release 10.2.0.1.0 - Production
    "CORE 10.2.0.1.0 Production"
    TNS for 32-bit Windows: Version 10.2.0.1.0 - Production
    NLSRTL Version 10.2.0.1.0 - Production
    Edited by: 873127 on Jul 18, 2011 9:46 PM

    Are you migrating or taking continual updates?
    If migrating it might be worth considering The SQLDeveloper Migration Workbench (which moves more than just data)..
    http://www.oracle.com/technetwork/database/migration/sqldevmigrationworkbench-132899.pdf
    Cheers
    David

  • Using Oracle Heterogenous services to access sql server database table

    I have created a dblink 'POC_HS' from oracle to sql (implemented heterogeneous services) and I am able to successfully pull out data from the default database that the DSN(for sql server) is connected to.
    So this 'select * from Test@POC_HS' is working perfectly fine on the Oracle database as 'Test' table resides in the default database (which the System DSN is connected to).
    But when I do 'select * Abc.Test@POC_HS' where Test table resides in 'ABC' database which is not the default database throws an error as follows:
    ORA-00942: table or view does not exist [Generic Connectivity Using ODBC][Microsoft][ODBC SQL Server Driver][SQL Server]Invalid object name 'Abc.Test'.[Microsoft][ODBC SQL Server Driver][SQL Server]Statement(s) could not be prepared. (SQL State: S0002; SQL Code: 208)
    I have also tried this 'select * from Abc.dbo.Test@POC_HS' but oracle throws this exception "ORA-00933: SQL command not properly ended".
    The dblink user and System DSN account has access to the 'Abc' database.
    Thoughts?

    Thanks for the info.
    But suppose if we have DB link 'POC_HS' where POC_HS is a DBlink between oracle servers, I can do the following -
    1. select * Abc.Test@POC_HS
    2. select * Def.Test@POC_HS
    where Abc,Def are the schemas which the Dblink user has access to. I can execute the above perfectly fine.
    I wanted the achieve the same functionality from Oracle to Sql where database keep on changing dynamically . So according to you that's not possible right?
    We will have to keep on changing the ODBC connection to a different database or create a new odbc/listener/tnsentry each time query uses a different database right?
    Edited by: 878753 on Aug 11, 2011 1:29 AM

  • How mapping sharepoint list Columns to Sql server data table columns programaticlly

    Hi ,
     I have one Verification List in share Point ,in that list i have 10 columns.And we have sql server in that sql server we have one data table Verification_Table
    in that table we have 25 column, my requirement is all list data  move to  sql data table[ what ever columns mapping to list--->data table that data store in data table reaming column is  Null]
     using grammatically not in BCS

    Hello,
    You can create SQL connection and use Datareader to read from SQL.Firs create a connection string and put this string in web application web.config file of your sharepoint site.
    Now use below code to call your connectionstring in your webpart.
    SqlConnection con = new SqlConnection(System.Configuration.ConfigurationManager.AppSettings["ConnectionString"]);
    Here is link to read the data from SQL:
    http://www.akadia.com/services/dotnet_data_reader.html
    Here is one MSDN link to read SP list data:
    http://msdn.microsoft.com/en-us/library/dd490727%28v=office.12%29.aspx
    Let me know if you have any doubt
    Hemendra:Yesterday is just a memory,Tomorrow we may never see
    Please remember to mark the replies as answers if they help and unmark them if they provide no help

  • Create a table in SQL Server, Export tables from Microsoft Excel to Microsoft SQL Server, Populate the created table

    Hello team,
    I have a project that I need to do, what is the best approach for each step?
    1- I have to create a table in Microsoft SQL Server.
    2- I have to import data/ tables from Microsoft Excel or Access to Microsoft SQL Server. Should I use Microsoft Visual Studio to move data from Excel or Access?
    3-I should populate the created table with the data from the exported data.
    4-How should I add the second and third imported table to the first table? Should I use union query?
    After I learn these, I will bring up the code to make sure what I do is right.
    Thanks for all,
    Guity
    GGGGGNNNNN

    Hello Naomi,
    I have imported all the tables into SQL Server,
    I created a table:
    CREATE
    TABLE dbo.Orders
    Now I want to populate this table with the values from imported tables, will this code take care of this task?
    INSERT INTO dbo.Orders(OrderId, OrderDate)
    SELECT OrderId, OrderDate
    FROM Sales.Orders
    UNION
    SELECT OrderId, OrderDate
    FROM Sales.Orders1
    Union
    SELECT OrderId, OrderDate
    FROM Sales.Orders2
    If not, what is the code?
    Please advise me.
    GGGGGNNNNN
    GGGGGNNNNN

  • SQL Server log table sizes

    Our SQL Server 2005 (Idm 7.1.1 (with patch 13 recently applied), running on Win2003 & Appserver 8.2) database has grown to 100GB. The repository was created with the provided create_waveset_tables.sqlserver script.
    In looking at the table sizes, the space hogs are:
    Data space:
        log       7.6G
        logattr   1.8G
        slogattr 10.3G
        syslog   38.3G
    Index space:
        log       4.3G
        logattr   4.3G
        slogattr 26.9G
        syslog    4.2GAs far as usage goes, we have around 20K users, we do a nightly recon against AD, and have 3 daily ActiveSync processes for 3 other attributes sources. So there is alot of potential for heavy duty logging to occur.
    We need to do something before we run out of disk space.
    Is the level of logging tunable somehow?
    If we lh export "default" and "users", then wipe out the repo, reload the init, default and users what will we have lost besides a history of attribute updates?

    Hi,
    I just fired up my old 7.1 environment to have a look at the syslog and slogattr tables. They looked save to delete as I could not find any "magic" rows in there. So I did a shutdown of my appserver and issued
    truncate syslog
    truncate slogattr
    from my sql tool. After restarting the appserver everything is still working nicely.
    The syslog and slogattr tables store technical information about errors. Errors like unable to connect to resource A or Active Sync agains C is not properly configured. It does not store provisioning errors, those go straight to the log/logattr table. So from my point of view it is ok to clean out the syslog and slogattr once in a while.
    But there is one thing which I think is not ok - having so many errors in the first place. Before you truncate your syslog you should run a syslog report to identify some of the problems in the environment.
    Once identified and fixed you should'nt have many new entries in your syslog per day. There will allways be a few, network hickups and the like. But not as many as you seem to have today.
    Regards,
    Patrick

  • SQL Server system setup

    I want to connect 5 computers for a database system using sql server. I will make one of the computers as database server and the other 4 will be client machines. Which operating system should  I use for clients and server?. Do I need to install Windows
    Server 2008 on the database server or can Win 7 Pro is enough for all machines both server and clients.

    it depends on the version and edition of sql server that you want to install.
    check this link -
    http://msdn.microsoft.com/en-us/library/ms143506.aspx#pmosr
    It contains the different operating systems supported for different editions of SQL 2014.
    Regards, Ashwin Menon My Blog - http:\\sqllearnings.com

  • Storing the file in to the sql server 2005 table

    hi,every body...
    in my appliaction i need to store the files.
    i have searched the net but only i found as BLOB(Binary L:arge OBject) is the solution,
    how to use this blob...technique..
    can any one send some sample of code to explain.......
    how to insert a file and how to retrive the inserted file from the SQL SERVER 2005...
    thanks in advance..

    yes, ur rite thse two methods are available..
    but i didnt understand that..the method setBinaryStream() accepts 3 parameters
    the last parameter which i know that.. size of the file of int type rite?
    can u give some more information regarding this..
    thanks ..

  • SQL Server partitioned tables

    I'm trying to work out the most efficient way of sharing a table between different stored procedures in a thread in SQL 2005. This will be addressed directly by table parameters in 2008 I realise.
    The constraints are that this is a concurrent thread application, i.e. at any one time there may be multiple threads using their own data. So my thoughts are:
    1. A single simple permanent table with some sort of thread key field. That's fine, but at the end of the thread I want to remove this data. The number of rows to be deleted at that time might conceivably be 500,000+. So I'm a bit worried about DELETE FROM performance.
    2. I could create a thread specific table. I've done that with dynamic SQL (exec (@sql_string)). Removing the data then just involves dropping the table. But my dynamic SQL is starting to get cumbersome. I've considered accesing the data on each thread by taking a copy of the thread specific table into a temporary table within each relevant SP. This is actually a lot more efficient than recalculating the table completely but still not very clever.
    3. Partitioned tables. If I create a partition with the relevant thread key as its partition function then my delete step should be pretty efficient. But then I'm not sure whether the overhead of partition function management is an issue
    My questions are:
    1. Any thoughts on other approaches? I suspect my description above doesn't really explain everything - sorry!
    2. What's the efficient partition managament approach for this. The thread key is a single int. So I just want a simple partition function that creates a new partition for every int. And when I drop the partition I want to remove it's definition from the function (I think, to avoid hitting any boundary value limits). Can this function be created as a one-off, or does it have to be altered each time I enter a new thread?  It seems like this is the simplest partition scheme in the world: "Please create a new partition for every new integer value".  Is tthis a simple approach or am I making complications?
    Any thoughts appreciated.
    Thanks,
    - Rob.

    Dynamic partitioning doesn't make much sense if the physical files don't come across multi disc hardware.
    There is no point (or at least there is little point) in separating your data in e.g. 10 data files if you just have ONE single disc. There is a point for that for LOG files but if you really want a performance gain you need to have them on different physical
    discs.

  • Efficient way to do the Purge / Delete activity on a SQL Server Heap table

    Hi,
    I have a huge heap table (sql 2008) on a staging database which is used to store log history for an application.
    The application is not directly using this heap table.
    The table is having a Date column and We have a Purge Plan to remove the records that are all older than 1 Year.
    In this scenario, which one will help and support us in order to expedite the purge process ?
    Whether Crating a Clustered Index or Non-Clustered Index ?
    Of course, I am planning to use the following script in order to avoid Log file bloat and get rid of the blockings.
    Can some help in this regard by providing suggestion.

    I personally wouldn't create a clustered index on the table.  Adding a clustered index has two problems in your scenario.
    Adding a clustered index will be time consuming and resource intensive.  Talk about log file bloat...
    A clustered index will result in poorer insert performance when compared to leaving the table as a heap and adding a non-clustered index.
    I would add the non-clustered index to the table on the date column you refer to, then purge data in small batches.  Although purging data in small batches might not be quite as fast as purging the data in a single batch, it won't be much slower and
    will allow you to have total control over your log file.
    The non-clustered index on the date column will be small since even the largest date datatypes only consume 10 bytes of space.  So for a table containing 5 billion records the non-clustered index would be only about 90 GB in size.
    As stated above you could then purge data in small batches and perform log backups between batches to control log file bloat or simply switch the database to simple recovery model.

  • How to Import Excel 2007 worksheet that has more than 255 columns into SQL Server 2008 table using SSIS 2008.

    I am using Excel source which uses Microsoft ACE 12.0 OLE DB provider, this only allows the first 255 columns to be imported and not the rest.
    I need to export all the columns into the table .
    Any pointers to this is greatly appreciated

    If you can use third-party solutions, check the commercial COZYROC Excel adapters:
    Excel Connection Plus Manager
    Excel Source Plus
    Excel Destination Plus
    Excel Task
    The enhanced components can be used both under 32bit and 64bit modes and doesn't exhibit the 255 columns limitation.
    SSIS Tasks Components Scripts Services | http://www.cozyroc.com/

Maybe you are looking for

  • Link from a report to another report

    Hi all, I have created a report called GROUPS with GROUPNO as the first column. And I have another report based on SQL query that shows the details for a group and takes the GROUPNO as an input variable. I have created a link from the first report to

  • Ipad2 won't recover/restore from itunes

    My Ipad2 flickered for a moment and went into "New Ipad" set-up mode.  I have attempted to restore/recover several from my iTunes on my PC, and each time it gets completely through the process, syncs, then hesitates for a second, and quickly goes bac

  • How to insert more than 4000 bytes in BLOB column

    Hi all, My oracle version is Oracle Database 11g Enterprise Edition Release 11.2.0.3.0. I have checked in google and in this forum also, but did not find the answer. When inserting into less than 4000 bytes, it is inserting without any issues. If i a

  • Library cannot be read

    I had to update my Itunes yesterday. I did this. This morning upon tryingh to open my Itunes i was told that Itunes does not suuport the language and it would not open. I had to uninstall Itunes - and re-installed it - Only to now be told that Itunes

  • XI Repository Export CMS Read timed out error

    when iam exporting the objects the export fails with the message..Sent on 12/14/07 at 9:04 PM: Unable to establish connection to CMS server http://servername:51000. Unable to transfer the following transport lists:  Export list for SWCV (send time =