SQL Server Bulk Copy
The performance optimization guide shows you how to switch between the SQL Server ODBC bulk copy API and SQLBulkOperations API but does not recomend which one works better. Since the default is SQL Server ODBC bulk copy API I assume that is the best option but I wish to test that assumption to see if that is true?
Edited by: Richard Sherman on Oct 3, 2008 11:53 AM
Hello,
we have to use the bulk options with data services and have the same question. Anyone know where a document comparing the two options?
Thank you very much, Silvia
Similar Messages
-
Sql server bulk insert blob filename parameter
My problem SQL script:
declare @filepath varchar(100)
set @filepath = 'E:\foto\1.jpg'
INSERT INTO [dbo].[MsQuestions] ([TestCategoryID], [LevelID], [TestTypeID], [QuestionText], [QuestionImg])
select 1 , 1, 8, 'data gambar',BulkColumn FROM OPENROWSET(BULK @filepath , SINGLE_BLOB)
thanks.
<%@ Page Language="C#" %>
<%@ Import Namespace="System.Data" %>
<%@ Import Namespace="System.Data.SqlClient" %>
<%
string sConn = @"server=.; database=OnlineTesting; Integrated Security=True";
SqlConnection objConn = new SqlConnection(sConn);
objConn.Open();
string sTSQL = "exec sp_filenamea";
SqlCommand objCmd = new SqlCommand(sTSQL, objConn);
objCmd.CommandType = CommandType.Text;
SqlDataReader dr = objCmd.ExecuteReader();
dr.Read();
Response.BinaryWrite((byte[])dr["QuestionImg"]);
objConn.Close();
%>
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
<script runat="server">
</script>
<html xmlns="http://www.w3.org/1999/xhtml">
<head id="Head1" runat="server">
<title>Exec SP</title>
</head>
<body>
<form id="form1" runat="server">
<div>
</div>
</form>
</body>
</html>Perhaps this
http://dimantdatabasesolutions.blogspot.co.il/2009/05/how-to-uploadmodify-more-than-one-blob.html
Best Regards,Uri Dimant SQL Server MVP,
http://sqlblog.com/blogs/uri_dimant/
MS SQL optimization: MS SQL Development and Optimization
MS SQL Consulting:
Large scale of database and data cleansing
Remote DBA Services:
Improves MS SQL Database Performance
SQL Server Integration Services:
Business Intelligence -
Hi
I have a datatable (in .net Application) which has a huge data.
I need to update/insert the data in to the Sql server database table.
(Upadte if the data exist in the table and insert if the data does not exist in the table).
just I wanted to know Is there any option for bulk insert or update rather than checking each row in (using cursor - SP) the table and inserting/updating..
SamprooHi Samproo, You will never use a cursor to check the existence of a record to whether update it or insert it. It will cost a lot to you when huge data stars flowing in.
Just use the Exists statement as below.
IF EXISTS(SELECT 1 FROM Table_Name WHERE condition = Value)
BEGIN
......UPDATE DATA Statemet.....
END
ELSE
BEGIN
.......INSERT DATA Statement.......
END
And Yes, you can use this inside any stored procedure the same way it is.
or if you doing matches between two tables then you can use MERGE as well...
Please mark as answer, if this has helped you solve the issue.
Good Luck :) .. visit www.sqlsaga.com for more t-sql code snippets and BI related how to articles. -
SQL SERVER BULK FETCH AND INSERT/UPDATE?
Hi All,
I am currently working with C and SQL Server 2012. My requirement is to Bulk fetch the records and Insert/Update the same in the other table with some business logic?
How do i do this?
Thanks in Advance.
Regards
Yogesh.B> is there a possibility that I can do a bulk fetch and place it in an array, even inside a stored procedure ?
You can use Temporary tables or Table variables and have them indexes as well
>After I have processed my records, tell me a way that I will NOT go, RECORD by RECORD basis, even inside a stored procedure ?
As i said earlier, you can perform UPDATE these temporary tables or table variables and finally INSERT/ UPDATE your base table
>Arrays are used just to minimize the traffic between the server and the program area. They are used for efficient processing.
In your case you will first have to populate the array (Using some of your queries from the server) which means you will first load the arrary, do some updates, and then send them back to server therefore
network engagement
So I just gave you some thoughts I feel could be useful for your implementation, like we say, there are many ways so pick the one that works good for you in the long run with good scalability
Good Luck! Please Mark This As Answer if it solved your issue. Please Vote This As Helpful if it helps to solve your issue -
Import Excel file into SQL, using bulk copy - date issues
Hello. I have a VB project where I need to import multiple excel files with lots of rows and columns into SQL 2012. Currently the import is set up, using OleDbConnection and insert commands and it takes up to 10 minutes to process all the spreadsheets.
I'm trying to switch the code to use SQLBulkCopy, but experienced issues with dates columns. Some rows have Null dates and those are interpreted as strings and won't import into SQL tables. Is there a way to format the column prior to import programmatically?
Any advice is appreciated.
Note -
If I add column mapping and exclude all dates columns - import works fine. All excel files have date fields, excel files are reports from other vendor and change on weekly bases, manual formatting of the excel files prior to import are out of the question...
The code is just basic:
Public Sub ImportFormExcelSample()
Dim ExcelConnection As New System.Data.OleDb.OleDbConnection("Provider=Microsoft.ACE.OLEDB.12.0;Data Source=C:\MyExcelSpreadsheet.xlsx;Extended Properties=""Excel 12.0 Xml;HDR=Yes""")
ExcelConnection.Open()
Dim expr As String = "SELECT * FROM [Sheet1$]"
Dim objCmdSelect As OleDbCommand = New OleDbCommand(expr, ExcelConnection)
Dim objDR As OleDbDataReader
Dim SQLconn As New SqlConnection()
Dim ConnString As String = "Data Source=MMSQL1;Initial Catalog=DbName; User Id=UserName; Password=password;"
SQLconn.ConnectionString = ConnString
SQLconn.Open()
Using bulkCopy As SqlBulkCopy = New SqlBulkCopy(SQLconn)
bulkCopy.DestinationTableName = "TableToWriteToInSQLSERVER"
Try
objDR = objCmdSelect.ExecuteReader
bulkCopy.WriteToServer(objDR)
objDR.Close()
SQLconn.Close()
Catch ex As Exception
MsgBox(ex.ToString)
End Try
End Using
End Sub
The error I get is System.InvalidOperatiomException: The given value of type String from the data source cannot be converted to type date of the specified target column. System.FormatException: Failed to convert parameter value from String to a DateTime...
Thank you!
Alla SandersHi Alla,
This issue might be caused because the field contains a NULL value, but the date/time columns in your table does not allow NULL values. Furthermore, please aslo take a look at "Data Type Issues" session in the article below:
http://odetocode.com/blogs/scott/archive/2013/02/08/working-with-sqlbulkcopy.aspx
Here is a similar thread about this topic for your reference, please see:
http://social.technet.microsoft.com/Forums/sqlserver/en-US/2d99181c-fc2b-4caf-9530-3bd6ae1745f1/sqlbulkcopy-column-validation-not-working?forum=sqldataaccess
If you have any feedback on our support, please click
here.
Regards,
Elvis Long
TechNet Community Support -
SQL Server 2012 Express bulk Insert flat file 1million rows with "" as delimeter
Hi,
I wanted to see if anyone can help me out. I am on SQL server 2012 express. I cannot use OPENROWSET because my system is x64 and my Microsoft office suit is x32 (Microsoft.Jet.OLEDB.4.0).
So I used Import wizard and is not working either.
The only thing that let me import this large file, is:
CREATE TABLE #LOADLARGEFLATFILE
Column1
varchar(100), Column2 varchar(100), Column3 varchar(100),
Column4 nvarchar(max)
BULK INSERT
#LOADLARGEFLATFILE
FROM 'C:\FolderBigFile\LARGEFLATFILE.txt'
WITH
FIRSTROW = 2,
FIELDTERMINATOR ='\t',
ROWTERMINATOR ='\n'
The problem with CREATE TABLE and BULK INSERT is that my flat file comes with text qualifiers - "". Is there a way to prevent the quotes "" from loading in the bulk insert? Below is the data.
Column1
Column2
Column3
Column4
"Socket Adapter"
8456AB
$4.25
"Item - Square Drive Socket Adapter | For "
"Butt Splice"
9586CB
$14.51
"Item - Butt Splice"
"Bleach"
6589TE
$27.30
"Item - Bleach | Size - 96 oz. | Container Type"
Ed,
Edwin LoperaHi lgnusLumen,
According to your description, you use BULK INSERT to import data from a data file to the SQL table. However, to be usable as a data file for bulk import, a CSV file must comply with the following restrictions:
1. Data fields never contain the field terminator.
2. Either none or all of the values in a data field are enclosed in quotation marks ("").
In your data file, the quotes aren't consistent, if you want to prevent the quotes "" from loading in the bulk insert, I recommend you use SQL Server Import and Export Wizard tools in SQL Server Express version. area, it will allow to strip the
double quote from columns, you can review the following screenshot.
In other SQL Server version, we can use SQL Server Integration Services (SSIS) to import data from a flat file (.csv) with removing the double quotes. For more information, you can review the following article.
http://www.mssqltips.com/sqlservertip/1316/strip-double-quotes-from-an-import-file-in-integration-services-ssis/
In addition, you can create a function to convert a CSV to a usable format for Bulk Insert. It will replace all field-delimiting commas with a new delimiter. You can then use the new field delimiter instead of a comma. For more information, see:
http://stackoverflow.com/questions/782353/sql-server-bulk-insert-of-csv-file-with-inconsistent-quotes
Regards,
Sofiya Li
Sofiya Li
TechNet Community Support -
Error during DatabaseCopy using SAP tools for SQL Server (v7.12)
I'm trying to perform a system copy from PRD to QAS. The database has been mounted to QAS via detach and attach method. I did a DBCC on the database and everything looks OK. I'm running SAPINST (SAP Tools for MSSQL) and selecting the Database Copy option. The source database has a schema of DBO. Both PRD and QAS are running SAP 4.7/Basis 620. The target schema is QAS. SAPINST fails in step 2 (Define Params) with:
"This service cannot be used for a system with SAP ABAP release 620"
Is this message misleading? Has anyone receive this message before.
UPDATE:
I donwloaded the latest version of SAP Tools for MSSQL and now it failes during the execution phase on step, "Convert DB objects to new schema".
Here is the log:
Process environment
===================
Environment Variables
=====================
=::=::\
=C:=C:\Program Files\sapinst_instdir\MSS\CPY
ALLUSERSPROFILE=C:\Documents and Settings\All Users
APPDATA=C:\Documents and Settings\r3tadm.STAG\Application Data
CLIENTNAME=TRAIMONOTEBOOK
ClusterLog=C:\WINDOWS\Cluster\cluster.log
CommonProgramFiles=C:\Program Files\Common Files
COMPUTERNAME=STAG
ComSpec=C:\WINDOWS\system32\cmd.exe
DBMS_TYPE=MSS
FP_NO_HOST_CHECK=NO
HOMEDRIVE=C:
HOMEPATH=\Documents and Settings\r3tadm.STAG
LOGONSERVER=
STAG
MSSQL_DBNAME=R3T
MSSQL_SERVER=stag
NUMBER_OF_PROCESSORS=4
OS=Windows_NT
Path=C:\WINDOWS\system32;C:\WINDOWS;C:\WINDOWS\System32\Wbem;C:\Program Files\Dell\SysMgt\RAC5;C:\Program Files\Dell\SysMgt\oma\bin;C:\Program Files\Dell\SysMgt\oma\oldiags\bin;C:\Program Files\Microsoft SQL Server\80\Tools\BINN;C:\usr\sap\R3T\SYS\exe\run
PATHEXT=.COM;.EXE;.BAT;.CMD;.VBS;.VBE;.JS;.JSE;.WSF;.WSH
PROCESSOR_ARCHITECTURE=x86
PROCESSOR_IDENTIFIER=x86 Family 6 Model 15 Stepping 6, GenuineIntel
PROCESSOR_LEVEL=6
PROCESSOR_REVISION=0f06
ProgramFiles=C:\Program Files
SAPINST_EXEDIR_CD=C:/STM/I386
SAPINST_JRE_HOME=C:/WINDOWS/TEMP/3/sapinst_exe.3012.1234409473/jre
SAPLOCALHOST=stag
SESSIONNAME=RDP-Tcp#25
SystemDrive=C:
SystemRoot=C:\WINDOWS
TEMP=C:\WINDOWS\TEMP\3
TMP=C:\WINDOWS\TEMP\3
USERDOMAIN=STAG
USERNAME=r3tadm
USERPROFILE=C:\Documents and Settings\r3tadm.STAG
windir=C:\WINDOWS
User: STAG\r3tadm, Id: S-1-5-21-2727398557-1322528747-1943968026-1019
Working directory: C:/Program Files/sapinst_instdir/MSS/CPY
Current access token
====================
Could not get thread token. Last error: 1008. I assume that no thread token exists.
Got process token.
Privileges:
Privilege SeBackupPrivilege, display name: Back up files and directories, not enabled.
Privilege SeRestorePrivilege, display name: Restore files and directories, not enabled.
Privilege SeShutdownPrivilege, display name: Shut down the system, not enabled.
Privilege SeDebugPrivilege, display name: Debug programs, not enabled.
Privilege SeAssignPrimaryTokenPrivilege, display name: Replace a process level token, not enabled.
Privilege SeSystemEnvironmentPrivilege, display name: Modify firmware environment values, not enabled.
Privilege SeIncreaseQuotaPrivilege, display name: Adjust memory quotas for a process, not enabled.
Privilege SeChangeNotifyPrivilege, display name: Bypass traverse checking, enabled.
Privilege SeRemoteShutdownPrivilege, display name: Force shutdown from a remote system, not enabled.
Privilege SeTcbPrivilege, display name: Act as part of the operating system, not enabled.
Privilege SeUndockPrivilege, display name: Remove computer from docking station, not enabled.
Privilege SeSecurityPrivilege, display name: Manage auditing and security log, not enabled.
Privilege SeTakeOwnershipPrivilege, display name: Take ownership of files or other objects, not enabled.
Privilege SeLoadDriverPrivilege, display name: Load and unload device drivers, not enabled.
Privilege SeManageVolumePrivilege, display name: Perform volume maintenance tasks, not enabled.
Privilege SeSystemProfilePrivilege, display name: Profile system performance, not enabled.
Privilege SeImpersonatePrivilege, display name: Impersonate a client after authentication, enabled.
Privilege SeSystemtimePrivilege, display name: Change the system time, not enabled.
Privilege SeCreateGlobalPrivilege, display name: Create global objects, enabled.
Privilege SeProfileSingleProcessPrivilege, display name: Profile single process, not enabled.
Privilege SeIncreaseBasePriorityPrivilege, display name: Increase scheduling priority, not enabled.
Privilege SeCreatePagefilePrivilege, display name: Create a pagefile, not enabled.
Groups:
Group count: 14
\LOCAL S-1-2-0 Attributes: SE_GROUP_MANDATORY SE_GROUP_ENABLED_BY_DEFAULT SE_GROUP_ENABLED
BUILTIN\Administrators S-1-5-32-544 Attributes: SE_GROUP_MANDATORY SE_GROUP_ENABLED_BY_DEFAULT SE_GROUP_ENABLED SE_GROUP_OWNER
\Everyone S-1-1-0 Attributes: SE_GROUP_MANDATORY SE_GROUP_ENABLED_BY_DEFAULT SE_GROUP_ENABLED
BUILTIN\Users S-1-5-32-545 Attributes: SE_GROUP_MANDATORY SE_GROUP_ENABLED_BY_DEFAULT SE_GROUP_ENABLED
STAG\SAP_R3T_LocalAdmin S-1-5-21-2727398557-1322528747-1943968026-1021 Attributes: SE_GROUP_MANDATORY SE_GROUP_ENABLED_BY_DEFAULT SE_GROUP_ENABLED
STAG\SAP_LocalAdmin S-1-5-21-2727398557-1322528747-1943968026-1023 Attributes: SE_GROUP_MANDATORY SE_GROUP_ENABLED_BY_DEFAULT SE_GROUP_ENABLED
STAG\None S-1-5-21-2727398557-1322528747-1943968026-513 Attributes: SE_GROUP_MANDATORY SE_GROUP_ENABLED_BY_DEFAULT SE_GROUP_ENABLED
NT AUTHORITY\INTERACTIVE S-1-5-4 Attributes: SE_GROUP_MANDATORY SE_GROUP_ENABLED_BY_DEFAULT SE_GROUP_ENABLED
NT AUTHORITY\NTLM Authentication S-1-5-64-10 Attributes: SE_GROUP_MANDATORY SE_GROUP_ENABLED_BY_DEFAULT SE_GROUP_ENABLED
NT AUTHORITY\Authenticated Users S-1-5-11 Attributes: SE_GROUP_MANDATORY SE_GROUP_ENABLED_BY_DEFAULT SE_GROUP_ENABLED
\ S-1-5-5-0-36828865 Attributes: SE_GROUP_MANDATORY SE_GROUP_ENABLED_BY_DEFAULT SE_GROUP_ENABLED SE_GROUP_LOGON_ID
NT AUTHORITY\REMOTE INTERACTIVE LOGON S-1-5-14 Attributes: SE_GROUP_MANDATORY SE_GROUP_ENABLED_BY_DEFAULT SE_GROUP_ENABLED
NT AUTHORITY\This Organization S-1-5-15 Attributes: SE_GROUP_MANDATORY SE_GROUP_ENABLED_BY_DEFAULT SE_GROUP_ENABLED
STAG\SAP_R3T_GlobalAdmin S-1-5-21-2727398557-1322528747-1943968026-1018 Attributes: SE_GROUP_MANDATORY SE_GROUP_ENABLED_BY_DEFAULT SE_GROUP_ENABLED
ERROR 2009-02-11 19:31:46.529 [sixxcstepexecute.cpp:984]
FCO-00011 The step MoveSchema with step key |SAPMSSTOOLS|ind|ind|ind|ind|0|0|MssSysCopy|ind|ind|ind|ind|4|0|MssSchemaMove|ind|ind|ind|ind|2|0|MoveSchema was executed with status ERROR .
TRACE 2009-02-11 19:31:46.544
Call block:CallBackInCaseOfAnErrorDuringStepExecution
function:CallTheLogInquirer
is validator: true
WARNING 2009-02-11 19:31:46.544 [iaxxejshlp.cpp:150]
Could not get property IDs of the JavaScript object.
ERROR 2009-02-11 19:31:46.544 [iaxxejsctl.cpp:492]
FJS-00010 Could not get value for property .
TRACE 2009-02-11 19:31:46.544
A problem occurs during execution the inquirer callback. SAPinst will switch back to the standard behaiviour.
TRACE 2009-02-11 19:31:46.544 [iaxxgenimp.cpp:707]
CGuiEngineImp::showMessageBox
<html> <head> </head> <body> <p> An error occurred while processing service SAP Toools for MS SQL Server > Database Copy. You may now </p> <ul> <li> choose <i>Retry</i> to repeat the current step. </li> <li> choose <i>View Log</i> to get more information about the error. </li> <li> stop the task and continue with it later. </li> </ul> <p> Log files are written to C:\Program Files/sapinst_instdir/MSS/CPY. </p> </body></html>
TRACE 2009-02-11 19:31:46.544 [iaxxgenimp.cpp:1245]
CGuiEngineImp::acceptAnswerForBlockingRequest
Waiting for an answer from GUI
Edited by: Tony Raimo on Feb 12, 2009 4:37 AMThis is looks like permission issue on source folder
Copy source in to your local drive and try again.
I had same issue and able to resolve after copy EXP1 EXP2 and EXP3 folder in to local drive C: with Everyone full access
Yogesh -
Hello,
I am in charge of moving an existing SAP NW BI (SPS20) from a MS SQL Server 2005 database on Windows Server 2003 (x864/64 bit) Platform to a MS SQL Server 2008 database/Windows Server 2008 Platform (x64/64 bit).
The SID of the SAP system will remain the same.
The thing is that I have always worrked so far on Oracle/Unix environements, windows/MSSQL is an entire new world for me,
so I am quite worried.
I have read many sap notes/installation guides/forum posts regarding this operation, I will describe here my understanding
of the techinical operations that I will need to perfrom.
Could you correct me if I am wrong, as well as give me your comments ?
1/ I will need to perform an SQL server database copy within a homogeneous system copy.
2/First ,I'll have to install on the target environnement the SQL server 2008 database. I'll have to install the SQL Server Database Software with SQL4SAP (no need to install the database server software manually, right ?). Afterwards, I'll install on the target environnent, SAP Netweaver 7.0 SR3 with a modified Instalaltion Master DVD.
3/ On the source environement, I'll perfrom a database backup of the SQL Server 2005 system and restore the database on the target environement. since I choose to perfrom a restore of the database , there is no need to follow all the steps described in sap note 151603 :determine the file structure of the source database, create the directory structure for the target database, etc ...
4/ Once the restoration is finished, I use SAPINST and finish the homogeneous copy : >additional sofware->system copy>target system->central system --> Based on AS Abap --central installation
Your comments are welcome.
Thank you for your attention
Best Regards.
RaoulHi Raoul,
you also need a modified Kernel DVD for the installation on SQL 2008 (see SAP Note 1152240).
And don't forget to install and patch the Visual Studio c++ Runtime (also described in that note).
I also would recommend to restore the backup and then install SAP NW 7.0 SR3 and use the restored SQL 2008 database.
So the basic steps are:
1. Perform backup and restore to SQL 2008
1.1 Install SQL 2008 with SQL4SAP
1.2 Backup old database
1.3. Restore backup on SQL 2008
2. Install VC runtime on Win 2008
3. Create modified Kernel DVD
4. Create modified Master DVD
5. Perform system copy target installaton with - additional sofware->system copy>target system--->central system --> Based on AS Abap --central installation and use the SQL 2008 database
You could also use SQL Server 2008 R2 which was release last month.
Please also check Note 1476928
Best,
Sebastian Dusch -
SQL Server CLR SharePoint Interface for Sharepoint 2013
Hello everybody.
Is the project
http://archive.msdn.microsoft.com/SqlClrSharePoint
actual to use with Sharepoint 2013 + SQL Server 2012 SP1?
Sergey VdovinHave you read the readme that's included within the archive? I'll quote it below for ease of reference: -
Considering it's for SharePoint and SQL instances that are several versions below what's widely used or supported now, I'd be surprised if this worked. The db schema changes alone might make this tricky to get working.
sp_configure 'clr enabled',1
reconfigure
a) Create your target database if it doesn't exist.
b) Mark the database as TRUSTWORTHY, to allow you to run EXTERNAL_ACCESS CLR code.
alter database MyDB set trustworthy on
c) Make sure your database is owned by a sysadmin or the database owner has the EXTERNAL
ACCESS ASSEMBLY privilege.
2) Build the assembly SqlClrSharePointInterface.dll using VisualStudio (or MsBuild)
3) Pre-Generate the XML Serialization Assembly
XML Serializer Generator Tool (Sgen.exe)
http://msdn2.microsoft.com/en-us/library/bk3w6240(vs.80).aspx
eg:
sgen /a:c:\mycode\SqlClrSharePointInterface\bin\release\SqlClrSharePointInterface.dll /f
4) Install the Assemblies in SQL Server
a)Copy the XML Serialization assembly and SqlClrSharePointInterface.dll somewhere where your SQL Server
can read the files.
b) run CREATE ASSEMBLY on SqlClrSharePointInterface.dll, marking it as EXTERNAL_ACCESS
CREATE ASSEMBLY [SqlClrSharePointInterface]
FROM 'c:\deploy\SqlClrSharePointInterface.dll'
WITH PERMISSION_SET = EXTERNAL_ACCESS
c) run CREATE ASSEMBLY on the XML Serialization assembly
CREATE ASSEMBLY [SqlClrSharePointInterfaceXML]
FROM 'c:\deploy\SqlClrSharePointInterface.XmlSerializers.dll'
5) Register the functions by running CREATE FUNCTION on each UDF.
Something like:
CREATE FUNCTION [dbo].[GetListCollection](@siteUrl [nvarchar](4000))
RETURNS TABLE (
[Title] [nvarchar](max) NULL,
[Description] [nvarchar](max) NULL,
[Name] [uniqueidentifier] NULL,
[ItemCount] [int] NULL
) WITH EXECUTE AS CALLER
AS
EXTERNAL NAME [SqlClrSharePointInterface].[ListFunctions].[GetListCollection]
GO
CREATE FUNCTION [dbo].[GetListItemsTable](@siteUrl [nvarchar](4000), @listName [nvarchar](4000), @viewName [nvarchar](4000))
RETURNS TABLE (
[ID] [int] NULL,
[ModifiedBy] [nvarchar](200) NULL,
[Title] [nvarchar](200) NULL,
[ContentType] [nvarchar](100) NULL,
[Created] [datetime] NULL,
[Modified] [datetime] NULL,
[EncodedAbsUrl] [nvarchar](400) NULL
) WITH EXECUTE AS CALLER
AS
EXTERNAL NAME [SqlClrSharePointInterface].[ListFunctions].[GetListItemsTable]
GO
CREATE FUNCTION [dbo].[GetListItems](@siteUrl [nvarchar](4000), @listName [nvarchar](4000), @viewName [nvarchar](4000))
RETURNS [xml] WITH EXECUTE AS CALLER
AS
EXTERNAL NAME [SqlClrSharePointInterface].[ListFunctions].[GetListItems]
GO
Then query SharePoint. Something like
select * form dbo.GetListCollection('http://MySharePointSite');
To get the list of SharePoint Lists available. Then retrieve the items for one of the lists.
the GetListItems function returns a single XML document containing all of the items. So to
make use of the data, you would typically use an XML-shreading query like this:
with ListItems as
select dbo.GetListItems('http://MySharePointSite','Site Collection Documents',null) AllListItems
select
Item.value('@ows_Title', 'varchar(50)') Title,
Item.value('@ows_EncodedAbsUrl','varchar(max)') Url
,Item.query('.') Item
from ListItems cross apply ListItems.AllListItems.nodes('/*/*') Items(Item)
There is also an example of a higher-performance solution that shreads the XML in CLR code
and returns a relational result to SQL Server. But you will need to customize the coding
to return the fields that are relevent in your list.
Select * from dbo.GetListItemsTable('http://MySharePointSite','Site Collection Documents',null)
Steven Andrews
SharePoint Business Analyst: LiveNation Entertainment
Blog: baron72.wordpress.com
Twitter: Follow @backpackerd00d
My Wiki Articles:
CodePlex Corner Series
Please remember to mark your question as "answered" if this solves (or helps) your problem. -
Convert Sql server Storeprocedure to Oracle Storeprocedure
Hi I vant to convert a sql server storeprocedures to oracle storeprocedure .
I need a convertor if anybody know a aplication that performed it please tell me.Hi Hoek,
I have tried this:
1. Get the DATETIME into a VARCHAR2 variable.
2a. Then, I use a TIMESTAMP variable and TO_TIMESTAMP like this:
v_tmst := TO_TIMESTAMP (r_typ.date1, 'DD.MM.YYYY HH24');2b. Or the function CAST:
v_tmst := CAST (r_typ.date1 AS TIMESTAMP);Same result, I lose the HH:MI:SS.FFF just in the moment the DATETIME from SQL Server is copied into an Oracle VARCHAR2 variable.
I also read the article you linked (I also read it before I wrote the 1st posting, but not with deep detail).
I'm afraid this cannot help me, or at least I'm not able to find out how.
They talk about ODI, but I'm not using that tool.
Moreover, they talk about creating a table, but they don't say anything regarding the conversion itself and how to get the miliseconds.
Don't know whether there is something I can take from here or not.
Anyway, thank you for the repply.
Francisco.
Edited by: FranBlanes on 30.08.2012 04:27 -
hello everybody
can anyone plz tell me how i can cofigure jboss with sql server 2000 to access its tables
Regards
soniaTo configure JBoss 4.0 with MS SQL Server database, MS SQL Server driver classes are required in the Classpath.Copy MS SQL Server JDBC driver class jar files mssqlserver.jar, msbase.jar, msutil.jar to the server/default/lib dir. To configure with non-xa MS SQL Server datasource copy /docs/examples/jca/mssql-ds.xml to /server/default/deploy dir. To configure with MS SQL Server XA datasource copy /docs/examples/jca/mssql-xa-ds.xml to /server/default/deploy dir. Modify mssql-ds.xml configuration file. Driver Class and Connection URL settings for MS SQL Server JDBC Drivers
Driver Class: com.microsoft.jdbc.sqlserver.SQLServerDriverConnection URL:jdbc:microsoft:sqlserver://localhost:1433;
DatabaseName=MyDatabase
To configure with XA JDBC driver for MS SQL Server modify the mssql-xa-ds.xml configuration file.
Driver Class: com.microsoft.jdbcx.sqlserver.SQLServerDataSourceThe standardjbosscmp-jdbc.xml configuration file is configured with Hypersonic database. To configure JBoss server with MS SQL Server modify /server/default/conf/standardjbosscmp-jdbc.xml configuration file. Set the <datasource> and <datasource-mapping> elements. <jbosscmp-jdbc> <defaults> <datasource>java:/MSSQLDS</datasource> <datasource-mapping>MS SQLSERVER2000</datasource-mapping>
</defaults>
</jbosscmp-jdbc> Modify login-config.xml configuration file with MS SQL Server
database settings. Add the following <application-policy/> element to login-config.xml. <application-policy name = "MSSQLDbRealm"> <authentication> <login-module code = "org.jboss.resource.security.ConfiguredIdentityLoginModule" flag = "required"> <module-option name = "principal">sa</module-option> <module-option name = "userName">sa</module-option> <module-option name = "password"></module-option> <module-option name ="managedConnectionFactoryName"> jboss.jca:service=LocalTxCM,name=MSSQLDS
</module-option> </login-module> </authentication> </application-policy> By modifying the mssql-ds.xml, standardjbosscmp-jdbc.xml and login-config.xml the JBoss 4.0 server is configured to be used with a MS SQL Server database. -
LabVIEW & MS SQL SERVER Architect and Developer
Have keen interest & expert-level know-how on architecting and developing databases geared to automated measurements and control systems that directly interface with LabVIEW.
Here is specific problem-solving expertise brought to projects:
Knowledge-Generating & Flexibale Database Schema For Measurements and Control Schema
In 8 years I have gone through ~15 iterations of database schema and pre-defined queries to finally arrive at guidelines & principles for generating the same for automated control and measurement systems. Re-use of such schema along with the pre-designed database querys has shortened integration time from months to weeks.
Fast Data Inserts from LabVIEW to Databases
LabVIEW applications can generate large amounts of data that cannot be inserted fast-enough into databases. However, there are ceratin data insert techniques that can overcome this challenge. MS SQL Server 'Bulk Insert' is a technqiue that I have mastered well that can upload upto 35,000 sample points in one second making databases viable in many automated test and measurement scenarios.
LabVIEW Real-Time & MS SQL Server Interfacing
ADO (ActiveX Data-Access Objects) cannot run in LabVIEW real-time systems. However, robust re-usable TCPIP client-server communication modules bridge the gap between LabVIEW Real-Time and MS SQL Server interfacing. These modules run fast enough and can handle complex projects when used in conjunction with LabVIEW FTP and the Bulk-inserts methodology.
Reporting using Feature Rich Data Grids
Reporting data with comprehensive search, filter, and hierarchichal organization is accomplished using 3-rd party data grids. Have full mastery in using one advanced data grid for this: Namely: ComponentOne vsFlexgrid 8.0.
For further info please visit Company web-site at: http://www.mezintel.com
Regards
Anthony L.Hi:
I wrote out a reply but could not post because this web site says that my message was more than 10,000 characters and that this exceeds the allowed message length.
. . . But my message was only 2,300 characters!
Anyway, I pasted the reply to a word document and have attached here. I have also attached samples of a 'Format File' and a 'Data File'.
Should you wish to create a Dynamic Bulk insert stored procedure then here it is:
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE PROC [dbo].[AC_Import_SamplesBulkCopy]
@DataFile nvarchar(150),
@FormatFile nvarchar(150),
@RowsPerBatch nvarchar(100) = '5000'
AS
SET TRANSACTION ISOLATION LEVEL READ UNCOMMITTED
SET NOCOUNT ON;
DECLARE @StrgSQL nvarchar(2000)
SET @StrgSQL = 'BULK INSERT [LOG_ResultsValues] FROM ''' + @DataFile +
''' WITH (FORMATFILE = ''' + @FormatFile + ''', ROWS_PER_BATCH = ' + @RowsPerBatch + ')'
EXEC (@StrgSQL)
GO
Regards
Anthony L.
Attachments:
SQL Bulk Insert Tips.docx 16 KB
SampleFormat.txt 2 KB
SampleData.txt 102 KB -
Downgrade SQL Server 2008 Enterprise to SQL Server2008 Standard Edition
I would like to know from fourm users, what are the necessary steps I would take care for above mentioned activity and also please let me know best suitable option for Production.
SNSYou can't do in-place downgrade. You need to go with backup -> uninstall ent -> install std -> bring it to same patch -> restore databases approach.
You need to make sure that there is not enterprise only feature used else you can't restore database on standard.
Balmukund Lakhani | Please mark solved if I've answered your question, vote for it as helpful to help other user's find a solution quicker
This posting is provided "AS IS" with no warranties, and confers no rights.
My Blog |
Team Blog | @Twitter
Balmukund's answer above is the Microsoft supported route for doing a downgrade of an instance of SQL Server. However, before you do this, you need to query the sys.dm_db_persisted_sku_fearures DMV in each of your user databases to make certain that
you don't have any Enterprise features in use that would block your ability to restore a database onto Standard Edition.
No matter what you are going to have to do an uninstall and reinstall of the SQL Server instance to downgrade the SKU. However, you can save yourself some time and the headache of trying to restore the system databases if you are careful about what
you do. I have done a couple of SKU downgrades in the past and the easiest way to do it, and I am not saying this is the Microsoft supported way but that it works if done correctly, is to:
Take a good backup of all of your databases (system and user).
Run SELECT @@VERSION and note the specific build number of SQL Server that you are currently on.
Shut down your existing instance of SQL Server.
Copy the master, model, and msdb database files (both mdf and ldf), don't move them copy them, from the current location to a new folder that you mark as readonly.
Uninstall SQL Server from the system.
Reboot the server.
Install SQL Server Standard Edition.
Apply the necessary Service Pack and/or Cumulative Updates to bring the instance up to your previous build number.
Shutdown SQL Server.
Copy the master, model, and msdb database files (both mdf and ldf) from the folder you saved them in to the correct location for the new install and remove the readonly flag from the files, and change the file ACL's so that the SQL Service account has Full
Control over the files.
Startup SQL Server and if you did it correctly it will startup and be exactly where you were before you made any changes, with all of your user databaes online and you should be ready to let applications connect and resume operations.
If you screw something up in the above, you still have your backups and you can run setup to rebuild the system databases and then go about following the Microsoft supported path for restoring the system databases and then user databases into the system
to bring it online. Essentially the file copy is no different that what would occur through attach/detach you are just doing it with system databases which is not supported, but it does work. The key is to have your backups from before you do anything
so you have the supported route available if you encounter an issue. The only issue I have ever had doing this set of steps is that I didn't set the file ACL's correctly and the database engine threw Access Denied errors and failed to start until I fixed
the ACL's correctly. This can save you many hours of frustration and downtime trying to restore everything since the database files are already there and it is just some small copy operations to put them where you need them to be.
Balmukund, feel free to beat me over the head for suggesting a non-supported path. :-)
Jonathan Kehayias | Senior Consultant,
SQLSkills.com
SQL Server MVP | Microsoft Certified Master: SQL Server 2008
Feel free to contact me through
My Blog or
Twitter. Become a
SQLskills Insider!
Please click the Mark as Answer button if a post solves your problem! -
MS SQL server 2008 - Bulk copy from XML to DB table throws bcp_init error
I have MS SQL server 2008 installed ,
Windows version - Windows 7 Professional with SP1
Doing a bulk copy process using the SQL library function bcp_init function in c++ throws xml error and its not inserting the data into the tables.
Error received ,
XML Datatransfer error: XML data or another error occurred while reading file 'd:\temp\scripts\dbtoolscripts\table_data.xml':
But this works in others machines with the same windows version and SQL version.We are using the same SQL lib function bcp_init , we have written a separate class to load the ODBC32.dll , sqlncli.dll and use the bcp_init and other SQL functions from that. As i mentioned earlier we all use the same application build , but
its not working in my machine only other machines its working.
There is no provision to load XML files when you use the BCP interface in sqlncli.dll. You can of course load XML files, but the BCP API does not know that it is XML, it only sees a number of bytes.
So that error message is not coming from the BCP API, but somewhere else. Maybe your own code in reaction to some error from the BCP API. But without any clue of that error message, we can't help you.
I think you will need to do some debugging or by some other means improve your diagnostics.
Erland Sommarskog, SQL Server MVP, [email protected] -
Bulk Insert from SQL Server to Oracle
I have to load around 20 million rows from SQL Server table to Oracle table using Network Link,wrote following code using Bulk Collect,which is working but taking more time(taking 5 hrs).
I also tried with changing table to parallel degree 8 didn't help(Also Oracle Table set to NOLOGGONG mode).
Is there any better way to do this ? Appreciate any help in this regard .
Script :
CREATE OR REPLACE PROCEDURE INSERT_SQLSERVER_TO_ORACLE
IS
TYPE v_ARRAY IS TABLE OF TARGET_CUST%ROWTYPE INDEX BY BINARY_INTEGER;
ins_rows v_ARRAY;
BEGIN
DECLARE CURSOR REC1 IS
SELECT COL1, COL2,COL3,COL4 SOURCE_SQLSERVER_CUST;
BEGIN
OPEN REC1;
LOOP
FETCH REC1 BULK COLLECT INTO ins_rows LIMIT 5000;
FORALL i IN ins_rows.FIRST..ins_rows.LAST
INSERT INTO TARGET_CUST VALUES ins_rows(i);
EXIT WHEN REC1%NOTFOUND;
END LOOP;
COMMIT;
CLOSE REC1;
END;
END;
Thanks in Advance.887204 wrote:
I have to load around 20 million rows from SQL Server table to Oracle table using Network Link,wrote following code using Bulk Collect,which is working but taking more time(taking 5 hrs).I would not pull that data via a network link and use standard SQL insert statements. Bulk processing is meaningless in this context. It does nothing to increase the performance - as context switching is not the issue.
The biggest factor is pulling 20 million rows's data via database link across the network. This will be slow by it's very nature.
I would use bcp (Bulk Copy export) on SQL-Server to write the data to a CSV file.
Zip that file. FTP/scp/sftp it to the Oracle server. Unzip it.
Then do a parallel direct load of the data using SQL*Loader.
This will be a lot faster than pulling uncompressed data across the network, a couple of rows at a time (together with the numerous moving parts on the Oracle side that uses a HS agent as interface between SQL-Server and the Oracle database).
Maybe you are looking for
-
How do i get my registration form to post with MD5
Hi I have a registration form which posts to my, mysql database via this code: <?php define('DB_NAME',''); define('DB_USER',''); define('DB_PASSWORD',''); define('DB_HOST',''); define('DB_TABLE','user00'); $link = mysql_connect(DB_HOST, DB_USER, DB_P
-
CUIC Reporting - Calls Transferred
Hey Guys, I need to create a report to show how many calls were transferred from an agent, and to what destinations. I've read the CUIC doesnt tap into the right database to collect this type of information, is this correct? If anyone could shed some
-
Pls tell me the jar file required
Hi All, Please tell me which is the jar file for the following import. Can you please send me the jar file to : [email protected] import javax.resource.cci.Connection; import javax.resource.cci.ConnectionFactory; import javax.resource.cci.Interaction
-
Can i keep my .me email address without subscribing to icloud
icloud is screwing up my computers. I don't like it but I have the old .me email address I would like to keep. Can I do that?
-
I recently upgraded from Snow Leopard to Lion and then to Mountain Lion in the same day. iPhoto was bundled with my original Snow Leopard install. I didn't update iPhoto during the brief time I had Lion installed, but now that I have Mountain Lion in