Incomplete Crawl of SQL Server Table Source

We are currently experiencing a problem with a Database-based crawler in SES 10.1.8.2 where scheduled crawls produce different results than crawls that are executed manually. Crawler configuration is as follows:
- Database Source Type pointing at a rollup table in SQL Server.
- Rollup table is deleted and re-created each evening prior to SES indexing. Rollup table includes a "KEY" column that is the primary key.
- Schedule properties include a recrawl policy of "Process All Documents" and a crawling mode of "Automatically Accept All URLs for Indexing"
The symptom we are experiencing basically involves an incomplete crawl of the source. For example, a source table that contains 1,734 rows only translates into 567 indexed documents in SES. This coincides with the Browse counts when using the out-of-the-box version of SES search. If I then manually launch the schedule, the proper number of documents are processed and the Browse feature reflects the proper number.
One other piece of information is after examining the log for a scheduled crawl, a document that was not indexed does appear in the log as being queued, but we never see a log info line indicating it was processed.
The bottom line is a scheduled crawl fails to index all database table rows. A manual crawl consistently indexes everything properly.
Finally, we have made sure that we have minimal overlap of scheduled crawls in case this is a resource issue.
Any assistance would be greatly appreciated.

1. What is the use of the table compression?
Save disk space and sometimes also gain performance.
2. when do we need to compress the table ?
Need is maybe not the best word, but when we want to reduce disk space and/or make the performance gains, we would consider compression.
And, not to forget, if we are on Enterprise Edition. Compression is not available in other editions.
3. If i compress the table what will be the performance impact
There are two levels of compression: ROW and PAGE. ROW is basically a different storage format, which gives a more compact format for most data profiles. Not the least if you have plenty of fixed-length columns that are often NULL. ROW compression has a
fairly low CPU overhead. Since compression means that the data takes up less space, this means a scan of the full table will be faster. This is why you may gain performance.
Page compression is more aggressive and uses a dictionary. You can make a bigger gain in disk space, but the CPU overhead is fairly considerable, so it is less likely that you will make a net gain.
To find out how your system is affected, there is a stored procedure, of which I don't recall the name right now, which can give you estimated space savings. But if you also want to see the performance effects, you will need to run a test with your workload.
There is also columnstore, which also is a form a compression, and which for data warehouses can give enormous performance gains.
Erland Sommarskog, SQL Server MVP, [email protected]

Similar Messages

  • How can I load a .xlsx File into a SQL Server Table using a Foreach Loop Container in SSIS?

    I know I've REALLY struggled with this before. I just don't understand why this has to be soooooo difficult.
    I can very easily do a straight Data Pump of a .xlsX File into a SQL Server Table using a normal Excel Connection and a normal Excel Source...simply converting Unicode to DT_STR and then using an OLE DB Destination of the SQL Server Table.
    If I want to make the SSIS Package a little more flexible by allowing multiple .xlsX spreadsheets to be pumped in by using a Foreach Loop Container, the whole SSIS Package seems to go to hell in a hand basket. I simply do the following...
    Put the Data Flow Task within the Foreach Loop Container
    Add the Variable Mapping Variable User::FilePath that I defined as a Variable and a string within the FOreach Loop Container
    I change the Excel Connection and its Expression to be ExcelFilePath ==> @[User::FilePath]
    I then try and change the Excel Source and its Data Access Mode to Table Name or view name variable and provide the Variable Name User::FilePath
    And that's when I run into trouble...
    Exception from HRESULT: 0xC02020E8
    Error at Data Flow Task [Excel Source [56]]:SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occured. Error code: 0x80004005.
    Error at Data Flow Task [Excel Source [56]]: Opening a rowset for "...(the EXACT Path and .xlsx File Name)...". Check that the object exists in the database. (And I know it's there!!!)
    I don't understand by adding a Foreach Loop Container to try and make this as efficient as possible has caused such an error unless I'm overlooking something. I have even tried delaying my validations and that doesn't seem to help.
    I have looked hard in Google and even YouTube to try and find a solution for this but for the life of me I cannot seem to find anything on pumping a .xlsX file into SQL Server using a Foreach Loop Container.
    Can ANYONE please help me out here? I'm at the end of my rope trying to get this to work. I think the last time I was in this quandry, trying to pump a .xlsX File into a SQL Server Table using a Foreach Loop Container in SSIS, I actually wrote a C# Script
    to write the contents of the .xlsX File into a .csv File and then Actually used the .csv File to pump the data into a SQL Server Table.
    Thanks for your review and am hoping and praying for a reply and solution.

    Hi ITBobbyP,
    If I understand correctly, you want to load data from multiple sheets in an .xlsx file into a SQL Server table.
    If in this scenario, please refer to the following tips:
    The Foreach Loop container should be configured as shown below:
    Enumerator: Foreach ADO.NET Schema Rowset Enumerator
    Connection String: The OLE DB Connection String for the excel file.
    Schema: Tables.
    In the Variable Mapping, map the variable to Sheet_Name, and change the Index from 0 to 2.
    The connection string for Excel Connection Manager is the original one, we needn’t make any change.
    Change Table Name or View name to the variable Sheet_Name.
    If you want to load data from multiple sheets in multiple .xlsx files into a SQL Server table, please refer to following thread:
    http://stackoverflow.com/questions/7411741/how-to-loop-through-excel-files-and-load-them-into-a-database-using-ssis-package
    Thanks,
    Katherine Xiong
    Katherine Xiong
    TechNet Community Support

  • Data Pump .xlsx into a SQL Server Table and the whole 32-Bit, 64-Bit discussion

    First of all...I have a headache!
    Found LOTS of Google hits when trying to data pump a .xlsx File into a SQL Server Table. And the whole discussion of the Microsoft ACE 64-Bit Driver or the Microsoft Jet 32-Bit Driver.
    Specifically receiving this error...
    An OLE DB record is available.  Source: "Microsoft Office Access Database Engine"  Hresult: 0x80004005  Description: "External table is not in the expected format.".
    Error: 0xC020801C at Data Flow Task to Load Alere Coaching Enrolled, Excel Source [56]: SSIS Error Code DTS_E_CANNOTACQUIRECONNECTIONFROMCONNECTIONMANAGER.  The AcquireConnection method call to the connection manager "Excel Connection Manager"
    failed with error code 0xC0202009.
    Strangely enough, if I simply data pump ONE .xlsx File into a SQL Server Table utilizing my SSIS Package, it seems to work fine. If instead I am trying to be pro-active and allowing for multiple .xlsx Files by using a Foreach Loop Container and a variable
    @[User::FileName], it's erroring out...but not really because it is indeed storing the rows onto the SQL Server Table. I did check all my Delay
    Why does this have to be sooooooo difficult???
    Can anyone help me out here in trying to set-up a SSIS Package in a rather constrictive environment to pump a .xlsx File into a SQL Server Table? What in God's name am I doing wrong? Or is all this a misnomer? But if it's working how do I disable the error
    so that is stops erroring out?

    Hi ITBobbyP,
    According to your description, when you import data of .xlsx file to SQL Server database, you got the error message.
    The error can be caused by the following reasons:
    The excel file is locked by other processes. Please kindly resave this file and name it to other file name to see if the issue will be fixed.
    The ACE(Access Database Engine) is not up to date as Vaibhav mentioned. Please download the latest ACE and install it from the link:
    https://www.microsoft.com/en-us/download/details.aspx?id=13255.
    The version of OFFICE and server bitness is not the same. To solve the problem, please refer to the following document:
    http://hrvoje.piasevoli.com/2010/09/01/importing-data-from-64-bit-excel-in-ssis/
    If you have any more questions, please feel free to ask.
    Thanks,
    Wendy Fu
    Wendy Fu
    TechNet Community Support

  • How to delete rows in the SQL Server table based on the contents of Excel file

    Hello, everyone,
    I have an Excel file which contains data for certain dates. I need to load it in a SQL Server table. But before doing that, I need to check if the table already contains data for the dates in the Excel file. If it does, I need to delete those data
    first. Not sure what is the best and efficient way to do this. Your help and guidance would be much appreciated.
    Thank you in advance.

    there are multiple ways of doing this
    Fastest method would be below
    1. Have a data flow task using excel source. Then add a OLEDB destination to dump the data to a staging table
    2. Have a Execute sql task to delete data from your actual table based on staging table data. The query would look like
    DELETE t
    FROM YourTable t
    WHERE EXISTS (SELECT 1
    FROM StagingTable
    WHERE DateField = t.DateField)
    3. Have another execute sql task to do final insert like
    INSERT YourTable (Col1,Col2,...)
    SELECT Col1,Col2,..
    FROM StagingTable
    the above operations will be set based and faster
    Please Mark This As Answer if it helps to solve the issue Visakh ---------------------------- http://visakhm.blogspot.com/ https://www.facebook.com/VmBlogs

  • Writing a stored procedure to import SQL Server table data into a Oracle table

    Hello,
    As a new DBA I have been tasked with writing a stored procedure to import SQL Server table data into an Oracle table. I have been given many suggestions on how to do it from SQL Server but I I just need to write a stored procedure to run it from the Oracle side. Suggestions/guidance on where to start would be greatly appreciated! Thank you!
    I started to write it based on what I have but I know this is not correct :/
    # Here is the select statement for the data source in SQL Server...
    SELECT COMPANY
    ,CUSTOMER
    ,TRANS_TYPE
    ,INVOICE
    ,TRANS_DATE
    ,STATUS
    ,TRAN_AMT
    ,CREDIT_AMT
    ,APPLD_AMT
    ,ADJ_AMT
    ,TRANS_USER1
    ,PROCESS_LEVEL
    ,DESCRIPTION
    ,DUE_DATE
    ,OUR_DATE
    ,OUR_TIME
    ,PROCESS_FLAG
    ,ERROR_DESCRIPTION
      FROM data_source_table_name
    #It loads data into the table in Oracle....   
    Insert into oracle_destination_table_name (
    COMPANY,
    CUSTOMER,
    TRANS_TYPE,
    INVOICE,
    TRANS_DATE,
    STATUS,
    TRANS_AMT,
    CREDIT_AMT,
    APPLD_AMT,
    ADJ_AMT,
    TRANS_USER1,
    PROCESS_LEVEL,
    DESCRIPTION,
    DUE_DATE,
    OUR_DATE,
    OUR_TIME,
    PROCESS_FLAG,
    ERROR_DESCRIPTION)
    END;

    CREATE TABLE statements would have been better as MS-SQL and Oracle don't have the same data types.
    OUR_DATE, OUR_TIME will (most likely) be ONE column in Oracle.
    DATABASE LINK
    Personally, I'd just load the data over a database link:
    insert into oracle_destination_table_name ( <column list> )
    select ... <transform data here>
    from data_source_table@mssql_db_link
    As far as creating the database link from Oracle to MS-SQL ... that is for somebody else to answer.
    (most likely you'll need to use an ODBC driver)
    EXTERNAL TABLE
    If the data from MS-SQL is in a CSV file, just use and external table.
    same concept:
    insert into oracle_destination_table_name ( <column list> )
    select ... <transform data here>
    from data_source_external_table
    MK

  • Loading a SQL Server table

    Hi All,
    I am developing a straight source - target interface from a sql server table to a sql server table. While executing the interface, I get an error message -
    " Cannot insert explicit value for identity column in table 'xxx' when IDENTITY_INSERT is set to OFF."
    After investigating, it appears I have to execute -
    SET IDENTITY_INSERT xxx ON
    GO
    This execution is session specific, ie I have to execute this statement each time ODI inserts into the target table. Is there a way I can have the tool run this statement ?
    TIA
    MN
    ODI Version - 11g
    Source - SQL Server 2008 R2
    Target- SQL Server 2008 R2

    In IKM , before the Insert new Rows statement
    Create a new command call it IDENTITY_INSERT_ON
    In command on Target , paste the following statement
    SET IDENTITY_INSERT <%=snpRef.getTable("L","TARG_NAME","A")%> ON
    GOso this way before inserting , the required command is executed on Target table ,also please use the same properties (like technology, transaction ) as specified in Insert New Rows .
    Hope this helps.

  • Export SQL Server Table into Multiple Sheets in Excel

    I'm trying to understand the example here.
    http://visakhm.blogspot.in/2013/09/exporting-sqlserver-data-to-multiple.html
    Basically, I'm up to step #5.
    5. Data Flow Task to populate the created sheet with data for that subject. The Data Flow looks like below
    I don't understand this part.  Has anyone worked with this sample before?  Has anyone gotten this to work?  I'm trying to learn SSIS better, but I'm finding it hard to get started with this stuff.  I guess if I get a couple projects under
    my belt, I'll be fine.  The hardest part is getting started.
    If anyone feels really ambitions today, maybe you can assist me with two other projects as well.
    #1)
    http://visakhm.blogspot.in/2011/12/simulating-file-watcher-task-in-ssis.html
    #2)
    http://sqlage.blogspot.in/2013/12/ssis-read-multiple-sheets-from-excel.html
    http://beyondrelational.com/modules/24/syndicated/398/Posts/18163/ssis-how-to-loop-through-multiple-excel-sheets-and-load-them-into-a-sql-table.aspx
    I'd greatly appreciate any help I can get with this.
    Thanks!!
    Knowledge is the only thing that I can give you, and still retain, and we are both better off for it.

    Hi Ryguy72,
    The solution introduced in Visakh’s blog does work. I have written the detailed steps for you:
    1. Create an Excel file (Des.xls) as the destination file, rename the Sheet1 to “Excel Destination”, and add four columns: ID, Name, Subject, and Marks.
    2. Create an Object type variable Subject, a String type variable SheetName, and a String type variable WorkSheetQuery.
    3. Set the value of variable SheetName to “Excel Destination$”, and set the expression of variable WorkSheetQuery to:
    "CREATE TABLE `" + @[User::SheetName] + "` (
    `ID` Long,
    `Name` LongText,
    `Subject` LongText,
    `Marks` Long
    4. In the Execute SQL Task outside the Foreach Loop Container, set its General tab as follows:
    ResultSet: Full result set
    ConnectionType: OLE DB
    Connection: LocalHost.TestDB (e.g. an OLE DB Connection to the source SQL Server table)
    SQLSourceType: Direct input
    SQLStatement: SELECT DISTINCT [Subject] FROM [TestDB].[dbo].[DynamicExcelSheetDemo]
    5. In the Result Set tab of this Execute SQL Task, map result “0” to variable “User::Subject”.
    6. In the Collection page of the Foreach Loop Container, set the Enumerator option to “Foreach ADO Enumerator”, and set the “ADO object source variable” to “User::Subject”.
    7. In the Variable Mapping page of the container, map variable “User::SheetName” to index “0”.
    8. Create an OLE DB Connection Manager to connect to the destination Excel file. Choose the provider as Native OLE DB\Microsoft Jet 4.0 OLE DB Provider, specify the fully qualified path of the Excel file (such as C:\Temp\Des.xls), and set the Extended Properties
    option of the Connection Manager to “Excel 8.0”. Click “Test Connection” button to make sure the connection is established successfully.
    9. Set the General page of the Execute SQL Task inside the container as follows:
    ResultSet: None
    ConnectionType: OLE DB
    Connection: Des (e.g. the OLE DB Connection to the destination Excel file we create above)
    SQLSourceType: Variable
    SQLStatement: User::WorkSheetQuery
    10. In the Data Flow Task, add an OLE DB Source component, set the connection manager to “LocalHost.TestDB”, set “Data access mode” to “SQL command”, and set the “SQL command text” to:
    SELECT * FROM [TestDB].[dbo].[DynamicExcelSheetDemo] WHERE [Subject]=?
    11. Click the “Parameters…” button, and map Parameter0 to variable “User::SheetName”.
    12. Add an Excel Destination component, setup an Excel Connection Manager to the destination Excel file, set the data access mode to “Table name or view name variable”, and select variable name as “User::SheetName”. In this way, the OLE DB Provider can get
    the Excel metadata and enable us to finish the column mappings correctly.
    13. Since the Microsoft Jet 4.0 has only 32-bit driver, please make sure to set the Run64BitRuntime property of the IS project to False.
    In addition, please note that there will be one useless worksheets “Excel Destination” in the Excel file. We can remove them manually or use a Script Task to delete this worksheet programmatically (need to install .NET Programmability Support feature during
    Office installing).
    Regards,
    Mike Yin
    TechNet Community Support

  • How to insert into SQL server table form oracle forms

    I created a form with oracle as my database. But there one trigger where I need to insert the data into a sql server table.
    Is this possible. If so can any help me out.
    Thanks in advance.
    Asha

    Hi,
    You can insert into sql server database using the following steps
    Note: Check wether you are using Forms 32 bit drivers. If not the Odbc data source will not work.
    step 1: Create ODBC data source for SQL server(one time creation);
    step 2: Logout from Oracle and login to SQL server giving the user name,password and host string as odbc:<odbc datasource name>;
    step 3: use EXEC SQL statement to insert the values into the SQL server and then logout and login again to your oracle database.
    Second Method.
    Check the sql server documentation to insert the values using command line parameters. Then you can call the host command to execute this.
    Third Method.
    Write a VB exe to enter the values in the sql server making two connections one to oracle another to SQL server and then getting values from Oracle and putting in the SQL server database. You can call this exe using the Host command.
    Hope this will help You.
    Regards
    Gaurav Thakur

  • How to select data from 3rd row of Excel to insert into Sql server table using ssis

    Hi,
    Iam having Excel files with headers in first two rows , i want two skip that two rows and select data from 3rd row to insert into Sql Server table using ssis.3rd row is having column names.

                                                         CUSTOMER DETAILS
                         REGION
    COL1        COL2        COL3       COL4           COL5          COL6          COL7
           COL8          COL9          COL10            COL11      
    1            XXX            yyyy         zzzz
    2            XXX            yyyy        zzzzz
    3           XXX            yyyy          zzzzz
    4          XXX             yyyy          zzzzz
    First two rows having cells merged and with headings in excel , i want two skip the first two rows and select the data from 3rd row and insert into sql server using ssis
    Set range within Excel command as per below
    See
    http://www.joellipman.com/articles/microsoft/sql-server/ssis/646-ssis-skip-rows-in-excel-source-file.html
    Please Mark This As Answer if it solved your issue
    Please Mark This As Helpful if it helps to solve your issue
    Visakh
    My MSDN Page
    My Personal Blog
    My Facebook Page

  • Using MS-SQL Server tables to build Universes

    My team has been tasked to create Universes so users can create WEBI reports.  We don't have a business warehouse, but that appeared not to be a problem as our understanding is that we can create Universes directly from MS-SQL Server tables. 
    On attempting to use Microsoft SQL Server Management Studio (20005), we find that once we expand the list of tables for SAP performance slows down to a 5 to 10 second response time between mouse clicks. 
    Is there a setting to work around this?  Are we approaching this the wrong way?  Is there documentation that addresses this?
    Thanks,
    Leo

    Hi Leo,
    There are a couple of options which can help improve the performance of your designer.
    1) Under tools, options on the graphics tab if you turn off the Show Row Counts, then the db will not be queried for the number of rows
    2) Check tools, options general ensure Automatic parse upon defintion and check universe integrity at opening are checked off
    It is not clear from your message, but you mentioned SAP as your source. I am not sure if you intended that you were connecting directly to SAP or not.
    The number of tables can also have an impact of the performance of the designer tool. You can limit the number of tabels by tailoring a tables strategies. This means you can limit the query that returns the list of table to a specific set.
    Hope this helps
    Alan

  • Moving Access table with an autonumber key to SQL Server table with an identity key

    I have an SSIS package that is moving data from an Access 2010 database to a SQL Server 2008 R2 database.  Two of the tables that I am migrating have identity keys in the SQL Server tables and I need to be able to move the autonumber keys to the SQL
    Server tables.  I am executing a SQL Script to set the IDENTITY_INSERT ON before I execute the Data Flow task moving the data and then execute a SQL Script to set the IDENTITY_INSERT OFF after executing the Data Flow task.
    It is failing with an error that says:
    An OLE DB record is available.  Source: "Microsoft SQL Server Native Client 10.0"  Hresult: 0x80040E21  Description: "Multiple-step OLE DB operation generated errors. Check each OLE DB status value, if available. No work was
    done.".
    Error: 0xC020901C at PGAccountContractDetail, PGAccountContractDetail [208]: There was an error with input column "ID" (246) on input "OLE DB Destination Input" (221). The column status returned was: "User does not have permission to
    write to this column.".
    Error: 0xC0209029 at PGAccountContractDetail, PGAccountContractDetail [208]: SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR.  The "input "OLE DB Destination Input" (221)" failed because error code 0xC020907C occurred, and the
    error row disposition on "input "OLE DB Destination Input" (221)" specifies failure on error. An error occurred on the specified object of the specified component.  There may be error messages posted before this with more information
    about the failure.
    Error: 0xC0047022 at PGAccountContractDetail, SSIS.Pipeline: SSIS Error Code DTS_E_PROCESSINPUTFAILED.  The ProcessInput method on component "PGAccountContractDetail" (208) failed with error code 0xC0209029 while processing input "OLE DB
    Destination Input" (221). The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.  There may be error messages posted
    before this with more information about the failure.
    Any ideas on what is causing this error?  I am thinking it is the identity key in SQL Server that is not allowing the update.  But, I do not understand why if I set IDENTITY_INSERT ON.  
    Thanks in advance for any help/guidance provided.

    I suspect it is the security as specified in the message. E.g .your DBA set the ID columns so no user can override values in it.
    And I suggest you 1st put the data into a staging table, then push it to the destination, this does not resolve the issue, but ensures better processing.
    Arthur
    MyBlog
    Twitter

  • How to convert from SQL Server table to Flat file (txt file)

    I need To ask question how convert from SQL Server table to Flat file txt file

    Hi
    1. Import/Export wizened
    2. Bcp utility
    3. SSIS 
    1.Import/Export Wizard
    First and very manual technique is the import wizard.  This is great for ad-hoc and just to slam it in tasks.
    In SSMS right click the database you want to import into.  Scroll to Tasks and select Import Data…
    For the data source we want out zips.txt file.  Browse for it and select it.  You should notice the wizard tries to fill in the blanks for you.  One key thing here with this file I picked is there are “ “ qualifiers.  So we need to make
    sure we add “ into the text qualifier field.   The wizard will not do this for you.
    Go through the remaining pages to view everything.  No further changes should be needed though
    Hit next after checking the pages out and select your destination.  This in our case will be DBA.dbo.zips.
    Following the destination step, go into the edit mappings section to ensure we look good on the types and counts.
    Hit next and then finish.  Once completed you will see the count of rows transferred and the success or failure rate
    Import wizard completed and you have the data!
    bcp utility
    Method two is bcp with a format file http://msdn.microsoft.com/en-us/library/ms162802.aspx
    This is probably going to win for speed on most occasions but is limited to the formatting of the file being imported.  For this file it actually works well with a small format file to show the contents and mappings to SQL Server.
    To create a format file all we really need is the type and the count of columns for the most basic files.  In our case the qualifier makes it a bit difficult but there is a trick to ignoring them.  The trick is to basically throw a field into the
    format file that will reference it but basically ignore it in the import process.
    Given that our format file in this case would appear like this
    9.0
    9
    1 SQLCHAR 0 0 """ 0 dummy1 ""
    2 SQLCHAR 0 50 "","" 1 Field1 ""
    3 SQLCHAR 0 50 "","" 2 Field2 ""
    4 SQLCHAR 0 50 "","" 3 Field3 ""
    5 SQLCHAR 0 50 ""," 4 Field4 ""
    6 SQLCHAR 0 50 "," 5 Field5 ""
    7 SQLCHAR 0 50 "," 6 Field6 ""
    8 SQLCHAR 0 50 "," 7 Field7 ""
    9 SQLCHAR 0 50 "n" 8 Field8 ""
    The bcp call would be as follows
    C:Program FilesMicrosoft SQL Server90ToolsBinn>bcp DBA..zips in “C:zips.txt” -f “c:zip_format_file.txt” -S LKFW0133 -T
    Given a successful run you should see this in command prompt after executing the statement
    Starting copy...
    1000 rows sent to SQL Server. Total sent: 1000
    1000 rows sent to SQL Server. Total sent: 2000
    1000 rows sent to SQL Server. Total sent: 3000
    1000 rows sent to SQL Server. Total sent: 4000
    1000 rows sent to SQL Server. Total sent: 5000
    1000 rows sent to SQL Server. Total sent: 6000
    1000 rows sent to SQL Server. Total sent: 7000
    1000 rows sent to SQL Server. Total sent: 8000
    1000 rows sent to SQL Server. Total sent: 9000
    1000 rows sent to SQL Server. Total sent: 10000
    1000 rows sent to SQL Server. Total sent: 11000
    1000 rows sent to SQL Server. Total sent: 12000
    1000 rows sent to SQL Server. Total sent: 13000
    1000 rows sent to SQL Server. Total sent: 14000
    1000 rows sent to SQL Server. Total sent: 15000
    1000 rows sent to SQL Server. Total sent: 16000
    1000 rows sent to SQL Server. Total sent: 17000
    1000 rows sent to SQL Server. Total sent: 18000
    1000 rows sent to SQL Server. Total sent: 19000
    1000 rows sent to SQL Server. Total sent: 20000
    1000 rows sent to SQL Server. Total sent: 21000
    1000 rows sent to SQL Server. Total sent: 22000
    1000 rows sent to SQL Server. Total sent: 23000
    1000 rows sent to SQL Server. Total sent: 24000
    1000 rows sent to SQL Server. Total sent: 25000
    1000 rows sent to SQL Server. Total sent: 26000
    1000 rows sent to SQL Server. Total sent: 27000
    1000 rows sent to SQL Server. Total sent: 28000
    1000 rows sent to SQL Server. Total sent: 29000
    bcp import completed!
    BULK INSERT
    Next, we have BULK INSERT given the same format file from bcp
    CREATE TABLE zips (
    Col1 nvarchar(50),
    Col2 nvarchar(50),
    Col3 nvarchar(50),
    Col4 nvarchar(50),
    Col5 nvarchar(50),
    Col6 nvarchar(50),
    Col7 nvarchar(50),
    Col8 nvarchar(50)
    GO
    INSERT INTO zips
    SELECT *
    FROM OPENROWSET(BULK 'C:Documents and SettingstkruegerMy Documentsblogcenzuszipcodeszips.txt',
    FORMATFILE='C:Documents and SettingstkruegerMy Documentsblogzip_format_file.txt'
    ) as t1 ;
    GO
    That was simple enough given the work on the format file that we already did.  Bulk insert isn’t as fast as bcp but gives you some freedom from within TSQL and SSMS to add functionality to the import.
    SSIS
    Next is my favorite playground in SSIS
    We can do many methods in SSIS to get data from point A, to point B.  I’ll show you data flow task and the SSIS version of BULK INSERT
    First create a new integrated services project.
    Create a new flat file connection by right clicking the connection managers area.  This will be used in both methods
    Bulk insert
    You can use format file here as well which is beneficial to moving methods around.  This essentially is calling the same processes with format file usage.  Drag over a bulk insert task and double click it to go into the editor.
    Fill in the information starting with connection.  This will populate much as the wizard did.
    Example of format file usage
    Or specify your own details
    Execute this and again, we have some data
    Data Flow method
    Bring over a data flow task and double click it to go into the data flow tab.
    Bring over a flat file source and SQL Server destination.  Edit the flat file source to use the connection manager “The file” we already created.  Connect the two once they are there
    Double click the SQL Server Destination task to open the editor.  Enter in the connection manager information and select the table to import into.
    Go into the mappings and connect the dots per say
    Typical issue of type conversions is Unicode to non-unicode.
    We fix this with a Data conversion or explicit conversion in the editor.  Data conversion tasks are usually the route I take.  Drag over a data conversation task and place it between the connection from the flat file source to the SQL Server destination.
    New look in the mappings
    And after execution…
    SqlBulkCopy Method
    Sense we’re in the SSIS package we can use that awesome “script task” to show SlqBulkCopy.  Not only fast but also handy for those really “unique” file formats we receive so often
    Bring over a script task into the control flow
    Double click the task and go to the script page.  Click the Design script to open up the code behind
    Ref.
    Ahsan Kabir Please remember to click Mark as Answer and Vote as Helpful on posts that help you. This can be beneficial to other community members reading the thread. http://www.aktechforum.blogspot.com/

  • Error when insert data in Sql Server table(DateTime data type)

    Hello all,
    I have created a database link in oracle 11g to SQL Server 2008 using Sqlserver gateway for oracle,Oracle run on Linux and SQL Server run on Windows platform.
    I have queried a table and it fetches rows from the target table.
    I am using this syntax for insert a row in Sql Server table.
    Insert into Prod@sqlserver (NUMITEMCODE, NUMPREOPENSTOCK, NUMQNTY, NUMNEWOPENSTOCK, DATPRODDATE , TXTCOMPANYCODE, "bolstatus", NUMRESQNTY )
    Values (1118 , 1390.0 , 100.0 ,1490 , '2012-06-23 12:37:58.000','SFP' ,0 , 0 );
    but it give me error on DATPRODDATE,The data type of DATPRODDATE column in Sql Server is DATETIME.
    My Question is how can i pass the date values in INSERT statement for Sql Server DateTime data type.
    Regards

    Just as with Oracle, you have to specify the date using the to_date() function or use the native date format for the target database (if you can figure out what that is). This is good practice anyway and a good habit to get into.

  • ORA-02070: Error when updating a SQL Server table thru an Oracle View

    I have a SQL Server table TIMESHEET which contains a number of VARCHAR and NUMERIC columns plus a DATETIME column.
    Only the DATETIME column is giving me trouble.
    On the ORACLE side I have a view which selects from the SQL Server table but in order to get the SELECT to work, I had to either put a CAST or TO_DATE function call around the DATETIME field
    Below is the relevant part of the 2 view definitions I have tried
    create view TIMESHEET as
    SELECT
    "TsKeySeq" as TS_KEY_SEQ,
    "EmployeeNo" as EMPLOYEE_NO,
    CAST("PeriodEnding" AS DATE) as PERIOD_ENDING,
    . . . (more columns - not relevant)
    FROM [email protected];
    An update to the view generates this message
    ORA-02070: database OLEMSQLPSANTDAS6 does not support CAST in this context
    create view TIMESHEET as
    SELECT
    "TsKeySeq" as TS_KEY_SEQ,
    "EmployeeNo" as EMPLOYEE_NO,
    TO_DATE("PeriodEnding") as PERIOD_ENDING,
    . . . (more columns - not relevant)
    FROM [email protected];
    An update to the view generates this message
    ORA-02070: database OLEMSQLPSANTDAS6 does not support TO_DATE in this context
    If I don't include either the TO_DATE() or CAST() then I get
    Select Error: ORA-28527: Heterogeneous Services datatype mapping error
    ORA-02063:preceding line from OLEMSQLSANTDAS6
    Does anyone have any idea how to update a SQL Server DATETIME column thru an ORACLE view?

    You can't cast accross heterogenious databases and there is no need to. HSODBC treats SQL Server DATETIME column as DATE. For example, I have SQL Server table:
    CREATE TABLE [Ops].[T_JobType](
         [JobType] [varchar](50) NOT NULL,
         [JobDesc] [varchar](200) NULL,
         [InsertDt] [datetime] NOT NULL CONSTRAINT [InsertDt_00000006]  DEFAULT (getdate()),
         [InsertBy] [varchar](128) NOT NULL CONSTRAINT [InsertBy_00000006]  DEFAULT (user_name()),
         [LastUpdated] [datetime] NOT NULL CONSTRAINT [LastUpdated_00000006]  DEFAULT (getdate()),
         [LastUpdatedBy] [varchar](128) NOT NULL CONSTRAINT [LastUpdatedBy_00000006]  DEFAULT (user_name()),
    CONSTRAINT [T_JobType_PK] PRIMARY KEY CLUSTERED
         [JobType] ASC
    )WITH (PAD_INDEX  = OFF, STATISTICS_NORECOMPUTE  = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS  = ON, ALLOW_PAGE_LOCKS  = ON, FILLFACTOR = 100) ON [DATA01FG]
    ) ON [DATA01FG]Now on Oracle side I do:
    SQL> desc "Ops"."T_JobType"@pbods
    Name                                      Null?    Type
    JobType                                   NOT NULL VARCHAR2(50)
    JobDesc                                            VARCHAR2(200)
    InsertDt                                  NOT NULL DATE
    InsertBy                                  NOT NULL VARCHAR2(128)
    LastUpdated                               NOT NULL DATE
    LastUpdatedBy                             NOT NULL VARCHAR2(128)
    SQL> select "InsertDt" from "Ops"."T_JobType"@pbods;
    InsertDt
    18-AUG-08
    09-OCT-08
    22-OCT-09
    18-AUG-08
    19-NOV-08
    SQL> SY.

  • Create material Master by connecting SAP Tables to SQL server tables

    We would like to use data in SQL server tables to upload material master into ECC. We have a number of rules embedded into the SQL server tables and would like that the LSMW program should pick up records directly from the SQL table without passing through a flat file format. Can anyone guide us how to accomplish this. We are on SAP AFS ECC 5.0 Thanks in advance

    Is anyone having any ideas on this ...?

Maybe you are looking for

  • Experience: Lack of Proactive interest to sell, Streamlined Escalation procedure and training

    Sequence of events Obtained coupon code for back to school special. Placed Order online for an iPad Mini Retina 128 GB Received email from Bestbuy.com that order was canceled. Phoned Best Buy Customer Service-Customer Services rep could not find out

  • How to query for specific filename

    I'm using a modified folder_action script to ftp files up to a server once they're finished being created. Is there a way to query for a specific filename to be present in the folder prior to passing it to the actual ftp function? The catch is that t

  • PSE 6 - problem with photos disappearing from Slideshow

    Hello, I worked on a slide show recently and then exported it to a .wma file. I returned to the slide show project in Elements a couple of days later to add another slide. Out of the 27 slides only three were intact. In 25 slides, there were no longe

  • Problems with 10.7.3 update.

    When I initially tried to install the 10.7.3 update on my Macbook Pro (2011 17") the installer moved all the files into the correct place, and then, when it said it was "cleaning up" it failed, almost at the end of the installiation. After this my co

  • "Unable to connect to background process" - read article, still won't work

    I got the error message, checked out the discussion, went to the article published by Apple, did what it said, and made it up to re-installing Compressor from my Final Cut 5 Install disk. However, when I try to custom install, Compressor is greyed ou