How to reduce size of SQL server table?

Hi experts,
  I have a table with 99.9% of unused size(9GB). How do I reduce the size?
I have tried those commands below but they do not work.
ALTER INDEX [VBDATA~0] ON qa2.VBDATA REBUILD
DBCC CLEANTABLE (QA2,"qa2.VBDATA", 0)
WITH NO_INFOMSGS;
GO
ALTER INDEX ALL ON qa2.VBDATA REBUILD

Hi deepakkori,
  Thanks for your help. I have found the solution.
"Also other option which you could try is :
      ALTER INDEX Index_name on Table_Name REORGANIZE WITH (LOB_COMPACTION=ON)"
you can check full discussion here:
http://social.msdn.microsoft.com/Forums/en-US/sqldatabaseengine/thread/68af32bb-eefa-494a-b62d-1ebd1387d105

Similar Messages

  • How to reduce size of a huge table in Access

    We have an Access mdb file that is getting too large.  It is about 1.7 GB, and will probably hit 2.0 GB in two or three months.  It has already been compacted.  There are only two tables, and 99% is from just one of the two tables, which has
    about 8.4 million records.  What are some strategies I could use to reduce the file size?  I've already done the database splitting strategy.  This is one of three back-end files in the split set.  I think some users might have Access 2003,
    so if you have a solution that is unique to Access 2007/2010, I'm not sure if it will be viable or not.

    Here it is.  Your question prompted me to change the datatype on Account and Fiscal Year, which were previously Text.  That reduced the size by 0.1 GB, but I definitely need to make further improvement.
    Company Code
    Text
    Cost Center
    Text
    Account
    Long Integer
    Item Number
    Text
    Item Description
    Text
    Charge Code
    Text
    Qty
    Double
    Unit of Measure
    Text
    Cost Per Unit of Measure
    Double
    Total Expense
    Double
    PO Number
    Text
    Vendor Name
    Text
    Vendor Number
    Text
    Source Type
    Text
    Fiscal Year
    Integer
    Fiscal Period
    Long Integer
    Journal Type
    Text
    Transaction Date
    Date/Time

  • How to move from one SQL Server table to another

    Suppose I have two tables:
    Employee1:
    EmpName varchar(100)
    EmpAddress varchar(200)
    And
    Employee2:
    EmpID int,
    EmpName varchar(100),
    EmpAddress varchar(200)
    Note: EmpID is not an identity field
    How to move all the data from Employee1 table to Employee2 table with values of EmpID as 1, 2, 3, ....so on.
    Can anyone have any idea?

    INSERT INTO [Employee2] (EmpID, EmpName, EmpAddress)
    SELECT ROW_NUMBER() OVER (ORDER BY EmpName), EmpName, EmpAddress from Employee1
    should work

  • How to select data from Sql server 2005 database tableinto oracle database table

    Hi,
    I have table text1 in sql server database and text2 in oracle database (11g). Now how to move data from SQL Server table into oracle table. So please help me how to do it.
    Thanks a lot in advance.
    rk
    OS: Windows 7 professional

    Hi,
    you can use export/import wizard and specify sql server as a source and oracle as destination.
    I hope this is helpful.
    Please Mark it as Answered if it answered your question
    OR mark it as Helpful if it help you to solve your problem
    Elmozamil Elamir Hamid
    MCSE Data Platform
    MCITP: SQL Server 2008 Administration/Development
    MCSA SQL Server 2012
    MCTS: SQL Server Administration/Development
    MyBlog

  • How to calculate the HFM Cube size in SQL Server-2005

    Hi
    How to calculate the HFM Cube size in SQL Server-2005 ?
    Below query used for Oracle. Then what is query for SQL Server?
    SQL> select sum(bytes/1024/1024) from dba_segments where segment_name like 'FINANCIAL_%' and owner='HFM';
    SUM(BYTES/1024/1024)
    SQL> select sum(bytes/1024/1024) from dba_segments where segment_name like 'HSV FINANCIAL%' and owner='HFM';
    SUM(BYTES/1024/1024)
    Regards
    Smilee

    What is your objective? The subcube in HFM is a concept which applies to the application tier - not so much to the database tier. The size of the subcube is the unique number of data strips (data values for January - December inclusive, for example) for the given entity, currency triplet or Parent.Child node. You have to account for parent accounts and customs which don't exist in the database but are generated in RAM in the application tier.
    So, if your objective is to find the largest subcubes, you could do this by querying the database and counting the number of records per entity/value (DCE tables) or parent.child entity combination (DCN tables). I'm not versed in SQL, but I think the script below would just tell you the schema size and not the subcube sizes.
    Check out Accelatis.com for a third party software product that can do this for you. The feature is called the Subcube Analyzer and was written by the same team that wrote HFM, so they ought to know how this works :-)
    --chris                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                               

  • How can I load a .xlsx File into a SQL Server Table using a Foreach Loop Container in SSIS?

    I know I've REALLY struggled with this before. I just don't understand why this has to be soooooo difficult.
    I can very easily do a straight Data Pump of a .xlsX File into a SQL Server Table using a normal Excel Connection and a normal Excel Source...simply converting Unicode to DT_STR and then using an OLE DB Destination of the SQL Server Table.
    If I want to make the SSIS Package a little more flexible by allowing multiple .xlsX spreadsheets to be pumped in by using a Foreach Loop Container, the whole SSIS Package seems to go to hell in a hand basket. I simply do the following...
    Put the Data Flow Task within the Foreach Loop Container
    Add the Variable Mapping Variable User::FilePath that I defined as a Variable and a string within the FOreach Loop Container
    I change the Excel Connection and its Expression to be ExcelFilePath ==> @[User::FilePath]
    I then try and change the Excel Source and its Data Access Mode to Table Name or view name variable and provide the Variable Name User::FilePath
    And that's when I run into trouble...
    Exception from HRESULT: 0xC02020E8
    Error at Data Flow Task [Excel Source [56]]:SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occured. Error code: 0x80004005.
    Error at Data Flow Task [Excel Source [56]]: Opening a rowset for "...(the EXACT Path and .xlsx File Name)...". Check that the object exists in the database. (And I know it's there!!!)
    I don't understand by adding a Foreach Loop Container to try and make this as efficient as possible has caused such an error unless I'm overlooking something. I have even tried delaying my validations and that doesn't seem to help.
    I have looked hard in Google and even YouTube to try and find a solution for this but for the life of me I cannot seem to find anything on pumping a .xlsX file into SQL Server using a Foreach Loop Container.
    Can ANYONE please help me out here? I'm at the end of my rope trying to get this to work. I think the last time I was in this quandry, trying to pump a .xlsX File into a SQL Server Table using a Foreach Loop Container in SSIS, I actually wrote a C# Script
    to write the contents of the .xlsX File into a .csv File and then Actually used the .csv File to pump the data into a SQL Server Table.
    Thanks for your review and am hoping and praying for a reply and solution.

    Hi ITBobbyP,
    If I understand correctly, you want to load data from multiple sheets in an .xlsx file into a SQL Server table.
    If in this scenario, please refer to the following tips:
    The Foreach Loop container should be configured as shown below:
    Enumerator: Foreach ADO.NET Schema Rowset Enumerator
    Connection String: The OLE DB Connection String for the excel file.
    Schema: Tables.
    In the Variable Mapping, map the variable to Sheet_Name, and change the Index from 0 to 2.
    The connection string for Excel Connection Manager is the original one, we needn’t make any change.
    Change Table Name or View name to the variable Sheet_Name.
    If you want to load data from multiple sheets in multiple .xlsx files into a SQL Server table, please refer to following thread:
    http://stackoverflow.com/questions/7411741/how-to-loop-through-excel-files-and-load-them-into-a-database-using-ssis-package
    Thanks,
    Katherine Xiong
    Katherine Xiong
    TechNet Community Support

  • In  a SQL query whihc has join, How to reduce Multiple instance of a table

    in a SQL query which has join, How to reduce Multiple instance of a table
    Here is an example: I am using Oracle 9i
    is there a way to reduce no.of Person instances from the following query? or can I optimize this query further?
    TABLES:
    mail_table
    mail_id, from_person_id, to_person_id, cc_person_id, subject, body
    person_table
    person_id, name, email
    QUERY:
    SELECT p_from.name from, p_to.name to, p_cc.name cc, subject
    FROM mail, person p_from, person p_to, person p_cc
    WHERE from_person_id = p_from.person_id
    AND to_person_id = p_to.person_id
    AND cc_person_id = p_cc.person_id
    Thnanks in advance,
    Babu.

    SQL> select * from mail;
            ID          F          T         CC
             1          1          2          3
    SQL> select * from person;
           PID NAME
             1 a
             2 b
             3 c
    --Query with only ne Instance of PERSON Table
    SQL> select m.id,max(decode(m.f,p.pid,p.name)) frm_name,
      2         max(decode(m.t,p.pid,p.name)) to_name,
      3         max(decode(m.cc,p.pid,p.name)) cc_name
      4  from mail m,person p
      5  where m.f = p.pid
      6  or m.t = p.pid
      7  or m.cc = p.pid
      8  group by m.id;
            ID FRM_NAME   TO_NAME    CC_NAME
             1 a          b          c
    --Expalin plan for "One instance" Query
    SQL> explain plan for
      2  select m.id,max(decode(m.f,p.pid,p.name)) frm_name,
      3         max(decode(m.t,p.pid,p.name)) to_name,
      4         max(decode(m.cc,p.pid,p.name)) cc_name
      5  from mail m,person p
      6  where m.f = p.pid
      7  or m.t = p.pid
      8  or m.cc = p.pid
      9  group by m.id;
    Explained.
    SQL> select * from table(dbms_xplan.display);
    PLAN_TABLE_OUTPUT
    Plan hash value: 902563036
    | Id  | Operation           | Name   | Rows  | Bytes | Cost (%CPU)| Time     |
    |   0 | SELECT STATEMENT    |        |     3 |   216 |     7  (15)| 00:00:01 |
    |   1 |  HASH GROUP BY      |        |     3 |   216 |     7  (15)| 00:00:01 |
    |   2 |   NESTED LOOPS      |        |     3 |   216 |     6   (0)| 00:00:01 |
    |   3 |    TABLE ACCESS FULL| MAIL   |     1 |    52 |     3   (0)| 00:00:01 |
    |*  4 |    TABLE ACCESS FULL| PERSON |     3 |    60 |     3   (0)| 00:00:01 |
    PLAN_TABLE_OUTPUT
    Predicate Information (identified by operation id):
       4 - filter("M"."F"="P"."PID" OR "M"."T"="P"."PID" OR
                  "M"."CC"="P"."PID")
    Note
       - dynamic sampling used for this statement
    --Explain plan for "Normal" query
    SQL> explain plan for
      2  select m.id,pf.name fname,pt.name tname,pcc.name ccname
      3  from mail m,person pf,person pt,person pcc
      4  where m.f = pf.pid
      5  and m.t = pt.pid
      6  and m.cc = pcc.pid;
    Explained.
    SQL> select * from table(dbms_xplan.display);
    PLAN_TABLE_OUTPUT
    Plan hash value: 4145845855
    | Id  | Operation            | Name   | Rows  | Bytes | Cost (%CPU)| Time     |
    |   0 | SELECT STATEMENT     |        |     1 |   112 |    14  (15)| 00:00:01 |
    |*  1 |  HASH JOIN           |        |     1 |   112 |    14  (15)| 00:00:01 |
    |*  2 |   HASH JOIN          |        |     1 |    92 |    10  (10)| 00:00:01 |
    |*  3 |    HASH JOIN         |        |     1 |    72 |     7  (15)| 00:00:01 |
    |   4 |     TABLE ACCESS FULL| MAIL   |     1 |    52 |     3   (0)| 00:00:01 |
    |   5 |     TABLE ACCESS FULL| PERSON |     3 |    60 |     3   (0)| 00:00:01 |
    PLAN_TABLE_OUTPUT
    |   6 |    TABLE ACCESS FULL | PERSON |     3 |    60 |     3   (0)| 00:00:01 |
    |   7 |   TABLE ACCESS FULL  | PERSON |     3 |    60 |     3   (0)| 00:00:01 |
    Predicate Information (identified by operation id):
       1 - access("M"."CC"="PCC"."PID")
       2 - access("M"."T"="PT"."PID")
       3 - access("M"."F"="PF"."PID")
    PLAN_TABLE_OUTPUT
    Note
       - dynamic sampling used for this statement
    25 rows selected.
    Message was edited by:
            jeneesh
    No indexes created...                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                               

  • How to load a comma seprated text file which contain address in to sql server table using ssis package

    Hi,
    I want to load a file which is comma separated and contain address .Problem is that address its self comma seprated so how do I differenciate whether comma used for column seprator or it used in address.
    for eg.
    One person having address like
    "c/o AB corp,156 cross lane,USA"
    Thanks.....

    Hi SR_MCTS,
    Based on your description, you want to distinguish a comma is used for column separator or used in address column in a text file, then load the data from the text file to SQL Server table.
    As per my understanding, if you can replace the comma column separator to another delimiter like semicolon (;), just do it. Then we can select Semicolon {;} as Column delimiter for the Flat File Connection Manager. Or ensure all columns are enclosed in double
    quotes ("). Then we can set the double quotes (") as Text qualifier for the Flat File Connection Manager, and the commas will be loaded as part of the string fields.
    If you can't have that done, because computers don't know the context of the data, you would have to come up with some kind of rules that decides when a comma represents a delimiter, and when it is just part of the text. I think a custom script component
    would be necessary to pre-process the data, identify where a comma is part of an address, and then treat that as one field.
    Thanks,
    Katherine Xiong
    Katherine Xiong
    TechNet Community Support

  • How to provide joins between oracle tables and sql server tables

    Hi,
    I have a requirement that i need to generate a report form two different data base. i.e Oracle and Sql Server.
    how to provide joins between oracle tables and sql server tables ? Any help on this
    Regards,
    Malli

    user10675696 wrote:
    I have a requirement that i need to generate a report form two different data base. i.e Oracle and Sql Server. Bad idea most times. Heterogeneous joins do not exactly scale and performance can be severely degraded by network speed and b/w availability. And there is nothing you can do in the application and database layers to address performance issue at the network level in this case - your code's performance is simply at the mercy of network performance. With a single glaring fact - network performance is continually degrading. All the time. Always. Until it is upgraded. When the performance degradation starts all over again.
    If the tables are not small (few 1000 rows each) and row volumes static, I would not consider doing a heterogeneous join. Instead I would rather go for a materialised view on the Oracle side, use a proper table and index structure, and do a local database join.

  • How to convert from SQL Server table to Flat file (txt file)

    I need To ask question how convert from SQL Server table to Flat file txt file

    Hi
    1. Import/Export wizened
    2. Bcp utility
    3. SSIS 
    1.Import/Export Wizard
    First and very manual technique is the import wizard.  This is great for ad-hoc and just to slam it in tasks.
    In SSMS right click the database you want to import into.  Scroll to Tasks and select Import Data…
    For the data source we want out zips.txt file.  Browse for it and select it.  You should notice the wizard tries to fill in the blanks for you.  One key thing here with this file I picked is there are “ “ qualifiers.  So we need to make
    sure we add “ into the text qualifier field.   The wizard will not do this for you.
    Go through the remaining pages to view everything.  No further changes should be needed though
    Hit next after checking the pages out and select your destination.  This in our case will be DBA.dbo.zips.
    Following the destination step, go into the edit mappings section to ensure we look good on the types and counts.
    Hit next and then finish.  Once completed you will see the count of rows transferred and the success or failure rate
    Import wizard completed and you have the data!
    bcp utility
    Method two is bcp with a format file http://msdn.microsoft.com/en-us/library/ms162802.aspx
    This is probably going to win for speed on most occasions but is limited to the formatting of the file being imported.  For this file it actually works well with a small format file to show the contents and mappings to SQL Server.
    To create a format file all we really need is the type and the count of columns for the most basic files.  In our case the qualifier makes it a bit difficult but there is a trick to ignoring them.  The trick is to basically throw a field into the
    format file that will reference it but basically ignore it in the import process.
    Given that our format file in this case would appear like this
    9.0
    9
    1 SQLCHAR 0 0 """ 0 dummy1 ""
    2 SQLCHAR 0 50 "","" 1 Field1 ""
    3 SQLCHAR 0 50 "","" 2 Field2 ""
    4 SQLCHAR 0 50 "","" 3 Field3 ""
    5 SQLCHAR 0 50 ""," 4 Field4 ""
    6 SQLCHAR 0 50 "," 5 Field5 ""
    7 SQLCHAR 0 50 "," 6 Field6 ""
    8 SQLCHAR 0 50 "," 7 Field7 ""
    9 SQLCHAR 0 50 "n" 8 Field8 ""
    The bcp call would be as follows
    C:Program FilesMicrosoft SQL Server90ToolsBinn>bcp DBA..zips in “C:zips.txt” -f “c:zip_format_file.txt” -S LKFW0133 -T
    Given a successful run you should see this in command prompt after executing the statement
    Starting copy...
    1000 rows sent to SQL Server. Total sent: 1000
    1000 rows sent to SQL Server. Total sent: 2000
    1000 rows sent to SQL Server. Total sent: 3000
    1000 rows sent to SQL Server. Total sent: 4000
    1000 rows sent to SQL Server. Total sent: 5000
    1000 rows sent to SQL Server. Total sent: 6000
    1000 rows sent to SQL Server. Total sent: 7000
    1000 rows sent to SQL Server. Total sent: 8000
    1000 rows sent to SQL Server. Total sent: 9000
    1000 rows sent to SQL Server. Total sent: 10000
    1000 rows sent to SQL Server. Total sent: 11000
    1000 rows sent to SQL Server. Total sent: 12000
    1000 rows sent to SQL Server. Total sent: 13000
    1000 rows sent to SQL Server. Total sent: 14000
    1000 rows sent to SQL Server. Total sent: 15000
    1000 rows sent to SQL Server. Total sent: 16000
    1000 rows sent to SQL Server. Total sent: 17000
    1000 rows sent to SQL Server. Total sent: 18000
    1000 rows sent to SQL Server. Total sent: 19000
    1000 rows sent to SQL Server. Total sent: 20000
    1000 rows sent to SQL Server. Total sent: 21000
    1000 rows sent to SQL Server. Total sent: 22000
    1000 rows sent to SQL Server. Total sent: 23000
    1000 rows sent to SQL Server. Total sent: 24000
    1000 rows sent to SQL Server. Total sent: 25000
    1000 rows sent to SQL Server. Total sent: 26000
    1000 rows sent to SQL Server. Total sent: 27000
    1000 rows sent to SQL Server. Total sent: 28000
    1000 rows sent to SQL Server. Total sent: 29000
    bcp import completed!
    BULK INSERT
    Next, we have BULK INSERT given the same format file from bcp
    CREATE TABLE zips (
    Col1 nvarchar(50),
    Col2 nvarchar(50),
    Col3 nvarchar(50),
    Col4 nvarchar(50),
    Col5 nvarchar(50),
    Col6 nvarchar(50),
    Col7 nvarchar(50),
    Col8 nvarchar(50)
    GO
    INSERT INTO zips
    SELECT *
    FROM OPENROWSET(BULK 'C:Documents and SettingstkruegerMy Documentsblogcenzuszipcodeszips.txt',
    FORMATFILE='C:Documents and SettingstkruegerMy Documentsblogzip_format_file.txt'
    ) as t1 ;
    GO
    That was simple enough given the work on the format file that we already did.  Bulk insert isn’t as fast as bcp but gives you some freedom from within TSQL and SSMS to add functionality to the import.
    SSIS
    Next is my favorite playground in SSIS
    We can do many methods in SSIS to get data from point A, to point B.  I’ll show you data flow task and the SSIS version of BULK INSERT
    First create a new integrated services project.
    Create a new flat file connection by right clicking the connection managers area.  This will be used in both methods
    Bulk insert
    You can use format file here as well which is beneficial to moving methods around.  This essentially is calling the same processes with format file usage.  Drag over a bulk insert task and double click it to go into the editor.
    Fill in the information starting with connection.  This will populate much as the wizard did.
    Example of format file usage
    Or specify your own details
    Execute this and again, we have some data
    Data Flow method
    Bring over a data flow task and double click it to go into the data flow tab.
    Bring over a flat file source and SQL Server destination.  Edit the flat file source to use the connection manager “The file” we already created.  Connect the two once they are there
    Double click the SQL Server Destination task to open the editor.  Enter in the connection manager information and select the table to import into.
    Go into the mappings and connect the dots per say
    Typical issue of type conversions is Unicode to non-unicode.
    We fix this with a Data conversion or explicit conversion in the editor.  Data conversion tasks are usually the route I take.  Drag over a data conversation task and place it between the connection from the flat file source to the SQL Server destination.
    New look in the mappings
    And after execution…
    SqlBulkCopy Method
    Sense we’re in the SSIS package we can use that awesome “script task” to show SlqBulkCopy.  Not only fast but also handy for those really “unique” file formats we receive so often
    Bring over a script task into the control flow
    Double click the task and go to the script page.  Click the Design script to open up the code behind
    Ref.
    Ahsan Kabir Please remember to click Mark as Answer and Vote as Helpful on posts that help you. This can be beneficial to other community members reading the thread. http://www.aktechforum.blogspot.com/

  • How to delete rows in the SQL Server table based on the contents of Excel file

    Hello, everyone,
    I have an Excel file which contains data for certain dates. I need to load it in a SQL Server table. But before doing that, I need to check if the table already contains data for the dates in the Excel file. If it does, I need to delete those data
    first. Not sure what is the best and efficient way to do this. Your help and guidance would be much appreciated.
    Thank you in advance.

    there are multiple ways of doing this
    Fastest method would be below
    1. Have a data flow task using excel source. Then add a OLEDB destination to dump the data to a staging table
    2. Have a Execute sql task to delete data from your actual table based on staging table data. The query would look like
    DELETE t
    FROM YourTable t
    WHERE EXISTS (SELECT 1
    FROM StagingTable
    WHERE DateField = t.DateField)
    3. Have another execute sql task to do final insert like
    INSERT YourTable (Col1,Col2,...)
    SELECT Col1,Col2,..
    FROM StagingTable
    the above operations will be set based and faster
    Please Mark This As Answer if it helps to solve the issue Visakh ---------------------------- http://visakhm.blogspot.com/ https://www.facebook.com/VmBlogs

  • How to insert into SQL server table form oracle forms

    I created a form with oracle as my database. But there one trigger where I need to insert the data into a sql server table.
    Is this possible. If so can any help me out.
    Thanks in advance.
    Asha

    Hi,
    You can insert into sql server database using the following steps
    Note: Check wether you are using Forms 32 bit drivers. If not the Odbc data source will not work.
    step 1: Create ODBC data source for SQL server(one time creation);
    step 2: Logout from Oracle and login to SQL server giving the user name,password and host string as odbc:<odbc datasource name>;
    step 3: use EXEC SQL statement to insert the values into the SQL server and then logout and login again to your oracle database.
    Second Method.
    Check the sql server documentation to insert the values using command line parameters. Then you can call the host command to execute this.
    Third Method.
    Write a VB exe to enter the values in the sql server making two connections one to oracle another to SQL server and then getting values from Oracle and putting in the SQL server database. You can call this exe using the Host command.
    Hope this will help You.
    Regards
    Gaurav Thakur

  • How to select data from 3rd row of Excel to insert into Sql server table using ssis

    Hi,
    Iam having Excel files with headers in first two rows , i want two skip that two rows and select data from 3rd row to insert into Sql Server table using ssis.3rd row is having column names.

                                                         CUSTOMER DETAILS
                         REGION
    COL1        COL2        COL3       COL4           COL5          COL6          COL7
           COL8          COL9          COL10            COL11      
    1            XXX            yyyy         zzzz
    2            XXX            yyyy        zzzzz
    3           XXX            yyyy          zzzzz
    4          XXX             yyyy          zzzzz
    First two rows having cells merged and with headings in excel , i want two skip the first two rows and select the data from 3rd row and insert into sql server using ssis
    Set range within Excel command as per below
    See
    http://www.joellipman.com/articles/microsoft/sql-server/ssis/646-ssis-skip-rows-in-excel-source-file.html
    Please Mark This As Answer if it solved your issue
    Please Mark This As Helpful if it helps to solve your issue
    Visakh
    My MSDN Page
    My Personal Blog
    My Facebook Page

  • SQL Server table capture using Oracle SQL Developer

    Hello,
    I need to know how to set up the Oracle SQL Developer to captuer SQL Server table with correct field precision. For some reason, all NVARCHAR field precision sizes are the double of the original precision size after capturing the table.
    Example:
    SQL Server table 'A' with the following field:
    name NVARCHAR( 50)
    after table capture becomes:
    name NVARCHAR2(100)
    which is double of the original precision.
    Any feed back or input is greatly appreciated.
    Thank you.
    Message was edited by:
    qa2537

    i'm sorry
    --tsql correction
    merge into md_columns m
    using(
    select md_col.precision as prec, md_col.column_name,md_col.rowid as row_id
    FROM md_schemas md_s
    join md_tables md_tab on md_tab.schema_id_fk = md_s.id
    join md_catalogs md_cat on md_cat.id = md_s.catalog_id_fk
    inner join md_connections con on md_cat.connection_id_fk=con.id
    join md_columns md_col on md_col.table_id_fk = md_tab.id
    where --md_s.created_on = (select max(created_on) from md_catalogs)
    con.name like '%(corr%)'
    and md_col.column_type in ('nchar','nvarchar')
    ) merge_clause on (m.rowid = merge_clause.row_id)
    when matched then update set m.precision = merge_clause.prec/2
    that works, as long as you don't forget to commit like i did....
    hope it helps

  • Data Pump .xlsx into a SQL Server Table and the whole 32-Bit, 64-Bit discussion

    First of all...I have a headache!
    Found LOTS of Google hits when trying to data pump a .xlsx File into a SQL Server Table. And the whole discussion of the Microsoft ACE 64-Bit Driver or the Microsoft Jet 32-Bit Driver.
    Specifically receiving this error...
    An OLE DB record is available.  Source: "Microsoft Office Access Database Engine"  Hresult: 0x80004005  Description: "External table is not in the expected format.".
    Error: 0xC020801C at Data Flow Task to Load Alere Coaching Enrolled, Excel Source [56]: SSIS Error Code DTS_E_CANNOTACQUIRECONNECTIONFROMCONNECTIONMANAGER.  The AcquireConnection method call to the connection manager "Excel Connection Manager"
    failed with error code 0xC0202009.
    Strangely enough, if I simply data pump ONE .xlsx File into a SQL Server Table utilizing my SSIS Package, it seems to work fine. If instead I am trying to be pro-active and allowing for multiple .xlsx Files by using a Foreach Loop Container and a variable
    @[User::FileName], it's erroring out...but not really because it is indeed storing the rows onto the SQL Server Table. I did check all my Delay
    Why does this have to be sooooooo difficult???
    Can anyone help me out here in trying to set-up a SSIS Package in a rather constrictive environment to pump a .xlsx File into a SQL Server Table? What in God's name am I doing wrong? Or is all this a misnomer? But if it's working how do I disable the error
    so that is stops erroring out?

    Hi ITBobbyP,
    According to your description, when you import data of .xlsx file to SQL Server database, you got the error message.
    The error can be caused by the following reasons:
    The excel file is locked by other processes. Please kindly resave this file and name it to other file name to see if the issue will be fixed.
    The ACE(Access Database Engine) is not up to date as Vaibhav mentioned. Please download the latest ACE and install it from the link:
    https://www.microsoft.com/en-us/download/details.aspx?id=13255.
    The version of OFFICE and server bitness is not the same. To solve the problem, please refer to the following document:
    http://hrvoje.piasevoli.com/2010/09/01/importing-data-from-64-bit-excel-in-ssis/
    If you have any more questions, please feel free to ask.
    Thanks,
    Wendy Fu
    Wendy Fu
    TechNet Community Support

Maybe you are looking for