RBS Migration Speed

I am in the middle of migrating our SharePoint site to RBS. Everything looks good and I kicked off the Migrate() function. It's been a couple of days now and the content is still migrating out of a large 130 GB site collection. The data is only moving out
at about a GB per hour though. Is that normal or could I be having IO issues? I have been tracking the progress about a day by looking at the size of the RBS volume on the SQL box. Below is a chart I am using to try and figure out when RBS Migration for the
Content DB should be done.
Total GB
130
Avg. ETA
1/30/15 10:26 AM
Size (GB)
Time
Difference (GB)
Time Span (HH:MM)
GB Per Hour
Hours Remaining
ETA
% Complete
83.0
1/28/15 10:30 AM
64%
86.0
1/28/15 11:55 AM
3
1:25
2.117647059
20.78
1/29/15 8:41 AM
66%
86.7
1/28/15 12:23 PM
0.7
0:28
1.491539067
29.03
1/29/15 5:24 PM
67%
88.2
1/28/15 2:00 PM
1.5
1:36
0.927927525
45.05
1/30/15 11:02 AM
68%
89.4
1/28/15 3:48 PM
1.2
1:48
0.661723145
61.35
1/31/15 5:10 AM
69%
90.0
1/28/15 4:33 PM
0.6
0:44
0.815783908
49.03
1/30/15 5:35 PM
69%
91.1
1/28/15 5:33 PM
1.1
1:00
1.100036668
35.36
1/30/15 4:54 AM
70%
91.8
1/28/15 6:27 PM
0.7
0:54
0.774626673
49.31
1/30/15 7:46 PM
71%
103.0
1/29/15 9:50 AM
11.2
15:22
0.728220044
37.08
1/30/15 10:54 PM
79%

1GB Per Hour is 250kb/s. I'd say that's unlikely to be due to IO contention unless you've got a lot going on on the box as well. It's simply too low even if you were on consumer grade hardware.
What is the migration hardware you're using and are there any other resource spikes (CPU, RAM etc.)?
You can use PerfMon to check the counters for IOPS if you want to rule it out.

Similar Messages

  • RBS Migration and Data Store Expension

    I'm seeking some insight on if (and how) remote blobs are migrated.  For example, if I've configured RBS for SharePoint 2010 but I'm approaching the storage maximum on the hardware of my remote blob location - how would I go about moving the blobs elsewhere and 'pointing' Sql Server and SharePoint to the new locations?  In addition, if I were to simply add another storage location - how does one go about re-configuring the RBS to store blobs in a new/additional location?
    TIA.
    -Tracy

    1.   
    Live SharePoint 2010 environment with SQL 2008 R2
    a.   
    Take back up from 2010 live server.
    i.     
    Open management studio on SQL server.
    ii.     
    Take back up of content database of live application.
    2.   
    QA SharePoint 2010 environment with SQL 2008 R2
    a.   
    Restore SQL backup
    i.     
    Open management studio on SQL server.
    ii.     
    Restore database.
    b.  
    Create Web Application
    i.     
    Open SharePoint server
    ii.     
    Open central admin
    iii.     
    Create web application with classic authentication.
    c.   
    Dismount database which is with existing application
    i.     
    Open SharePoint PowerShell on SharePoint server.
    ii.     
    Fire below command.
    Dismount-SPContentDatabase <Database name>
    Note: Change the database name.
    d.  
    Mount restored database with existing application
    i.     
    Open SharePoint PowerShell on SharePoint server.
    ii.     
    Fire below command.
    Mount-SPContentDatabase <Database name>  -DatabaseServer  <Database server name > -WebApplication <Web application>
    Note: Change the database name and web application URL.
    iii.     
    Open SharePoint Designer and change the master page and publish it.
    iv.     
    Set the Test page as Home page.
    v.     
    Test user logging
    Logging with the 2-3 different users and test they can able to logging.
    e.   
    Configure RBS
    i.     
    Enable FILESTREAM on the database server
    Open SQL Server Configuration manager on SQL Server.
    From left panel click on SQL Server Services.
    From right panel select the instance of SQL Server on which you want to enable FILESTREAM.
    Right-click the instance and then click Properties.
    In the SQL Server Properties dialog box, click the FILESTREAM tab.
    Select the Enable FILESTREAM for Transact-SQL access check box.
    If you want to read and write FILESTREAM data from Windows, click Enable FILESTREAM for file I/O streaming access. Enter the name of the Windows share in the Windows Share Name box.
    If remote clients must access the FILESTREAM data that is stored on this share, select allow remote clients to have streaming access to FILESTREAM data.
    Click Apply and ok.
    ii.     
    Set FILESTREAM access level
    Open SQL management studio and connect SQL database instance.
    Right click on database instance and open Property.
    Go to: click on advanced from left panel.
    Find the “Filestream Access Level” property and set the value “Full access enabled”
    Click OK and exit window.
    iii.     
    Set SharePoint Configure FILESTREAM access level
    Open Query window on root
    Execute  following query
    EXEC sp_configure filestream_access_level, 2
    RECONFIGURE
    Restart SQL services
    Note: You will get message” Configuration option 'filestream access level' changed from 2 to 2. Run the RECONFIGURE statement to install.”
    iv.     
    Provision a BLOB store for each content database
    Click the content database for which you want to create a BLOB store, and then click New Query
    Execute following query
    use [<Database name>]
    if not exists
    (select * from sys.symmetric_keys
    where name = N'##MS_DatabaseMasterKey##')
    create master key encryption by password = N'Admin Key Password !2#4'
    Note:
    Change the database name
    You get “Command(s) completed successfully.” Message.
    use [<Database name>]
    if not exists
    (select groupname from sysfilegroups
    where groupname=N'RBSFilestreamProvider')
    alter database [<Database name>]
    add filegroup RBSFilestreamProvider contains filestream
    Note:
    Change the database name.
    You get “Command(s) completed successfully.” Message.
    use [<Database name>]
    alter database [<Database name>]
     add file (name = RBSFilestreamFile, filename =
    '<E:\SQL\Data\PetroChina>')
    to filegroup RBSFilestreamProvider
    Note:
    Change the database name and store path.
    If you get message “FILESTREAM file 'RBSFilestreamFile' cannot be added because its destination filegroup cannot have more than one file.”
    Ignore it.
    v.     
    Install the RBS client library on each Web server
    To install the RBS client library on the on the first Web server
    Open SharePoint Web server
    Open command prompt.
    Execute following command
    msiexec /qn /lvx* rbs_install_log.txt /i RBS.msi TRUSTSERVERCERTIFICATE=true FILEGROUP=PRIMARY DBNAME=<Database name> DBINSTANCE=<Database server> FILESTREAMFILEGROUP=RBSFilestreamProvider FILESTREAMSTORENAME=FilestreamProvider_1
    Note:
    Change the database name and database instance name.
    DB instance should be <server name\instance name>
    Download RBS.msi for respective SQL version.
    To install the RBS client library on all additional Web and application serversOpen SharePoint Web server
    Open command prompt.
    Execute following command
    msiexec /qn /lvx* rbs_install_log.txt /i RBS.msi DBNAME=<Database name> DBINSTANCE=<Database server> ADDLOCAL=Client,Docs,Maintainer,ServerScript,FilestreamClient,FilestreamServer
    Note:
    Change the database name and database instance name.
    DB instance should be <server name\instance name>
    vi.     
    Enable RBS for each content database
    You must enable RBS on one Web server in the SharePoint farm. It is not important which Web server that you select for this activity. You must perform this procedure once for each content database.
    Open SharePoint web server
    Open SharePoint PowerShell
    Execute below script
    $cdb = Get-SPContentDatabase <Database name>
    $rbss = $cdb.RemoteBlobStorageSettings
    $rbss.Installed()
    $rbss.Enable()
    $rbss.SetActiveProviderName($rbss.GetProviderNames()[0])
    $rbss
    Note: Change the database name.
    vii.     
    Test the RBS installation
    On the computer that contains the RBS data store.
    Browse to the RBS data store directory.
    Confirm that size of RBS data store directory.
    On the SharePoint farm, upload a file that is at least 100 kilobytes (KB) to a document library.
    On the computer that contains the RBS data store.
    Browse to the RBS data store directory.
    Confirm that size of RBS data store directory.
    It must be more than before.
    viii.     
    Test user logging
    Logging with the 2-3 different users and test they can able to logging.
    f.    
    Migrate RBLOB from RBS to SQL database and completely remove RBS
    i.     
    Migrate all content from RBS to SQL and disable RBS for content DB:
    Open SharePoint server.
    Open SharePoint management PowerShell
    Execute below script
    $cdb=Get-SPContentDatabase <Database name>
    $rbs=$cdb.RemoteBlobStorageSettings
    $rbs.GetProviderNames()
    $rbs.SetActiveProviderName("")
    $rbs.Migrate()
    $rbs.Disable()
    Note:
    Migrate() might take some time depending on amount of data in your RBS store.
    Change the database name.
    If you get message on the PowerShell “PS C:\Users\sp2010_admin> $rbs.Migrate()
    Could not read configuration for log provider <ConsoleLog>. Default value used.
    Could not read configuration for log provider <FileLog>. Default value used.
    Could not read configuration for log provider <CircularLog>. Default value used.
    Could not read configuration for log provider <EventViewerLog>. Default value used.
    Could not read configuration for log provider <DatabaseTableLog>. Default value used.” Then wait for while it will take some time to start migration.”
    ii.     
    Change the default RBS garbage collection window to 0 on your content DB:
    Open SQL server
    Open SQL management studio
    Select your content DB and open new query window
    Execute below SQL query
    exec mssqlrbs.rbs_sp_set_config_value ‘garbage_collection_time_window’,'time 00:00:00′
    exec mssqlrbs.rbs_sp_set_config_value ‘delete_scan_period’,'time 00:00:00′
    Note:
    Run one by one SQL query
    You will get “Command(s) completed successfully.” Message
    iii.     
    Run RBS Maintainer (and disable the task if you scheduled it):
    Open SharePoint server
    Open command prompt
    Run below command
    "C:\Program Files\Microsoft SQL Remote Blob Storage 10.50\Maintainer\Microsoft.Data.SqlRemoteBlobs.Maintainer.exe" -connectionstringname RBSMaintainerConnection -operation GarbageCollection ConsistencyCheck ConsistencyCheckForStores -GarbageCollectionPhases
    rdo -ConsistencyCheckMode r -TimeLimit 120
    iv.     
    Uninstall RBS:
    Open SQL server
    Open SQL management studio
    On your content DB run below SQL query
    exec mssqlrbs.rbs_sp_uninstall_rbs 0
    Note:
    If you will get message “The RBS server side data cannot be removed because there are existing BLOBs registered. You can only remove this data by using the force_uninstall parameter of the mssqlrbs.rbs_sp_uninstall stored pro” then run this “exec mssqlrbs.rbs_sp_uninstall_rbs
    1 ”
    You will get “Command(s) completed successfully.” Message.
    v.     
    Uninstall from add/remove SQL Remote Blob Storage.
    I found that there were still FILESTREAM references in my DB, so remove that reference
    Open SQL server
    Open SQL management studio
    Run below SQL query on your content DB:
    ALTER TABLE [mssqlrbs_filestream_data_1].[rbs_filestream_configuration] DROP column [filestream_value]
    ALTER TABLE [mssqlrbs_filestream_data_1].[rbs_filestream_configuration] SET (FILESTREAM_ON = "NULL")
    Note:
    Run one by one SQL query
    vi.     
    Now you can remove the file and filegroup for filestream:
    Open SQL server
    Open SQL management studio
    Open new query window on top
    Execute below SQL query
    ALTER DATABASE <Database name> Remove file RBSFilestreamFile;
    Note:
    Change the database name
    If it gives message “The file 'RBSFilestreamFile' cannot be removed because it is not empty.” Then remove all table prefix with “mssqlrbs_” from your database and execute SQL query again.
    This query takes time as per your database size (almost 30 min).
    You will get “The file 'RBSFilestreamFile' has been removed.” Message
    Execute below SQL query
    ALTER DATABASE <Database name> REMOVE FILEGROUP RBSFilestreamProvider;
    Note:
    Change the database name
    You get “The filegroup 'RBSFilestreamProvider' has been removed.” Message.
    Or If you get “Msg 5524, Level 16, State 1, Line 1 Default FILESTREAM data filegroup cannot be removed unless it's the last
    FILESTREAM data filegroup left.” message. Then ignore this message.
    vii.     
    Remove Blob Store installation
    Open SharePoint server
    Run RBS.msi setup file and choose Remove option.
    Finish wizard.
    viii.     
    Disable FILESTREAM in SQL Configuration Manager
    Disable FILESTREAM in SQL Configuration Manager for your instance (if you do not use it anywhere aside this single content DB with SharePoint), run SQL reset and IIS reset and test.
    ix.     
    Test the RBS Removed or not?
    On the computer that contains the SQL database.
    Confirm that size of SQL database (.mdf file).
    On the SharePoint farm, upload a file that is at least 100 kilobytes (KB) to a document library.
    On the computer that contains the SQL database.
    Confirm that size of SQL database.
    It must be more than before. If there is no difference then ignore it. Just check it Store is no more in SQL.
    x.     
    Test user logging
    Logging with the 2-3 different users and test they can able to logging.
    g.   
    Convert classic-mode web applications to claims-based authentication
    i.     
    Open SharePoint server
    ii.     
    Open SharePoint PowerShell
    iii.     
    Execute below script
    $WebAppName = "<URL>"
    $wa = get-SPWebApplication $WebAppName
    $wa.UseClaimsAuthentication = $true
    $wa.Update()
    $account = "<Domain name\User name>"
    $account = (New-SPClaimsPrincipal -identity $account -identitytype 1).ToEncodedString()
    $wa = get-SPWebApplication $WebAppName
    $zp = $wa.ZonePolicies("Default")
    $p = $zp.Add($account,"PSPolicy")
    $fc=$wa.PolicyRoles.GetSpecialRole("FullControl")
    $p.PolicyRoleBindings.Add($fc)
    $wa.Update()
    $wa.MigrateUsers($true)
    $wa.ProvisionGlobally()
    iv.     
    Test user logging
    Logging with the 2-3 different users and test they can able to logging.
    h.  
    Take SQL backup from QA server
    i.     
    Open SQL server
    ii.     
    Open management studio on SQL server
    iii.     
    Select the content database
    iv.     
    Take back up of content database
    Information: This SQL backup is not content RBS.
    3.   
    New SharePoint 2013 environment with SQL 2012
    a.   
    Restore SQL backup
    i.     
    Open SQL server
    ii.     
    Open SQL management studio
    iii.     
    Restore the SQL database using *.bak file
    b.  
    Dismount database which is with existing application
    i.     
    Open SharePoint server
    ii.     
    Open SharePoint management PowerShell
    iii.     
    Execute below script
    Dismount-SPContentDatabase <Database name>
    Note: change the database name which bind with existing application.
    c.   
    Mount restored database with existing application
    i.     
    Open SharePoint server
    ii.     
    Open SharePoint management PowerShell
    iii.     
    Execute below script
    Mount-SPContentDatabase <Database name> -DatabaseServer <Database server name> -WebApplication <URL>
    Note:
    Change the database name with new restored database name
    Change the database server name in form of “DB server name\DB instance name”
    Change the URL of web application
    This command take some time.
    d.  
    Upgrade site collection
    i.     
    Open SharePoint server
    ii.     
    Open new site
    iii.     
    You will find message on top “Experience all that SharePoint 15 has to offer. Start now or Remind me later”
    iv.     
    Click on “Start”
    v.     
    Click on ”Upgrade this Site Collection”
    vi.     
    Click on “I Am ready”
    vii.     
    After some time you will get message “Upgrade Completed Successfully”
    viii.     
    Test User logging.

  • EBS to RBS migration with SQL Enterprise

    All of our customers are currently using EBS for Sharepoint 2007 / 2010 with a custom provider that we provide as part of our solution.  
    We built an RBS provider for our Sharepoint 2013 version of our product but just found out that it cannot be used on SQL Server Standard.  
    So my question for Microsoft is how can you deprecate EBS for Sharepoint 2013 and force all of our customers using SQL server standard to pay to upgrade their database to SQL Enterprise (not a small fee) just so they can continue to externalize their Sharepoint
    content on a UNC path or SAN?  This is essentially telling customers that they must pay, in some cases, $18,000 to upgrade to Sharepoint 2013.  I understand that you may want to charge non Sharepoint SQL customers for moving SQL BLOBs to a network
    path / drive.  But it is not right to all of a sudden charge customers for doing what they are already doing with Sharepoint 2010 just because they want to move to Sharepoint 2013.  I am sure we will have customers that will consider not using our
    product or Sharepoint if this policy persists and I don't think either of us wants that.  
    What options can you provide now and is there anything we can do to raise this up the chain of command at MS?
    Thanks,
    Bill

    Our customers typically have millions of documents in Sharepoint.  The reason we have always externalized is because we don't want the database size to be Terabytes.  Not even sure what the maximum supported size is but don't really want to deal
    with restoring DBs that size if ever needed.  
    One of the ways that legacy ECM solutions have always sold against MS Sharepoint is by telling customers that they need have to store documents in the database.  Ours and other partners EBS / RBS solutions attempt to address this concern for true enterprise
    wide content management applications.  We are not storing collaborative marketing material in Sharepoint.  We are storing transactional documents like vendor invoices, purchase orders, student records, etc.  Do you think Shredded storage alleviates
    the concern of a storing millions of blobs in SQL or that storing that many blobs is not really a problem?
    Thanks,
    Bill

  • Migrated data from RBS to datafiles but data still remains in FileStream

    have a content database with RBS enabled on SQL Server 2012 Filestream. Migrated the data from Filestream to SQL with the script
    $cdb = Get-SPContentDatabase <database_name>
    $rbs = $cdb.RemoteBlobStorageSettings
    $rbs.SetActiveProviderName("")
    $rbs.Migrate()
    $rbs.Disable()
    seen the data being uploaded into the SQL (datafiles increased after migration)
    checked all OK
    $rbs.Installed() #shows true
    $rbs.Enabled #shows false
    run in SQL backend database
    exec mssqlrbs.rbs_sp_set_config_value 'garbage_collection_time_window','time 00:00:00'
    exec mssqlrbs.rbs_sp_set_config_value 'delete_scan_period','time 00:00:00'
    However the data is still in the filestream store.
    How do I get rid of the Filestream Store data (since it was migrated to SQL datafiles) ?

    check this link and make sure you followed the steps.
    http://alipka.wordpress.com/2010/06/19/how-to-disable-rbs-in-sharepoint-2010/
    or Step K on this blog:
    http://blogs.technet.com/b/pramodbalusu/archive/2011/07/09/rbs-and-sharepoint-2010.aspx
    Please remember to mark your question as answered &Vote helpful,if this solves/helps your problem. ****************************************************************************************** Thanks -WS MCITP(SharePoint 2010, 2013) Blog: http://wscheema.com/blog

  • RBS questions on sharepoint 2013

    I need some advice on RBS setup on sharepoint 2013.  I am following the microsoft article on setting RBS setup, on where you run some sql queries for the sql file stream on the content databases you want RBS to be setup on.  Then i download the
    RBS_amd64.msi file and install it on one web front end server  and enable for each content database. 
    My questions are that i have other web servers, app servers and db servers, do i need to install the rbs.msi exe on every other server in the farm , or just only the web servers and the app servers. Such as do we run that msiexec command on each server.
    Also how does rbs work for all migrated sites or existing sites which have documents already stored.  So if we enable rbs for these sites and for files over 5 mb, would that apply to only new uploaded files over 5mb, or will apply to also already stored
    over 5mb files. 
    Also does shredded storage impact the files sizes been uploaded as while, such if we configure files to be uploaded over 5 mb, will this apply to the files over this limit or does the files have to be much larger than 5 MB, such 8 or 9 mb have the rbs applied
    to them.
    Can i be advised on these 2 questions.
    Thanks

    I SUGGEST YOU NOT TO USE RBS IF YOU HAVE FURTHER PLANS FOR MIGRATING FOR NEWER VERSIONS. STILL I will give you steps for migrating RBS on sharepoint 2010 to 2013. But note that when I have migrated it on 2013 it is not in the way or RBS. I had to remove
    RBS from 2010 server and convert it itn database file of more than 600 GB and then migrated this db file to restore on sharepoint 2013 with sql 2012. and belive me it took more than 15 days to make copy and paste only. so.. better not to use RBS if you have
    large data.
    see the below steps.
    1.   
    Live SharePoint 2010 environment with SQL 2008 R2
    a.   
    Take back up from 2010 live server.
    i.     
    Open management studio on SQL server.
    ii.     
    Take back up of content database of live application.
    2.   
    QA SharePoint 2010 environment with SQL 2008 R2
    a.   
    Restore SQL backup
    i.     
    Open management studio on SQL server.
    ii.     
    Restore database.
    b.  
    Create Web Application
    i.     
    Open SharePoint server
    ii.     
    Open central admin
    iii.     
    Create web application with classic authentication.
    c.   
    Dismount database which is with existing application
    i.     
    Open SharePoint PowerShell on SharePoint server.
    ii.     
    Fire below command.
    Dismount-SPContentDatabase <Database name>
    Note: Change the database name.
    d.  
    Mount restored database with existing application
    i.     
    Open SharePoint PowerShell on SharePoint server.
    ii.     
    Fire below command.
    Mount-SPContentDatabase <Database name>  -DatabaseServer  <Database server name > -WebApplication <Web application>
    Note: Change the database name and web application URL.
    iii.     
    Open SharePoint Designer and change the master page and publish it.
    iv.     
    Set the Test page as Home page.
    v.     
    Test user logging
    Logging with the 2-3 different users and test they can able to logging.
    e.   
    Configure RBS
    i.     
    Enable FILESTREAM on the database server
    Open SQL Server Configuration manager on SQL Server.
    From left panel click on SQL Server Services.
    From right panel select the instance of SQL Server on which you want to enable FILESTREAM.
    Right-click the instance and then click Properties.
    In the SQL Server Properties dialog box, click the FILESTREAM tab.
    Select the Enable FILESTREAM for Transact-SQL access check box.
    If you want to read and write FILESTREAM data from Windows, click Enable FILESTREAM for file I/O streaming access. Enter the name of the Windows share in the Windows Share Name box.
    If remote clients must access the FILESTREAM data that is stored on this share, select allow remote clients to have streaming access to FILESTREAM data.
    Click Apply and ok.
    ii.     
    Set FILESTREAM access level
    Open SQL management studio and connect SQL database instance.
    Right click on database instance and open Property.
    Go to: click on advanced from left panel.
    Find the “Filestream Access Level” property and set the value “Full access enabled”
    Click OK and exit window.
    iii.     
    Set SharePoint Configure FILESTREAM access level
    Open Query window on root
    Execute  following query
    EXEC sp_configure filestream_access_level, 2
    RECONFIGURE
    Restart SQL services
    Note: You will get message” Configuration option 'filestream access level' changed from 2 to 2. Run the RECONFIGURE statement to install.”
    iv.     
    Provision a BLOB store for each content database
    Click the content database for which you want to create a BLOB store, and then click New Query
    Execute following query
    use [<Database name>]
    if not exists
    (select * from sys.symmetric_keys
    where name = N'##MS_DatabaseMasterKey##')
    create master key encryption by password = N'Admin Key Password !2#4'
    Note:
    Change the database name
    You get “Command(s) completed successfully.” Message.
    use [<Database name>]
    if not exists
    (select groupname from sysfilegroups
    where groupname=N'RBSFilestreamProvider')
    alter database [<Database name>]
    add filegroup RBSFilestreamProvider contains filestream
    Note:
    Change the database name.
    You get “Command(s) completed successfully.” Message.
    use [<Database name>]
    alter database [<Database name>]
     add file (name = RBSFilestreamFile, filename =
    '<E:\SQL\Data\PetroChina>')
    to filegroup RBSFilestreamProvider
    Note:
    Change the database name and store path.
    If you get message “FILESTREAM file 'RBSFilestreamFile' cannot be added because its destination filegroup cannot have more than one file.”
    Ignore it.
    v.     
    Install the RBS client library on each Web server
    To install the RBS client library on the on the first Web server
    Open SharePoint Web server
    Open command prompt.
    Execute following command
    msiexec /qn /lvx* rbs_install_log.txt /i RBS.msi TRUSTSERVERCERTIFICATE=true FILEGROUP=PRIMARY DBNAME=<Database name> DBINSTANCE=<Database server> FILESTREAMFILEGROUP=RBSFilestreamProvider FILESTREAMSTORENAME=FilestreamProvider_1
    Note:
    Change the database name and database instance name.
    DB instance should be <server name\instance name>
    Download RBS.msi for respective SQL version.
    To install the RBS client library on all additional Web and application serversOpen SharePoint Web server
    Open command prompt.
    Execute following command
    msiexec /qn /lvx* rbs_install_log.txt /i RBS.msi DBNAME=<Database name> DBINSTANCE=<Database server> ADDLOCAL=Client,Docs,Maintainer,ServerScript,FilestreamClient,FilestreamServer
    Note:
    Change the database name and database instance name.
    DB instance should be <server name\instance name>
    vi.     
    Enable RBS for each content database
    You must enable RBS on one Web server in the SharePoint farm. It is not important which Web server that you select for this activity. You must perform this procedure once for each content database.
    Open SharePoint web server
    Open SharePoint PowerShell
    Execute below script
    $cdb = Get-SPContentDatabase <Database name>
    $rbss = $cdb.RemoteBlobStorageSettings
    $rbss.Installed()
    $rbss.Enable()
    $rbss.SetActiveProviderName($rbss.GetProviderNames()[0])
    $rbss
    Note: Change the database name.
    vii.     
    Test the RBS installation
    On the computer that contains the RBS data store.
    Browse to the RBS data store directory.
    Confirm that size of RBS data store directory.
    On the SharePoint farm, upload a file that is at least 100 kilobytes (KB) to a document library.
    On the computer that contains the RBS data store.
    Browse to the RBS data store directory.
    Confirm that size of RBS data store directory.
    It must be more than before.
    viii.     
    Test user logging
    Logging with the 2-3 different users and test they can able to logging.
    f.    
    Migrate RBLOB from RBS to SQL database and completely remove RBS
    i.     
    Migrate all content from RBS to SQL and disable RBS for content DB:
    Open SharePoint server.
    Open SharePoint management PowerShell
    Execute below script
    $cdb=Get-SPContentDatabase <Database name>
    $rbs=$cdb.RemoteBlobStorageSettings
    $rbs.GetProviderNames()
    $rbs.SetActiveProviderName("")
    $rbs.Migrate()
    $rbs.Disable()
    Note:
    Migrate() might take some time depending on amount of data in your RBS store.
    Change the database name.
    If you get message on the PowerShell “PS C:\Users\sp2010_admin> $rbs.Migrate()
    Could not read configuration for log provider <ConsoleLog>. Default value used.
    Could not read configuration for log provider <FileLog>. Default value used.
    Could not read configuration for log provider <CircularLog>. Default value used.
    Could not read configuration for log provider <EventViewerLog>. Default value used.
    Could not read configuration for log provider <DatabaseTableLog>. Default value used.” Then wait for while it will take some time to start migration.”
    ii.     
    Change the default RBS garbage collection window to 0 on your content DB:
    Open SQL server
    Open SQL management studio
    Select your content DB and open new query window
    Execute below SQL query
    exec mssqlrbs.rbs_sp_set_config_value ‘garbage_collection_time_window’,'time 00:00:00′
    exec mssqlrbs.rbs_sp_set_config_value ‘delete_scan_period’,'time 00:00:00′
    Note:
    Run one by one SQL query
    You will get “Command(s) completed successfully.” Message
    iii.     
    Run RBS Maintainer (and disable the task if you scheduled it):
    Open SharePoint server
    Open command prompt
    Run below command
    "C:\Program Files\Microsoft SQL Remote Blob Storage 10.50\Maintainer\Microsoft.Data.SqlRemoteBlobs.Maintainer.exe" -connectionstringname RBSMaintainerConnection -operation GarbageCollection ConsistencyCheck ConsistencyCheckForStores -GarbageCollectionPhases
    rdo -ConsistencyCheckMode r -TimeLimit 120
    iv.     
    Uninstall RBS:
    Open SQL server
    Open SQL management studio
    On your content DB run below SQL query
    exec mssqlrbs.rbs_sp_uninstall_rbs 0
    Note:
    If you will get message “The RBS server side data cannot be removed because there are existing BLOBs registered. You can only remove this data by using the force_uninstall parameter of the mssqlrbs.rbs_sp_uninstall stored pro” then run this “exec mssqlrbs.rbs_sp_uninstall_rbs
    1 ”
    You will get “Command(s) completed successfully.” Message.
    v.     
    Uninstall from add/remove SQL Remote Blob Storage.
    I found that there were still FILESTREAM references in my DB, so remove that reference
    Open SQL server
    Open SQL management studio
    Run below SQL query on your content DB:
    ALTER TABLE [mssqlrbs_filestream_data_1].[rbs_filestream_configuration] DROP column [filestream_value]
    ALTER TABLE [mssqlrbs_filestream_data_1].[rbs_filestream_configuration] SET (FILESTREAM_ON = "NULL")
    Note:
    Run one by one SQL query
    vi.     
    Now you can remove the file and filegroup for filestream:
    Open SQL server
    Open SQL management studio
    Open new query window on top
    Execute below SQL query
    ALTER DATABASE <Database name> Remove file RBSFilestreamFile;
    Note:
    Change the database name
    If it gives message “The file 'RBSFilestreamFile' cannot be removed because it is not empty.” Then remove all table prefix with “mssqlrbs_” from your database and execute SQL query again.
    This query takes time as per your database size (almost 30 min).
    You will get “The file 'RBSFilestreamFile' has been removed.” Message
    Execute below SQL query
    ALTER DATABASE <Database name> REMOVE FILEGROUP RBSFilestreamProvider;
    Note:
    Change the database name
    You get “The filegroup 'RBSFilestreamProvider' has been removed.” Message.
    Or If you get “Msg 5524, Level 16, State 1, Line 1 Default FILESTREAM data filegroup cannot be removed unless it's the last
    FILESTREAM data filegroup left.” message. Then ignore this message.
    vii.     
    Remove Blob Store installation
    Open SharePoint server
    Run RBS.msi setup file and choose Remove option.
    Finish wizard.
    viii.     
    Disable FILESTREAM in SQL Configuration Manager
    Disable FILESTREAM in SQL Configuration Manager for your instance (if you do not use it anywhere aside this single content DB with SharePoint), run SQL reset and IIS reset and test.
    ix.     
    Test the RBS Removed or not?
    On the computer that contains the SQL database.
    Confirm that size of SQL database (.mdf file).
    On the SharePoint farm, upload a file that is at least 100 kilobytes (KB) to a document library.
    On the computer that contains the SQL database.
    Confirm that size of SQL database.
    It must be more than before. If there is no difference then ignore it. Just check it Store is no more in SQL.
    x.     
    Test user logging
    Logging with the 2-3 different users and test they can able to logging.
    g.   
    Convert classic-mode web applications to claims-based authentication
    i.     
    Open SharePoint server
    ii.     
    Open SharePoint PowerShell
    iii.     
    Execute below script
    $WebAppName = "<URL>"
    $wa = get-SPWebApplication $WebAppName
    $wa.UseClaimsAuthentication = $true
    $wa.Update()
    $account = "<Domain name\User name>"
    $account = (New-SPClaimsPrincipal -identity $account -identitytype 1).ToEncodedString()
    $wa = get-SPWebApplication $WebAppName
    $zp = $wa.ZonePolicies("Default")
    $p = $zp.Add($account,"PSPolicy")
    $fc=$wa.PolicyRoles.GetSpecialRole("FullControl")
    $p.PolicyRoleBindings.Add($fc)
    $wa.Update()
    $wa.MigrateUsers($true)
    $wa.ProvisionGlobally()
    iv.     
    Test user logging
    Logging with the 2-3 different users and test they can able to logging.
    h.  
    Take SQL backup from QA server
    i.     
    Open SQL server
    ii.     
    Open management studio on SQL server
    iii.     
    Select the content database
    iv.     
    Take back up of content database
    Information: This SQL backup is not content RBS.
    3.   
    New SharePoint 2013 environment with SQL 2012
    a.   
    Restore SQL backup
    i.     
    Open SQL server
    ii.     
    Open SQL management studio
    iii.     
    Restore the SQL database using *.bak file
    b.  
    Dismount database which is with existing application
    i.     
    Open SharePoint server
    ii.     
    Open SharePoint management PowerShell
    iii.     
    Execute below script
    Dismount-SPContentDatabase <Database name>
    Note: change the database name which bind with existing application.
    c.   
    Mount restored database with existing application
    i.     
    Open SharePoint server
    ii.     
    Open SharePoint management PowerShell
    iii.     
    Execute below script
    Mount-SPContentDatabase <Database name> -DatabaseServer <Database server name> -WebApplication <URL>
    Note:
    Change the database name with new restored database name
    Change the database server name in form of “DB server name\DB instance name”
    Change the URL of web application
    This command take some time.
    d.  
    Upgrade site collection
    i.     
    Open SharePoint server
    ii.     
    Open new site
    iii.     
    You will find message on top “Experience all that SharePoint 15 has to offer. Start now or Remind me later”
    iv.     
    Click on “Start”
    v.     
    Click on ”Upgrade this Site Collection”
    vi.     
    Click on “I Am ready”
    vii.     
    After some time you will get message “Upgrade Completed Successfully”
    viii.     
    Test User logging.

  • I want to transfer files  from my Macbook to my new Macbook Pro, what cable will provide fastest transfer rate ?

    I want to transfer files& software from my old macbook, to a new macbook pro., what cable will provide fastest migration speed ?

    Firewire 800 if it is present on your Macbook, otherwise Firewire 400.

  • Content Database large after removing BLOB

    My content database was 3 gigs before removing BLOB. In the process of removing BLOB, it required a merge which moved all of the files from the BLOB to my content database and made it 32 gigs. I then deleted the site containing all of the documents, but
    it didn't reduce the size of my content database.
    How can I clean up and get my content database back down to the original 3 gigs?

    Thanks for your replies and interest in helping. I've been researching this for days.
    In order to remove the BLOB I had to "migrate the data" into my content database. I then deleted the list and the subsite and emptied the recycle bin. After doing all the clean up I could find on the SharePoint side, I started researching the SQL
    side.
    I followed the instructions in the links you posted, but the Shrinking (second link) was basically the same as what I've tried already. The same steps, just from a different site.
    I don't know if I should be looking at SharePoint, SQL or both. The migrate (removing the BLOB) was done in SQL, so I'm thinking it should be removed using SQL.
    This is the link to the steps I used to remove the BLOB:
    https://alipka.wordpress.com/2010/06/19/how-to-disable-rbs-in-sharepoint-2010/
    This is the step I did that increased the DB size:
    Migrate all content off RBS to SQL and disable RBS for content db:
    $cdb=Get-SPContentDatabase <ContentDbName>
    $rbs=$cdb.RemoteBlobStorageSettings
    $rbs.GetProviderNames()
    $rbs.SetActiveProviderName("")
    $rbs.Migrate() –note: this might take some time depending on amount of data in your RBS store
    $rbs.Disable()

  • Clustered volume question

    I've recently taken over the administration of a network that has the core file shares on 3 physical, clustered Netware 6.5 machines. They are extremely old servers and the Fiber-Channel SAN on which the file shares reside is at end-of-life. The combined size of these file shares is around 1 TB. I need to migrate these volumes to some other platform.
    I'm much more familiar with SLES and OES2 than I am with Netware. I need to move these file shares to new Equalogic SAN volumes. I'd like to create a clustered or at least an HA environment. I'd like the hosts to be VM's on ESX 3.5 (soon to be upgraded to ESXi 4.0).
    What I'm looking for are some suggestions from the community as to how to proceed. Reliability is an issue as is speed. The Netware/Clustered volume environment has been extremely reliable.
    We are a private university with approximately 600-800 concurrent users. Any suggestions would be helpful.

    Novell does provide migration tools.. Personally I like to do the work
    manually.
    To move the data shouldn't be a problem.. you can export on netware the
    volumes via NFS and Rsync the Data directly to an OES Server (obviously
    you could use CIFS or NCP as well, I just find NFS faster)
    If you use the rsync -a option it preserves most file information (there
    are other options)
    Backup the trustees on Netware (using trustee.nlm)
    you will then need to convert this file format to OES file format,
    Again I do this via a script that turns each line in to a RIGHTS import
    command (rights is used on OES Linux)
    If you have moved homedrives you will need to re-apply file permissions
    either via trustee export or I just run in linux an 'ls' in a for loop
    on the Users folders and apply that using the rights command.
    You will also need to re-add the homeDirectory attribute. again I use a
    tool, that a colleague wrote, it is called ldapdo and is on coolsolutions
    We run a 5 node cluster, on physical hardware, I am not sure how well
    supported clustering is on ESX.. Personally I would use physical
    hardware. But then I am not a fan of ESX.
    Personally, I don't find OES Linux as good as Netware, We had the same
    number of resources on fewer servers, That said with the latest updates
    stability has vastly improved and so has the cluster migrate speeds.
    We currently have up to 2000 concurrent users on a busy day (with up to
    80 - 90 concurrent logins for labs). We provide access via NCP, CIFS
    and NFS for Linux labs, although we are looking to remove NFS as with
    NSS you can only export the file system with a NO_ROOT_SQUASH.
    Our data usage is about 14TB across 30 resources, Our university is try
    to centralise all IT, onto Windows however are failing to provide a
    solution that is stable and able to support multi protocol access. In
    the mean time, our system limps on, on old hardware.
    If you want any of the scripts let me know, they really were thrown
    together but they do the job.. I have to migrate data quite often on to
    new disks (Bigger) As our data requirements grow. We use some pretty
    reliable but cheap sata SANS called infortrends... As our budget is
    minimal..
    On 24/09/10 12:36 AM, tagross wrote:
    >
    > I've recently taken over the administration of a network that has the
    > core file shares on 3 physical, clustered Netware 6.5 machines. They are
    > extremely old servers and the Fiber-Channel SAN on which the file shares
    > reside is at end-of-life. The combined size of these file shares is
    > around 1 TB. I need to migrate these volumes to some other platform.
    >
    > I'm much more familiar with SLES and OES2 than I am with Netware. I
    > need to move these file shares to new Equalogic SAN volumes. I'd like to
    > create a clustered or at least an HA environment. I'd like the hosts to
    > be VM's on ESX 3.5 (soon to be upgraded to ESXi 4.0).
    >
    > What I'm looking for are some suggestions from the community as to how
    > to proceed. Reliability is an issue as is speed. The Netware/Clustered
    > volume environment has been extremely reliable.
    >
    > We are a private university with approximately 600-800 concurrent
    > users. Any suggestions would be helpful.
    >
    >

  • Upgrade and plug ins migration ? speed changes and transitions to motion 3

    I have the FCS2 upgrade on its way soon i hope..
    will my natress plug ins and others migrate when i go from FCS1 to 2?
    I am mid project - have done all my captures but havent got into the nitty gritty of the edit yet - you reckon its a bit risky to upgrade?
    I am super keen to get the ability to easily do ramps (in Motion 3 or in FC 6) and 3d logo fly arounds. Looking fwd to being able to throw transitions and speed changes into Motion 3 - i hear that it can deal with that now - is this true?

    Thanks - thats all good

  • Poor SSD disk IO speed in Oracle Linux 6.3 (Windows migration)

    Hello,
    I am trying to migrate from Windows to Oracle Linux, but I'm seeing very poor disk IO speeds. It's probably a tuning thing, but I'm relatively new to Oracle Linux and could use some detailed advice.
    I took one physical server and migrated it from Windows 2008R2 to Oracle Linux 6.3 while maintaing the same Oracle version (11.2.0.3 Enterprise with ASM) and the same hardware (quad CPU 48 core HP DL585 G7 with 128GB RAM, 7 LSI 9200-8e HBAs, 28 Samsung SSD Drives). Disk IO performance, as measured using Oracle IO Calibration, was ~7,800MB/Second and 440K IOPS under windows but fell to ~2,400MB/Second and 250K IOPS under Linux.
    Oracle Linux and the DB were installed using default values. The Oracle tools seem to have done a great job setting all of the obvious IO tuning parameters like the scheduler, but I figure that there are other important IO-related OS or DB parameters and that I have failed to configure the system properly.
    My goal for the migration is sequential read IO speed and I would have bet money that Linux would provide better performance than Windows. I still think that it should. What basic IO tuning should I do for Oracle Linux using ASM and SSD drives?
    Thank you!
    Some details:
    Oracle DB 11.2.0.3 enterprise installed via the GUI with the "Data Warehousing" template
    ASM - single disk group, 28 SSD disks, AU=4MB
    Oracle memory: Automatic memory management, 64GB allocated
    Non-default Oracle params: filesystemio_options=setall, disk_asynch_io=true
    Edited by: 975524 on Dec 7, 2012 8:56 AM

    Thanks "dude" for the advice. Unfortunately, I am still seeing low IO speeds.
    The default scheduler for OEL 6.3 with the DB pre-install package is deadline, which seemed like a far better choice than CFS. Based on your advice, I tried noop this morning and got the same results. I also tested with and without hugepages and saw only a small difference - at least in IO speed - I did not test overall DB performance. Lastly, I understand the /dev/shm issue, but even with the default configuration I'm getting 64MB allocated to Oracle, which is far more than is needed to test sequential IO - in fact I can get better results by using less RAM.
    To answer your questions, I am testing using Oracle IO Calibration, which is an IO testing feature of the Oracle DB that is similar to the standalone Oracle Orion tool. I also performed a few tests using IOMeter, but found that the Linux version of that product was not giving me consistent data. The overall trend was the same however - IO on the Linux version far lower than the same hardware running Windows. The system is functioning very well, so I assume that everything has been installed correctly, but I do not think that it was installed optimally - thus my cry for help.
    I am so surprised that Linux is showing slow IO!
    Edited by: 975524 on Dec 7, 2012 9:22 AM

  • File opening throws RBS exception after migrating to a new farm with sql 2012

    We have successfully migrated the SP 2010 -sql 2008 r2 wss_content to a new sp2010 farm with sql 2012. The old and new farms are configured for RBS - FIlestream.
    In the new farm when tried opening a document we see the below exceptions in event viewer. The documents open fine for admin users and fail for others. I have verified all the permissions on the blob store file location and it looks good. Users have no 
    issue uploading new files over and below the set the file size limit.
    Any suggestions on how to debug this would be great, do we need to rerun the .migrate() to realign any of the blob file references, we played around setting the max file size couple of times.
    Event vwr on DB server -
    FILESTREAM file named with GUID '00311100-0000-51eb-f3a8-0b387bc5c5ce' that belongs to FILESTREAM data file ID 0x10001 does not exist or cannot be opened.
    Event vwr on Front end -
    Message ID:20, Level:ERR , Process:10580, Thread:36
    Blob store <FilestreamProvider_1> threw this exception:
    Operation: Unknown
    BlobStoreId: 1
    Log Time: 5/5/2014 8:00:49 PM
    Exception: Microsoft.Data.SqlRemoteBlobs.BlobStoreException: There was a generic database error. For more information, see the included exception. ---> System.Data.SqlClient.SqlException: A transport-level error has occurred when receiving results from
    the server. (provider: TCP Provider, error: 0 - An existing connection was forcibly closed by the remote host.)
       at System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection)
       at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj)
       at System.Data.SqlClient.TdsParserStateObject.ReadSniError(TdsParserStateObject stateObj, UInt32 error)
       at System.Data.SqlClient.TdsParserStateObject.ReadSni(DbAsyncResult asyncResult, TdsParserStateObject stateObj)
       at System.Data.SqlClient.TdsParserStateObject.ReadNetworkPacket()
       at System.Data.SqlClient.TdsParserStateObject.ReadBuffer()
       at System.Data.SqlClient.TdsParserStateObject.ReadByteArray(Byte[] buff, Int32 offset, Int32 len)
       at System.Data.SqlClient.TdsParserStateObject.ReadInt64()
       at System.Data.SqlClient.TdsParserStateObject.ReadPlpLength(Boolean returnPlpNullIfNull)
       at System.Data.SqlClient.TdsParser.ProcessColumnHeader(SqlMetaDataPriv col, TdsParserStateObject stateObj, Boolean& isNull)
       at System.Data.SqlClient.SqlDataReader.ReadColumnHeader(Int32 i)
       at System.Data.SqlClient.SqlDataReader.GetBytesInternal(Int32 i, Int64 dataIndex, Byte[] buffer, Int32 bufferIndex, Int32 length)
       at System.Data.SqlClient.SqlDataReader.GetBytes(Int32 i, Int64 dataIndex, Byte[] buffer, Int32 bufferIndex, Int32 length)
       at Microsoft.Data.BlobStores.FilestreamBlobStore.FilestreamStoreLibraryBase.ReadBlob(Byte[] storePoolId, Byte[] storeBlobId)
       --- End of inner exception stack trace ---
       at Microsoft.Data.BlobStores.FilestreamBlobStore.FilestreamStoreLibraryBase.ReadBlob(Byte[] storePoolId, Byte[] storeBlobId)
       at Microsoft.Data.BlobStores.BlobStore.ReadBlobInternal(Request request)
    BlobStoreException Code: OperationFailedAuthoritative
    -Mash "http://sharepointxperiments.wordpress.com"

    Migration - its a content db backup/restore/mount to new farm with sql 2012 from sql 2008 r2
    Okie here are the results after we tried to narrow down the issue to few steps-
    This issue happens on documents that are already existed in the content db prior to migration. we picked two docs from one library. DOC A and DOC B of close to same size..
    DOC A - file size 64k, rbs max size set to 1mb.
    Regular User and SPAdmin both can open fine, check out n check in successfully.
    no entries made in event vwr of front end and DB server
    DOC B - file size 64k, rbs max size set to 1mb
    SPAdmin - can check out n open fine, but cannot check in back - file not available error
    Regular User - cannot open the file - file not avialable at the url
              Both users can view doc properties at lib level..
    Errors found in event vwr of DB server -
    "FILESTREAM file named with GUID '00311100-0000-51eb-f3a8-0b387bc5c5ce' that belongs to
    FILESTREAM data file ID 0x10001 does not exist or cannot be opened."    
    and
           "The operating system returned the error '2(The system cannot find the file specified.)' while attempting 'FsDoHandler::CreateFile' on 'H:\Blob_Store\a06953ae-6f49-43b1-9c2d-fefb13bda9fe\1c581a4a-b450-4143-bc3d-8d419b49b0fb\0000eb20-000066ff-0002'
    at 'fstrman.cpp'(2075)."
    I incline to believe that the references for documents in blob store could have been corrupted for some documents during the migration. Any new docs uploaded to content or to blob are all accessible fine.
    I'm not sure running .migrate() will re align the blob file references? Any ideas on how to effectively get this resolved..
    -Mash "http://sharepointxperiments.wordpress.com"

  • Speed up proces of migration Interactive Reporting version 8.3.2 to 9.3.1.

    Is there a way to speed up the proces of migration Hyperion Performance Suite 8.3.2 to Interactive reporting 9.3.1?
    What I tried and what helped was:
    SQL repository transfer has been improved by copying (select into the old db tables to the new repository.
    Transfering the repository files is done by a manual xcopy script from server A to server B.
    But the proces of renamning and converting the bqy files into the new version is very very slow. And we have to convert about 8000 files.
    I found a java heap size setting in the run.xml but I am not aware of any improvements after increasing that value.
    It is the brioqry_s.exe which takes much time to open, convert and save the individual files.
    Is is just a matter of adding extra cpu or memory? I am not too familiar with that kind of things. But if anyone has a idea we can work on it.
    Hope to heare from you, Detlev

    we had to much job output. We deleted it in the source. That helped a lot.
    Also the USAGE events where deleted just from the sql tables. It useless information in a new environment so why bother migrating it. Now it ruins within 13 hours. Thats acceptable.

  • How do i speed up my migration from my old mac to the new one?

    How do i speed up my migration from my old mac to the new one?

    what method are you using? Firewire, ethernet, wireless?  I'm assuming you are doing this via Migration Assistant.  You can't speed it up if it is in process, but you can use a faster method if available to you.  http://support.apple.com/kb/HT4413

  • Transfering- copying documents from IMac to MacBook Pro  - Migration Assistant or High Speed File share cable- which would be faster?

    I started Migration Assistant to copy documents from my IMac to my new MacBook Pro - it started off saying "About 53 hours and 40 minutes remaining" to "About 64 hours and 47 minutes remaining" - I have a High Speed File Share Cable - should I have used that?

    If you mean a FireWire cable, yes.
    Your best bet is to start over.  See the green box in Problems after using Migration Assistant

  • Speed also at 200 kbps, Informed of migration? Adv...

    Hi all, and thanks in advance for all your help.
    After having Sky installed last saturday (3rd April) my internet connection has been limited to 200kbps
    Have rang BT today and been informed of 'migration', and will take another 10 days before i get an appropriate connection speed. Is this normal practice? How can i keep track of progress on this migration? I've been told not to ring back about the issue, but I have no way of knowing whats going on?
    Just on another issue, my broadband is limited monthly, but after having the Sky Box connected I'm worried I'm going to excewed this limit. When talking to a Sky sales person while purchasing the product they said that Sky will have no effect on download. Is this correct? If it is why does it need to be connected?
    Thanks
    matt
    Solved!
    Go to Solution.

    Are you migrating to BT from another ISP or away from BT?
    Is your Sky box filtered at the phone socket?
    Get your line stats from your router with and without the Sky box plugged in to the phone socket and see if the downstream figures change - connection rate, snr (noise) margin - http://www.kitz.co.uk/adsl/frogstats.php
    While your at it stick the appropriate figures into this estimator http://www2.farina1.com/adsl/ and get an idea of what speeds you should be getting..
    Also make sure your internal wiring is up to scratch - http://www.kitz.co.uk/adsl/socket.htm
    Sky insist on the box being connected for the first year I believe so they can check where it's being used, could be wrong though.
    Half Man Half Biscuit - Joy Division Oven Gloves - Save 6

Maybe you are looking for