Question on Filestreams

Hi All,
I have a question on Filestreams.
I know that Filestreams are introduced in SQL2008 and used to stored BLOBs on NTFS file system and not in the database file. Now, question is, what is benefit of do so? Why do we need to store on File system and why not database file?
Appreciate your help.
Thank you.
Sam

Storing and retrieving BLOB data from FileSystem is faster compared to Storing andretrieving BLOB data from SQL Server. SQL Server is better in handling structured data, but not so good in handling large unstructured data.
It is worth pointing out that this applies if you use the special OpenSqlFilestream API and then regular Win32 API operations for reading and writing data. If you only move the data from a regular BLOB column to FILESTREAM, you don't win that much.
Also, the data will have to be of some size for FILESTREAM to be a winner. 1 MB is often given as the tipping point. If your blobs generally are smaller, don't bother.
Erland Sommarskog, SQL Server MVP, [email protected]

Similar Messages

  • Filestream Creation Unable to Open Physical File Operating System Error 259

    Hey Everybody,
    I have run out of options supporting a customer that is having an error when creating a database with a file stream.  The error displayed is unable to open physical file operating system error 259 (No more data is available).  We're using a pretty
    standard creation SQL script that we aren't having issues with other customers:
    -- We are going to create our data paths for the filestreams.  
    DECLARE @data_path nvarchar(256);
    SET @data_path = (SELECT SUBSTRING(physical_name, 1, CHARINDEX(N'master.mdf', LOWER(physical_name)) - 1)
                      FROM master.sys.master_files
                      WHERE database_id = 1 AND file_id = 1);
    -- At this point, we should be able to create our database.  
    EXECUTE ('CREATE DATABASE AllTables
    ON PRIMARY
        NAME = AllTables_data
        ,FILENAME = ''' + @data_path + 'AllTables_data.mdf''
        ,SIZE = 10MB
        ,FILEGROWTH = 15%
    FILEGROUP FileStreamAll CONTAINS FILESTREAM DEFAULT
        NAME = FSAllTables
        ,FILENAME = ''' + @data_path + 'AllTablesFS''
    LOG ON
        NAME = AllTables_log
        ,FILENAME = ''' + @data_path + 'AllTables_log.ldf''
        ,SIZE = 5MB
        ,FILEGROWTH = 5MB
    GO
    We are using SQL Server 2014 Express.  File streams were enabled on the installation SQL Server.  The instance was created successfully and we are able to connect to the database through SSMS. The user is using an encrypted Sophos. 
    We have tried the following:
    1. Increasing the permissions of the SQL Server server to have full access to the folders.
    2. Attempted a restore of a blank database and it failed.
    There doesn't seem to be any knowledge base articles on this particular error and I am not sure what else I can do to resolve this.  Thanks in advance for any help!

    Hi Ryan,
    1)SQL Server(any version) can't be installed on encrypted drives. Please see a similar scenario in the following link
    https://ask.sqlservercentral.com/questions/115761/filestream-and-encrypted-drives.html
    2)I don't think there is any problem with permissions on the folder, if the user can create a database in the same folder. Am not too sure. Also see the article by
    Jacob on configuring the FILESTREAM for SQL Server that describes how to configure FILESTREAM access level & creating a FILESTREAM enabled database
    Hope this helps,
    Thanks
    Bhanu 

  • Permission to FileStream Directory on MSDN question

    On technet you have listed - http://technet.microsoft.com/en-us/library/bb933993(v=sql.105).aspx
    Only the account under which the SQL Server service account runs is granted NTFS permissions to the FILESTREAM container. We
    recommend that no other account be granted permissions on the data container.
    Why is this the case, what if you want to allow your IIS App Pool Account access to read these files.  We are using PDF API that when trying to stream takes two minutes+ to generate the pdf file, however if we can read from the directly it is in
    milliseconds. Can you provide more evidence on why the app pool identity cannot access this directory?  Again, why the recommendation?  
    In MSDN you contradict yourself on how to use IO to Read/Write to the file tables - http://msdn.microsoft.com/en-us/library/gg492089.aspx#accessing
    Moojjoo MCP, MCTS
    MCP Virtual Business Card
    http://moojjoo.blogspot.com

    Tibor, I am writing a custom application for the WEB
    Where WebConfigurationManager.AppSettings["WebDocuments"] = The file stream directory
    INSERTs
    public void UploadFiles(List<UploadFileModel> uploadedFile)
    string path = WebConfigurationManager.AppSettings["WebDocuments"];
    foreach (UploadFileModel file in uploadedFile)
    if (file != null && file.File.ContentLength == 0)
    continue;
    if (file == null) continue;
    if (file.FileName == null) continue;
    string savedFileName = Path.Combine(
    path,
    Path.GetFileName(file.FileName));
    file.File.SaveAs(savedFileName);
    DELETEs
    public static void DeleteFilesByWebSiteId(int webSiteId)
    string path = WebConfigurationManager.AppSettings["WebDocuments"];
    //string path = @"C:\_Temp\"; Used with Upload
    string strWebSiteId = webSiteId.ToString();
    string filesToDelete = strWebSiteId + "*";
    string[] fileList = Directory.GetFiles(path, filesToDelete);
    if (fileList.Length > 0)
    foreach (string file in fileList)
    System.IO.File.Delete(file);
    Again this would require the app pool identity.  Is this a security problem and why?  It would only require read/write capability.
    Moojjoo MCP, MCTS
    MCP Virtual Business Card
    http://moojjoo.blogspot.com

  • Fairly certain that FileStream.writeObject() and FileStream.readObject() do not function - at all -.

    I've struggled with this since Jan 9th, 2013 (if not longer) and the only conclusion I can come to is that this simply does not function.  No matter what I try and no matter what resource (and I'm finding precious few) I follow to try to implement this within Flash Builder 4.7, Flex SDK 4.6.0 (Build 23201), AIR SDK 3.5, I only succeed in creating a file (with the correct name) that is 123 bytes in size that reads back in as NULL;
    I've tried using ByteArray.writeObject()/readObject() as an intermediary with FileStream.writeBytes()/readBytes(), with no luck.
    I've tried instantiating an object, setting properties and then using that.  I've tried instantiating my correctly formed ValueObject (including the remoteClass alias metadata tag).
    I've tried using -verbatim- the example provided in the top most suggested 'Community Help' resource http://www.switchonthecode.com/tutorials/adobe-air-and-flex-saving-serialized-objects-to-f ile It is worth noting that this solitary example of the procedure/SDK-usage is dated to Flex SDK 3.0 and at least 04/04/2009 (first comment on the article).
    My frustrating hell (one version of many methods attempted) is detailed on StackOverflow (including -all- mxml, as, and trace output), but so far, no assistance has been forthcoming, alas.  This is a severely stripped down and simplified version of what had been a far more complex attempt:
    http://stackoverflow.com/questions/14366911/flex-air-actionscript-mobile-file-writeobject- readobject-always-generates-null-w
    An earlier post* detailing a far more complex attempt interation, with, alas, just as little help (guess this isn't a hot button topic) forthcoming:
    http://stackoverflow.com/questions/14259393/flex-actionscript3-filestream-writeobject-fail s-silently-in-ios-what-am-i-doin
    * I previously suspected that it was only failing from within iOS on an iPad, but the first example (the stripped down version) made it evident that it didn't work in the AIR mobile device simulator (iPad) in the Windows environment, and indeed, didn't work in a non-mobile project in the windows environment AIR launcher.
    I'm at a loss, upset, frustrated, in major trouble with my supervisor/deadlines, etc.
    I would very much appreciate any suggestions/help/confirmation/etc.
    Just to move ahead with development I've opted for a far less preferable solution of writing out both an XML file and a JPG file.  I very much do not like this and very much want to store encapsulated serialized objects locally in the same way I assume will work for storing remotely with AMFPHP (if the project ever gets to that point *sigh*).
    Again.  Would be so grateful for any help.

    I want to add to this post as I marked it as "The Answer" though it does not indeed contain the answer directly, for those who come looking for simliar solutions.
    harUI prompted me to realize that my metadata term needed to be capitalized (RemoteClass instead of remoteClass).  As the metadata tags may be user defined, the compiler throws no errors (or warnings *grumble*)
    package vo
        import flash.display.BitmapData;
       // [remoteClass(alias="PTotmImageVO")] incorrect
       [RemoteClass(alias="PTotmImageVO")]
        public class PTotmImageVO

  • Access Denied error while reading from filestream

    Hi Everyone.
    I have an intranet application that stores files in SQL filestream.
    On my dev machine, everything works like a charm.
    I'm able to upload and store files into SQL filestream (AjaxUpload) and able to downlaod them.
    On the live server, I'm able to upload files, delete them, but when I try to download the file from filestream, I get the following error:
    Access is denied
    Description: An unhandled exception occurred during the execution of the current web request. Please review the stack trace for more information about the error and where it originated in the code.
    Exception Details: System.ComponentModel.Win32Exception: Access is denied
    Source Error:
    An unhandled exception was generated during the execution of the current web request. Information regarding the origin and location of the exception can be identified using the exception stack trace below.
    Stack Trace:
    [Win32Exception (0x80004005): Access is denied]
    System.Data.SqlTypes.SqlFileStream.OpenSqlFileStream(String path, Byte[] transactionContext, FileAccess access, FileOptions options, Int64 allocationSize) +1465594
    System.Data.SqlTypes.SqlFileStream..ctor(String path, Byte[] transactionContext, FileAccess access, FileOptions options, Int64 allocationSize) +398
    System.Data.SqlTypes.SqlFileStream..ctor(String path, Byte[] transactionContext, FileAccess access) +27
    quotes_GetFileStream.quotes_GetFileStream_Load(Object sender, EventArgs e) +740
    System.Web.UI.Control.LoadRecursive() +71
    System.Web.UI.Page.ProcessRequestMain(Boolean includeStagesBeforeAsyncPoint, Boolean includeStagesAfterAsyncPoint) +3048
    The application pool is set to integrated, V.4.0,  and I use a domain user for the identity authentication back to SQL.
    I gave that user DB_Owner rights to the SQL Database for that application.
    Even tried giving it all the SQL Server Roles, though i still get the above error.
    When I change the Identity username to mine (I have Domain Admin rights), everything works flawlessly on the live server.
    What rights am I missing to give that user so he can read from SQL filestream properly.
    Here is the block of code that gets the file from filestream and pushes it to the browser, maybe I'm missing something here (though once i modify the user it works great).
    Dim RecId As Integer = -1
    If Not IsNothing(Request.QueryString("ID")) And IsNumeric(Request.QueryString("ID")) Then
    RecId = CType(Request.QueryString("ID"), Integer)
    End If
    Dim ConString As String = ConfigurationManager.ConnectionStrings("ConnectionString").ToString
    Using Con As SqlConnection = New SqlConnection(ConString)
    Con.Open()
    Dim txn As SqlTransaction = Con.BeginTransaction()
    Dim Sql As String = "SELECT FileData.PathName() , GET_FILESTREAM_TRANSACTION_CONTEXT() as TransactionContext, [FileName], [FileExtension] FROM [QAttach] where RecId = @RecId"
    Dim cmd As SqlCommand = New SqlCommand(Sql, Con, txn)
    cmd.Parameters.Add("@RecID", Data.SqlDbType.Int).Value = RecId
    Dim Rdr As SqlDataReader = cmd.ExecuteReader()
    While Rdr.Read()
    Dim FilePath As String = Rdr(0).ToString()
    Dim objContext As Byte() = DirectCast(Rdr(1), Byte())
    Dim fname As String = Rdr(2).ToString()
    Dim FileExtension As String = Rdr(3).ToString()
    Dim sfs As SqlFileStream = New SqlFileStream(FilePath, objContext, FileAccess.Read)
    Dim Buffer As Byte() = New Byte(CInt(sfs.Length) - 1) {}
    sfs.Read(Buffer, 0, Convert.ToInt32(Buffer.Length))
    Response.Buffer = True
    Response.Charset = ""
    Response.Cache.SetCacheability(HttpCacheability.NoCache)
    Response.ContentType = FileExtension
    Response.AddHeader("content-disposition", "attachment;filename=" & fname)
    Response.BinaryWrite(Buffer)
    Response.Flush()
    Response.End()
    sfs.Close()
    End While
    End Using
    Thanks.
    Oren

    @William Bosacker:
    Please accept our apologies for any mistreatment.  While there's certainly no legal recourse for posts on an open forum like this, we do take steps to try to keep the forums a pleasant and friendly place to visit, and toward this end, two other moderators
    have already cleaned the thread and have begun to address the abuse.
    Not to defend the manner in which it was addressed, I must still point out that necro posting to a thread (this thread is from late 2013) and proposing your post as an answer are both generally discouraged. Again, I cannot condone the way in which it was
    presented, but I can understand why the other contributors thought something should be said.
    The recommended way to contribute information like this would be to create a new Discussion thread with content something like:
    "I was experiencing XYZ issue and while searching for a resolution I found this thread [link to old thread].  But after trying A, B, and C, I found that the following actually resolved my problem [code snippet].  I thought this might be helpful
    for anyone else with this issue... yada yada"
    In the case of this original thread, the issue was most certainly permission related.  While the underlying network permissions would certainly need to allow that user to access the server, the root problem may well have been within the SQL table permissions
    themselves.  The OP of the original thread really didn't provide enough context to know if they had the internal database permissions set correctly.
    The information you've provided essentially shows one way to set the table permissions, but it isn't necessarily the only way.  Its also possible that the issue could be resolved by modifying permission entries within the SQL manager rather than through
    a particular script file.  So while this information may indeed be helpful to someone in the future, it does not necessarily answer the question of this thread.  Only the OP has enough information to know if this can be applied to their situation;
    and since the thread is several years old and was originally closed by a moderator, there is very little chance that the OP will be back to respond.
    Hopefully this clears the air a little and will allow us all to get back to trying to help the VB development community within the guidelines of the forum.
    Reed Kimble - "When you do things right, people won't be sure you've done anything at all"

  • When to use Filestream partitions?

    We have a Web site where we do a lot of document management. We currently have a table with 370,000 records in it. When uploading a new file we check it's size and if it is below 2Gig we store it in a VarChar blob column. We currently want to alter that
    table and add a Filestream column and transfer the data as shown below. As you see we are only creating one file folder and the query will probably run for six hours or so. 
    We are also thinking about adding up to 5 million audio files stored in a different area. We could conceivably end up with several terabytes of file data. Should we partition and if so how many files should we store in each partition? We are using SQL Server
    2012 and Windows Server 2012 R2.
    --Create a ROWGUID column
    USE CUR
    ALTER Table documents
    Add DocGUID uniqueidentifier not null ROWGUIDCOL unique default newid()
    GO
    --Turn on FILESTREAM
    USE CUR
    ALTER Table documents
    SET (filestream_on=FileStreamGroup1)
    GO
    --Add FILESTREAM column to the table
    USE CUR
    ALTER Table documents
    Add DocContent2 varbinary(max) FILESTREAM null
    GO
    -- Move data into the new column
    UPDATE documents
    SET DocContent2=DocContent
    where doccontent is not null and  doccontent2 is null  
    GO
    --Drop the old column
    ALTER Table documents
    DROP column DocContent
    GO
    --Rename the new FILESTREAM column to the old column name
    Use CUR
    GO
     sp_rename 'documents.DocContent2', 'DocContent','Column'
    GO

    Hi tomheaser,
    Quote: Should we partition and if so how many files should we store in each partition?
    Yes, if our database contains very large tables, we may benefit from partitioning those tables onto separate filegroups. In this case, SQL Server can access all the drives of each partition at the same time, this may reduce a lot time to load data.
    If you only want to reduce the query time by increasing the number of the filegroups, then the limit on the maximum number of partitions is 15,000 in SQL Server. But in order to maintain a balance between performance and number of partitions, we need to consider
    more things such as memory, partitioned index operations, DBCC commands, and queries. So please consider all those things first, then choose a reasonable number of partitions. For more information about Performance Guidelines of Table Partition, please refer
    to the following article:
    http://msdn.microsoft.com/en-us/library/ms190787(v=sql.110).aspx
    If you have any question, please feel free to let know.
    Regards,
    Jerry Li

  • Memory leak in fileStream.readMultiByte?

    Hi everybody,
    after a long session bug hunting my ipad application because of a memory leak, i think i found a memory bug in the fileStream class.
    I am using the fileStream class to load xml and css files in my application for initial data etc.
    I parsed the fileStream using readMultiByte() to a string, but there seems to be a small (<1kb) memory leak using this method.
    After switching to fileStream.readUTFBytes() the memory leak seems to be gone.
    Can someone confirm this for me, so that we can submit this to the adobe bug database.
    Greetings,
    Kriz

    Hi Hank,
    how are you using the fileStream to open your files?. If u use the fileStream.open, your application will stop everything, and waits for the file to be completely loaded before continuing, instead u can use the fileStream.openAsync to open a asynchronous connection, and use listeners for the fileStream to execute on completion.
    For you next question, try building your own tweens using Event.ENTER_FRAME, and frame counters instead of a tween engine like TweenLite (tween engines have a lot of handles that are still being used, even if u are not using them), Also try to use Bitmaps, or cacheAsBitmap items for GPU rendering. There are a lot of thread in this forum about this question, and the method used really depends on the type of animation.
    Hope that answers your questions,
    Kriz

  • RBS questions on sharepoint 2013

    I need some advice on RBS setup on sharepoint 2013.  I am following the microsoft article on setting RBS setup, on where you run some sql queries for the sql file stream on the content databases you want RBS to be setup on.  Then i download the
    RBS_amd64.msi file and install it on one web front end server  and enable for each content database. 
    My questions are that i have other web servers, app servers and db servers, do i need to install the rbs.msi exe on every other server in the farm , or just only the web servers and the app servers. Such as do we run that msiexec command on each server.
    Also how does rbs work for all migrated sites or existing sites which have documents already stored.  So if we enable rbs for these sites and for files over 5 mb, would that apply to only new uploaded files over 5mb, or will apply to also already stored
    over 5mb files. 
    Also does shredded storage impact the files sizes been uploaded as while, such if we configure files to be uploaded over 5 mb, will this apply to the files over this limit or does the files have to be much larger than 5 MB, such 8 or 9 mb have the rbs applied
    to them.
    Can i be advised on these 2 questions.
    Thanks

    I SUGGEST YOU NOT TO USE RBS IF YOU HAVE FURTHER PLANS FOR MIGRATING FOR NEWER VERSIONS. STILL I will give you steps for migrating RBS on sharepoint 2010 to 2013. But note that when I have migrated it on 2013 it is not in the way or RBS. I had to remove
    RBS from 2010 server and convert it itn database file of more than 600 GB and then migrated this db file to restore on sharepoint 2013 with sql 2012. and belive me it took more than 15 days to make copy and paste only. so.. better not to use RBS if you have
    large data.
    see the below steps.
    1.   
    Live SharePoint 2010 environment with SQL 2008 R2
    a.   
    Take back up from 2010 live server.
    i.     
    Open management studio on SQL server.
    ii.     
    Take back up of content database of live application.
    2.   
    QA SharePoint 2010 environment with SQL 2008 R2
    a.   
    Restore SQL backup
    i.     
    Open management studio on SQL server.
    ii.     
    Restore database.
    b.  
    Create Web Application
    i.     
    Open SharePoint server
    ii.     
    Open central admin
    iii.     
    Create web application with classic authentication.
    c.   
    Dismount database which is with existing application
    i.     
    Open SharePoint PowerShell on SharePoint server.
    ii.     
    Fire below command.
    Dismount-SPContentDatabase <Database name>
    Note: Change the database name.
    d.  
    Mount restored database with existing application
    i.     
    Open SharePoint PowerShell on SharePoint server.
    ii.     
    Fire below command.
    Mount-SPContentDatabase <Database name>  -DatabaseServer  <Database server name > -WebApplication <Web application>
    Note: Change the database name and web application URL.
    iii.     
    Open SharePoint Designer and change the master page and publish it.
    iv.     
    Set the Test page as Home page.
    v.     
    Test user logging
    Logging with the 2-3 different users and test they can able to logging.
    e.   
    Configure RBS
    i.     
    Enable FILESTREAM on the database server
    Open SQL Server Configuration manager on SQL Server.
    From left panel click on SQL Server Services.
    From right panel select the instance of SQL Server on which you want to enable FILESTREAM.
    Right-click the instance and then click Properties.
    In the SQL Server Properties dialog box, click the FILESTREAM tab.
    Select the Enable FILESTREAM for Transact-SQL access check box.
    If you want to read and write FILESTREAM data from Windows, click Enable FILESTREAM for file I/O streaming access. Enter the name of the Windows share in the Windows Share Name box.
    If remote clients must access the FILESTREAM data that is stored on this share, select allow remote clients to have streaming access to FILESTREAM data.
    Click Apply and ok.
    ii.     
    Set FILESTREAM access level
    Open SQL management studio and connect SQL database instance.
    Right click on database instance and open Property.
    Go to: click on advanced from left panel.
    Find the “Filestream Access Level” property and set the value “Full access enabled”
    Click OK and exit window.
    iii.     
    Set SharePoint Configure FILESTREAM access level
    Open Query window on root
    Execute  following query
    EXEC sp_configure filestream_access_level, 2
    RECONFIGURE
    Restart SQL services
    Note: You will get message” Configuration option 'filestream access level' changed from 2 to 2. Run the RECONFIGURE statement to install.”
    iv.     
    Provision a BLOB store for each content database
    Click the content database for which you want to create a BLOB store, and then click New Query
    Execute following query
    use [<Database name>]
    if not exists
    (select * from sys.symmetric_keys
    where name = N'##MS_DatabaseMasterKey##')
    create master key encryption by password = N'Admin Key Password !2#4'
    Note:
    Change the database name
    You get “Command(s) completed successfully.” Message.
    use [<Database name>]
    if not exists
    (select groupname from sysfilegroups
    where groupname=N'RBSFilestreamProvider')
    alter database [<Database name>]
    add filegroup RBSFilestreamProvider contains filestream
    Note:
    Change the database name.
    You get “Command(s) completed successfully.” Message.
    use [<Database name>]
    alter database [<Database name>]
     add file (name = RBSFilestreamFile, filename =
    '<E:\SQL\Data\PetroChina>')
    to filegroup RBSFilestreamProvider
    Note:
    Change the database name and store path.
    If you get message “FILESTREAM file 'RBSFilestreamFile' cannot be added because its destination filegroup cannot have more than one file.”
    Ignore it.
    v.     
    Install the RBS client library on each Web server
    To install the RBS client library on the on the first Web server
    Open SharePoint Web server
    Open command prompt.
    Execute following command
    msiexec /qn /lvx* rbs_install_log.txt /i RBS.msi TRUSTSERVERCERTIFICATE=true FILEGROUP=PRIMARY DBNAME=<Database name> DBINSTANCE=<Database server> FILESTREAMFILEGROUP=RBSFilestreamProvider FILESTREAMSTORENAME=FilestreamProvider_1
    Note:
    Change the database name and database instance name.
    DB instance should be <server name\instance name>
    Download RBS.msi for respective SQL version.
    To install the RBS client library on all additional Web and application serversOpen SharePoint Web server
    Open command prompt.
    Execute following command
    msiexec /qn /lvx* rbs_install_log.txt /i RBS.msi DBNAME=<Database name> DBINSTANCE=<Database server> ADDLOCAL=Client,Docs,Maintainer,ServerScript,FilestreamClient,FilestreamServer
    Note:
    Change the database name and database instance name.
    DB instance should be <server name\instance name>
    vi.     
    Enable RBS for each content database
    You must enable RBS on one Web server in the SharePoint farm. It is not important which Web server that you select for this activity. You must perform this procedure once for each content database.
    Open SharePoint web server
    Open SharePoint PowerShell
    Execute below script
    $cdb = Get-SPContentDatabase <Database name>
    $rbss = $cdb.RemoteBlobStorageSettings
    $rbss.Installed()
    $rbss.Enable()
    $rbss.SetActiveProviderName($rbss.GetProviderNames()[0])
    $rbss
    Note: Change the database name.
    vii.     
    Test the RBS installation
    On the computer that contains the RBS data store.
    Browse to the RBS data store directory.
    Confirm that size of RBS data store directory.
    On the SharePoint farm, upload a file that is at least 100 kilobytes (KB) to a document library.
    On the computer that contains the RBS data store.
    Browse to the RBS data store directory.
    Confirm that size of RBS data store directory.
    It must be more than before.
    viii.     
    Test user logging
    Logging with the 2-3 different users and test they can able to logging.
    f.    
    Migrate RBLOB from RBS to SQL database and completely remove RBS
    i.     
    Migrate all content from RBS to SQL and disable RBS for content DB:
    Open SharePoint server.
    Open SharePoint management PowerShell
    Execute below script
    $cdb=Get-SPContentDatabase <Database name>
    $rbs=$cdb.RemoteBlobStorageSettings
    $rbs.GetProviderNames()
    $rbs.SetActiveProviderName("")
    $rbs.Migrate()
    $rbs.Disable()
    Note:
    Migrate() might take some time depending on amount of data in your RBS store.
    Change the database name.
    If you get message on the PowerShell “PS C:\Users\sp2010_admin> $rbs.Migrate()
    Could not read configuration for log provider <ConsoleLog>. Default value used.
    Could not read configuration for log provider <FileLog>. Default value used.
    Could not read configuration for log provider <CircularLog>. Default value used.
    Could not read configuration for log provider <EventViewerLog>. Default value used.
    Could not read configuration for log provider <DatabaseTableLog>. Default value used.” Then wait for while it will take some time to start migration.”
    ii.     
    Change the default RBS garbage collection window to 0 on your content DB:
    Open SQL server
    Open SQL management studio
    Select your content DB and open new query window
    Execute below SQL query
    exec mssqlrbs.rbs_sp_set_config_value ‘garbage_collection_time_window’,'time 00:00:00′
    exec mssqlrbs.rbs_sp_set_config_value ‘delete_scan_period’,'time 00:00:00′
    Note:
    Run one by one SQL query
    You will get “Command(s) completed successfully.” Message
    iii.     
    Run RBS Maintainer (and disable the task if you scheduled it):
    Open SharePoint server
    Open command prompt
    Run below command
    "C:\Program Files\Microsoft SQL Remote Blob Storage 10.50\Maintainer\Microsoft.Data.SqlRemoteBlobs.Maintainer.exe" -connectionstringname RBSMaintainerConnection -operation GarbageCollection ConsistencyCheck ConsistencyCheckForStores -GarbageCollectionPhases
    rdo -ConsistencyCheckMode r -TimeLimit 120
    iv.     
    Uninstall RBS:
    Open SQL server
    Open SQL management studio
    On your content DB run below SQL query
    exec mssqlrbs.rbs_sp_uninstall_rbs 0
    Note:
    If you will get message “The RBS server side data cannot be removed because there are existing BLOBs registered. You can only remove this data by using the force_uninstall parameter of the mssqlrbs.rbs_sp_uninstall stored pro” then run this “exec mssqlrbs.rbs_sp_uninstall_rbs
    1 ”
    You will get “Command(s) completed successfully.” Message.
    v.     
    Uninstall from add/remove SQL Remote Blob Storage.
    I found that there were still FILESTREAM references in my DB, so remove that reference
    Open SQL server
    Open SQL management studio
    Run below SQL query on your content DB:
    ALTER TABLE [mssqlrbs_filestream_data_1].[rbs_filestream_configuration] DROP column [filestream_value]
    ALTER TABLE [mssqlrbs_filestream_data_1].[rbs_filestream_configuration] SET (FILESTREAM_ON = "NULL")
    Note:
    Run one by one SQL query
    vi.     
    Now you can remove the file and filegroup for filestream:
    Open SQL server
    Open SQL management studio
    Open new query window on top
    Execute below SQL query
    ALTER DATABASE <Database name> Remove file RBSFilestreamFile;
    Note:
    Change the database name
    If it gives message “The file 'RBSFilestreamFile' cannot be removed because it is not empty.” Then remove all table prefix with “mssqlrbs_” from your database and execute SQL query again.
    This query takes time as per your database size (almost 30 min).
    You will get “The file 'RBSFilestreamFile' has been removed.” Message
    Execute below SQL query
    ALTER DATABASE <Database name> REMOVE FILEGROUP RBSFilestreamProvider;
    Note:
    Change the database name
    You get “The filegroup 'RBSFilestreamProvider' has been removed.” Message.
    Or If you get “Msg 5524, Level 16, State 1, Line 1 Default FILESTREAM data filegroup cannot be removed unless it's the last
    FILESTREAM data filegroup left.” message. Then ignore this message.
    vii.     
    Remove Blob Store installation
    Open SharePoint server
    Run RBS.msi setup file and choose Remove option.
    Finish wizard.
    viii.     
    Disable FILESTREAM in SQL Configuration Manager
    Disable FILESTREAM in SQL Configuration Manager for your instance (if you do not use it anywhere aside this single content DB with SharePoint), run SQL reset and IIS reset and test.
    ix.     
    Test the RBS Removed or not?
    On the computer that contains the SQL database.
    Confirm that size of SQL database (.mdf file).
    On the SharePoint farm, upload a file that is at least 100 kilobytes (KB) to a document library.
    On the computer that contains the SQL database.
    Confirm that size of SQL database.
    It must be more than before. If there is no difference then ignore it. Just check it Store is no more in SQL.
    x.     
    Test user logging
    Logging with the 2-3 different users and test they can able to logging.
    g.   
    Convert classic-mode web applications to claims-based authentication
    i.     
    Open SharePoint server
    ii.     
    Open SharePoint PowerShell
    iii.     
    Execute below script
    $WebAppName = "<URL>"
    $wa = get-SPWebApplication $WebAppName
    $wa.UseClaimsAuthentication = $true
    $wa.Update()
    $account = "<Domain name\User name>"
    $account = (New-SPClaimsPrincipal -identity $account -identitytype 1).ToEncodedString()
    $wa = get-SPWebApplication $WebAppName
    $zp = $wa.ZonePolicies("Default")
    $p = $zp.Add($account,"PSPolicy")
    $fc=$wa.PolicyRoles.GetSpecialRole("FullControl")
    $p.PolicyRoleBindings.Add($fc)
    $wa.Update()
    $wa.MigrateUsers($true)
    $wa.ProvisionGlobally()
    iv.     
    Test user logging
    Logging with the 2-3 different users and test they can able to logging.
    h.  
    Take SQL backup from QA server
    i.     
    Open SQL server
    ii.     
    Open management studio on SQL server
    iii.     
    Select the content database
    iv.     
    Take back up of content database
    Information: This SQL backup is not content RBS.
    3.   
    New SharePoint 2013 environment with SQL 2012
    a.   
    Restore SQL backup
    i.     
    Open SQL server
    ii.     
    Open SQL management studio
    iii.     
    Restore the SQL database using *.bak file
    b.  
    Dismount database which is with existing application
    i.     
    Open SharePoint server
    ii.     
    Open SharePoint management PowerShell
    iii.     
    Execute below script
    Dismount-SPContentDatabase <Database name>
    Note: change the database name which bind with existing application.
    c.   
    Mount restored database with existing application
    i.     
    Open SharePoint server
    ii.     
    Open SharePoint management PowerShell
    iii.     
    Execute below script
    Mount-SPContentDatabase <Database name> -DatabaseServer <Database server name> -WebApplication <URL>
    Note:
    Change the database name with new restored database name
    Change the database server name in form of “DB server name\DB instance name”
    Change the URL of web application
    This command take some time.
    d.  
    Upgrade site collection
    i.     
    Open SharePoint server
    ii.     
    Open new site
    iii.     
    You will find message on top “Experience all that SharePoint 15 has to offer. Start now or Remind me later”
    iv.     
    Click on “Start”
    v.     
    Click on ”Upgrade this Site Collection”
    vi.     
    Click on “I Am ready”
    vii.     
    After some time you will get message “Upgrade Completed Successfully”
    viii.     
    Test User logging.

  • File Upload - SQL 2008 FileStream

    Hi,
    In SQL 2008, there is something called FILESTREAM.  It is a new feature of SQL Server 2008, which allows storage of and efficient access to BLOB data using a combination of SQL Server 2008 and the NTFS file system.
    Has anyone used ColdFusion to upload a document into a database table that has filestream columns?
    I want to know how I can upload a document from ColdFusion and get it into the database.
    Any help appreciated
    -ws

    Here are the basic steps for uploading a file and storing it in your database.
    1. Create a page with a form that includes a file tag.  This form will submit to an 'action.cfm' page.  Note the name of the .cfm page is not important, 'action.cfm' is an example.
    2. On your 'action.cfm' use CFFILE with action="upload" tag to save the file to your filesystem.
    3. Use the CFFILE tag with action="readbinary" to read the contents of the uploaded file into a variable.   You may also want to use CFFILE to delete the uploaded file after it has been read into a variable. 
    4. Use CFQUERY with CFQUERYPARAM to insert the file conents. 
    Please reply if you have more questions.

  • SQL Server 2012 Database with FileStream enabled tables

    Hi,
    I have some questions concerning the SQL Server 2012 FileStream feature.
    In a database combining both Filestream connected tables and none filestream connected tables. It is obviously possible to tell the root path to a filestream FILEGROUP. It's also possible to create database primary data file (.mdf) and several
    optional secondary data files (.ndf), and multiple log files.
    If I have two filestream connected tables, which each in turn have a couple of other referenced tables (none filestream connected). Is it possible to put the filestream filegroup1 (eg. the filestream connected table1) and its referenced none filestream connected
    tables, their data, indexes etc., on the same physical data file? And the other filestream connected table and its referenced tables to another physical file (.ndf)? If this is possible and recommended, how do I declare such an create database statement?
    For ex. when having tables for both none archived state, archived state, in the same database. Or is the best solution to split the two (and its referenced tables) in separate databases?
    SWEDEV

    Hello,
    File groups a just contains for the objects, you e.g. can split one table over several file Groups/secondarie files using partitioning. And a filestream is also just a table. You can reference all table Independent of file Groups / file stream.
    Olaf Helper
    [ Blog] [ Xing] [ MVP]

  • When to use FileTable, and DB design questions

    I have a kind of interesting scenario- lets say I have a list of people I wish to store in the database, so I create a table 'Person'.  Now lets say each person has a 2000 large image files (say high res photos or something).  Each image is >
    5M.  What I need to do at the end of the day is retrieve a person, and upon selection display a selected photo (or photos) in my C# WPF app.
    I was looking at FileTable to store the data on the disk, as this would be preferable to storing all these images as blobs.  But I wasn't sure how to associate the Person with their images through the FileTable.
    Can I add additional columns to a file table (like an fk to Person)?  Not sure if that is the right approach.  Maybe FileStream is better?  Is it better in a case like this to just store a path in the database?  Which leads to the next
    question...
    I also need to store other kinds of data, like DeepZoom data- this is interesting stuff, as its about 600M of data, and exists in a single directory structure with multiple levels of data (and images).  Apps able to read this need extremely fast access
    to the data to render it quickly.  I think in this case I would need to just store the location of the DZ data on disk.  But that seems fragile.  Is there a standard accepted way to store file locations in the DB?
    Thoughts and advice appreciated!

    Okay, filestream it is.
    What about storing file locations?   Also, if I write a 10M file to a filestream- do I write this to the DB as if it were a blob?  Is it slower to write to the DB than directly to the filesystem using filestream?
    I read the whitepaper on filestream and hunted around but couldn't seem to find this info...
    Since a FileTable appears as a folder in the Windows file system, you can easily load files into a new FileTable by using any of the available methods for moving or copying files. These methods include Windows Explorer, command line options including xcopy
    and robocopy, and custom scripts or applications.
    Please read the below URL. It clearly explains answer for your question.In the below url  script example also given
    http://technet.microsoft.com/en-us/library/gg492083.aspx#HowToMigrateFiles
    Loading or Migrating Files into a FileTable
    The method that you choose for loading or migrating files into a FileTable depends on where the files are currently stored.
    Current location of files
    Options for migration
    Files are currently stored in the file system.
    SQL Server has no knowledge of the files.
    Since a FileTable appears as a folder in the Windows file system, you can easily load files into a new FileTable by using any of the available methods for moving or copying files. These methods include Windows Explorer, command line options including xcopy
    and robocopy, and custom scripts or applications.
    You cannot convert an existing folder to a FileTable.
    Files are currently stored in the file system.
    SQL Server contains a table of metadata that contains pointers to the files.
    The first step is to move or copy the files by using one of the methods mentioned above.
    The second step is to update the existing table of metadata to point to the new location of the files.
    For more information, see Example: Migrating Files from the File System into a FileTable in this topic.
    Please use Marked as Answer if my post solved your problem and use Vote As Helpful if a post was useful. By ganeshk

  • Migrated data from RBS to datafiles but data still remains in FileStream

    have a content database with RBS enabled on SQL Server 2012 Filestream. Migrated the data from Filestream to SQL with the script
    $cdb = Get-SPContentDatabase <database_name>
    $rbs = $cdb.RemoteBlobStorageSettings
    $rbs.SetActiveProviderName("")
    $rbs.Migrate()
    $rbs.Disable()
    seen the data being uploaded into the SQL (datafiles increased after migration)
    checked all OK
    $rbs.Installed() #shows true
    $rbs.Enabled #shows false
    run in SQL backend database
    exec mssqlrbs.rbs_sp_set_config_value 'garbage_collection_time_window','time 00:00:00'
    exec mssqlrbs.rbs_sp_set_config_value 'delete_scan_period','time 00:00:00'
    However the data is still in the filestream store.
    How do I get rid of the Filestream Store data (since it was migrated to SQL datafiles) ?

    check this link and make sure you followed the steps.
    http://alipka.wordpress.com/2010/06/19/how-to-disable-rbs-in-sharepoint-2010/
    or Step K on this blog:
    http://blogs.technet.com/b/pramodbalusu/archive/2011/07/09/rbs-and-sharepoint-2010.aspx
    Please remember to mark your question as answered &Vote helpful,if this solves/helps your problem. ****************************************************************************************** Thanks -WS MCITP(SharePoint 2010, 2013) Blog: http://wscheema.com/blog

  • Is filestream filegroup size considers in database size?

    while calculating database size will it necessary to consider filestream filegroup size??
    sp_spaceused stored procedure returns size of database but it consider only data and log files. 
    my question is : can we consider filestream filegroup size while calculating database size or is there any other way which already consider filestream filegroup size??

    Hi priyanka,
    Since you can get the size of
    filestream files via  the T-SQL statement and the sys.database_files, I will mark their post as answer for your question.
     That way, other community members could benefit from this sharing. Thanks for your understanding.
    Regards,
    Sofiya Li
    Sofiya Li
    TechNet Community Support

  • Bytearray uncompress filestream wirtebytes error

    Hi,
    I'm developing an app for iPad in which I'm trying to load a zip file from a remote url, save that zip file on device and then trying to unzip it.
    It works fine both on laptop(while testing as AIR app) and also on the device(iPad) if zip file size is around 20 - 25MB, with it's internal file being around 140 - 170MB. But if the zip file size is larger (beyond 50MB) then it crashes while uncompressing the zip file on the device. It still works fine on laptop though.
    I'm compiling the code with Flex 4.6 overlayed on AIR 3.6
    iPad iOS version is 6.0.1
    Using following code :
    private function unzipFile():void
                progress = progress + fileName +' unzipFile started\n';
                saveLog();
                var zStream:FileStream = new FileStream();
                var bytes:ByteArray = new ByteArray();
                var fileName:String = new String();
                var flNameLength:uint;
                var xfldLength:uint;
                var offset:uint;
                var compSize:uint;
                var uncompSize:uint;
                var compMethod:int;
                var signature:int;
                progress = progress + fileName +'before zStream.open\n';
                saveLog();
                zStream.open(fileLocal, FileMode.READ);
                progress = progress + fileName +'after zStream.open\n';
                saveLog();
                bytes.endian = Endian.LITTLE_ENDIAN; 
                while (zStream.position < fileLocal.size)
                {  progress = progress + fileName +'before zStream.readBytesbytes, 0, 30\n';
                    saveLog();
                    zStream.readBytes(bytes, 0, 30);
                    progress = progress + fileName +'after zStream.readBytesbytes, 0, 30\n';
                    saveLog();
                    bytes.position = 0;
                    signature = bytes.readInt();
                    if (signature != 0x04034b50)
                        break;
                    bytes.position = 8;
                    compMethod = bytes.readByte();
                    offset = 0;
                    bytes.position = 26;
                    flNameLength = bytes.readShort();   
                    offset += flNameLength;    
                    bytes.position = 28;    
                    xfldLength = bytes.readShort();
                    offset += xfldLength;   
                    zStream.readBytes(bytes, 30, offset);
                    bytes.position = 30;
                    fileName = bytes.readUTFBytes(flNameLength); 
                    bytes.position = 18;
                    compSize = bytes.readUnsignedInt(); 
                    bytes.position = 22;  
                    uncompSize = bytes.readUnsignedInt();
                    progress = progress + fileName +'before zStream.readBytes(bytes, 0, compSize)\n';
                    saveLog();
                    zStream.readBytes(bytes, 0, compSize);
                    progress = progress + fileName +'after zStream.readBytes(bytes, 0, compSize)\n';
                    saveLog();
                    if (compMethod == 8)
                        try
                            progress = progress + fileName +' before bytes.uncompress\n';
                            saveLog();
                            bytes.uncompress(CompressionAlgorithm.DEFLATE);
                            //bytes.uncompress(CompressionAlgorithm.LZMA);
                            progress = progress + fileName +' after bytes.uncompress\n';
                            saveLog();
                            //outFile(fileName, bytes); 
                        catch(error:Error)
                            progress = progress + fileName +' bytes.uncompress catch\n';
                            saveLog();
                  //write bytes to a file locally here
    It fails on this line:
    bytes.uncompress(CompressionAlgorithm.DEFLATE);
    and gets inside catch block.
    To avoid this problem, I also tried to load the file which was there in the zip file, directly using remote url(file size and download time will be more), but in this case, after loading the file, reading the bytearray data of it, when I try to write this bytearray to a filestream, it crashes again!
    Just wanted to know if there is any file size limit on mobile\iOS devices, while unzipping a file and while writing bytearray data to a filesystem or am I doing something wrong here?
    Kindly help, as I'm stuck with this and cannot really proceed on this project using Flex AIR if this doesn't work.
    -Deepak

    Hi Brent,
    Yes, for unzipping, we really have to split the zip files, as uncompress doesn't work for larger zip files(it crashes on device).
    Since my file size would range from 200-400MB, for now, I'm planning to load the raw file directly(running out of time to deliver it :| ). This too was failing, when i tried to readByes\writeBytes, after loading the file completely(since it runs out of memory). But I came across a solution right here:
    http://stackoverflow.com/questions/14583247/air-as3-download-large-file-trough-ipad-applic ation
    It basically writes data to the disk, as and when data gets downloaded in chunks. I felt it was a great idea! Apparantly, there won't be any memory issues too with that approach
    And yes, as you have mentioned future plan would be to load and unzip zip files.
    Thanks Brent, that helped too

  • Monitoring Filestream database files

    Hello everyone,
    I've got two MS SQL 2012 databases - each one with three database files.
    In SCOM 2012 R2 every database file is monitored correctly except the one with file type "FILESTREAM Data".
    Here the green tick is missing under Health Explorer ("Raws Data" and "Log" are monitored and have a green tick). All monitors are enabled.
    Is there a possibility how to monitor databases files with file type "FILESTREAM Data" in scom?
    Thanks in advance!
    Best regards,
    Hermann

    As SQL MP monitors DB files and DB logs by default. for the data files with "FILESTREAM Data" type, if we want to monitor these data, from our SCOM side, we need to create a specified MP to monitor these data. and this will need you to involve
    your develop team and department, thanks for your understadning.
    Please remember to click “Mark as Answer” on the post that helps you, and to click “Unmark as Answer” if a marked post does not actually answer your question. This can be beneficial to other community members reading the thread.

Maybe you are looking for