Searching a Remote Data Store

We have a document storage system that we would like to integrate into our portal search. We seem to have two major issues to overcome:
- the documents (metadata and data) are stored in a database- the security of the document management system is extremely complex, and any search results would have to be passed through the external system, which would then apply security.
Has anyone done this type of integration? We can access a web service from the external system to get the full text of a document, but I don't know if I can then make the portal do a full text index of that data. Then, when users do a search in the portal it will search the Knowledge Directory, and then ask the external system if the user is allowed to see the results.

Toby,
At this moment, there is no way that bytecode or
scripting languages can perform better than nicely
implemented C++ code. An interpreter or a virtual
machine is still an extra step between the code and
the processor. I'd love to see this change, and I
really hope you're right!This is what HotSpot is all about. It optimizes your code on the fly for the particular processor you are running.
For example, if I write a C++ program, I'm not going to optimize it for PentiumIII's. I probably won't even optimize it for PentiumII's. My Java program, however, can be optimal for whatever platform it will run on.
The idea that there is a whole lot of interpretation going on is a throwback from the old days of JDK1.1. Ever since the JIT compiler was introduced in 1.2, java byte-code ends up being compiled down to native instructions.
The truth is that given an infinite amount of time, you could always write a program in assembly that would be faster than the same program written in any other high level language. The more important truth is that nobody has that much time.
God bless,
-Toby Reyelts

Similar Messages

  • Search across multiple data stores

    Hello,
    I have 1 Endeca Server with 2 different data stores in it. I would like to search for example "Marco" in the first data store and also also get the records back from the other data store where "Marco" is in the text. Is that possible?
    Thanks.
    Marco
    Edited by: Marco Snels (AortaBI)(NL) on 9-mei-2012 2:21

    Hi Marco -
    The search box component can be configured for use against multiple search interfaces, potentially from multiple datastores. In this case, the expectation is that during configuration you would specify a target page for each search interface datastore in use and those pages would have components focused on that datastore. The end user can then toggle between the different search interface options and when they submit their search they will be pushed to a page with breadcrumbs and a results list etc. focused on that datastore. All other components are tired to a single datastore and do not merge results from multiple datastore instances. The more common case is that the search box in configured with a single datastore in use.
    Jason

  • Wildcard search in data store

    I'm trying to search using a wildcard in a AOP data store.  
    This produces a result:
    Back EMF Date: 2012-08-20 Time: 14:16:04
    But this does not
    Back*
    How can I use wildcards in a data store search?  
    Solved!
    Go to Solution.

    I would assume all of the following should create correct results
    dim query : set query = store.CreateQuery
    query.ReturnType = "Measurement"
    query.Conditions.Add "Measurement", "name", "=", "Back*"
    store.Search(query)
    store.GetElementList("Measurement", "name=Back%", true)
    store.GetElementList("Measurement", "name=Back*", false)
    store.GetElementList("Measurement", "name=Back*")
     Please take into account that AOP does not support case insensitive query.
    I assume that you are using a AVL Puma. You might need to set a switch in the Advanced DIalog of the AOP connection. "Use Like instead of MATCH".
    In the connection parameter this shows up as
    <santorinlike>YES</santorinlike>

  • LDAP Authentication in a separate data store

    I am running AM 7.0 in Legacy Mode. I want to have a sub-organization (or realm) do its authentication in a different LDAP from the one AM uses as its data store. I have done the same successfully in AM6.1.
    I modified the LDAP Authentication module in the realm to point to the other LDAP. I can now log in to the sub-organization/realm against the secondary LDAP. However, when AM searches for attributes after the login, it uses the search dn that is specified for the LDAP module. What I want it to do is to use the main AM repository for attributes, roles, etc., and only validate credentials against the remote LDAP.
    In 6.1 this was the default behavior. Modifying LDAP Authentication did not affect other behaviors of AM, only the authentication.
    Advice?

    do you have "Return User DN to Authenticate" enabled on the LDAP module? If so, turn it off and see what happens.

  • What do you recommend to use as an offline data store, since SQL CE support is not in VS 2013?

    A few years back I was architecting an occasionally connected .Net desktop application. 
    VS 2010 was offering full support for Microsoft Sync Framework and SQL CE with Entity Framework. 
    This seemed like the perfect marriage, so I ran with it, and the resulting software solution is still successfully running in production, years later. 
    Jump forward to today, and I am architecting a new occasionally connected .Net desktop application. 
    I was really looking forward to taking advantage of the advances made by Microsoft in using the tools built into VS 2013. 
    However, what I discovered has dumbfounded me.  VS 2013 has no designer support for Sync Framework, and worse, built in support for SQL CE has been completely removed, including the ability to generate Entity Framework models from a
    CE database using the designer. 
    My question to the community is, what tools should I be using to solve the problem of offline storage in my brand new .Net application? 
    I am aware of ErikEJ’s SQL Server Compact Toolbox, which brings back some support for these features in VS 2013, but it is not as fully featured as the VS 2010 native support was, plus it does not have the institutional “Microsoft” stamp on it. 
    I am building a multimillion dollar corporate solution that I will have to support for many years.
     I would like to have some comfort that the technologies I select, today, will still be supported 5 years from now, unlike the way Microsoft has discontinued supporting Sync Framework and CE in the most recent VS. 
    I can accept open source technologies, because there is a community behind them, or off the shelf corporate solutions, since they will be driven by financial gain, but I have trouble committing to a solution that is solely supported by an individual,
    even if that person is a very talented Microsoft MVP.
    Some of the features of SQL CE that I would like to keep are
    Built in encryption of the file on disk
    Easy querying with an ORM, like Entity Framework
    Tools to easily sync up the offline data store with values from SQL Server (even better if this can be done remotely, such as over WCF)
    Does not require installation of additional software on the client machine, as SQL Express would
    Please, provide your feedback to let me know how you have achieved this, without resorting to simply using an older version of VS or Management Studio. 
    Thank you.

    Hello,
    Based on your description, you can try to use SQL Server 2012 Express LocalDB.
    LocalDB is created specifically for developers. It is very easy to install and requires no management, but it offers the same T-SQL language, programming surface and client-side providers as the regular SQL Server Express.
    SQL Server LocalDB can work with Entity Framework and ADO.NET Syc Framework. However, there is no built-in encryption feature in LocalDB which
    can let you encrypt database. You should decrypt/encrypt data on your own, for example, using
    Cryptographic Functions
    Reference:SQL Express v LocalDB v SQL Compact Edition
    Regards,
    Fanny Liu
    If you have any feedback on our support, please click here.
    Fanny Liu
    TechNet Community Support

  • Why do i get a run time error box pop up and then close itunes when I search in the iTunes store?

    Wnen i search in the itunes store, a ren time error pops up and then closes my itunes. why is it doing this? I have to go thru power search to get around the problem and is not as convenient. thanks, Jamboy72

    This issue can be caused by the presence of leftover files in the Firefox program folder (defaults\pref) like a file firefox.js that overrides the update URL with a wrong link.
    Do a clean reinstall and be sure to remove the Firefox program folder to remove the file(s) that cause the problem.
    Your bookmarks and other profile data are stored elsewhere in the Firefox Profile Folder and won't be affected by a reinstall, but make sure that you do not select to remove personal data if you uninstall Firefox.
    * http://kb.mozillazine.org/Profile_folder_-_Firefox
    See:
    * [/questions/826858]

  • I upgraded to ios 6 and every time i try to search apps the app store crashes

    i upgraded to ios 6 and every time i try to search apps the app store crashes help

    Download them again in the App Store purchased tab. As long as you use the same ID that you bought the apps with, you do not have to pay for them again.
    This will recover your apps, but all of your data will be gone.

  • Best Practices for Remote Data Communication?

    Hello all
    I am developing a full-fledged website in Flex 3.4 and Zend Framework, PHP. I am using the Zend_AMF class in Zend framework for communicating the data to remote server.
    I will be communicating to database in the following way...
    get data from server
    send form data to server
    send requests to server to get data in response
    Right now I have created just a simple login form which just sends two fields username and password in the method in service class on remote server.
    Here is a little peek into how I did that...
    <?xml version="1.0" encoding="utf-8"?>
    <mx:Application xmlns:mx="http://www.adobe.com/2006/mxml">
      <mx:RemoteObject id="loginService" fault="faultHandler(event)" source="LoginService" destination="dest">
        <mx:method name="doLogin" result="resultHandler(event)" />
      </mx:RemoteObject>
      <mx:Script>
        <![CDATA[
          import mx.rpc.events.ResultEvent;
          import mx.controls.Alert;
          private function resultHandler(event:ResultEvent):void
            Alert.show("Welcome " + txtUsername.text + "!!!");
        ]]>
      </mx:Script>
      <!-- Login Panel -->
      <mx:VBox>
        <mx:Box>
          <mx:Label text="LOGIN"/>
        </mx:Box>
        <mx:Form>
          <mx:FormItem>
            <mx:Label text="Username"/>
            <mx:TextInput id="txtUsername"/>
          </mx:FormItem>
          <mx:FormItem>
            <mx:Label text="Password"/>
            <mx:TextInput id="txtPassword" displayAsPassword="true" width="100%"/>
          </mx:FormItem>
          <mx:FormItem>
          <mx:Button label="Login" id="loginButton" click="loginService.doLogin(txtUsername.text, txtPassword.text)"/>
          </mx:FormItem>
        </mx:Form>
      </mx:VBox>
    </mx:Application>
    This works fine. But if I create a complicated form which has many fields then it would be almost unbearable to sent each fields as an argument of a function.
    Another method that can be used is using HttpService which supports XML like request and response.
    I want to ask what are best practices in Flex when using remote data communication on a large scale? Like may be using some classes or objects which store data? Can somebody guide me on how to approach data storing?
    Thanks and Regards
    Vikram

    Oh yes, I have done study about Cairngorm, haven't really applied it though. I thought that it helps in separating the data models, presentation and business logic into various layers.
    Although what I am looking for is something about data models may be?
    Thanks and Regards
    Vikram

  • Generic LDAPv3 data store help

    hey,
    We are running Access Manager 7.1 patch 1 on Sun App Server 8.2/Solaris 10 sparc. We are receiving the following error in the console when we try to add a Generic LDAPv3 data store:
    "The attribute name sun-idrepo-ldapv3-ldapv3Generic does not match the service schema"
    the console error log says:
    "2008-10-01 10:00:50"   "/|testldap|The attribute name sun-idrepo-ldapv3-ldapv3Generic does not match the service schema"       amConsole.error CONSOLE-644 dc=example,dc=com      e4a92e04b361e73401      SEVERE  id=amadmin,ou=user,dc=example,dc=com   XX.XX.XX.XX     "cn=dsameuser,ou=DSAME Users,dc=example,dc=com" amhostany ideas on the issue?

    SysHex wrote:
    Hi there
    Using AM7.1 RTM
    Been having a bit of trouble configuring a Data Store for Active directory. I actually got it working, just not the way I want it to work.
    I got an LDAP tree in this AD that looks somewhat like this :
    dc=company
    ou=section1,dc=company
    ou=section2,dc=company
    ou=users,ou=section1,dc=company
    ou=users,ou=section2,dc=company
    inside each ou=users I got several users such as :
    cn=aaa1,ou=users,ou=section1,dc=company
    cn=aaa2,ou=users,ou=section1,dc=company
    cn=aaa3,ou=users,ou=section1,dc=company
    and
    cn=bbb1,ou=users,ou=section2,dc=company
    cn=bbb2,ou=users,ou=section2,dc=company
    cn=bbb3,ou=users,ou=section2,dc=company
    What I'm looking for is to configure a DataStore that has all these users in it.
    I have managed to create a data store that contains wither all users in section1 or all users in section2 , but no datastore with all users from section1 and section2.
    When I use
    LDAP Organization DN: ou=section2,dc=company
    or
    LDAP Organization DN: ou=section1,dc=company
    and
    LDAP People Container Naming Attribute : ou
    LDAP People Container Value: users
    I get the users from each of the sections.
    So, till here everything is working the way it should.
    Now Trying to configure the Datastore to contain the users of both I do
    LDAP Organization DN: dc=company
    and I set
    LDAPv3 Plug-in Search Scope: SCOPE_SUB
    if you do the ldapsearch for the same scope and basedn with bind DN , doe it return any result?
    >
    and it doesn't show any of the users.
    Am I doing anything wrong?
    Thanks for your helpwhat is your use case? if you are going to do mostly read/search/mod/del operation then you can create 2 datastore instances one pointing to section1 and the other pointing to section2
    Rp

  • RBS Migration and Data Store Expension

    I'm seeking some insight on if (and how) remote blobs are migrated.  For example, if I've configured RBS for SharePoint 2010 but I'm approaching the storage maximum on the hardware of my remote blob location - how would I go about moving the blobs elsewhere and 'pointing' Sql Server and SharePoint to the new locations?  In addition, if I were to simply add another storage location - how does one go about re-configuring the RBS to store blobs in a new/additional location?
    TIA.
    -Tracy

    1.   
    Live SharePoint 2010 environment with SQL 2008 R2
    a.   
    Take back up from 2010 live server.
    i.     
    Open management studio on SQL server.
    ii.     
    Take back up of content database of live application.
    2.   
    QA SharePoint 2010 environment with SQL 2008 R2
    a.   
    Restore SQL backup
    i.     
    Open management studio on SQL server.
    ii.     
    Restore database.
    b.  
    Create Web Application
    i.     
    Open SharePoint server
    ii.     
    Open central admin
    iii.     
    Create web application with classic authentication.
    c.   
    Dismount database which is with existing application
    i.     
    Open SharePoint PowerShell on SharePoint server.
    ii.     
    Fire below command.
    Dismount-SPContentDatabase <Database name>
    Note: Change the database name.
    d.  
    Mount restored database with existing application
    i.     
    Open SharePoint PowerShell on SharePoint server.
    ii.     
    Fire below command.
    Mount-SPContentDatabase <Database name>  -DatabaseServer  <Database server name > -WebApplication <Web application>
    Note: Change the database name and web application URL.
    iii.     
    Open SharePoint Designer and change the master page and publish it.
    iv.     
    Set the Test page as Home page.
    v.     
    Test user logging
    Logging with the 2-3 different users and test they can able to logging.
    e.   
    Configure RBS
    i.     
    Enable FILESTREAM on the database server
    Open SQL Server Configuration manager on SQL Server.
    From left panel click on SQL Server Services.
    From right panel select the instance of SQL Server on which you want to enable FILESTREAM.
    Right-click the instance and then click Properties.
    In the SQL Server Properties dialog box, click the FILESTREAM tab.
    Select the Enable FILESTREAM for Transact-SQL access check box.
    If you want to read and write FILESTREAM data from Windows, click Enable FILESTREAM for file I/O streaming access. Enter the name of the Windows share in the Windows Share Name box.
    If remote clients must access the FILESTREAM data that is stored on this share, select allow remote clients to have streaming access to FILESTREAM data.
    Click Apply and ok.
    ii.     
    Set FILESTREAM access level
    Open SQL management studio and connect SQL database instance.
    Right click on database instance and open Property.
    Go to: click on advanced from left panel.
    Find the “Filestream Access Level” property and set the value “Full access enabled”
    Click OK and exit window.
    iii.     
    Set SharePoint Configure FILESTREAM access level
    Open Query window on root
    Execute  following query
    EXEC sp_configure filestream_access_level, 2
    RECONFIGURE
    Restart SQL services
    Note: You will get message” Configuration option 'filestream access level' changed from 2 to 2. Run the RECONFIGURE statement to install.”
    iv.     
    Provision a BLOB store for each content database
    Click the content database for which you want to create a BLOB store, and then click New Query
    Execute following query
    use [<Database name>]
    if not exists
    (select * from sys.symmetric_keys
    where name = N'##MS_DatabaseMasterKey##')
    create master key encryption by password = N'Admin Key Password !2#4'
    Note:
    Change the database name
    You get “Command(s) completed successfully.” Message.
    use [<Database name>]
    if not exists
    (select groupname from sysfilegroups
    where groupname=N'RBSFilestreamProvider')
    alter database [<Database name>]
    add filegroup RBSFilestreamProvider contains filestream
    Note:
    Change the database name.
    You get “Command(s) completed successfully.” Message.
    use [<Database name>]
    alter database [<Database name>]
     add file (name = RBSFilestreamFile, filename =
    '<E:\SQL\Data\PetroChina>')
    to filegroup RBSFilestreamProvider
    Note:
    Change the database name and store path.
    If you get message “FILESTREAM file 'RBSFilestreamFile' cannot be added because its destination filegroup cannot have more than one file.”
    Ignore it.
    v.     
    Install the RBS client library on each Web server
    To install the RBS client library on the on the first Web server
    Open SharePoint Web server
    Open command prompt.
    Execute following command
    msiexec /qn /lvx* rbs_install_log.txt /i RBS.msi TRUSTSERVERCERTIFICATE=true FILEGROUP=PRIMARY DBNAME=<Database name> DBINSTANCE=<Database server> FILESTREAMFILEGROUP=RBSFilestreamProvider FILESTREAMSTORENAME=FilestreamProvider_1
    Note:
    Change the database name and database instance name.
    DB instance should be <server name\instance name>
    Download RBS.msi for respective SQL version.
    To install the RBS client library on all additional Web and application serversOpen SharePoint Web server
    Open command prompt.
    Execute following command
    msiexec /qn /lvx* rbs_install_log.txt /i RBS.msi DBNAME=<Database name> DBINSTANCE=<Database server> ADDLOCAL=Client,Docs,Maintainer,ServerScript,FilestreamClient,FilestreamServer
    Note:
    Change the database name and database instance name.
    DB instance should be <server name\instance name>
    vi.     
    Enable RBS for each content database
    You must enable RBS on one Web server in the SharePoint farm. It is not important which Web server that you select for this activity. You must perform this procedure once for each content database.
    Open SharePoint web server
    Open SharePoint PowerShell
    Execute below script
    $cdb = Get-SPContentDatabase <Database name>
    $rbss = $cdb.RemoteBlobStorageSettings
    $rbss.Installed()
    $rbss.Enable()
    $rbss.SetActiveProviderName($rbss.GetProviderNames()[0])
    $rbss
    Note: Change the database name.
    vii.     
    Test the RBS installation
    On the computer that contains the RBS data store.
    Browse to the RBS data store directory.
    Confirm that size of RBS data store directory.
    On the SharePoint farm, upload a file that is at least 100 kilobytes (KB) to a document library.
    On the computer that contains the RBS data store.
    Browse to the RBS data store directory.
    Confirm that size of RBS data store directory.
    It must be more than before.
    viii.     
    Test user logging
    Logging with the 2-3 different users and test they can able to logging.
    f.    
    Migrate RBLOB from RBS to SQL database and completely remove RBS
    i.     
    Migrate all content from RBS to SQL and disable RBS for content DB:
    Open SharePoint server.
    Open SharePoint management PowerShell
    Execute below script
    $cdb=Get-SPContentDatabase <Database name>
    $rbs=$cdb.RemoteBlobStorageSettings
    $rbs.GetProviderNames()
    $rbs.SetActiveProviderName("")
    $rbs.Migrate()
    $rbs.Disable()
    Note:
    Migrate() might take some time depending on amount of data in your RBS store.
    Change the database name.
    If you get message on the PowerShell “PS C:\Users\sp2010_admin> $rbs.Migrate()
    Could not read configuration for log provider <ConsoleLog>. Default value used.
    Could not read configuration for log provider <FileLog>. Default value used.
    Could not read configuration for log provider <CircularLog>. Default value used.
    Could not read configuration for log provider <EventViewerLog>. Default value used.
    Could not read configuration for log provider <DatabaseTableLog>. Default value used.” Then wait for while it will take some time to start migration.”
    ii.     
    Change the default RBS garbage collection window to 0 on your content DB:
    Open SQL server
    Open SQL management studio
    Select your content DB and open new query window
    Execute below SQL query
    exec mssqlrbs.rbs_sp_set_config_value ‘garbage_collection_time_window’,'time 00:00:00′
    exec mssqlrbs.rbs_sp_set_config_value ‘delete_scan_period’,'time 00:00:00′
    Note:
    Run one by one SQL query
    You will get “Command(s) completed successfully.” Message
    iii.     
    Run RBS Maintainer (and disable the task if you scheduled it):
    Open SharePoint server
    Open command prompt
    Run below command
    "C:\Program Files\Microsoft SQL Remote Blob Storage 10.50\Maintainer\Microsoft.Data.SqlRemoteBlobs.Maintainer.exe" -connectionstringname RBSMaintainerConnection -operation GarbageCollection ConsistencyCheck ConsistencyCheckForStores -GarbageCollectionPhases
    rdo -ConsistencyCheckMode r -TimeLimit 120
    iv.     
    Uninstall RBS:
    Open SQL server
    Open SQL management studio
    On your content DB run below SQL query
    exec mssqlrbs.rbs_sp_uninstall_rbs 0
    Note:
    If you will get message “The RBS server side data cannot be removed because there are existing BLOBs registered. You can only remove this data by using the force_uninstall parameter of the mssqlrbs.rbs_sp_uninstall stored pro” then run this “exec mssqlrbs.rbs_sp_uninstall_rbs
    1 ”
    You will get “Command(s) completed successfully.” Message.
    v.     
    Uninstall from add/remove SQL Remote Blob Storage.
    I found that there were still FILESTREAM references in my DB, so remove that reference
    Open SQL server
    Open SQL management studio
    Run below SQL query on your content DB:
    ALTER TABLE [mssqlrbs_filestream_data_1].[rbs_filestream_configuration] DROP column [filestream_value]
    ALTER TABLE [mssqlrbs_filestream_data_1].[rbs_filestream_configuration] SET (FILESTREAM_ON = "NULL")
    Note:
    Run one by one SQL query
    vi.     
    Now you can remove the file and filegroup for filestream:
    Open SQL server
    Open SQL management studio
    Open new query window on top
    Execute below SQL query
    ALTER DATABASE <Database name> Remove file RBSFilestreamFile;
    Note:
    Change the database name
    If it gives message “The file 'RBSFilestreamFile' cannot be removed because it is not empty.” Then remove all table prefix with “mssqlrbs_” from your database and execute SQL query again.
    This query takes time as per your database size (almost 30 min).
    You will get “The file 'RBSFilestreamFile' has been removed.” Message
    Execute below SQL query
    ALTER DATABASE <Database name> REMOVE FILEGROUP RBSFilestreamProvider;
    Note:
    Change the database name
    You get “The filegroup 'RBSFilestreamProvider' has been removed.” Message.
    Or If you get “Msg 5524, Level 16, State 1, Line 1 Default FILESTREAM data filegroup cannot be removed unless it's the last
    FILESTREAM data filegroup left.” message. Then ignore this message.
    vii.     
    Remove Blob Store installation
    Open SharePoint server
    Run RBS.msi setup file and choose Remove option.
    Finish wizard.
    viii.     
    Disable FILESTREAM in SQL Configuration Manager
    Disable FILESTREAM in SQL Configuration Manager for your instance (if you do not use it anywhere aside this single content DB with SharePoint), run SQL reset and IIS reset and test.
    ix.     
    Test the RBS Removed or not?
    On the computer that contains the SQL database.
    Confirm that size of SQL database (.mdf file).
    On the SharePoint farm, upload a file that is at least 100 kilobytes (KB) to a document library.
    On the computer that contains the SQL database.
    Confirm that size of SQL database.
    It must be more than before. If there is no difference then ignore it. Just check it Store is no more in SQL.
    x.     
    Test user logging
    Logging with the 2-3 different users and test they can able to logging.
    g.   
    Convert classic-mode web applications to claims-based authentication
    i.     
    Open SharePoint server
    ii.     
    Open SharePoint PowerShell
    iii.     
    Execute below script
    $WebAppName = "<URL>"
    $wa = get-SPWebApplication $WebAppName
    $wa.UseClaimsAuthentication = $true
    $wa.Update()
    $account = "<Domain name\User name>"
    $account = (New-SPClaimsPrincipal -identity $account -identitytype 1).ToEncodedString()
    $wa = get-SPWebApplication $WebAppName
    $zp = $wa.ZonePolicies("Default")
    $p = $zp.Add($account,"PSPolicy")
    $fc=$wa.PolicyRoles.GetSpecialRole("FullControl")
    $p.PolicyRoleBindings.Add($fc)
    $wa.Update()
    $wa.MigrateUsers($true)
    $wa.ProvisionGlobally()
    iv.     
    Test user logging
    Logging with the 2-3 different users and test they can able to logging.
    h.  
    Take SQL backup from QA server
    i.     
    Open SQL server
    ii.     
    Open management studio on SQL server
    iii.     
    Select the content database
    iv.     
    Take back up of content database
    Information: This SQL backup is not content RBS.
    3.   
    New SharePoint 2013 environment with SQL 2012
    a.   
    Restore SQL backup
    i.     
    Open SQL server
    ii.     
    Open SQL management studio
    iii.     
    Restore the SQL database using *.bak file
    b.  
    Dismount database which is with existing application
    i.     
    Open SharePoint server
    ii.     
    Open SharePoint management PowerShell
    iii.     
    Execute below script
    Dismount-SPContentDatabase <Database name>
    Note: change the database name which bind with existing application.
    c.   
    Mount restored database with existing application
    i.     
    Open SharePoint server
    ii.     
    Open SharePoint management PowerShell
    iii.     
    Execute below script
    Mount-SPContentDatabase <Database name> -DatabaseServer <Database server name> -WebApplication <URL>
    Note:
    Change the database name with new restored database name
    Change the database server name in form of “DB server name\DB instance name”
    Change the URL of web application
    This command take some time.
    d.  
    Upgrade site collection
    i.     
    Open SharePoint server
    ii.     
    Open new site
    iii.     
    You will find message on top “Experience all that SharePoint 15 has to offer. Start now or Remind me later”
    iv.     
    Click on “Start”
    v.     
    Click on ”Upgrade this Site Collection”
    vi.     
    Click on “I Am ready”
    vii.     
    After some time you will get message “Upgrade Completed Successfully”
    viii.     
    Test User logging.

  • Unity Data Store for unity 8

    Guys,
      Im installing unity 8.0.3  , but unfortunately stuck at " cisco data store" .When i searched google itsays to install "sql server 2005 sp3" or"sql server express 2005 sp3". I installed express , but unity assistant still shows the result as "failed".
    Any help from u guys will be appreciatable.

    You should take a look at the "Read Me" file as it states that the Unity Data Store DVD for Unity 8 is NOT available for download.  Refer to the following:
    Do Not Use the Cisco Unity Data Store 2000 Disc from Cisco Unity 7.x and Earlier to Upgrade to Cisco Unity 8.0
    If you are upgrading to Cisco Unity 8.0 and reinstalling software on the Cisco Unity server (for example, because you are replacing the server or upgrading to Windows Server 2003), you must use the Data Store 2000 disc that ships with Cisco Unity 8.0. The Data Store 2000 disc that shipped with earlier versions of Cisco Unity has a different directory structure, and the Cisco Unity System Setup Assistant, which guides you through installing and configuring Cisco Unity 8.0, will not recognize the Data Store 2000 disc that shipped with Cisco Unity 7.x and earlier. (If you paid for an upgrade to SQL Server 2005, we will ship you a Data Store 2005 disc with the rest of your Cisco Unity software.)
    The Cisco Unity 8.0 version of the Data Store 2000 disc is not available for download on Cisco.com. To perform an upgrade that requires reinstalling software on the Cisco Unity server, you must either wait for the Cisco Unity 8.0 installation discs to arrive or manually install SQL Server 2000 or MSDE 2000. Manually installing the application is not documented and, therefore, not supported.
    There's your answer.  You need to order the upgrade kit from Cisco in order to get the proper DVD.
    Hailey
    Please rate helpful posts!

  • How to search on creation Date to get specific date records

    Hi,
    I have created a search page based on VO which contains creation Date field. Creation Date field stores date with time into database.
    af:query is created on this VO. Suppose I have created 10 records today and want to query that by providing today's date(6/21/2011) in CreationDate field. On click of search, it does not return any record.
    Since records in database are timestamp based, it does not return any record.
    How to achieve this functionality where user can search by specific date?

    799794 wrote:
    Hi,
    I have created a search page based on VO which contains creation Date field. Creation Date field stores date with time into database.
    af:query is created on this VO. Suppose I have created 10 records today and want to query that by providing today's date(6/21/2011) in CreationDate field. On click of search, it does not return any record.
    Since records in database are timestamp based, it does not return any record.
    How to achieve this functionality where user can search by specific date?did you created any view criteria? share with us more info

  • How to read from and write to remote data file?

    Hi there
    I have two variables that I would like to store in a remote data file which can be accessed by my Flash file. It is for a voting system, so the first variable would be a counter that is incremented each time a user votes, and the second would be the total rating. The average rating would then be calculated from these.
    How would one store, retrieve, and then update these variables from the data file?
    I can create the system within a single Flash file for a single session, but obviously I'd like multiple users opening the file simultaneously to be able to access the data and update it.
    I'm using AS2 in CS3. Any help would be appreciated.

    Right, I've made significant progress since I started this topic. I now have my Flash and php files set up, and the communication between them is working. I'm having difficulty now retaining my variables in the php and updating them for the next session. The script that I have is as follows:
    <?
    $totalRating1 += $_POST['latestRating1'];
    $ratingCount1 ++;
    $averageRating1 = $totalRating1 / $ratingCount1;
    echo "&averageRating=$averageRating1";
    ?>
    The first line receives the variable "latestRating1" from the SWF. The new average rating is calculated and returned to the SWF as "averageRating". When I run it, however, the variables are reset each time. The average rating is always the same as the latest rating. How can I retain the variables so that they are updated and available for the next user?
    Any advice would be appreciated. I am very new to php, and don't have any experience with other scripting languages. I'm using ActionScript2.

  • OIF : Federation Data Store

    I have configured RDBMS as federation data Store.
    Now when I navigate to OIF Instance->Administration->Identities->Federated Identities and click on search, I get the following error
    Unable to perform search.
    The following error occured while performing the operation above:
    javax.management.RuntimeMBeanException: An unexpected error occurred while performing the federation record search: oracle.security.fed.admin.search.exceptions.SearchDatastoreException: Error ocurred while querying database
    Also datastore.xml is missing from my OIF server.The RCU is already installed on DB. Please let me know what did I do wrong.
    Regards,
    RA
    Edited by: R_A on Jan 4, 2012 1:23 AM

    I hope you know that eDirectory is not supported - http://www.oracle.com/technology/software/products/ias/files/idm_certification_101401.html#BABHFBCC
    Like any LDAP-compliant server eDirectory has its own idiosyncracies and you might hit issues because OIF using eDir as the federation data store has not been QA-ed by Oracle.
    -Vinod

  • To Stop ATG search  in  Commerce Reference Store

    Hi Guys,
    Does any body has the Idea , how to stop ATG search in Commerce Reference Store.
    I Want to run Endeca Search,My data is indexed , i want to see the indexed data in Commerce reference Store. Any body has any Idea how Can i Perform this?

    CRS version 10.1.2 requires Endeca and includes an Endeca assembler-based integration, and does not support ATG Search.
    CRS version 10.1.1 includes optional Endeca indexing modules to index CRS data, but does not include an Endeca integrated CRS storefront. CRS version 10.1.1 supports optional ATG Search modules, but can be run without them (in which case simple Repository-based searching is used).
    ATG and Endeca integrations can either follow the CRS 10.1.2's example of using a Nucleus-configured assembler and extending ATG Endeca integration platform features, or can do an integration as they would have before there was an ATG Endeca integration (for example, use a Spring-configured assembler and follow the model shown by Endeca's Discover Electronics).

Maybe you are looking for

  • USB stick won't mount on mac mini, it works perfectly on my macbook

    Both of them are osx lion. On the mini the led of the 256Mb USB stick (and still plenty of space!) gives short flashes every second, and won't mount on my desktop, in finder. It doesn't show up in disk utility. The mac mini is the most recent server

  • C80 Reporting to French central bank of France using RFIDITSR00

    Hello Gurus,  I was unable, so far...., on finding how to complete the field Code Identifier in the DMEE tree format ITSR_FR_C80. This field is located in the Record Detail and should be completed with values like 09820 ; 09822 or 09821 according to

  • Cannot start iChat....cannot accept invites.....

    spent an hour on the phone with support.....they didn't help. I have connected to the apple test (appleu3test01, 02 & 03...) works fine. My pal can also connect to the test sites. But, neither of us can connect to anyone else. There is a friend of hi

  • XPAAJ Java API question

    Hi, I am currently struggling with this task: I have created a PDF using livecycle, what i want todo is in java convert the generated PDF to an image, i cannot use the livecycle server, as it will have to generate on a client machine, that at the tim

  • Report field condition

    In a report region. I can change the template to form region. Then the data shows as: Field1: data Field2: Field3: data Field1: data2 Field2: data2 Field3: data2 Is it possible to set a condition that hides the field that is null: Field1: data Field3