RBS Migration and Data Store Expension

I'm seeking some insight on if (and how) remote blobs are migrated.  For example, if I've configured RBS for SharePoint 2010 but I'm approaching the storage maximum on the hardware of my remote blob location - how would I go about moving the blobs elsewhere and 'pointing' Sql Server and SharePoint to the new locations?  In addition, if I were to simply add another storage location - how does one go about re-configuring the RBS to store blobs in a new/additional location?
TIA.
-Tracy

1.   
Live SharePoint 2010 environment with SQL 2008 R2
a.   
Take back up from 2010 live server.
i.     
Open management studio on SQL server.
ii.     
Take back up of content database of live application.
2.   
QA SharePoint 2010 environment with SQL 2008 R2
a.   
Restore SQL backup
i.     
Open management studio on SQL server.
ii.     
Restore database.
b.  
Create Web Application
i.     
Open SharePoint server
ii.     
Open central admin
iii.     
Create web application with classic authentication.
c.   
Dismount database which is with existing application
i.     
Open SharePoint PowerShell on SharePoint server.
ii.     
Fire below command.
Dismount-SPContentDatabase <Database name>
Note: Change the database name.
d.  
Mount restored database with existing application
i.     
Open SharePoint PowerShell on SharePoint server.
ii.     
Fire below command.
Mount-SPContentDatabase <Database name>  -DatabaseServer  <Database server name > -WebApplication <Web application>
Note: Change the database name and web application URL.
iii.     
Open SharePoint Designer and change the master page and publish it.
iv.     
Set the Test page as Home page.
v.     
Test user logging
Logging with the 2-3 different users and test they can able to logging.
e.   
Configure RBS
i.     
Enable FILESTREAM on the database server
Open SQL Server Configuration manager on SQL Server.
From left panel click on SQL Server Services.
From right panel select the instance of SQL Server on which you want to enable FILESTREAM.
Right-click the instance and then click Properties.
In the SQL Server Properties dialog box, click the FILESTREAM tab.
Select the Enable FILESTREAM for Transact-SQL access check box.
If you want to read and write FILESTREAM data from Windows, click Enable FILESTREAM for file I/O streaming access. Enter the name of the Windows share in the Windows Share Name box.
If remote clients must access the FILESTREAM data that is stored on this share, select allow remote clients to have streaming access to FILESTREAM data.
Click Apply and ok.
ii.     
Set FILESTREAM access level
Open SQL management studio and connect SQL database instance.
Right click on database instance and open Property.
Go to: click on advanced from left panel.
Find the “Filestream Access Level” property and set the value “Full access enabled”
Click OK and exit window.
iii.     
Set SharePoint Configure FILESTREAM access level
Open Query window on root
Execute  following query
EXEC sp_configure filestream_access_level, 2
RECONFIGURE
Restart SQL services
Note: You will get message” Configuration option 'filestream access level' changed from 2 to 2. Run the RECONFIGURE statement to install.”
iv.     
Provision a BLOB store for each content database
Click the content database for which you want to create a BLOB store, and then click New Query
Execute following query
use [<Database name>]
if not exists
(select * from sys.symmetric_keys
where name = N'##MS_DatabaseMasterKey##')
create master key encryption by password = N'Admin Key Password !2#4'
Note:
Change the database name
You get “Command(s) completed successfully.” Message.
use [<Database name>]
if not exists
(select groupname from sysfilegroups
where groupname=N'RBSFilestreamProvider')
alter database [<Database name>]
add filegroup RBSFilestreamProvider contains filestream
Note:
Change the database name.
You get “Command(s) completed successfully.” Message.
use [<Database name>]
alter database [<Database name>]
 add file (name = RBSFilestreamFile, filename =
'<E:\SQL\Data\PetroChina>')
to filegroup RBSFilestreamProvider
Note:
Change the database name and store path.
If you get message “FILESTREAM file 'RBSFilestreamFile' cannot be added because its destination filegroup cannot have more than one file.”
Ignore it.
v.     
Install the RBS client library on each Web server
To install the RBS client library on the on the first Web server
Open SharePoint Web server
Open command prompt.
Execute following command
msiexec /qn /lvx* rbs_install_log.txt /i RBS.msi TRUSTSERVERCERTIFICATE=true FILEGROUP=PRIMARY DBNAME=<Database name> DBINSTANCE=<Database server> FILESTREAMFILEGROUP=RBSFilestreamProvider FILESTREAMSTORENAME=FilestreamProvider_1
Note:
Change the database name and database instance name.
DB instance should be <server name\instance name>
Download RBS.msi for respective SQL version.
To install the RBS client library on all additional Web and application serversOpen SharePoint Web server
Open command prompt.
Execute following command
msiexec /qn /lvx* rbs_install_log.txt /i RBS.msi DBNAME=<Database name> DBINSTANCE=<Database server> ADDLOCAL=Client,Docs,Maintainer,ServerScript,FilestreamClient,FilestreamServer
Note:
Change the database name and database instance name.
DB instance should be <server name\instance name>
vi.     
Enable RBS for each content database
You must enable RBS on one Web server in the SharePoint farm. It is not important which Web server that you select for this activity. You must perform this procedure once for each content database.
Open SharePoint web server
Open SharePoint PowerShell
Execute below script
$cdb = Get-SPContentDatabase <Database name>
$rbss = $cdb.RemoteBlobStorageSettings
$rbss.Installed()
$rbss.Enable()
$rbss.SetActiveProviderName($rbss.GetProviderNames()[0])
$rbss
Note: Change the database name.
vii.     
Test the RBS installation
On the computer that contains the RBS data store.
Browse to the RBS data store directory.
Confirm that size of RBS data store directory.
On the SharePoint farm, upload a file that is at least 100 kilobytes (KB) to a document library.
On the computer that contains the RBS data store.
Browse to the RBS data store directory.
Confirm that size of RBS data store directory.
It must be more than before.
viii.     
Test user logging
Logging with the 2-3 different users and test they can able to logging.
f.    
Migrate RBLOB from RBS to SQL database and completely remove RBS
i.     
Migrate all content from RBS to SQL and disable RBS for content DB:
Open SharePoint server.
Open SharePoint management PowerShell
Execute below script
$cdb=Get-SPContentDatabase <Database name>
$rbs=$cdb.RemoteBlobStorageSettings
$rbs.GetProviderNames()
$rbs.SetActiveProviderName("")
$rbs.Migrate()
$rbs.Disable()
Note:
Migrate() might take some time depending on amount of data in your RBS store.
Change the database name.
If you get message on the PowerShell “PS C:\Users\sp2010_admin> $rbs.Migrate()
Could not read configuration for log provider <ConsoleLog>. Default value used.
Could not read configuration for log provider <FileLog>. Default value used.
Could not read configuration for log provider <CircularLog>. Default value used.
Could not read configuration for log provider <EventViewerLog>. Default value used.
Could not read configuration for log provider <DatabaseTableLog>. Default value used.” Then wait for while it will take some time to start migration.”
ii.     
Change the default RBS garbage collection window to 0 on your content DB:
Open SQL server
Open SQL management studio
Select your content DB and open new query window
Execute below SQL query
exec mssqlrbs.rbs_sp_set_config_value ‘garbage_collection_time_window’,'time 00:00:00′
exec mssqlrbs.rbs_sp_set_config_value ‘delete_scan_period’,'time 00:00:00′
Note:
Run one by one SQL query
You will get “Command(s) completed successfully.” Message
iii.     
Run RBS Maintainer (and disable the task if you scheduled it):
Open SharePoint server
Open command prompt
Run below command
"C:\Program Files\Microsoft SQL Remote Blob Storage 10.50\Maintainer\Microsoft.Data.SqlRemoteBlobs.Maintainer.exe" -connectionstringname RBSMaintainerConnection -operation GarbageCollection ConsistencyCheck ConsistencyCheckForStores -GarbageCollectionPhases
rdo -ConsistencyCheckMode r -TimeLimit 120
iv.     
Uninstall RBS:
Open SQL server
Open SQL management studio
On your content DB run below SQL query
exec mssqlrbs.rbs_sp_uninstall_rbs 0
Note:
If you will get message “The RBS server side data cannot be removed because there are existing BLOBs registered. You can only remove this data by using the force_uninstall parameter of the mssqlrbs.rbs_sp_uninstall stored pro” then run this “exec mssqlrbs.rbs_sp_uninstall_rbs
1 ”
You will get “Command(s) completed successfully.” Message.
v.     
Uninstall from add/remove SQL Remote Blob Storage.
I found that there were still FILESTREAM references in my DB, so remove that reference
Open SQL server
Open SQL management studio
Run below SQL query on your content DB:
ALTER TABLE [mssqlrbs_filestream_data_1].[rbs_filestream_configuration] DROP column [filestream_value]
ALTER TABLE [mssqlrbs_filestream_data_1].[rbs_filestream_configuration] SET (FILESTREAM_ON = "NULL")
Note:
Run one by one SQL query
vi.     
Now you can remove the file and filegroup for filestream:
Open SQL server
Open SQL management studio
Open new query window on top
Execute below SQL query
ALTER DATABASE <Database name> Remove file RBSFilestreamFile;
Note:
Change the database name
If it gives message “The file 'RBSFilestreamFile' cannot be removed because it is not empty.” Then remove all table prefix with “mssqlrbs_” from your database and execute SQL query again.
This query takes time as per your database size (almost 30 min).
You will get “The file 'RBSFilestreamFile' has been removed.” Message
Execute below SQL query
ALTER DATABASE <Database name> REMOVE FILEGROUP RBSFilestreamProvider;
Note:
Change the database name
You get “The filegroup 'RBSFilestreamProvider' has been removed.” Message.
Or If you get “Msg 5524, Level 16, State 1, Line 1 Default FILESTREAM data filegroup cannot be removed unless it's the last
FILESTREAM data filegroup left.” message. Then ignore this message.
vii.     
Remove Blob Store installation
Open SharePoint server
Run RBS.msi setup file and choose Remove option.
Finish wizard.
viii.     
Disable FILESTREAM in SQL Configuration Manager
Disable FILESTREAM in SQL Configuration Manager for your instance (if you do not use it anywhere aside this single content DB with SharePoint), run SQL reset and IIS reset and test.
ix.     
Test the RBS Removed or not?
On the computer that contains the SQL database.
Confirm that size of SQL database (.mdf file).
On the SharePoint farm, upload a file that is at least 100 kilobytes (KB) to a document library.
On the computer that contains the SQL database.
Confirm that size of SQL database.
It must be more than before. If there is no difference then ignore it. Just check it Store is no more in SQL.
x.     
Test user logging
Logging with the 2-3 different users and test they can able to logging.
g.   
Convert classic-mode web applications to claims-based authentication
i.     
Open SharePoint server
ii.     
Open SharePoint PowerShell
iii.     
Execute below script
$WebAppName = "<URL>"
$wa = get-SPWebApplication $WebAppName
$wa.UseClaimsAuthentication = $true
$wa.Update()
$account = "<Domain name\User name>"
$account = (New-SPClaimsPrincipal -identity $account -identitytype 1).ToEncodedString()
$wa = get-SPWebApplication $WebAppName
$zp = $wa.ZonePolicies("Default")
$p = $zp.Add($account,"PSPolicy")
$fc=$wa.PolicyRoles.GetSpecialRole("FullControl")
$p.PolicyRoleBindings.Add($fc)
$wa.Update()
$wa.MigrateUsers($true)
$wa.ProvisionGlobally()
iv.     
Test user logging
Logging with the 2-3 different users and test they can able to logging.
h.  
Take SQL backup from QA server
i.     
Open SQL server
ii.     
Open management studio on SQL server
iii.     
Select the content database
iv.     
Take back up of content database
Information: This SQL backup is not content RBS.
3.   
New SharePoint 2013 environment with SQL 2012
a.   
Restore SQL backup
i.     
Open SQL server
ii.     
Open SQL management studio
iii.     
Restore the SQL database using *.bak file
b.  
Dismount database which is with existing application
i.     
Open SharePoint server
ii.     
Open SharePoint management PowerShell
iii.     
Execute below script
Dismount-SPContentDatabase <Database name>
Note: change the database name which bind with existing application.
c.   
Mount restored database with existing application
i.     
Open SharePoint server
ii.     
Open SharePoint management PowerShell
iii.     
Execute below script
Mount-SPContentDatabase <Database name> -DatabaseServer <Database server name> -WebApplication <URL>
Note:
Change the database name with new restored database name
Change the database server name in form of “DB server name\DB instance name”
Change the URL of web application
This command take some time.
d.  
Upgrade site collection
i.     
Open SharePoint server
ii.     
Open new site
iii.     
You will find message on top “Experience all that SharePoint 15 has to offer. Start now or Remind me later”
iv.     
Click on “Start”
v.     
Click on ”Upgrade this Site Collection”
vi.     
Click on “I Am ready”
vii.     
After some time you will get message “Upgrade Completed Successfully”
viii.     
Test User logging.

Similar Messages

  • Data Source and Data Store

    Are both Data Source and Data Store the same in BI? If not can someone explain what each one of these terms mean.
    Thanks for the help

    Data Source or Persistent Staging Area  is a transparent database table or initial store in BI. In this table the requested data is saved unchanged from the Source System.
    DataStore Objects are primary physical database storage objects used in BI. They are designed to store very detailed transactional level records.
    Thanks

  • BW Analytical Authorisations and Data Store Objects

    Hello All
    I am in the proces of trying to figure out how BW Analytical authorisations work as I have to build some authrisations for a new a new BW project.
    I understand the concept of BW Analytical authorisations. I have created an object linked to heirarchies via an info provider, and assigned it to a user and it works great. The problem is that I then went and ran a generation for heirarchies and I specified the Z info provider my analytical authorisation object was linked to. Now I find that all usrs on the system have access to my object and I need to remove this. Even new users on the system automatically get this access.
    I have read note 1052242 which explains that I can remove the authorisations using data store objects (DSOs). The thing is that I do not know how to maintain these DSOs..
    Can anyone help with this. Once I know how to maintain the DSO I can add in the required D_E_L_E_T_E entry and re-run the genration and hopefully this will solve my problem.
    Thank You In Advance
    Best Regards

    Hi Anwar,
    if your question is how to update data into a DSO, then I recommend you read the documentation.
    http://help.sap.com/saphelp_nw70/helpdata/en/f9/45503c242b4a67e10000000a114084/frameset.htm
    You require basic BW knowledge for that.
    If your background is more ABAP then think about making the DSO a DSO for direct update.
    That way you do not need BW knowledge and you can use ABAP instead to modify the data in the DSO.
    These Function modules of the API can be used:
    ●      RSDRI_ODSO_INSERT: Inserts new data (with keys not yet in the system).
    ●      RSDRI_ODSO_MODIFY:  inserts data having new keys; for data with keys already in the system, the data is changed.
    ●      RSDRI_ODSO_UPDATE: changes data with keys in the system
    ●      RSDRI_ODSO_DELETE_RFC: deletes data
    More information about these Function Module is here
    http://help.sap.com/saphelp_nw70/helpdata/en/c0/99663b3e916a78e10000000a11402f/frameset.htm
    However, if that doesn't solve your original problem with the authorizations, here are some useful links that I found helpful when implementing BW Analysis Authorizations.
    SDN area for Analysis Authorizations
    http://wiki.sdn.sap.com/wiki/display/BI/AuthorizationinSAPNWBI#AuthorizationinSAPNWBI-Differencebetweenrssmandrsecadmin
    Marc Bernard session
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/media/uuid/ac7d7c27-0a01-0010-d5a9-9cb9ddcb6bce
    SAP release note for new Analysis Authorizations
    http://help.sap.com/saphelp_nw04s/helpdata/en/80/d71042f664e22ce10000000a1550b0/frameset.htm
    Best,
    Ralf

  • Audit Vault 10.2.3.2 and data store db version

    Hi,
    once the installation of AV 10.2.3.2 is completed which will be the database version of the datastore deployed ?
    It is possible to upgrade the audit vault database repository to a different version ?
    where this info is decribed inside the official doc ?
    many thanks
    Angelo

    Hi:
    The underlying database for 10.2.3.2 Audit Vault is at version 10.2.0.4 (EE). It is NOT possible to upgrade just the database component independent of the rest of the tech stack, other than to apply patches and CPUs.
    Regards.

  • Low-Cost data migration and ETL tools

    Hi, Im building a database for my company. We are a rather small size book company with a lot of references and still growing.
    We have a Mysql database here and are trying to find some good tools to use it at its best. Basically we are just starting up the database after dealing with Excel: we had a size problemu2026 So im trying to find a program that will allow us to do two different things: the migration of our data from the old system to the new one and a specialized software to perform ETL (Extract, transform and load) on our database.
    About the price of the tools, if we were one year ago, the accounting department would have been pretty relaxed about this. But today, we have some budget restrictions and therefore need a low cost tool. So could you give me some advice on a good data migration and etl tool for a low cost?
    Thanks for your help.

    Hi,
    Some companies have budget problems and are having to reduce their cost of operation.
    Have you ever heard of open source tools? They can do the job easily and less expensive than proprietary solutions that cost a lot of money.
    You can have a look at a good ETL open source program called Talend Open Studio: it is user-friendly but also has advanced features intended for technical users (java debugger, code injectionu2026). It can perform data migration and ETL as you wrote in your first post.
    The website is [here|http://www.talend.com/solutions-data-integration/data-migration.php] to download the open source program. They have a forum and documentation you can read. Tell us what you think about the software.
    For an ETL benchmark: [ETL Benchmark|http://blogs.sun.com/aja/entry/talend_s_new_data_processing]

  • How to migrate iPhone apps and data to another apple ID

    OK, Hopefully someone can help me out with this situation:  I have had my daughter's iPhone associated with my Apple ID for several years now because A) She was too young to have an Apple ID of her own, and B) I wanted to be able to monitor her App Store purchases and such. Well now that she is older and has proven herself to be a responsible young lady, I want to get her her own Apple ID, as she will be getting a new McBook as a graduation present.  My question is this: Is it possible to migrate all the apps and data from her phone to a new Apple ID without having to download all those apps again?  I backup her phone to my computer on a regular basis.  Any help is appreciated.

    hold on a second. i went thru the same situation with my daughter, almost exactly as she got a new macbook pro and a new iphone 4. she had an ipod touch 4g that she synced to my macbook pro and apple id for apps. with wjosten's (the guy who wrote the article randers4 linked) help we transferred purchases from her ipod touch to her new itunes on her new macbook with her new apple id. except for those apps that she already had she started all fresh. i figured at some point she'll be moving out and off to college or whatever so might as well start all fresh but she didn't want to lose data she had on some of the old apps. the only issue is when there is an update for the old apps on her new macbook pro i have to sign in on my apple id to update them. i don't know if this info will be of any help or not, but just throwing it out there.

  • TS3297 when I press the iTunes button on my ipod touch I get the message 'cannot connect to iTunes store' .  My wifi is working fine, I can connect to safari & you tube, no parental setting in place, and time and date are correct. Can anyone help please?

    When I press the iTunes button on my ipod touch I get the message 'cannot connect to iTunes store' .  My wifi is working fine, I can connect to safari & you tube, no parental setting in place, and time and date are correct. Can anyone help please? I have restored my ipod to factory settings and rest it.

    I also tried moving the date forward by a year and then moving it back to normal and it still doesn't work.  i can't find an automatic update of time zones on my itouch to turn this off.

  • Purchase date of my iphone and what store it is bought from

    How can I check purchase date of my iphone and what store it is bought from? because my iphone is broken and I need a reciept for the insurance company to fix it. Thanks!

    https://selfsolve.apple.com/agreementWarrantyDynamic.do

  • What are  Pre Database Copy and Post data base copy activity list, Pre Migration and Post Migration activity list from SAP BW 7.0 to SAP BW 7.4 SPS6.

    BW on HANA :  Pre Database Copy and Post data base copy activity list, Pre Migration and Post Migration activity list from SAP BW 7.0 to SAP BW 7.4 SPS6.
    We are trying to copy database from SAP BW7.0 to SAP BW on HANA 7.4 SPS6 so we are in search for list of steps or activities during database copy both pre and post steps.
    Along with the above we are in search of Pre and post migration steps ones database is transferred successfully from oracle to HANA on 7.4 SPS6.
    Kindly help us in getting the exact course of action as requested.
    Thanks and Regards,
    Lavina Joshi

    Hi Lavina,
    try this link for starters: Upgrade and Migration - BW on HANA | SAP HANA
    Points to remember are:
    Preparation:
         -- Hardware Sizing
         -- Preparation of Data Centres
         -- HANA Hardware preparation
         -- System Landscape Readiness (upgrade software downloads, system readiness checks, etc)
         -- House Keeping activities on BW system (data clean up, etc)
    Post Installation:
         -- Sanity checks / Preparation and License checks
         -- JAVA Configurations
         -- Infoprovider conversions 
    Overall Stages are described below:
    # Environmental setup (HANA box)
         -- Initial system checks and Building Activities (system copy, Appln server setups, etc)
    # System readiness
                   - ZBW_HANA_COCKPIT Tool
                   - ZBW_HANA_CHECKLIST Tool
                   - ZBW_ABAP_ANALYZER Tool
                   - ZBW_TRANSFORM_FINDER Tool
                   - SIZING Report
                   - System Clean up Activities
                   - Impact of 7.4 on source system checks
                   - Java Upgrade for portal
    # DMO Stages
                   - Preparation & Pre Migration checks
                   - Execution / Migration
                   - Post Migration Activities
    # Testing Phase
                   - Source system checks/Activities
                   - System and Integration Testing
                   - End to End Testing
                   - Performance testing
                   - Reports
                   - BO reports / Interfaces
    Do let me know if you require any further information.
    Regards,
    Naren

  • Problem with concatenated data store and VACHAR(4000) field

    Have a concatenated data store built from 18 columns in one table. The main column for the index is a VARCHAR(4000) field. When we insert more than 3215 characters in that field the row disappers from the index, 3215 characters or less works fine.
    Anyone know why?

    hi,
    If you want to display them in an input field you will need to use expression box and use the formula as store@datastorefield.
    if you want to display in the Plain text go to the display tab in the properties of plain text and use the same formula here, in plain text you can only display the values present in the datastore.
    Regards,
    Rk.

  • Issue Migrating Character Data Using a Full Export and Import

    Hi There;
    I have a database in my local machine that doesn't support Turkish characters. My NLS_CHARACTERSET is WE8ISO8859P1, It must be changed to WE8ISO8859P9 , since it supports full Turkish characters. I would like to migrate character data using a full export and import and my strategy is as follows:
    1- create a full export to a location in network,
    2- create a new database in local machine that it's NLS_CHARACTERSET is WE8ISO8859P9 (I would like to change NLS_LANGUAGE and NLS_TERRITORY by the way)
    3- and implement full import to newly created database.
    I 've implemented first step, but I couldn't implement the second step. I 've created the second step by using toad editor by clicking Create -> New Database but I can not connect the new database. I must connect new database in order to perform full import. How can I do this?
    Thanks in advance.
    Technical Details
    NLS_LANGUAGE.....................AMERICAN
    NLS_TERRITORY.....................AMERICA
    NLS_CURRENCY.....................$
    NLS_ISO_CURRENCY..............AMERICA
    NLS_NUMERIC_CHARACTERS    .,
    NLS_CHARACTERSET.............WE8ISO8859P1
    NLS_CALENDAR.....................GREGORIAN
    NLS_DATE_FORMAT................DD-MON-RR
    NLS_DATE_LANGUAGE...........AMERICAN
    NLS_SORT...............................BINARY
    NLS_TIME_FORMAT.................HH.MI.SSXFF AM
    NLS_TIMESTAMP_FORMAT......DD-MON-RR HH.MI.SSXFF AM
    NLS_TIME_TZ_FORMAT............HH.MI.SSXFF AM TZR
    NLS_TIMESTAMP_TZ_FORMAT..DD-MON-RR HH.MI.SSXFF AM TZR
    NLS_DUAL_CURRENCY............ $
    NLS_COMP...............................BINARY
    NLS_LENGTH_SEMANTICS.......BYTE
    NLS_NCHAR_CONV_EXCP........FALSE
    NLS_NCHAR_CHARACTERSET...AL16UTF16
    NLS_RDBMS_VERSION............10.2.0.1.0

    First, if your applications run on Windows, do not use WE8ISO8859P9. Use TR8MSWIN1254.
    Second, if you create a new database, the database is not necessarily immediately accessible to outer world. I do not know Toad and I have no idea if it performs all necessary steps required for the new database to be visible.  For example, in the Toad itself, I assume you should create a new connection that references the new SID of the newly created database and use this new connection to connect. However, connections without a connection string use the ORACLE_SID setting in Registry to tell connecting applications which instance (database) to use.  To change the database accessed with an empty connection string you need to modify Registry (unless Toad has an option to do this for you). If you want to connect without changing Registry, you need a connect string. This requires setting up Oracle Listener to serve the new database (unless default configuration is used and the database registers itself with the default listener). It also requires changing tnsnames.ora file to create an alias for the new database. Net Manager and/or Net Configuration Assistant can help you with this.
    I wonder if Database Configuration Assistant would not be a better tool to create new Oracle databases.
    Thanks,
    Sergiusz

  • Session Facade and Access to a Non SQL Based Persistent Data Store

    Hi,
    We are currently using jDeveloper 10.1.3.5 and Oracle Application Server 10.1.3.5. We develop all our applications as Java portlets using Oracle PDK and they are exposed through Oracle Portal.
    In our environment, the persistent data is stored on a combination of an Oracle database and a non SQL based persistent data store.
    The way we access the non SQL persistent data store is by posting a URL and receiving an XML document back in response. This mechanism is used both for enquiry and update of the persistent store.
    We have to create a new XML schema for each entity that we need to access and there are software changes on both our environment (Java) and the non SQL based persistent data store.
    In an attempt to shorten development times we are looking to start using ADF faces and EJB3.
    We have downloaded the SRDemo tutorial and made it work but there are some challenges.
    1. The SRDemo seem to have a very minimal implementation of a business layer. From what I can see, it is essentially some straightforward wiring between database attributes and their viewable representation. Is there a demo/tutorial containing a bit more meat in the business layer that you are aware of?
    2. Given our non SQL based persistent data store, how would you go about implementing EJB3 for such scenario. Is it recommended at all? How would you go about integrating the rest of the application (business layer and representation layer) to data arriving from such source?
    3. SRDemo is not intended to be exposed as a portlet. Is there a tutorial that we can use incorporating JSR168, ADF Faces and EJB3 in the same application? I also understand that there is a JSF-JSR168 bridge available. Can you provide some pointers here? Where can we find it? Would we be able to use it in jDeveloper 10.1.3.5?
    Regards

    Matt,
    The only way to associate an "x-axis" with a signal in the Write Data VI would be to feed it waveforms, which are constrained to use time as the x-axis unit. There is really no way around this, so in my opinion, the best solution for you would be to use the "rows are channels" conversion and write the frequency and amplitude values to the file independently. Then when you read the file in DIAdem, take the two channels and build a graph out of them there.
    Regards,
    E. Sulzer
    Applications Engineer
    National Instruments
    E. Sulzer
    Applications Engineer
    National Instruments

  • Backup and Data Migration Questions????

    Sorry for the multiple post, posted to fast and placed in wrong post....
    But anyway... I am considering getting the Droid X (Android 2.1) and I will be changing over from a Imagio (WM 6.5) and since I am new to the whole Android OS, I was wondering what is the best way to transfer all my data from the Imagio to the Droid X once I recieve it?  I have looked at a application (Sprite Migration) that seems to make the migration process rather simple that can be found at http://www.spritesoftware.com/products/migrate/how -does-it-work- but I was wondering if anyone has used this application and how well did it work?
    Second I was also looking at the Sprite Backup application also for my system backups of device, I used the Windows Mobile version before ojn my Imagio and it worked well but I decided to stay with SPB Backup because I was already use to the program since I have used it for a number of years.... 
    Is Sprite Migration and Sprite Backup a quality product to use on a Android device because I want to backup the stock image into a file before I start modifing my device and learning the ins and outs of the unit.
    Any feedback would be appreciated

    Hi Wildman,
    Yes you are correct, Sprite Backup offers a much more complete backup.
    We backup everything that can be done without root access.
    We are also now on many handsets in ROM, with even more access.  We hope to see more manufacturers, (4 so far, but not Motorola yet) bundle us on their devices (approx 1.2 million devices with us in on Android).
    If you already have moved to your new Droid then it's probably too late.  However, Migration is a great solution for moving your data painlessly across to the android device.
    I appreciate you used SPB, a fine product on Windows Mobile.  I am sure you are aware of our 18 million Windows mobile sales of Sprite Backup (the other camp , the majority are actually bundled on devices there also)
    We have a strong support staff, so if you have any questions please fire them off to us.
    http://spritesoftware.crmdesk.com/
    To give you an idea of what Android Backup covers, please read this
    http://www.spritesoftware.com/getmedia/0e592ab4-61bb-4f6d-a218-b6df82669a18/Android-Backup-v2.0-User-Guide.aspx
    Migration is soon coming out as an OTA service, for a carrier or two
    As you have pointed out, contacts sync (like for winmo active sync) is great, but there is a lot more on a device.
    Many thanks
    Julian@Sprite

  • Data Migrator  and lsmw

    hi
    i want to know where i can find material on data migrator and lsma?
    how it works?examples?
    thanks
    have a nice day

    Hi balan,
    Check this <a href="http://www.sap-img.com/sap-data-migration.htm">Link</a> for a few tips on the same..
    also<a href="http://www.sap-img.com/basis/setting-up-lsmw-on-mini-basis.htm">http://www.sap-img.com/basis/setting-up-lsmw-on-mini-basis.htm</a>
    Also check this step by step example..
    <a href="http://www.sap-img.com/basis/setting-up-lsmw-on-mini-basis.htm">Doc on LSMW</a>
    regards
    satesh

  • Migrating CMSA Assets and Data from one server to another

    Hello,
    Is there an official way to migrate the data from one CMSA installation to another one?  In the near future we're going to need to move from our development machine into the production environment, and it would be great if there was a way to get all the assets we've created in dev over to production without the need to manually re-create them all.
    Is this possible?
    Cheers,
    Kristian

    Check out http://help.adobe.com/en_US/LiveCycle/9.5/AssetComposerTechGuide/WS2bacbdf8d487e5824fb272 0612b944351b6-8000.html for this.

Maybe you are looking for

  • IChat AV does not work behind firewall even after opening up correct ports

    Hi. I am trying to get my iChat setup so that I can do video and audio. It works fine when my mac is directly plugged into the cable modem. When I plug it into my firewall it will not connect. I have read many online manuals and forums and I have tri

  • I am desperate for help!

    I recently purchased an iMac (October 2013) and my computer knowledge is very limited. The computer has two user areas, one for my wife and one for me. Our iTunes account and photos are in my user area. My wife wants to access our photos and music fr

  • Unable to Install via Firewire Target Disk Mode

    I'm having problems with installing Leopard 10.5 via Target Disk Mode. The 2 Macs I am using are a eMac 1.0GHz with 1GB RAM Running OS X 10.4.11 and a PowerMac QuickSilver 933MHz 896MB RAM Running OS X Server 10.4.11. The eMac's Combo Drive has gone

  • Flash/XML comms

    I have been working through friendsofED book called "Foundation ActionScript for Flash 8". I really get how to get information out of a XML database and be able to display it. Now I want to learn how to get data to go the other direction. Is there a

  • New iPhone 5 - 6.1.1 software upgrade

    I have had my new iPone for 2 week - did the software upgrade to 6.1.1 and wella no wi-fi connection - what now?