Geo-Replication in SQL Azure
We are looking to move from geographically distributed and replicated physical/virtual data machines to SQL Azure.
We would like to have synched/replicated Azure SQL DB instances in multiple regions. I would appreciate if you could answer some questions related to how to achieve this and what to expect:
1. Is Azure SQL Sync the correct approach for geo-replicating Azure SQL Databases?
2. What is the Performance SLA that I can expect for the synchronization? I would like to know the minimum and maximum delay possible for the synchronization. (I can see that the minimum sync frequency is 5 minutes. What would be the sync delay when the
sync is triggered every 5 minutes?)
Hello,
1. SQL Data sync can be used for synchronization across SQL Database sites or between SQL Database and an on-premise SQL Server instance. But also notes: SQL Data Sync is currently available only as a Preview and is meant only for product feedback for
future releases and should not be used in production environments. Currently, there are
limits in SQL Data Sync.
2. Windows Azure SQL Database keep triple redundancy within the Azure data center combined with the 99.9% availability SLA. SQL Database does not currently provide any SLA for performance or for security. As for SQL Data Syc, I can not find any document from BOL
about the SLA for SQL Data sync.
Reference:SQL Data Sync (Preview) Best Practices
Regards,
Fanny Liu
If you have any feedback on our support, please click here.
Fanny Liu
TechNet Community Support
Similar Messages
-
Geo Replication question?
I have a quick question about geo replication, is it possible to use the same DataCenter or it has to be a different DC when we set up Active Geo Replication for the Azure DB?
Yes, you can choose the same region as the source database. I verified the scenario on my personal account.
-
Azure geo-replication does not work - Feature is disabled
When I try and add geo-replication to a database (s0) in which is on the east coast (tried both the azure management console and the azure portal); it creates the west coast server and
I guess the next step is to create the replicattion database., but it fails. The error I get is "Feature is disabled" and does not create the database
Any idea what feature needs to be enabled for this to work? the entire process seems pretty click forward., no idea why it would fail.
Edit: details on error
OPERATIONNAME: Update
SQL database
Status: Failed
SUBSTATUS: Bad
Request (HTTP Status Code: 400)
Level: Error
PROPERTIES: statusCode:BadRequest
statusMessage:{"code":"45150","message":"Feature is disabled.","target":null,"details":[{"code":"45150","message":"Feature is disabled.","target":null,"severity":"16"}],"innererror":[]}Hi,
Thanks for posting here.
I suggest you to check this link.
Standard Geo-Replication for Azure SQL Database:
http://msdn.microsoft.com/en-us/library/azure/dn758204.aspx
http://blogs.technet.com/b/blainbar/archive/2014/08/12/step-by-step-azure-sql-database-introduces-geo-restore-standard-geo-replication-and-auditing.aspx
peer 2 peer replication is not supported on standard edition, but bi-directional replication is. Here is a tutorial on how to make this work.
http://sqlblog.com/blogs/hilary_cotter/archive/2011/10/28/implementing-bi-directional-transactional-replication.aspx
Hope this helps you.
Girish Prajwal -
My company is planning to move some of our applications to the cloud using Azure platform. We are also planning to use SQL Azure as our production database. My immediate concern is how do we replicate the production data (SQL Azure) to our reporting server
(currently SQL Server 2005 Standard, will be SQL Server 2008 Enterprise within 3 months), since we only keep a minimum amount of data on the production server but we keep everything on our reporting server? Our current setup uses SQL Server replication for
this purpose.
Also, I have seen this article on the internet and would like to get answers on the issues raised, for example backup/restore, reporting/analysis services, and of course replication.
Thanks.
AMFHi AMF,
SQL Azure Data Sync CTP2 seems like it would provide the data synchronization you need between SQL Server and SQL Azure.
http://www.microsoft.com/en-us/SQLAzure/datasync.aspx
Feel free to email me directly (Liam.Cavanagh AT microsoft.com) if I can be of help here.
Liam
Sr. Program Manager, SQL Azure and Sync Framework - http://msdn.microsoft.com/sync/ -
Connection between two SQL Azure Databases
We have a requirement to move data (partial data in a table based on policy conditions) between two SQL Azure Databases. Want to know the best possible way to do this.
We are not looking at Data Sync Framework - as this is only a Preview version and we have to use this in an ongoing basis in Production and the volume of data is quite high.
The option that we have currently is to use an on premise stored procedure - that will have two link servers to the source and target SQL Azure Databases and do the data movement in one transaction.
Are there any other better options to do this ? Any pointers will be helpful.Hi Kothai Ramanathan,
According to your description, if you just want to moving part of the data from a huge table between two different SQL Server database, you can use
SQL Server replication to sync the part of data via articles. However, in SQL Azure database, it does not support SQL Server replication, if you want to migrate database, as other post, you
can use Data-Tier Application Import and Export or other ways.
In addition, for just moving the part of data, you can also create two Linked Servers between local SQL Server database and two different SQL azure database. For example, you can get the changed data from the first Linked
Server and saved the data to Local database, then insert these data to the other azure database via the second Linked Server.
For more information, see:
http://azure.microsoft.com/blog/2012/09/19/announcing-updates-to-windows-azure-sql-database/
http://blogs.msdn.com/b/sqlcat/archive/2011/03/08/linked-servers-to-sql-azure.aspx
Regards,
Sofiya Li
Sofiya Li
TechNet Community Support -
Configure SQL Azure database as subscriber
Hi All,
I have a in-house database and a SQL Azure database (only database). Require to configure replication between these 2 databases (SQL Azure Databasse as Subscriber) . But AFAIK replication require actual subscriber sql server name. Is there a possibility
that i may be able to achieve this task.
Regards
sufian
Mohd Sufian www.sqlship.wordpress.com Please mark the post as Answered if it helped.Hi,
Thanks for posting here.
http://www.windowsazure.com/en-us/manage/services/sql-databases/getting-started-w-sql-data-sync/
http://msdn.microsoft.com/en-us/library/hh456371.aspx
I think the best bet is to use SQL Data Sync, it should gives you bidirectional and we use it currently to sync data around the world in terms of datacenters and one local on premise database. It will only give you 5 mins sync timing but this will probably
do, otherwise the next best options is to use SQL Server VMs and do the old fashion way. But with SQL Azure Data Sync we have found to be reasonable reliable and been running it for a good six months syncing across 4 database in four data centres in Azure.
Some problems though with it,
It uses Triggers.
It will obivously add load and connections to your current SQL Database.
It is in preview last time I looked, so it might not be 100% suitable for you
Ref:
http://www.mssqltips.com/sqlservertip/3007/move-an-on-premise-sql-server-database-to-the-sql-azure-cloud/
http://azure.microsoft.com/en-us/documentation/articles/sql-database-get-started-sql-data-sync/
Hope this helps you.
Girish Prajwal -
SQL Azure down (5-5-2014)?
I think things are down. Can't access DB in South East region. Looks like portal.azure.com is full of 520 errors.
Hi Dariel,<o:p></o:p>
I want to respond to your questions and allow me to start with the second. As geo-replication uses asynchronous replication method for obvious reasons, we have to recognize the possibility of
data loss after the failover. Since we replicate the committed transactions immediately the loss will be small in most cases, but not zero. Because different applications have different tolerance to data loss if the failover decision is made by us
it would be the last resort and therefore we would do the exhaustive investigation of the root cause before we pull the plug as such a decision will impact thousands of databases. As a
result the RTO of an automatic failover could not be great. Some applications would rather not wait for our analysis if they could restore availability asap. Yet other applications would prefer the recovery of the primary if there is any chance as opposed to
losing data. Hence our approach to let the applications decide what's best. You are correct that it means monitoring the replication state. We provide two indicators of failure:<o:p></o:p>
1. The "interlink" status (up or down). We set it to down if replication between two regions is blocked for more than 10 min regardless of the cause (e.g. network or internal failure)<o:p></o:p>
2. The last replicated transaction timestamp (allows you to monitor the transaction lag)
Depending on your data loss tolerance and the application SLA you can decide to failover immediately or wait. If you decide to wait, you can use the readable secondary to support read access
to data while waiting.<o:p></o:p>
That said we also recognize the value of the automatic failover and are looking at a combination of the application control and automatic failover of those who decided to wait. <o:p></o:p>
Re the cost concern, as Bill mentioned we are working on the additional DR options for Basic
and Standard editions but those are not yet available.
Regard,
Sasha -
I'm getting an error on a line in the middle of a larger sql script, only in SQL Azure.
IF NOT EXISTS (SELECT * FROM sys.database_principals WHERE name = N'foouser')
CREATE USER [foouser] FOR LOGIN [foouser] WITH DEFAULT_SCHEMA=[dbo]
GO
Error: "The CREATE USER statement must be the only statement in the batch."
I don't actually understand what 'the only statement in the batch' means.
What is a batch? Is it a SQL file? Is it related to a 'GO' statement or an 'IF' statement? What is the reason for the error? And how do I avoid it?
Thanks,
Tim>IF...ELSE imposes conditions on the execution of a Transact-SQL statement
I understand the general purpose of an If statement. I could let go of our definition of statement counting disagreeing too except that because of the error I'm stuck.
It's less important for Create User but what I am really puzzled over now is a very similar issue how am I supposed to do a safe version of CREATE LOGIN, when I don't know whether a login has been previously created on the server or whether I
am setting up the database on a clean server?
IF NOT EXISTS (SELECT * FROM sys.server_principals WHERE name = N'foouser')
CREATE LOGIN [foouser] WITH PASSWORD = 'asdfasdf'
GO
If I try and execute this script, it throws the same error as above.
The first unworkable workaround idea is to omit the if statement
CREATE LOGIN [foouser] WITH PASSWORD = 'asdfasdf'
GO
But if the login already exists on the server (because a similar script was already run), then the script throws an error.
The second unworkable workaround idea is to do
DROP LOGIN [foouser]
GO
CREATE LOGIN [foouser] WITH PASSWORD = 'asdfasdf'
GO
Obviously this throws an error in the second block if the login doesn't already exist on the server.
The third workaround idea I have is to go conditional by putting an IF condition around DROP instead of CREATE:
Unfortunately that doesn't work for me either!
"The DROP LOGIN statement must be the only statement in the batch"
(This is despite the fact that 'drop login' is listed on the
supported commands page, not the partially supported page..?! Which disagrees with the notes on
this page.)
Anyway the real question I am interesting in addressing is: is there actually a way to have a 'Create/Delete login
if exists' operation which is SQL-Azure compatible and doesn't throw me error messages (which messes with the sql execution tool I am using)?
If there is no way, I would like to believe it's because it would be a bad idea to do this. But in that case why is it a bad idea?
Tim -
How to add description of a column of a table in SQL Azure
Hi
I have some tables in my application database where there are descriptions added against certain columns. Needless to say they were done by using sp_addextendedproperty.
Now I am trying to migrate the Database to SQL Azure. SQL Azure does not support sp_addextendedproperty.
Hence I am not able to figure out how to add descriptions to those columns.
Any help would be much appreciated.
Thanks
SoumyadebHello,
Just as Latheesh post above, Windows Azure SQL database are not support extended stored procedures. That’s one of the limitations on SQL database, and I don’t know there is another way to achieve the same on Azure.
Regards,
Fanny Liu
Fanny Liu
TechNet Community Support -
Cannot refresh data in Excel Services with SQL Azure databases
I am using Excel Services on a SharePoint Online.
I get my data from a SQL Azure. When i create my Excel repor with Excel 2013 pro I have no problem. So I upload my file on my Sharepoint and try to refresh data.
Connexion : Power Query - RPT_Event_ByEventType
Erreur : Erreur sur site (OnPremise) : Sorry, the data source for this data connection isn't registered for Power BI. Ask your Power BI
admin to register the data source in the Power BI admin center.
I do not understad why I get that error because my data source is on Azure why It told me "OnPremise" ?hi,
>> this button of excel gets just address of web and have button for import it
i test it by rest API project , but doesn't work, do you know how it is work?
Do you mean that you don't know how to get the table? You may input the site address into the address box, and then click go button nearby, select the table you want to import into the Excel. Then click import button.That also works for rest API,
and your rest API should get the data that you want
By the way, this is the forum for discussions about Excel develop(VBA ,customization), better to go to TechNet forum for Excel for Excel features question, so that you could get more professional help.
Best Regards
Lan
We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
Click
HERE to participate the survey. -
SQL Azure - Intermittent The wait operation timed out
I have a website engine which runs a few hundred "white label" sites. It's hosted on Azure with a SQL Azure Business database. Generally everything is fine - it all works and runs at a good speed.
However, throughout the day I get maybe 40 or 50 of the error:
System.ComponentModel.Win32Exception: The wait operation timed out
Please don't refer me to the connectivity blog at http://blogs.msdn.com/b/sqlazure/archive/2010/03/22/9982979.aspx as this seems to refer to problems where you just can't connect. My problem is that it's fine most of the time, but I still get these
intermittently.
This is sometimes on the main database, but we're also using a database for sessions and this gets the errors too. Both databases are on the same server.
I also get errors like:
An existing connection was forcibly closed by the remote host
and:
System.Data.SqlClient.SqlException: The service has encountered an error processing your request. Please try again. Error code 40143. A severe error occurred on the current command. The results, if any, should be discarded.
and, when evil bots are hammering the site:
System.Data.SqlClient.SqlException: Resource ID : 1. The request limit for the database is 180 and has been reached. See 'http://go.microsoft.com/fwlink/?LinkId=267637' for assistance.
Each website can potentially have a Google footprint of around 10,000 pages. The result it that bots are hitting the sites regularly, indexing lots of pages for hundreds of sites. I also have some worker roles doing data work. The database is clearly busy!
I am hoping to add 2 or 3 times the number of sites that I currently have to the "engine".
I am looking at efficiency where possible, but the sites are clearly under a fair load from bots and visitors.
My question is, will one of the upgrades from Business to S2, P1, P2 or P3 resolve these problems? The financial cost of these database instances stagger greatly so I wouldn't want to update and find I'm left with the same problems but am paying many times
more each month.
Thank you in advance.Hello,
For Web/Business edition database, the maximum limit of concurrent requests is 180. Beyond this limit, you will receive error.
The Max woker threads for Standard(S2) is 100, you should upgrade your database to Permium tier.
The concurrent requests limit of premium database varies depending on the reservation size of a premium database. For example, for P1, the max worker threads is 200.
Reference:Azure SQL Database Resource Governance
Azure SQL Database Service Tiers and Performance Levels
Regards,
Fanny Liu
If you have any feedback on our support, please click here.
Fanny Liu
TechNet Community Support -
Performance is too slow on SQL Azure box
Hi,
Performance is too slow on SQL Azure box (Located in Europe)
Below query returns 500,000 rows in 18 Min. on SQL Azure box (connected via SSMS, located in India)
SELECT * FROM TABLE_1
Whereas, on local server it returns 500,000 rows in (30 sec.)
SQL Azure configuration:
Service Tier/Performance Level : Premium/P1
DTU : 100
MAX DB Size : 500GB
Max Worker Threads : 200
Max Sessions : 2400
Benchmark Transaction Rate : 105 transactions per second
Predictability : Best
Any suggestion would be highly appreciated.
Thanks,Hello,
Can you please explain in a little more detail the scenario you testing? Are you comparing a SQL Database in Europe against a SQL Database in India? Or a SQL Database with a local, on-premise SQL Server installation?
In case of the first scenario, the roundtrip latency for the connection to the datacenter might play a role.
If you are comparing to a local installation, please note that you might be running against completely different hardware specifications and without network delay, resulting in very different results.
In both cases you can use the below blog post to assess the resource utilization of the SQL Database during the operation:
http://azure.microsoft.com/blog/2014/09/11/azure-sql-database-introduces-new-near-real-time-performance-metrics/
If the DB utilizes up to 100% you might have to consider to upgrade to a higher performance level to achieve the throughput you are looking for.
Thanks,
Jan -
Link a Crystal Report report with an SQL Azure database
Hi,
I want to use my database on SQL Azure in Crystal Report. So I want to link my reports with data contained not in a local db, but in a SQL Azure one.
Insiede Crystal Report I have created a new ADO connection to my SQL Azure, providing server, db, user, password, and Crystal Report have recognized the database. But when I go to the Database Expert and I try to set this ADO connection inside my report,
I recieve this error:
"Not Implemented
Source ADODB.Connection
L'operazione richiesta non è supportata dall'oggetto o dal provider (operation not supported by the object or by the provider)"
Why? How can i use my SQL Azure data in my Crystal Report reports?
ThanksHi Delfins,
Please create a UDL file to test the connection, ensure the connection is fine and then use the same connection string in your Crystal Report.
For UDL file, you can refer to:
http://msdn.microsoft.com/en-us/library/e38h511e(VS.71).aspx
Hope this helps,
Raymond
Raymond Li - MSFT -
Unable to save a report that includes a datasource of "Microsoft SQL azure" type
I have install SSRS in azure using the following instructions (http://msdn.microsoft.com/en-us/library/dn449661.aspx) and all seems to work fine, however when I create a report in report builder 2014 (in this case empty) that includes a Microsoft SQL Azure
datasource type I am unable to save the report and get the following error message (even though when I test connection it succeeds).
"The report definition was saved, but one or more errors occurred while setting the report properties"
Reports with standard sql datasources work fine.
I have also tried creating the report using Visual Studio 2013 and get a similar error message.
I have tried this using SQL 2012 and SQL 2014 and get the same error.
Does anybody know how I can create a report with Microsoft SQL Azure datasource type?Hi jamesla,
Based on my research, the issue can be caused by a deleted shared data source still exist under the Data Sources list in Report Builder. For more details about this scenario, we can refer to the following thread:
http://social.msdn.microsoft.com/Forums/sqlserver/en-US/7170dbee-048c-4298-89ba-df4d42924c8e/the-report-definition-was-saved-but-one-or-more-errors-occurred-while-setting-report-properties?forum=sqlreportingservices
Since the error message without detail information, we can try to render the report to see the detail error message. Besides, we can try to check it in the log file. The SQL Reporting Services log files are found on the reporting services point server, in
the folder %programfiles%\Microsoft SQL Server\<SQL Server Instance>\Reporting Services\LogFiles.
For more information about how to use Microsoft SQL azure as the data source of a report, please see:
http://msdn.microsoft.com/en-IN/library/ff519560.aspx
http://programming4.us/database/2158.aspx
Thanks,
Katherine Xiong
Katherine Xiong
TechNet Community Support -
Create a web site in Visual Studio - fails with SQL Azure V12
Creation of Microsoft Azure Website failed. <Error xmlns="Microsoft.SqlServer.Management.Framework.Web.Services" xmlns:i="http://www.w3.org/2001/XMLSchema-instance"><Message>The service objective 'Web' specified is invalid.</Message><InnerError
i:nil="true"/><SqlErrorCode>40804</SqlErrorCode><Severity>16</Severity></Error>
I receive this error after connecting to a database using the Preview Edition of SQL Azure V12 with a service level of 'Basic'
'Web' may need to be changed to 'Basic' or 'Standard' depending on the service level. How can I do this?
Regards
DavidHi,
Thanks for posting here.
Upgrading Web and Business Databases
Upgrading Web or Business databases to a new service tier/performance level does not take the database offline. The database will continue to work through the upgrade operation. At the time of the actual transition to the new performance level temporary
dropping of the connections to the database can happen for a very small duration (typically measured in seconds). If an application has transient fault handling for connection terminations then it is sufficient to protect against dropped connections at the
end of the upgrade.
Upgrading a Web or Business database to a new service tier involves the following steps:
Determine service tier based on feature capability
Determine an acceptable performance level based on historical resource usage
Why does existing performance for my Web or Business database map to the higher Premium levels?
Tuning your workload to fit a lower performance level
Upgrade to the new service tier/performance level
Monitor the upgrade to the new service tier/performance level
Monitor the database after the upgrade
Refer:
http://azure.microsoft.com/en-us/documentation/articles/sql-database-upgrade-new-service-tiers/
https://msdn.microsoft.com/en-us/library/azure/dn741336.aspx
Hope this helps you.
Girish Prajwal
Maybe you are looking for
-
Acrobat 9.3.4 is very slow to open files
Greetings: I have Acrobat 9.3.4 installed on a Dell Latitude D630 with a 2.4 GHz T7700 Core 2 Duo processor and 4GB of RAM. I re-installed Windows XP SP3 about a month ago due to some USB issues. The motherboard was replaced prior to that to try to r
-
I recently updated iTunes from version 10.x to 11. In version 10, I could download a podcast or an app to my iPhone from the app store, then the app or podcast would co COPIED to iTunes during a sync. Now with version 11, when I sync, the app or po
-
I am unable to insert the Memory Stick into the card reader slot on my HP 7510
I have three Photosmart 7510's (C311a) and one Photosmart Premier (C310a). All of these printers have the identical Card Reader slot on the lower, left front corner. ALL four printers will not accept a standard Sony Memory Stick from my Sony digit
-
Macs! I have found a solution for an u44m1p7 error.
This might not be the cause for your problem, but have a look at the error log (shown when you press More Information after the update fails) and see if there is any mention of Adobe folder in Applications not being there to load the necessary patche
-
Need to know how to limit the number of rows returned on Oracle
MS SQL Server has a command called 'set row count'. We are trying to find similar one on Oracle. What we are trying to do is that instead of using rownum in the query statement, we would like to find way to limit the number of rows returned. I unders