SQL Azure Synch

I have created a synch between two azure databases and the hub database has a size of  20 GB .How long it will take the synch process to complete? Could any one help me on this 

Hi,
I am sure you have the answer by now, if not please have a check on a similar thread.
http://social.msdn.microsoft.com/Forums/windowsazure/en-US/cab00a4f-603f-43f2-9a22-e0406db19a77/many-problems-with-sql-azure-data-sync
Regards,
Mekh.

Similar Messages

  • Geo-Replication in SQL Azure

    We are looking to move from geographically distributed and replicated physical/virtual data machines to SQL Azure.
    We would like to have synched/replicated Azure SQL DB instances in multiple regions. I would appreciate if you could answer some questions related to how to achieve this and what to expect:
    1. Is Azure SQL Sync the correct approach for geo-replicating Azure SQL Databases?
    2. What is the Performance SLA that I can expect for the synchronization? I would like to know the minimum and maximum delay possible for the synchronization. (I can see that the minimum sync frequency is 5 minutes. What would be the sync delay when the
    sync is triggered every 5 minutes?)

    Hello,
    1. SQL Data sync can be used for synchronization across SQL Database sites or between SQL Database and an on-premise SQL Server instance. But also notes: SQL Data Sync is currently available only as a Preview and is meant only for product feedback for
    future releases and should not be used in production environments. Currently, there are
    limits in SQL Data Sync.
    2. Windows Azure SQL Database keep triple redundancy within the Azure data center combined with the 99.9% availability SLA. SQL Database does not currently provide any SLA for performance or for security. As for SQL Data Syc, I can not find any document from BOL
    about the SLA for SQL Data sync.
    Reference:SQL Data Sync (Preview) Best Practices
    Regards,
    Fanny Liu
    If you have any feedback on our support, please click here.
    Fanny Liu
    TechNet Community Support

  • SQL Azure sync service-- absurdly slow and fails after a few days

    Hello. We have been trying to use Azure Data Sync to replicate an on-premise MSSQL database to an SQL Azure database for read-only access by a customer. This was working for a while, but stopped syncing after a couple months(12hr auto-sync schedule) with
    no errors in the log. I had to re-create the sync group, but now it takes even longer than originally to try to sync, and never actually completes, as it gets interrupted by bi-weekly server restarts... It used to take a few hours to sync the new data in our
    database(which is appended to daily)-- but this time it fails after 4+ days... It was unacceptably slow initially(IMO), but now it's clearly unusable.  The original initialization of the data when I first set it up was less than 2 days of syncing. 
    It seems there is a throttle on the Azure sync service. Is this true?  Would it be best to clear the SQL Azure database now and re-sync? Is there a way to pre-load the SQL Azure database with MSSQL on-premise data via a SQL backup file or something?
    Please advise. Thank you.

    when you re-created the sync group, does the member databases/hub database have pre-existing data?
    when synching a sync group for the first time, make sure databases don't contain the same set of data, otherwise, you will run into conflicts which will completely slow down your sync...
    I deleted the initial sync group because it wasn't syncing(auto or on-demand), nor creating a log entry with an error indicating why.
    So, I simply deleted the sync group and re-created it with the same exact databases and settings. I did not delete all data in the SQL Azure database-- I was under the assumption that the sync service, with the tracking tables were smart enough to not get
    confused with pre-existing data, but apparently that's not how this works?
    I obviously can't delete the data in the source database(MSSQL on-premise), but I could delete the tables in the SQL Azure database if that's supposed to fix the problem-- then we'll just have to wait multiple days for it to be completely re-initialized,
    hopefully without error... Is there a way to seed the data in some way to prevent this extremely log first sync?
    Thank you for your help.

  • "The CREATE USER statement must be the only statement in the batch" in SQL Azure - why? what to do?

    I'm getting an error on a line in the middle of a larger sql script, only in SQL Azure.
    IF NOT EXISTS (SELECT * FROM sys.database_principals WHERE name = N'foouser')
    CREATE USER [foouser] FOR LOGIN [foouser] WITH DEFAULT_SCHEMA=[dbo]
    GO
    Error: "The CREATE USER statement must be the only statement in the batch."
    I don't actually understand what 'the only statement in the batch' means.
    What is a batch? Is it a SQL file? Is it related to a 'GO' statement or an 'IF' statement? What is the reason for the error? And how do I avoid it?
    Thanks,
    Tim

    >IF...ELSE imposes conditions on the execution of a Transact-SQL statement
    I understand the general purpose of an If statement. I could let go of our definition of statement counting disagreeing too except that because of the error I'm stuck.
    It's less important for Create User but what I am really puzzled over now is a very similar issue how am I supposed to do a safe version of CREATE LOGIN, when I don't know whether a login has been previously created on the server or whether I
    am setting up the database on a clean server?
    IF NOT EXISTS (SELECT * FROM sys.server_principals WHERE name = N'foouser')
    CREATE LOGIN [foouser] WITH PASSWORD = 'asdfasdf'
    GO
    If I try and execute this script, it throws the same error as above.
    The first unworkable workaround idea is to omit the if statement
    CREATE LOGIN [foouser] WITH PASSWORD = 'asdfasdf'
    GO
    But if the login already exists on the server (because a similar script was already run), then the script throws an error.
    The second unworkable workaround idea is to do
    DROP LOGIN [foouser]
    GO
    CREATE LOGIN [foouser] WITH PASSWORD = 'asdfasdf'
    GO
    Obviously this throws an error in the second block if the login doesn't already exist on the server.
    The third workaround idea I have is to go conditional by putting an IF condition around DROP instead of CREATE:
    Unfortunately that doesn't work for me either!
    "The DROP LOGIN statement must be the only statement in the batch"
    (This is despite the fact that 'drop login' is listed on the
    supported commands page, not the partially supported page..?! Which disagrees with the notes on
    this page.)
    Anyway the real question I am interesting in addressing is: is there actually a way to have a 'Create/Delete login
    if exists' operation which is SQL-Azure compatible and doesn't throw me error messages (which messes with the sql execution tool I am using)?
    If there is no way, I would like to believe it's because it would be a bad idea to do this. But in that case why is it a bad idea?
    Tim

  • How to add description of a column of a table in SQL Azure

    Hi
    I have some tables in my application database where there are descriptions added against certain columns. Needless to say they were done by using sp_addextendedproperty.
    Now I am trying to migrate the Database to SQL Azure. SQL Azure does not support sp_addextendedproperty.
    Hence I am not able to figure out how to add descriptions to those columns.
    Any help would be much appreciated.
    Thanks
    Soumyadeb

    Hello,
    Just as Latheesh post above, Windows Azure SQL database are not support extended stored procedures. That’s one of the limitations on SQL database, and I don’t know there is another way to achieve the same on Azure.
    Regards,
    Fanny Liu
    Fanny Liu
    TechNet Community Support

  • Cannot refresh data in Excel Services with SQL Azure databases

    I am using Excel Services on a SharePoint Online.
    I get my data from a SQL Azure. When i create my Excel repor with Excel 2013 pro I have no problem. So I upload my file on my Sharepoint and try to refresh data.
    Connexion : Power Query - RPT_Event_ByEventType 
    Erreur : Erreur sur site (OnPremise) : Sorry, the data source for this data connection isn't registered for Power BI. Ask your Power BI
    admin to register the data source in the Power BI admin center. 
    I do not understad why I get that error because my data source is on Azure why It told me "OnPremise" ?

    hi,
    >> this button of excel gets just address of web and have button for import it
         i test it by rest API project , but doesn't work, do you know how it is work?
    Do you mean that you don't know how to get the table? You may input the site address into the address box, and then click go button nearby, select the table you want to import into the Excel. Then click import button.That also works for  rest API,
    and your rest API should get the data that you want
    By the way, this is the forum for discussions about Excel develop(VBA ,customization), better to go to TechNet forum for Excel for Excel features question, so that you could get more professional help.
    Best Regards
    Lan
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.

  • SQL Azure - Intermittent The wait operation timed out

    I have a website engine which runs a few hundred "white label" sites. It's hosted on Azure with a SQL Azure Business database. Generally everything is fine - it all works and runs at a good speed.
    However, throughout the day I get maybe 40 or 50 of the error:
    System.ComponentModel.Win32Exception: The wait operation timed out
    Please don't refer me to the connectivity blog at http://blogs.msdn.com/b/sqlazure/archive/2010/03/22/9982979.aspx as this seems to refer to problems where you just can't connect. My problem is that it's fine most of the time, but I still get these
    intermittently.
    This is sometimes on the main database, but we're also using a database for sessions and this gets the errors too. Both databases are on the same server.
    I also get errors like: 
    An existing connection was forcibly closed by the remote host
    and:
    System.Data.SqlClient.SqlException: The service has encountered an error processing your request. Please try again. Error code 40143. A severe error occurred on the current command. The results, if any, should be discarded.
    and, when evil bots are hammering the site:
    System.Data.SqlClient.SqlException: Resource ID : 1. The request limit for the database is 180 and has been reached. See 'http://go.microsoft.com/fwlink/?LinkId=267637' for assistance.
    Each website can potentially have a Google footprint of around 10,000 pages. The result it that bots are hitting the sites regularly, indexing lots of pages for hundreds of sites. I also have some worker roles doing data work. The database is clearly busy!
    I am hoping to add 2 or 3 times the number of sites that I currently have to the "engine". 
    I am looking at efficiency where possible, but the sites are clearly under a fair load from bots and visitors.
    My question is, will one of the upgrades from Business to S2, P1, P2 or P3 resolve these problems? The financial cost of these database instances stagger greatly so I wouldn't want to update and find I'm left with the same problems but am paying many times
    more each month.
    Thank you in advance.

    Hello,
    For Web/Business edition database, the maximum limit of concurrent requests is 180. Beyond this limit, you will receive error.
    The Max woker threads for Standard(S2) is 100, you should upgrade your database to Permium tier.
    The concurrent requests limit of premium database varies depending on the reservation size of a premium database. For example, for P1, the max worker threads is 200.
    Reference:Azure SQL Database Resource Governance
    Azure SQL Database Service Tiers and Performance Levels
    Regards,
    Fanny Liu
    If you have any feedback on our support, please click here. 
    Fanny Liu
    TechNet Community Support

  • Performance is too slow on SQL Azure box

    Hi,
    Performance is too slow on SQL Azure box (Located in Europe)
    Below query returns 500,000 rows in 18 Min. on SQL Azure box (connected via SSMS, located in India)
    SELECT * FROM TABLE_1
    Whereas, on local server it returns 500,000 rows in (30 sec.)
    SQL Azure configuration:
    Service Tier/Performance Level : Premium/P1
    DTU       : 100
    MAX DB Size : 500GB     
    Max Worker Threads : 200          
    Max Sessions     : 2400
    Benchmark Transaction Rate      : 105 transactions per second
    Predictability : Best
    Any suggestion would be highly appreciated.
    Thanks,

    Hello,
    Can you please explain in a little more detail the scenario you testing? Are you comparing a SQL Database in Europe against a SQL Database in India? Or a SQL Database with a local, on-premise SQL Server installation?
    In case of the first scenario, the roundtrip latency for the connection to the datacenter might play a role. 
    If you are comparing to a local installation, please note that you might be running against completely different hardware specifications and without network delay, resulting in very different results.
    In both cases you can use the below blog post to assess the resource utilization of the SQL Database during the operation:
    http://azure.microsoft.com/blog/2014/09/11/azure-sql-database-introduces-new-near-real-time-performance-metrics/
    If the DB utilizes up to 100% you might have to consider to upgrade to a higher performance level to achieve the throughput you are looking for.
    Thanks,
    Jan 

  • Link a Crystal Report report with an SQL Azure database

    Hi,
    I want to use my database on SQL Azure in Crystal Report. So I want to link my reports with data contained not in a local db, but in a SQL Azure one.
    Insiede Crystal Report I have created a new ADO connection to my SQL Azure, providing server, db, user, password, and Crystal Report have recognized the database. But when I go to the Database Expert and I try to set this ADO connection inside my report,
    I recieve this error:
    "Not Implemented
    Source ADODB.Connection
    L'operazione richiesta non è supportata dall'oggetto o dal provider (operation not supported by the object or by the provider)"
    Why? How can i use my SQL Azure data in my Crystal Report reports?
    Thanks

    Hi Delfins,
    Please create a UDL file to test the connection, ensure the connection is fine and then use the same connection string in your Crystal Report.
    For UDL file, you can refer to:
    http://msdn.microsoft.com/en-us/library/e38h511e(VS.71).aspx
    Hope this helps,
    Raymond
    Raymond Li - MSFT

  • Unable to save a report that includes a datasource of "Microsoft SQL azure" type

    I have install SSRS in azure using the following instructions (http://msdn.microsoft.com/en-us/library/dn449661.aspx) and all seems to work fine, however when I create a report in report builder 2014 (in this case empty) that includes a Microsoft SQL Azure
    datasource type I am unable to save the report and get the following error message (even though when I test connection it succeeds). 
    "The report definition was saved, but one or more errors occurred while setting the report properties"
    Reports with standard sql datasources work fine.
    I have also tried creating the report using Visual Studio 2013 and get a similar error message.
    I have tried this using SQL 2012 and SQL 2014 and get the same error.
    Does anybody know how I can create a report with Microsoft SQL Azure datasource type?

    Hi jamesla,
    Based on my research, the issue can be caused by a deleted shared data source still exist under the Data Sources list in Report Builder. For more details about this scenario, we can refer to the following thread:
    http://social.msdn.microsoft.com/Forums/sqlserver/en-US/7170dbee-048c-4298-89ba-df4d42924c8e/the-report-definition-was-saved-but-one-or-more-errors-occurred-while-setting-report-properties?forum=sqlreportingservices
    Since the error message without detail information, we can try to render the report to see the detail error message. Besides, we can try to check it in the log file. The SQL Reporting Services log files are found on the reporting services point server, in
    the folder %programfiles%\Microsoft SQL Server\<SQL Server Instance>\Reporting Services\LogFiles.
    For more information about how to use Microsoft SQL azure as the data source of a report, please see:
    http://msdn.microsoft.com/en-IN/library/ff519560.aspx
    http://programming4.us/database/2158.aspx
    Thanks,
    Katherine Xiong
    Katherine Xiong
    TechNet Community Support

  • Create a web site in Visual Studio - fails with SQL Azure V12

    Creation of Microsoft Azure Website failed. <Error xmlns="Microsoft.SqlServer.Management.Framework.Web.Services" xmlns:i="http://www.w3.org/2001/XMLSchema-instance"><Message>The service objective 'Web' specified is invalid.</Message><InnerError
    i:nil="true"/><SqlErrorCode>40804</SqlErrorCode><Severity>16</Severity></Error>
    I receive this error after connecting to a database using the Preview Edition of SQL Azure V12 with a service level of 'Basic'
    'Web' may need to be changed to 'Basic' or 'Standard' depending on the service level. How can I do this?
    Regards
    David

    Hi,
    Thanks for posting here.
    Upgrading Web and Business Databases
    Upgrading Web or Business databases to a new service tier/performance level does not take the database offline. The database will continue to work through the upgrade operation. At the time of the actual transition to the new performance level temporary
    dropping of the connections to the database can happen for a very small duration (typically measured in seconds). If an application has transient fault handling for connection terminations then it is sufficient to protect against dropped connections at the
    end of the upgrade.
    Upgrading a Web or Business database to a new service tier involves the following steps:
    Determine service tier based on feature capability
    Determine an acceptable performance level based on historical resource usage
    Why does existing performance for my Web or Business database map to the higher Premium levels?
    Tuning your workload to fit a lower performance level
    Upgrade to the new service tier/performance level
    Monitor the upgrade to the new service tier/performance level
    Monitor the database after the upgrade
    Refer:
    http://azure.microsoft.com/en-us/documentation/articles/sql-database-upgrade-new-service-tiers/
    https://msdn.microsoft.com/en-us/library/azure/dn741336.aspx
    Hope this helps you.
    Girish Prajwal

  • Unable to drop SQL azure table

    Hi,
    I'm trying to drop SQL azure table. However I can delete data inside the table. When using drop command, it is taken long time processing and finally this error message 'Connection Failed'. Please help me. Thanks

    That sounds like an internal error in SQL Server which may be due to corruption. Don't really know how you deal with that in Azure. But if DBCC CHECKDB is available, run it and see what happens.
    Also, do you have any DDL triggers on the database? In such case, disable the trigger, in case it is the trigger that is failing.
    Erland Sommarskog, SQL Server MVP, [email protected]

  • SQL Server 2012 Deploy Database to SQL Azure...

    I have spent many days trying to copy a simple test database from my PC to SQL Azure but with little success.  Then I installed SQL Server 2012 and see a new task:
    Deploy Database to SQL Azure...  It looks like an automated extraction to DAC and then creating the Azure database all in one step.
    I was elated, but got hit with the following three types of errors.
    One or more unsupported elements were found in the schema used as part of a data package.
    Error SQL71564: The element Extended Property: [dbo].[Accounts].[Address].[MS_Description] is not supported when used as part of a data package (bacpac).
    Error SQL71564: Table Table: [dbo].[Activities] does not have a clustered index.  Clustered indexes are required for inserting data in this version of SQL Server.
    Error SQL71564: Element User: [NT AUTHORITY\NETWORK SERVICE] has an unsupported property AuthenticationType set and is not supported when used as part of a data package.
     (Microsoft.SqlServer.Dac)
    May I know what I can do on the on-premise database to rid of the first and third error?
    Thanks.

    Hello,
    1) Remove the Extended Property from the (?) column Address in table Accounts.
    BTW, for small databases it's sometimes easiere to generate a script for the database (and modify it slightly) instead of using the Wizard
    Olaf Helper
    Blog
    Xing
    What is Extended Property?   What is this .MS_Description?   I can't see it in the table designer or the Column Properties.  (I don't know whether this was corruption caused by Data Sync I used earlier.)  How do I remove it.
    Of all the methods of copying SQL Server to Azure SQL, this one gives the least mistakes.  If I can solve this Extended Properties thing hopefully I can migrate the database over.

  • Can't create a new SQL Azure Standard Tier db

    I'm having an issue creating a new sql azure standard tier db. Here's the basic steps I've done to migrate a web tier db.
    Register for the sql preview programme
    https://account.windowsazure.com/PreviewFeatures
    Perform a db export to blob storage
    use the + (new) button to create a new SQL db
    select the import option
    browser to your saved export
    hopefully see the new tiers as options in dialog. 
    The new tiers require a separate server to web. Can create this from the import dialog
    I have 2 independent azure accounts, the above process worked for my test account but for the live account where I was also experimenting after success on the test account I hit an issue. I request to create a new server whilst importing. This step seems
    to work but then the actual import fails with this msg
    "Error encountered during the service operation. 
     Could not import package.
     Error SQL72014: .Net SqlClient Data Provider: Msg 40823, Level 16, State 1, Line 1 Invalid value provided for parameter EDITION. Please provide a value that is valid on server version 1.0.
     Error SQL72045: Script execution error. The executed script:
     CREATE DATABASE [$(DatabaseName)] COLLATE SQL_Latin1_General_CP1_CI_AS
     (EDITION = 'Standard', MAXSIZE = 1 GB)"
    Tried a couple of times but no joy. Using the portal I can browse to the new sql server and it looks okay other than the list of ENABLED
    RESERVATION SIZES ONLY HAS P1, P2, P3 I'm requesting a standard (s1) db not premium (P1, P2, P3). On my test server I see in this list also S1,
    S2. As you can see in the error message I'm requesting Edition = Standard. I get the feeling the newly created server is not accepting standard tier dbs?
    Now the server is created if I try an import I see the server in the list of available servers but when I select a tier of Basic or Standard the new server is grayed out, not so if I select Premium or the older Web or Business? Interestingly my current live
    sql server shows up as supporting the new premium?
    Thanks
    Wayne 

    Hello,
    Glad to hear that the issue resolved and thanks for your sharing.
    Regards,
    Fanny Liu
    If you have any feedback on our support, please click here. 
    Fanny Liu
    TechNet Community Support

  • SQL Azure Reporting - There was an exception running the extensions specified in the config file. --- Maximum request length exceeded.

    I am trying to deploy an RDL file (5MB) to SQL Azure Reporting server in South Central US using the deploy function in SQL Server Data Tools but facing the following error during deployment to Azure Reporting server.
    "There was an exception running the extensions specified in the config file. ---> Maximum request length exceeded."
    Is there any limit on the size of RDL files which can be deployed to Azure Reporting server? I have seen some online posts which talk about increasing the maxRequestLength in httpruntime of web.config in Reporting server. But in case of Azure Reporting server
    how can be make modification to this configuration?
    I have tried to upload it directly to SQL Azure Reporting server from the Management Portal --> Upload Report function which still resulted in error.
    Thanks & Regards, Deep

    Thanks for your question. Unfortunately we are in the process of deprecating SQL Reporting services.  Full details are available at http://msdn.microsoft.com/en-us/library/gg430130.aspx
    Thanks Guy

Maybe you are looking for

  • How can you remove mirroring from the digital images of old photographs?

    I have inherited several hundred old photographs and many of them display "silver mirroring" in the black regions which creates a bluish cast on the scanned image.  Is there a simple way to reduce this problem during the scanning process or is there

  • Another PPC/Intel node question!!!

    Hi MacAddicts, I've searched the forum but didn't get an answer about this. I want to connect a PowerBook(host)to an Intel Mac mini as the node(the poor man's HD setup!!). I'm on LP 7.1.1 on Panther. Can i install the node app. from 7.1.1 on the inte

  • Can't lock screen in KDE

    Hello guys, Please give me advice - how can i debug problem with screen locking in KDE? I don't find any similar problems in arch forums/mailing lists. This problem appeared after update. Standart Ctrl+Alt+L doesn't work, plasma applet (screen lock)

  • Non-Warranty Software Upgrade

    Is anyone else suprised about linksys charging for out-of-warranty firmware upgrades that offers a fix for user logins to change router settings?

  • I uninstalled my free trial of photoshop

    I uninstalled my free trial of photoshop because it wasn't working but now it won't reinstall. What do I do?