Does SQL Azure charge cross region traffic cost?

Say, I have a SQL DB on West US, and have two cloud services, one hosted on West US and the other hosted on East US.
Assume two cloud service have exactly same read/write throughput on the DB, is the cost same? I am wondering, if East US service cost more since it need cross region traffic?

Hi,
Is data transfer between Azure services located within the same region charged?
No. For example, an Azure SQL database in the same region will not have any additional data transfer costs.
Is data transfer between Azure services located in two regions charged?
Yes. Outbound data transfer is charged at the normal rate and inbound data transfer is free.
Reference :
http://azure.microsoft.com/en-us/pricing/details/data-transfers/
Regards,
Mekh.

Similar Messages

  • Does SQL Azure support XML index?

    Hi
    I read the doc link below that Azure supports selective XML index? Is that true? I couldn't find any other documentation. I know it previously didn't support full XML index or has this changed?
    http://msdn.microsoft.com/en-us/library/azure/dn387405.aspx
    Does anyone know? Thanks,

    Hi Jilim,
    As Olaf said, the feature is not supported in current version of Azure SQL Database.
    If you have any concern about this behavior, you can submit a feedback at
    http://connect.microsoft.com/SQLServer/Feedback and hope it is resolved in the next release of service pack or product. Your feedback enables Microsoft to make software and services the best that they can be, Microsoft might consider to add this feature
    in the following release after official confirmation.
    Thank you for your understanding.
    Regards,
    Charlie Liao
    TechNet Community Support

  • Cost of bandwidth to/from SQL Azure database? Free?

    Anyone know if the SQL Azure or remoteapps services charge for bandwidth used between them? It's just Microsoft's LAN right?

    Hi,
    If the Azure RemoteApp collection and the Azure SQL Database that it is connecting to are located in the same Azure Region then there is no charge for bandwidth.  For example, if you create a RemoteApp collection in West US that is accessing
    an Azure SQL Database that is also in West US, then there is no charge for bandwidth.
    If the Azure SQL Database is located in a different region than the RemoteApp collection then you will pay for outgoing data transfer from the Azure SQL Database to the RemoteApp.
    -TP 

  • SQL firewall rule to restrict traffic from only one Azure PaaS website

    Hi,
    I have been asked to configure the firewall on the SQL PaaS instance to only allow traffic from a specified PaaS website that is within the same subscription. I can't see any way to set a static internal IP for the website, is there a way to identify it
    for the purpose of the SQL Database firewall rule?
    Thanks,
    Karina

    Hi Karina,
    If you used Azure Vm, you could set ta static internal IP for your VM. And you can host your website on VM.https://msdn.microsoft.com/en-us/library/azure/dn630228.aspx
    But for Azure Website service, I think you may not set the internal IP. But I think you can try to add the website server into your allow rule list if you used the basic or standard mode website.
    BTW, I suggest you can post this issue on SQL Azure forum for more helps:
    https://social.msdn.microsoft.com/forums/azure/en-US/home?forum=ssdsgetstarted
    Regards,
    Will
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.

  • Why does a SQL Azure DACPAC upgrade (via a PowerShell script) consistently take 30min to complete

    I created a PowerShell script to upgrade a SQL Azure instance with my latest DACPAC (taken from http://msdn.microsoft.com/en-us/library/ee634742.aspx).
    What I have experienced when running my PowerShell script is that it consistently takes approximately 30min to execute. The script is idle for almost half an hour waiting on $dacstore.IncrementalUpgrade($dacName,
    $dacType, $upgradeProperties) to return from execution and nothing is printed out on the PowerShell console window. Only right at the end of the half hour does the incremental update start spitting out console messages which inform me that
    the upgrade is taking place (essentially it appears that the script has hung for 30min until it finally comes back alive and the script does this consistently every time).
    Does it usually take this long for the IncrementalUpgrade to complete and is there supposed to be a 30min period of inactivity/waiting?
    Note that I am running the PowerShell script from my local machine which is external to the Azure network.
    Thanks for any insight you can give for this, I am hoping that I can reduce this incremental upgrade process to substantially less than 30hr so that my continuous integration build doesn't take so long.

    Answer this post
    According to Microsoft Support this is a known issue and will be fixed in SQL Server 2012 (code named Denali). Here are the details from Microsoft Support:
    It’s a known issue that using SSMS 2008 or PowerShell to update DAC on SqlAzure is very slow. SQLServer 2008 utilize old extraction engine which run query for every column and small object. This way works well at on-premise server, and meets
    SQLServer 2008 original design target. However, when managing the SqlAzure database, the query need be transferred over internet, network latency makes the old extraction becomes inefficient, especially, when network is not good.
    Our SQL product team aware this issue and designed new extraction engine to fix it. The new engine is integrated in SQL Server 2012 (code name Denali). Unfortunately, some of the engine behavior may bring break changes to SQL Server 2008. We try
    different approach but we can’t relief regression barrier when apply the new engine in the SQL server 2008. Therefore, we don’t have plan to deliver the new extraction engine as hotfix on SQLServer 2008 so far. That will impact the current on-premise
    users and operation.

  • SQL Azure - Intermittent The wait operation timed out

    I have a website engine which runs a few hundred "white label" sites. It's hosted on Azure with a SQL Azure Business database. Generally everything is fine - it all works and runs at a good speed.
    However, throughout the day I get maybe 40 or 50 of the error:
    System.ComponentModel.Win32Exception: The wait operation timed out
    Please don't refer me to the connectivity blog at http://blogs.msdn.com/b/sqlazure/archive/2010/03/22/9982979.aspx as this seems to refer to problems where you just can't connect. My problem is that it's fine most of the time, but I still get these
    intermittently.
    This is sometimes on the main database, but we're also using a database for sessions and this gets the errors too. Both databases are on the same server.
    I also get errors like: 
    An existing connection was forcibly closed by the remote host
    and:
    System.Data.SqlClient.SqlException: The service has encountered an error processing your request. Please try again. Error code 40143. A severe error occurred on the current command. The results, if any, should be discarded.
    and, when evil bots are hammering the site:
    System.Data.SqlClient.SqlException: Resource ID : 1. The request limit for the database is 180 and has been reached. See 'http://go.microsoft.com/fwlink/?LinkId=267637' for assistance.
    Each website can potentially have a Google footprint of around 10,000 pages. The result it that bots are hitting the sites regularly, indexing lots of pages for hundreds of sites. I also have some worker roles doing data work. The database is clearly busy!
    I am hoping to add 2 or 3 times the number of sites that I currently have to the "engine". 
    I am looking at efficiency where possible, but the sites are clearly under a fair load from bots and visitors.
    My question is, will one of the upgrades from Business to S2, P1, P2 or P3 resolve these problems? The financial cost of these database instances stagger greatly so I wouldn't want to update and find I'm left with the same problems but am paying many times
    more each month.
    Thank you in advance.

    Hello,
    For Web/Business edition database, the maximum limit of concurrent requests is 180. Beyond this limit, you will receive error.
    The Max woker threads for Standard(S2) is 100, you should upgrade your database to Permium tier.
    The concurrent requests limit of premium database varies depending on the reservation size of a premium database. For example, for P1, the max worker threads is 200.
    Reference:Azure SQL Database Resource Governance
    Azure SQL Database Service Tiers and Performance Levels
    Regards,
    Fanny Liu
    If you have any feedback on our support, please click here. 
    Fanny Liu
    TechNet Community Support

  • SQL Azure: More Intermittent Timeouts

    Hi guys,
    We have a set of 5 online auction systems running on Windows Azure & SQL Azure. Each system consists of a single web worker and one or more web roles. Each system is using ASP.NET MVC 3 and Entity Framework, Repository Pattern and StructureMap.
    The worker role is responsible for housekeeping and runs two groups of processes. One group is run every ten seconds, the other every second. Each process will likely run a database query or stored procedure. These are scheduled with Quartz.net
    The web role serves the public interface and back office. Among other basic crud functionality, both of these provide screens which, when open, will repeatedly call controller methods which will result in execution of stored procedure read-only queries.
    The frequency of repetition is about 2-3 seconds per client. A typical use case would be 5 back office windows open, and 25 end user windows open – all hitting the system repeatedly.
    For a long time we have been experiencing intermittent SQL timeout errors. Three of the most common ones are:
    System.Data.SqlClient.SqlException: A transport-level error has occurred when receiving results from the server. (provider: TCP Provider, error: 0 - An existing connection was forcibly closed by the remote host.)
    System.Data.SqlClient.SqlException: A transport-level error has occurred when receiving results from the server. (provider: TCP Provider, error: 0 - The semaphore timeout period has expired.)
    System.Data.SqlClient.SqlException: Timeout expired.  The timeout period elapsed prior to completion of the operation or the server is not responding.
    The only predictable scenario is during an auction where a specific controller -> sproc starts to timeout during the event (presumably due to load). All other times the errors appear to be completely random and come in singles, two’s, and three’s etc.
    even during periods of user inactivity. For example the system will go 18 hours without an error and then could be 5 – 10 errors from different housekeeping methods, or perhaps a user logged on and viewed their account.
    Other info:
    I have tried to run the affected queries/sprocs on SQL Azure using both local SSMS and Azure web-based query tool – all seem to execute quickly, 1 second max. Query plans not showing anything too suspicious although I am by no means a SQL query performance
    expert, or any other kind of expert for that matter
    J
    We have wrapped all affected areas in Azure SQL Transient Fault Handling Blocks – but as is discussed here
    http://social.msdn.microsoft.com/Forums/en-US/ssdsgetstarted/thread/7a50985d-92c2-472f-9464-a6591efec4b3, they do not catch timeouts, and according to “Valery M” this is for good reason.
    We are not storing any session information in the database, although asp.net membership information is stored in the database.
    We use 1 “SQL Azure server instance” which hosts all 5 databases, two for staging and three for production. All 5 systems are generally active at the same time although it is unlikely that more than one will be in a state of live load use at any given time.
    All web roles, worker roles and the SQL Azure server reside in the same Azure Geographical Region.
    Any thoughts on where we should be looking? Would it help giving each system it's own SQL Azure server? ... Failing a solution by ourselves - is it possible to get Microsoft to open a support ticket and take a look under the hood at what’s going on in with
    our application – how does one go about this?
    Thanks in advance.
    Ilan

    Ditto.
    New website/database (has only been in production for a week or so). The model we have is pretty simple Azure website MVC EF 6.1.1 using a single database in same region as website. I have monitored for and have never found any throttled connections or deadlocks.
    What I have tried so far
    I bumped the timeout in the connection string to 60 seconds
    I changed the web site to Basic and configured the auto scale to max instance size and count
    That seemed to help - but then just recently during points of high volume usage (client requests, data in and out, CPU time) I started seeing this error. As I narrow the focus to look at metrics during the exact time of the error it doesn't reflect
    the exact peak. For example peak usage may be at 3:45, but the errors occur at 4:30 when the usage is less than 3:45 by a fair amount.
    To try to force recreate and see if it was caused by transaction data table locks I ran a query in a transaction block with a WAITFOR delay and then went to the web site and tried to access data that would be locked by that query. That test did generate
    an error - but not this specific "semaphore timeout period has expired" - so I don't think it is caused by table locks.
    I did notice that when the error happens - it generally happens to one user and then happens multiple times to that one user over the course of what can be a minute or so and in different queries (views). The last analysis showed that pattern for
    one user with 2 other users getting the error but just once (and in different views/queries).
    USER A - got about 9 of these errors in about 8 different views over the course of a little more than a minute (20:26 to 21:35)
    USER B - got 1 (in the same timespan)
    USER C - got 1 (in the same time span)
    Analysis of the error logs showed it isn't unique to any single view/query or scenario wrt queries. The only clue that makes it less of an intermittent situation is does correspond with usage. However, it doesn't have to be specifically peak usage at the
    exact moment - but in a window (within an hour), It makes me wonder if it is the load or just the fact that more usage just increases the odds (I will monitor this relation some more and note the exact metrics). 
    I am now bumping the timeout in the connection string to 90 seconds and I have changed the database DTU setting to 50

  • Best way to Insert Millions records in SQL Azure on daily basis?

    I am maintaining millions of records in Sql Server 2008 R2 and now i am intended to migrate these on SQL Azure.
    In existing system with SQL Server 2008 R2, few SSIS packages and Stored Procedures are firstly truncate the existing records and then perform Insert operation on the table which holds
    approx 26 Million records in 30 mins. on Daily basis (as system demands).
    When i migrate these on SQL Azure, i am unable to perform these operations in a
    faster way as i did in SQL 2008. Sometimes i got Request timeout error.
    While searching for faster way, many of them suggest for Batch process or BCP. But Batch processing is NOT suitable in my case because it takes much time to insert those records. I required some faster and efficient way on SQL Azure.
    Hoping for some good suggestions.
    Thanks in advance :)
    Ashish Narnoli

    +1 to Frank's advice.
    Also, please upgrade your Azure SQL Database server to
    V12 as you will receive higher performance on the premium tiers.  As you scale-up your database for your bulk insert, remember that
    SQL Database charges by the hour. To minimize costs, scale back down when the inserts have completed.

  • SQL Azure: Query Analyzer VS Web Application - Calling Stored Prcocedure

    I have a stored procedure in SQL Azure.
    Calling this stored procedure normally would take 30 minutes.
    I need to call this Stored procedure multiple times (18 times, with different input)
    Scenario 1: When I call this asynchronously from the Web Application, all 18 calls run concurrently so the whole process take about 30 minutes.
    Scenario 2: When I call this same stored procedure from Microsoft SQL Server Management Studio, (each process in a different TAB) they seem to be running very slowly, it has already taken more than 5 hours.
    Is there a reason for this ?
    Is there any difference in calling multiple stored procedures from different tabs ?
    Is this process running Asynchronously ?
    What is the best was to achieve scenario without going via the front end ?

    Is there a reason for this ?
    --When you are running the query, no matter application or SQL Server, please check sys.dm_exec_requests and check the session status in SQL Server and see if any difference.
    Maybe it is caused by blocking when running in SSMS. We need to dig into it.
    I am not sure how you call the procedure asynchronously  in application, but if they are sent to SQL Server within different connections/sessions, then it should be the same as SSMS.
    Is there any difference in calling multiple stored procedures from different tabs ?
    --No.
    Is this process running Asynchronously ?
    --When you start the procedure in SSMS, it is a synchronize session.
    However, the slowness should not be associated with the synchronization or not.
    Per my understanding, for the asynchronously calling in application, it simply means that application goes to execution other code without for SQL Server.
    This does not affect the real time cost in SQL Server.
    What is the best was to achieve scenario without going via the front end ?
    --We need to understand what cause the issue first. Normally they should be almost the same.
    Right now the procedure caused 30 minutes to complete and it is really too long. I think you'd better make adjustment to the code or logic to tune the performance first.

  • Use SQL Azure with local web app?

    Hello all-
    I have a web app running on a local-ish (our Data Center) server which runs ColdFusion and SQL Server on the same physical hardware.  The load for the app is not too great, so running both services on the same box is not problem.
    I need to upgrade SQL Server, and I would prefer not to put more resources onto our local servers if possible.  Essentially, I would like to move to the cloud.  BUT at this point I don't want to move my code to the cloud due to file storage and
    a few other issues.
    Is it feasible to use SQL Azure as my datasource for my local web app?  I am at a top-tier University on the west coast, and our connection speeds are about as good as you can possibly get.
    I don't mind a minor slowdown.  For instance, if the round-trip takes 30 milliseconds, I won't be sweating that.  But I'd rather not have a relatively snappy app turn into something agonizing for users.
    Also, we have a perimeter firewall on campus which normally blocks all SQL Server traffic.  Is it possible to change the port that SQL Azure uses?
    Any guidance would be appreciated.
    Thanks-
    Karl

    Hello,
    At this time I have 4 virtual servers on Azure (2 of them SQL Server instances) on a high availability configuration and it only
    cost 13.35 per day. It is very cheap. Incoming traffic is free.
    About how to Connect to SQL Azure, the following application may help:
    http://code.msdn.microsoft.com/windowsazure/ASPNET-MVC-connect-with-1f40770f
    What is the size of the database?
    About the port default port and the organizational firewall, please read the following article:
    http://www.dotnetsolutions.co.uk/blog/connecting-to-sql-azure-without-changing-your-firewall
    Hope this helps.
    Regards,
    Alberto Morillo
    SQLCoffee.com

  • Geo-Replication in SQL Azure

    We are looking to move from geographically distributed and replicated physical/virtual data machines to SQL Azure.
    We would like to have synched/replicated Azure SQL DB instances in multiple regions. I would appreciate if you could answer some questions related to how to achieve this and what to expect:
    1. Is Azure SQL Sync the correct approach for geo-replicating Azure SQL Databases?
    2. What is the Performance SLA that I can expect for the synchronization? I would like to know the minimum and maximum delay possible for the synchronization. (I can see that the minimum sync frequency is 5 minutes. What would be the sync delay when the
    sync is triggered every 5 minutes?)

    Hello,
    1. SQL Data sync can be used for synchronization across SQL Database sites or between SQL Database and an on-premise SQL Server instance. But also notes: SQL Data Sync is currently available only as a Preview and is meant only for product feedback for
    future releases and should not be used in production environments. Currently, there are
    limits in SQL Data Sync.
    2. Windows Azure SQL Database keep triple redundancy within the Azure data center combined with the 99.9% availability SLA. SQL Database does not currently provide any SLA for performance or for security. As for SQL Data Syc, I can not find any document from BOL
    about the SLA for SQL Data sync.
    Reference:SQL Data Sync (Preview) Best Practices
    Regards,
    Fanny Liu
    If you have any feedback on our support, please click here.
    Fanny Liu
    TechNet Community Support

  • Alternatives to CONTEXT_INFO on SQL Azure

    Hi,
    We use CONTEXT_INFO in an application we currently are looking to move to SQL Azure however CONTEXT_INFO is already used to troubleshooting purposes by SQL Azure. Are there any built in alternatives in SQL Azure?
    Thanks,

    Hmm, still no answer since August of 2010?  I hope MS has something for us that use this.  I'll give an example I use CONTEXT_INFO in an applicaiton.
    I use CONTEXT_INFO to prevent users from modifying a table using SQL Studio.  I do this by setting a CONTEXT_INFO in my c#/vb code like so after a begin transaction, along with other TSQL commands, I have this little guy.
        DECLARE @x varbinary(4); SET @x = cast(-1 as varbinary(4)); SET CONTEXT_INFO @x
    update dbo.CusCharges set PaymentAmount=5 where CusChargeAmount=10 and CustomerId=30
    So then in a particular table, in the trigger (after update, insert) I have this piece of code (shown below)This will allow my application to update the table, but casual TSQL users which don't know about this won't be able to perform an update to this table directly.This works great, except in Azure-land where sysprocesses is not available. So how can I pass down a special variable to a trigger in the context of a transaction?If course, someone could still use TSQL studio, but they would really need to examine the trigger and know what they are doing. This just prevents thebeginning DBA from wiping out Customer Charges for the past decade in a table (it's happened, which is why I put it in).  declare @SessionNid int
        select @SessionNid = cast(substring(CONTEXT_INFO,1,4) as int) FROM master.dbo.sysprocesses WHERE spid = @@SPID
        if @SessionNid = 0 begin
            raiserror('You cannot directly modify dbo.CusCharges.', 16, 1)
            rollback transaction
            return
        end

  • How to add description of a column of a table in SQL Azure

    Hi
    I have some tables in my application database where there are descriptions added against certain columns. Needless to say they were done by using sp_addextendedproperty.
    Now I am trying to migrate the Database to SQL Azure. SQL Azure does not support sp_addextendedproperty.
    Hence I am not able to figure out how to add descriptions to those columns.
    Any help would be much appreciated.
    Thanks
    Soumyadeb

    Hello,
    Just as Latheesh post above, Windows Azure SQL database are not support extended stored procedures. That’s one of the limitations on SQL database, and I don’t know there is another way to achieve the same on Azure.
    Regards,
    Fanny Liu
    Fanny Liu
    TechNet Community Support

  • Unable to save a report that includes a datasource of "Microsoft SQL azure" type

    I have install SSRS in azure using the following instructions (http://msdn.microsoft.com/en-us/library/dn449661.aspx) and all seems to work fine, however when I create a report in report builder 2014 (in this case empty) that includes a Microsoft SQL Azure
    datasource type I am unable to save the report and get the following error message (even though when I test connection it succeeds). 
    "The report definition was saved, but one or more errors occurred while setting the report properties"
    Reports with standard sql datasources work fine.
    I have also tried creating the report using Visual Studio 2013 and get a similar error message.
    I have tried this using SQL 2012 and SQL 2014 and get the same error.
    Does anybody know how I can create a report with Microsoft SQL Azure datasource type?

    Hi jamesla,
    Based on my research, the issue can be caused by a deleted shared data source still exist under the Data Sources list in Report Builder. For more details about this scenario, we can refer to the following thread:
    http://social.msdn.microsoft.com/Forums/sqlserver/en-US/7170dbee-048c-4298-89ba-df4d42924c8e/the-report-definition-was-saved-but-one-or-more-errors-occurred-while-setting-report-properties?forum=sqlreportingservices
    Since the error message without detail information, we can try to render the report to see the detail error message. Besides, we can try to check it in the log file. The SQL Reporting Services log files are found on the reporting services point server, in
    the folder %programfiles%\Microsoft SQL Server\<SQL Server Instance>\Reporting Services\LogFiles.
    For more information about how to use Microsoft SQL azure as the data source of a report, please see:
    http://msdn.microsoft.com/en-IN/library/ff519560.aspx
    http://programming4.us/database/2158.aspx
    Thanks,
    Katherine Xiong
    Katherine Xiong
    TechNet Community Support

  • Create a web site in Visual Studio - fails with SQL Azure V12

    Creation of Microsoft Azure Website failed. <Error xmlns="Microsoft.SqlServer.Management.Framework.Web.Services" xmlns:i="http://www.w3.org/2001/XMLSchema-instance"><Message>The service objective 'Web' specified is invalid.</Message><InnerError
    i:nil="true"/><SqlErrorCode>40804</SqlErrorCode><Severity>16</Severity></Error>
    I receive this error after connecting to a database using the Preview Edition of SQL Azure V12 with a service level of 'Basic'
    'Web' may need to be changed to 'Basic' or 'Standard' depending on the service level. How can I do this?
    Regards
    David

    Hi,
    Thanks for posting here.
    Upgrading Web and Business Databases
    Upgrading Web or Business databases to a new service tier/performance level does not take the database offline. The database will continue to work through the upgrade operation. At the time of the actual transition to the new performance level temporary
    dropping of the connections to the database can happen for a very small duration (typically measured in seconds). If an application has transient fault handling for connection terminations then it is sufficient to protect against dropped connections at the
    end of the upgrade.
    Upgrading a Web or Business database to a new service tier involves the following steps:
    Determine service tier based on feature capability
    Determine an acceptable performance level based on historical resource usage
    Why does existing performance for my Web or Business database map to the higher Premium levels?
    Tuning your workload to fit a lower performance level
    Upgrade to the new service tier/performance level
    Monitor the upgrade to the new service tier/performance level
    Monitor the database after the upgrade
    Refer:
    http://azure.microsoft.com/en-us/documentation/articles/sql-database-upgrade-new-service-tiers/
    https://msdn.microsoft.com/en-us/library/azure/dn741336.aspx
    Hope this helps you.
    Girish Prajwal

Maybe you are looking for

  • Apple 20" vs Dell

    I'm looking to purchase a new monitor to replace this very, very old monitor (understand, any model will be an upgrade). I realize Apple Cinema Displays are top notch, but for the $$$$ I don't mind crossing over and considering a Dell. Does anyone ha

  • To switch to OS Xlion need to back up my word documents, excel, music, photos, etc????? and would have to reinstall applications, iphoto, imovie, garageBand??????

    1. to switch to OS Xlion need to back up my word documents, excel, music, photos, etc????? 2. and would have to reinstall applications, iphoto, imovie, garageBand?????? i am sorry, i am new to mac plis

  • Edits look different on mobileme gallery

    Hi there I have just posted some photos on a mobileme gallery after doing some minor edits - but they look different, and not right! Too bright, etc etc Have other people had this problem? Will it be a problem for prints and photobooks? Not sure why

  • Question About Calculation

    A previous forum user help me out by giving me this code: form1.#subform[0].Term_Start_Date::exit - (FormCalc, client) // Get the number of days from the epoch for the starting date var startNumber = Date2Num(Term_Start_Date.rawValue,"YYYY-MM-DD") //

  • Dde server examples please

    could any of you helpful folks who have example code communicating with the dde server of another application please post some 'generic' example code here? (or any related tips) -6.0.2- Thanks, Paul.S