Reservation Size unavailable - SQL Azure Premium

Hello,
Currently, we're testing the new preview functionality "Premium for SQL Azure". But we're having an issue when setting up this system.
We've requested a SQL Azure Premium beta license, we've created a new SQL Database Server. Requested Premium Quota. Now there is one Premium database quota available.
But when we create a SQL Database (premium), we need to specify the reservation Size... We can't choose between P1 or P2... What is the reason? How can we solve it?
Thank you in advance!

Hello,
The problem solved itself.. After a few days I could choose between the different reservation sizes.
Thank you!

Similar Messages

  • How to Print SQL Azure V12 current DB Utilization SIZE.

    Hi,
    After upgrading SQL Azure Server from V11 to V12, I am not able to Print DB Size..
    $dbname =Get-AzureSqlDatabase -ConnectionContext "Server credential" 
    Write-Output $db.SizeMB
    In V11 I am able to print current utilized Size in MB but after V12 I am not able to Print.

    1. you may have to disable the triggers as they do really slowdown the performance.
    2. it depends on your network bandwidth.
    you can learn more in the BOL.
    https://msdn.microsoft.com/en-us/library/hh456371.aspx

  • SQL Azure - query with row_number() executes slow if columns with nvarchar of big size are included

    I am linking my question from Stack Overflow here. The link: http://stackoverflow.com/questions/27943913/sql-azure-query-with-row-number-executes-slow-if-columns-with-nvarchar-of-bi
    Appreciate your help!
    Gorgi

    Hi,
    Thanks for posting here.
    I suggest you to check this link and optimize your query on sql azure.
    http://www.sqlusa.com/articles/query-optimization/
    http://sqlblog.com/blogs/paul_white/archive/2011/02/23/Advanced-TSQL-Tuning-Why-Internals-Knowledge-Matters.aspx
    Also check this blog which had similar issue.
    https://social.msdn.microsoft.com/Forums/en-US/c1da08b4-265d-4ec8-a252-8d7090234e3e/simple-select-query-takes-long-time-to-execute-with-nvarchar-columns?forum=transactsql
    Girish Prajwal

  • SQL Azure - Intermittent The wait operation timed out

    I have a website engine which runs a few hundred "white label" sites. It's hosted on Azure with a SQL Azure Business database. Generally everything is fine - it all works and runs at a good speed.
    However, throughout the day I get maybe 40 or 50 of the error:
    System.ComponentModel.Win32Exception: The wait operation timed out
    Please don't refer me to the connectivity blog at http://blogs.msdn.com/b/sqlazure/archive/2010/03/22/9982979.aspx as this seems to refer to problems where you just can't connect. My problem is that it's fine most of the time, but I still get these
    intermittently.
    This is sometimes on the main database, but we're also using a database for sessions and this gets the errors too. Both databases are on the same server.
    I also get errors like: 
    An existing connection was forcibly closed by the remote host
    and:
    System.Data.SqlClient.SqlException: The service has encountered an error processing your request. Please try again. Error code 40143. A severe error occurred on the current command. The results, if any, should be discarded.
    and, when evil bots are hammering the site:
    System.Data.SqlClient.SqlException: Resource ID : 1. The request limit for the database is 180 and has been reached. See 'http://go.microsoft.com/fwlink/?LinkId=267637' for assistance.
    Each website can potentially have a Google footprint of around 10,000 pages. The result it that bots are hitting the sites regularly, indexing lots of pages for hundreds of sites. I also have some worker roles doing data work. The database is clearly busy!
    I am hoping to add 2 or 3 times the number of sites that I currently have to the "engine". 
    I am looking at efficiency where possible, but the sites are clearly under a fair load from bots and visitors.
    My question is, will one of the upgrades from Business to S2, P1, P2 or P3 resolve these problems? The financial cost of these database instances stagger greatly so I wouldn't want to update and find I'm left with the same problems but am paying many times
    more each month.
    Thank you in advance.

    Hello,
    For Web/Business edition database, the maximum limit of concurrent requests is 180. Beyond this limit, you will receive error.
    The Max woker threads for Standard(S2) is 100, you should upgrade your database to Permium tier.
    The concurrent requests limit of premium database varies depending on the reservation size of a premium database. For example, for P1, the max worker threads is 200.
    Reference:Azure SQL Database Resource Governance
    Azure SQL Database Service Tiers and Performance Levels
    Regards,
    Fanny Liu
    If you have any feedback on our support, please click here. 
    Fanny Liu
    TechNet Community Support

  • Can't create a new SQL Azure Standard Tier db

    I'm having an issue creating a new sql azure standard tier db. Here's the basic steps I've done to migrate a web tier db.
    Register for the sql preview programme
    https://account.windowsazure.com/PreviewFeatures
    Perform a db export to blob storage
    use the + (new) button to create a new SQL db
    select the import option
    browser to your saved export
    hopefully see the new tiers as options in dialog. 
    The new tiers require a separate server to web. Can create this from the import dialog
    I have 2 independent azure accounts, the above process worked for my test account but for the live account where I was also experimenting after success on the test account I hit an issue. I request to create a new server whilst importing. This step seems
    to work but then the actual import fails with this msg
    "Error encountered during the service operation. 
     Could not import package.
     Error SQL72014: .Net SqlClient Data Provider: Msg 40823, Level 16, State 1, Line 1 Invalid value provided for parameter EDITION. Please provide a value that is valid on server version 1.0.
     Error SQL72045: Script execution error. The executed script:
     CREATE DATABASE [$(DatabaseName)] COLLATE SQL_Latin1_General_CP1_CI_AS
     (EDITION = 'Standard', MAXSIZE = 1 GB)"
    Tried a couple of times but no joy. Using the portal I can browse to the new sql server and it looks okay other than the list of ENABLED
    RESERVATION SIZES ONLY HAS P1, P2, P3 I'm requesting a standard (s1) db not premium (P1, P2, P3). On my test server I see in this list also S1,
    S2. As you can see in the error message I'm requesting Edition = Standard. I get the feeling the newly created server is not accepting standard tier dbs?
    Now the server is created if I try an import I see the server in the list of available servers but when I select a tier of Basic or Standard the new server is grayed out, not so if I select Premium or the older Web or Business? Interestingly my current live
    sql server shows up as supporting the new premium?
    Thanks
    Wayne 

    Hello,
    Glad to hear that the issue resolved and thanks for your sharing.
    Regards,
    Fanny Liu
    If you have any feedback on our support, please click here. 
    Fanny Liu
    TechNet Community Support

  • Performance is too slow on SQL Azure box

    Hi,
    Performance is too slow on SQL Azure box (Located in Europe)
    Below query returns 500,000 rows in 18 Min. on SQL Azure box (connected via SSMS, located in India)
    SELECT * FROM TABLE_1
    Whereas, on local server it returns 500,000 rows in (30 sec.)
    SQL Azure configuration:
    Service Tier/Performance Level : Premium/P1
    DTU       : 100
    MAX DB Size : 500GB     
    Max Worker Threads : 200          
    Max Sessions     : 2400
    Benchmark Transaction Rate      : 105 transactions per second
    Predictability : Best
    Any suggestion would be highly appreciated.
    Thanks,

    Hello,
    Can you please explain in a little more detail the scenario you testing? Are you comparing a SQL Database in Europe against a SQL Database in India? Or a SQL Database with a local, on-premise SQL Server installation?
    In case of the first scenario, the roundtrip latency for the connection to the datacenter might play a role. 
    If you are comparing to a local installation, please note that you might be running against completely different hardware specifications and without network delay, resulting in very different results.
    In both cases you can use the below blog post to assess the resource utilization of the SQL Database during the operation:
    http://azure.microsoft.com/blog/2014/09/11/azure-sql-database-introduces-new-near-real-time-performance-metrics/
    If the DB utilizes up to 100% you might have to consider to upgrade to a higher performance level to achieve the throughput you are looking for.
    Thanks,
    Jan 

  • Performance too Slow on SQL Azure box

    Hi,
    Performance is too slow on SQL Azure box:
    Below query returns 500,000 rows in 18 Min. on SQL Azure box (connected via SSMS)
    SELECT * FROM TABLE_1
    Whereas, on local server it returns 500,000 rows in (30 sec.)
    SQL Azure configuration:
    Service Tier/Performance Level : Premium/P1
    DTU       : 100
    MAX DB Size : 500GB     
    Max Worker Threads : 200          
    Max Sessions     : 2400
    Benchmark Transaction Rate      : 105 transactions per second
    Predictability : Best
    Thanks,

    Hello,
    Please refer to the following document too:
    http://download.microsoft.com/download/D/2/0/D20E1C5F-72EA-4505-9F26-FEF9550EFD44/Performance%20Guidance%20for%20SQL%20Server%20in%20Windows%20Azure%20Virtual%20Machines.docx
    Hope this helps.
    Regards,
    Alberto Morillo
    SQLCoffee.com

  • Create a web site in Visual Studio - fails with SQL Azure V12

    Creation of Microsoft Azure Website failed. <Error xmlns="Microsoft.SqlServer.Management.Framework.Web.Services" xmlns:i="http://www.w3.org/2001/XMLSchema-instance"><Message>The service objective 'Web' specified is invalid.</Message><InnerError
    i:nil="true"/><SqlErrorCode>40804</SqlErrorCode><Severity>16</Severity></Error>
    I receive this error after connecting to a database using the Preview Edition of SQL Azure V12 with a service level of 'Basic'
    'Web' may need to be changed to 'Basic' or 'Standard' depending on the service level. How can I do this?
    Regards
    David

    Hi,
    Thanks for posting here.
    Upgrading Web and Business Databases
    Upgrading Web or Business databases to a new service tier/performance level does not take the database offline. The database will continue to work through the upgrade operation. At the time of the actual transition to the new performance level temporary
    dropping of the connections to the database can happen for a very small duration (typically measured in seconds). If an application has transient fault handling for connection terminations then it is sufficient to protect against dropped connections at the
    end of the upgrade.
    Upgrading a Web or Business database to a new service tier involves the following steps:
    Determine service tier based on feature capability
    Determine an acceptable performance level based on historical resource usage
    Why does existing performance for my Web or Business database map to the higher Premium levels?
    Tuning your workload to fit a lower performance level
    Upgrade to the new service tier/performance level
    Monitor the upgrade to the new service tier/performance level
    Monitor the database after the upgrade
    Refer:
    http://azure.microsoft.com/en-us/documentation/articles/sql-database-upgrade-new-service-tiers/
    https://msdn.microsoft.com/en-us/library/azure/dn741336.aspx
    Hope this helps you.
    Girish Prajwal

  • Getting "The remote server returned an error 503 server unavailable" in azure web jobs

    I have created one web
    job - on demand schedule under azure web site.  This web jobs contains .execmd(i.e.)
    Console Application.
    I am retrieving the data from SQL Azure database and uploaded the data in sharepoint online lists. (i.e.)I have uploaded the data to several(7) lists in each subsites. I have 3 subsites. 
    I am getting this error "The remote server returned an error 503 server unavailable", while uploaded the data into lists. 
    Full Error Message:
    Message - The remote server returned an error 503 server unavailable.
    StackTrace -    at System.Net.HttpWebRequest.GetResponse()
    > cc525c: INFO]    at Microsoft.SharePoint.Client.SPWebRequestExecutor.Execute()
    > cc525c: INFO]    at Microsoft.SharePoint.Client.ClientRequest.ExecuteQueryToServer(ChunkStringBuilder sb)
    > cc525c: INFO]    at Microsoft.SharePoint.Client.ClientRequest.ExecuteQuery()
    > cc525c: INFO]    at Microsoft.SharePoint.Client.ClientRuntimeContext.ExecuteQuery()
    > cc525c: INFO]    at Microsoft.SharePoint.Client.ClientContext.ExecuteQuery()
    This is not occur every time. Some time i didn't get any error data successfully uploaded in share point online list.
    Totally 4 hours taken uploaded the data into list for completed all 3 subsites. 
    If anyone know how to resolve this.
    Thanks,
    A.Ramu

    Hi,
    Per my understanding, there is an issue when uploading data from SQL Azure database to a SharePoint list in Online environment.
    For narrowing down the issue, I suggest you create a Console Application in Visual Studio without accessing the SQL Azure database and upload some sample data to the same SharePoint
    list to see if the issue still occurs intermittently.
    Thanks
    Patrick Liang
    TechNet Community Support
    Please remember to mark the replies as answers if they help, and unmark the answers if they provide no help. If you have feedback for TechNet Support, contact
    [email protected]

  • SQL Azure Reporting - There was an exception running the extensions specified in the config file. --- Maximum request length exceeded.

    I am trying to deploy an RDL file (5MB) to SQL Azure Reporting server in South Central US using the deploy function in SQL Server Data Tools but facing the following error during deployment to Azure Reporting server.
    "There was an exception running the extensions specified in the config file. ---> Maximum request length exceeded."
    Is there any limit on the size of RDL files which can be deployed to Azure Reporting server? I have seen some online posts which talk about increasing the maxRequestLength in httpruntime of web.config in Reporting server. But in case of Azure Reporting server
    how can be make modification to this configuration?
    I have tried to upload it directly to SQL Azure Reporting server from the Management Portal --> Upload Report function which still resulted in error.
    Thanks & Regards, Deep

    Thanks for your question. Unfortunately we are in the process of deprecating SQL Reporting services.  Full details are available at http://msdn.microsoft.com/en-us/library/gg430130.aspx
    Thanks Guy

  • SQL Azure: More Intermittent Timeouts

    Hi guys,
    We have a set of 5 online auction systems running on Windows Azure & SQL Azure. Each system consists of a single web worker and one or more web roles. Each system is using ASP.NET MVC 3 and Entity Framework, Repository Pattern and StructureMap.
    The worker role is responsible for housekeeping and runs two groups of processes. One group is run every ten seconds, the other every second. Each process will likely run a database query or stored procedure. These are scheduled with Quartz.net
    The web role serves the public interface and back office. Among other basic crud functionality, both of these provide screens which, when open, will repeatedly call controller methods which will result in execution of stored procedure read-only queries.
    The frequency of repetition is about 2-3 seconds per client. A typical use case would be 5 back office windows open, and 25 end user windows open – all hitting the system repeatedly.
    For a long time we have been experiencing intermittent SQL timeout errors. Three of the most common ones are:
    System.Data.SqlClient.SqlException: A transport-level error has occurred when receiving results from the server. (provider: TCP Provider, error: 0 - An existing connection was forcibly closed by the remote host.)
    System.Data.SqlClient.SqlException: A transport-level error has occurred when receiving results from the server. (provider: TCP Provider, error: 0 - The semaphore timeout period has expired.)
    System.Data.SqlClient.SqlException: Timeout expired.  The timeout period elapsed prior to completion of the operation or the server is not responding.
    The only predictable scenario is during an auction where a specific controller -> sproc starts to timeout during the event (presumably due to load). All other times the errors appear to be completely random and come in singles, two’s, and three’s etc.
    even during periods of user inactivity. For example the system will go 18 hours without an error and then could be 5 – 10 errors from different housekeeping methods, or perhaps a user logged on and viewed their account.
    Other info:
    I have tried to run the affected queries/sprocs on SQL Azure using both local SSMS and Azure web-based query tool – all seem to execute quickly, 1 second max. Query plans not showing anything too suspicious although I am by no means a SQL query performance
    expert, or any other kind of expert for that matter
    J
    We have wrapped all affected areas in Azure SQL Transient Fault Handling Blocks – but as is discussed here
    http://social.msdn.microsoft.com/Forums/en-US/ssdsgetstarted/thread/7a50985d-92c2-472f-9464-a6591efec4b3, they do not catch timeouts, and according to “Valery M” this is for good reason.
    We are not storing any session information in the database, although asp.net membership information is stored in the database.
    We use 1 “SQL Azure server instance” which hosts all 5 databases, two for staging and three for production. All 5 systems are generally active at the same time although it is unlikely that more than one will be in a state of live load use at any given time.
    All web roles, worker roles and the SQL Azure server reside in the same Azure Geographical Region.
    Any thoughts on where we should be looking? Would it help giving each system it's own SQL Azure server? ... Failing a solution by ourselves - is it possible to get Microsoft to open a support ticket and take a look under the hood at what’s going on in with
    our application – how does one go about this?
    Thanks in advance.
    Ilan

    Ditto.
    New website/database (has only been in production for a week or so). The model we have is pretty simple Azure website MVC EF 6.1.1 using a single database in same region as website. I have monitored for and have never found any throttled connections or deadlocks.
    What I have tried so far
    I bumped the timeout in the connection string to 60 seconds
    I changed the web site to Basic and configured the auto scale to max instance size and count
    That seemed to help - but then just recently during points of high volume usage (client requests, data in and out, CPU time) I started seeing this error. As I narrow the focus to look at metrics during the exact time of the error it doesn't reflect
    the exact peak. For example peak usage may be at 3:45, but the errors occur at 4:30 when the usage is less than 3:45 by a fair amount.
    To try to force recreate and see if it was caused by transaction data table locks I ran a query in a transaction block with a WAITFOR delay and then went to the web site and tried to access data that would be locked by that query. That test did generate
    an error - but not this specific "semaphore timeout period has expired" - so I don't think it is caused by table locks.
    I did notice that when the error happens - it generally happens to one user and then happens multiple times to that one user over the course of what can be a minute or so and in different queries (views). The last analysis showed that pattern for
    one user with 2 other users getting the error but just once (and in different views/queries).
    USER A - got about 9 of these errors in about 8 different views over the course of a little more than a minute (20:26 to 21:35)
    USER B - got 1 (in the same timespan)
    USER C - got 1 (in the same time span)
    Analysis of the error logs showed it isn't unique to any single view/query or scenario wrt queries. The only clue that makes it less of an intermittent situation is does correspond with usage. However, it doesn't have to be specifically peak usage at the
    exact moment - but in a window (within an hour), It makes me wonder if it is the load or just the fact that more usage just increases the odds (I will monitor this relation some more and note the exact metrics). 
    I am now bumping the timeout in the connection string to 90 seconds and I have changed the database DTU setting to 50

  • Best way to Insert Millions records in SQL Azure on daily basis?

    I am maintaining millions of records in Sql Server 2008 R2 and now i am intended to migrate these on SQL Azure.
    In existing system with SQL Server 2008 R2, few SSIS packages and Stored Procedures are firstly truncate the existing records and then perform Insert operation on the table which holds
    approx 26 Million records in 30 mins. on Daily basis (as system demands).
    When i migrate these on SQL Azure, i am unable to perform these operations in a
    faster way as i did in SQL 2008. Sometimes i got Request timeout error.
    While searching for faster way, many of them suggest for Batch process or BCP. But Batch processing is NOT suitable in my case because it takes much time to insert those records. I required some faster and efficient way on SQL Azure.
    Hoping for some good suggestions.
    Thanks in advance :)
    Ashish Narnoli

    +1 to Frank's advice.
    Also, please upgrade your Azure SQL Database server to
    V12 as you will receive higher performance on the premium tiers.  As you scale-up your database for your bulk insert, remember that
    SQL Database charges by the hour. To minimize costs, scale back down when the inserts have completed.

  • Use SQL Azure with local web app?

    Hello all-
    I have a web app running on a local-ish (our Data Center) server which runs ColdFusion and SQL Server on the same physical hardware.  The load for the app is not too great, so running both services on the same box is not problem.
    I need to upgrade SQL Server, and I would prefer not to put more resources onto our local servers if possible.  Essentially, I would like to move to the cloud.  BUT at this point I don't want to move my code to the cloud due to file storage and
    a few other issues.
    Is it feasible to use SQL Azure as my datasource for my local web app?  I am at a top-tier University on the west coast, and our connection speeds are about as good as you can possibly get.
    I don't mind a minor slowdown.  For instance, if the round-trip takes 30 milliseconds, I won't be sweating that.  But I'd rather not have a relatively snappy app turn into something agonizing for users.
    Also, we have a perimeter firewall on campus which normally blocks all SQL Server traffic.  Is it possible to change the port that SQL Azure uses?
    Any guidance would be appreciated.
    Thanks-
    Karl

    Hello,
    At this time I have 4 virtual servers on Azure (2 of them SQL Server instances) on a high availability configuration and it only
    cost 13.35 per day. It is very cheap. Incoming traffic is free.
    About how to Connect to SQL Azure, the following application may help:
    http://code.msdn.microsoft.com/windowsazure/ASPNET-MVC-connect-with-1f40770f
    What is the size of the database?
    About the port default port and the organizational firewall, please read the following article:
    http://www.dotnetsolutions.co.uk/blog/connecting-to-sql-azure-without-changing-your-firewall
    Hope this helps.
    Regards,
    Alberto Morillo
    SQLCoffee.com

  • Can I migrate from sql azure business database edition (old) to sql azure standard database edition (new in preview)?

    I have an export .bacpac file from a sql azure business database edition and I want import that file to sql azure standard database edition.
    I have tried through Azure but I think that is not supported yet.
    I have also tried through Microsoft SQL Server Management Studio 2012 but Wizard only shows me old editions "web" and "business" for destination database and it is incompatible...
    Thanks.

    I've done this today.
    You'll need to restore your database onto Azure as a Business/Web addition. Either use your existing or New > SQL Database > Import.
    Then you need to sign up to the SQL Preview https://azure.microsoft.com/en-us/services/preview/.
    Once you've done that, in the Azure portal under DB > Scale, you'll have Web, Business, Basic, Standard and Premium to choose from.
    Obviously this is still in preview mode.

  • Need a better way than killing connections in SQL Azure - Governor

    I've been pushing the boundaries of  SQL azure (S0 and S1) recently. 
    I'm at a point where normal T-SQL code needs to be optimized for SQL azure, I was surprised that all Microsoft have done was enhance the governor process and kill the connection  rather than slow down the connection, guess this was the easy path to
    use. 
    Simple T-SQL statements like MERGE need to be rewritten to support large data (1 million rows).  Given you need to batch core T-SQL commands you need to answer yourself is this the correct approach.  Also had the same issue using SQL Bulk Copy
    but tweaked a few settings to get around that issue. 
    S0 & S1 still haven't got the same IO as Web/Business edition hence Microsoft need to fix this ASAP.
    Is SQL Azure worth that extra hassle compared with other DB engines?
    Micatio Software Free IIS Azure Web Log App

    Hi Jan,
    Its not the command timeout (for example set as 0 in SQL management studio). 
    I've seen it many times where the last wait type is the LOG GOVERNOR. 
    A few minutes later the status is set as KILLED/ROLLBACK.  I can get around the issue by batching the MERGE statement  and it works fine.  
    I would understand if the MERGE was running for 1-2 hours and I've seen the same thing occur when using a bulk insert command in C# (resolved by limiting the batch size, streaming, etc.).
    The azure version is Microsoft SQL Azure (RTM) - 11.0.9230.176.
    The log governor does kill connections, this is documented on a few sites and MSDN.  LOG GOVERNOR was a SQL Enterprise feature in the standalone product except Microsoft in there wisdom ported it across into SQL Azure to resolve users running bad queries
    on the infrastructure.
    Micatio Software Free IIS Azure Web Log App

Maybe you are looking for

  • Safari 7.0/Mavericks topsites bug

    I upgraded to Mavericks/Safari 7.0 yesterday.  I have Google News (http://news.google.com) as one of my topsites.  If I click on it from within Safari's topsites display, it loads, minus the navigation frame that should be on the left side of the pag

  • Remoting security/authentication

    Hi all CF developers, I'm starting my journey with ColdFusion 9. I'm Flex developer and I would like to implement authorization mechanism for my Flex app in CF. I'm interested in implementing user authentication, roles(authorization) and session mana

  • Name of this antique wooden number game

    HI I have this antique number game made by and artist in mosaik wooden piece but i would like to know what is the name of this game. Thanks for any infos.Bruno

  • OBYC Setting Help

    Hi All, Following error I am getting While Issuing in VA01 for Creation Automatic PR via Sales order Not possible to determine a consumption account Message no. 06138 Diagnosis The system was not able to determine a consumption account for the purcha

  • Need simple code...

    Hello experts, I am currently having trouble modifying a report. In the part of the report that I am modifying, it is inside a loop and it gets vbelv and posnv from vbfa comparing it to mseg-mblnr and zeile respectively. now, I cannot use at new stat