Azure Storage Used by non-Azure Website in PHP

Dear sirs,
We are first time trying to use Azure video and image storage on non-azure PHP website. How we can do to access, save and get files from Azure blob account to non-azure PHP website?
Please somebody can help me on it?
Thanks & Regards,
Pedro

Hi Pedro,
Thanks for your posting!
In your scenarios, two approaches could meet your requirements.
>1 you could download Azure SDK for PHP as Magamalare said,
Also, you could see this blog and sample:
http://blogs.msdn.com/b/brian_swan/archive/2010/07/08/accessing-windows-azure-blob-storage-from-php.aspx
>2 You could use azure storage REST API to operate the storage:
http://msdn.microsoft.com/en-us/library/azure/dd135733.aspx
Regards,
Will
We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
Click
HERE to participate the survey.

Similar Messages

  • Azure Storage Used by non-Azure Website

    I inherited a MVC EF website that was later converted to Azure for everything.  We are taking the Azure Website and Azure SQL Server off Azure however and hosting both on our own servers.  However we still want to use Azure Blob storage. 
    Apparently on my development environment I have to install the Azure SDK for Visual Studio and configure settings using a command prompt.  On my local environment I was getting (Azure Reader?) errors relating http://127.0.0.1:10002.  When I uploaded
    the production site to the new server I getting the same errors.  What do I have to configure to get rid of that?
    Thank you

    Thank you so far!
    I fixed that in the web.config file to point to our blob storage account
    <add name="AzureReader2" connectionString="DefaultEndpointsProtocol=http;AccountName=blobnuqa;AccountKey=<our account key>endpoint="http://blobnuqa.blob.core.windows.net" prefix="~/azure/" />
    and I don't seem the be getting an error.
    I have another key in the web.config file however and the settings above generate errors when I point to table storage:
    Original:
    <add key="TableStorageConnectionString" value="UseDevelopmentStorage=true" />
    New:
    <add key="TableStorageConnectionString" value="DefaultEndpointsProtocol=http;AccountName=blobnuqa;AccountKey=<our Key>; endpoint="http://blobnuqa.tabel.core.windows.net"  />

  • Best practice to use the Slot on Azure Website

    Hi Team,
    I want to understand the website slot feature little deeper. Can anyone suggest that How should we plan while we have testing, staging & production environment on Azure? (Development might be from Visual Studio Azure Emulator)
    Take an example, we have ToDoApp. How should we use Azure Website & Slot feature to make it available for Testing, Staging & Production?
    Q1) Should we create two different site (1 for Testing and Staging while 2 for Production) or should we use all in one with 3 slot?
    Q2) If we create two different site like as I said in Q1 How should we move the package from Staging to Production once it done with staging?
    Q3) If we have one single site instead of two, Can we have role based feature like, Whoever is swapping the slot from testing to staging would do upto that level only. while from staging to production except Product Admin no-one can do?
    Q4) If none of the above is recommended then what would it be & How should we use swapping feature there for move package on production?
    Regards, Brijesh Shah

    Hi Brij,
    Deployment slots were developed with whole development lifecycle in mind. A good practice is to have a source control branch for stage of the environment so you know what are the changes. There are generally two types of development:
    One branch - always fix forward and goes to production at least once per day. The build might get deployed to staging environment (slot) to run full functional validation before taking live traffic and then deployed to Production. With Azure Websites this
    could be as simple as swap staging to Production slot. After swap (new) staging is ready to take next build and run another set of validation. So one site with many slots here
    Multiple branches - when deploying to Prod is slower you need multiple branches and slots, and stabilization phase. This means you can have Dev branch where is vNext - vNext slot (you can setup continuous deployment here so every check in is deployed),
    Main branch where only selected changes are integrated or when is time for a major Release or hotfix - points on Stage slot, Release branch pointing to PreRelease slot which later to be swapped with Production after deployment to avoid
    cold start. Better to stay with one site and may slots as well but current limitation is 5 slots.
    Thus said swap is the easiest way to move code thru stages :). There are many features there that can be used:
    during swap some settings stay at slot while others are moved with the code (this is available only in PowerShell and SDK at the moment)
    Auto swap - to avoid cold start you can deploy to PreRelease slot and auto swap with Production slot right after it finishes the deployment (this is available only in PowerShell and SDK at the moment)
    TiP - redirect only small % of the traffic to PreRelease slot (vNext or your choice) to expose let's say 1 % of customers to new code base.
    Video
    on Q3: RBAC(role based access control) allows you to do exactly this. only admin can have access to Production slot and separate groups can have access to other slots. Thus only admin can swap with production
    Hope that helps,
    Galin Iliev [MCSD.NET, MCPD], http://www.galcho.com

  • Azure Search for non-azure websites?

    I am looking into Azure Search as a possible replacement for a dedicated server running SearchBlox  and crawling several domains.
    This might be painfully obvious to others, but is Azure Search designed to index non-azure websites? I am unable to figure out how to setup an index for a domain. Does the service not actually 'crawl' in the sense that Google Search or Bing Search would?
    Thanks,

    Hi,
    You can add documents to your search index by using REST API (https://msdn.microsoft.com/en-us/library/azure/dn798930.aspx). We are also working on automatically indexing SQL
    and DocumentDb data. Currently, Azure Search doesn't support crawling Web sites - Azure Search focus is on searching app-provided data; Bing and Google generally do a great job for web pages. However, feel free to add your vote for this feature on our UserVoice
    site:
    http://feedback.azure.com/forums/263029-azure-search/suggestions/6328653-support-for-crawling-html-websites
    HTH,
    Eugene

  • How to upload 100 Mb+ video on Azure storage using PHP

    Dear Sirs,
    I have uploaded 1.2 Mb video to Azure storage through PHP code with no problem. When I tried to upload a 90 Mb video I got the following error message:
    Fatal error: Allowed memory size of 268435456 bytes exhausted (tried to allocate 90363852 bytes) in C:\xampp\htdocs\projetos\ajao\azure-sdk-for-php-master\WindowsAzure\Blob\BlobRestProxy.php on line 1319
    Can somebody please instruct me how to upload 100 Mb+ file size into Azure storage using PHP code?
    In advance thanks for your help.
    Best regards,
    Luiz Doi

    Hi Manu, Im having the same problem as Luiz, getting the same type of errors. I'm not a dev so I thought I ask for more details to clear things out.
    I have a Joomla site that was configured in though Azure hosting but I can't upload anything into Joomla less than 8MB. So basically having Azure storage blobs and media services for Joomla is useless if I can't use the larger files into the site or out
    of the site into Azure storage/Media services acct.   
    I even manually changed the upload_max_filesize and post_max_size parameters on the admin and then directly on the site code and even added a php.ini file on the root to override all as I read
    here but still nothing worked. 
    Therefore the code you posted above seems the solution but just wanted to be sure if it will solve my issue.
    1. I need my Azure hosted site to allow larger files up to 500MB.
    2. I need the files people upload on my site front end through a form see
    here to be sent to my Azure blob. What code do I need to paste on the Joomla's forms admn. So basically files first go into site and then to storage.
    3. After I upload the Php SDk and then the Http_Request2 PEAR
    and paste your code in the SDK how does the code recognize my Azure site and my azure blob/containers? Where in the code do I plug in my info?
    Kindly advise.
    Karin
    kd9000

  • Azure websites continues deployment using visual studio online doesn't work

    Hello!
    I have an issue with continues deployment in Azure. I have two subscriptions for my project. I created the team project in Subscription 1 & done the automatic deployment using GITContinuesDeployment. Everything is fine but while I changed the subscription
    I can't do the automatic deployment since the Team project can't be find. The question is how can I do the deployment? I get the following error:
    An attempted http request against URI https://management.core.windows.net/....../publishxml returned an error: (404) Not Found.
     Exception Message: One or more errors occurred. (type AggregateException)
    Exception Stack Trace:    at Microsoft.TeamFoundation.Deployment.Workflow.Activities.RestApiAsyncCodeActivity`1.ProcessAggregateException(AsyncCodeActivityContext context, AggregateException ag)
       at Microsoft.TeamFoundation.Deployment.Workflow.Activities.RestApiAsyncCodeActivity`1.EndExecute(AsyncCodeActivityContext context, IAsyncResult result)
       at System.Activities.AsyncCodeActivity`1.System.Activities.IAsyncCodeActivity.FinishExecution(AsyncCodeActivityContext context, IAsyncResult result)
       at System.Activities.AsyncCodeActivity.CompleteAsyncCodeActivityData.CompleteAsyncCodeActivityWorkItem.Execute(ActivityExecutor executor, BookmarkManager bookmarkManager)
    Inner Exception Details:
    Exception Message: One or more errors occurred. (type AggregateException)
    Exception Stack Trace:    at System.Threading.Tasks.Task`1.GetResultCore(Boolean waitCompletionNotification)
       at System.Threading.Tasks.Task`1.InnerInvoke()
       at System.Threading.Tasks.Task.Execute()
    Inner Exception Details:
    Exception Message: An attempted http request against URI https://management.core.windows.net/.../publishxm
    If I check Azure website also in Subscription 2, I can't find the Team project also from Subscription 1 even If I can see the subscription 1& 2 under services in Visual studio online 
    returned an error: (404) Not Found. (type HttpRequestWithUriException)
    Exception Stack Trace:    at Microsoft.TeamFoundation.Deployment.Workflow.Activities.GetAzureWebsitePublishProfile.<>c__DisplayClass2.<GetApiTask>b__0(Task`1 returnTask)
       at System.Threading.Tasks.ContinuationResultTaskFromResultTask`2.InnerInvoke()
       at System.Threading.Tasks.Task.Execute()
    Does anyone have the solution for that?
    Regards,
    Maryam

    Note that the better forum to directly reach the Visual Studio Online experts is
    here. A forum admin may be able to move the post to the right place.

  • Can't use punycode domain on azure website. Getting 404 Not Found.

    I attached punycode domain to azure website using powershell (Set-AzureWebsite).
    Domain is shown is azure output of Get-AzureWebsite and in Azure portal.
    But when I try to access website using domain, I'm getting 404 NotFound azure page.
    When I go to azurewebsites.net domain, I see default website page (OK).

    I found a partial answer - moving the index.html file did make the site accessible - but at a more convoluted URL - so I have edited my question and reposted for further clarification!

  • Using AzureContinuousDeployment.11.xaml template to deploy to Azure Website not working

    I have a visual studio online account and have set up continuous deployment using the AzureContinuousDeployment.11.xaml and set the Azure Website property to my publish profile, however when the build is running, it complete but never deploys.
    If however, I use the TFVCContinousDeployment.12.xaml (http://azure.microsoft.com/en-us/documentation/articles/cloud-services-continuous-delivery-use-vso/)
    then it does deploy as expected.
    Any ideas?

      <connectionStrings>
        <add name="MyWhesEntities" connectionString="metadata=res://*/MyWhesModel.csdl|res://*/MyWhesModel.ssdl|res://*/MyWhesModel.msl;provider=System.Data.SqlClient;provider connection string=&quot;data
    source=localhost;initial catalog=mywhes;integrated security=True;MultipleActiveResultSets=True;App=EntityFramework&quot;" providerName="System.Data.EntityClient" />
        <add name="MyWhesContext" connectionString="tcp:ngw4ziez6g.database.windows.net,1433;Database=mywhes;UserID=********;Password=********;Trusted_Connection=False;Encrypt=True;Connection Timeout=30;" providerName="System.Data.SqlClient"
    />
      </connectionStrings>     
    Hello James,
    Your Entity Framework Model "MyWhesEntities" connection string is still pointing to a local server, not to SQL Azure; or do you change it during runtime in your code?
    And you should add the "data source=" property name to your second connection string, it's missing.
    Olaf Helper
    * cogito ergo sum * errare humanum est * quote erat demonstrandum *
    Wenn ich denke, ist das ein Fehler und das beweise ich täglich
    Blog
    Xing

  • Can the new Azure File Service be used from Azure WebSites?

    The title pretty much says it all...
    Microsoft just launched the new File Services on Azure. (http://blogs.msdn.com/b/windowsazurestorage/archive/2014/05/12/introducing-microsoft-azure-file-service.aspx).
    Can I use it from within a Azure WebSite?
    Or is it limited to use from VMs and CloudServices due to "net use" restrictions?
    Thanks

    Hi,
    Mounting the file share using SMB in windows azure websites is not supported. If you’d like to see it. I suggest you suggest the feature here..
    http://feedback.azure.com/forums/169385-web-sites
    At currently we can use rest API to do this just as Name-Dis said.
    Best Regards,
    Jambor
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.

  • .svclog file is not creating on cloud when cloud service is deployed into azure website.

    I have created a wcf cloud service which is being deployed on cloud through bitbucket repository.
    I want to create a .svclog file to trace logs on my azure local storage.
    For that, I have refered so many posts and finally configured my solution as below:
    ServiceConfiguration.Cloud.cscfg:
    <Role name="MyServiceWebRole">    <Instances count="1" />    <ConfigurationSettings>      <Setting name="Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString"                value="DefaultEndpointsProtocol=https;AccountName=StorageName;AccountKey=MyStorageKey" />    </ConfigurationSettings>    <Certificates>      <Certificate name="Certificate" thumbprint="certificatethumbprint" thumbprintAlgorithm="sha1" />    </Certificates>  </Role>
    ServiceConfiguration.Local.cscfg:
    <Role name="MyServiceWebRole">
        <Instances count="1" />    <ConfigurationSettings>      <!--Also tried with value = "UseDevelopmentStorage=true"-->      <Setting name="Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString"               value="DefaultEndpointsProtocol=https;AccountName=StorageName;AccountKey=MyStorageKey" />    </ConfigurationSettings>    <Certificates>      <Certificate name="Certificate" thumbprint="certificatethumbprint" thumbprintAlgorithm="sha1" />    </Certificates>  </Role>
    ServiceDefinition.csdef:
    <WebRole name="MyServiceWebRole" vmsize="Small">    <Sites>      <Site name="Web">        <Bindings>          <Binding name="Endpoint1" endpointName="Endpoint1" />        </Bindings>      </Site>    </Sites>    <Endpoints>      <InputEndpoint name="Endpoint1" protocol="http" port="80" />    </Endpoints>    <Imports>      <Import moduleName="Diagnostics" />    </Imports>    <LocalResources>      <LocalStorage name="MyServiceWebRole.svclog" sizeInMB="1000" cleanOnRoleRecycle="false" />    </LocalResources>    <Certificates>      <Certificate name="Certificate" storeLocation="LocalMachine" storeName="My" />    </Certificates>  </WebRole>
    web.config (MyServiceWebRole project):
    <system.diagnostics>    <trace autoflush="false">      <listeners>        <add name="AzureDiagnostics"             type="Microsoft.WindowsAzure.Diagnostics.DiagnosticMonitorTraceListener, Microsoft.WindowsAzure.Diagnostics,              Version=2.2.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35" />      </listeners>    </trace>  </system.diagnostics>  ............<system.serviceModel>    <diagnostics>      <messageLogging maxMessagesToLog="3000"                      logEntireMessage="true"                      logMessagesAtServiceLevel="true"                      logMalformedMessages="true"                      logMessagesAtTransportLevel="true" />    </diagnostics>   ............ <runtime>    <assemblyBinding xmlns="urn:schemas-microsoft-com:asm.v1">      <dependentAssembly>        <assemblyIdentity name="Microsoft.WindowsAzure.Diagnostics" publicKeyToken="31bf3856ad364e35" culture="neutral" />        <!--<bindingRedirect oldVersion="0.0.0.0-1.8.0.0" newVersion="2.2.0.0" />-->      </dependentAssembly>    </assemblyBinding>  </runtime>
    WebRole.cs (MyServiceWebRole project):
           public override bool OnStart()        {            //Trace.Listeners.Add(new DiagnosticMonitorTraceListener());            Trace.Listeners.Add(new AzureLocalStorageTraceListener());            Trace.AutoFlush = false;            Trace.TraceInformation("Information");            Trace.TraceError("Error");            Trace.TraceWarning("Warning");            TimeSpan tsOneMinute = TimeSpan.FromMinutes(1);            // To enable the AzureLocalStorageTraceListner, uncomment relevent section in the web.config            DiagnosticMonitorConfiguration diagnosticConfig = DiagnosticMonitor.GetDefaultInitialConfiguration();            // Transfer logs to storage every minute            diagnosticConfig.Logs.ScheduledTransferPeriod = tsOneMinute;            // Transfer verbose, critical, etc. logs            diagnosticConfig.Logs.ScheduledTransferLogLevelFilter = LogLevel.Verbose;            // Start up the diagnostic manager with the given configuration            DiagnosticMonitor.Start("Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString", diagnosticConfig);            // For information on handling configuration changes            // see the MSDN topic at http://go.microsoft.com/fwlink/?LinkId=166357.            return base.OnStart();        }
    AzureLocalStorageTraceListener.cs (MyServiceWebRole project):
    public class AzureLocalStorageTraceListener : XmlWriterTraceListener    {        public AzureLocalStorageTraceListener() : base(Path.Combine(GetLogDirectory().Path, "MyServiceWebRole.svclog"))        {        }        public static DirectoryConfiguration GetLogDirectory()        {            try            {                DirectoryConfiguration directory = new DirectoryConfiguration();                // SHOULD I HAVE THIS CONTAINER ALREADY EXIST IN MY LOCAL STORAGE?                directory.Container = "wad-tracefiles";                directory.DirectoryQuotaInMB = 10;                directory.Path = RoleEnvironment.GetLocalResource("MyServiceWebRole.svclog").RootPath;                var val = RoleEnvironment.GetConfigurationSettingValue("Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString");                return directory;            }            catch (ConfigurationErrorsException ex)            {                throw ex;            }        }    }
    I also tried to comment out element in ServiceDefinition.csdef file. but here, I am having build time error (The XML specification is not valid).
    In my case, I am pushing all source code to bitbucket repository and from there it is deployed to the azure "WebSite". Here is more details:
    I need help to know:
    Why my service did not creating .svclog file from local to azure?
    It's also not doing the same even it has been deployed to azure?
    On which location(container) I can get the .svclog file into local storage?
    Please suggest correct way or modification so that I can overcome with this issue. Please replay fast.
    Thanks.

    Hello _Adian,
    Thanks for response.
    I uploaded all my code on bitbucket repository and configured a website on portal using "Integrate source control" (please refer:  http://azure.microsoft.com/en-in/documentation/articles/web-sites-publish-source-control/).
    (NOTE: This is the way my client is following.)
    Here is the structure of my solution:
    1. a wcf service application (.svc)
    2. few class library projects
    3. Azure cloud service (with Project 1 as web role).
    Now whenever I push my updated code to bitbucket, It automatically deployed to azure.
    So, please suggest me how can I create a separate .svclog file into local storage (using above environment).
    I hope this info will helpful to you for answer.

  • There are no more endpoints available on Azure Website

    I am just trying to setup a simple web app that pings a IP address and host it on Azure Websites. I have a standard website (not the free tier) and I have web sockets turn ON.
    I am just sending a simple ping using the system.net.networkinformation.ping part of the .net api. It runs fine within VS.NET on my machine when deploying local. however when I publish to my azure website, and run it there I get
    There are no more endpoints available from the endpoint mapper
    I've tried looking on the azure configuration, the only obvious thing I noticed was making sure websockets was turned on. (and it is)
    Any ideas? im sure it is some simple configuration and im just missing something? or are there certain things they do not allow from a Azure Website?
    Thanks a bunch for looking at my question!
    --Ryan

    Hi,
    I am afraid, Azure website doesn't support ping, if you need Inbound & Outbound ping, please vote this feedback:
    http://feedback.azure.com/forums/217313-networking-dns-traffic-manager-vpn-vnet/suggestions/3346609-icmp-support-for-azure-websites-roles-cloud-serv
    Best Regards,
    Jambor
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.

  • PDF Generator failing on Azure Website but not development machine

    I am using EVOPDF for testing html to pdf.
    http://idealpi.azurewebsites.net/defaultpdf.aspx
    On local development works fine in full trust. When run on Azure website Reserved mode with full trust in web.config I get error.
    Could not get conversion result header. Data receive error. Could not receive
    data. Error code: 109
    A similar post on discounet.asp site
    http://community.discountasp.net/showthread.php?t=14232
    indicates this is a full trust issue, but I have full trust defined in web.config
    <configuration><system.web> <trust level="Full" />
    Is full trust not full trust? Is there another way to define Full Trust?
    Greg
    PS
    The EO company responsed with the following
    For scalability and stability reasons, EO.Pdf performs the actual conversion in a separate worker process instead of inside your Web application process. That is probably what’s failing in your environment since in a Web only environment it usually won’t
    allow you to spawn your own worker process.
    Any workarounds using WAWS?

    I have solved problem,  with large files (~150 pages).
    My Evo to Pdf version is 5.17 under 64bits process (iis 7.5) .
    Making this change in web.config works ok
      <security>
          <requestFiltering>
            <requestLimits maxAllowedContentLength="1000000000"/> <!--Importante para la transferecia de archivos grandes-->
          </requestFiltering>
        </security>
    Regards,
    Jose Antonio.

  • Linking Azure website with Remote mysql database in Cpanel

    I am creating an azure website eg calvynlee.azurewebsites.net
    In my hosting cpanel account, I create a mysql db & user, in order to link the azure website with this mysql db, I had place a "%" in the remote mysql section.
    It works, but when I remove the % from the remote mysql section. Azure website unable show error connection with my database.
    Question:
    Understand that placing % is allow all remote connection to my database. For security purpose, placing % is not recommended, and placing the website IP address is the most suitable practice.
    Unfortunately, I place the website IP (by ping the website and obtain the IP) in remote mysql section is not working.
    My cpanel service provider mention that the IP I obtain is not the right IP, there will be another host sever IP
    What is the actual IP address I should put? and How to find this right IP?

    Hi,
    Based on your description, your mysql db is created in cpanel, and want to let the azure website connect with this db, I don't familar with cpanel, in general way the azure website request may be stopped by something, such as firewall etc.., as far as I
    know, the azure Hybrid Connections may meet your requirement, you can connect a website on Microsoft Azure to any on-premises resource that uses a static TCP port, such as SQL Server, MySQL, HTTP Web APIs, Mobile Services, and most custom Web Services, refer
    to
    http://azure.microsoft.com/en-us/documentation/articles/web-sites-hybrid-connection-get-started/ for more details, hope this helps, if not, please feel free to let me know.
    Best Regards,
    Jambor 
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.

  • Azure Websites suspended - why, and how do I reactivate it? Cannot submit tech support request

    Hello,
    Our company has an Microsoft Action Pack subscription. I have an MSDN Subscription allocated to me. Through the MSDN Subscription, I have a monthly $70 Azure credit. I have been using this Azure credit to test Azure Websites.
    I have noticed this month (March 2015) that two of my websites are showing "Suspended" in the portal.
    I suspect this probably happened in January 2015 when we had a particularly busy testing month and I used up all my Azure credit for the month, however the current status leaves me puzzled:
    1. I can't find a specific record of when, or why, the websites were suspended. Can someone tell me how to discover this?
    2. If it was due to exhausting my monthly credit, why did they not become re-enabled the following month when my account went back into credit?
    3. How do I raise a tech support ticket against this subscription for this problem, or other problems I may have with Azure during my development and testing?
    I have had various fruitless phone and email exchanges with Microsoft representatives, often suggesting I purchase technical support for this subscription in order to submit a ticket. As an MSDN subscriber, I shouldn't need to, according to this blog post
    from Microsoft:
    http://blogs.msdn.com/b/mast/archive/2013/10/24/windows-azure-technical-support-for-msdn-technet-or-mpn-users-and-partners.aspx
    However, the process described is not working for me, when it asks for my Contract ID and Access ID, it ignores it and redirects to the same screen without an error message.
    As a software developer, it is still challenging to recommend Azure hosting to clients when services become suspended without warning or explanation, and technical support channels are unclear or broken.
    Any insight appreciated.
    Michael

    Hi Cristhian,
    Thank you for re-enabling the websites. So, my immediate issue is resolved in that the websites are running again, but I do have a question about the root cause of this issue.
    If I use up my monthly credit before the end of the month and the websites get suspended again, will I need to repeat this process to get them re-enabled, or would you expect them to automatically re-enable when my account is back in credit?
    I'm still having trouble lodging a technical support request for Azure under my MSDN subscription, but I will include the detail in a reply email to your colleague Sudha.
    Thanks,
    Michael

  • For my Azure Website, I don't see instrumentation for its dependent Azure database

    I followed the instructions found at Application Insights Azure WebSite extensions but I don't see any instrumentation for the
    Azure SQL database that the Azure website uses.
    For the windows "Application Health / Overview timeline"
    I see graphs for Client Process time, Send Request Time, Server Response Time, Server Requests and Failed Requests, but no SQL
    For the windows "Diagnostics / Diagnostics timeline"
    I see graphs for Server Exceptions, Browser Exceptions, Failed Requests and "DEPENDENCY FAILURES" has as text box over it stating "Learn how to collect dependent resource data".  It seems that it should be able to see the Sql Server,
    but it doesn't 
    Any thoughts how to fix this,
    John Marsing http://MyHebrewBible.com/

    Hi John,
    Thanks for confirming your AI Nuget version is 0.12. What I meant by "choose site extension" is that if you follow the process in the blog post that you reference (Application
    Insights Azure WebSite extensions) there's step #2 that describes how to enable Application Insights extension from Azure Websites blade - click on Extensions part, click +, Add Site Extension, Choose Site Extension, Application Insights Extension,
    wait for it to be installed, verify that it appears in your Installed Extensions list. Please can you confirm that you have enabled it for your website.
    Thanks!
    Alex
    Application Insights team is hiring: https://careers.microsoft.com/jobdetails.aspx?jid=166735. Interested? Please email your resume to albulank at microsoft.com.

Maybe you are looking for

  • Windows Vista 64 bit w/older 2 dual core 3.0ghz Mac Pro with NVIDIA 7300

    I have read a variety of the posts that have appeared here on the use of Windows Vista 64 bits on a Mac Pro. I remain confused, and but want to do this. I have a Mac Pro purchased in the Spring of '07 with dual 3.0Ghz dual core Xeon cpus and 10gb of

  • Itunes 7 and Music Video not synching

    I have just upgraded to Itunes 7.0 and automatically updated my ipod firmware to 1.2 as instructed. Two problems: 1. Now all the music video's are in Itunes in a smart playlist called music video's and none appear on the ipod. On the Ipod there is a

  • STO, Transfer postings in MM

    Hi all, What is the difference between the framework,scheduling agreement and contracts?Difference between the stock transfer, transfer posting, stock transport in MM? Can anybody explain how does the STO is mapped in SAP MM with practical example? I

  • Shopping Cart approval process in SRM using Black Berry

    Hi, I am very new to SRM. Here the user wants the whole approval/rejection process of shopping cart should be done via the Black Berry. When a shopping cart is created in SRM then automatically a mail should appear in this mail box(outlook or lotus)

  • File exists on client? from web deployed forms

    Any ideas on the quickest and sleekest way to identify whether or not a file at a known location actually exists on client? My first thought is to deconstruct some part of the fileupload java stuff. Any other ideas?