Cloud service - Uploading package files results in certificate error

Hi,
I want to upload a Cloud Service (Visual Studio 2013) to Azure but get the error "The certificate
with the thumbprint 4bbc8b9188d0e321198a9069b8a0a1c06709e6cb was not found." (in German "Das Zertifikat mit dem Fingerabdruck 4bbc8b9188d0e321198a9069b8a0a1c06709e6cb wurde nicht gefunden").
What I have done:
Created a valid Azure subscription (successful)
On the Azure Portal created a Cloud Service (successful)
In Visual Studio created a new Project of type "Windows Azure Cloud Service" > "WCF Service Web Role" (successful)
Hint: I didn't change the predefined settings (HTTP endpoint) and I didn't change any config file!
In Visual Studio built the solution (successful)
In Visual Studio created the package files (successful)
On the Azure Portal started to upload the package files which I have generated  before (not successful!!!)
--> The upload terminates with the error message mentioned above.
Do I need a certification though it's a HTTP endpoint?
Where does the thumbprint (4bbc8b9188d0e321198a9069b8a0a1c06709e6cb) in the error message comes from? It seems that Azure needs a certificate with exactly this thumbprint. But why?
Can anybody help me?

hi Sir,
Did you try to logon in using your azure account or imported your Azure subscription in VS? I am afraid your package lack the windows azure tools certificate. So you could try those methods:
http://blogs.msdn.com/b/avkashchauhan/archive/2012/05/10/downloading-windows-azure-publish-settings-subscription-configuration-file.aspx
Also, you could try to select "publish" method to deploy your project to Azure.
Please try it.
Regards,
Will
We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
Click
HERE to participate the survey.

Similar Messages

  • BT Cloud wont upload Large Files

    Hi
    New to the forum so please be nice!
    Ive recently started to use my BT cloud storage and today increased my storage to 50GB.
    I have a few files i want to upload that are over 2GB in size each, so i set about setting the upload via the PC client application (web version will not allow files over 1GB to be uploaded)
    My files show a progress and then it stops around a quarter of the way there 25% and says upload complete, but my available space is still 50GB and the file is not there, but shows in the client.  I cannot download it or do any action on it.
    i tested with a smaller file 100MB and that uploads fine. 
    Is there a limit on the file size you can backup/upload?  Ive tried numerous times and its the same thing each time, even with a different file around a similar size (over 2GB).
    Driving me a little nuts!
    Any advice/help appreciated
    Solved!
    Go to Solution.

    Hi guys,
    Sorry for the problems you're having with BT Cloud and also for the late reply.
    Click on my profile and in the "about me" section you'll see the link to "contact the mods".  We'll check this out and give you a shout back.
    Thanks a million,
    Robbie
    BTCare Community Mod
    If we have asked you to email us with your details, please make sure you are logged in to the forum, otherwise you will not be able to see our ‘Contact Us’ link within our profiles.
    We are sorry that we are unable to deal with service/account queries via the private message(PM) function so please don't PM your account info, we need to deal with this via our email account :-)
    If someone answers your question correctly please let other members know by clicking on ’Mark as Accepted Solution’.

  • Using the cloud service with .psd files

    I just purchased Photoshop Touch for my Galaxy Tab 2. Everything I read says I should be able to transfer .psd files from the cloud service to photoshop touch but I can't get it to work  Anyons else have this issue?

    You can only open .psd files through the "add local image" and it flattens the image(no layers). It's the biggest flaw of Touch.

  • Can not Package file. Getting Link Error.

    My document has no Errors reported, and everything is accounted for.
    When I go to Package, I get this error.  https://www.dropbox.com/s/nkj0e4tzsr2ue25/Screenshot%202013-12-23%2011.08.52.png
    How can I troubleshoot this file when there is no specific indication of the error?
    When I package the file I'm not including fonts, just image links.
    I'm running InDesign CC.
    Thanks.

    Hi. Thanks for answering.  I checked all the links.  Only one file had an unusual character, and that has been corrected.  No other unusual characters, other than hyphen or underscore.
    Just tried to repackage, and still getting the same error.
    I wish if there was an error like this InDesign could log the error file it's having trouble with.
    Any other suggestions on how to trouble shoot this file for packaging?
    Thanks.

  • Implementing CPUID assembly codes in 12.4 Beta C++ file results in iropt Error

    Hi,
    In BOOST 1.56, the CPUID implementation is not available for Solaris 11.2 when compiling the file "libs/log/src/dump.cpp". So I have tried to implement the CPUID-equivalent code below:
    private:    
      static void cpuid(uint32_t& eax, uint32_t& ebx, uint32_t& ecx, uint32_t& edx)
         __asm__(
             "cpuid;"                                        /* assembly code */
             :"=a" (eax), "=b" (ebx), "=c" (ecx), "=d" (edx) /* outputs */
                                                             /* clobbers: none */
    and compiling the file "libs/log/src/dump.cpp" results in some weird iropt errors:
    /opt/SolarisStudio12.4-beta_jul14-solaris-x86/lib/compilers/iropt'quita+0xa4 [0x8285914]
    /opt/SolarisStudio12.4-beta_jul14-solaris-x86/lib/compilers/iropt'libsunir_error_callback+0xdb [0x8285c9b]
    /opt/SolarisStudio12.4-beta_jul14-solaris-x86/lib/compilers/sys/libsunir.so'0xdd3e [0xfe61dd3e]
    /opt/SolarisStudio12.4-beta_jul14-solaris-x86/lib/compilers/sys/libsunir.so'0x43f28 [0xfe653f28]
    /opt/SolarisStudio12.4-beta_jul14-solaris-x86/lib/compilers/sys/libsunir.so'0x4482a [0xfe65482a]
    /opt/SolarisStudio12.4-beta_jul14-solaris-x86/lib/compilers/sys/libsunir.so'0x2462a [0xfe63462a]
    /opt/SolarisStudio12.4-beta_jul14-solaris-x86/lib/compilers/sys/libsunir.so'0x2646a [0xfe63646a]
    /opt/SolarisStudio12.4-beta_jul14-solaris-x86/lib/compilers/sys/libsunir.so'0x2644c [0xfe63644c]
    /opt/SolarisStudio12.4-beta_jul14-solaris-x86/lib/compilers/sys/libsunir.so'0x26889 [0xfe636889]
    /opt/SolarisStudio12.4-beta_jul14-solaris-x86/lib/compilers/sys/libsunir.so'0x27dc8 [0xfe637dc8]
    /opt/SolarisStudio12.4-beta_jul14-solaris-x86/lib/compilers/sys/libsunir.so'ir_proc_write+0x70 [0xfe625e40]
    /opt/SolarisStudio12.4-beta_jul14-solaris-x86/lib/compilers/iropt'write_irfile+0x1be [0x832746e]
    /opt/SolarisStudio12.4-beta_jul14-solaris-x86/lib/compilers/iropt'0x2d8983 [0x8328983]
    /opt/SolarisStudio12.4-beta_jul14-solaris-x86/lib/compilers/iropt'main+0x7b2 [0x832db22]
    /opt/SolarisStudio12.4-beta_jul14-solaris-x86/lib/compilers/iropt'_start+0x72 [0x80946e2]
    compiler(iropt) error:    Iropt internal error calling libsunir.
    CC: Fatal error in /opt/SolarisStudio12.4-beta_jul14-solaris-x86/lib/compilers/iropt : Abort
    My guess is that Solaris Studio C++ does not support CPUID feature. Does anyone know about this iropt errors and whether has there been a workaround / fix for CPUID feature? Kindly advise. Thanks.
    Regards,
    Brian

    Hi Alexander,
    I have finally reproduced the weird iropt error. The test code when compiled with -xO4 optimization level triggers the weird iropt error.
    Test_CPUID.cpp:
    #include <sys/types.h>
    #include <sys/stat.h>
    #include <fcntl.h>
    #include <unistd.h>
    #include <string.h>
    #include <errno.h>
    #include <stdio.h>
    static int cpuid(uint32_t& eax, uint32_t& ebx, uint32_t& ecx, uint32_t& edx)
         __asm__
             "cpuid;"                                        /* assembly code */
             :"=a" (eax), "=b" (ebx), "=c" (ecx), "=d" (edx) /* outputs */
                                                             /* clobbers: none */
         return 1;
    int main()
      unsigned int eax, ebx, ecx, edx;
      int i;
      cpuid(eax, ebx, ecx, edx);
      return 0;
    Run it with this command (with or without -noex does not matter):
    CC -xannotate=no -noex -mt -xO4 -Qoption iropt -Rloop_reform   -c -KPIC  -o Test_CPUID.o Test_CPUID.cpp
    If this command is run below at -xO1 optimization level as the suggested workaround, it throws a different error:
    CC -xannotate=no -mt -xO1 -Qoption iropt -Rloop_reform -c -KPIC  -o Test_CPUID.o Test_CPUID.cpp
    assertion failed in function fwAsmStmtArg() @ iexp1.c:415
    assert(ex_op_(actual) == EOPRVAL)
    CC: Fatal error in /opt/SolarisStudio12.4-beta_jul14-solaris-x86/lib/compilers/ube : Segmentation Fault
    The compilation is successful if no optimization levels are specified (-xO1 or -xO4 omitted).
    Regards,
    Brian

  • Is there a way for converting WCF service into Azure Cloud Service without dealing with project files.

    Here is my situation.
    We provide Platform and SDK to other people where they can build WCF services and deploy that on site. We have a  many ISVs who has built wcf services, packaged them and ship them.
    Packaging consist of couple of manifest file and .dll file that make up wcf service. Now we want to go on cloud with our platform.
    Long story short - I have all the dlls for my wcf service but no project file.
    Is there a way to generate cloud service package from this dll files?

    Hi,
    You can't simply package your dlls and deploy it as a azure cloud service deployment package - you will need to have your project as well as service definition file which will define your role and instances.
    You can read more about packaging an Application by Using the CSPack Command-Line Tool here -http://msdn.microsoft.com/en-us/library/azure/gg433133.aspx
    another way to go to cloud platform is by making use of azure web sites, however you will still need your service files do deploy.
    If you have the dlls and the service files with you - then you can simply use the web site publish wizard and deploy to azure web site.
    Read more about it here - http://azure.microsoft.com/en-us/documentation/articles/web-sites-deploy/
    Bhushan | Blog |
    LinkedIn | Twitter

  • Cloud Service Not Running

    Hi There,
    I am new to azure but I hope someone can help me. I have an asp.net web application that I am currently trying to move to azure as a cloud service. I have it set up in Visual Studio and converted it to a windows azure cloud service. This all works fine and
    I can build and run the application in visual studio 2013 no problem using the azure emulator. 2 days I deployed this to windows azure and it worked just fine.
    Now today, I have made changes to my application and I am trying to publish from Visual studio once again. This is failing every time whether I try to publish to staging or production. So to try and fix this I deleted everything from windows azure account
    through the portal so I could start again.
    Now when I create a new cloud service it creates okay but under production and staging there is a dash. I believe it is meant to say "running here". I am unable to publish to my cloud service, my application uploads from visual studio but then
    i get an error telling me that i cannot publish (sorry i don't have the exact error message, but it is not very informative anyway, something like "unable to publish"). I have been googling and people are mentioning certificates and storage accounts
    and editing the config files.
    I cannot understand why it worked for me on my first attempt and now i cannot successfully create a cloud service.
    Any help would be greatly appreciated. I would normallly give much better information to trouble shoot something, but having little knowledge of azure, i have no idea what other information you might need to help me, but please ask and I will tell you. Oh
    I am on the free trial, I do still have credits.
    Thank You
    David

    Hi,
    I have solved the problem and I want to inform you how I did it.
    The problem was that after i created my first azure cloud service and successfully published it, i then made some changes on my computer with visual studio and created a new project and copied over the files from the old project to the new project.
    Somewhere along the way my certificates got lost. I now know that cloud services will create in azure without certificates but they will never run and deployments will fail.
    To fix the issue i right clicked on my application in visual studio, i clicked publish, and then clicked the previous button twice to go back to the sign in, there it showed me my subscription, i clicked on the down arrow and clicked manage. I forget exactly,
    but somehow i managed to sign out and delete existing subscriptions from my computer. Then i had to sign in again, and then click manage once more, then i had to go to certificates and remove the existing certificate, then click import and then you are given
    a link to click on to download the certificate from azure, I downloaded it to my desktop, then imported that file, and then i proceeded to publish. Before i clicked finish it was still remembering my old storage account so at the top in the profiles drop down
    menu i had to delete my profile (maybe i had to click manage or something first), then once i deleted my profile it researched and found the correct storage account. I have learned that without an already created storage account in azure that the publish will
    fail too.
    So for me, i had to fix my certificates, and then i had to make sure that i had a storage account in azure.
    I hope this information helps someone else. I have been in IT and software development for many years and i must say, i am not finding azure cloud to be very easy.

  • Error while uploading WSDL file in Interactive Form Data Connection!

    I have created we service to return some data based on user input.
    I am trying to link this webservice to Interactive adobe form! and While creating new data connection->uploading WSDL file--> I am receiving error i.e. Invalid File.
    Please help me in resolving this issue.
    I have created this WSDL file copy/pasting XML code generates from "Open WSDL document for selected binding" link in SOAMANAGER.
    Regards,
    Naveen.I

    Hello,
    This is a Webservice created for the FM : HRXSS_PER_READ_EMERGENCY_AR
    Here is the sample of the WSDL file generated, as asked by you.
    <?xml version="1.0" encoding="utf-8" ?>
    - <wsdl:definitions targetNamespace="urn:sap-com:document:sap:soap:functions:mc-style" xmlns:wsdl="http://schemas.xmlsoap.org/wsdl/" xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:soap="http://schemas.xmlsoap.org/wsdl/soap/" xmlns:http="http://schemas.xmlsoap.org/wsdl/http/" xmlns:mime="http://schemas.xmlsoap.org/wsdl/mime/" xmlns:tns="urn:sap-com:document:sap:soap:functions:mc-style" xmlns:wsp="http://schemas.xmlsoap.org/ws/2004/09/policy" xmlns:wsu="http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-wssecurity-utility-1.0.xsd" xmlns:n1="urn:sap-com:document:sap:rfc:functions">
    - <wsdl:documentation>
      <sidl:sidl xmlns:sidl="http://www.sap.com/2007/03/sidl" />
      </wsdl:documentation>
      <wsp:UsingPolicy wsdl:required="true" />
    - <wsp:Policy wsu:Id="BN_BN_ZHR_READ_EMERGENCY">
      <saptrnbnd:OptimizedXMLTransfer uri="http://xml.sap.com/2006/11/esi/esp/binxml" xmlns:saptrnbnd="http://www.sap.com/webas/710/soap/features/transportbinding/" wsp:Optional="true" />
      <saptrnbnd:OptimizedXMLTransfer uri="http://www.w3.org/2004/08/soap/features/http-optimization" xmlns:saptrnbnd="http://www.sap.com/webas/710/soap/features/transportbinding/" wsp:Optional="true" />
    - <wsp:ExactlyOne xmlns:wsp="http://schemas.xmlsoap.org/ws/2004/09/policy" xmlns:sapsp="http://www.sap.com/webas/630/soap/features/security/policy" xmlns:sp="http://docs.oasis-open.org/ws-sx/ws-securitypolicy/200702" xmlns:wsa="http://www.w3.org/2005/08/addressing" xmlns:wsu="http://schemas.xmlsoap.org/ws/2002/07/utility">
    - <wsp:All>
    - <sp:TransportBinding>
    - <wsp:Policy>
    - <sp:TransportToken>
    - <wsp:Policy>
      <sp:HttpsToken />
      </wsp:Policy>
      </sp:TransportToken>
    - <sp:AlgorithmSuite>
    - <wsp:Policy>
      <sp:TripleDesRsa15 />
      </wsp:Policy>
      </sp:AlgorithmSuite>
    - <sp:Layout>
    - <wsp:Policy>
      <sp:Strict />
      </wsp:Policy>
      </sp:Layout>
      </wsp:Policy>
      </sp:TransportBinding>
      </wsp:All>
      </wsp:ExactlyOne>
      </wsp:Policy>
    - <wsp:Policy wsu:Id="IF_IF_ZHR_READ_EMERGENCY">
    - <sapsession:Session xmlns:sapsession="http://www.sap.com/webas/630/soap/features/session/">
      <sapsession:enableSession>false</sapsession:enableSession>
      </sapsession:Session>
      <sapcentraladmin:CentralAdministration xmlns:sapcentraladmin="http://www.sap.com/webas/700/soap/features/CentralAdministration/" wsp:Optional="true" />
      </wsp:Policy>
    - <wsp:Policy wsu:Id="OP_IF_OP_HrxssPerReadEmergencyAr">
      <sapcomhnd:enableCommit xmlns:sapcomhnd="http://www.sap.com/NW05/soap/features/commit/">false</sapcomhnd:enableCommit>
      <sapblock:enableBlocking xmlns:sapblock="http://www.sap.com/NW05/soap/features/blocking/">true</sapblock:enableBlocking>
      <saptrhnw05:required xmlns:saptrhnw05="http://www.sap.com/NW05/soap/features/transaction/">no</saptrhnw05:required>
      <saprmnw05:enableWSRM xmlns:saprmnw05="http://www.sap.com/NW05/soap/features/wsrm/">false</saprmnw05:enableWSRM>
      </wsp:Policy>
    - <wsdl:types>
    - <xsd:schema attributeFormDefault="qualified" targetNamespace="urn:sap-com:document:sap:rfc:functions">
    - <xsd:simpleType name="char1">
    - <xsd:restriction base="xsd:string">
      <xsd:maxLength value="1" />
      </xsd:restriction>
      </xsd:simpleType>
    "More simple types
      <xsd:pattern value="\d*" />
      </xsd:restriction>
      </xsd:simpleType>
      </xsd:schema>
    - <xsd:schema attributeFormDefault="qualified" targetNamespace="urn:sap-com:document:sap:soap:functions:mc-style" xmlns:n0="urn:sap-com:document:sap:rfc:functions">
      <xsd:import namespace="urn:sap-com:document:sap:rfc:functions" />
    - <xsd:complexType name="Bapiret2">
    - <xsd:sequence>
    "More element names
    - <xsd:complexType>
    - <xsd:sequence>
      <xsd:element name="Messages" type="tns:Bapirettab" />
      <xsd:element name="Records" type="tns:HcmtBspPaArR0006Tab" />
      <xsd:element name="Records2" type="tns:HcmtBspPaArR0021Tab" />
      </xsd:sequence>
      </xsd:complexType>
      </xsd:element>
      </xsd:schema>
      </wsdl:types>
    - <wsdl:message name="HrxssPerReadEmergencyAr">
      <wsdl:part name="parameters" element="tns:HrxssPerReadEmergencyAr" />
      </wsdl:message>
    - <wsdl:message name="HrxssPerReadEmergencyArResponse">
      <wsdl:part name="parameter" element="tns:HrxssPerReadEmergencyArResponse" />
      </wsdl:message>
    - <wsdl:portType name="ZHR_READ_EMERGENCY">
    - <wsp:Policy>
      <wsp:PolicyReference URI="#IF_IF_ZHR_READ_EMERGENCY" />
      </wsp:Policy>
    - <wsdl:operation name="HrxssPerReadEmergencyAr">
    - <wsp:Policy>
      <wsp:PolicyReference URI="#OP_IF_OP_HrxssPerReadEmergencyAr" />
      </wsp:Policy>
      <wsdl:input message="tns:HrxssPerReadEmergencyAr" />
      <wsdl:output message="tns:HrxssPerReadEmergencyArResponse" />
      </wsdl:operation>
      </wsdl:portType>
    - <wsdl:binding name="ZHR_READ_EMERGENCY" type="tns:ZHR_READ_EMERGENCY">
    - <wsp:Policy>
      <wsp:PolicyReference URI="#BN_BN_ZHR_READ_EMERGENCY" />
      </wsp:Policy>
      <soap:binding transport="http://schemas.xmlsoap.org/soap/http" style="document" />
    - <wsdl:operation name="HrxssPerReadEmergencyAr">
      <soap:operation soapAction="" style="document" />
    - <wsdl:input>
      <soap:body use="literal" />
      </wsdl:input>
    - <wsdl:output>
      <soap:body use="literal" />
      </wsdl:output>
      </wsdl:operation>
      </wsdl:binding>
    - <wsdl:service name="service">
    - <wsdl:port name="ZHR_READ_EMERGENCY" binding="tns:ZHR_READ_EMERGENCY">
      <soap:address location="http://cieh4-srvr.collabera.com:8000/sap/bc/srt/rfc/sap/zhr_read_emergency/900/zhr_read_emergency/zhr_read_emergency" />
      </wsdl:port>
      </wsdl:service>
      </wsdl:definitions>
    Cheers,
    Remi

  • Getting error when try to upload xml file into Data Template

    Hi,
    Getting error when try to upload xml file into Data Template.error:"The uploaded file XXSLARPT.xml is invalid. The file should be in XML-DATA-TEMPLATE format."Plz anybody help me.
    Thanks,
    Prasad.

    Hi,
    Anybody Help Plzzzzzz.
    thx,
    Prasad

  • ERROR WHILE UPLOADING TIFF FILE.

    Dear Sir/Madam,
             While i am trying to upload tiff file i got this error , i cannot understand where i made mistake , please guide me to solve this problem.
    Load File
    C:\Documents and Settings\dastagiri\Desktop\PARU.tiff
    The file contains      2,798  bytes
    This is a TIFF file with INTEL byte order
    First IFD offset:                                    2,612
    Reading IFD from offset      2,612  Number of Tags         15
    ImageWidth:                                            176
    ImageLength:                                           148
    BitsPerSample levels:                                    3
    BitsPerSample - level 1:                                 8
    BitsPerSample - level 2:                                 8
    BitsPerSample - level 3:                                 8
    Compression:                                             5
    Photometric Interpretation:                              2
    Number of StripOffsets:                                  7
    SamplesPerPixel:                                         3
    RowsPerStrip:                                           23
    Number of StripByteCounts:                               7
    XResolution:                                            96  /          1
    YResolution:                                            96  /          1
    ResolutionUnit:                                          2
    TIFF format error: No baseline TIFF 6.0 file
    Thanks in Advance,
    D@st@giri.

    Dear Vijay,
    I have one tiff 6.0 image which i am passing to program "RSTXLDMC".
    I have not done any conversion or changes in the file.
    Regards,
    Sagar Sontakke

  • ITunes Match has stopped uploading - every file errors and says waiting - I have tried to delete the files and use other formats etc.  I have had the service since Day 1 and NEVER had an issue.  It didn't start until the Delete from Cloud switch to Hide

    iTunes Match has stopped uploading - every file errors and says waiting - I have tried to delete the files and use other formats etc.  I have had the service since Day 1 and NEVER had an issue.  It didn't start until the Delete from Cloud switch to Hide from cloud - the files that do not upload show grayed out on my other devices.

    Have you confirmed that you successfull purged iTunes Match by also looking on an iOS device?  If so, keep in mind that Apple's servers may be experiencing a heavy load right now.  They just added about 19 countries to the service and I've read a few accounts this morning that suggests all's not running perfectly right now.

  • .svclog file is not creating on cloud when cloud service is deployed into azure website.

    I have created a wcf cloud service which is being deployed on cloud through bitbucket repository.
    I want to create a .svclog file to trace logs on my azure local storage.
    For that, I have refered so many posts and finally configured my solution as below:
    ServiceConfiguration.Cloud.cscfg:
    <Role name="MyServiceWebRole">    <Instances count="1" />    <ConfigurationSettings>      <Setting name="Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString"                value="DefaultEndpointsProtocol=https;AccountName=StorageName;AccountKey=MyStorageKey" />    </ConfigurationSettings>    <Certificates>      <Certificate name="Certificate" thumbprint="certificatethumbprint" thumbprintAlgorithm="sha1" />    </Certificates>  </Role>
    ServiceConfiguration.Local.cscfg:
    <Role name="MyServiceWebRole">
        <Instances count="1" />    <ConfigurationSettings>      <!--Also tried with value = "UseDevelopmentStorage=true"-->      <Setting name="Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString"               value="DefaultEndpointsProtocol=https;AccountName=StorageName;AccountKey=MyStorageKey" />    </ConfigurationSettings>    <Certificates>      <Certificate name="Certificate" thumbprint="certificatethumbprint" thumbprintAlgorithm="sha1" />    </Certificates>  </Role>
    ServiceDefinition.csdef:
    <WebRole name="MyServiceWebRole" vmsize="Small">    <Sites>      <Site name="Web">        <Bindings>          <Binding name="Endpoint1" endpointName="Endpoint1" />        </Bindings>      </Site>    </Sites>    <Endpoints>      <InputEndpoint name="Endpoint1" protocol="http" port="80" />    </Endpoints>    <Imports>      <Import moduleName="Diagnostics" />    </Imports>    <LocalResources>      <LocalStorage name="MyServiceWebRole.svclog" sizeInMB="1000" cleanOnRoleRecycle="false" />    </LocalResources>    <Certificates>      <Certificate name="Certificate" storeLocation="LocalMachine" storeName="My" />    </Certificates>  </WebRole>
    web.config (MyServiceWebRole project):
    <system.diagnostics>    <trace autoflush="false">      <listeners>        <add name="AzureDiagnostics"             type="Microsoft.WindowsAzure.Diagnostics.DiagnosticMonitorTraceListener, Microsoft.WindowsAzure.Diagnostics,              Version=2.2.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35" />      </listeners>    </trace>  </system.diagnostics>  ............<system.serviceModel>    <diagnostics>      <messageLogging maxMessagesToLog="3000"                      logEntireMessage="true"                      logMessagesAtServiceLevel="true"                      logMalformedMessages="true"                      logMessagesAtTransportLevel="true" />    </diagnostics>   ............ <runtime>    <assemblyBinding xmlns="urn:schemas-microsoft-com:asm.v1">      <dependentAssembly>        <assemblyIdentity name="Microsoft.WindowsAzure.Diagnostics" publicKeyToken="31bf3856ad364e35" culture="neutral" />        <!--<bindingRedirect oldVersion="0.0.0.0-1.8.0.0" newVersion="2.2.0.0" />-->      </dependentAssembly>    </assemblyBinding>  </runtime>
    WebRole.cs (MyServiceWebRole project):
           public override bool OnStart()        {            //Trace.Listeners.Add(new DiagnosticMonitorTraceListener());            Trace.Listeners.Add(new AzureLocalStorageTraceListener());            Trace.AutoFlush = false;            Trace.TraceInformation("Information");            Trace.TraceError("Error");            Trace.TraceWarning("Warning");            TimeSpan tsOneMinute = TimeSpan.FromMinutes(1);            // To enable the AzureLocalStorageTraceListner, uncomment relevent section in the web.config            DiagnosticMonitorConfiguration diagnosticConfig = DiagnosticMonitor.GetDefaultInitialConfiguration();            // Transfer logs to storage every minute            diagnosticConfig.Logs.ScheduledTransferPeriod = tsOneMinute;            // Transfer verbose, critical, etc. logs            diagnosticConfig.Logs.ScheduledTransferLogLevelFilter = LogLevel.Verbose;            // Start up the diagnostic manager with the given configuration            DiagnosticMonitor.Start("Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString", diagnosticConfig);            // For information on handling configuration changes            // see the MSDN topic at http://go.microsoft.com/fwlink/?LinkId=166357.            return base.OnStart();        }
    AzureLocalStorageTraceListener.cs (MyServiceWebRole project):
    public class AzureLocalStorageTraceListener : XmlWriterTraceListener    {        public AzureLocalStorageTraceListener() : base(Path.Combine(GetLogDirectory().Path, "MyServiceWebRole.svclog"))        {        }        public static DirectoryConfiguration GetLogDirectory()        {            try            {                DirectoryConfiguration directory = new DirectoryConfiguration();                // SHOULD I HAVE THIS CONTAINER ALREADY EXIST IN MY LOCAL STORAGE?                directory.Container = "wad-tracefiles";                directory.DirectoryQuotaInMB = 10;                directory.Path = RoleEnvironment.GetLocalResource("MyServiceWebRole.svclog").RootPath;                var val = RoleEnvironment.GetConfigurationSettingValue("Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString");                return directory;            }            catch (ConfigurationErrorsException ex)            {                throw ex;            }        }    }
    I also tried to comment out element in ServiceDefinition.csdef file. but here, I am having build time error (The XML specification is not valid).
    In my case, I am pushing all source code to bitbucket repository and from there it is deployed to the azure "WebSite". Here is more details:
    I need help to know:
    Why my service did not creating .svclog file from local to azure?
    It's also not doing the same even it has been deployed to azure?
    On which location(container) I can get the .svclog file into local storage?
    Please suggest correct way or modification so that I can overcome with this issue. Please replay fast.
    Thanks.

    Hello _Adian,
    Thanks for response.
    I uploaded all my code on bitbucket repository and configured a website on portal using "Integrate source control" (please refer:  http://azure.microsoft.com/en-in/documentation/articles/web-sites-publish-source-control/).
    (NOTE: This is the way my client is following.)
    Here is the structure of my solution:
    1. a wcf service application (.svc)
    2. few class library projects
    3. Azure cloud service (with Project 1 as web role).
    Now whenever I push my updated code to bitbucket, It automatically deployed to azure.
    So, please suggest me how can I create a separate .svclog file into local storage (using above environment).
    I hope this info will helpful to you for answer.

  • How do I upload a file into my acrobat cloud?  I am needing to do this in order to convert the pdf to excel.

    How do I upload a file into my acrobat cloud?  I am needing to do this in order to convert the pdf to excel.

    Hi wesm34245063,
    Here's a quick tutorial on using the ExportPDF online service to convert PDF files to Excel: Getting Started with ExportPDF | Adobe Community.
    I think that you'll find it's pretty easy, but if you do run into questions/issues, please let us know.
    Best,
    Sara

  • WEB Service Upload File

    Dear all,
    I have some files (50 files).
    if web service have a service flollowing:
    upload (String fileName,byte[] content);
    But if file more than 10M, how to upload to web service.
    Please show me a solution to upload 50 files or more with web service.
    Thank and regards.

    It looks like you just invoke that method with an array of bytes. If you have 50 files, do it 50 times (say, in a loop). You can get the array of bytes with a FileInputStream and a ByteArrayOutputStream, but I'll bet something in the java.nio package can do it more smoothly. (Check the docs; your googling will be as good as mine for that part.)
    The details will depend a lot on the web service itself; for that, talk to the provider of the service.

  • What are the performance implications moving apps using cloud drive to Azure File Services?

    I run a number of cloud services with 5 or more nodes in using cloud drives. Cloud drive is scheduled to be deprecated in 2015. So I am thinking of replacing the cloud drive with Azure Files service.
    For each cloud service I am using one storage account to create all the the VHD/cloud drives. Some people at the time when cloud drive first appeared, told me that to get better performance, I should create only one VHD/Cloud Drive
    under only one storage account. For example, if I have five instances under a worker role then I should create 5 storage accounts and create one VHD/Cloud Drive under each storage account to be used by each node. I didn't follow that route because I was satisfied
    with the performance of the apps under cloud services having all VHD/Cloud Drives under one storage account.
    My question is, if I replace cloud drive with Azure file services, will my apps perform well having all shares under one storage account or create one storage account for each share?
    Thanks,
    @nazik_huq

    Thanks Obama for replying.
    Here is the comment from @jaiharidas of MSFT if anyone's interested:
    @Naziq, It is better to have multiple shares under single storage account and there is no perf implications. However, please ensure that your ingress/egress and request/sec is within
    the limits of a single storage account (seemsdn.microsoft.com/.../dn249410.aspx)
    and use multiple storage accounts if you need to scale beyond the limits.
    See the original comment  on Azure Storage Team here: http://ow.ly/ChPNf 
    @nazik_huq

Maybe you are looking for