List of Blobs in Storage Container

Hi,
I am aware I can get a Enumeration Page which provides the list of Blobs in a container in an XML Format.
The issue is, I can't parse this into HTML with javascript. Ideally I want a file.html or file.php which will list the entire contents of the storage container and give a hyperlink to the resource. Much like a webservers index page.
Problem is when I use Javascript and do an xmlhttp.open Get command to https://myaccount.blob.core.windows.net/mycontainer?restype=container&comp=list It does not work because I get the following No
'Access-Control-Allow-Origin' header is present on the requested resource.
Thus it seems I can not parse the information in this manner.
Is there any way of achieving this at all.

Web browsers use a security restriction called "same origin policy" that prevents a web page from calling APIs in a different domain. You must set up
CORS on your Azure Storage account so web browsers will recognize it as authorized access. For more information, see the following MSDN page:
http://msdn.microsoft.com/en-us/library/azure/dn535601.aspx
and the following blog post: http://blogs.msdn.com/b/windowsazurestorage/archive/2014/02/03/windows-azure-storage-introducing-cors.aspx.

Similar Messages

  • List blobs in a container based on time condition

    Hi, Azure team,
    Is possible to provide an api only return the list of blobs in a container in some Time Window (e.g, one hour, one day, etc.) from current timestamp. I found some related links to this topic, [I cannot post link here, please search keywords "Filtering
    and deleting blobs in a container based on date and extension?" if you are interested in the link I found :)]
    And find something in [I cannot post link here, please search keywords "List Blobs (REST API)" if you are interested in the link I found:)]while none of them discussed about how to get blobs in a container in some time window.
    From the perspective of implementation, I assume there is some index for the blobs in the container, the index is probably based on hash, is there a plan to add another index based on the order (tree structure based index) to support this kind of query?
    Best,
    Aden

    Hi,
    I suggest you create a ticket to submit this issue to azure expert, I think you will be get professional advice, below is channel.
    #http://azure.microsoft.com/en-us/support/options/
    Hope this helps
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.

  • Read Only Access to Storage Container

    Is it possible to give Read Only access to a particular storage container without adding someone to Subscription and providing them the access key without anonymous request without going through SAS route

    By default, a container and any blobs within it may be accessed only by the owner of the storage account. If you want to give anonymous users read permissions to a container and its blobs, you can set the container permissions to allow public access. Anonymous
    users can read blobs within a publicly accessible container without authenticating the request.
    This link gives full details
    https://msdn.microsoft.com/en-us/library/azure/dd179354.aspx
    Frank

  • Access blobs in private container

    Hi,
    How can we access blobs in the container using account name and key if the container is marked as private.
    If the container is public we can directly download the blob from the url.
    Is there any such way to download blobs from private container using url.
    Regards,
    Kishan.

    Hi Kishan,
    You could consider using Shared Access Signatures to access BLOBs in Private containers.
    You could refer the following links for details on how to create and use SAS (Shared Access Signatures) with BLOB storage:
    http://azure.microsoft.com/en-in/documentation/articles/storage-dotnet-shared-access-signature-part-2/
    https://msdn.microsoft.com/en-us/library/azure/jj721951.aspx
    Regards,
    Malar.

  • Accessing blobs in private container without Shared Access Secret key

    Is there any way to access blobs in private blob container without Shared Access Secret key ? i mean any User / Role based security or domain level security i.e only our domain should be able to access blobs in private container etc.
    Actually i don't want to append SAS key after each blob url to access it, i want my container to be private and also i want to access each blob in that container without SAS key
    any way currently available or planned in future release ?

    Hi Yazeem,
    > That main page loads sucessfully but the js, css, xml files which this page accesses are unable to load because SAS key is not appended to their URL automatically.
    If the main page is served by a http handler and the js, css, xml files are linked using relative address, these files will also be served by the http handler too. For example, if the http handler serves a page in address
    http://xxx.cloudapp.net/blobproxy/index.html and the page links to a script file using tag
    <script src="myscript.js"></script>, actually the browser will use address
    http://xxx.cloudapp.net/blobproxy/myscript.js to access the script file. So the solution is to create a http handler to serve all requests to address
    http://xxx.cloudapp.netb/blobproxy/*.
    For test purpose, I made this sample. Please add a class file BlobProxy.cs to your web role project:
    using System;
    using System.Web;
    using Microsoft.WindowsAzure.StorageClient;
    using Microsoft.WindowsAzure;
    namespace WebApplication2
    public class BlobProxy : IHttpHandler
    // Please replace this with your blob container name.
    const string blobContainerName = "files";
    public bool IsReusable
    get { return false; }
    public void ProcessRequest(HttpContext context)
    // Get the file name.
    string fileName = context.Request.Path.Replace("/blobproxy/", string.Empty);
    // Get the blob from blob storage.
    var storageAccount = CloudStorageAccount.DevelopmentStorageAccount;
    var blobStorage = storageAccount.CreateCloudBlobClient();
    string blobAddress = blobContainerName + "/" + fileName;
    CloudBlob blob = blobStorage.GetBlobReference(blobAddress);
    // Read blob content to response.
    context.Response.Clear();
    try
    blob.FetchAttributes();
    context.Response.ContentType = blob.Properties.ContentType;
    blob.DownloadToStream(context.Response.OutputStream);
    catch (Exception ex)
    context.Response.Write(ex.ToString());
    context.Response.End();
    Then please add this http handler to web.config file:
    <configuration>
    <system.webServer>
    <handlers>
    <add name="BlobProxy" verb="*" path="/blobproxy/*" type="WebApplication2.BlobProxy"/>
    </handlers>
    </system.webServer>
    </configuration>
    Before running the project, please replace blobContainerName with your own blob container that contains both html and related files. Then start debugging the Azure service project and then you can use the following address to access the page:
    http://127.0.0.1:[port number]/blobproxy/[page name]
    I above sample does not work for you, please let me know.
    Thanks.
    Wengchao Zeng
    Please mark the replies as answers if they help or unmark if not.
    If you have any feedback about my replies, please contact
    [email protected].
    Microsoft One Code Framework

  • What are the list of .stock by storage location

    hi, iwant to devlop the report to stock by storage location.give me description about a report to list of .stock by storage location. what are the functional spects for this report, what are the tables and fields we have to used for this report plz give me sample report.

    Dear Radha Krishna,
    You can either use program RMMMBESTN (from SE38) or TCode MMBE. However, you can copy the above program into a 'Z' program and change the logic if your requirement is not served by the standard report or you want to enhance the same.
    Go through standard Reports:
    MMBE Stock Overview
    MB52 List of Warehouse Stocks on Hand
    MB53 Display Plant Stock Availability
    MB54 Consignment Stocks
    MB5B Stocks for Posting Date
    MB5K Stock Consistency Check
    MB5L List of Stock Values: Balances
    MB5T Stock in transit CC
    MB5W List of Stock Values
    MBBS Display valuated special stock
    MBLB Stocks at Subcontractor
    MBSF Release Blocked Stock via Mat. Doc.
    MBW1 Special stocks via WWW
    MC.1 INVCO: Plant Anal. Selection: Stock
    MC.5 INVCO: SLoc Anal. Selection, Stock
    MC.9 INVCO: Material Anal.Selection,Stock
    MC.D INVCO: MRP Cntrllr.Anal.Sel. Stock
    MC.H INVCO: Business Area Anal.Sel. Stock
    MC.L INVCO: Mat.Group Analysis Sel. Stock
    MC.P INVCO: Division Analysis Sel. Stock
    MC.T INVCO: Mat.Type Anal.Selection Stock
    MC48 INVCO: Anal. of Current Stock Values
    MC49 INVCO: Mean Stock Values
    MC50 INVCO: Analysis of Dead Stock
    MC8M Read Opening Stocks
    MCB) INVCO: Long-Term Stock Selection
    MCC4 Set Up INVCO Info Structs.from Stock
    MCH: RIS: STRPS/Mvmts + Stock - Selection
    MCKJ Selection version tree: Stock
    MCKR User-spec. sel. vers. tree: Stock
    MCNB BW: Initialize Stock Balances
    MCSK Call Standard Analyses of Stocks
    MD04 Display Stock/Requirements Situation
    ME27 Create Stock Transport Order
    ME2O SC Stock Monitoring (Vendor)
    MF65 Stock Transfer for Reservation
    MI35 Batch Input: Post Zero Stock Balance
    MIQ1 Batch Input: PhInvDoc. Project Stock
    MM73 Special Stocks: Preparation
    MM74 Archive Special Stocks
    MM75 Display Archive of Special Stocks
    MMBE_OLD Stock Overview
    MMN1 Create Non-Stock Material &
    MS04 Planning Scenario: Stock/Reqmts List
    MS29 Calculate Sim. Initial Stock
    Tables:
    MKPF Header: Material Document
    MSEG Document Segment: Material
    IKPF Header: Physical Inventory Document
    ISEG Physical Inventory Document Items
    SBSE Stock Mngmt Levels for Inventory Sampling
    SKPF Header Data: Inventory Sampling
    SLGH Elements of Stock Population
    SSCH Strata of Inventory Sampling
    T156 Movement Type
    MSKA Sales Order Stock
    MSKAH Sales Order Stock: History
    MSKU Special Stocks with Customer
    MSKUH Special Stocks at Customer: History
    MSLB Special Stocks with Vendor
    MSLBH Special Stocks at Vendor: History
    MSPR Project Stock
    MSPRH Project Stock: History
    MSSAH Total Sales Order Stocks: History
    MSSL Total Special Stocks with Vendor
    MSSQ Project Stock Total
    MSSQH Total Project Stocks: History
    http://goldenink.com/abap/files_in_sap.html
    http://www.erpgenie.com/abap/tables.htm
    Regards,
    Naveen.

  • List of stock by storage bin

    Hi,
    I need to display a list of materials by storage bin. How to display in this list the quantity and valuation of stock for each material and for each storage bin.
    Thanks for your comments and suggestions.

    Am not in front of the system, but if am right MBEW has fields called valuation area which is plant, matnr you should find in both tables.
    Check by doing the join of the said fields, if the system does not allow joins on MBEW table then use abap statements to read value.
    Hope the above helps.

  • List of PT TABLES which contain digital certificate

    Team,
    We have implemented self sign certicate in our environment.
    So, can someone share the list of PT TABLES which contain digital certificate .
    Because, we are going to refresh the environemnt and I dont want to lose the digital certificate after the refresh.
    Thanks

    On the safer side, I am taking export of below tables too. Please have a look.
    -- PROCESS SERVERS
    EXPORT PS_SERVERCLASS;
    EXPORT PS_SERVERDEFN;
    EXPORT PS_SERVERDEFN_LNG;
    EXPORT PS_SERVERNOTIFY;
    EXPORT PS_SERVERMESSAGE;
    EXPORT PS_SERVEROPRTN;
    EXPORT PS_SERVERCATEGORY;
    EXPORT PS_SERVERSTAT;
    --Report Node
    EXPORT PS_CDM_DIST_NODE;
    -- URL DEFINITIONS
    EXPORT PSURLDEFN;
    EXPORT PSURLDEFNLANG;
    EXPORT PS_PT_URL_PROPS;
    -- DIRECTORY
    EXPORT PSDSDIR;
    EXPORT PSDSSRVR;
    EXPORT DSCONNECTID;
    EXPORT PSDSEXT_INSTALL;
    EXPORT PSDSSECMAPMAIN;
    EXPORT PSDSSECMAPSRVR;
    EXPORT DSUSRPRFLMAP;
    EXPORT PSDSUSERPRFL;
    EXPORT PSDSSECROLERULE;
    EXPORT DSSRCH_SBR;
    EXPORT DSSRCHATTR;
    EXPORT DSSECFILTER;
    EXPORT PT_WF_NOT_DSCFG;
    -- WEB Data
    EXPORT PSWEBPROFBROW;
    EXPORT PSWEBPROFCOOK;
    EXPORT PSWEBPROFDEF;
    EXPORT PSWEBPROFILE;
    EXPORT PSWEBPROFPROP;
    EXPORT PSWEBPROFNVP;
    EXPORT PSGATEWAY;
    --Digital Certificates
    EXPORT PSCERTISSUER;
    EXPORT PSCERTDEFNDEL;
    EXPORT PSCERTDEFN;
    EXPORT PSCERTDB;
    -- Search Attribute
    EXPORT PSPTSF_ATTRS;
    EXPORT PSPTSF_ADD_PRMS;
    EXPORT PSPTSF_ATT_PROP;
    -- Search Definition
    EXPORT PSPTSF_SD;
    EXPORT PSPTSF_SD_ATTR;
    EXPORT PSPTSF_SD_ATTRH;
    EXPORT PSPTSF_SD_DCATR;
    EXPORT PSPTSF_SD_DCACL;
    EXPORT PSPTSF_SD_LANG;
    EXPORT PSPTSF_SD_PNLGP;
    EXPORT PSPTSF_SD_SRACL;
    -- Search Category
    EXPORT PSPTSF_CATADVFD;
    EXPORT PSPTSF_CATDSPFD;
    EXPORT PSPTSF_CATFACET;
    EXPORT PSPTSF_CAT_LANG;
    EXPORT PSPTSF_SRCCAT;
    EXPORT PSPTSF_SRCCATAT;
    -- Search Context
    EXPORT PSPTUS_CTX;
    EXPORT PSPTUS_CTX_DET;

  • Listing of Channels in a container

    I am having some truble and need some help...
    I want to make a JSPProvider channel that can display all of the channels that are located in another Container.
    For example I have a TableContainer that contains a JSP channel and another Table Container
    RootTableContainer
    ----> JSP Channel
    -----> TableContainer
    Within the JSP Channel I want to provide a list of all of the channels availaible in the Table Container. Essentially this will work similar to the Frame Layout.
    I have tried variations of the following with no success:
    <dt:obtainContainer container="$CONTAINERNAME">
    <dtcpc:containerProviderContext>
    <jx:forEach var="channel" items="$channels">
      <dtcpc:obtainChannelFromContainer channel="channel">
      <jx:declare id="channel" type="java.lang.String"/>
      <dt:getTitle id="title" scope="request" silentException="true"/>
      <jx:declare id="title" type="java.lang.String"/>
      Title: <%=title%>
      Channel: <%=channel%>
      </dtcpc:obtainChannelFromContainer>
    </jx:forEach>
    </dtcpc:containerProviderContext>
    </dt:obtainContainer>Can anyone provide some help/feedback.
    -Thanks

    only an idea:
    listing all channels within one container lists only the channels availabe in here, so you need to "get a hook" upwards to the parent container (from JSPChannel -> Root) which should be able to provide you with a list of the contained containers (JSPChannel+TableContainer). So you get the hook to TableContainer and can ask this one for its content.
    -> means up and down the hierachy.
    another option could be to extend the RootContainer (maybe you don't have to develop a new one and extending the JSP is enough) and in this you go downwards to the TableContainer, fetch the information and send it as a parameterlist from Root to your JSPChannel-Provider.
    .. only 2 ideas
    /u

  • Quering on blob fields that contain XML

    Hi All,
    I need to query on blob fields that contain XML data. So I wanna say give me XML docs where ELEMENT[@ATTRIBUTE='VALUE']. So, I need to be able to query on attributes. Is there any way to do that?
    Thanks.

    Thanks for the attempt, but tried what you suggested and it will only pull records where the Asterisk is the ONLY character in that data field.
    Maybe I wasn't clear enough.
    I have in access of 200,000 records in KNA1, out of which some of them have one or more * values in the some of the fileds, eg: Comany ABC At XXXX road Workshop  is stored in the Name 1 field, I want all records like this where * is one of the characters in that field.
    May have to download the entire lot and use Excel to filter.

  • Interim storage bins creation, List of standard interim storage bins

    Hi,
    We need to create interim storage bins as part of cutover. can someone provide me list of standard interim storage bins to be created. Which transaction will be used to create interim storage bins
    Thank you

    Hi,
    You can use the transaction LX20 to create interim bins.
    Following interim bins will be available in the standard.
    001     901    WE-ZONE
    001     902    WE-ZONE
    001     910    WA-ZONE
    001     917    QUALITAET
    001     920    UML-ZONE
    001     921    UML-ZONE
    001     922    U-ZONE
    001     980    DUMMY
    001     998    AUFNAHME
    001     999    DIEBSTAHL
    001     999    SCHROTT
    Thanks & Regards,

  • HT4859 my iphone is listed twice in manage storage under backups.  can i erase the old one?

    my iphone is listed 2x in manage storage under backups.  i think it's because verizon replaced a broken phone, but they helped me set the new replacement one up so i would think the old one shouldn't appear at all.  can i erase the old one? 
    this is on my iphone under: settings-icloud-storage&backup-manage storage-
    it shows the old phone latest back up date the day that i sent it back to verizon (6.7GB). they recommend a back up before you send it to them.  the current iphone is 7.2GB, so yes i have been having storage problems.  i just want to make sure it will be ok to erase the old one iphone.

    dhmarvin wrote:
    why would someone need the old backup?  isn't everything from the old back up on the new iphone backup?
    I just wanted to be clear that if you delete it, it's permanently gone.  So make sure you check your phone to be sure it has all your data before deleting it.  If it does, go ahead.
    when verizon helped me set up the replacement phone, it transferred the 6.7GB to the new one right?
    I don't know what Verizon did; judging from the size of your new backup it would seem that they did.  Check your phone.  If it has all your data, go ahead and delete the old backup.

  • Displaying List of BLOB images

    Hello everyone,
    i successfully and easily used the steps described in http://forum.java.sun.com/thread.jspa?threadID=5047085 for displaying an image from a BLOB column in a database.
    this was the first step in the process of developing a page that displays a list of images. Image for example, a picture album.
    Ideally, i'd like to display a list of images for a given user_id. I do not want to use Creator's table component because it's layout and appearance and functionality is not usefull for any of my purposes.
    is there any way to dynamically add image components to a JSC page? can i somehow create the number of image components to match the number of rows in a resultset (containing all images for a user)??
    I'd greatly appreciate any suggestions.
    Thanks.
    - P

    This could be an issue with image caching in the renderers. Try the SuperImage control from here
    http://www.quietlyscheming.com/blog/2007/01/23/some-thoughts-on-doubt-on-flex-as-the-best- option-orhow-i-made-my-flex-images-stop-dancing/

  • Azure Sql DB Export to Storage Container fails with "An error occurred while sending the request"

    I've built a new VM from which I'm running PowerShell scripts to backup my databases.  It had worked before on an old server for several months, and worked once on the new server, then I upgraded my Azure PowerShell cmdlets, and haven't been able to
    get it to work again.  The new version is 0.8.10.1.
    Below is my source code, with sensitive stuff replaced with ?'s.  When I display the $stctx and $dbctx, they seem to have reasonable values.  I added the IP address of the server as an exception to the db firewall, and I've installed SQL Server
    Mangement Studio and verified that I can connect to the database.  I have a feeling there's something simple I've overlooked.
    Here's are both error messages:
    Start-AzureSqlDatabaseExport : An error occurred while sending the request.
    At C:\Users\Public\PublicCmds\test.ps1:29 char:1
    + Start-AzureSqlDatabaseExport -SqlConnectionContext $dbctx -StorageContext $stctx ...
    + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
        + CategoryInfo          : NotSpecified: (:) [Start-AzureSqlDatabaseExport], HttpRequestException
        + FullyQualifiedErrorId : Microsoft.WindowsAzure.Commands.SqlDatabase.Database.Cmdlet.StartAzureSqlDatabaseExport
    Start-AzureSqlDatabaseExport : Error while copying content to a stream.
    At C:\Users\Public\PublicCmds\test.ps1:29 char:1
    + Start-AzureSqlDatabaseExport -SqlConnectionContext $dbctx -StorageContext $stctx ...
    + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
        + CategoryInfo          : NotSpecified: (:) [Start-AzureSqlDatabaseExport], HttpRequestException
        + FullyQualifiedErrorId : Microsoft.WindowsAzure.Commands.SqlDatabase.Database.Cmdlet.StartAzureSqlDatabaseExport
    Here is the source code:
    param($dbname)
    if ($dbname -eq $null) {
    write-host "Database code must be specified"
    return
    $password = "????"| ConvertTo-SecureString -asPlainText -Force
    $servercredential = new-object System.Management.Automation.PSCredential("????", $password) 
    $dbsize = 1
    $dbrestorewait = 10
    $dbserver = "????"
    $stacct = $dbname
    $stkey = "????"
    $stctx = New-AzureStorageContext -StorageAccountName $stacct -StorageAccountKey $stkey
    $dbctx = New-AzureSqlDatabaseServerContext -ServerName $dbserver -Credential $servercredential 
    $dt = Get-Date
    $timestamp = "_" + $dt.Year + "-" + ("{0:D2}" -f $dt.Month) + "-" + ("{0:D2}" -f $dt.Day) + "-" + ("{0:D2}" -f $dt.Hour) + ("{0:D2}" -f $dt.Minute)
    $bkupname = $dbname + $timestamp + ".bacpac"
    write-host "db context"
    $dbctx
    write-host "storage context"
    $stctx
    write-host "Backup $dbname to $bkupname"
    Start-AzureSqlDatabaseExport -SqlConnectionContext $dbctx -StorageContext $stctx -StorageContainerName databasebackup -DatabaseName $dbname -BlobName $bkupname

    Hi Brad,
    Mentioned script, with appropriate values, works on my system.
    I'm able to export an Azure SQL database to blob storage. Am using version 0.8.10.1 of cmdlets, so this the same version mentioned in this problem description.
    Can you please try using Add-AzureAccount and check if that helps. This is indicated in a different third-party blog.
    http://answers.flyppdevportal.com/categories/azure/azuretroubleshooting.aspx?ID=8aee89fe-430e-45fe-af54-7c8ed3ac60e1%29."http://answers.flyppdevportal.com/categories/azure/azuretroubleshooting.aspx?ID=8aee89fe-430e-45fe-af54-7c8ed3ac60e1
    Does it work from a different machine with newly downloaded credentials.
    Does it work for a newly created database (so minimal database size).
    If above do not work, we may require additional details like RequestID, StorageAccountName, ServerName so an MS ticket may be more appropriate.
    Girish Prajwal

  • How to schedule a WebI Report for the list of users that it contains in it

    I have a WebI Report in BOXIr2 which contains the list of Employees Details (Employee Name, Mail ID, Emp Code) who had missed their timesheet for the week, Please let me know Is their a way that i can send mail notification through BO report Scheduling to the list of Employees who are listed in the Report, Thanks in advance !

    Hi,
    in BO XI 3.1 you can do this using a publication object. Unfortunately in XI R2 this works only for DeskI documents. You may want to consider using the BOE SDK for this. I must admit that I do not have a specific solution for you a workflow may look like this:
    1) Schedule your WebI with the user list to be exported on a network drive
    2) Build a java program that reads in the file, connects to the BOBJ repository and schedules the report using the recepient list.
    Regards,
    Stratos

Maybe you are looking for

  • Built in Camera 24-in,  Early 2008 iMac Ceased Working

    The built in camaera of my iMac stopped working. The iMac has: Processor  3.06 GHz Intel Core 2 Duo Memory  2 GB 800 MHz DDR2 SDRAM Graphics  NVIDIA GeForce 8800 GS 512 MB Serial Number  QP8270M20N4 Software  OS X 10.9.2 (13C64) In the 'About this  M

  • Application Usage Report Blank - ARD 3.8

    We are trying to run an Application Usage Report on one of our clients. We are using both 3.8 Admin and Client. It will run the report but the output is simply blank. Has anyone seen this before? Any advice. Tried it on several clients and same issue

  • Desktop display zoom

    How do I increase the font size of the desktop display? It's too small to read comfortably. Post relates to: Palm Z22

  • Safari not displaying website comments

    Since the latest update to Safari in Mavericks 10.9.4, I no longer can see comments to articles at huffington post. When I open the same page in another browser, the comments are there. Anyone know a way to fix this?

  • Is Mac to Mac faster than Mac to PC?

    Someone told me something which doesn't make sense to me, but I could be wrong. Is it true that a website made with my iMac is opened faster than, say Windows PC? This is what the person wrote to me. I don't know how to respond. She wrote: *'I know t