locked
[Java] Blob Storage access using AAD application RRS feed

  • Question

  • Hi,

    I would like to be able to use an application registration with a role of Reader (or sufficient similar role permissions) to be able to read the following information;

    • List of Storage Accounts
    • The location for each Storage account
    • A list of Blob Containers underneath each Storage Account
    • A list of Blob under each Blob Container

    I would like to do this WITHOUT needing access to the Storage Account Keys.

    As it stands at the moment, I can list the storage accounts (provided I have Reader role set on my AAD application registration), and obtain the location for each of these. However after this I can only list the blob containers and blobs if I also give this Application Registration the STORAGE ACCOUNT KEY OPERATOR SERVICE ROLE. To me this is unacceptable since granting such a role gives Full Access to the Blob Containers and blobs - meaning I get CREATE, READ, WRITE and DELETE permissions.

    The reason I need to do this is that I am trying to build an application that routinely analyses the data I have stored in a number of Storage Accounts which may contain many Blob Containers and underneath many Blobs. A SAS is limited by an expiry date, and per Storage Account (I understand that never expiring SASs are possible but not recommended). Ideally I should receive one set of Credentials and be able to list all Storage accounts, All Blob Containers and all Blobs.

    Is it possible to generate a SAS against the App Registration to allow READ ONLY permissions on the Blob Containers/Blobs without the need for access to these keys? Or is some other mechanism that allows me to do the above? I'm using the Java SDK to do this. 




    • Edited by sebeard Tuesday, June 13, 2017 3:45 PM
    • Edited by Aaron Chen - MSFT Thursday, June 15, 2017 1:55 AM edited title
    Tuesday, June 13, 2017 3:42 PM

All replies

  • One set of credentials for all your storage accounts in the form of SAS is not possible. You can, however, have multiple SAS URIs depending on what you’re setting out to retrieve in your different storage accounts (containers, blobs etc) and ensure read-only access for a given resource by specifying permissions sp=r within the SAS URI. Refer this document for more information on using SAS, its best practices and some examples.
    -----------------------------------------------------------------------------------------------------

    Do click on "Mark as Answer" on the post that helps you, this can be beneficial to other community members.

    • Proposed as answer by Md Shihab Wednesday, June 14, 2017 4:22 AM
    • Unproposed as answer by sebeard Wednesday, June 14, 2017 7:32 AM
    Wednesday, June 14, 2017 4:22 AM
  • I'm not necessarily looking at a single SAS for all Storage Accounts. I'm fine with that - and I understand that article. What it doesn't tell me though is that how users of the application should input credentials.

    Let's take an extreme example. Say I have 200 Storage Accounts - That means I have 400 Storage Keys (2 per account), and I can create SAS URIs for each of these - which would be totally independent. I can then access at a read-only level all my storage accounts using the SAS URIs. Good. However this doesn't fix the front end problem. I now need to create and input 200 different SAS URIs to access all Storage Accounts, and when they expire I have to replace them. From a security aspect this looks good, but from a usability standpoint its very painful - even if my example was just 10 Storage Accounts this approach seems cumbersome.

    It also doesn't provide me with all the data my application requires. The below link and JSON are only accessible using an AAD Application or User;

    https://docs.microsoft.com/en-us/rest/api/storagerp/srp_json_get_storage_account_properties

    {
        "id": "/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.Storage/storageAccounts/{accountName}",
        "name": "accountName",
        "location": "account geo region",
        "tags": {
            "key1": "value1", 
            "key2": "value2"
        },
        "type": "Microsoft.Storage/StorageAccount",
        "properties": {
            "provisioningState": "status",
            "encryption": {
                    "services": {
                            "blob": {
                                    "enabled": true,
                                    "lastEnabledTime": dateTime}
                    }
                    "keySource": "Microsoft.Storage"
            }
            "primaryEndpoints": {
                "blob": "blob endpoint",
                "queue": "queue endpoint",
                "table": "table endpoint",
                 "file": "file endpoint"
            },
            "primaryLocation": "primary geo region",
            "statusOfPrimary": "available|unavailable",
            "lastGeoFailoverTime": "dateTime",
            "secondaryLocation": "secondary geo region",
            "statusOfSecondary": "available|unavailable",
            "secondaryEndpoints": {
                "blob": "secondary blob endpoint",
                "queue": "secondary queue endpoint",
                "table": "secondary table endpoint",
            },
            "creationTime": "dateTime",
            "customDomain": {
                    "name": "user domain”
             },
            "accessTier": "Cool|Hot"
        },
        "sku": {
                "name": "Standard_LRS|Standard_ZRS|Standard_GRS|Standard_RAGRS|Premium_LRS"
                "tier": "Standard|Premium"
        }
        "kind": "Storage|BlobStorage"
    }

    If I use an AAD application registration then I can access the above data. If I also provide the application with the Storage Key Operator Service Role then I can access the Storage Account keys and generate SAS URIs to access all storage accounts. So in this case I get all the data I need with a single credential (clientId/applicationId, tenantId/directoryId, secret), however I also get full permissions on Blob Storage since I have access to the keys. Usability good, Security bad.

    What I'm trying to understand or ask is is there a way to get access to all Storage Accounts and the above data using just an AAD Application Registration? I don't mind having to handle 200 SAS URIs internally, but I don't want to compromise Security or Usability in this aspect. I want a way to keep my permissions at a Read-Only level through out. Is this possible? Can I get a handle/reference to 200 different SAS URIs using just an AAD Application registration? If not is there some other mechanism I should be using?

    SAS URI seems limited for my purpose. Storage Account Keys give away too much control.

    Wednesday, June 14, 2017 7:32 AM
  • There seems to be references to list Account/Service SAS URIs here: https://docs.microsoft.com/en-us/rest/api/storagerp/storageaccounts#StorageAccounts_ListServiceSAS

    But I can't find any reference in the Java SDK - so I'm guessing it's not there? Furthermore while this may give me some SAS URIs to use, I'm not 100% sure that they will list anything if a User hasn't genereated any SAS URIs.

    Wednesday, June 14, 2017 8:16 AM
  • You may open a support ticket with our technical team for much detailed analysis of the issue: http://azure.microsoft.com/en-us/support/options/

    Tuesday, July 4, 2017 1:28 PM