Not-so-secret keys and passwords RRS feed

  • General discussion

  • This applies to Live Framework as well as the lower-level Azure stuff, but I was in an "Azure" kinda mood when I thought of it so I put the post here.

    Basically what I'm worried about is the security of my keys. Let's say I'm building a Mesh-Enabled Web Application and I'm accessing Azure storage from there. I've got my storage key in my configuration file..but oh no... anybody who knows where to find the local cache of Silverlight apps from the Mesh can go dig through my unprotected configuration file and find the key and gain unlimited access to my Azure Storage account - totally destroying my application.

    The same is true for a WPF application. If I'm writing a WPF application that I want to talk to Azure Storage - how do I secure the key? I could certainly embed the key in the code instead of using the config file, but any newbie C# programmer can glean the key in 5 minutes using ILDASM or Reflector. If I store an encrypted version of the key, then I have to store the key to decrypt the key ... which can also be obtained using ILDASM or Reflector, and only delays the potentially malicious coder by about 5 more minutes.

    Is there some way that you folks have found, some best practice that I can use, that will protect my storage account key from prying developers when building apps that have a non-server footprint?

    The .NET Addict - http://dotnetaddict.dotnetdevelopersjournal.com
    • Moved by DanielOdievich Tuesday, September 28, 2010 9:28 PM forum migration (From:Windows Azure)
    Friday, December 5, 2008 7:43 PM

All replies

  • I would be interested to hear a way around this problem as well.
    Friday, December 5, 2008 8:54 PM
  • One way to solve the problem is to host a "gate-keeper" service that serves as the holder of the private information. So your silverlight application will only access the public data or data through the 'gate-keeper'. The SL/WPF app would use normally used authentication to your 'gate-keeper' (WLID, cookies, forms authentication, etc.) which would then proxy the requests to the underlying storage. 

    We are definitely interested in trying to improve this scenario and looking at different approaches for it.
    Friday, December 5, 2008 9:50 PM
  • all my work so far has been predicated on this 'gate-keeper' pattern.

    i built a thin service-side sds-proxy service that holds the actual login data for SDS and can also, if needed, do it's own auth-n/z work before talking to SDS. this also lets me convert the SDS plain-old-XML into HTML, Atom, JSON, CSV, etc. as needed for the client.

    i just started w/ Azure Storage this week and expect to do the same thing. esp. since the Azure auth process is so customized and multi-various that no standard browser or http client can be used to access WAS directly anyway.

    my suggestion to the Azure team is that all the HTTP-aware services (Azure storage/queues, SDS, Mesh) provide a standard HTTP Auth proxy to allow non-.NET code and utilities to easily interact. without that, this cool tech could end up hitting a serious adoption hurdle for non-Windows/.NET users.

    Mike Amundsen [http://amundsen.com/blog/]
    Friday, December 5, 2008 10:03 PM
  • I would not recommend that you allow the Mesh appliction or the WPF application direct access to your Azure storage.  The best architectural model you should use would be to publish a web service (hosted in Azure) that your Mesh or WPF application can call.  This way your web service could implement your data access control as well as ensure that the data integrity and any other business rules are being implemented prior to actually updating the Azure storage.  Placing the keys and giving raw data access to your Azure storage is not an architecture that should even be remotely considered.  Think about Azure being part of your service oriented architecture.

    Shan McArthur
    Friday, December 5, 2008 10:10 PM
  • This is the conclusion I'd come to a long time ago (don't let the keys off the server). I was hoping I wasn't going crazy in thinking it was a bad idea. I was just curious to see if people were using the Access Control stuff as a way around it, but for my needs involving Access Control is just way too complicated.

    So my basic plan is to use Azure Storage, implement the ASP.NET providers for Azure Storage, and then have user-based authentication to my web service. This way, I can immediately identify the role to which the calling user belongs and that allows me (the owner) to perform more operations than the users of my application.

    The .NET Addict - http://dotnetaddict.dotnetdevelopersjournal.com
    Saturday, December 6, 2008 1:01 AM
  • Unless you have some form of secure storage on your local client where your secret keys can be held (such as this USB key fob) for an appropriate cert exchange, then there's no true security unless you move the keys off the client.  And hence the "gatekeeper" security pattern discussed above.

    Client side access of trusted code is Hard Problem, unless, of course, you move the code off the local client.

    George Moore
    Saturday, December 6, 2008 6:04 PM
  • Take something simple like storage where you want to upload blocks and then assemble them into a blob.  Azure let's you do this directly but because of the authentication scheme I would have to wrap that service in a shim layer.

    Perhaps another layer of security that wasn't so locked down would be acceptable for some services.  For instance a simple session-id that is generated securely would allow a client to upload to a blob.  My service layer could do the auth and hand down the session to the client.  Client does its thing and lets the service know when it's done.

    Service can still validate what is coming in.  A hijacked session id would be of little value outside of harrassment.  But considering the session id had to be created by a trusted client in the first place it would be an easy choke point to control.

    But your point is well taken.  The application's web service should be handling the details for lots of reasons.  Not the least of which is the ability to move between different cloud providers.
    Tuesday, January 13, 2009 4:47 AM