none
Ingesting assets to Media Services from another blob storage account

    Question

  • Is there a way to Ingest an asset to Azure Media Services from another Azure blob storage account? I see examples of ingesting content from the local folder, but I cannot find one that creates an asset from a media that is existing in another blob storage.
    Friday, June 22, 2012 5:26 PM

Answers

All replies

  • In Preview, we don't have any helper methods for moving a blob from one storage location account to the one that is connected to your Media Services account automatically.  We do have this as a common request and are planning to add support for this in a later release. 

    A possible workaround for you is to create an empty Asset in WAMS using the API, and then get a Write access SAS Locator for the Asset.  You can then use the standard Azure blob storage client library APIs to copy the blob into the Asset, and then call Publish on the Asset to commit it.

    Saturday, June 23, 2012 4:54 PM
  • I have tried the solution that you've mentioned and it doesn't work. I managed to create the empty asset and copy my file from blob into the asset but afterwards the job for that particular asset always turns up with State.Error. If i create the asset with the exactly same file but from my local file system then the job completes successfully.

    Have you actually tried that and worked?????

    PS: The error details of the job reports the ever-mysterious ,

    "Media Proccessor" : "Task failed"

    • Edited by John Ropas Tuesday, July 10, 2012 10:16 PM addition
    Tuesday, July 10, 2012 10:14 PM
  • Hi John,

    Can you please confirm if you are doing the following:

    1) Create empty asset

    2) Copy files over to asset container

    3) call asset.Publish() on the IAsset object

    4) Run the job with this asset

    Please let us know. Thanks.

    Tuesday, July 24, 2012 4:43 AM
  •  

    I followed the steps mentioned above and then checked the container using Azure Storage Explorer It (copy) was  present in the container..Bt when i tried to Encode the same object After job.submit() It showed the status : queued-> processing -> error 

    My code is like

    assetToBeProcessed = RefreshAsset(_context,assetToBeProcessed.Id);

                IJob job = _context.Jobs.Create("My encoding job");
                IMediaProcessor processor = GetMediaProcessor(_context,"Windows Azure Media Encoder");
                ITask task = job.Tasks.AddNew("My encoding task",processor, "H.264 256k DSL CBR",TaskCreationOptions.ProtectedConfiguration);
    
                task.InputMediaAssets.Add(assetToBeProcessed);
                task.OutputMediaAssets.AddNew("Output asset", true, AssetCreationOptions.StorageEncrypted);
                job.Submit();
                
                Boolean finished=false;
                
                while(!finished){
                    var tjob = _context.Jobs.Where(j => j.Id == job.Id).FirstOrDefault();
                    switch(tjob.State){
                        case JobState.Finished :
                            finished = true;
                            Debug.Write(tjob.State);
                            break;
                        case JobState.Error:
                            Debug.Write(tjob.State);
                            break;
                        case JobState.Processing:
                            Debug.Write(tjob.State);
                            break;
                        default:
                            Debug.Write(tjob.State);
                            break;
                    }
                    Thread.Sleep(30000);
                }
      



    where 'assetToBeProcessed' is newly published object and 'RefeshAsset()' does the following ...

    private static IAsset RefreshAsset( CloudMediaContext _context,String asset_id)
            {
                var assets = from a in _context.Assets
                             where a.Id == asset_id
                             select a;
                return assets.FirstOrDefault();
            }
    • Edited by b.raja Saturday, September 08, 2012 2:41 PM code indent
    Saturday, September 08, 2012 2:39 PM
  • This thread http://social.msdn.microsoft.com/Forums/en-US/MediaServices/thread/9aeeee31-515c-4b5a-a7aa-041e01577f5e has some code on how to move content from an existing blob to an IAsset without making a round trip to the client.
    Monday, September 17, 2012 5:53 PM
  • Hey John!

    Thanks for all your work on WAMS!

    I was curious what progress the WAMS team has made on this functionality.

    (Ingesting assets to Media Services from another blob storage account)

    I / we could really use this capability.

    Thanks again!

    Robert

    Saturday, December 15, 2012 5:18 AM
  • Hi Robert,

    I posted on how to copy from an existing blob into a new asset in this post previously.

    Go down to the last message and read the code there, Nick updated it to match the last release of the WAMS SDK.

    http://social.msdn.microsoft.com/Forums/en-US/MediaServices/thread/be486bac-ac37-4984-87d0-20931fcb1328

    We also added this feature to the new Portal UX.  You can now ingest from another blob storage account directly in the Portal.

    Monday, December 17, 2012 4:22 PM
  • Hi John,

    I went to the link but the source file is hard coded.

    //Get and validate the source blob, in this case a file called FileToCopy.mp4:
    CloudBlob sourceFileBlob = sourceContainer.GetBlobReference("FileToCopy.mp4");

    RE: "You can now ingest from another blob storage account directly in the Portal."

    Can I do this with the API? (Programatically)

    I'd like users to be able to upload Videos to BLOB storage which simultaneously creates a Queue message. Then, a WorkerRole listens for Queue messages and processes the BLOB with WAMS ingestion code.

    I can't ingest from the Portal manually. I really need to use BLOB as the source insted of a local file.

    Thanks,

    Robert







    • Edited by Rob Vig Tuesday, December 18, 2012 4:25 AM
    Tuesday, December 18, 2012 4:13 AM
  • Hi Rob,

    Lets assume that you can get your users to drop files into some "Inbox" container in your storage account.  There are three ways of doing this:

    1. They are doing a big POST to your server, and the server is putting it there.  Not great because there are size limitations to a POST and you become the bottleneck for all uploads.

    2. You give them a SAS url and they are doing a PUT directly into the container from their client.  Watch our for cross domain issues here, you may need to use Silverlight or Flash as a cross-domain-capable upload shim.   See this thread, and this blog post.

    3. You are not using a web-browser as a client, and you can PUT things right into storage without any of these issues.

    So at this point, you have some file in an upload container in your storage account.  For the sake of this example, lets say it is:

    yourStorageAccount.blob.core.windows.net/UploadInbox/someFile_someGuid.mp4

    And when your client app was done uploading, it added the following to your queue:

    {sourceContainerName="UploadInbox", sourceFileBlobName="somefile_someGuid.mp4"}

    When your worker sees this go by, it simply parses the message as JSON and uses the code found in the last post in:

    http://social.msdn.microsoft.com/Forums/en-US/MediaServices/thread/be486bac-ac37-4984-87d0-20931fcb1328

    But you've modified this to :

        public void UseAzureStorageSdkToUpload(string sourceContainerName, string sourceFileBlobName)
           
    {
    ...
                if (copyFromExistingBlob)
               
    {
                   
    //
                   
    //Specific things you'll need to set:
                    // this is now an argument:
    var sourceContainerName = "uploads";               
                    // this is now an argument: var sourceFileBlobName = "FileToCopy.mp4";
    ...

    Regards,

    -Nick

    Tuesday, December 18, 2012 7:04 PM
  • Hi Nick! Thanks for posting.

    I am actually 98% of the way there with this solution (Browser Client upload to Blob, set Queue, listening Worker Role that Ingests) and it's all working. The only catch is using a BLOB insted of a local Video file as a source for the Ingestion code. (I use that term loosely).

    So without re-doing the logic, I just wanted to know if the WAMS team was going to have this capability done soon. (see the initial post)

    Thanks,

    Robert

    ps: Know of any good articles or walkthrough's on showing WAMS video in a SilverLight Player?http://social.msdn.microsoft.com/Forums/en-US/MediaServices/thread/f099ef41-19eb-4778-80dd-ccae837459e1\

    Tuesday, December 18, 2012 10:16 PM
  • Has any progress been made on a OOTB solution for this? (BLOB Source for Media Services Ingestion)
    • Edited by Rob Vig Wednesday, January 23, 2013 4:32 PM
    Thursday, January 10, 2013 4:37 PM
  • Please see:

    http://social.msdn.microsoft.com/Forums/en-US/MediaServices/thread/be486bac-ac37-4984-87d0-20931fcb1328

    For updates on asset.Publish(), which is now deprecated.

    Wednesday, January 16, 2013 5:24 PM
  • Just to round out this thread, we just published a new article on Copying Blobs on MSDN.

    How-to: Copy an Existing Blob into a Media Services Asset

    http://msdn.microsoft.com/en-us/library/jj933290.aspx

    Thursday, February 07, 2013 8:57 PM
  • John,

    I downloaded the SDK 1.7.1 Binaries. How do I build it to get the new Microsoft.WindowsAzure.StorageClient DLL ??

    There are 3 separate folders in the Download:

    Reference: http://msdn.microsoft.com/en-us/library/jj933290.aspx

    "The ability to copy blobs between different storage accounts was introduced in the Windows Azure SDK for .NET version 1.7.1. This version is only available through GITHUB. To use the Windows Azure SDK version 1.7.1, you need to get the binaries from GITHUB and build the Microsoft.WindowsAzure.StorageClient DLL. Then, replace the 1.7 version of the Microsoft.WindowsAzure.StorageClient.dll (this dll is added by the windowsazure.mediaservices Nuget package) with the DLL that you built. "

    Also, isn't this DLL included in the 2.0.1.0 SDK?

    http://nuget.org/packages/windowsazure.mediaservices

    Thanks again for your work. This really helps us out!

    Robert



    • Edited by Rob Vig Monday, February 11, 2013 4:56 PM
    Friday, February 08, 2013 6:24 PM
  • Rob,

    I performed the following git commands after I cloned the https://github.com/WindowsAzure/azure-sdk-for-net/tree/sdk_1.7.1 repo.

    1. $ cd azure-sdk-for-net
    2. $ git remote add upstream git@github.com:WindowsAzure/azure-sdk-for-net.git
    3. $ git fetch upstream (not sure if I had to perform this step)
    4. $ git fetch upstream sdk_1.7.1:my_sdk_1.7.1
    5. $ git checkout my_sdk_1.7.1

    In Visual Studio opened the StorageClient project located in \Documents\GitHub\azure-sdk-for-net\microsoft-azure-api

    1. Compiled.
    2. Copied Microsoft.WindowsAzure.StorageClient.dll that was generated.
    3. Added it to the project that copies blobs (instead of the 1.7 version added by the NuGet package)

    Please, let me know if that doesn't work for you,

    -Julia


    This posting is provided "AS IS" with no warranties, and confers no rights.

    Thursday, February 14, 2013 11:21 PM
  • Julia! Thank you for your wonderful Valentines Day present / instructions !!!!

    Your list worked like a charm! I was able to replace 1.7 with 1.7.1.

    Now a question for John Deutscher...

    John, I'm having trouble getting this to compile:

    http://msdn.microsoft.com/en-us/library/jj933290.aspx

    Is there a chance I could get a .zip of the entire project so I can take a look?

     

     

     

     

     

     

     

     

     

    Friday, February 15, 2013 6:35 PM
  • Rob,

    you can send me email, so I can reply to you and send you the zip file.  

    juliako@micrsoft.com

    thank you,

    Julia


    This posting is provided "AS IS" with no warranties, and confers no rights.

    Friday, February 15, 2013 6:38 PM
  • Ooops! My email to you from my Hotmail account got kicked back....

    ===========

    This is an automatically generated Delivery Status Notification.

    Delivery to the following recipients failed.

           juliako@micrsoft.com
     


    • Edited by Rob Vig Friday, February 15, 2013 9:18 PM
    Friday, February 15, 2013 9:17 PM
  • sorry :)

    it is juliako@microsoft.com


    This posting is provided "AS IS" with no warranties, and confers no rights.

    Saturday, February 16, 2013 12:25 AM
  • You can find the code here: http://code.msdn.microsoft.com/How-to-Copy-an-Existing-5ccaac3e/view/SourceCode

    Please, let me know if something is not working.

    thank you,

    Julia


    This posting is provided "AS IS" with no warranties, and confers no rights.

    Monday, February 18, 2013 8:18 PM
  • Hi John Ropas,

    Did you find the solution for this.. even i am facing the same..

    if you found the solution please share with us ..

    thanks in advance

    Deepak B S

    Thursday, February 21, 2013 12:01 PM
  • Julia,

    Thank you so much for assisting us on this!

    I downloaded the project, made the adjustments the the app.config accounts (not sure if I have it right)

    Anyway, I am getting this error when I try and compile

    ============

    Error    1    The name 'HttpUtility' does not exist in the current context    

    C:\Users\Administrator\Documents\Azure\WAMS\Copy BLOB into WAMS\C#\CopyFromExistingBlobToAsset\Program.cs    
    ============

    Here's the line of code:

    string fileName = HttpUtility.UrlDecode(Path.GetFileName(sourceBlob.Uri.AbsoluteUri));

    Thursday, February 28, 2013 2:58 AM
  • Needed to add a reference to System.Web even though it was listed in the "using's"
    Friday, March 01, 2013 2:57 PM