Friday, February 01, 2013 8:38 AM
I am trying to download azure storage blob using casablanca library. Below is the code -
azure::storage::cloud_blob_client blob_client(blob_uri, store_cred);
azure::storage::cloud_blob_container container = get_blob_container(blob_client,U("mytest"), true);
//Get Directory where to download file
char result[ 512 ];
DWORD a = GetCurrentDirectory(MAX_PATH,result);
success = download_file(azure::storage::cloud_blob(container,outfilename), StringToWString(result));
I observed memory is growing +30 Mb after blob download. Is there anyway to cleaning up memory. Or is this the problem with library.
Thanks for the inputs.
EDIT - Actully increased memory is blob size. Memory grows after this line -
std::vector<unsigned char> bytes = blob.get().get().data();
Is there anyway to clenup this memory?
Friday, February 01, 2013 3:05 PMOwner
We'll look into this right away.
Friday, February 01, 2013 8:15 PM
You should be able to delete the vector once the data is written to the file which should release the memory. We are currently adding support for streaming the data from the blob for our next release. This would eliminate the need to wait for the entire blob contents to be in memory before proceeding with writing it to the disk.
Wednesday, February 06, 2013 1:59 PM
Thanks for your input.
Further analysis I found that memory leak happnes in this line file_buf.putn(&bytes, (size_t)blob_size).wait();
Not with const std::vector<unsigned char> bytes = blob.get().get().data(); line.
How do we take care of memory here?
Saturday, February 09, 2013 1:29 AMOwner
Yes, we had a memory leak in the file buffer implementation that has been identified and addressed. Our next release will have a fix, but until then, there is nothing you can do except use some other file I/O (such as iostreams) library for now.
- Proposed As Answer by Artur LaksbergOwner Thursday, February 28, 2013 4:48 AM