none
Experimentation Service -> Remote compute in on premise Linux VM RRS feed

  • Question

  • I was just wondering if using a remote docker in an on premise Linux VM is supported as a compute target in the experimentation service using Azure ML Workbench? I have seen a couple of bits of documentation (extracts below), one which suggests it should be possible and the other suggests it would need to be a Linux VM on Azure rather than an on premise Linux VM. 

    Documentation Suggesting It is Supported 

    Link

    https://docs.microsoft.com/en-gb/azure/machine-learning/desktop-workbench/experimentation-service-configuration

    Remote VM should satisfy the following requirements:

    • Remote VM needs to be running Linux-Ubuntu and should be accessible through SSH. 
    • Remote VM needs to have Docker running.

    Documentation Suggesting It is Not Supported

    Link

    https://docs.microsoft.com/en-us/azure/machine-learning/desktop-workbench/overview-general-concepts

    Extract

    • Compute Target: A compute target is the compute resource that you configure for executing your experiments. It can be your local computer (Windows or macOS), Docker container running on your local computer or in a Linux VM on Azure, or an HDInsight Spark cluster.

    Wednesday, May 23, 2018 2:50 PM

Answers

All replies