Integration Run-time on Azure VM RRS feed

  • Question

  • Good day forum. 

    I have a question that I feel should be easy for those of you more experienced in Azure.  My network admin and I have been in a disagreement about setting up an Integration Run-time for Data Factory on an Azure VM.   They are stating that the IR needs to be installed on hardware in our data center.  I am stating that we can set up an Azure VM with VPN access to our network and install the IR there.  

    Can anyway confirm or deny that an IR can be set up on an Azure VM to retrieve data from our on-premise sources?

    Thank you for your time.


    Wednesday, October 31, 2018 4:23 PM

All replies

  • Hi Matt,

    Here is the documentation you need to set-up the 'Hosted in Azure' scenario but also explains on-premise to some degree.

    Create and configure a self-hosted integration runtime

    Please let us know if you have additional questions. ~Mike

    Wednesday, October 31, 2018 6:12 PM
  • Hi,

    Yes, you can set it up using an Azure VM. You can link him to this doc ;) Setting up a self-hosted IR on an Azure VM by using an Azure Resource Manager template


    Alberto Vega

    Wednesday, October 31, 2018 10:21 PM
  • Hi Matt,

    You can set up IR on Azure VM or on-prem server.

    If you set it up on-prem then you need not to worry about opening 1433 ports if your source systems (as this machine is hosted in on-prem data center) are in secure zone and it is against the organization policies, in this case you only need to open ports 80 & 443 on IR server. If you install it on Azure then you need to take care of all the ports required to access data.


    Thursday, November 1, 2018 1:19 AM
  • Are there any metrics available on performance/data transfer implications of using an Azure VM for the self-hosted IR?

    We are in the process of getting ExpressRoute set up and my thought was to create a couple of Azure VMs as nodes and peering to the virtual network that routes to on-premise infrastructure. We plan on opening up appropriate ports to the data sources on the on-premise firewall along with IP white listing.

    The goal is to move large amounts of data to make available to Data Factory/DataBricks services.

    Tuesday, December 11, 2018 6:15 PM