webRTC with signalR on browsers live chat RRS feed

  • Question

  • User1252623935 posted

    Is it possible to use signalR on top of webRTC in order that non webRTC supported browsers will get only video data with no sound.

    They will not be able to broadcast but only receive video data through signalR websocket and use the html5 canvas to show it.

    So basically i need 1 regular webRTC upload streaming that signalR server Splits into 2 kind of streaming broadcasting one regular webRTC and one video data through websocket for use in the html5 canvas to show it.

    Dose signalR can give me that solution?

    Saturday, December 3, 2016 7:49 PM

All replies

  • User283571144 posted

    Hi cheinan,

    Dose signalR can give me that solution?

    As far as I know, SignalR is based on standards (WebSockets, LongPolling, ForeverFrame, etc.) which only stream text based JSON messages.

    If you want to send the video data to client, you need to convert the data to JSON, then send it.

    If you want to use signalR to achieve your requirement, I suggest you could transmits images as Data URIs.

    More details, you could refer to follow link:


    Best Regards,


    Monday, December 5, 2016 8:00 AM
  • User1252623935 posted

    Thanks Brando for your replay

    The link you gave me is to a primitive not efficient way of doing video chat that take image data from user as heavy base 64  encoded string

    and use SignalR to upload it and broadcast it. 

    It dose not use webRTC.

    I need on server side to transform the webRTC incoming streaming to image data string that SignalR will broadcast

    Tuesday, December 27, 2016 10:12 AM