Technical concepts of Headwind Remote
To control a remote unit, the software should solve two tasks.
- Display the device screen
- Deliver the control gestures to a device and replay them
Another important condition is the usability. The remote access session must be established without any network adjustments. The user must not know about IP addresses, hosts, routes, protocols, and other underlying technical details.
The last thing is that Headwind Remote should be completely open, including the server part, so every user could be able to setup a self-hosted solution.
The Headwind Remote system is delivered as a container of Docker Compose. This simplifies the installation and makes Headwind Remote server compatible with every Linux distribution.
Screen sharing and delivery of gestures to devices is powered by the Janus media server.
The administrator’s application for the remote control of Android devices is a web application. It can be opened in any web browser supporting WebRTC, for example, Chrome. So the web part is cross-platform.
The mobile agent could be installed on an Android device. Mobile platforms other than Android are not supported by Headwind Remote.
How we share the screen?
Most remote control software use VNC (Virtual Network Computing) protocol for accessing remote devices.
We found this protocol to be unsuitable for controlling mobile devices. Using TCP and Remote Frame Buffer (RFB) in mobile networks could be slow and may cause high latencies. Therefore, we separated the tasks of screen mirroring and delivery of supervisor’s gestures.
The screencast is being broadcasted as an RTP stream based on the UDP bearer protocol. The video streaming data are sent by a mobile agent to a Janus inbound socket. Janus transforms these data to a secure WebRTC stream and sends to the administrator’s web application.
The mobile agent casts the Android device screen by using the MediaProjection API. This API is a part of Android (5.0 and above) and doesn’t require any firmware customization or rooting. The video is encoded by low-level built-in encoders with low CPU costs.
By using this method to mirror the screen of Android phone, we were able to achieve secure and low-latency screencast even in slow mobile networks (3G and above).
How we deliver control gestures?
Control gestures (taps and swipes) are sent from the web host to the mobile agent through Janus Data Channel. This method uses UDP protocol for faster delivery and lower latency.
An Android device “plays” these gestures using Accessibility services. This is the standard Android API available in Android 7 and above. It doesn’t require rooting or firmware customization.
To allow accessibility services and enable Headwind Remote agent, users must explicitly give their consent. The Headwind Remote agent implements this process as easy as possible.
Now when you read about the technical concepts and underlying technologies of Headwind Remote, you can study its source code at github, or install the software and organize the remote access to your company’s mobile devices.