SoFunction
Updated on 2025-04-22

Android realizes screen sharing and remote control functions for two mobile phones

Android realizes screen sharing and remote control functions for two mobile phones

Updated: April 22, 2025 11:45:54 Author: Katie.
In various scenarios such as remote assistance, online teaching, and technical support, the screen of another mobile device is obtained in real time and operated on it, which has extremely high application value. This project aims to realize screen sharing and remote control between two Android phones. Friends who need it can refer to it.

1. Project Overview

In various scenarios such as remote assistance, online teaching, and technical support, the screen of another mobile device is obtained in real time and operated on it, which has extremely high application value. This project aims to realize screen sharing and remote control between two Android phones, and its core functions include:

  • Main control terminal (Controller): Capture the screen and encode the real-time screen and send it over the network; at the same time, monitor the user's input operations such as touch, sliding and key pressing on the main control terminal, and send the operation events to the controlled terminal.

  • Controlled end (Receiver): Receive screen data and decode and render to the local interface in real time; receive and analyze input operation events from the main control terminal, simulate touch and buttons through the system interface to realize the operation of the controlled device.

Through this solution, users can "see" the screen of the controlled end in real time, and interact with touch, slide and other things on the main control end, achieving the effect of "remote control" other machines. The core difficulty of this project lies in how to ensure the real-time and clarity of image data, and how to accurately and timely simulate input events.

2. Related knowledge

2.1 MediaProjection API

  • Overview: Screen recording and projection interface introduced by Android 5.0 (API 21). passMediaProjectionManagerAfter obtaining user authorization, you can createVirtualDisplay, deliver the screen content toSurfaceorImageReader

  • Key categories

    • MediaProjectionManager: Request screen capture permission

    • MediaProjection: Perform screen capture

    • VirtualDisplay: Virtual display, output toSurface

    • ImageReader:byImageGet screen images by frame

2.2 Socket Network Communication

  • Overview: Bidirectional streaming communication based on TCP protocol, suitable for the stable transmission of large blocks of data.

  • Key categories

    • ServerSocket / Socket: Server listening and client connection

    • InputStream / OutputStream: Read and write data

  • Notice: A simple and efficient protocol is needed to design, and a frame header (such as length information) is added before sending each frame of image, so that the receiver can correctly subcontract and frame.

2.3 Input event simulation

  • Overview: Can't be used directly in non-system applicationsInputManagerInjection events requires accessibility Service or system signature permissions.

  • Key technologies

    • AccessibilityService Injection Touch Events

    • useGestureDescriptionConstruct gestures and passdispatchGestureTrigger

2.4 Data compression and transmission optimization

  • Image encoding:WillImageThe frame is converted to JPEG or H.264 to reduce bandwidth usage.

  • Data sharding: Segmented large frames to prevent single write blocking or triggeringOutOfMemoryError

  • Network buffering and retransmission: TCP itself provides retransmission, but needs to control the appropriate transmission rate to prevent congestion.

2.5 Multithreading and asynchronous processing

  • Overview: Screen capture and network transmission take time, and must be placed in independent threads orHandlerThread, otherwise the UI will be stuck.

  • frame

    • ThreadPoolExecutorManage capture, encoding, and send tasks

    • HandlerThreadCooperateHandlerProcess IO callbacks

Ideas for realization

3.1 Architecture Design

+--------------+                                +--------------+
|              |--(Request authorization)------------------->|              |
| MainActivity |                                | RemoteActivity|
|              |<-(Start the service、Connection successfully)-----------|              |
+------+-------+                                +------+-------+
       |                                                |
       | Capture the screen -> MediaProjection -> ImageReader      | Receive screen -> decoding -> SurfaceView
       | coding(JPEG/H.264)                               | 
       | send -> Socket OutputStream                     | 
       |                                                | Receive events -> Accessible Service -> dispatchGesture
       |<--Touch event pack------------------------------------|
       | Simulated touch => AccessibilityService                |
+------+-------+                                +------+-------+
| ScreenShare  |                                | RemoteControl|
|   Service    |                                |   Service    |
+--------------+                                +--------------+

3.2 Protocol and data format

  • Frame header structure (12 bytes)

    • 4 bytes: Frame type (0x01 means image, 0x02 means touch event)

    • 4 bytes: data length N (network endianness)

    • 4 bytes: timestamp (milliseconds)

  • Image frame data[Frame Header][JPEG Data]

  • Touch event data

    • 1 byte: Event type (0: DOWN, 1: MOVE, 2: UP)

    • 4 bytes: X coordinate (float)

    • 4 bytes: Y coordinate (float)

    • 8 bytes: timestamp

3.3 Screen Capture and Coding

  1. Master control call(), request authorization.

  2. After the authorization is approved, obtainMediaProjection, createVirtualDisplayAnd bind()

  3. In independent threads,()Continuously obtaining the originalImage

  4. WillImageConvert toBitmap, then use(, 50, outputStream)Coding.

  5. Splice the JPEG bytes according to the protocol and send them to the controlled end.

3.4 Network transmission and decoding

Main control terminal

  • Use singletonSocketClientManage connections.

  • Write the encoded frame data toBufferedOutputStream, and call it if necessaryflush()

Controlled end

  • start upScreenReceiverService, listen to the port and accept connections.

  • useBufferedInputStream, first read the 12-byte frame header, and then read the data according to the length.

  • Use JPEG data()Decode, update toSurfaceView

3.5 Input Event Capture and Simulation

Main control terminal

  • existMainActivityListen to touch eventsonTouchEvent(MotionEvent), extract event type and coordinates.

  • Encapsulated into event frames according to the protocol and sent to the controlled end.

Controlled end

  • RemoteControlServiceAfter receiving the event frame, construct through the barrier-free interface.GestureDescription

Path path = new Path();
(x, y);
 stroke = new (path, 0, 1);
  • CalldispatchGesture(stroke, callback, handler)Inject touch.

4. Complete code

/**************************  **************************/
package ;
 
import ;
import ;
import ;
import ;
import ;
import ;
import ;
import ;
import ;
import ;
import ;
import ;
import ;
import ;
import ;
 
import ;
import ;
import ;
import ;
 
/*
  * MainActivity: Responsible
  * 1. Request screen capture permissions
  * 2. Start ScreenShareService
  * 3. Capture touch events and send
  */
public class MainActivity extends Activity {
    private static final int REQUEST_CODE_CAPTURE = 100;
    private MediaProjectionManager mProjectionManager;
    private MediaProjection mMediaProjection;
    private ImageReader mImageReader;
    private VirtualDisplay mVirtualDisplay;
    private ScreenShareService mShareService;
    private Button mStartBtn, mStopBtn;
    private Socket mSocket;
    private BufferedOutputStream mOut;
 
    @Override
    protected void onCreate(Bundle savedInstanceState) {
        (savedInstanceState);
        setContentView(.activity_main);
        mStartBtn = findViewById(.btn_start);
        mStopBtn = findViewById(.btn_stop);
 
        // Click Start: Request authorization and start the service        (v -> startCapture());
        // Click Stop: Stop the service and disconnect        (v -> {
            ();
        });
    }
 
    /** Request screen capture authorization */
    private void startCapture() {
        mProjectionManager = (MediaProjectionManager) getSystemService(Context.MEDIA_PROJECTION_SERVICE);
        startActivityForResult((), REQUEST_CODE_CAPTURE);
    }
 
    @Override
    protected void onActivityResult(int requestCode, int resultCode, Intent data) {
        if (requestCode == REQUEST_CODE_CAPTURE && resultCode == RESULT_OK) {
            mMediaProjection = (resultCode, data);
            // Initialize ImageReader and VirtualDisplay            setupVirtualDisplay();
            // Start the service            mShareService = new ScreenShareService(mMediaProjection, mImageReader);
            ();
        }
    }
 
    /** Initialize virtual monitors for screen capture */
    private void setupVirtualDisplay() {
        DisplayMetrics metrics = getResources().getDisplayMetrics();
        mImageReader = (, ,
                                               PixelFormat.RGBA_8888, 2);
        mVirtualDisplay = ("ScreenCast",
                , , ,
                DisplayManager.VIRTUAL_DISPLAY_FLAG_AUTO_MIRROR,
                (), null, null);
    }
 
    /** Capture touch events and send them to the controlled end */
    @Override
    public boolean onTouchEvent(MotionEvent event) {
        if (mShareService != null && ()) {
            (event);
        }
        return (event);
    }
}
 
/**************************  **************************/
package ;
 
import ;
import ;
import ;
import ;
import ;
import ;
import ;
import ;
 
import ;
import ;
import ;
 
/*
  * ScreenShareService: Responsible
  * 1. Establish a Socket connection
  * 2. Get screen frames from ImageReader
  * 3. Send after encoding
  * 4. Receive touch event send
  */
public class ScreenShareService {
    private MediaProjection mProjection;
    private ImageReader mImageReader;
    private Socket mSocket;
    private BufferedOutputStream mOut;
    private volatile boolean mRunning;
    private HandlerThread mEncodeThread;
    private Handler mEncodeHandler;
 
    public ScreenShareService(MediaProjection projection, ImageReader reader) {
        mProjection = projection;
        mImageReader = reader;
        // Create background thread processing encoding and network        mEncodeThread = new HandlerThread("EncodeThread");
        ();
        mEncodeHandler = new Handler(());
    }
 
    /** Start the service: Connect to the server and start capturing and sending */
    public void start() {
        mRunning = true;
        (this::connectAndShare);
    }
 
    /** Stop service */
    public void stop() {
        mRunning = false;
        try {
            if (mSocket != null) ();
            ();
        } catch (Exception ignored) {}
    }
 
    /** Establish Socket connection and loop capture sending */
    private void connectAndShare() {
        try {
            mSocket = new Socket("192.168.1.100", 8888);
            mOut = new BufferedOutputStream(());
            while (mRunning) {
                Image image = ();
                if (image != null) {
                    sendImageFrame(image);
                    ();
                }
            }
        } catch (Exception e) {
            ("ScreenShare", "Connect or send failed", e);
        }
    }
 
    /** Send image frame */
    private void sendImageFrame(Image image) throws Exception {
        // To convert Image to Bitmap and compress it into JPEG         plane = ()[0];
        ByteBuffer buffer = ();
        int width = (), height = ();
        Bitmap bmp = (width, height, .ARGB_8888);
        (buffer);
 
        ByteArrayOutputStream baos = new ByteArrayOutputStream();
        (, 40, baos);
        byte[] jpegData = ();
 
        // Write frame header: type=1, length, time stamp        (intToBytes(1));
        (intToBytes());
        (longToBytes(()));
        // Write image data        (jpegData);
        ();
    }
 
    /** Send touch events */
    public void sendTouchEvent(MotionEvent ev) {
        try {
            ByteArrayOutputStream baos = new ByteArrayOutputStream();
            ((byte) ());
            (floatToBytes(()));
            (floatToBytes(()));
            (longToBytes(()));
            byte[] data = ();
 
            (intToBytes(2));
            (intToBytes());
            (longToBytes(()));
            (data);
            ();
        } catch (Exception ignored) {}
    }
 
    // …(byte/int/long/float and bytes convert each other, omitted)}
 
/**************************  **************************/
package ;
 
import ;
import ;
import ;
 
import ;
import ;
import ;
import ;
 
/*
  * RemoteControlService (Inheriting AccessibilityService)
  * 1. Start ServerSocket to receive the main control terminal connection
  * 2. Looping the frame header and data
  * 3. Distinguish image frames from event frames and process them
  */
public class RemoteControlService extends AccessibilityService {
    private ServerSocket mServerSocket;
    private Socket mClient;
    private BufferedInputStream mIn;
    private volatile boolean mRunning;
 
    @Override
    public void onServiceConnected() {
        ();
        new Thread(this::startServer).start();
    }
 
    /** Start the server socket */
    private void startServer() {
        try {
            mServerSocket = new ServerSocket(8888);
            mClient = ();
            mIn = new BufferedInputStream(());
            mRunning = true;
            while (mRunning) {
                handleFrame();
            }
        } catch (Exception e) {
            ();
        }
    }
 
    /** Process each data frame */
    private void handleFrame() throws Exception {
        byte[] header = new byte[12];
        (header);
        int type = bytesToInt(header, 0);
        int len = bytesToInt(header, 4);
        // long ts = bytesToLong(header, 8);
 
        byte[] payload = new byte[len];
        int read = 0;
        while (read < len) {
            read += (payload, read, len - read);
        }
 
        if (type == 1) {
            // Image frame: decode and render to SurfaceView            handleImageFrame(payload);
        } else if (type == 2) {
            // Touch Event: Simulation            handleTouchEvent(payload);
        }
    }
 
    /** Decode the JPEG and update the UI (communication via Broadcast or Handler) */
    private void handleImageFrame(byte[] data) {
        // … (Omitted, decode Bitmap and post to SurfaceView)    }
 
    /** parse according to the protocol and dispatchGesture */
    private void handleTouchEvent(byte[] data) {
        int action = data[0];
        float x = bytesToFloat(data, 1);
        float y = bytesToFloat(data, 5);
        // long t = bytesToLong(data, 9);
 
        Path path = new Path();
        (x, y);
         sd =
                new (path, 0, 1);
        dispatchGesture(new ().addStroke(sd).build(),
                        null, null);
    }
 
    @Override
    public void onInterrupt() {}
}
<!--  -->
<manifest xmlns:andro
    package="">
    <uses-permission android:name=".FOREGROUND_SERVICE"/>
    <uses-permission android:name=".SYSTEM_ALERT_WINDOW"/>
    <uses-permission android:name=".WRITE_EXTERNAL_STORAGE"/>
    <application
        android:allowBackup="true"
        android:label="ScreenCast">
        <activity android:name=".MainActivity">
            <intent-filter>
                <action android:name=""/>
                <category android:name=""/>
            </intent-filter>
        </activity>
        <service android:name=".RemoteControlService"
                 android:permission=".BIND_ACCESSIBILITY_SERVICE">
            <intent-filter>
                <action android:name=""/>
            </intent-filter>
            <meta-data
                android:name=""
                android:resource="@xml/accessibility_service_config"/>
        </service>
    </application>
</manifest>
&lt;!-- activity_main.xml --&gt;
&lt;LinearLayout xmlns:andro
    android:orientation="vertical" android:layout_width="match_parent"
    android:layout_height="match_parent" android:gravity="center"&gt;
    &lt;Button
        android:
        android:layout_width="wrap_content"
        android:layout_height="wrap_content"
        android:text="Start Screen Sharing"/&gt;
    &lt;Button
        android:
        android:layout_width="wrap_content"
        android:layout_height="wrap_content"
        android:text="Stop Service"/&gt;
    &lt;SurfaceView
        android:
        android:layout_width="match_parent"
        android:layout_height="match_parent"/&gt;
&lt;/LinearLayout&gt;

5. Code interpretation

  1. MainActivity

    • Request and process user authorization, create and bindVirtualDisplay

    • start upScreenShareServiceResponsible for capturing and sending;

    • RewriteonTouchEvent, pass the touch event to the service.

  2. ScreenShareService

    • Establish a TCP connection in a background thread;

    • Loop fromImageReaderGet the frame and convert it toBitmapAnd compress it and send it through Socket;

    • Listen to the touch events of the master control terminal, encapsulate and send event frames.

  3. RemoteControlService

    • Start as an accessibility service, listen to the port to receive data;

    • Read the frame header and payload and distribute it to image processing or touch processing according to the type;

    • Used during touch processingdispatchGestureInject trajectory to realize remote control.

  4. Layout and permissions

    • existDeclare necessary permissions and barrier-free services;

    • activity_main.xmlSimple layout includes buttons andSurfaceViewUsed for rendering.

6. Project Summary

Through this project, we have fully realized the screen sharing and remote control functions of two devices on the Android platform, and mastered and comprehensively used the following key technologies:

  • MediaProjection API: Native screen capture and virtual display creation;

  • Socket Programming: Design frame protocol to achieve efficient and reliable bidirectional transmission of images and events;

  • Image encoding/decoding: Compress the screen frame to JPEG to balance the clarity and bandwidth;

  • Accessible service:passdispatchGestureInject touch events to complete remote control;

  • Multithreaded processing:useHandlerThreadEnsure real-time performance of capture, encoding, transmission, etc. to avoid UI blocking.

This solution has the following expansion directions:

  1. Audio synchronization: Transfer microphone or system audio while screen sharing.

  2. Video codec optimization: Introduced hardware H.264 encoding for lower latency and higher compression rates.

  3. Cross-platform support: Implement corresponding clients on platforms such as iOS and Windows.

  4. Security enhancement: Add TLS/SSL encryption to prevent man-in-the-middle attacks; verify device identity.

The above is the detailed content of Android implementing screen sharing and remote control functions for two mobile phones. For more information about screen sharing and remote control of Android phones, please follow my other related articles!

  • Android
  • Screen Sharing
  • Remote control

Related Articles

  • Android realizes sliding scale effect

    This article mainly introduces the effect of Android to implement sliding scales. The sample code in the article is very detailed and has certain reference value. Interested friends can refer to it.
    2020-06-06
  • Android custom password input EditTextLayout

    This article mainly introduces the Android custom password input EditTextLayout in detail, which has certain reference value. Interested friends can refer to it.
    2018-08-08
  • Android programming sliding effect implementation method (with demo source code download)

    This article mainly introduces the reflection effect implementation method of Android programming sliding effect, which implements reflection function based on the inheritance of BaseAdapter's custom Gallery and ImageAdapter, and comes with demo source code for readers to download and reference. Friends who need it can refer to it
    2016-02-02
  • Android EditText automatically adds space effects every 4 digits

    This article mainly introduces relevant information about the effect of automatically adding spaces for every 4 digits of Android EditText. The article introduces the sample code in detail, which has a certain reference learning value for everyone to learn or use EditText. If you need it, let’s learn together.
    2019-06-06
  • Android implements dot indicator for boot page

    This article mainly introduces the dot indicators of Android implementation guide page in detail. The sample code in the article is introduced in detail and has certain reference value. Interested friends can refer to it.
    2021-06-06
  • Android implementation saves Bitmap to local

    This article mainly introduces the Android implementation of saving Bitmap locally, which has good reference value and hopes it will be helpful to everyone. Let's take a look with the editor
    2020-03-03
  • Android implementation method to save View as Bitmap

    This article mainly introduces the method of saving View into Bitmap on Android, involving related usage techniques for Android canvas Canvas, bitmap bitmap and View. Friends who need it can refer to it
    2016-06-06
  • Android    Data Binding  libra

    This article mainly introduces the relevant information about Android Data Binding encountered errors and solutions in library module. Friends who need it can refer to it
    2017-03-03
  • Introduction to how to use Android shape tags

    Shape is a commonly used tag. It can generate lines, rectangles, circles, and rings. Like our rounded buttons, it can be realized through shape. In the end, Android will parse the image with shape tag into a Drawable object. This Drawable object is essentially GradientDrawable
    2022-09-09
  • Android development implements file storage function

    This article mainly introduces the file storage function of Android development. The sample code in the article is introduced in detail and has a certain reference value. Interested friends can refer to it.
    2020-07-07

Latest Comments