Unity simple client server video streaming using RPC calls to send webcamTexture

, I have created a simple application in unity that acts as a client as well as a server.The server runs on android device while the client on a laptop/desktop.What the application does is the client captures a photo from its device webcam then converts it to JPG byte array and sends this to the server using an RPC call, the server then renders the texture, this process happens quickly and repeatedly thus simulating a live video stream.The problem is that i am getting poor framerate maybe around 8frames/s.Can someone help me get it right maybe improve the framerate to around 30 or something above.Here is the algorithm:

The client captures a shot from webcam uses (WebCamTexture.GetPixels() in
unity)

Then converts it to JPG using Texture2D.EncodeToJPG().

Then send this byte array which is on average 4Kb in size and never exceeds 4.5Kb

The server then loads this texture using Texture2D.LoadImage(textureBytes)

Here is the code now(Just included the relevant portion, note not tested might not run !) :

   using System;
   using System.Collections;
   using System.Collections.Generic;
   using UnityEngine;
   using UnityEngine.UI;

   public class Media : MonoBehaviour
   {

[SerializeField]
private RawImage videoPanel;                          // The rawimage on which the video stream will be showed

private WebCamTexture liveRec;
private bool routineRunning;
private bool texelsLoaded = true;                    
private bool noSignal;                               
private Texture2D image;
private byte[] texelsBinary;
private byte[] textureBytes;
private int qualityControl;                      
private bool asyncLoadRunning;                            
private int  rpcBufferCount;

internal static int width;
internal static int height;

public int bufferLimit = 60;                             
private Toggle serverswitch;
private Toggle clientswitch;

// Use this for initialization

void Start()
{

    texelsBinary = new byte[2];
    textureBytes = new byte[2];

    qualityControl = 70;

    liveRec = new WebCamTexture();
    liveRec.requestedHeight = 1; 
    liveRec.requestedWidth = 1; 
    liveRec.anisoLevel = 1;
    liveRec.mipMapBias = 1.5f;

    if (!(Application.platform == RuntimePlatform.Android)) { liveRec.Play(); }
    liveRec.anisoLevel = 1;
    width = liveRec.width;
    height = liveRec.height;
    image = new Texture2D(liveRec.width, liveRec.height);

}

void Update()
{

            if(serverswitch.isOn && ProceduralLink.isConnected())           // Means if the application is acting a s a server

            {
                if (!asyncLoadRunning)
                {
      
                  asyncLoadRunning = true;
                  StartCoroutine(serverAsyncLoader());
                }
            }                        
   

    if (clientswitch.isOn && !routineRunning)                // Means the application is acting as a client take photos and send them to the server app on android
        
    {
        routineRunning = true;
        StartCoroutine(TakePhoto());
    }

}

IEnumerator TakePhoto()
{

    yield return new WaitForSeconds(0);

   if (rpcBufferCount > bufferLimit) { rpcBufferCount = 0; Network.RemoveRPCs(ProceduralLink.clientViewId); }  //RPC calls seem to buffer up so i clear them up otherwise on the server the video stream is left far behind the live stream

   makeNextShotReady();
   sendTexels();
   rpcBufferCount += 1;             
     
   routineRunning = false;
    
}

// server RPC

[RPC]

void loadTexels(byte[] buffer)

{

  if (!noSignal)

    {
        textureBytes = buffer;
    }

}

private void sendTexels()

{
    
        try   { GetComponent<NetworkView>().RPC("loadTexels", RPCMode.Server, texelsBinary); }
        catch { return; }
           
}

private void makeNextShotReady()

{
    image.SetPixels32(liveRec.GetPixels32());
    texelsBinary = image.EncodeToJPG(qualityControl);
}

IEnumerator serverAsyncLoader()
{   
    yield return new WaitForSeconds(0);
    image.LoadImage(textureBytes);
    videoPanel.texture = image;
    asyncLoadRunning = false;
}

    }

Some facts:

1)> The resolution of the camera device i have lowered down to the lowest possible and the final texture size that i get when it is encoded into JPG is always around 4KB so size is nopt an issue.

2)> The server and client apps are connected through LAN where the bandwidth is dedicated to them forever from my home router.

3)> I have a searched on the internet they seem to say that Texture2D.LoadImage() is very slow use WWW.loadTexture, but i can’t use that as i am not uploading the photo from the client on a URL(webserver)

4)> Used Texture2D.LoadRawTexture , has the same effect poor performance.

5)> They said use threads , but i have never used them before and will they work on Android device? and how to use them in this case ?

6)> Please consider that i am a Bachelor Student of Computer Science(Studying currently in 5th semester) and have no practical experience, and this is my first time working on a networking project, so i can’t afford any plugins, and please whatever you explain just remember that i am a student and not a professional.

Thankyou for your time!. Much obliged!

FMETP STREAM is an All-in-One Game View Streaming Solution on almost all platforms.

Tested on: iOS/Android/Mac/PC/VR/AR/WebGL/HTML/Linux/HoloLens/MagicLeap…etc

Asset Store: FMETP STREAM CORE(V1) | Packs | Unity Asset Store

PS: All Source Code written in C#, free to modify for specific case.

I found the solution to my question, the old unity legacy network system had flaws and didn’t provide much control over what you do.I switched to UNET(The new networking architecture in unity), tweaked some parameters and i was done.A fact to note is that some cameras can’t do good framerates in low lighting conditions, due to exposure timings, this problem cannot be solved with software tweaking of any sort except if you can tweak the hardware driver.Here are some of the important parameters you should tweak in UNET host topology definition.Below is what i’ve setup for my purposes.

    NetworkTransport.Init();
    cc = new ConnectionConfig();

    reliableChannel = cc.AddChannel(QosType.Reliable);
    unreliableChannel = cc.AddChannel(QosType.Unreliable);
    fragmentedReliableChannel = cc.AddChannel(QosType.ReliableFragmented);
    fragmentedUnreliableChannel = cc.AddChannel(QosType.UnreliableFragmented);

    cc.PacketSize = 1470;
    cc.MinUpdateTimeout = 1; 
    cc.SendDelay = 10;
    cc.BandwidthPeakFactor = 100;
    cc.InitialBandwidth = 1536000;        // this is the most important parameter
    cc.FragmentSize = 220;
    cc.MaxSentMessageQueueSize = 250;