[Release] FMETP STREAM 6: GameView | Mic | Audio | Remote Desktop Control | CrossPlatforms: Vision Pro, HoloLens 2, AR, VR, WebGL and more

+ Please try out our latest major updates → -> → FMETP STREAM 6

with FMETP STREAM 6, you can integrate the Live Stream feature into your projects immediately.

  • Setup and testing take just 5 minutes.

  • Encoder Module is the simplest way to understand how it work:

Encode(sender) Network System(send bytes) → Decode(receiver)


What’s New in FMETP STREAM 6?

  • This is a Major update from FMETP STREAM 4.0, which are re-written the core in C++ and fully optimised for Unity 6 URP.

  • The Core SDK compiled into almost every platforms, including the latest Vision OS, UWP(ARM64) and fully optimised for WebGL(WASM).

Here is the key changes:

  • FM-VP8 support for video stream for all platforms

  • FM-OPUS support for audio stream for all platforms

  • FM-YUV GPU support for all platforms

  • libturbojpeg 3.x for MJPEG support, enabled further options for compression quality and speed, and support for WebGL build

  • FM Desktop 2 major improvement on latency and performance

  • FM Network 6 optimisation and bug fixes

  • FM WebSocket 6 major improvement on webgl

  • MainCam Capture Mode supports URP now, you don’t need to use RenderCam mode for VR anymore!

  • Test on Vision Pro and HoloLens 2


Flexible networking systems for many use case

  • FM Network: the best low-latency networking system in local network

  • FM WebSocket: host your own game server with customised room names

Stream your data via single command: SendToOthers()

  • 3rd party networking system supports: PUN2/Photon Fusion(RPC), MIRROR, UNet and any networking system support byte streaming!

Benefit with our SDK

  • Solid solution for many projects since 2018.

  • Widely intergrated into projects for public exhibitions, museums, art galleries, future cinema systems, AR/VR/MR/Interactive projects.

  • Save you 6~12+ months development time, good for AR/VR startups, indie developers, researchers and students.


+ More templates for FMETP STREAM

  • Quest 2 Template updated(stable 120FPS in game) in Unity2022.3.8f1 with Oculus SDK v57
  • HoloLens Template

mrtk3: GitHub - frozenmistadventure/fmetp_tutorial_hololens2_mrtk3

mrtk2: GitHub - frozenmistadventure/fmetp_tutorial_hololens2_stream


+ Tutorials (3.0)

  • FM Network Stream(Basic)

youtu.be/_sj_jUzSuwo

  • FM WebSocket Stream(Basic)

youtu.be/RNBhnx1T-c8

+ Tutorials (2.0)

Old testing videos in FMETP STREAM V2:

Macbook Pro → M1 iMac: youtu.be/NKZNuVSSmms

Tested devices in our public projects:

Apple Vision Pro(2024)

Apple Macbook(2017)

Apple Macbook Pro(2015)

Apple Macbook Pro(2018)

Apple Macbook Pro(silicon M1 Max )

Apple Macbook Pro(silicon M3 Max )

Apple Mac Studio(silicon M1 Ultra )

Apple Mac Studio(silicon M2 Ultra )

Apple iMac(silicon M1)

iPhone 5

iPhone 6s

iPhone XS

iPhone 12 Pro

iPhone 14 Pro

iPhone 15 Pro

iPhone 16 Pro

iPhone SE(2022)

iPad Air 2

iPad pro(2017)

iPad pro(2018)

iPad pro(2019)

iPad pro(M2)

iPad mini(2th gen)

iPad mini(6th gen)

Samsung S4

Samsung S6

Samsung S9+

PC with GTX1080, RTX 2080

Razer Blade 15(2021) RTX3060

Windows 7

Windows 10

Windows 11

Ubuntu 21, 22

HTC Vive Pro

HTC Vive Focus

Oculus Go

Oculus Rift

Hololens 1 & 2

AVIE 360 Projection System

Future Cinema System

Oculus Quest 1 & 2

Stereo Pi (v1 & v2)

*Hololens & Magic Leap (requires build with Net 4.x (IL2CPP))

*Pico VR

*Nreal Light

*Huawei Android Pad

Supported platform:

iOS, Android, Mac, PC, WebGL, Linux

######################

*tested by customers

######################

This asset uses websocket-sharp under The MIT License (MIT); SharpZipLib Licenses under The MIT License (MIT); libjpeg-turbo under Modified BSD License (This software is based in part on the work of the Independent JPEG Group.) libvpx, libopus, see Third-Party Notices.txt file in package for details.

1 Like

We just submitted an update to Asset Store and the new feature will be available in v1.02 very soon.
[NEW!] Network Action (Beta)

  • Custom RPC networking solution with TCP and UDP
  • Sync / Remote trigger for multiple clients
  • Simple Example Scene provided

1 Like

Hi, do you have any experience streaming from a mobile VR headset. We are looking for a solution that can do this without compromising the framerate on the headset, currently using a Oculus Quest.

Thanks for reaching us!
I assumed that you want to mirror the headset view to other screens(pc/tablet). We mainly do testing between iOS/Android/Mac/Win10, and even iphone5 can stream the webcam/game view without problem.

for the performance, it depends on few things:

  1. hardware side: cpu / network card / wifi router
  2. image quality: resolution / compression(file size)

We tried GearVR with Samsung S9+ and it works fine. Unfortunately, we don’t have Oculus Quest in our lab now, but we will order it soon.
For your concern, we will do some test with Oculus Go in coming few days.

We did a game view streaming on Oculus GO with resolution of 1024x576(streaming fps: 20).
It costs 5ms in headset performance. I believe Oculus Quest should have a better processor than Oculus Go, thus, the performance will be better too.

The headset view is wirelessly streaming to iphone XS / Macbook Pro almost in real time without noticeable delay.
the latency with my plugin is much much lower than “Oculus Go Wired mirroring 2~4 second delay” said on their official website.
https://developer.oculus.com/blog/oculus-go-wired-mirroring-how-to/

I will update a GameViewStreaming demo very soon, which are tested with Oculus Go.

I have a similar question regarding VR streaming: in my case, I want to stream the content of a desktop Unity app to an instance of another Unity app running on a mobile VR headset (Go or Quest); will your plugin satisfy this use case? More specifically, does your plugin provide a way to render the stream on an arbitrary surface?

We just submitted version1.03 update to Asset Store, hopefully it will be available soon.
-Added Game View Streaming, Tested VR Headset: mirror Oculus Go Game View to Macbook Pro & iPhone XS
-Optimised Connection(Server)
-Available to Send Raw Byte[ ] in Network Action(custom RPC networking solution)

Any questions from you guys could help us understood your needs! We would like to test and take action immediately if possible.

It should work. for your use case, I need to test the performance on Oculus Go in office next week.
what is your target streaming resolution & streaming fps, which will satisfy you?

PS: Once your server & clients are on Mac/PC/iOS/Android, they should work. but the image quality and performance rely on your hardware. Usually, decoding streaming on clients side demands more resources.

I am looking to stream content on an Oculus Go/Quest at 640x720, 60 fps at least; I have written a contrived setup that allows me to stream still frames over the net and I get around 20-ish frames per second, but it wastes a lot of bandwith.

I personally don’t believe that Oculus Go can achieve 60fps game view streaming, as it’s a very cheap VR headset. Maybe Quest can do it, but I didn’t own one at the moment.

Would you mind to share your current bandwidth/data size for each frame?

Hi, I wonder if the network solution can also be used for AR applications. Is the RTMP protocol supported during Live Stream? For example, can I use a mobile broadcast to transfer to Wowza or Dacast servers? Or what else do I have to do for this? I would be very happy if you could help me with this subject. Will you also prepare a video sample for the new network feature?

I own them both, and as far as I can say there’s no huge improvement, it’s also true that I have to test my system on the Quest yet.

Each frame is 160000 bytes long tops, packed as JPEGs with a 80% compression rate at 640x720, theoretically speaking a second should be around 9 MB (60 frames per sec) of data and a full 30 minutes session should be around 15 GB; of course this is a ridiculous amount of data sent that should (could?) be compressed further with a proper codec. The biggest bottleneck so far has been the Texture2D.Apply() function, which is obviously needed to switch from CPU space to GPU space.

we have two separated demos for “live webcam/game view streaming” and “custom rpc networking(experiment feature)”.
However, those features are aiming for local network during our development.
Technically speaking, they are all developed based on UDP+TCP. As all our codes are writing in C#, they might be a good starting point for your subject.
We would like to do some research about RTMP protocol and It’s kind of interesting topic.

PS: please feel free to discuss and hopefully we both can learn something new.

Edited: WebSocket Demo is available now, you can do live stream via Internet with node.js server hosting.

I agreed that it needs a proper codec for compression. In general, encoding and decoding heavily will affect the performance, which it’s another challenge for all-in-one headsets.

I suggested that you should target 30fps for oculus quest, with balanced image quality.
In order to maximise your device’s hardware limit, your streaming fps should be same as your game fps. It means, your streaming fps should not be higher than your game fps.

In your case, I can imagine your biggest bottleneck should be VR headset, as you can still upgrade your computer with insane cpu for “Texture2D.Apply()” if budget allows.

(I’ve bought your plugin in the meanwhile, if it doesn’ fit this project I have still another use planned.)

Yes, my biggest bottleneck is the VR headset, because Texture2D.Apply() takes too much time to update the screen; it should be possible to stream a hi-def video though, since the guys from Cubicle Ninjas (Guided Meditation VR) are able to play a 360 video in 5k/Strips format in Unity without any stuttering on Oculus Go (okay, the source is different since they stream from persistentDataPath, but still the texture has to be updated fast enough somehow).
I will take a look at your code in the meanwhile, maybe there’s something I’m missing.
Thank you for your replies!

loading video from persistentDataPath is totally another story, as it doesn’t need to render image and transfer any data via wifi in real time.

share a little bit of my experience:
I had a project using AVPro Video with Oculus Go for normal 5k video playback in 60~72fps in last year. if you do not need to render texture in real time wirelessly, this might be your option.

I am just curious: Does Oculus Quest have sleep mode after 5mins? I gave up Oculus Go as its annoying sleep mode.

Recently, there is a project for an immersive 360 cylindrical projection system, which requires a video player solution to sync 6~11 computers. I was using UNet for networking solution but it will be depreciated soon. Thus, I started creating a custom RPC networking method to upgrade the player. I am not sure if someone will be interested in a multi-video sync player solution demo?

and please be reminded that the latest version 1.03 is just submitted in few hours ago, it will take few days to get approval on Asset Store. In case you are urgent, you could send me an email with invoice number.

[Good News!]
As requested by some customers, we planned to add Audio Streaming demo in future updates, which allows you to capture in-Game Audio, and stream it between any platforms (Mac/PC/iOS/Android).

PS: now we are in debugging stage & trying to optimise it, hopefully it can be included in coming updates asap.

[v1.04] updated:
basic audio streaming demo is available on Asset Store

[v1.05] submitted & will be available soon:
optimised audio streaming demo for different Sample Rate & Channels Count between server & clients

[v1.061] submitted
-optimise for Windows 7
-optimise for old cpu, which are less than 4 threads
-testing async solution on devices, in order to reduce thread usage
-streaming experiment on VR headsets