NatCorder - Video Recording API

You could consider writing a native plugin to perform the concatenation. I don’t have any plans for building an FFmpeg wrapper.

Yes, I wrote NatCorder. Why do you ask?

Did you try out the iOS library I shared? If so, can you confirm that you are no longer losing frames?

When I try to record video with audio, I get an invalid mp4 file (that’s 80kb long). If I set audioChannels to zero, it records just fine. Here is my code:

        mediaRecorder = new MP4Recorder(filename, Screen.width, Screen.height, 30, 44100, 2, recordingEnded, 5909760*2,3);
        // Create a camera input to record the main camera
        cameraInput = new CameraInput(mediaRecorder, clock, MainCamera);
        // Create an audio input to record the scene's AudioListener
        audioInput = new AudioInput(mediaRecorder, clock, Audio);

Any ideas?

What device is this happening on? What is your Audio variable? Also, it looks like you have modified MP4Recorder (I highly, highly recommend against this). I also recommend never recording at screen resolution since screen resolutions can be pretty high, and you shouldn’t be hard coding the sample rate and channel count when recording from Unity. Get the values from Unity itself (see AudioSettings class).

Im using ReplayCam example.

I record a video using the Android camera, the path gets saved. Then i record a second video, that path gets saved.

I want to append the second video to the first video. So I decide to start recording from a texture, play the first video to a texture, then play the second video to the texture, then stop recording.

This should record both videos into one video, appending the second video to the end of the first.

But when i stop recording it crashes. Do you have any idea what im doing wrong?

Here is my code below. The code before this is the same as came with the package.

After it records the second video, i call the PlayAndRecord(firstPath,secondPath) below.

It goes thru that whole function and then crashes after it leaves StopWholeRecord, you can see from the crash log below at the bottom.

Thanks

    public IEnumerator PlayAndRecord(string path1,string path2)
        {
            Debug.Log("PLAY AND RECORD:v1="+path1 + " v2="+path2);
            yield return new WaitForEndOfFrame();
            yield return new WaitForSeconds(0.5f);
            yield return new WaitForEndOfFrame();

            StartWholeRecord();                  // START RECORDING
            yield return new WaitForEndOfFrame();
            yield return new WaitForSeconds(0.5f);
            yield return new WaitForEndOfFrame();
            Debug.Log("RECORD STARTED");

            PlayVideo(path1);                  // PLAY BACK THE FIRST VIDEO
            yield return new WaitForEndOfFrame();
            yield return new WaitForSeconds(0.5f);
            yield return new WaitForEndOfFrame();
            Debug.Log("FIRST VIDEO DONE");

            PlayVideo(path2);              // PLAY BACK THE SECOND VIDEO
            yield return new WaitForEndOfFrame();
            yield return new WaitForSeconds(0.5f);
            yield return new WaitForEndOfFrame();
            Debug.Log("SECOND VIDEO DONE");

            StopWholeRecord();      // STOP RECORDING
        }

        public void StartWholeRecord()
        {
            Debug.Log("START WHOLE VIDEO RECORD");
            recordingClock = new RealtimeClock();
            videoRecorder = new MP4Recorder(
                videoWidth,
                videoHeight,
                30,
                recordMicrophone ? AudioSettings.outputSampleRate : 0,
                recordMicrophone ? (int)AudioSettings.speakerMode : 0,
                OnRenderedWholeVideo
            );
            // Create recording inputs
            renderInput = new RenderTextureInput(videoRecorder, recordingClock);
            if (recordMicrophone) {
                StartMicrophone();
                audioInput = new AudioInput(videoRecorder, recordingClock, microphoneSource, true);
            }
        }

        public void PlayVideo(string path)
        {
            Debug.Log("PLAY VIDEO IN:" + path);
#if UNITY_EDITOR
            EditorUtility.OpenWithDefaultApp(path);
#elif UNITY_IOS
            Handheld.PlayFullScreenMovie("file://" + path);
#elif UNITY_ANDROID
            Handheld.PlayFullScreenMovie(path);
#endif
            Debug.Log("PLAY VIDEO OUT:" + path);
        }

        public void StopWholeRecord()
        {
            Debug.Log("STOP WHOLE VIDEO RECORD IN");
            // Stop the recording inputs
            if (recordMicrophone) {
                Debug.Log("stopping mic...");
                StopMicrophone();
                Debug.Log("...mic stopped");
                audioInput.Dispose();
                Debug.Log("...audio disposed");
            }
            Debug.Log("disposing renderer...");
            renderInput.Dispose();
            Debug.Log("...renderer disposed");
            // Stop recording
            videoRecorder.Dispose();
            Debug.Log("STOP WHOLE VIDEO RECORD OUT");
        }

        private void OnRenderedWholeVideo(string path)
        {
            Debug.Log("WHOLE VIDEO RENDERED:" + path);
            lastPath = path;
            currentPath = null;
        }

(Filename: ./Runtime/Export/Debug.bindings.h Line: 45)
08-25 20:17:20.652 15045-15060/? I/Unity: STOP WHOLE VIDEO RECORD OUT

(Filename: ./Runtime/Export/Debug.bindings.h Line: 45)
08-25 20:17:20.661 15045-15251/? V/Unity: NatRender: Released GLRenderContext
08-25 20:17:20.661 15045-15250/? V/Unity: NatCorder: MP4 video encoder changed output format: {csd-1=java.nio.HeapByteBuffer[pos=0 lim=8 cap=8], mime=video/avc, frame-rate=30, width=720, height=1280, color-standard=1, color-range=2, bitrate=5909760, csd-0=java.nio.HeapByteBuffer[pos=0 lim=21 cap=21], color-transfer=3, max-bitrate=5909760}
08-25 20:17:20.669 15045-15250/? V/Unity: NatCorder: MP4 video encoder encountered EOS
08-25 20:17:20.704 15045-15250/? E/Unity: NatCorder Error: Failed to stop MP4Recorder
java.lang.IllegalStateException: Failed to stop the muxer
at android.media.MediaMuxer.nativeStop(Native Method)
at android.media.MediaMuxer.stop(MediaMuxer.java:254)
at com.yusufolokoba.natcorder.MP4Recorder$10.run(MP4Recorder.java:287)
at java.lang.Thread.run(Thread.java:761)


Greetings, there are two topics that I would like to ask about.

The first is regarding if there is any update with the audio problems experienced when recording with audio resulting in corrupted files, alongside with the editor crashing?

My other concern is better understood with the attached image. As you can see, there is a green line appearing in the bottom when recording, I’ve made several tests and can’t find out what is causing it. Do you have any insight regarding this issue?

Thanks for your help.

You can’t use Handheld.PlayFullscreenMovie to playback the videos because Unity Engine gets suspended, and uses a native view to play the video. To be able to record with NatCorder, you have to playback the videos in Unity, using a VideoPlayer or something similar. The error you get is because you start and stop recording without ever recording any frames (since Unity is paused while Handheld.PlayFullscreenMovie plays the video).

What platform are you seeing this on?

can i record ar 7201280 or 10801920??

Yes. The orientation doesn’t matter, so you can record at wxh or hxw.

1 Like
  1. Audio holds the (only) audioListener in the scene
  2. I just changed how MP4Recorder handles the filename
  3. I tried using AudioSettings , same results (80kb file)
  4. I tried reducing resolution, same results

I attached a Unity VideoPlayer to the preview GameObject in the ReplayCam demo, and its working now. Thanks!

1 Like

What device is this happening on?

Any news about the black frames we get on the Samsung S7 Edge when using ARCore and not Vuforia?

Thanks,

i use simple code and where i start recording i have 1-1.5sec app freezing. Device - Samsung A30

clock = new RealtimeClock();
videoRecorder = new MP4Recorder(1080,1920,30,0,0,OnRecordStop);
cameraInput = new CameraInput(videoRecorder,clock,Camera.main);
audioInput = new AudioInput(videoRecorder,clock,Camera.main.GetComponent<AudioListener>());

Can you email me? I’d like to send an updated Android library with a potential fix.

Can you upload the full, unfiltered logs from logcat in a .txt file?

EDIT: Also you are creating an audio input even though you have not specified a sample rate or channel count.

Hi,
I’m using NatCorder and NatShare. Both seem to work well. Could you tell me if you know how to compress the recorded gifs ( whithout modifying dimensions or frame skip ?)
Thanks a lot

I don’t think this is possible. We don’t offer any ‘compression’ parameters.