Lib Graf
May 2018

A library to take a continuous media source [camera, video, gstreamer byte arrays], apply OpenGL filters [vertex+fragment shaders], and configure rendering in shape, size and rotation in 3D space the input and direct to any/multiple Surfaces on Android. The output can then be streamed using Gstreamer or any streaming library from an offscreen Surface's callback.

The best explanation to understand how + why this was made is in the library's readme of the library hosted internally on Tonbo's GitHub.

Flow Chart


           MediaSource
                |
   Attach [Multiple] Output Surfaces
                |
        Provide a Filter
                |
        Provide a Renderer
               ...
          Send to Display
               ...
    Callback from Surface (if any)
               ...
  Send data from callback to Gstreamer

Each surface is an output destination to where the mediasource's frames are rendered to. You can provide custom filters and renderers.

Media Source Types

A Media Source is any data source that writes to a SurfaceTexture.

  • CameraV1Source
  • CameraV2Source
  • MediaPlayer
  • Any DataSource writing to SurfaceTexture. This could be a custom data source created by you which writes to a SurfaceTexture. To do so you must use glTexImage2D.

Output Surfaces

The library provides support for the following output surfaces by default:

  • SurfaceView You can attach/deattach surfaceviews at runtime to increase or decrease the number of surfaces being written to.
  • MediaCodec
    • MediaCodec Surface with MediaMuxer Use this when you would like to record a video
    • MediaCodec Surface with a data callback Use this when you would like data callback from a MediaCodec Surface which encodes the Media Source's frames.
  • Offscreen Surface This is useful for when you want to render to an offscreen surface, ie, any surface which isn't backed by a view, or MediaCodec. The library allows you to asynchronously read the frames from this surface internally using a PBO.

Filters

These are OpenGLES shader programs. Each filter has a Vertex and Fragment shader. There is a list of filters provided in the library. We've re-used several filters from Bard Larson's GPUImage library.

Creating your own Filters
  • if you'd only like to change the fragment shader, create a class which extends BaseFragmentShader which provides a mediaTextureId which is the texture id of the data source's current frame.
  • if you'd like to change both the fragment and vertex shader, extend the BaseVertexShader
  • Your fragment shader must also name the mediaTextureId sTexture in the program if you'd like to use the default renderer.

Renderers

This is how the frame is rendered to the surface. You are provided a Sprite3d helper class that allows you to move the video sprite in 3D with simple builder functions.

Creating your own Renderer
  • extend the BaseRenderer.
  • ensure you provide your Filter the mediaTextureId in the onSurfaceCreated callback.
  • Define your rendering defaults in the onSurfaceChanged callback
  • set your viewport and draw your Sprite in onDraw. This is called every time the mediasource provides data.