AlphaVideoPlayer 技术方案
📸

AlphaVideoPlayer 技术方案

Published
October 1, 2022
Tags
FED
微信小程序
Description
微信小程序透明视频技术方案

前言

微信小程序透明视频播放器(AlphaVideoPlayer)旨在解决微信不支持带透明通道的视频(webm等格式)的播放问题。
为满足小程序上播放透明视频的需求,参考字节 AlphaPlayer 的透明通道混合方案、企鹅电竞 VAP(Video Animation Player) 的实现原理,诞生了该实现方案。

技术方案

基于 VideoDecoder 的视频解码方案

整体架构设计:
notion image
 
备注:该方案诞生于:微信小程序并不支持获取 VideoContext / Canvas2D 接收 VideoContext 存在较多问题的场景下。
 
接入方式:
1、借助微信提供的 VideoDecoder(视频解码器)能力获取视频帧数据(需要按PTS解码来保证视音频同步)。
2、将解码获取的视频帧数据发送给WebGL,借助自定义shader(见后文)实现透明视频的效果。3、将解码获取的音频数据发送给MediaAudioPlayer。
4、组件销毁时,销毁对应的资源。
 
相关API:
 
存在的问题:
  1. 对视频码率等要求较高(视频码率不能太高、推荐指定视频解码库)。
  1. 视频解码对硬件要求较高,设备发热较为很严重,部分低端设备会解码失败。
  1. 对微信的基础库要求较高(>=2.11.0)(截止2022.04.08约0.35%的用户基础库低于2.11.0)。
  1. 频繁启动视频解码器会导致解码器持续失败(目前采用的 H5 的兜底方案)。
  1. 不支持手动 seek。
  1. 视频解码器不支持 Pause,使用 Stop+Start+Seek 的方式进行模拟,对于整体的性能开销较大。
  1. 小程序 Canvas 同层渲染在部分客户端存在明显差异。
 

基于 VideoContext 的实现方案

鉴于上述方案存在较多问题(主要集中在小程序视频解码器的问题上) 通过 NodesRef 返回 VideoContext 然后交给 Canvas 绘制,这样就使得 AlphaVideoPlayer 的实现中可以避免使用 VideoDecoder、MediaAudioPlayer 这两个资源占用较大的 API
 
整体架构设计:
notion image
备注:
  1. 对于简单的 Canvas2D 可以完成的功能直接使用 Canvas2D 即可(eg.简单的添加水印、截取视频帧等)
  1. 对于一些复杂的场景需要借助 WebGL 完成,此时需要使用 Canvas2D 获取到 VideoContext 对应的视频帧数据转发给 WebGLRenderingContext 通过自定义 Shader 的方式处理(eg.视频的透明通道混合、复杂的视频滤镜等)
  1. 尽量保证标准 API 以简化使用者的心智负担(透视视频播放器在本质上还是 Video 组件,API 和 Props 尽量和 小程序<video /> 组件 保证一直)
 
  • 接入方式:
  1. 通过 wx.createSelectorQuery 获取 VideoContext 实例
  1. 将VideoContext实例传递给Canvas绘,绘制类型需要根据实际需要使用
    1. CanvasRenderingContext2D:canvas.drawImage(video, …)
    2. WebGLRenderingContext:gl.texImage2D(…, video)微信小程序不支持直接传入VideoContext实例,只能先借助 CanvasRenderingContext2D 获取到像素数据后传递给WebGLRenderingContext
特别说明:
  • 支持任意时间点seek
  • CanvasRenderingContext2D 的 drawImage 方法 2.10.0(截止2022.04.08约0.18%的用户基础库低于2.10.0) 起支持传入通过 SelectorQuery 获取的 video 对象
应用场景:
  1. Video滤镜、视频贴图
  1. Video水印(非 CoverView 的方式)
  1. 选取视频封面、视频编辑等
  1. 透明视频播放器

后记

所有的技术方案都不是一成不变的,随着技术的成熟和底层能力的完善,对应的技术方案也需要做到与时俱进。

参考文档

附录

透明通道混合GLSL代码
/* * Copyright (c) 2021. J <info_together@aliyun.com> All Rights Reserved. */ export const vertexShader = ` precision mediump float; precision mediump int; attribute vec3 aPos; attribute vec2 aVertexTextureCoord; uniform vec2 u_scale; varying vec2 vUv; void main(void) { vUv = aVertexTextureCoord; vec3 scaledPosition = aPos * vec3(u_scale, 1.0); vec3 t = scaledPosition + vec3(0, 0, 0); gl_Position = vec4(t, 1); } ` export const fragmentShader = ` precision mediump float; uniform sampler2D texture; varying vec2 vUv; vec3 rgb2hsv(vec3 c){ vec4 K = vec4(0.0, -1.0 / 3.0, 2.0 / 3.0, -1.0); vec4 p = mix(vec4(c.bg, K.wz), vec4(c.gb, K.xy), step(c.b, c.g)); vec4 q = mix(vec4(p.xyw, c.r), vec4(c.r, p.yzx), step(p.x, c.r)); float d = q.x - min(q.w, q.y); float e = 1.0e-10; return vec3(abs(q.z + (q.w - q.y) / (6.0 * d + e)), d / (q.x + e), q.x); } void main( void ) { vec2 v = vec2(vUv.xy); v.x = v.x / 2.0; vec2 mask = vec2(vUv.x / 2.0 + 0.5, v.y); vec3 hsv = rgb2hsv(texture2D(texture, mask).rgb); vec4 color = vec4(texture2D(texture, v).rgb, hsv.z); gl_FragColor = color; } `
视频批量转码脚本
#!/bin/bash # 视频码率太高,小程序解码会很卡,推荐使用ffmpeg进行转码 # 单个视频 ffmpeg -i input.mp4 output.mp4 echo '批量视频转码开始' find ./ -name '*.mp4' -exec sh -c 'ffmpeg -i "$0" "${0%%.mp4}.small.mp4"' {} \; echo '批量视频转码结束'
AlphaVideoPlayerCore 核心代码
// index.tsx /* * Copyright (c) 2022. J <info_together@aliyun.com> All Rights Reserved. */ import Taro from '@tarojs/taro'; import classnames from 'classnames'; import { Video, View, Canvas } from '@tarojs/components'; import { VideoProps } from '@tarojs/components/types/Video'; import Status from '../Status'; import { createRenderer } from './utils'; import styles from './index.module.scss'; export enum VideoStatusEnum { INIT = 'init', READY = 'ready', PLAYING = 'playing', PAUSE = 'pause', SEEKING = 'seeking', WAITING = 'waiting', ENDED = 'ended', ERROR = 'error' } interface VideoMetaData { width: number height: number duration: number } interface ComponentQueryDetail { width: number height: number } interface P extends Pick<VideoProps, | 'onTimeUpdate' | 'onEnded' | 'onPlay' | 'onError' | 'onWaiting' | 'onPause' | 'onLoadedMetaData' | 'onProgress' | 'autoplay' | 'muted' | 'poster' | 'loop'> { id?: string src: string fps?: number debug?: boolean onReady?: () => void onStatusChange?: (status: VideoStatusEnum) => void onDraw?: (canvas, ctx, width, height) => void } interface S { canvasDetail: { width: number height: number } | null videoStatus: VideoStatusEnum videoMuted: boolean } const DPR = wx.getSystemInfoSync().pixelRatio; export default class AlphaVideoPlayerCore extends Taro.Component<P, S> { constructor(props) { super(props); this.state = { canvasDetail: null, videoStatus: VideoStatusEnum.INIT, videoMuted: props.muted, } this.FPS = props.fps || 24; } FPS = 24; videoContext; canvasNode; requestAnimationFrameId; canvas2DContext; glRenderFunc; destroyGL; videoMetaData?: VideoMetaData; componentQueryDetail?: ComponentQueryDetail; videoStatus: VideoStatusEnum; lastRunTime = Date.now(); lastLoopTime = Date.now(); timer; selectorComponentQuery = () => { // eslint-disable-next-line @typescript-eslint/no-this-alias const that = this; wx.createSelectorQuery() .in(that.$scope) .select('#alphaVideoPlayer') .boundingClientRect((res) => { this.componentQueryDetail = { width: res.width, height: res.height, } that.calculateCanvas(); }) .exec(); } calculateCanvas = () => { const { width: componentWidth, height: componentHeight } = this.componentQueryDetail!; const { width: videoWith, height: videoHeight } = this.videoMetaData!; let base = [componentWidth, componentHeight]; const videoRadio = (videoWith / 2) / videoHeight; const resultHeight = componentWidth / videoRadio; base = [componentWidth, resultHeight]; if (resultHeight > componentHeight) { const resultWidth = componentHeight * videoRadio; base = [resultWidth, componentHeight]; } this.setState({ canvasDetail: { width: Math.floor(base[0]), height: Math.floor(base[1]), } }, () => { this.onPlay(); }) } onPlay = () => { // eslint-disable-next-line @typescript-eslint/no-this-alias const that = this; wx.createSelectorQuery() .in(that.$scope) .select('#video') .context(({ context }) => { that.videoContext = context; wx.createSelectorQuery() .in(that.$scope) .selectAll('#canvasHelper, #webglCanvas') .node(res => { const { width, height } = that.state.canvasDetail!; that.canvasNode = res[0].node; that.canvas2DContext = res[0].node.getContext('2d') res[0].node.width = width * DPR res[0].node.height = height * DPR res[1].node.width = width * DPR / 2 res[1].node.height = height * DPR / 2 const { render, destroy } = createRenderer({ canvas: res[1].node, width: width, height: height, }); that.glRenderFunc = render; that.destroyGL = destroy; that.setVideoStatus(VideoStatusEnum.READY); that.getFirstFrame(); that.onComponentReady(); }) .exec() }).exec(); } onComponentReady = () => { const { onReady, autoplay } = this.props; if (autoplay) { this.timer = setTimeout(() => { this.play(); }, 1000); } onReady && onReady(); } getFirstFrame = () => { // eslint-disable-next-line @typescript-eslint/no-this-alias const that = this; const { poster } = this.props; if (!poster) { return; } const posterImage = this.canvasNode.createImage(); wx.downloadFile({ url: poster, success: (res) => { posterImage.onload = () => { that.draw(posterImage) } posterImage.src = res.tempFilePath; }, fail: (err) => { console.log('getFirstFrame err', err); } }) } loop = (status?: VideoStatusEnum) => { const videoStatus = status ? status : this.videoStatus; if ( videoStatus == VideoStatusEnum.PLAYING || videoStatus == VideoStatusEnum.READY || videoStatus == VideoStatusEnum.WAITING ) { this.requestAnimationFrameId = this.canvasNode.requestAnimationFrame(() => { const now = Date.now(); if (now - this.lastLoopTime >= (1000 / this.FPS)) { this.lastLoopTime = now; this.draw(); } this.loop() }); } else { this.canvasNode.cancelAnimationFrame(this.requestAnimationFrameId); } } play = () => { this.playOrPause(VideoStatusEnum.PLAYING); } pause = () => { this.playOrPause(); } stop = () => { this.videoContext.stop(); } playbackRate = (rate: number) => { this.videoContext.playbackRate(rate); } playOrPause = (status?: VideoStatusEnum.PLAYING | VideoStatusEnum.PAUSE) => { console.info('playOrPause', status); const currentTime = Date.now(); const protectTime = 100; // 设置保护性延时 单位毫秒,不要小于50 建议100以上 if ((currentTime - this.lastRunTime) < protectTime) { Taro.showToast({ title: '操作过于频繁', icon: 'none', }) console.info('点击操作过于频繁,已被忽略'); return; // 两次执行太过频繁,直接忽略 } if ( status === VideoStatusEnum.PLAYING || this.videoStatus !== VideoStatusEnum.PLAYING ) { console.info('playOrPause', 1); this.videoContext.play(); this.setVideoStatus(VideoStatusEnum.PLAYING); } else { console.info('playOrPause', 2); this.videoContext.pause(); this.setVideoStatus(VideoStatusEnum.PAUSE); } this.lastRunTime = Date.now(); } /** * video seek * @param percent */ seek = (percent: number) => { if (this.videoMetaData) { this.pause(); this.setVideoStatus(VideoStatusEnum.SEEKING); const position = percent * this.videoMetaData.duration; this.videoContext.seek(position); const timer = setTimeout(() => { this.play(); clearTimeout(timer); }, 300) } } draw = (initData?: string) => { if (!this.state.canvasDetail) { return; } const { width: w, height: h } = this.state.canvasDetail; this.canvas2DContext.drawImage(initData || this.videoContext, 0, 0, w * DPR, h * DPR); // this.canvas2DContext.fillStyle = 'rgba(0, 0, 0, 0.5)'; // this.canvas2DContext.fillRect(0, 0, w * DPR, h * DPR); this.props.onDraw && this.props.onDraw(this.canvasNode, this.canvas2DContext, w * DPR, h * DPR) this.canvas2DContext.restore(); this.glRenderFunc( new Uint8Array(this.canvas2DContext.getImageData(0, 0, w * DPR, h * DPR).data), w * DPR, h * DPR ) } onVideoLoadedMetaData = (e) => { console.log('video meta data', e.detail); this.videoMetaData = e.detail; this.selectorComponentQuery(); this.props.onLoadedMetaData && this.props.onLoadedMetaData(e); } setVideoStatus = (status: VideoStatusEnum) => { return new Promise<VideoStatusEnum>((resolve) => { // if (status == this.videoStatus) { // return; // } this.setState({ videoStatus: status, }, () => { console.log('videoStatus', this.state.videoStatus); }) this.videoStatus = status; resolve(status); this.props.onStatusChange && this.props.onStatusChange(status); }) } onVideoPlay = (e) => { this.setVideoStatus(VideoStatusEnum.PLAYING) .then((status) => { this.loop(status); this.props.onPlay && this.props.onPlay(e) }) } onVideoPause = (e) => { this.setVideoStatus(VideoStatusEnum.PAUSE) .then(() => { this.props.onPause && this.props.onPause(e); }) } onVideoEnded = (e) => { const { loop } = this.props; this.setVideoStatus(VideoStatusEnum.ENDED) .then(() => { this.props.onEnded && this.props.onEnded(e); if (loop) { this.play(); } }) } onVideoTimeUpdate = (e) => { this.props.onTimeUpdate && this.props.onTimeUpdate(e) } onVideoWaiting = (e) => { this.props.onWaiting && this.props.onWaiting(e); } onVideoError = (e) => { this.setVideoStatus(VideoStatusEnum.ERROR) .then(() => { this.props.onError && this.props.onError(e); }) } onVideoProgress = (e) => { this.props.onProgress && this.props.onProgress(e); } destroy = () => { this.canvasNode.cancelAnimationFrame(this.requestAnimationFrameId); this.destroyGL(); clearTimeout(this.timer); this.timer = null; } componentDidMount() { } componentWillUnmount() { this.destroy(); } render() { const { src, debug = false } = this.props; const { canvasDetail, videoMuted } = this.state; return ( <View id='alphaVideoPlayer' className={classnames(styles.alphaVideoPlayer, { [styles.debug]: debug, })} > <Video id='video' src={src} muted={videoMuted} className={styles.videoHelper} controls={false} objectFit='contain' onPlay={this.onVideoPlay} onPause={this.onVideoPause} onEnded={this.onVideoEnded} onTimeUpdate={this.onVideoTimeUpdate} onWaiting={this.onVideoWaiting} onError={this.onVideoError} onProgress={this.onVideoProgress} onLoadedMetaData={this.onVideoLoadedMetaData} /> { canvasDetail ? ( <View className={styles.canvasCont}> <Canvas id='canvasHelper' type='2d' className={styles.canvas2D} canvasId='canvasHelper' disableScroll /> <Canvas type='webgl' className={styles.canvasGL} id='webglCanvas' canvasId='webglCanvas' style={{ width: `${canvasDetail.width}px`, height: `${canvasDetail.height - (debug ? 4 : 0)}px`, }} /> </View> ) : ( <View className={styles.loading}> <Status type='loading' smallIcon /> </View> ) } </View> ); } }
// utils.ts /* * Copyright (c) 2022. J <info_together@aliyun.com> All Rights Reserved. */ import { vertexShader as vertexShaderAlpha, fragmentShader as fragmentShaderAlpha} from './shader/alphaPlayer'; const buffers: any = {}; const vertex = [ -1, -1, 0.0, 1, -1, 0.0, 1, 1, 0.0, -1, 1, 0.0, ]; const vertexIndices = [ 0, 1, 2, 0, 2, 3, ]; const texCoords = [ 0.0, 0.0, 1.0, 0.0, 1.0, 1.0, 0.0, 1.0, ]; export const createShader = (gl, src, type) => { const shader = gl.createShader(type); gl.shaderSource(shader, src); gl.compileShader(shader); if (!gl.getShaderParameter(shader, gl.COMPILE_STATUS)) { console.error("Error compiling shader: " + gl.getShaderInfoLog(shader)); } return shader; } const systemInfo = wx.getSystemInfoSync() const pixelRatio = systemInfo.pixelRatio; export const createRenderer = (ops) => { const { canvas, width, height, } = ops; const scale = [1, 1]; const vertexShader = vertexShaderAlpha; const fragmentShader = fragmentShaderAlpha; const gl = canvas.getContext('webgl', { alpha: true, depth: true, stencil: true, antialias: true, premultipliedAlpha: true, preserveDrawingBuffer: false, powerPreference: 'default', failIfMajorPerformanceCaveat: false, xrCompatible: true }); if (!gl) { throw new Error('Unable to get webgl context'); } gl.canvas.width = width * pixelRatio; gl.canvas.height = height * pixelRatio ; gl.viewport(0, 0, gl.drawingBufferWidth, gl.drawingBufferHeight); const _vertexShader = createShader(gl, vertexShader, gl.VERTEX_SHADER); const _fragmentShader = createShader(gl, fragmentShader, gl.FRAGMENT_SHADER); const program = gl.createProgram(); gl.attachShader(program, _vertexShader); gl.attachShader(program, _fragmentShader); gl.linkProgram(program); if (!gl.getProgramParameter(program, gl.LINK_STATUS)) { throw new Error('Unable to initialize the shader program'); } gl.useProgram(program); const texture = gl.createTexture(); gl.activeTexture(gl.TEXTURE0); gl.bindTexture(gl.TEXTURE_2D, texture) gl.pixelStorei(gl.UNPACK_FLIP_Y_WEBGL, true); gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.NEAREST); gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.NEAREST); gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.CLAMP_TO_EDGE); gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_T, gl.CLAMP_TO_EDGE); gl.bindTexture(gl.TEXTURE_2D, null); buffers.vertexBuffer = gl.createBuffer(); gl.bindBuffer(gl.ARRAY_BUFFER, buffers.vertexBuffer); gl.bufferData(gl.ARRAY_BUFFER, new Float32Array(vertex), gl.STATIC_DRAW); buffers.vertexIndiceBuffer = gl.createBuffer(); gl.bindBuffer(gl.ELEMENT_ARRAY_BUFFER, buffers.vertexIndiceBuffer); gl.bufferData(gl.ELEMENT_ARRAY_BUFFER, new Uint16Array(vertexIndices), gl.STATIC_DRAW); const aVertexPosition = gl.getAttribLocation(program, 'aPos'); gl.vertexAttribPointer(aVertexPosition, 3, gl.FLOAT, false, 0, 0); gl.enableVertexAttribArray(aVertexPosition); const scaleLocation = gl.getUniformLocation(program, "u_scale"); gl.uniform2fv(scaleLocation, scale); buffers.trianglesTexCoordBuffer = gl.createBuffer(); gl.bindBuffer(gl.ARRAY_BUFFER, buffers.trianglesTexCoordBuffer); gl.bufferData(gl.ARRAY_BUFFER, new Float32Array(texCoords), gl.STATIC_DRAW); const vertexTexCoordAttribute = gl.getAttribLocation(program, 'aVertexTextureCoord'); gl.enableVertexAttribArray(vertexTexCoordAttribute); gl.vertexAttribPointer(vertexTexCoordAttribute, 2, gl.FLOAT, false, 0, 0); const samplerUniform = gl.getUniformLocation(program, 'uSampler'); gl.uniform1i(samplerUniform, 0); return { // eslint-disable-next-line no-shadow render: function (arrayBuffer, width, height) { // console.warn('render', arrayBuffer, width, height); gl.viewport(0, 0, gl.drawingBufferWidth, gl.drawingBufferHeight); gl.bindTexture(gl.TEXTURE_2D, texture); gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGBA, width, height, 0, gl.RGBA, gl.UNSIGNED_BYTE, arrayBuffer); gl.drawElements(gl.TRIANGLES, 6, gl.UNSIGNED_SHORT, 0); }, destroy: function (){ gl.deleteTexture(texture); gl.deleteShader(_vertexShader); gl.deleteShader(_fragmentShader); buffers && buffers.trianglesTexCoordBuffer && gl.deleteBuffer(buffers.trianglesTexCoordBuffer); buffers && buffers.vertexBuffer && gl.deleteBuffer(buffers.vertexBuffer); buffers && buffers.vertexIndiceBuffer && gl.deleteBuffer(buffers.vertexIndiceBuffer); gl.deleteProgram(program); } } }
// shader/alphaPlayer.ts /* * Copyright (c) 2022. J <info_together@aliyun.com> All Rights Reserved. */ export const vertexShader = ` precision mediump float; precision mediump int; attribute vec3 aPos; attribute vec2 aVertexTextureCoord; uniform vec2 u_scale; varying vec2 vUv; void main(void) { vUv = aVertexTextureCoord; vec3 scaledPosition = aPos * vec3(u_scale, 1.0); vec3 t = scaledPosition + vec3(0, 0, 0); gl_Position = vec4(t, 1); } ` export const fragmentShader = ` precision mediump float; uniform sampler2D texture; varying vec2 vUv; vec3 rgb2hsv(vec3 c){ vec4 K = vec4(0.0, -1.0 / 3.0, 2.0 / 3.0, -1.0); vec4 p = mix(vec4(c.bg, K.wz), vec4(c.gb, K.xy), step(c.b, c.g)); vec4 q = mix(vec4(p.xyw, c.r), vec4(c.r, p.yzx), step(p.x, c.r)); float d = q.x - min(q.w, q.y); float e = 1.0e-10; return vec3(abs(q.z + (q.w - q.y) / (6.0 * d + e)), d / (q.x + e), q.x); } void main( void ) { vec2 v = vec2(vUv.xy); v.x = v.x / 2.0; vec2 mask = vec2(vUv.x / 2.0 + 0.5, v.y); vec3 hsv = rgb2hsv(texture2D(texture, mask).rgb); vec4 color = vec4(texture2D(texture, v).rgb, hsv.z); gl_FragColor = color; } `
// index.module.scss /*! * Copyright (c) 2022. J <info_together@aliyun.com> All Rights Reserved. */ .alphaVideoPlayer{ position: relative; width: 100%; height: 100%; box-sizing: border-box; } .videoHelper{ position: absolute; top: -999999px; left: -99999px; } .canvas2D{ position: absolute; top: -999999px; left: -99999px; } .canvasGL{ box-sizing: border-box; } .canvasCont{ width: 100%; height: 100%; display: flex; flex-direction: column; justify-content: center; align-items: center; } .loading{ width: 100%; height: 100%; display: flex; flex-direction: column; justify-content: center; align-items: center; } .debug{ border: 1px dashed red; .canvasGL{ border: 1px dashed green; } }