Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

How to realize the function of html5 recording

2025-02-27 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Development >

Share

Shulou(Shulou.com)06/01 Report--

This article mainly introduces "how to achieve html5 recording function". In daily operation, I believe many people have doubts about how to achieve html5 recording function. The editor consulted all kinds of materials and sorted out simple and easy-to-use operation methods. I hope it will be helpful for you to answer the doubts about "how to achieve html5 recording function". Next, please follow the editor to study!

Origin

Due to the needs of the project, we need to implement the recording function on the web side. At first, two solutions were found, one through iframe and the other through html5's getUserMedia api. Since our recording function does not need to be compatible with IE browsers, we did not hesitate to choose getUserMedia provided by html5 to implement it. The basic idea is to refer to the official api documents and some online search programs to make a combination of programs to meet the needs of the project. But because we have to ensure that this recording function can be turned on on both the pad side and the PC side at the same time, there are some holes in it. The following is a process restore.

Step 1

Because the new api is through navigator.mediaDevices.getUserMedia, and a promise is returned.

The old api was navigator.getUserMedia, so a compatibility was made. The code is as follows:

/ / Old browsers may not have implemented mediaDevices at all, so we can first set an empty object if (navigator.mediaDevices = undefined) {navigator.mediaDevices = {};} / some browsers partially support mediaDevices. We cannot set getUserMedia// directly to the object because it may overwrite existing properties. Here we will only add it when there is no getUserMedia attribute. If (navigator.mediaDevices.getUserMedia = undefined) {let getUserMedia = navigator.getUserMedia | | navigator.webkitGetUserMedia | | navigator.mozGetUserMedia | | navigator.msGetUserMedia Navigator.mediaDevices.getUserMedia = function (constraints) {/ / first of all, get it if there is getUserMedia / / some browsers don't implement it at all-then return a reject from error to promise to maintain a unified interface if (! getUserMedia) {return Promise.reject (new Error ('getUserMedia is not implemented in this browser')) } / / otherwise, wrap a Promise return new Promise for the old navigator.getUserMedia method (function (resolve, reject) {getUserMedia.call (navigator, constraints, resolve, reject);});}

Step 2

This is a method that exists on the Internet, encapsulating a HZRecorder. This method is basically referenced. Call HZRecorder.get to call the recording API. This method passes in a callback function, which executes the callback function after new HZRecorder and passes in an instantiated HZRecorder object. You can start recording, pause, stop, play and other functions through the method of this object.

Var HZRecorder = function (stream, config) {config = config | | {}; config.sampleBits = config.sampleBits | | 8; / / sampling digit 8, 16 config.sampleRate = config.sampleRate | | (44100 / 6); / / sampling rate (1amp 6 44100) / / create an audio environment object audioContext = window.AudioContext | | window.webkitAudioContext; var context = new audioContext () / / input the sound to this object var audioInput = context.createMediaStreamSource (stream); / / set the volume node var volume = context.createGain (); audioInput.connect (volume); / / create a cache to cache the sound var bufferSize = 4096 / / create a cache node for the sound. The second and third parameters of the createScriptProcessor method mean that both the input and output are dual channels. Var recorder = context.createScriptProcessor (bufferSize, 2,2) Var audioData = {size: 0 / / recording file length, buffer: [] / / recording cache, inputSampleRate: context.sampleRate / / input sampling rate, inputSampleBits: 16 / input sampling digit 8,16, outputSampleRate: config.sampleRate / / output sampling rate OututSampleBits: config.sampleBits / / output sample digits 8,16, input: function (data) {this.buffer.push (new Float32Array (data)) This.size + = data.length;}, compress: function () {/ / merge compression / / merge var data = new Float32Array (this.size); var offset = 0; for (var I = 0; I

< this.buffer.length; i++) { data.set(this.buffer[i], offset); offset += this.buffer[i].length; } //压缩 var compression = parseInt(this.inputSampleRate / this.outputSampleRate); var length = data.length / compression; var result = new Float32Array(length); var index = 0, j = 0; while (index < length) { result[index] = data[j]; j += compression; index++; } return result; } , encodeWAV: function () { var sampleRate = Math.min(this.inputSampleRate, this.outputSampleRate); var sampleBits = Math.min(this.inputSampleBits, this.oututSampleBits); var bytes = this.compress(); var dataLength = bytes.length * (sampleBits / 8); var buffer = new ArrayBuffer(44 + dataLength); var data = new DataView(buffer); var channelCount = 1;//单声道 var offset = 0; var writeString = function (str) { for (var i = 0; i < str.length; i++) { data.setUint8(offset + i, str.charCodeAt(i)); } }; // 资源交换文件标识符 writeString('RIFF'); offset += 4; // 下个地址开始到文件尾总字节数,即文件大小-8 data.setUint32(offset, 36 + dataLength, true); offset += 4; // WAV文件标志 writeString('WAVE'); offset += 4; // 波形格式标志 writeString('fmt '); offset += 4; // 过滤字节,一般为 0x10 = 16 data.setUint32(offset, 16, true); offset += 4; // 格式类别 (PCM形式采样数据) data.setUint16(offset, 1, true); offset += 2; // 通道数 data.setUint16(offset, channelCount, true); offset += 2; // 采样率,每秒样本数,表示每个通道的播放速度 data.setUint32(offset, sampleRate, true); offset += 4; // 波形数据传输率 (每秒平均字节数) 单声道×每秒数据位数×每样本数据位/8 data.setUint32(offset, channelCount * sampleRate * (sampleBits / 8), true); offset += 4; // 快数据调整数 采样一次占用字节数 单声道×每样本的数据位数/8 data.setUint16(offset, channelCount * (sampleBits / 8), true); offset += 2; // 每样本数据位数 data.setUint16(offset, sampleBits, true); offset += 2; // 数据标识符 writeString('data'); offset += 4; // 采样数据总数,即数据总大小-44 data.setUint32(offset, dataLength, true); offset += 4; // 写入采样数据 if (sampleBits === 8) { for (var i = 0; i < bytes.length; i++, offset++) { var s = Math.max(-1, Math.min(1, bytes[i])); var val = s < 0 ? s * 0x8000 : s * 0x7FFF; val = parseInt(255 / (65535 / (val + 32768))); data.setInt8(offset, val, true); } } else { for (var i = 0; i < bytes.length; i++, offset += 2) { var s = Math.max(-1, Math.min(1, bytes[i])); data.setInt16(offset, s < 0 ? s * 0x8000 : s * 0x7FFF, true); } } return new Blob([data], { type: 'audio/wav' }); } }; //开始录音 this.start = function () { audioInput.connect(recorder); recorder.connect(context.destination); }; //停止 this.stop = function () { recorder.disconnect(); }; // 结束 this.end = function() { context.close(); }; // 继续 this.again = function() { recorder.connect(context.destination); }; //获取音频文件 this.getBlob = function () { this.stop(); return audioData.encodeWAV(); }; //回放 this.play = function (audio) { audio.src = window.URL.createObjectURL(this.getBlob()); }; //上传 this.upload = function (url, callback) { var fd = new FormData(); fd.append('audioData', this.getBlob()); var xhr = new XMLHttpRequest(); if (callback) { xhr.upload.addEventListener('progress', function (e) { callback('uploading', e); }, false); xhr.addEventListener('load', function (e) { callback('ok', e); }, false); xhr.addEventListener('error', function (e) { callback('error', e); }, false); xhr.addEventListener('abort', function (e) { callback('cancel', e); }, false); } xhr.open('POST', url); xhr.send(fd); }; //音频采集 recorder.onaudioprocess = function (e) { audioData.input(e.inputBuffer.getChannelData(0)); //record(e.inputBuffer.getChannelData(0)); }; }; //抛出异常 HZRecorder.throwError = function (message) { throw new function () { this.toString = function () { return message; };}; }; //是否支持录音 HZRecorder.canRecording = (navigator.getUserMedia != null); //获取录音机 HZRecorder.get = function (callback, config) { if (callback) { navigator.mediaDevices .getUserMedia({ audio: true }) .then(function(stream) { let rec = new HZRecorder(stream, config); callback(rec); }) .catch(function(error) { HZRecorder.throwError('无法录音,请检查设备状态'); }); }}; window.HZRecorder = HZRecorder; 以上,已经可以满足大部分的需求。但是我们要兼容pad端。我们的pad有几个问题必须解决。 录音格式必须是mp3才能播放 window.URL.createObjectURL传入blob数据在pad端报错,转不了 以下为解决这两个问题的方案。 步骤3 以下为我实现 录音格式为mp3 和 window.URL.createObjectURL传入blob数据在pad端报错 的方案。 1、修改HZRecorder里的audioData对象代码。并引入网上一位大神的一个js文件lamejs.js const lame = new lamejs();let audioData = { samplesMono: null, maxSamples: 1152, mp3Encoder: new lame.Mp3Encoder(1, context.sampleRate || 44100, config.bitRate || 128), dataBuffer: [], size: 0, // 录音文件长度 buffer: [], // 录音缓存 inputSampleRate: context.sampleRate, // 输入采样率 inputSampleBits: 16, // 输入采样数位 8, 16 outputSampleRate: config.sampleRate, // 输出采样率 oututSampleBits: config.sampleBits, // 输出采样数位 8, 16 convertBuffer: function(arrayBuffer) { let data = new Float32Array(arrayBuffer); let out = new Int16Array(arrayBuffer.length); this.floatTo16BitPCM(data, out); return out; }, floatTo16BitPCM: function(input, output) { for (let i = 0; i < input.length; i++) { let s = Math.max(-1, Math.min(1, input[i])); output[i] = s < 0 ? s * 0x8000 : s * 0x7fff; } }, appendToBuffer: function(mp3Buf) { this.dataBuffer.push(new Int8Array(mp3Buf)); }, encode: function(arrayBuffer) { this.samplesMono = this.convertBuffer(arrayBuffer); let remaining = this.samplesMono.length; for (let i = 0; remaining >

= 0; I + = this.maxSamples) {let left = this.samplesMono.subarray (I, I + this.maxSamples); let mp3buf = this.mp3Encoder.encodeBuffer (left); this.appendToBuffer (mp3buf); remaining-= this.maxSamples;}}, finish: function () {this.appendToBuffer (this.mp3Encoder.flush ()) Return new Blob (this.dataBuffer, {type: 'audio/mp3'});}, input: function (data) {this.buffer.push (new Float32Array (data)); this.size + = data.length;}, compress: function () {/ / merge compression / / merge let data = new Float32Array (this.size); let offset = 0 For (let I = 0; I < this.buffer.length; iTunes +) {data.set (this.buffer [I], offset); offset + = this.buffer [I] .length;} / / compress let compression = parseInt (this.inputSampleRate / this.outputSampleRate, 10); let length = data.length / compression; let result = new Float32Array (length); let index = 0 Let j = 0; while (index < length) {result [index] = data [j]; j + = compression; index++;} return result;}, encodeWAV: function () {let sampleRate = Math.min (this.inputSampleRate, this.outputSampleRate); let sampleBits = Math.min (this.inputSampleBits, this.oututSampleBits) Let bytes = this.compress (); let dataLength = bytes.length * (sampleBits / 8); let buffer = new ArrayBuffer (44 + dataLength); let data = new DataView (buffer); let channelCount = 1; / / Mono let offset = 0; let writeString = function (str) {for (let I = 0; I < str.length) Data.setUint8 +) {data.setUint8 (offset + I, str.charCodeAt (I));}}; / / Resource exchange file identifier writeString ('RIFF'); offset + = 4; / / the total number of bytes from the next address to the end of the file, that is, the file size-8 data.setUint32 (offset, 36 + dataLength, true); offset + = 4 / / WAV file flag writeString ('WAVE'); offset + = 4; / / Waveform format flag writeString (' fmt'); offset + = 4; / / filter bytes, generally 0x10 = 16 data.setUint32 (offset, 16, true); offset + = 4 / / format category (sampled data in PCM form) data.setUint16 (offset, 1, true); offset + = 2; / / number of channels data.setUint16 (offset, channelCount, true); offset + = 2; / / sampling rate, samples per second, indicating the playback speed of each channel data.setUint32 (offset, sampleRate, true); offset + = 4 / / Waveform data transfer rate (average bytes per second) Mono × data bits per second × data bits per sample / 8 data.setUint32 (offset, channelCount * sampleRate * (sampleBits / 8), true); offset + = 4 / / Fast data integer sampling takes up bytes per mono × data bits per sample / 8 data.setUint16 (offset, channelCount * (sampleBits / 8), true); offset + = 2; / / data bits per sample data.setUint16 (offset, sampleBits, true); offset + = 2; / / data identifier writeString ('data') Offset + = 4; / / Total sampled data, i.e. total data size-44 data.setUint32 (offset, dataLength, true); offset + = 4; / / write sampled data if (sampleBits = 8) {for (let I = 0; I < bytes.length) Offset++) {const s = Math.max (- 1, Math.min (1, bytes [I]); let val = s < 0? S * 0x8000: s * 0x7fff; val = parseInt (65535 / (val + 32768), 10); data.setInt8 (offset, val, true);} else {for (let I = 0; I < bytes.length; istandards, offset + = 2) {const s = Math.max (- 1, Math.min (1, bytes [I])) Data.setInt16 (offset, s < 0? S * 0x8000: s * 0x7fff, true);} return new Blob ([data], {type: 'audio/wav'});}}

2. Modify the calling method of audio acquisition in HZRecord.

/ / Audio acquisition recorder.onaudioprocess = function (e) {audioData.encode (e.inputBuffer.getChannelData (0));}

3. GetBlob method of HZRecord.

This.getBlob = function () {this.stop (); return audioData.finish ();}

4. Play method of HZRecord. Transfer blob to base64url.

This.play = function (func) {readBlobAsDataURL (this.getBlob (), func);}; function readBlobAsDataURL (data, callback) {let fileReader = new FileReader (); fileReader.onload = function (e) {callback (e.target.result);}; fileReader.readAsDataURL (data);}

So far, the above two problems have been solved.

Step 4

This paper mainly introduces the dynamic effect of how to make a recording.

According to the incoming volume, make a circular arc dynamic expansion.

/ / create an analyser node to get audio time and frequency data const analyser = context.createAnalyser (); audioInput.connect (analyser); const inputAnalyser = new Uint8Array (1); const wrapEle = $this.refs ['wrap']; let ctx = wrapEle.getContext (' 2d'); const width = wrapEle.width;const height = wrapEle.height;const center = {x: width / 2, y: height / 2}; function drawArc (ctx, color, x, y, radius, beginAngle, endAngle) {ctx.beginPath () Ctx.lineWidth = 1; ctx.strokeStyle = color; ctx.arc (x, y, radius, (Math.PI * beginAngle) / 180, (Math.PI * endAngle) / 180); ctx.stroke ();} (function drawSpectrum () {analyser.getByteFrequencyData (inputAnalyser); / / get frequency domain data ctx.clearRect (0,0, width, height); / / Line for (let I = 0; I < 1) ) {let value = inputAnalyser [I] / 3; / /

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

Development

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report