Monitor Sound Wave Decibels While Recording on iOS

You’re going to laugh at this.  My brother-in-law, David, told me an app idea he wanted me to build YEARS ago.  I had only written one crappy Android app, but when you write an app people start throwing you their ideas like crazy.

His idea, it was funny, but it was beyond my ability so I never really got going on it.  Now that NativeScript is here and I’m spending more time in the app world I decided to give it a go.  His idea? RankMyStank…

Farts

He wanted an app where you fart into the phone and it ranks your fart based on an array of factors: Length, depth, variety, pitch and so on.  I’m still working out that algorithm…

What I love about this idea is it made me really dig into NativeScript and try to figure out how to make this happen.  So I found the plugin nativescript-audio.  This boasted the ability to record and playback sounds easily in NS.

At first I couldn’t get it to work, so I put the project to the side.  After I released my first app into iTunes recently I had learned so much more about the framework and decided to give it another go.

I got the plugin working pretty quick.  As I toyed with it I began to realize there was no way to track the sound wave or get any details about the structure of the sounds being recorded.  You could just record then play.

Contributing

I’ve always wanted to contribute to an open source project but never felt like I was talented enough.  As I dug into figuring out how to accomplish this task I realized if I could figure it out, I could probably add it to the plugin.  This was motivation.  So, I set forth on my venture and here’s what I figured out.

Native Functionality

The cool thing about NativeScript is that you have access to all the native functions of iOS and Android.  Usually you don’t tap into them because there is such a great library of plugins already.  And there’s no way I could build this plugin from scratch at this point so altering the nativescript-audio plugin was my best option.

Inside the nativescript-audio plugin is a directory called src/ios/ and in here you find the functionality that communicates with the iOS native functions for audio.  The file recorder.js was my best bet so I popped it open.  There I saw:

TNSRecorder.prototype.start = function (options) {

As well as

TNSRecorder.prototype.stop = function () {

Amongst others, these guys told me that I found the spot where the functions I’ve been using in my javascript code to use this plugin were being defined.

Monitoring Recording

So I spent quite some time learning about the class I saw being used here: AVAudioRecorder.  This is the iOS class used to record sound.  I found out exactly how to use it to get the dB of sound waves being recorded in real time.  The function is called averagePowerForChannel() though there is also peakPowerForChannel().  From what I understand averagePowerForChannel() is closer to what the human ear hears so I went with that.

I also discovered that in order to use these functions you you must turn metering on.  It’s processor intensive so it is not on bey default.  In the end, this is all the code I added to the plugin to give us two new functions… one to detect if recording is happening and one to get us the current dB of sound being recorded.

TNSRecorder.prototype.isRecording = function () {
    var _this = this;
    return _this._recorder.recording;
};
TNSRecorder.prototype.getMeters = function (channel) {
    var _this = this;
    if (!_this._recorder.meteringEnabled) {
        _this._recorder.meteringEnabled = true;
    }
    _this._recorder.updateMeters();
    return _this._recorder.averagePowerForChannel(channel);
};

I wound up not even using the isRecording function but I figured it could be useful for others so I left it in after testing it.

Rewriting in TypeScript

So TypeScript is how the original plugin was built and it uses a compiler to create the .js files.  Now knowing this I had to write these functions in TS before submitting a pull request to the original plugin.  Fortunately that wasn’t too difficult.

public isRecording() {
  var _this = this;
  return _this._recorder.recording;
}

public getMeters(channel: number) {
  var _this = this;
  if(!_this._recorder.meteringEnabled) {
    _this._recorder.meteringEnabled = true;
  }
  _this._recorder.updateMeters();
  return _this._recorder.averagePowerForChannel(channel);
}

Using In NativeScript

So here’s what I have now just demoing the functions and showing they work.  I update a Label view’s width to show current dB coming in.  It’s fairly straight forward and a decent example of it’s use.

exports.onTap = function(args) {
    var recorderOptions = {
        filename: documents.path + "/recording.caf",
        infoCallback: function() {
            console.log();
        },
        errorCallback: function() {
            console.log();
        }
    };
    if( isRec ) {
        recorder.stop();
        isRec = false;
        timer.clearInterval(timeRec);
    } else {
        recorder.start( recorderOptions );
        isRec = true;
        timeRec = timer.setInterval( function() {
            var barWidth = Math.round((160+Number(recorder.getMeters(0)))/160*100);
            page.css = ".soundlevel { color:#FFF; padding: 10px; background-color: Brown; width:" + barWidth + "%; }";
        }, 10);
    }
}

Leave a Reply

Your email address will not be published.