Hi guys, I've a data processing function similar to structure below. The processing function is slowing down when I change data source to another one. It's audio data processing function. button1.addEventListener('click', function() { ... var arr = [], buffer = e.outputBuffer.getChannelData(0); for (var i = 0; i < arr.length; i++) { arr[i] = buffer[i]; processing(arr[i]); } }); // source one button2.addEventListener('click', function() { ... var arr = [], buffer = e.outputBuffer.getChannelData(0); for (var i = 0; i < arr.length; i++) { arr[i] = buffer[i]; processing(arr[i]); } }); // source two function processing(j) { var this.j = j , node = context.createBufferSource() , buffer = context.createBuffer(1, 4096, context.sampleRate) , data = buffer.getChannelData(0); for (var i = 0; i < 4096; i++) { data[i] = j; ... } ... } Code (markup): Everything is working fine with one data source (one song), but if you shuffle data source between two songs or add more data sources by clicking or anything the music quality becomes choppy and bad and worst and crash my browser eventually. I heard the for loop is bad. Does it has anything to do with this. Any idea how can I manage these kind of array source is greatly appreciated. Thank you,
I don't understand other parts, but this one perhaps: button1.addEventListener('click', function() { ... var arr = [], buffer = e.outputBuffer.getChannelData(0); for (var i = 0; i < arr.length; i++) {... Code (JavaScript): A zero-length, un-initialized arr will always be created at any clicks on button1, so that the for() loop will never actually be executed ??
Thanks for help. I can sense what you mean. But I don't understand why the for() loop will never actually be executed. How can I solve this. According to the internet, they said var arr = []; is better than var arr = new Array(4096); So I follow those instructions.
Maybe they mean var arr = [4096] & not just var arr = [] ? The latter means creating an empty array (a.k.a. array with zero-length) & since your for() loop depends on this length... you know what I mean.
Is there some reason you are making an extra copy of the array values?!? Is there some reason you couldn't just pass your entire array to your "processing" function? Seems like you're doing a lot of array stuff for nothing -- but it's hard to say without seeing the rest of the code or having more info on the data source that's being processed.
be careful when working with very large arrays in the browser --> all browsers still to this day have very crappy memory cleanup routines leading to "memory leaks", meaning that you see the problems reported by the OP. make as little copies as possible, keep an eye on (windows:ctrl+shift+esc --> advanced) the meters for memory consumption, and if you see the browser eating up more and more memory, you know you got a memory leak. would help if you'd all read the google hits for "memory leak browser"
Thank you @seductiveapps.com, That was what I suspected. My web app plays animation at 60 frames a second and sound frame at much higher rate. The higher resolution the choppier quality.
Then you need to be super careful about your variable assignments. Basically, all var = someDataStoredInBrowser.some.sub.variable.path statements are Bad. Start there and your problems should appear much later.
Thanks for suggestion, Sound like you're familiar with this stuff. I only see Google engineers working on this.
I don't do audio processing or visuals processing, I work with very large JSON databases sometimes, which suffer from the same problem as soon as you try to visualize them in the browser