I'm trying to use frequency data from a microphone to dynamically alter the scale and/or position of a GLTF model in A-Frame. Here is what I have so far:
<script>
AFRAME.registerComponent('voicecontrol', {
init: function () {
console.log("voicecontrol called");
},
tick: function (time, timeDelta) {
// get scene
var sceneEl = this.el; // Or this.el since we're in a component.
// get model
var mandalaEl = sceneEl.querySelector('#interactiveMandala');
// get current frequency
var currentFreq = freqAvg();
var positionChange = ((currentFreq/16) + 1 );
//console.log("position change factor " + positionChange);
mandalaEl.setAttribute('animation', 'to', {x:positionChange, y:0, z:positionChange});
console.log(mandalaEl.getAttribute('animation', 'to'));
}
});
</script>
And:
<a-scene embedded arjs voicecontrol>
...
<a-marker type='pattern' url='https://raw.githubusercontent.com/DaveyDangerpants/youarehere/master/images/pattern-marker.patt'>
<a-entity rotation="90 0 0">
<a-entity position="0 0 0" scale=".1 .1 .1" animation="property: rotation; to: 0 0 360; loop: true; easing:linear; dur: 45000;">
<a-entity id="interactiveMandala" gltf-model="#mandala" scale="0.15 0.15 0.15" position="0 0 0"
animation="property: position; to:0 0 0; dur: 100;">
</a-entity>
</a-entity>
</a-entity>
</a-marker>
I can see the microphone data streaming in and console seems to be indicating that I'm setting the animation component's 'to' values correctly, but the model isn't moving. Is this an AR.JS issue? I've never made anything with A-Frame or AR.js, so I'm a bit lost. Help!