renders

Exciting times! Lots of things going on at WVU, both related to my dissertation, and otherwise. With midterms having just concluded, we enter the final stages of planning for our virtual percussion ensemble concert, student juries, and subsequently my final recital… luckily, I seem to have some apps that work!

The past month has been a lot of keeping my nose to the grindstone and iterating on the development of the final movement of Touching Light, ‘Synecdoche’ (or ‘the one with cubes on’ as it has been called). I’ll include a video at the end of this post that shows the app for the movement in action.

The artistic goal for this movement was to explore musical interactions that were unique to MR; while movements 1 and 2 engage MR to broaden the possibilities of live performance, ultimately both the holo-mixer and the carousel could be achieved via other means. The weightless projections and interactions of the holographic objects in movement 3 are a different story.

There were three main things that I wanted to do with Synecdoche. The first one was to make ‘primitives,’ (in this case, cubes) somehow the ‘star of the show.’ I started my 3D modeling experience with Blender, and so the ‘meme’ about deleting the default cube may have directly inspired this in more ways than one. So, the first step was to figure out what I could do to make the cube(s) interesting.

As I’ve shown before, I colored the cubes so that they corresponded to the three principle colors in the RGB color profile, with some edge-lighting to make them a bit more abstract, and then made them weightless.

For the final version, I’ve spent a lot of time on the HandMenu, building out specific controls for each cube individually, as well as some global controls.

Mute All command will forcibly mute the Sound Sources on all of the Cubes, using the boolean operator in the Audio Source

Reset All will reset all of the cubes to the origin, without removing any inertia that they may be carrying using a custom script that called SetToOrigin

using System.Collections;
using System.Collections.Generic;
using UnityEngine;

public class SetToOrigin : MonoBehaviour
{
    public Vector3 pos;
    public Quaternion rot;

    // Use this for initialization  
    void Start()
    {

    }

    // Update is called once per frame  
    void Update()
    {
        transform.SetPositionAndRotation(pos, rot);
        this.enabled = false;
    }

}

Boundaries generates the cage around the performer which helps keep the cubes within a reasonable performance space

CREATE

The Create commands enables the Mesh Renderer on each of the cubes (which is disabled by default)

RELEASE

The Release commands activate a few different functions, unique to each cube.

Release RED
When released, the Red Cube activates the ‘Object Puller’ script, which causes the Red Cube to be pulled toward a designated object (the Blue Cube), if it is within range using the ObjectPuller custom script:

using System.Collections;
using System.Collections.Generic;
using UnityEngine;

 public class ObjectPuller : MonoBehaviour
 {
     public GameObject attractedTo;
     public float strengthOfAttraction = 0.01f;
     public float radiusOfAttraction = 1.0f;
     float distance;
     void Start() {}

     void FixedUpdate()
     {
        distance = Vector3.Distance(transform.position, attractedTo.transform.position);
        //Debug.Log(distance);
        if (distance < radiusOfAttraction)
        {
            Vector3 direction = attractedTo.transform.position - transform.position;
            gameObject.GetComponent<Rigidbody>().AddForce(strengthOfAttraction * direction);
        }
     }
 }

Release BLUE
When released, the Blue Cube activates the ‘Object Pusher’ script, which causes the Blue Cube to be pushed away from a designated object (the Green Cube), if it is within range using the ObjectPusher custom script:

using System.Collections;
using System.Collections.Generic;
using UnityEngine;

public class ObjectPusher : MonoBehaviour
{
    public GameObject attractedTo;
    public float strengthOfRepulsion = 0.01f;
    public float radiusOfRepulsion = 1.0f;
    float distance;
    void Start() { }

    void FixedUpdate()
    {
        distance = Vector3.Distance(transform.position, attractedTo.transform.position);
        //Debug.Log(distance);
        if (distance < radiusOfRepulsion)
        {
            Vector3 direction = attractedTo.transform.position + transform.position;
            gameObject.GetComponent<Rigidbody>().AddForce(strengthOfRepulsion * direction);
        }

    }
}

Release GREEN
When released, the Green Cube is affected by gravity, causing it to accelerate downwards at approximately the same rate as an object on earth’s moon. No custom scripts are required for this interaction; instead the button toggles the ‘Use Gravity’ toggle on the cube’s RigidBody.

SING

Each of the cubes has a different function when ‘Singing, but RED and GREEN are functionally the same. Both cubes’ Audio Source is unmuted (it begins muted).

This allows the tracks that they play to remain in sync, as they play on load.

For the BLUE cube, the ‘SING’ button activates a custom script that plays a randomized diatonic note whenever the cube detects a collision, using the ImpactTrigger custom script:

using System.Collections;
using System.Collections.Generic;
using UnityEngine;

public class ImpactTrigger : MonoBehaviour
{
    public AudioSource source0;
    public AudioSource source1;
    public AudioSource source2;
    public AudioSource source3;
    public AudioSource source4;
    public AudioSource source5;
    public AudioSource source6;
    AudioSource sourceToPlay = new AudioSource();
    AudioSource[] notes = new AudioSource[] { };

    private void OnCollisionEnter(Collision collision)
    {
        AudioSource[] notes = { source0, source1, source2, source3, source4, source5, source6 };
        //Debug.Log(notes[Random.Range(0, notes.Length)]);
        sourceToPlay = notes[Random.Range(0, notes.Length)];
        sourceToPlay.Play();
    }
}

Those sources are defined through the Impact Triger (Script) component, and are attached as separate audio sources and children of Blue_Cube.

SILENCE

The Silence function is essentially the reverse of the Sing function, muting the various Audio Sources, and disabling ImpactTrigger.

FREEZE

The Freeze function is essentially the reverse of the Release function, putting the Rigid Body components on the cubes to sleep and disabling the ObjectPuller, ObjectPusher, and gravity interactions.

FIND

The Find function generates a floating orb (that does not have a RigidBody collider) which will always point towards its appropriate cube, disappearing once the cube is in view.

PING-PONG

Finally, and this was more a happy accident than an intentional design, because of the way that I built the Hand Menus, they can be used like paddles to hit the cubes and send them floating off.

Overall, I pleased with the way that things are shaping up… recital is on May 1 at 10 AM EST!

sliders

I’ve spent the last two weeks banging my head against the proverbial wall, re-learning class and function indications in C#, the programming language that underpins the Unity Engine (Unity), trying to get the Pinch Slider assets from the Microsoft’s Mixed Reality Toolkit (MRTK) to talk to the AudioMixer class that’s native to Unity.

With some help from a few professional programmers in my network, as well tutorials and manuals that have been developed by both Unity and Microsoft, as of this weekend, I have a functional skeleton for the first ‘movement’ of Touching Light, which involves independent audio faders that will adjust the volume of loop-based original music.

This will function as an augmentable backing track alongside which the performer can improvise freely, or engage with some pre-written melodic and harmonic material that will be presented with traditional staff-notation.

Final testing will occur in the next day or two to ensure that device deployment works as I am intending, after which I’ll take a break from app development for the remainder of the week to shore up chapters 2 and 3 before sending it along to my research advisor for preliminary comments.

Included below is a technical overview of the Unity assets and scripts involved at this juncture, which are subject to change as I optimize:

In-Engine Render

Here you’re seeing a collection of 3D objects that have been adapted from the PinchSlider pre-fabricated (prefab) assets provided in the MRTK; each fader is interactable with the HoloLens 2 (HoloLens) “pinch” gesture (hence, “PinchSlider”) and the ‘thumb’ (the knob that slides) will move vertically along that track, returning a value between 0 and 1, depending on where it is located along that track.

The context menu near the bottom is a profiler asset that allows me to track the CPU usage of different interactions in real-time, keeping an eye on whether or not things are in danger of causes lag, and freezing/crashing the program; so far, we’re in the green.

PinchSlider Inspector

The connection between the PinchSlider asset exists within the ‘Events’ section: whenever the value of the slider (the number between 0 and 1) changes, the PinchSlider will return that value which can then be collected by other scripts (programs) and used to alter things like the volume of specific sounds, loops, etc.

I wrote the MixLevels.cs script (referenced in the On Value Updated event) to take that slider value and apply it to the volume for the appropriate track.

MixLevels.cs

public class MixLevels : MonoBehaviour
{
    public AudioMixer masterMixer; //create a new object to reference the in-engine audio mixer

    private float dB; //create a new variable to map the slider's
    private string param;


        [SerializeField]
 //ask for input
        private string exposedParam = null;

    public void SetMusicLvl(SliderEventData eventData) //take an input of sliderEvent type

    {
        param = exposedParam;
        dB = eventData.NewValue;
        if (!(dB == 0)) //because the logorhythmic function to alter the value for a fader level breaks with an input of '0,' checks for an input of 0
        {
            masterMixer.SetFloat(param, (Mathf.Log10(dB) * 20)); //change slider value to something that works for a volume fader; essentially map the 0-1 range to a -80 to 0 range.

            Debug.Log(dB); //print dB value in order to confirm that it is being changed

        }
        else

        {
            masterMixer.SetFloat(param, -80); //if input is 0, set the fader to -80

            Debug.Log(dB); //print dB value in order to confirm that it is being changed

        }
    }
}

A fairly ‘simple’ script, as far as what is possible in the grand scale, this program allows for the user to input which fader’s volume (an “exposed parameter,” essentially meaning that it is visible and available for other programs to edit) this specific copy of the MixLevels.cs script should be attached to.

I’ve color-coded the code above; anything in sage green are ‘comments;’ they are notes within the program written in English for other programmers, that the computer ignores when it is running the program. Commenting is an important way to communicate with others who will look at your code in order to help them understand what your program is doing, and how.

Master Fader

While there is likely a more efficient and elegant way to create these connections, this is final piece of the puzzle: the place where the user can designate which audio track the slider should be in control of. You can see here that in the ‘Exposed Param’ field (which I’ve shown is a ‘Serialized Field’ in the script code above, thus prompting for an input) I’ve designated ‘masterVol’ which, as you might guess, is the reference to the Volume of the Master Fader (the one that controls the total overall volume of all of the tracks).

But multiplying this by five or ten times, I’ve then generated the necessary faders to control all 11 (10, plus the master fader) of the individual tracks that make up the complete original music.

Screen recording while rendering 3D and audio in real-time is a bit taxing
(hence the red on the profiler)