So far we haven’t really explored the interactive potential here and just defaulted to basic controls. This is a great way to get started and test things, but maybe it’s now time to think about interactivity a little and prototype some approaches.
Currently dragging the position slider moves the red position indicator in the top
WaveView and this begs the question: why not drag the indicator itself? And since we can drag across two axes, we can combine the two position sliders and have the vertical axis control jitter while the horizontal controls position.
Also, we can play with only emitting sounds when the view is touched rather than having it on nonstop, the behavior seen in that old dinosaur MegaCurtisBig. Rather than having the sound abruptly turning on and off, it would be nice to have it gradually fade in and out. For that we can write some fading code as we did with the grain ramp, but it’s probably a good idea to abstract this concept out and start building modulators: GrainSwift/Modulators.swift.
Readers familiar with synthesis may wish to skip the next couple paragraphs, as we pause to explain another cool-sounding word for a simple concept: modulation. Modulation can be thought of as a way to automate controls. Instead of manually moving a volume control or a position slider, we can have a signal modulate it, i.e. wiggle it for us. One common type of modulator is the envelope. As we did with the grain ramp, an envelope modulator typically increases a signal level, like amp volume, and then at some point decreases it again.
There are many types of envelopes, and we need not dwell on them here. We simply want an envelope that fades in our sound when we touch the wave and fades it out when we let go. An ASR envelope will work nicely, and you can see one implemented here.
attackTime will control how long it takes for the sound to rise, and
releaseTime will control how long it takes for the sound to fall.
The envelope will also need a
hold mode, so when we are tweaking the old sliders, the app behaves as before, with the sound playing nonstop. When the wave view is touched however, we’ll want the
hold turned off, so that the envelope can smoothly fade our sound in and out.
It was already a little awkward when we reused the
WaveView inside the
GrainView and set the position indicator to a negative value by default in order to hide it. This worked fine in practice but already had a bit of a stench. Arbitrary values with overloaded meanings like “this isn’t really a valid state” or “this means hide the indicator” have a tendency to creep up and cause problems later.
A cleaner approach would be to remove
position from the
WaveView entirely. We can instead make a dedicated component that indicates position and also lets us adjust it: remove position.
I’m frankly not sure what the ideal way to compose components in SwiftUI is yet, but in this case simply making a
ZStack with a position control on top of the wave view seemed like the cleanest approach, with very clear separation of concerns. To this end, we can create a
PositionControlView which displays position but also handles touches, using a
DragGesture gives us an
onEnded but no
onBegan. In order to discern new touches, we can add a
touching boolean to keep track of touch state. We can make it
@Binding to foist this state up to the parent, where we’ll need it to turn on the amp envelope
hold when a user drags the old position slider control: if not touching, turn on hold. Notice that we’ve added an
onDrag closure parameter to the
ControlSliderView so it can trigger actions whenever it’s changed: onDrag.
The rest of the
PositionControlView code is fairly self-explanatory, courtesy of SwiftUI’s declarative syntax. As a little extra, we can easily show position jitter by thickening and blurring the position indicator. It only takes a couple lines of code!
.stroke(Color.red, lineWidth: 1 + CGFloat(audio.grainControl.positionJitter * 8.0)) .blur(radius: CGFloat(audio.grainControl.positionJitter * 10.0))