“Sarah” : Video Manipulation in p5.js


This sketch was created using a video clip from “The Puppeteer,” a solo I choreographed on one of my advanced contemporary students, Sarah Takash. The piece is a 5.35 minute long introspective journey through movement.

To generate this sketch, I used three overlays of the same 14-second video clip. I changed the tint of two of them, modified the alpha and increased the speed. I decided to use a zoomed-in shot of the video, in order to abstract the overall visual. I liked having the shadows of her body as the center point of the viewer’s gaze.

In order to hear only one audio stream at a time, I set the volume of two of the videos to zero. This allows the viewer to hear the audio in a linearly, while watching the video in a nonlinearly. I also played with pixel manipulation but in the end I found that distorting the pixels was more of a distraction rather than an enhancement. So I decided to keep it simple. The code for this sketch is posted below.


Leather Nun

I created Leather Nun when testing and learning about buttons and sliders in an ICM help session. There is no rhyme of reason to the chosen images, aside from them being two images I found on the internet that I really like (well actually the truth is I sat next to a nun on the train a few days ago and she reminded me how religious I was when I was younger and this brought me to question my religious stance now and how different my spiritual views are).

The purpose of this sketch was more to learn about functionality rather than aesthetic, so the overall aesthetic concept currently lacks. When you click the “change picture” button, the image changes to one of outer space, and when clicked again, it changes back to the leather nun. The “add tint” button, enables a slider to appear on the bottom left-hand-side which activates an HSB tint spectrum. You can move the slider throughout a 360 range of hues to tint the images as you please.

My apologies for the deceptive line of “This is a paragraph.” It is not a paragraph.

All Good Things Come to an End

This sketch, “All Good Things Come to an End” is meant to represent the realm of the ephemeral. It’s a comment on one aspect of the roller coaster I’ve been on for (I’d like to say the past few weeks, but truthfully) my whole life. Specifically talking about the “ups” of the roller coaster, and how they always come down. What goes up comes down.

The sketch is meant to evoke desire. The “for loop” of ellipses are specifically set to a random rainbow fill, to glitter and shine in the eye of the viewer, drawing the viewer in and exciting them. Within just a few seconds, the ellipses’ opacity fades to 0 and you are left with a blank screen. Crushing the desire the sketch attempted to induce just a few seconds earlier.

The irony here is that you can constantly reset the sketch, by clicking the mouse. So although the ellipses representing desire, sweet things, “highs”, etc, are gone within seconds and out of your grasp, they technically are available at your command, by simply clicking the mouse. Although not yours to keep, they are readily available at your choosing.

Delving more into tech rather than concept, I would like to figure out how to set the sound to only the pink-background sketch, so that when the sketch changes to a black fill with the question mark, the viewer is also denied the music. I think the denial of the music which your ear grows accustomed to after a few seconds, is really the most painful part of the viewing experience.


Lounge Visual Cont’d

Oct 13, 2016

This week I created an extension on last week’s time-based sketch. I really did try to break away from the “scripted” type of sketch, but something was driving me to it and before I knew it I was waste deep choreographing every step of my code.

As I got a few steps further, the process started to get very frustrating. When I would save and run the sketch in the editor, everything would by synched up accordingly. When I would run the html file in the browser, however, the sketch would move much faster at certain points and much slower at others. It lost synch with the music and when watching a visual that is clearly intended to express musical synchronicity, it is almost painful to watch it out of synch.

Again, this brings me back to the realization that p5 is meant to be interaction-based, rather than time-based. Regardless, the *finished (*actually very unfinished) product is shown below.

Lounge Visual

The following visual is one I made to a favorite song of mine: Moon Tattoo. I wanted to create a time-based sketch, in other words, one that would operate without the prompt of a mouse. I used the built-in variable “frameCount” to do this. I have my sketch running at 40 frames per second. In order to assign “if” statements to my sketch in a time-based manner, I assigned each statement to a frame (I.e. “if (frameCount>1270){background(0);}). This was working very well until I went to upload it in the browser.

When uploading, something delayed the sketch after a certain point. About halfway through, the frames started running later than 40 per second, even though they were set to that speed, and this made the visual lose synch with the music. This was upsetting because the whole point was to create a moving image that captured the beat of the music. So when, for example, the strobe light hits the screen after the beat is dropped, the effect is null. All this being said, there has to be an easier way of synching a time-based sketch up with music. How though?

Another side note, every 30 seconds of this sketch required about three hours. THIS TOOK A LONG TIME. What felt like 2 minutes of code actually amounted to much less. I think this also had to do with the intensity of the song. This particular song, “Moon Tattoo” by Sofi Tukker has a very driving beat with a suspenseful beginning, and I felt I had to match that in my sketch.

I’d really like to add actual photos into the sketch and have them come in and out by filling the screen with a black rectangle of varying opacity, but when I uploaded the photo, the sketch wasn’t able to load. Was this just too much data for the sketch to handle? Is that possible? I figure that shouldn’t be the case but I wasn’t sure what the problem was.

All of these issues aside, here is the unfinished finished product:

Creating an Arduino Melody

Oct 5, 2016

When I first sought out to create a melody through an Arduino code, I thought this would be extremely difficult. I came to understand that it is pretty straightforward, especially if you have prior experience with a musical instrument. Granted, my code only generates a very short simple melody and the possibilities in which one could generate music span way further than what I have done this week. I understand that. However, I was pleasantly surprised when the sounds coming out of the speaker actually sounded like a melody I once made up on the piano.

The Arduino library of musical notes was puzzling at first. Each note is written with the word “NOTE”, an underscore, a note itself (i.e. C,B,A) and a number (1-8). If the note is a “sharp”, the letter “S” is placed in between the letter and number. I came to discover, through trial and error, that the number represents the octave in which it is being played. If the octave is higher the number is higher and likewise if the octave it lower, the number is lower. Each of these notes represents a different frequency value. Arduino has a pre-constructed library of pitches that a coder can use so they don’t have to go through the trouble of creating a name for each frequency. This is what I used.


So I started with a mid-range octave: 4 (i.e. NOTE_”letter”4). There was a variance between 3 and 4 for some notes,  but the majority were played at the 4th octave. The next step was to replicate the string of notes that I am used to playing on the piano: C, G, F, E, D, C, | C, G, F, E, D, C | C, G, F, E, D, C, D, E, F, G, F, E, D, C | C, B, A, B, C | C, B, A, B, C | C, B, A, B, C, D, E, D, C, B, A, B, C, D, E, D, C, C. These are the opening notes to one of my favorite songs that I play on the piano. A song I made up a few years ago.

The next step was to set the duration of each note. In other words, how long should each note be played. Through trial and error I came to discover that the higher the number the shorter the duration of the note and the lower the number the longer duration of the note. For instance NOTE_C32 will be a very fast sound, where as NOTE_C1 will be drawn out longer. This was important for my melody because without varying note durations, the melody is lost as it just sounds like a bunch of notes running in a marathon. The duration variance gives the melody its dynamics.


The next step was to create corresponding lights. The entire melody consisted of the notes A-G, which is seven notes. I decided to place the lights in order, via a parallel digital OUTPUT circuit,  to represent each note in the same order it would appear on a piano (i.e. when watching the video, the green LED all the way to the right is A, the yellow LED to it’s left is B, the red LED to it’s left is C, following is D, E, F and G). So after creating an Arduino code that mapped each note to an LED, the LEDs lit up according to the note being played. So if one were to watch the LEDs light pattern, they could simply translate that directly to a piano and play the exact melody (with probably a nice, more soothing tone). This would be an easy way to read music.

A side note: I didn’t create a switch to initiate the melody, so when going through all of my troubleshooting and having to play the melody repeatedly at ITP, some people actually began humming the melody without thinking. That’s my song they’re singing!!!! How cool!!!! 🙂 This made me happy.

In the future, with more time, I would like to create a switch for this circuit, and also a variable resistor that could manipulate the duty cycle and allow the user to control the volume.

(In the following aerial view video, the note durations are a bit off towards the end.)

After finally getting my hands on some “Bare Conductive” Electric Paint, I was able to begin the kind of experimentation I am very interested in: physicality in circuitry. I volunteered my sister to be my guinea pig. I told her the chances of her getting electrocuted were only slim, and so she agreed to the role. I put a dab on her pointer fingers and thumbs and then painted a small connecting line between them. She then held both ends of the wires, which are usually connected via the graphite of the drawing. IT WORKED!!!! The LEDs lit!!!!! Well… barely… but it worked! I think the amount of conductive material (paint) was too small for this circuit, because the LEDs only dimly lit up, however it was really exciting to see that the possibility of lighting an LED with the body definitely exists!

I did do some research prior to experimenting on my sister, and I found that there was actually no chance of electrocuting the user as long as the voltage was five or less volts. This came as a disappointment to me however because does that mean I can’t use any circuits that run on more than five volts? I would LOVE to one day choreograph a piece where the dancer moves about the stage and connects various circuits that lit up different stage lights or perhaps sound different musical notes. Would this be possible? Or would the dancer run the risk of being electrocuted? This is probably an overreach right now, but one day achieving something like that would be a dream.

Lastly, as per the lab completed in class, below if a video of an input and output analog circuit. I used a potentiometer as a variable resister to input a value between 0 and 180. I had to map the standard analog input range of 0-1023 to 0-180 because a Servo moves based on angles and generally within a range of 180 degrees.