sound in multimodal texts

The two multimodal texts I examined were Jason Nelson’s “Conversation: Injury Analysis” and Yoon-Hae Chang’s “The Last Day of Betty Nkomo”.

These two texts both utilize sound very differently from each other. Jason Nelson’s “Conversation” is a multi-layered piece in terms of sound, as it lets you layer different narratives on top of each other. It even lets you control the volume and whether you want to control left and right volumes. You have the control to listen to any of these narratives, as you can turn them off. If you wanted to, you can listen to one story at a time, or you can play all of them together. The usage of sound is definitely more complicated in this piece, with the added control over left and right stereo.

Yoon-Hae Chang’s “The Last Day of Betty Nkomo” is fully controlled by the artist, as once you click to play it you have no control on how the story is told. Most interesting thing about this particular piece is the rhythm of the piece: the words flick in and out in tune with the background music, creating an odd rhythm that makes it easier for you to immerse yourself in the words and the story. The story almost follows the music in this piece, and the words almost seem like they are complementary to the music.

The two pieces differ in how much control we get as the audience, and also in the usage of music. “Conversation” doesn’t use music at all, and relies only on voices and technicalities. On the other hand, we don’t get much control in “The Last Day of Betty Nkomo” and the piece relies very much on the music to help carry the story.


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s