Science & Tech

The AutoSpiel: Computer Engineering Students Merge Disciplines To Automate Music

Computer Science and Engineering students merged software and hardware to develop a self-playing glockenspiel to perform songs that include the "Aggie War Hymn," "Silent Night" and "Sandstorm" by Darude.
By Felysha Walker, Texas A&M University College of Engineering June 3, 2019

The AutoSpiel's circuit board and microcontroller.
The AutoSpiel’s circuit board and microcontroller.

Jonathan Westerfield

 

Three students from the Department of Computer Science and Engineering at Texas A&M University applied their creativity, experience and interdisciplinary classroom education toward developing a self-playing glockenspiel.

In their microcomputer systems course, students were given $100 each and challenged to create a group project that successfully merged software and hardware.

In anticipation of this course, computer engineering senior Jonathan Westerfield had already orchestrated a project idea.

“I can’t play piano, but I like to listen to piano music,” said Westerfield. “I thought, what if I made an automatic piano player so I can listen to the songs that I want to hear?”

With a total budget of $300 and only a month to develop their project, Westerfield and fellow senior computer engineering undergraduate students Alejandra Sandoval and Kenneth Obkirchner scaled back their ambitions and applied their idea to the smaller, more cost-effective glockenspiel.

Turning an idea into innovation

Taking lead on the hardware aspects of the project, Westerfield used solenoids as the physical mechanisms for playing the AutoSpiel, the name they gave their project.

“A solenoid is basically a small coil of copper wire with a rod in the middle of it,” explained Westerfield. “When you apply electricity through it, it shoots. Then a spring puts it back into place.”

In his design, he aligned a solenoid above each of the 32 keys on the glockenspiel so that when activated, a solenoid would strike the key below it and play a note. Putting this physical component into action required Westerfield to draw upon his classroom education in electrical engineering courses.

As a Texas A&M engineering student, Westerfield utilized resources in the new SuSu and Mark A. Fischer ’72 Engineering Design Center to design and fabricate a circuit board. He then added a socket to the board that holds the project’s microcontroller (a compact circuit) and created a platform out of an old ping pong table to station the Raspberry Pi (a credit-card sized, barebones computer) and display.

Solenoids above each of the glockenspiel's keys.
Solenoids above each of the glockenspiel’s keys.

Jonathan Westerfield

Converting chords to code

Focusing on the first phase of the project’s software creation, Sandoval developed user-friendly programs for the AutoSpiel, which would allow anyone to insert a USB loaded with songs and press play. To do so required knowledge of programming and familiarity of the Raspberry Pi.

“The Raspberry Pi is the (mini) computer that controls everything,” explained Sandoval.

Being skilled in Python, a programming language, Sandoval programmed the Raspberry Pi to achieve two tasks, functional controls and music conversion.

Her coding allowed the Raspberry Pi to register when the play, pause, previous and next buttons were pressed, act on that command and display the corresponding song title on the small LCD. Her programming also initiated the process of playing a song by converting the song’s notes and times to data (think ones and zeros) and sending that data over a wire to the microcontroller.

The AutoSpiel's display.
The AutoSpiel’s display.

Jonathan Westerfield

Making sounds with software

Heading phase two of the software for the project, Obkirchner’s job was to blend the software with the hardware to automate the music.

Working with the microcontroller, the brain of the circuit board, Obkirchner wrote a complex C program that allowed the microcontroller to read the data like sheet music. By reading his embedded code, the microcontroller coordinated what notes were played and when to play them.

When it’s time for a note to be played, Obkirchner’s code allowed the microcontroller’s electric current to pass through a transistor, flow through Westerfield’s circuit board design and activate the appropriate solenoid to strike the key below it.

To hear how their work came together, listen to songs from the team’s demo video:

Kenneth Obkirchner, Alejandra Sandoval, and Jonathan Westerfield
Kenneth Obkirchner, Alejandra Sandoval and Jonathan Westerfield.

Photo courtesy of Jonathan Westerfield

The music makers

Kenneth Obkirchner, embedded programming

“What I really liked about the course was that, since we’re all computer engineering majors, this is the course where we actually combined all of our electrical engineering classes and software from our computer science classes into a very unique project.”

Alejandra Sandoval, music conversion and functionality

“I learned that, after three years, I can do stuff like hardware and software at the same time, working as a team to build a project without any instructions.”

Jonathan Westerfield, hardware design and project manager

“I enjoyed watching the project go from just an idea in my head to something that not only worked but was also such high quality. And then the people aspect, this project would not have worked without my team.”

This article by Felysha Walker originally appeared on the College of Engineering website.

Related Stories

Recent Stories