I love finding mentions of JFugue in articles that are not music based. Duncan Mac-Vicar has been playing with Scala and Xtend lately, and realized that Xtend does not yet have character literals, which he discovered while using JFugue's Rhythm API.
(BTW, Martin Odersky's Scala class on Coursera ("Functional Programming Principles in Scala") is in its fourth week now, and the class is excellent - I highly recommend it!)
Also posted to my idea2product blog
Dave Koelle's blog on JFugue, the Generative Radio project, and other experiments with music programming and sensemaking.
Monday, October 8, 2012
Saturday, October 6, 2012
Higgs boson data turned into music
Domenico Vicinanza has converted data from positrons passing through cloud chambers into music. He took the images of charged particles moving through deuterium (such as this one) and placed those images onto musical staves (see this page for more explanation). Later, composer Ben McCormack took the notes and adjusted some of the timing in the music, wrote countermelodies, and composed the following piece.
Tuesday, October 2, 2012
Today at JavaOne: Rewriting an Open Source Music Program in Scala
If you're at JavaOne this year, don't miss Brian Tarbox's talk, "Rewriting an Open Source Music Program in Scala" Brian will talk about his experience converting Log4JFugue to Scala. (Of course, Log4JFugue, which uses JFugue to let you listen to your log files, won the Duke's Choice Award at JavaOne 2010). Brian's talk should prove to be intriguing: "The majority of the session is a hands-on,
code-on-the-fly re-creation of the Scala version from scratch. You will
see the differences between the languages and get a feel for coding in
the functional paradigm. You will also understand that Scala need not be
scary. No background in Scala is required."
Brian's talk is scheduled for Tuesday, October 2, 3:00pm - 4:00pm, in the Hilton San Francisco - Golden Gate 6/7/8
Brian's talk is scheduled for Tuesday, October 2, 3:00pm - 4:00pm, in the Hilton San Francisco - Golden Gate 6/7/8
Wednesday, September 5, 2012
Experimenting with the EchoNest API
I've finally made it to a Boston Python User Group meeting - Sept Project Night! For my project tonight, I decided to experiment with the Echo Nest API, including the Echo Nest Remix API. If you're interested in following along, here are some steps:
- Make sure you have Python installed. If you want to play with the Echo Nest Remix API, it currently requires Python 2.6.
- Don't forget that you will also need setuptools to install Python packages.
- Get an API key from The Echo Nest. I'm on Windows, and I also set up an environment variable, ECHO_NEST_API_KEY, with my API key.
- There is a Python wrapper for the Echo Nest API. Get it from Github and install it using python setup.py install.
- Install the native installer for The Echo Nest Remix API from Google Code.
- Have fun!
from pyechonest import config, artist
config.ECHO_NEST_API_KEY="YOUR API KEY"
bk = artist.Artist('boney m')
print "Artists similar to: %s:" % (bk.name,)
for similar_artist in bk.similar:
print "\t%s" % (similar_artist.name,)
from pyechonest import artist
for hottt_artist in artist.top_hottt():
print hottt_artist.name, hottt_artist.hotttnesss
Friday, February 17, 2012
Randomness is not creativity
If you ever come across a system that purports to be creative but, under the hood, it's driven by randomness, it is not creative. Randomness is a stand-in for creativity, a shortcut because we don't always know what it means to be creative, or what aspects of a topic a system can be creative about. Bach's "Musikalisches Würfelspiel" - sorry, not creative. Well, Bach was creative in composing the piece - fun idea (and I'm convinced that Bach would have loved JFugue), but the music that comes out of the game does not have any special creativity more than its initial creation. Aleatoric, yes. Ah, aleatoric - a more beautiful way to say "random."
Okay, now that I've put that in the open, let me state that, in the interest of science, you may see me produce some music that does use randomness as I seek to discover what those aspects of musical creativity this generative system can be creative about.
Look at you, watching science in the making. Isn't this fun?
Okay, now that I've put that in the open, let me state that, in the interest of science, you may see me produce some music that does use randomness as I seek to discover what those aspects of musical creativity this generative system can be creative about.
Look at you, watching science in the making. Isn't this fun?
Experiments with generative beats
One of my goals is to create a fully generative radio station that people would love to listen to. So much research into generative music has focused on classical and jazz composition. I'm aiming more for rock, pop, hip hop, and electronica.
I've been putting a lot of thought into how a computer program might generate decent music. Part of this has involved hours listening to the structure and composition of modern music to figure out what makes it tick.
Without making too simplistic of an assumption, a lot of music is a unique combination of simple elements. For example, the basic beat of most music is rather simple. There are notable exceptions - see Rush and Red Hot Chili Peppers for examples.
Many 4/4 beats fit use the template "1 and 2 and 3 and 4 and", where the 1 and 3 beats are frequently bass drums, and the 2 and 4 beats are often hand claps or something lighter. There are switch-ups to this pattern, for example the addition of a cymbal crash or the lack of 2, 3, and 4 beats. And, there are stylistic tricks like turning a single bass strike into two quicker strikes (is that what drummers call it when you hit a drum? My music vocabulary lags my experimentation)
I've hand-programmed this into a JFugue Rhythm. I still have such a fun time with the Rhythm class in JFugue:
And this one, as an alternate (OK, what do real musicians call that?):
And this one, as a switch-up (again, vocabulary):
You can listen to a composition of these beats (decent automated composition is another aspect of my generative music experiments - but I'm not focusing on that today).
The question I'm trying to figure out is how a program would determine that this is an acceptable, good-sounding beat. It could be that a program simply has a huge list of acceptable beats; my shock at how modern music borrows from other music certainly demonstrates this as a valid, if disappointing, approach (see WhoSampled's "The 10 Most Sampled Breakbeats of All Time"). But there's also something of a heuristic in here - the ability to turn a single long beat into two quicker beats, for example.
I've been putting a lot of thought into how a computer program might generate decent music. Part of this has involved hours listening to the structure and composition of modern music to figure out what makes it tick.
Without making too simplistic of an assumption, a lot of music is a unique combination of simple elements. For example, the basic beat of most music is rather simple. There are notable exceptions - see Rush and Red Hot Chili Peppers for examples.
Many 4/4 beats fit use the template "1 and 2 and 3 and 4 and", where the 1 and 3 beats are frequently bass drums, and the 2 and 4 beats are often hand claps or something lighter. There are switch-ups to this pattern, for example the addition of a cymbal crash or the lack of 2, 3, and 4 beats. And, there are stylistic tricks like turning a single bass strike into two quicker strikes (is that what drummers call it when you hit a drum? My music vocabulary lags my experimentation)
I've hand-programmed this into a JFugue Rhythm. I still have such a fun time with the Rhythm class in JFugue:
Rhythm r = new Rhythm(); // 1-a-2-a-3-a-4-a- r.setLayer(1, "Q ....q.q......."); r.setLayer(2, "....H ......H ..");Where "Q" and "q" are bass drums (eighth and sixteenth durations), and "H" are hand claps (which are specified by calls to r.setSubstitution() that I've left out of this example for brevity).
And this one, as an alternate (OK, what do real musicians call that?):
// 1-a-2-a-3-a-4-a- r.setLayer(1, "Q ....q.q.q....."); r.setLayer(2, "....H ......H ..");
And this one, as a switch-up (again, vocabulary):
// 1-a-2-a-3-a-4-a- r.setLayer(1, "Q .............."); r.setLayer(2, "....H ......H ..");
You can listen to a composition of these beats (decent automated composition is another aspect of my generative music experiments - but I'm not focusing on that today).
The question I'm trying to figure out is how a program would determine that this is an acceptable, good-sounding beat. It could be that a program simply has a huge list of acceptable beats; my shock at how modern music borrows from other music certainly demonstrates this as a valid, if disappointing, approach (see WhoSampled's "The 10 Most Sampled Breakbeats of All Time"). But there's also something of a heuristic in here - the ability to turn a single long beat into two quicker beats, for example.
Wednesday, January 25, 2012
Player.play() stops after 17 calls: A grueling, long-standing JFugue bug
If you use JFugue, it's possible that you've come across a bug where JFugue's Player.play() method stops play after it is called around 17 times. Brian Tarbox and I have been trying hard to track this one down... and we think the problem lies within the javax.sound.midi code.
It seems that Java's Sequencer.open() and Sequencer.close() methods might be happening asynchronously. As we were debugging this, we found that we'd get an IllegalArgumentException from a Sequencer that should have just been opened. It's possible that it was opened, but then the close() that was called shortly beforehand might have finally taken affect.
The bug is Issue 49 on Google Code, in case anyone would like to lend a hand!
It seems that Java's Sequencer.open() and Sequencer.close() methods might be happening asynchronously. As we were debugging this, we found that we'd get an IllegalArgumentException from a Sequencer that should have just been opened. It's possible that it was opened, but then the close() that was called shortly beforehand might have finally taken affect.
The bug is Issue 49 on Google Code, in case anyone would like to lend a hand!
Friday, January 13, 2012
Explore music and color with the Synesthesia Project
Andrew Hagerty has been working on the Synesthesia Project, which he describes as "A microtonal exploration that explores the link between music and color using the golden ratio (Phi)"
(and it so happens that phi is my favorite number - you can even see its central role in the Great Dodecahedron)
I'm so thrilled when I see people using JFugue for a project like this!
(and it so happens that phi is my favorite number - you can even see its central role in the Great Dodecahedron)
I'm so thrilled when I see people using JFugue for a project like this!
xpoles from Andrew Hagerty on Vimeo.
Log4JFugue 4 Scala
Brian Tarbox has been re-writing his award-winning* Log4JFugue project in Scala. Awesome!
You can read more about Log4JFugue in this article in PragPub Magazine: "...And Your Bugs Can Sing"
* - winner of the 2010 Duke's Choice Award for "Innovative Java for Developers"
You can read more about Log4JFugue in this article in PragPub Magazine: "...And Your Bugs Can Sing"
* - winner of the 2010 Duke's Choice Award for "Innovative Java for Developers"
New Project: jfugue-scala
There's a new JFugue-related project on Google Code: jfugue-scala
Earlier this year, Joshua Gooding (aka skavookie) started to creatie a Scala-based parser for JFugue's MusicStrings. There has been a lot of good work on this side-project, but there's also a fair amount more to do. The work had been done under the main JFugue project, but given the project's infancy, I'd like to move it to its own project for now. When the Scala-based parser is more solid, I'll be happy to promote it to the main JFugue project.
In the meantime, if you're interested in JFugue and Scala, I invite you to check out and contribute to this project!
Earlier this year, Joshua Gooding (aka skavookie) started to creatie a Scala-based parser for JFugue's MusicStrings. There has been a lot of good work on this side-project, but there's also a fair amount more to do. The work had been done under the main JFugue project, but given the project's infancy, I'd like to move it to its own project for now. When the Scala-based parser is more solid, I'll be happy to promote it to the main JFugue project.
In the meantime, if you're interested in JFugue and Scala, I invite you to check out and contribute to this project!
Introducing my music blog!
Welcome! This is my blog for all things related to my experiences with JFugue, my Generative Radio project (What's that? Keep an eye here to learn more), and my general experiments with music programming.
If you also run a blog or site about music programming, generative music, algorithmic music, or anything of a related vein, let me know and I'll put you in my blog roll.
If you also run a blog or site about music programming, generative music, algorithmic music, or anything of a related vein, let me know and I'll put you in my blog roll.
Subscribe to:
Posts (Atom)