Tunatic: Yet Another Form of User Generated Metadata
In our [Automatic Metadata Generation Framework](http://ariadne.cs.kuleuven.be/amg), we try to generate metadata that do not require people to fill in electronic forms – because ‘electronic forms must die’ 🙂
[Tunatic](http://www.wildbits.com/tunatic/) represents another promising approach to obtaining metadata without making people’s life more complicated. It is a small free application that will identify any song that you make it “listen” to over your computer’s microphone. Basically, it generates a fingerprint from 30 seconds of the song and matches that against its database: it is quite effective at doing this, as reported by [Macworld](http://www.macworld.com/weblogs/macgems/2005/12/tunatic/index.php?lsrc=mwrss).
So, where does Tunatic get its fingerprints from? Well, the Tunatic database is fed through [Tunalyzer](http://www.wildbits.com/tunatic/tunalyzer.html). This program scans your computer for music: when it finds a song that is not in Tunatic’s database yet, it analyzes it and sends its audio fingerprint and metadata (title, artist, etc.) to the Tunatic server. Thus, that song can later be identified by other Tunatic users.
This is very cool and can easily be extended to learning object metadata: a LOMalyzer could generate fingerprints for all learning objects in your LMS that it doesn’t have in its repository yet. It could then generate metadata from the context in which this learning object is deployed in your LMS – we can already do that for Blackboard, Moodle and Ines – and feed that into its repository. LOMatic could similarly suggest learning objects from that same context when you are working in your LMS, either as a learner or as a teacher.
Drop me a note if you know of any other such applications or if you want to work with us on exactly this idea…