This is not simply a build an app project, there is also serious work needed doing about how to even format and store the data in addition to how tagging/annotation should work.
For example, when you record an audio file today, the document itself is tagged with a ‘created time’ which can be when it was initiated or saved. Inside the timecode there is nothing related to ‘real time’, only the progression of the recording. And that is only the start of the issue to be tackled since the recording format frequency will vary from one recording to another and if they are not the same in an event, they will drift apart. Crazy, but this is a reality which often comes up when doing audio match-up with video, where the audio is recorded at high quality and the video only recorded with reference quality audio. This means we will have to develop a ‘wrapper’ for the raw audio (and later, video) recordings, which opens the format up and refers to each second on the raw data, to real-world time.
We will likely be employing Graph Databases for such problems. One contender is http://neo4j.com