On the Edge of the Dream
This album was recorded in 1994 using a Mac Quadra, Matrix 1000, Korg T1, Korg M1, Korg Wavestation, and various samples and sound effects. The entire performance took place in 1994 over 10 shows at the Calgary Planetarium or Alberta Science Centre incorporating stars, slides, videos, 3D renders and quicktime movies in the early stages of the technology. A two issue article in EQ Magazine from 1994 or 1995 describes the show. Ohama has recorded many albums but this is my first one on CD Baby, which I'm kind of running as a test. If things work out okay I'll be putting more material here. Peace. Thanks. *******added Dec 2010 FROM EQ Magazine - 2 part article 1995-ish - This is part of the rough draft - I don't have a copy of this article, so if you have one I'd like to see it! INTRODUCTION I recorded my latest CD '...on the edge of the dream' using Digital Performer. And designed the cover in Photoshop. The news of Peter Gabriel's XPlora CD-ROM, and the recent advances in CD-ROM gave me an idea: why not do a concert created entirely with a Macintosh? During March of 1994 I performed 10 solo shows at the Alberta Science Centre's Planetarium. The performance incorporated QuickTime movies, 3D animations, Director movies, Digital Performer, Sample Cell and rack of MIDI modules. THE ALBERTA SCIENCE CENTRE PLANETARIUM I chose Calgary's Alberta Science Centre (ASC) Planetarium as a venue because I wanted to use video projectors and a low ambient light level was of major importance. I wanted to envelope the audience with images and sound, somewhere between the game MYST and a virtual reality machine. The Planetarium is a dome-shaped structure, with 360° of seating. There are 3 video projectors: one that points straight up to the centre of the dome, and two that point horizontally north and south. There are several banks of slide projectors. One bank projects a 12-slide panoramic around the base of the dome, another bank projects "all-skies" that cover the entire dome. There are several solo projectors, some lighting effects, lasers and of course, the star projector. Three laser disk players, an S-VHS player and all the slide projectors are controlled by the Omni-Q system (basically a visual sequencer) which is synced to SMPTE recorded on an Otari 1/2" eight track. HOW I PLANNED TO DO THE SHOW I realized that I wouldn't be able to run the show entirely from my Mac but I wanted all the images to be produced on the Mac. After all, I reasoned, isn't that part of the promise of the multi-media computer? In reality, I knew that my Mac couldn't even play Digital Performer and a QuickTime movie simultaneously, but this was the seed of the idea. First of all, I am not a graphics artist. I am not an interactive multimedia programmer. I am not a video artist, animator or anything like that. I have basic skills, but I am mostly a singer/songwriter who has used a Mac-based MIDI studio since 1986. CD-ROM. Multimedia. Virtual Reality. I set out to create a music project both on CD and live that would take advantage of the newest technology. But how much is hype, and how much is real? My experience with QuickTime, 3-D rendering and 32-bit color was limited to watching demos play on my Macintosh over the years. I guess the technology allows people to do things they wouldn't be able to do normally and I was about to see how much that was true. I decided to use Digital Performer to play the music, including 800MB of audio, and I would play one part on my keyboard and sing live. The Planetarium's Omni-Q system would control still images transferred to slides and moving images transferred to S-VHS and laser-disk. Fred Boehli, the Omni-Q programmer, gave me a 1/2' analog 8-track tape with SMPTE on track 8. I was supposed to put a copy of my music on tracks 1&2 so that he could program the projectors without needing my equipment to play the music. I decided to put music on tracks 1&2 and a guide vocal on track 3. Tracks 1&2 could act as a backup just in case the sequencer went boom during the show (which it did). I slaved Digital Performer to SMPTE, and started recording my audio to tracks 1 & 2. Fine. But at 42 minutes, the audio quit playing. The MIDI data kept playing, but no audio. I back-tracked a song, and tried again, but had the same problem in the same spot. It turns out (verified by MOTU), that Digital Performer will not play audio after 42 minutes when slaved to SMPTE. To work around this problem, I laid down my own time code on track 7, going from 00:00:00:00 to 00:30:00:00 and then again 00:00:00:00 to 00:30:00.00 with no pause between the two chunks of time-code. My CD has NO breaks or silence on it except at the halfway point, made to accommodate cassette production, and I put my live break there. Instead of one long sequencer, I created 2 long sequences, and during the live show, I had to go from one sequence to another manually. A little hairy, but it worked. FRAME-BY-FRAME or REAL TIME? Early on I had decided to record my visuals in real-time from the MAC. This decision turned out to be the most important one I made. I decided to go real-time because I didn't own a frame-accurate video recorder or a controller card like a DQ-Animaq. My budget allowed me to get a Video Vision card, and I already owned an S-VHS recorder. Anyhow, it seemed reasonable to me to go real-time from the Mac. After all, I had a Quadra 700, 20MB of RAM and a 1.2 GB drive. I had seen some fairly smooth QuickTime Movies and thought that by using Macromind Accelerator, I could play smooth 3D animations. Going real time offered 2 benefits: it was less expensive because it used equipment I already had, and I could also see what my work looked like on a regular television as I created it. Creating the stills: The slides used at the planetarium are either masked or.