Jump to content

Low budget syncing multi-session video and audio with Cubase and GoPro Hero4?


PViking

Recommended Posts

I hope you audio and video professionals will excuse me for seeking your advice on this forum.  If there's a better place for me to ask please let me know. I’m an amateur musician trying to incorporate my song multi-track recordings / performances with video angles, am learning video and about timecode (I'm still a video newbie), and am looking for a low budget solution to getting video synced with audio in a rather clutsy but unique multi-track audio situation.  My main issue is how to get subsequent takes (other instruments) video recorded in sync with the initial one, while using non TC equipment and software, and I have some ideas.  I need some feedback before spending more money. It's fairly complicated and I need to fully explain what I'm attempting. I'm open to hear alternatives even if I can't afford them. But I'm mainly hoping for a low budget solution I can share with other home recording amateurs.

I currently record audio in Cubase (Artist) 8.5 and play several instruments (guitars, bass, vocals mainly), overdubbing on several tracks, on Windows 10.

I have two Gopro Hero4 Silver cameras and a Smart Remote from GoPro. Some of you might stop reading this right now :-)  The cameras pair wireless to the remote and I can start them recording "at the same time" but I don't have specs from GoPro yet re the actual variance in start times, assuming one. The synced start time matters a lot of course. When playing a fast guitar part (I'm not great but can play fairly fast at times) if the audio and video are out of sync by even a tiny amount it's painfully obvious to me, and gets worse in slow motion. Sorry I can't quantify this.  I am starting to experiment with 60fps for more accuracy but that hasn't fixed things.

I built a prototype attachment for a guitar that I mount the two GoPros on and record from a player’s perspective. Others have done similar but it’s a fairly new idea. The main reason is to show what it’s like to play, i.e. looking down at the guitar and seeing my hands from this player’s POV. Might be useful for people making guitar tutorials and I plan to share the idea on Youtube and everywhere anyone might like it. I just started to experiment with this recently.

So I mount the cameras on my guitar, pair the cameras to the Smart Remote, point them at either hand, strap the guitar on me, start Cubase recording a track from a mic, and start the cameras recording by pushing the button on the remote. If I get a good take I can import the two mp4s into my cheap consumer video editor (Magix Video Editor 'Pro'). I then import the audio from Cubase. Trimming it to sync is another topic but it can be done. The two video files are fairly well synced. Of course the video editor snaps to one-frame accuracy when dragging pieces around which was kind of a shock to learn after unlimited movement in audio. I listen to camera audio to help find song align points when dragging video; it's not too bad.

The real issue: For a subsequent instrument track, e.g. bass, I could repeat this process but the video frames would not line up with the first angles --- GoPros don't have timecode and neither does my editor.

One idea is to use a Cubase MIDI signal to trigger the cameras from the song start for every shoot (including the initial) but there is no add-on product for those cameras (that I know of), that will accept timecode or any control signal including the not-yet available SyncBac (which will only record the same timecode on all connected GoPro cameras during a single shoot). Also, I read that the 'trigger' pin on the GoPro 30-pin connector is an output pin only or I would look into that.

For a possible hoaky way through this, there's a product from MIDI Solutions called a MIDI Solutions Relay ($150) which closes a fast reed switch (2 ms) when it receives whatever MIDI signal you program it for, and I could control on and off time/duration.  I could solder fine wires internally to the pushbutton switch in the remote and connect the wires to the relay. This would allow Cubase to start the two cameras at the same point in the song, for the inital track and every new-track session. Any latency would hopefully be repeatable so the second set of angles would be off the same amount, which is probably alright, even preferred.

It seems to this newbie, while not as good as timecode control through the whole song, the important thing for this is to get the video frame starts of the subsequent shoot to coincide with frame starts on the first shoot, relative to the song.  It could be frame 525 of the first set of angles and frame 491 of the second set where I start playing but, if the frames are all aligned with 'real time', would show my hands playing in sync once dragged into the correct spot in the time line. The audio would be "in sync" with the overall song tempo if I play my instrument right while recording in Cubase and listening to its previous track on headphones, but that's sort of external to this and a performance issue.  I would not look at any video output while recording, only listen to prior audio (and maybe metronome) on headphones.  

I am avoiding lip-syncing and the instrument equivalent in most cases, in case you are wondering, because I would prefer this to be authentic. But arguments in favor are certainly welcome!

I hope you have been able to follow my rambling; I've tried to be clear. Am I right in trying to get the frame starts to align? I hope someone can comment or offer suggestions on other ways to accomplish what I'm trying to do. Any ideas on a theoretical unlimited budget solution would also help - i.e. what do the pros do? Might lead to other ideas. THANKS FOR YOUR TIME!!

 

Link to comment
Share on other sites

my initial thought would be to get one of the accessory leads that allow external 3.5mm mic input into the GoPros. Feed a mono audio mix out of your Cubase system into each camera - using the original take's track for each overdub (so all overdubs have the same audio attached to the picture - not their own). You'll probably need to pad down the audio level and use a very lightweight cable to feed the camera on the guitar. Each overdub will be recorded to a track in Cubase, of course. Use Pluraleyes software (not expensive) to line up all the camera takes using the common audio track - assuming you can work out a workflow with your video editing software - I'm not an expert on that. Once the edit's done, import at the guide audio track into your audio editor and waveform match the final mix of the multitrack audio to that (can't be done in the video NLE as you've discovered), cut to exactly the same length, and replace the video edit soundtrack with your mix. That's the basic workflow. Details depend on the specifics of the DAW and NLE - neither of which I'm familiar with.

Link to comment
Share on other sites

Thanks for the reply nickkreich.  That sounds like a good way to align tracks, but you might have missed a point I was asking about.  I think if I shoot using your method alone it would be hit and miss as far as if the frame leading edges would be alignable (I called them frame starts) because there's no timecode involved. Are you saying that doesn't matter? Using your method it would be randomly distributed. What did you think of the way I mentioned to start the camera rolling at the same point in the song? I probably didn't describe it correctly because I don't know the right terminology. But I think that's one of the features of using timecode, that frames are starting together, right? I mean that not only makes the camera angle sync points easily found but also the frames line up referenced to what I called real time. 

If I randomly start shooting a 30 fps second take (no timecode) then the second series of frames could be off, relative to the first take, by as much as half of 1/30th of a second or say 15ms and I would be limited by that when dragging together on the timeline (either by hand or programmatically). If I could theoretically start the second series of frames so the frame leading edges were aligned in real time, wherever that point is in real time (in the song), it would be much more accurate.

But I'm probably missing something.

EDIT: I drew a picture to help explain what I'm thinking.

Frame start alignment.png

Edited by PViking
Added image
Link to comment
Share on other sites

My initial thought is to simply put a 2 pop at the begining of your session and align each take to that. It would be a simple way of syncing each overdub. I am not familiar with modern incarnations of Cubase, so don't know what it's video editing capabilities are, but if you were to use the audio track of your gopro as a guide, the pop at the beginning of each session should in theory serve as an easy reference point. 

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...