Jump to content

Search the Community

Showing results for tags 'sync'.

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


  • Main Board
    • Current
    • The Daily Journal
    • General Discussion
    • Equipment
    • Cameras... love them, hate them
    • Recording Direct to Computer
    • Workflow
    • The Post Place
    • Images of Interest
    • Macs... and the other computer
    • All Things Apple
    • Technical Reference
    • Do It Yourself
    • Manufacturers & Dealers
    • Work Available - Available for Work
    • Post to the Host
    • Donate to Support JWSOUNDGROUP

Find results in...

Find results that contain...

Date Created

  • Start


Last Updated

  • Start


Filter by number of...


  • Start





Website URL








Found 23 results

  1. Watch Full Video or read companion article at: https://henrirapp.com/sync-for-video-production/
  2. Hi all! I'm considering buying a single tentacle sync device for my sound kit. Now I know the magic with this device comes when using the provided software. It will interpret the timecode sound from the footage to timecode data and then sync that up with my audio. But how do you use this in a situation where I work for a client who does not have the software and is not really into buying the software for this project. Anyone has some experience in this? Is there another way for them to sync the tentacle sound data without the software? I know when using a higher end camera there is 'timecode in'. So this is only for cases when using a camera without this feature. I hope my question is clear and thanks in advance! Greetings from a dutch sound enthousiast!
  3. Dear sound community, As timecode and sync technology is our passion, for many years we wanted to develop easily accessible, short-subject educational videos that break down and explain these modern production elements. Ambient University provides the information and instruction for low budget productions as well as the most sophisticated setups all around the world. Our videos about timecode and synchronization provide in-depth background information, while other videos demonstrate how to use that knowledge in the field. This video library includes step-by-step tutorial videos illustrating external timecode and sync distribution setups with the most commonly used cameras from ARRI, RED, Sony, Panasonic, Canon, etc. Our goal is to continuously develop and update the channel to provide the most comprehensive pool of knowledge on this topic – that's always available in your pocket. We hope you like it and of course are open for new video suggestions.
  4. I'm not really sure if there is a way to fix this but that's why I'm posting here. So my audio is out of sync with my image because I was shooting with the sony fs-700 at 24p (lav mic going into the sony) and forgot to switch over my odyssey 7Q recorder to 24 so it stayed at 29.97. I transcoded the footage with the audio to 23.98 it's synced but the pitch is way off. An audio friend of mine used protools to try and normalize the pitch but it still sounds a little off. Anybody else have any ideas on how to fix it? Below is a link to dropbox to download the footage and audio if you would like to give it a shot. Any help is greatly appreciated! https://www.dropbox.com/sh/xupcrr403xa8vxe/AABZ16gPvSTdcKfJG_T8psaUa?dl=0 Thanks Ryan
  5. Hi all, I have been asked to do sound on a 5D shoot. - Mk2 or Mk3 (TBD). - Sometimes a pair of 5Ds. - It is documentary. - We will shoot in Armenia. - No post house on board yet. - Editor not yet confirmed. - Likely to stay like that until we return from Armenia. - First production meeting Saturday. I have some obvious concerns about sync. - I have only one Lockit. - My pair of Tentacle Sync boxes might not arrive by then. So plan is: ERX on each 5D; - 5D Left track - scratch mix (for PluralEyes) - 5D Right track - TC (for AuxTC Reader) For the interviews this seems fine (they will be boarded) For the event coverage (marches, concert etc) where cams will cut & roll at will, I plan to roll a continuous take on Nomad, as much as possible. I’ve downloaded trial versions of both programs to test the workflow, when I get my hands on a camera. Questions: - Could long audio and/or video takes cause problems with this sync set up? - What is the appropriate way to pad down the TC for 5Ds mic level input? - Any suggestions for improvements to this (provisional) system? - Anyone done this before care to share their findings? Many Thanks, B
  6. I've looked and read through many different forums and websites trying to get a straight answer, but I'm not 100% clear on what I need. Here is the situation: I have a multicam interview shoot with one ARRI Alexa and one ARRI Amira. They want timecode to be synced between cameras and the sound recorder without being tethered. I am recording on a Sound Devices 552 mixer which doesn't generate timecode. What I need is a straight forward explanation of what I need to rent to make this happen and how to set it up. Does anyone have any input? As a side note, I do NOT want to record timecode through XLR, I want to use the TA3F Connector so it timestamps the audio file. I've never had to sync timecode before, so talk to me as if I'm a complete moron in the matter. PLEASE any thoughts and references would be much appreciated.
  7. Hi all! The crowdfunding campaign for the Tentacle Sync is ending in less than 24 hours! They still need (only) 15'000 Euros to make it happen. Get yours at https://www.indiegogo.com/projects/tentacle-sync-time-to-sync-different http://tentaclesync.com Cheers, Jürg
  8. NEW QL Timecode Generator We would like to announce the newest member of our Tig Q family which is the QL. It offers all of the accuracy, features, and size of the Q28 but without the Lemo. We know that some users just want a locking connector but don’t necessarily need the Lemo. Now the QL fits those users at the lower price. Intro Price - $399 Top Feature Highlights Proven Tig software design that works well with all types of cameras, recorders, and even “i devices” Quick and simple set up for easy use Locking 3.5 connector Durable design with center beam - We have run over it repeatedly with a truck and it still works fine. “Flow through Audio” to work well with devices not intended for timecode (i.e. any recorder, DSLR’s...) 2 year limited warranty Now for a FREE QL To introduce the QL we are offering a free QL “give away” (drawing) for those who “share” the QL on facebook. The drawing will be next Tuesday at 5pm EST. (This “give away” is also available to those in Europe.) So share away... Check out more about the QL at: http://www.mozegear.com/#!tig-ql/cn3s
  9. TIG Timecode Generator Durability Test I love durability tests and think they are so much fun. So we decided to do a video showing the TIG (timecode generator) being driven over by a truck and dropped off the back. It survived very well (especially considering we drove over it about 20 times and dropped it over 30.) So here's a TIG and a truck: https://plus.google.com/112281036772159173979/posts Laurie 480-292-9060 lauriew@mozegear.com www.mozegear.com
  10. Fine people of the internet I pose this scenario to you: I own a MOTU Traveler MK1, not MK3. It claims it will "resolve" to SMPTE via any audio input and send out SMPTE via any output. The software it ships with, the MOTU Firewire SMPTE console, controls all of the SMPTE options as you'd imagine. The frame rate selection does not include 23.98 fps but only 24, 25, 29.97 DF/NDF or 30 DF/NDF. Digital Performer 5 also has this same limitation as does my older version of Nuendo. I also own a 664. I have jammed the Traveler from my 664 at 23.98 fps and it seems to hold sync for an hour or so in my brief test when the Traveler is set for true 24 fps SMPTE. I have yet to try to jam the 664 from the Traveler to confirm that is stays 23.98 out of the MOTU but that is my main question. I know this is not technically correct. I was curious if, despite the settings, that the MOTU actually just accepts the incoming external TC as is and outputs TC according to the source or if it forces a true 24fps flag on the stream. Basically I'm wondering what actually occurs when one device jams another with TC. Is it the source that sets the stream or the receiving end that confirms/interprets the stream or both. Any insight would be really helpful. I've searched Google and this site (via Google) and have not found the specific answer to my question. Thanks!
  11. i am trying to sync 2 788t's. not linked together. Ive read through the manual numerous times and came up empty. ive searched on the internet to no avail. starting to get frustrated with this. I have a lemo to bnc and a standard bnc cable. Set my master to Freerun and slave to rx ex. didnt work. Any help or suggestions would be greatly appreciated. ced
  12. I have a hypothetical/theoretical question about timecode. I am not asking for any practical reason, and I have no timecode enabled equipment to test this on myself. Just a curiosity. I'm pretty sure I understand the differences between timecode and genlock/sync, namely, that timecode alone does not force synchronised recording speeds between devices, it only stamps synchronised time numbers over the top of the recordings. EG, I am aware that if one jam-syncs the timecode of multiple devices at the start of shooting, then disconnects the timecode cables, after a while, the timecodes (and thus, the recordings themselves) can/will drift out of sync. But what happens when the timecode (NO genlock) is constantly being fed, not a single jam-sync? Say, for example, you have a single audio recorder and a single camera. The audio recorder is sending timecode to the camera constantly via cable. What happens when the camera 'tries' to drift out of sync? The timecodes cannot drift out of sync (as they would if only jam-synced at the start of shooting), as that is the whole point of having constantly connected timecode cables, right? But at the same time, timecode alone will not force the recordings to sync? So...? I have been reading up online about this, and different websites seem to imply two different possibilities: A) the recording of the slave device (the single camera, in my above scenario) will simply drift out of sync from the master (the single audio recorder, in my above scenario), even though the timecodes remain the same throughout the recordings. Or as the slave recording 'tries' to go out of sync by a full frame, it 'tries' to take the timecode with it, but it cannot, so it 'jump' corrects itself. EG, as the camera 'tries' to run 1 frame slower than the timecode it is being fed, it forcibly corrects itself by adding a frame (and/or the opposite, of course). I have come across references to 'green flashes' in footage as a result of a slave camera 'trying' to drift from its incoming timecode. I guess these green flashes are the result of the camera 'adding' a frame (of green?) to force correct itself to match the incoming timecode? Or something else entirely?
  13. hey all, Im kinda in a pinch, on a small budget. I will be running sound for a single person interview w/ multiple takes this sunday. I have a 788t. Limo out to bnc is the only cable i have to use. Obviously the 5d does not have a bnc in. I dont have it in the budget for a Denecke time slate or timecode buddy. Does anyone have any suggestions on a way to sync my 788t (master) to the 5d without potential for alot of drift.. the good ol hand clap? Any suggestions would be greatly appreciated. thanks ced
  14. Hey all, Getting into timecode and worried about whether I'm totally #$@%i@ things up or not. Followed and what a few mixers have told me (and searching on JW a bit) have told me that I could jam from the recorder to the camera and that it should be fine. Without a lockit box. However, the Alexa manual is kind of brief and doesn't mention this particular activity of the timecode. What I want to know is what the blinking Alexa TC means. After I jam from the recorder it (it being the TC numbers/digits) stays solid for awhile, but then goes back to blinking again. This recent production with the Alexa also had a lockit box, and when I used it it would stay solid generally (although sometimes it would go back to blinking unless I babysat the camera a bit to always make sure it stays solid). The AC was indicating to me the TC staying solid is a good thing. But is he right? Anyways, thanks for the help.
  15. Greetings, I was working with a RED Scarlett today, the same one of which I've worked with a week ago with no problems, and had the audio return (monitor) go to very loud white noise. I lost jam sync as well, and it never returned. To begin, in the morning and prior to talent being on set, audio return from the camera was normal, and sync was green during camera set-up. Right before roll, the audio return goes to a very loud white noise. Audio levels on camera appear to bounce as they should, and playback of RED files has clean audio. The only change in the camera setup was the addition a HDMI video tap out of the back tightly squeezed next to the sync and headphone return. When I disconnect all audio cables to camera, white noise remains. I was rolling on a Nomad so wasn't too stressed over the lack of "good" RED tracks. After the first set-up camera lost the HDMI video tap, but still would not produce audio return or jam sync. We restarted the camera more than once, and no, I don't know what build firmware he was running. I've never seen this behavior before and am wondering if anyone else has? Cheers, JB
  16. http://vimeo.com/71959025 With firmware 3.1 for the Tiny Lockit and the Lockit we added a few exceedingly powerful features. "Use your Lockit to send the camera time code to your recorder without the fear of wireless dropouts!" The new software integrates a TC transceiver mode to the units using the already existing ACN network hardware. So without additional equipment your Lockit is now a fully functional time code transmitter or receiver. Thanks to the internal generator running in the background, TC dropouts are a relic of the past. "No need to tune your Lockits with the rental ones anymore!" Using the already well accepted ACN sync broadcast we added a nifty invisible feature. The new sync broadcast doesn't stop after the first broadcast but it will resend smaller timestamps every four seconds. All receiving units will compare their internal generated time code with the highly accurate timestamp and pull up or down the generator speed until they are completely matched. This is done imperceptible and without having to press extra buttons. The only thing you’ll notice that you never get bothered again by post about drift or offset problems when using Lockits from various sources. Even more this makes the system now suitable for line-sync online editing. "Sync your Pro Tools system with all cameras during a concert shot!" When you connect your Lockit or Tiny Lockit to your PC or MAC it will be recognized immediately as Audio/ Midi device. Just select your unit in your DAW as time code source and you're ready to rock. Now you get the camera TC into your software or you can use your Session TC on your cameras. If you own a Lockit you can even wordclock sync your Audio Interface to the camera video sync. This means that you can shoot a whole festival and trust that your cameras and your audio stay in sync. Lockit Firmware Tiny Lockit Firmware Installation Guide Video Enjoy your new features Your Ambient Team
  17. hello, our mixer is generating timecode from a fostex 824. jamming deneke boxes and slates twice a day. using a 702t as backup recorder. our video and audio seems to have a 75% out of sync rate, usually just between .25 and 1 frame. theoretically the deneke boxes should hold perfect sync though. any ideas what's going on here? we are syncing dailies with scratch, and slipping audio to match video timecode. wishing the slate clap would sync perfectly every time.
  18. Tri-Level Sync in a Bi-Level World by Dave Pincek, Vice President of Product Development The advent of HDTV has brought a number of new concepts and technologies with it. One of the concepts put into practice is tri-level sync. Tri-level sync solves some traditional problems found with bi-level sync. Although tri-level sync is preferable with the new television system, we still find ourselves interfacing to systems capable of handling only bi-level sync. Therefore, the need exists to convert from tri-level to bi-level sync on occasion. This Tech Corner will acquaint the reader with the new tri-level sync format and its relationship to bi-level sync. Bi-Level Sync Bi-level sync has been the standard synchronization signaling method for all forms of video including computer video, composite video, S-video, and component video. Bi-level refers to two levels. For sync, this means a pulse having two voltage levels (a high and low level, relatively speaking), hence the name. Systems using bi-level sync are edge triggered. Typically, the negative-going, leading edge of the pulse triggers the synchronization process (Figure 1). Display systems must "look" for this negative going edge in order to identify the moment in time when to re-sync the raster scan process. Most will recall that computer graphic cards sometimes output positive-going sync. Positive-going sync signal the display that the graphics line rate has changed to a new format. Looking for the sync pulse has always been one of the "trickiest" of tasks for the display signal processor. It requires careful biasing of the sync processing circuitry so that the sync pulse is made as distinguishable as possible from the other voltage levels within the video signal. As part of the video signal, bi-level sync introduces an unwanted DC component (Figure 2). In processing of composite, S-video, or component video the DC component is not too troublesome and can easily be managed as part of the normal sync separation routine. When bi-level sync is introduced onto RGB video channels, the process is more complex. In some systems, sync is introduced on the green channel only. This requires that the sync separation process be ultra clean; in most cases, however, it is not. Usually a very narrow sync pulse remains. Residual sync results from incomplete removal of the sync information from a video processing channel. Sync is typically imposed on the green channel in RGsB systems. High definition component video signals contain sync on each channel. Depending on the performance characteristics of the DC restoration circuitry within the video processing channel, some or all of the sync pulse may not be removed from the green channel. Residual sync causes the green channel to bias incorrectly with respect to red and blue at the display CRT, thus causing a color shift. Even in RGB systems where sync is introduced on all three channels, there is some difficulty with maintaining consistent processing between the three channels. Again, small DC shifts in the black level caused by residual sync can disturb the color balance or gains of the video channels. A significant amount of power is used by the broadcast transmitter to send the sync pulse. Polarity of the video signal is designed to minimize the amount of power used to transmit sync. And, while we have not transmitted analog versions of high definition television terrestrially, early testing done during HDTV development demonstrated a need to improve the management of synchronization in the new television system. Tri-level sync eliminates the DC component and provides a more robust way to identify the coming of synchronization in the signal chain. Tri-Level Sync Tri-level sync was introduced with the SMPTE 240 analog HDTV standard. Previous to that, the early HDTV 1125/60 systems used various synchronization waveforms, as provided by various 1125/60 equipment manufacturers. The creators of the later SMPTE 240 HDTV standard searched for a standard sync waveform that would ensure system compatibility. The goal was to provide more precise synchronization and relative timing of the three component video signals. HDTV component video has sync present on all three channels: Y, Pb, and Pr. In addition, the sync structure needs to be resilient enough to endure multigenerational recording and other noisy situations. Tri-level sync met the requirements. Figure 3 shows a graphic representation of a tri-level sync signal. As defined by the SMPTE 240 standard, the pulse will start at the zero volts (specified black level) and first transitions negative, to -300 mV (+/- 6 mV). After a specified period, it transitions positive + 300 mV (+/- 6 mV), holds for a specified period and then returns to zero or black level. The display system "looks" for the zero crossing of the sync pulse. Each half of the tri-level sync pulse is defined to be 44 samples (reference clock periods) wide, for a total sync pulse width of 88 samples. The rise time is defined to be four samples wide +/- 1.5 samples. This symmetry of design results in a net DC value of zero volts. This is one major advantage of tri-level sync. This solves the problem of a bi-level signal introducing a DC component into the video signal. The elimination of DC offset makes signal processing easier. Within our new digital television system, the unique excursions of the sync derive numerical values that are easily coded and easily recognized within the digital transmission channel. Converting Tri-Level to Bi-Level Sync There are times when it is necessary to convert tri-level sync to bi-level sync such as when component HDTV is converted to RGBHV. A format converter, like Extron's CVC 200, will perform the conversion of tri-level to bi-level sync as part of the component HDTV to RGB conversion process. Traditional displays and projectors not capable of handling tri-level sync will "see" sync information in the traditional way. Any time signals are converted from one format to another, the relative timing of the conversion is of prime importance. The introduction of timing error, once introduced into a signal channel, is difficult to repair. The positioning of tri-level sync with respect to active video and the wider excursion from peak negative (-300 mV) to peak positive (+300 mV) provided by this format establishes easier sync detection and more consistent triggering through the use of the zero crossing. When converting bi-level sync, the leading edge of the bi-level pulse should be aligned using the zero crossing of the tri-level sync. By doing so, the bi-level sync pulse will provide leading-edge trigger at the proper point and correct timing will be maintained. Figure 4 shows the relationship of a tri-level sync signal to a properly-timed bi-level sync signal. Anyone involved in interfacing video signals will, at some point, encounter the need to convert tri-level sync to bi-level sync. As time progresses, a growing group of displays and projectors will be designed to cope directly with these format differences. In the meantime, technicians should be aware of the differences in sync construction and the proper timing relationship for conversion between these two common formats. http://goo.gl/PWhHf
  19. I was posting just prior about importing from a AAF/OMF and the files were read-only so I fixed that issue (or I should say, you all helped me resolve that problem). Now I just realized that (after editing for awhile), the sync is drifting. What's bizarre about it is if I look at the video reference in Quicktime, within the inspector window, it says the frame rate is 23.98, playback also 23.98 So, I import it into protools, but protools claims it's true 24. The mixed down reference audio matches the video and stays in sync. However, when I import the AAF/OMF file, it claims it's 23.98, and all those tracks are fall out of sync. has anyone experienced this before and what did you do to resolve your situation? Thanks for your guidance. jgbsound
  20. Hi there, First post here. I'm about to start filming a contemporary dance film wich include a great part with sync playback where the dancing has to move along with the music. This is my first time doing this kind of production and I'm looking for some wisdom to reproduce an accurate master playback dupe (1h running time approx.) with timecode through my 788T SSD, or recording a reference guide track into the 788T from a CD player in order to make the syncronization possible in the editing room. The idea is to use the same master playback dupe for the shooting and editing as well. I wonder wich option suits best for the occasion: using the 788T as playback machine, or to record a reference guide track. A single Arri Alexa will be used, running at 25 fps. Due to the single camera workflow, the production will be very paused and the need for accurate playback reproduction is a must. I need some advice on the making of the master playback dupe with timecode process. I understand that I have to make a mono version of the music in one track, and timecode recorded on the other. My questions are: 1- Can I record timecode directly from the timecode output of the 788T into Pro Tools, add the mono track version of the music and then trim the timecode track to the duration of the music for finally make a stereo bounce or export both region as files (interleaved)? 2- In the case I playback the master dupe from a CD player and record the music and timecode in separated ISO tracks, will the 788T recognize the timecode from the CD? How can I do to get the timecode from the CD out of the 788T into a digital slate? 3-On the other hand, if I can playback the master dupe from the 788T in order to get a precisely reproduction of the music. Will the 788T recognize the internal timecode from the master playback dupe, or will generate one of its own, depending the timecode mode? Will the digi slate recognize the timecode coming from the master dupe if I assign its channel to an output of the 788T? Shall I route the outputs of the 788T (used for playback) into another recorder, or it's enough feeding the camera from the 788T? It's possible I'm missing other important options or questions to keep in mind. Waiting for some advice as soon as possible, thanks to you all! Sincerely, Santiago Fumagalli
  21. Happy birthday to you! 20 years Lockit!! We celebrate this event with an extraordinary limited aluminum special edition and new accessories... A good 20 years ago Guenter Knon worked for the German candid camera TV show. After countless episodes he was getting tired of lining up the Nagras and cameras in the morning each and every day. So he and Chris Price sat down together and after one year and several prototypes the first ACL Lockit was born. In the following 20 years this fellow became more and more famous and reproductive. Thus it rapidly became the head of the well-known Clockit family. This clan was so powerful that they were able to take over the leadership of the international portable sync and time code market. Their strength was based on their internal accuracy, their ability to be tuned to match all situations, and most importantly, on a network of good friends and supporters worldwide. We want to celebrate this with an extraordinary Special Edition of our recent ACL 204: THE ACL 204 anniversary edition… …is a limited run of 200 units with a “full metal jacket”, a machined, pearl blasted and anodized aluminum body. Thanks to the more robust material we could reduce the wall thickness and therefore exactly maintain the weight and price. Together with the anniversary Lockit we are proud to publish a brand new Lockit accessory. The ACL Mount… …is a quick release mount with a fixable 3/8” male thread. This makes the use of Velcro to attach your Clockit device to the camera a relic of the past. The ACM 204 was especially designed for the new aluminum body for older versions there will be a different model: The ACM-TL is a mount with adhere plate for ACN-TL or older ACls I hope you like the new products! Best Regards Timo
  22. Hi All, Just for general interest, thought some of you may be interested in knowing that Pluraleyes 3.0 is available. Link and Video : http://www.redgiants...all/pluraleyes/ I've used the demo version of Pluraleyes 2.0 on a few hours worth of footage from 2 Canon DSLRs and a Tascam HD-P2. When shooting I had to ensure that I got good scratch audio levels on the cameras - probably best to feed the DSLRs a direct feed from a mixer rather than relying on the internal DSLR mic. All in all it worked well - a real time saver. I found it very interesting to have actually used Pluraleyes because it helped me personally experience the DSLR audio workflow firsthand and influenced my audio setup on subsequent DSLR shoots. Apparently the 3.0 version is more efficient and much faster - anxious to try it. Thanks for your interest.... cheers. Dave P.S. I have no connection to Red Giant & Pluraleyes... just an end user :-))
  23. Intro Hi guys, tuning into this awesome forum for some of the usual life-saving help. So I just finished production mixing for my second feature length movie. Landed doing the audio post which will be a first(except mix). A daunting task at the least, but living in the Dominican Republic, people dont really have a miriad of options, eventhough I am fairly inexperienced in the film world, I have been doing commercials and documentary work for over 7 years. Anyways I digress. While i was doing the production audio for this movie, I made it a point to have my recorder jammed from the Panasonic HD Varicam any chance I would get. (since i was running a hardwired mix to camera 85% of the time, it was pretty easy to just have a second TC cable). It was easier to jam my 744t from the camera bc this way I dont have to fidget with the camera. The camera was set to free run and we used time of day. I did this with the belief that I would be saving the production the trouble of manually syncing my audio with the dailies. Is this correct? I did not use a TC slate or lockits, but was constantly jamming my 744t from the Varicam. Isn't there a way for FCP to read the timecode of audio files and video files and automatically line them up? I would think there is, is there something im missing here? The video editor hasn't a clue to what im talking about. Step by step would be awesome I'd appreciate some input on this... Thanks in advance guys!!!
  • Create New...