Jump to content

Jeff Wexler

Administrators
  • Posts

    10,089
  • Joined

  • Days Won

    111

Everything posted by Jeff Wexler

  1. The Stereo Nagra (Nagra IV-S) was released I believe in 1971 and was used for many years before the IV-STC (timecode) model came out. For most all of its use, primarily music recording, timecode was not needed. I used the Nagra IV-S in production in 1975 on "Bound For Glory" for some of the musical scenes. I remember that the transfer facility (I believe it was Todd-AO) required advance notice that I would be using the Stereo Nagra and they had to rent a Nagra IV-STC to do the transfers for those specific scenes. Again, timecode not an issue as the majority of film production had not yet utilized timecode or smart slates or any of that stuff.
  2. Fascinating -- I love this history (had no knowledge of the early SQN or the connection to the Nagra SN). The model I was familiar with and used a few times (never owned one) was the model C pictured below.
  3. I had an Astro van for several years that I used as my work vehicle. My daughter read some crash reports and immediately told me I had ti stop driving the Astro and get rid of it. At that time I wasn't really using it for much anyway (doing mostly big movies with transportation department taking care of all my vehicle needs) and so I did sell it eventually. Never got another van though I did consider briefly the Ford Transit (and I might have gone for the Mercedes like several of my friends doing commercials had -- but it wouldn't fit in my small garage at home).
  4. Thank you "The Documentary Sound Guy" (sorry, I don't know your name) for taking the time to post your thoughts on this subject. I agree with much of what you are saying here, but I also strongly disagree with you on certain things. Thank you as well for painting out that "Iron Film" lives in New Zealand, something which I had not considered with some of my comments to him. I would take the time to respond to your post, point by point, but I feel that it might be something that doesn't really interest that many people, I would not want to dominate the site (though I would rather not have you have the last word.)
  5. Apple had to settle with Creative Technology (a company that had an MP3 player with a user interface it had patented) for $100 million (US) to allow the Apple iPod to be produced without violating Creative's valid patent. We can all debate whether Creative should have been allowed to patent their user interface, but they applied for a patent and it was granted. -from Wikipedia: Apple's application to the United States Patent and Trademark Office for a patent on "rotational user inputs",[93] as used on the iPod interface, received a third "non-final rejection" (NFR) in August 2005. Also in August 2005, Creative Technology, one of Apple's main rivals in the MP3 player market, announced that it held a patent[94] on part of the music selection interface used by the iPod line, which Creative Technology dubbed the "Zen Patent", granted on August 9, 2005.[95] On May 15, 2006, Creative filed another suit against Apple with the United States District Court for the Northern District of California. Creative also asked the United States International Trade Commissionto investigate whether Apple was breaching U.S. trade laws by importing iPods into the United States.[96] On August 24, 2006, Apple and Creative announced a broad settlement to end their legal disputes. Apple will pay Creative US$100 million for a paid-up license, to use Creative's awarded patent in all Apple products. As part of the agreement, Apple will recoup part of its payment, if Creative is successful in licensing the patent. Creative then announced its intention to produce iPod accessories by joining the Made for iPod program
  6. So, are you the one who decides whether something is patentable or not? Are you aware of all the abuses of the patent/copyright procedures that have been tried over the years? Do you know that Microsoft wanted to patent the use of a "window" in computer operating systems? All of the "look and feel" litigation that went on (Apple, Microsoft, etc.). How about the famous Betamax lawsuits where the motion picture studios tried to get Congress to allow personal home video tape! Regarding our industry and the gear we use, do you know one of the primary reasons the Nagra recorder was the only recorder used for sync sound for picture (Kudelski patented the neopilot sync system that was introduced with the Nagra III and became the standard to which all recordings had to adhere other manufacturers were prohibited from producing recorders that utilized this patented system). I could go on and on but I won't. If you have a problem with the patent system, do something to try and change it -- don't attack the companies that are obeying the law and give a free pass to those companies who break the law.
  7. To IronFilm: ask yourself if you had produced something and had it patented (whether you think the patent system is a good thing or a bad thing) wouldn't you want to defend your patent rights? If you are actually interested in the patent system in the US and how it has affected so much of our everyday lives regarding the gear that we use, I would be happy to discuss some of this history with you.
  8. Thank you jBond, I approved this post without realizing it was a sales post. I will remove it.
  9. This Zaxcom panel discussion was recorded in Atlanta on Sunday, September 25, 2022. Seated from left to right, Glenn Sanders of Zaxcom was joined by Kevin Cerchiai, Whit Norris, CAS, and Michael Clark, CAS.
  10. Zaxcom Ecosystem This Zaxcom panel discussion was recorded in Atlanta on Sunday, September 25, 2022. Seated from left to right, Glenn Sanders of Zaxcom was joined by Kevin Cerchiai, Whit Norris, CAS, and Michael Clark, CAS. In this video Michael Clark discusses the Zaxcom ecosystem on Stranger Things and how he uses Zaxcom digital recording wireless. This Zaxcom panel discussion was recorded in Atlanta on Sunday, September 25, 2022. Seated from left to right, Glenn Sanders of Zaxcom was joined by Kevin Cerchiai, Whit Norris, CAS, and Michael Clark, CAS. In this portion of the chat the panel discusses all things Zaxcom IFB and the ZMT4. This Zaxcom panel discussion was recorded in Atlanta on Sunday, September 25, 2022. Seated from left to right, Glenn Sanders of Zaxcom was joined by Kevin Cerchiai, Whit Norris, CAS, and Michael Clark, CAS. In this segment, Whit and Kevin discuss their workflow and how they use the RX-8 with Dante. This Zaxcom panel discussion was recorded in Atlanta on Sunday, September 25, 2022. Seated from left to right, Glenn Sanders of Zaxcom was joined by Kevin Cerchiai, Whit Norris, CAS, and Michael Clark, CAS. Cujo Cooley joins this portion to share some of his extensive antenna/RF knowledge. This Zaxcom panel discussion was recorded in Atlanta on Sunday, September 25, 2022. Seated from left to right, Glenn Sanders of Zaxcom was joined by Kevin Cerchiai, Whit Norris, CAS, and Michael Clark, CAS. In this portion, Kevin discusses the importance of transmitters being as light as possible at the end of your boom pole. This Zaxcom panel discussion was recorded in Atlanta on Sunday, September 25, 2022. Seated from left to right, Glenn Sanders of Zaxcom was joined by Kevin Cerchiai, Whit Norris, CAS, and Michael Clark, CAS. In this segment, Glenn discusses the Nova 2 and Aria control surface.
  11. Well, I hadn't really thought to post the comments from Ron to try and get more people to donate but I do want to thank everyone, globally, who have been so generous in their support. The support donations are certainly welcomed and the overall support from so many of our members helps sustain the site. I will post comments from time to time, like the most recent donation from Wyatt Tuzo: Jeff, Best to you and your this holiday season! Thank you for this platform which allows us to share our experiences, and learn from others in the field. This site has done much to foster an enduring global community, and our craft is better for it! Grateful, Wyatt
  12. I appreciate all of those who have donated in support of JWSOUND site over the years, it has helped a lot to keep this online community going, a valued site for our sound community that has stood the test of time. Below is a really nice comment accompanying a generous donation from Ron who has been a long time supporter. Thank you to all! Thank you for your contribution to our production sound community throughout these many decades... I speak for those of us who were made welcome to your platform; to learn from & to pass our production sound experiences to others; to our production sound innovators & companies who created new ways & the means that simplified our craft... laissez le bon temps rouler Ron Scelza
  13. As Olle has said, working with others who trust us is key -- one of the skills that we learn over the years is to be able to hear the spoken dialog as if we are hearing it for the first time. This gives us a perspective over how most everyone else hears the same dialog. The Director has probably spent hours listening to the dialog in pre-production read throughs, rehearsals, consultation with the writer, etc. Of course they can understand every word. The audience, however, has only one shot at it and often they really miss a lot. Additionally, regarding the article, it is true that there was not much attention paid to production techniques that have changed over the years, and it is my firm belief that the majority of problems begin on the set, on the day, and then are exacerbated later during the mix and all the processing that goes on for the distribution and delivery of content.
  14. Doug, I have been reading with great interest your experiences selling (and I imagine giving away) sound gear. I still have a shop full of stuff having only sold a few items since retiring (my last Deva, my Zaxcom wireless and a few Schoeps microphones and accessories). So much of the things that were vital to my production career are no longer needed as production procedures have changed so dramatically over the years. I do have miles of mic cable (XLR), adapter cables to accommodate connectors that do not even exist on current gear, cases full of clamps, goosenecks, etc. for plant mics (when there are whole sound teams that never even use plant mics anymore!). Then there are all the things which I really just want to throw out but to be environmentally responsible much of it would require hazardous waste pickup, etc, I can't just throw it in the trash. I'm not sure how to proceed. At the very least there might be some value in the fifty or sixty Pelican cases I still have that holds all this other useless stuff.
  15. I extracted the text and removed the advertising. Below is the complete text of the article for those who may have been having some difficulty. Everything sounds bad, and there’s nothing we can do about it https://www.avclub.com/television-film-sound-audio-quality-subtitles-why-1849664873 Television today is better read than watched—and frankly, we don’t have much of a choice in the matter. Over the last decade, the rise in streaming technology has led to a boon in subtitle usage. And before we start blaming aging millennials with wax in their ears, a study conducted earlier this year revealed that 50 percent of TV viewers use subtitles, and 55 percent of those surveyed find dialogue on TV hard to hear. The demographic most likely to use them: Gen Z. Mounting audio issues on Hollywood productions have been exacerbated in the streaming era and made worse by the endless variety of consumer audio products. Huge scores and explosive sound effects overpower dialogue, with mixers having their hands tied by streamer specs and artist demands. There is very little viewers can do to solve the problem except turn on the subtitles. And who can blame them? “It’s awful,” Jackie Jones, Senior Vice President at the Formosa Group, an industry leader in post-production audio. “There’s been so much time and client money spent on making it sound right. It’s not great to hear.” Formosa is one of the many post-production houses struggling to keep dialogue coherent amid constant media fracturing. “Every network has different audio levels and specs,” Jones told _The__A.V. Club_over Zoom. “Whether it’s Hulu or HBO or CBS. You have to hit those certain levels for it to be in spec. But it really is how it airs and how it airs is out of our control.” After it leaves a place like Formosa, the mix might go through an additional mix at the streamer and another mix, so to speak, by the viewer’s device. Of course, this is the last thing they want in the audio industry. “Dialogue is king,” sound editor Anthony Vanchure told us. “I want all the dialogue to be clean as clear as possible, so when you hear that people are struggling to hear that stuff, you’re frustrated.” And yet, we still end up with the subtitles on. If we’re just going to read an adaptation of The Sandman on Netflix, why even bother making it? “Everybody’s very unhappy about it,” said David Bondelevitch, associate professor of Music and Entertainment Studies at the University of Denver. “We work very hard in the industry to make every piece of dialogue intelligible. If the audience doesn’t understand the dialogue, they’re not going to follow anything else.” Streamers and devices make terrible music together With all this technology at our fingertips, dialogue has never been more incoherent, and the proliferation of streaming services has made the landscape impossible to navigate. Aside from the variety of products people watch media on, no two streamers are alike. Each one may have a different set of requirements for the post-production house. As far as streamers go, editors say Netflix is the best for good sound and even published their audio specs publicly, but the service is an outlier. “They have put an awful lot of money into setting up their own standards, whereas some of the other streamers seem to have pulled them out of their asses,” Bondelevich said. “With some of these streamers, editors get like 200 pages of specifications that [they] have to sit there and read to make sure that they’re not violating anything.” All streamers aren’t so forgiving. “I was at lunch with a couple of friends recently off of a mix, and they were at lunch answering emails because they did the mix, completed the mix, and everybody’s happy,” said Vanchure. “And then the director got like a screener or was able to watch it at home, you know, being whatever streaming service he was using. And he was like, ‘Hey, this sounds completely different.’” Today, sound designers typically create two mixes for a film. The first is for theatrical, assuming that the film is getting a theatrical release. The other is called a “near-field mix,” which has less dynamic range (the difference between loud and quiet parts of a mix), making it more suitable for home speakers. But just because the mixes are getting better doesn’t mean we’ll be able to hear them. “‘Near field’ means that you’re close up on the speakers, like you would be in your living room,” said Brian Vessa, the Executive Director of Digital Audio Mastering at Sony Pictures. “It’s just having a speaker near you so that what you’re perceiving is pretty much what comes out of the speakers themselves and not what is being contributed by the room. And you listen at a quieter level than you would listen to in the cinema.” “What the near-field mix is really about is bringing your container in a place where you can comfortably listen in a living room and get all of the information that you’re supposed to get, the stuff that was actually put into the program that that might just kind of disappear otherwise.” Vessa wrote the white paper on near-field mixes, creating the industry standard. He believes a big part of the problem is “psycho-acoustic,” meaning we simply don’t perceive sound the same way at home and at the theater, so if a good near-field mix isn’t the baseline, audiences are left to fend for themselves. Complicating matters, where things end up has never been more fluid. “In TV we anchor the dialogue so it is always even and clear and build everything else around that,” said Andy Hay, who delivered the first Dolby Atmos project to Netflix and helped develop standards for the service. “In features we let the story drive our decisions. A particularly dynamic theatrical mix can be quite a challenge to wrestle into a near-field mix.” With so many productions being dumped on streaming after a movie’s complete, audio engineers might not even know what format they’re mixing for. And there is the home to deal with. Consumer electronics give users a number of proprietary options that “reduce loud sounds” or “boost dialogue.” Sometimes they simply have stupid marketing names like “VRX” or “TruVol,” but they are “motion smoothing” for sound. Those options, which may or may not be on by default from the manufacturer, attempt to respond to noise spikes in real-time, usually trying to grab and “reduce” loud noises, like explosions or a music cue, as they happen. Unfortunately, they’re usually delayed and end up reducing whatever’s following the noise. It’s not just the speakers that are the problem. Rooms, device placement, and white noise created by fans and air conditioners can all make dialogue harder to hear. A near-field mix is supposed to account for that, too. “I listen very intently and very quietly, because that way all of these other factors, the air conditioner, the noise next door, all the other stuff that could be clattering around and stuff starts to matter. And if I lose something, we got to bring that up.” The long road to bad sound The sound issues we’re experiencing today are the result of decades of devaluing the importance of clear audio in productions. Bondelevitch cites the move from shooting on sound stages with theatrical actors as the first nail in the coffin. Sounds stages provide an isolated place to pick up clear dialogue, usually with the standard boom mic “eight feet above the actors.” The popularity of location shooting made this impossible, leading to the standardizing of radio mics in the ’90s and 2000s, which present their own problems. Cloth rustle, for example, is tricky to edit and leads to more ADR, which actors and directors alike hate because it diminishes the performance given on set. In the early days of cinema, when most actors were theatrically trained for the stage, performers would project toward the microphone. Method acting, however, allowed for more whispering and mumbling in the name of realism. This could be managed if more time were put into rehearsal, where actors could practice the volume and clarity of their lines, but very few productions have that luxury. One name that keeps being brought up by sound editors for this shift is Christopher Nolan, who popularized a growly acting style through his Batman movies. The problem remained consistent throughout his Dark Knight trilogy, with Batman’s and Bane’s voices being two consistent complaints even among fans of the movies. When Bane’s voice was totally ADR’d following the film’s disastrous IMAX preview, it overpowered the rest of the movie. “The worst mix was The Dark Knight Rises,” he said. “The studio realized that nobody could understand him so at the last minute they remixed it and they made him literally painfully loud. But the volume wasn’t the problem. [Tom Hardy’s] talking through the mask, and he’s got a British accent. Making it louder didn’t fix anything. It just made the movie less enjoyable to sit through.” Volume is an ongoing war not just among sound editors but inside the government. In 2010, the Federal Communications Commission passed the Commercial Advertisement Loudness Mitigation (CALM) Act to lower the volume of commercials. Instead, networks simply raised the volume of the television shows and compressed the dynamic range, making dialogue harder to hear. “They’re trying to compress things so much that they can keep getting louder,” said Clint Smith, an assistant professor of sound design at the University of North Carolina School of the Arts School of Filmmaking, who previously worked as a sound editor at Skywalker Ranch. Smith has been teaching audio engineering for five years and encourages his students to embrace subtitles and work to work them into the narrative of a film in more creative ways. “What does it look like? Ten years down the road, 20 years down the road where subtitles become more prevalent because I don’t see them as going away,” Clint asked his students. “I was kind of just curious about…how can we actually have the subtitles be part of the filmmaking process. Don’t try to run away from them.” As unintelligible dialogue becomes more common, we’ll have no choice but to embrace the subtitle. But at what point are studios and streamers not even bothering to mix sound properly and assuming viewers will just read the dialogue? With subtitles being an option for every streamer, soon, “we’ll fix it in post” could become “they’ll fix it at home.” Sound you can feel There are some things that we can do. For instance, there’s always buying a nice sound system. Even more important is setting it up properly. Most of the sound mixers interviewed recommended having professional help but also mentioned that many soundbars today come with microphones for home optimization. None sounded too convinced by soundbars, though. “If you’re using a soundbar,” Bondelevich said, “Get the best soundbar you can afford. And if you’re listening on your earbuds or headphones, get good headphones. If it’s a noisy environment, get over-the-ear headphones. They do really isolate sound much better and do not use noise canceling headphones because those really screw up the audio quality.” But more than anything, they emphasized how this is a selling factor for movie theaters. If you want good sound, there’s a place that has “sound you can feel.” “It’s a bummer because you want the theater experience,” said Vanchure. “People aren’t going out to theaters as much nowadays because everything’s just streaming. And that’s how you want people to hear these things. You’re doing this work so you can hear this loud and big.”
  16. This is a great article! One of the main contributors to this article is our esteemed Cinema Audio Society Board member David Bondelevitch.
  17. Thank you, Al, for this --- wonderful! As you know, I am fascinated by the whole process of creating music, and of course, I'm a huge Beatle fan.
  18. JBond's Nagra section is one of the best things JWSOUND site has to offer. I appreciate the incredible indexing you have done, what a lot of work! I do hear from so many people that one of the things they really like about JWSOUND is that it is a great archive. I wish the forum structure had better search routines and indexing functions but sadly it does not. I continue to explore other more modern platforms but haven't yet discovered anything that would significantly improve the site (and I worry that we might lose a lot of stuff doing any sort of conversion or migration to a different platform). Also, the basic model for social media as we all know is Facebook, Tip Tok, Instagram, etc., and all of these are miserable in terms of searching for historical posts -- they are all about immediate and flashy posts that capture the limited attention span of most people these days.
  19. Maybe I'm missing something here, but I have only one question: why? I see no reason to go to the effort of pulling a telephone apart and placing a transmitter and a mic in the handset --- maybe for some specialty project, a surveillance video or hidden camera sort of thing or something, if it is a regular scripted narrative that happens to have an actor on a phone, why not just mic the scene as you always would? I have, of course, had to deal with scenes in movies where we need to make the actual phone set practical (so that two actors can converse, sometimes in two different locations, but that doesn't seem to be what is being discussed). One simple reason why a mic placed in the telephone handset is a bad idea: will there be any other actors who speak in the scene with the main actor making a phone call? You know that a lav mic placed in the phone, however it is done, will have a very specific "color" and tone to the voice --- what happens when that same actor finishes his or her phone call, puts the phone down and starts talking to the other actor? In this case, you will mic the scene as you always do, possibly booming both actors -- the difference in tonality will be very apparent and an audience won't know why they feel the scene does not feel authentic.
  20. Looks good, Phil, and the covers you did and mods seem to do a good job adapting the Innovative to our production environment. It really shines as a "stand up" working cart (something I could never get into, all of my carts were definitely designed for sitting). I'm putting your pictures up on the Gallery of Sound Carts.
  21. We have been blessed all these years to have JBond as a core member of our online community, representing the history of the Nagra recorder that was so much a part of most all of our lives. The wealth of information available here is immeasurable --- we all owe a debt of gratitude for your tireless efforts. We hail you and your commitment and dedication (and all the hard work that goers into your posts).
  22. Simple answer: No. As said by others, you may find a 2.4ghz wireless that performs quite well in some situations but is totally useless in other situations, environments, and even sometimes in the same environment on the same day. So, you could use these low cost 2.4ghz devices if you have to but you will get royally burned at some point and it could cost you your job. None of the high end digital wireless operate in the 2.4 ghz range.
  23. I think possibly Zaxcom felt that these products are so far out of the professional arena that it wasn't worth the time and expense to defend the patent regarding these specific devices. I may not be correct on this but it is quite possible that is how they "dodged" having a legal patent dispute. It could as well relate to whether these devices have smite timecode or not, etc., etc.
  24. Anyone able to be in Atlanta on Sunday, September 25th, Glenn Sanders and Colleen will be doing a full Zaxcom workshop going through the whole Zaxcom ecosystem. Should be a great event.
×
×
  • Create New...