Jump to content


  • Posts

  • Joined

  • Last visited

  • Days Won


1 Follower

Profile Information

  • Location
  • About
    Bilbaina Jazz Club
  • Interested in Sound for Picture

Recent Profile Visitors

3,578 profile views
  1. As far as I know (I own two units of the previous generation) the newest units (UTX-40, etc) offer some additional options: - A "gain" mode in the microphone attenuation menu - An occupancy sensor mode to prevent ultrasound from sensors hitting the microphones and affecting the compander - A digital shoe interface for Sony cameras Always check the frequencies you intend to use. Do a scan to make sure they are free. If you run into problems and you don't have a spectrum analyzer do a scan on a different channel group. And if not using RF Explorer or frequency coordination software, at the very least Sony offers advice for frequency usage depending on the number of channels you want. Check the frequency list for the relevant models (it's more or less the same for the latest UWP-D or the previous generation), paying attention to the "Grouping for multi-channel system" sections. https://pro.sony/ue_US/support-resources/utx-b40/manual Also, somewhere else in this forum @Atanas posted a firmware download to add the sensor occupancy immunity to the compander (advantages of digital domain implementation) but I haven't tried it. My units were purchased second hand and I am trying to get an answer from Sony about this beta firmware. And, last but not least: knowing that the included mics are really poor I ordered two second hand ECM77-BMP microphones from eBay. Surprise, they have a ferrite clamp installed close to the jack. Is this a hint that these transceivers might be susceptible to RF interference received through the microphone connector? As far as I know these ferrite clamps are installed by Sony themselves.
  2. WRC-23 is ongoing (World Radio Conference) and it seems PMSE (programme making and special events) will tend to be protected. At least it is the European position. This is interesting: https://accesspartnership.com/access-alert-what-impact-might-wrc-23-have-on-the-content-and-events-industry/ Quoting: " A recent study from the European Commission showed that demand for PMSE spectrum was increasing in 50% of the EU Member States, with Spain showing the biggest increase at 20% a year[3]. " At least they are considering PMSE as an incumbent application that should be protected. And while they are considering a proposal for mobile service (beware, in radio regulation gobbledygook mobile service is _not_ cell phones, which are called "IMT" with IMT being the real threat to PMSE). It is interesting to note that, while broadcasting usage is decreasing in some countries, they are taking into account the growth in PMSE, ie, wireless microphones and associated applications). And https://www.itu.int/dms_pub/itu-r/oth/0A/0A/R0A0A0000150001PDFE.pdf Slide 14: "CEPT supports the continuation and development of the incumbent usage by PMSE (SAP/SAB) (in accordance with existing RR No. 5.296)." So maybe there is hope! Sometimes crazy proposals are made in World Radio Conferences only to be discarded. I remember the French put out a poorly written proposal to reassign the Amateur 144-146 MHz spectrum to aeronautical communications which was rejected. They are still insisting though, and the International Amateur Radio Union is still on guard. And, adding some more information, the Radio Spectrum Policy Group of the European Commission recommends "long term regulation stability" for PMSE and free to view television. I know I am talking about Europe (Region 1) but hope the US authorities get a clue. Who fights to protect the PMSE usage, anyway? Seems mostly broadcasting organizatios (such as EBU) but maybe other representatives of the entertainment (or sports) industry should participate somehow. A coalition of wireless equipment manufacturers and industry associations such as AMPS (MPAA even!) could help make a difference joining forces with the broadcasting associations.
  3. Interesting. What chipset does it use? In my experience Realteks are a big no-no, AX88179A chipset (or, I guess, similar) works although not as well as the Thunderbolt attached Broadcom.
  4. Ouch! There are several Thunderbolt to Ethernet adapters sold by Sonnet. I think the one sold as "AVB" has a Broadcom chip (well supported by Apple and others) would work as well without needing to chain dongles. But I haven't tried it. The Mac Mini option of course is the best option of course. The thing with Dante is, the protocol is not so complicated. But the market is littered with real networking junk which works well enough for the low end peecee market (or it doesn't work at all but unsuspecting customers don't know who is to blame).
  5. Last Summer I tried several USB-C to Ethernet interfaces with a Macbook Pro. I was going to do something rather straightforward, recording a concert (16 channels over Dante from the stage preamps). I only found one combination to really work as it should. The old Apple Thunderbolt GbE adapter chained to a Thunderbolt 4 to Thunderbolt 2 adapter. The rest were a mixed bag and some of them outright catastrophic.
  6. There is something called "institutional knowledge". Skills that are difficult to document that pass from generation to generation. If you break the cycle through a layoff or letting people retire without passing on that knowledge through apprenticeship you are toast. That happened, it's been said, to the French nuclear industry and it backfired spectacularly when building the Okiluoto reactor in Finland. https://www.power-eng.com/news/lessons-learned-from-olkiluoto-3-plant/#gref
  7. Actually Deity licensed several patents from Zaxcom, so they can record and transmit without legal trouble.
  8. borjam

    Rode mics

    I understand his frustration with overly enthusiastic marketing departments, but he doesn't seem to understand very well how digital audio works. Dynamic range (let's say, macrodynamics) is different from resolution. When he says his voice can be recorded with 8 bits I think he is talking about envelopes, range of amplitudes, not failthful audio recording. So much for 8 bit audio, even good old digital telephone audio used 12 bits! (Yes, non linear, A or µ law but it's still 12). Resolution is related to S/N ratio because low resolution means quantization noise. So when you record you aim to make a good use of the A/D dynamic range leaving some headroom to prevent clipping. Of course everyone here knows that there is a chain of elements with different dynamic ranges involved in audio recording. Air: Yes, it can become non linear for loud enough levels! (*) Microphone Preamp ADC. And turns out many beginners (for whom many Rode products are aimed!) tend to fail setting up recording levels and clip. So, what's wrong with making their lives easier? As for controlling all of the circumstances or not, well, it depends. For people shooting guerrilla style, or documentaries, or even doing nature recordings, all kinds of unpredictable stuff can happen! Imagine you are recording a distant bird (with lots of preamp gain!) and suddenly an interesting bird comes close and sings. So yes, it won't capture 32 bit audio. Fine. But it will be much more lenient with recording levels. So what's wrong? Youtube videos. UGH!
  9. Well, maybe that Zaxnet system is way too naive. That approach can work in regulated spectrum with all the players following similar rules. But in ISM bands where several radically different schemes can be used… Remember that Deity mentioned a clever mechanism to advise WiFi equipment around to leave free radio time. How old is Zaxnet anyway? Also there might be other problems. How is its front-end filtering? I mean, despite using one end of the available band, how would it resist off channel interference? But on ISM bands your transmission system should be designed with the assumption that it will have to coexist. And actually I am probably wrong about the 2.4 GHz band. Maybe it has gone through a sort of transition period with less pollution. But now wireless headphones and other 2.4 GHz devices are on the rise, not being WiFi. So my assumption is, indeed, likely inaccurate. I was only thinking about WiFi, sorry.
  10. Yes, the 6 GHz WiFi band is more or less the same (radio wise) as the 5.8 GHz band. The advantages are: - There is much more bandwidth than at 2.4 GHz. - It is less tolerant to obstacles. The second may be shocking for some, but it is a real advantage. Whenever I give a talk about WiFi networks to customers I use to point the main mistake made in many WiFi deployments. Whenever I hear something like "I use high power network gear, I can connect even from the other side of the parking lot!" I answer "You botched it!". Why? Because WiFi equipment adapt to network conditions. When your signal is great, equipment chooses the high throughput signal codings. With poor conditions they fall back to the slowest modes which can work with marginal radio signal. So, yes, it connects from the parking lot!. What many people fail to grasp is, there is a precious resource in your wireless network and it's called "time". If you have a far away use connected with a low performance mode, it will mean it is using a lot of "on the air" time, which means it will slow down all the "cell" (users of the same access point). So why is 5-6 GHz better? It makes you increase the access point density in order to cover a wide area, and it will help prevent distant users from connecting to the wrong AP because of the higher attenuation. Also, less obstacle tolerance means less interference from neighbors. So it can be a blessing Anyway, with WiFi users migrating to 5 GHz whenever possible, the 2.4 GHz band should be less polluted now than it was several years ago. Except of course for other equipment such as follow focus, video transmitters, etc. As I told a customer the other day: The two kinds of wireless hells on Earth are schools and movie sets As for 5G, it can use several different bands. The lowest frequencies (700 MHz) have a much better wall penetration so it should work pretty well inside buildings, while there are provisions for high density deployments on 2.6 or even 26 GHz which would be adequate for places such as sports stadiums.
  11. Properly implemented spread spectrum can be really robust. Of course it can be vulnerable to receiver overload, but the legal limits on transmission power should avoid that. Of course there is no magic for that, although some spread spectrum schemes using also FEC (forward Error Correction) like LoRaWan can achieve unbelievable results. I am not suggesting they are using LoRa (it's a low data rate modulation, so impossible), but that modern modulation techniques can be amazing. But at the cost of latency if using FEC. Note that it includes timecode, they claim. That means they can cope with more latency and still the audio files will be usable, you can think. There is a limit to that of course, but (assuming it is frequency hopping) if the hops are really random they shoudln't expect many collisions. As for other devices, it depends. If using frequency hopping I am wondering about the front end filtering of the chips they employ. I guess they will keep latency kinda under control maybe having the equipment make decisions about interference avoidance. This is old tech, even the Telebit Trailblazer modems of the 80's did something like that over telephone lines. Instead of the classic approach of one carrier, they used many low bandwidth carriers and if error rate was high for some carriers their frequencies were blacklisted if I remember well. As I said above, I imagine they are using some off the shelf chipset that implements that. I have hints because a customer in a completely unrelated field is developing some spread spectrum transmission system and I am pretty sure they aren't developing modulation schemes. I guess good enough for many modest, prosumer equipment level but I am pretty sure they are not trying to compete with the high end professional stuff all of you use. Only the RF front ends must be really expensive. My brother, who works for a stage equipment company, told me that he was sure that Lectrosonics cheated with power levels because he couldn't understand how they manage to work otherwise I had to explain him that, well, that's not possible and, well, not all RF circuits are created equal! WiFi imposes several limitations, including collision avoidance measures if you are intending to send large packets. Also it doesn't have an arbitration protocol. Maybe they are using WiFi chipsets with a kinda tailored implementation of the protocols? But I doubt it. If you develop your own scheme you can make important decisions so that you can compromise between latency, error correction, total channel capacity, etc, with more flexibility. There is buffering also just because you are packing together a bunch of samples into a packet and computing an error detection/correction code. So, latency. WiFi sends packets of course. You can say that access points do the packet switching. And the use cases for TCP and UDP are different. UDP is mostly appropiate if your priority is real time (say, a journalist live broadcasting via satellite link) and you can tolerate a glitch or two. TCP, with congestion control and error detection and retransmission offers data integrity but it will sacrifice latency. It can be a middle ground solution anyway. I think Rode are no fools. As a mater of fact three years ago the marketing dept wanted to buy a wireless microphone to record some videos at trade shows. They were going to buy some utterly crappy stuff and I convinced them to, as a bare minimum, buy a Rode set. So far it has worked even in very busy trade shows with WiFi activity all over the place. Typical Rode fashion I imagine it will be very good equipment for the money but surely no match for the established high end names.
  12. Maybe they are using some spread spectrum modulation that, as long as the receiver is not saturated by a really strong signal (such as a nearby microwave oven) can be unbeliavably resilient. I have heard of some frequency hopping applications working on 2.4 GHz that, I guess, are based on (maybe?) some new chipset implementing implementing it. For certain applications spread spectrum can seem like magic. And with digital signal processing power becoming so affordable and energy efficient you can use techniques that were unthinkable several years ago.
  13. For reference, sorry about the post avalanche, I would for now avoid this. If you see Realtek as the vendor ID, run for your life. I think there are better Realtek chips (maybe the D-Link for which I cancelled the order was better) but this one is unacceptable.
  14. Indeed, Apple have been quite stupid (in my opinion). They should keep an up to date proper Ethernet interface. Maybe not cheap but at least with a minimum expectation of being solid. I can imagine countless situations in which someone goes to record a concert with a $4000 laptop and a $25 crappy Ethernet adapter only to find that it won't work while others (maybe with built in Ethernet port) are capable of doing it. Dodgy hardware can damage reputation more than non existant products. And I got the Belkin adapter from the Apple Store together with the laptop. it was the recommended model. At least the hub I have just tried is not terrible, but it was a bit more troublesome to set up than the TB-GbE. And it probably means that other dongles based on the same chipset should work. One more test. With the Thunderbolt adapter running a full duplex iperf3, kernel CPU consumption was about 44% with full bandwidth. With the Ugreen hub, 70 - 75% CPU, good performance. Not sure about its buffering strategy, it took some seconds to reach full bandwith while the Thunderbolt was almost instantly topping it. With the crappy Belkin, 92% CPU (!!!!) and appalling performance. ( [ ID][Role] Interval Transfer Bitrate [ 5][TX-C] 0.00-30.00 sec 2.75 GBytes 787 Mbits/sec sender [ 5][TX-C] 0.00-30.00 sec 2.75 GBytes 787 Mbits/sec receiver [ 7][RX-C] 0.00-30.00 sec 1.11 GBytes 319 Mbits/sec sender [ 7][RX-C] 0.00-30.00 sec 1.11 GBytes 319 Mbits/sec receiver As far as I know that is not 44% (or 72%) of the whole CPU power, but of one core.
  15. @Vincent R. It works, it seems. That said, when moving it from one USB-C port to another I was forced to delete the interface and recreate it. Otherwise it was unable to achieve clock lock. I have made the test connected to a 2010 Mac Pro running Mojave through a direct cable, dedicated interface (no switches messing around). Latency is good. iperf throughput is as it should. But it was trickier to make it work, being USB-C. Curiously latency with the Thunderbolt adapter is a bit higher. But seems to be more solid. I mean, I had to do some voodoo macumba in order to make the hub work, deleting the interface and recreating again when I changed USB ports. Otherwise it would not achieve lock.
  • Create New...