Vue normale

Reçu avant avant-hier

Codecs Serve Increasingly Diverse Needs

5 avril 2026 à 14:00

This is one in a series about trends and best practices in codecs for radio.

Chris Crump
Chris Crump

Chris Crump is senior director of sales and marketing for Comrex. He has experience as a radio producer and remote broadcast engineer, and has held technical sales roles for several manufacturers.

Radio World: Chris can you give us your perspective on the most important current or recent trend in codecs?

Chris Crump: I don’t think we are seeing just one trend but maybe a few. 

As we see younger broadcast talent entering the industry, they’re wanting to depend more on their personal mobile phones whenever and wherever possible. There’s an increasing dependence on apps and, of course, social media as an extension of their terrestrial broadcast. 

On the corporate side, we are being asked for large-scale virtualization to address centralized infrastructure or disaster recovery plans on an enterprise level. So while we see talent wanting the freedom that mobility offers, we also see a need for the cost savings that centralization can offer. 

RW: How has the expanding use of the cloud changed the role and use of codecs?

Crump: “Use of the cloud” always kind of makes me chuckle because it basically just means “somebody else’s computer that’s connected to the internet.” 

But some of our biggest customers require a large-scale, virtualized codec that can live “in the cloud” to address their need for cost-savings, DRP and ease of routing programming within and between very large facilities. Some of our biggest customers have or will be moving to our ACCESS VM platform for centralized distribution of programming and streaming content. This is especially important in scenarios where having 100 or more hardware codecs is no longer tenable in terms of both cost and rack space. 

RW: How well do today’s codecs integrate with today’s AoIP networks and infrastructures; what issues do they present?

Crump: Luckily, or perhaps out of necessity, standards exist that help facilitate these integrations. Most professional audio codecs on the market support some or all the various proprietary flavors of AoIP — Livewire, WheatNet, Dante, AES67 Ravenna — or the AES67 standard for AoIP interoperability. 

We’ve also pulled some standards from the video side of our business such as SMPTE 2022-7 Seamless Production Switching and NMOS, which are critical for our key distribution customers that provide both audio and video content. 

Our company philosophy has always been to support free, unlicensed, open-source standards and platforms to allow for easier integration of products with AoIP systems and to keep the costs of our products reasonable for our customers. 

For example, we love the idea of AES70 for control and monitoring of media devices over AoIP networks, but its unlikely that we’ll see it implemented by console manufacturers because that’s really their “secret sauce,” if you will.

RW: What considerations should be taken into account to allow radio talent to do shows using their phones?

Crump: Mobile phones have improved drastically as processors have gotten faster and storage more efficient. But today’s smart phones are very personal objects, and users have their own unique ways of using them. 

Trying to get someone to understand that using a phone for reliable broadcast requires they understand that they need to turn off resource draining background apps and take measure to insure an uninterrupted broadcast — perhaps even using a specific phone configured specifically for broadcast use.

There’s so much that can go wrong if someone is running a bunch of background apps and if they forget to put the device in “Do Not Disturb” or Airplane mode before they go on air. 

As developers, we must make sure our apps work on about a gazillion mobile devices, with new devices being introduced all the time. As with any broadcast, having a backup plan is key. We really like the concept of apps, but for reliability, we still encourage the use of our purpose-built, hardware codecs like the ACCESS NX Portable.

Comrex has developed several products and applications that take advantage of mobile phones. Our free FieldTap can be used to connect to our ACCESS and BRIC-Link codecs using a wireless internet connection like 4G/5G or Wi-Fi, and it can also be used with our new FieldLink sideline reporter codec on private Wi-Fi connection. 

Our Gagl + Hotline subscription-based service utilizes a web browser on a mobile device but it also allows users to call a 10-digit phone number in the U.S. from a Verizon, AT&T or T-Mobile device. This special phone line maintains HD Voice near-studio quality all the way to the hardware codec in the studio. 

Our Opal IP Audio gateway uses the same WebRTC technology from a mobile device’s web browser to a dedicated hardware device in the studio. We’ve also seen customers having success using USB-C microphone/headphone interfaces from Shure, IK and others with mobile devices, to make the experience more professional and broadcast-like.

 RW: Can you tell us about a recent installation or application for codecs that you found notable? 

Crump: We recently shipped our very first FieldLink Sideline reporter codec, which uses mobile phones to get audio from courtside or the sidelines up to the press box. FM station KPGZ(LP) in Kearney, Mo., was the first to use it, at the Missouri High School Football state championships, with great success. 

This product was developed for our customers who were requesting a simple and affordable way to do sideline reporting. So, for it to deliver such great results right out of the box and to generate comments like, “This thing is friggin’ cool” was a great feeling for everyone at Comrex.

Read more on this topic in the free ebook “Trends in Codecs 2026.”

The post Codecs Serve Increasingly Diverse Needs appeared first on Radio World.

Tieline Will Debut ViA Duo Codec

30 mars 2026 à 14:23
The ViA Duo portable codec
Tieline ViA Duo

Tieline will debut its ViA Duo codec at the NAB Show.

“ViA Duo is an ultra-portable broadcast platform that unifies IP workflows by delivering an all-in-one solution for remote broadcasts, commentary and off-tube broadcasting,” the manufacturer said in a release.

Charlie Gawley, VP Sales APAC/EMEA, said the ViA Duo bridges a gap between field reporting and commentary. It operates as an IP codec or AoIP commentary node, or facilitates off-tube broadcasting through support for audio and a video feed.

“ViA Duo supports HDMI video out, multiple AoIP protocols, plus IP streaming over multiple interfaces.”

Applications reporters with a guest; two commentators broadcasting sports events; announcers working from home; talk show and other radio show hosts on the road; off-tube commentary, or a sports commentary unit over AES67, ST2110-30, Livewire, Ravenna or Dante. An AoIP and Dante card is optional.

Tieline says the ViA Duo is lightweight and fits in the palm of a hand.

Features include two XLR mic/line inputs and headphone outputs. It can send bidirectional stereo or dual mono IP audio from a remote location to the studio using cellular, Wi-Fi or Ethernet connections, with no need for additional gear.

“Use ViA Duo as an AoIP node in stadium commentary booth and connect over a LAN to a remote truck, or over a WAN to a mixing console at a studio REMI hub. Off-tube commentary is also supported so your commentary team can receive a live video feed and call a game from wherever they are in the world.”

Remote control is provided using the embedded Toolbox Web-GUI or optional Cloud Codec Controller.

NAB Show Booth: C2246

The post Tieline Will Debut ViA Duo Codec appeared first on Radio World.

Path Redundancy Is Now Very Cost-Effective

25 mars 2026 à 14:55
Abstract image of data center with flowchart.
Credit: Yurichiro Chino/Getty Images

This is one in a series about trends and best practices in codecs for radio.

Robbie Green is product manager, communications products for Telos Alliance. He is the former senior director, enterprise technology for Audacy and has held engineering management roles at Cumulus and Clear Channel, among other broadcast companies.

Robbie Green headshot
Robbie Green

Radio World: Robbie what is the most important trend in the design or use of codecs for radio broadcasting?

Robbie Green: Hands down, it’s path redundancy. To ensure 100% uptime, you need to send your critical audio across more than one path. The good news is IP path redundancy is very cost-effective these days. You could pair two different wired ISPs, or an ISP and a low-cost IP radio option. As long as one link remains up, you are still on the air.

RW: More and more parts of the broadcast air chain now are performed in software rather than hardware. How has this affected broadcast codecs?

Green: If anything, it’s made codecs even more ubiquitous. As air chains have become increasingly virtualized and coding algorithms standardized, deploying codecs has become much easier. 

Telos offers several virtual codec choices tailored to different tasks; all of them integrate into modern studio systems and AoIP networks quite nicely. Things have come a long way since the days of ISDN and 66 blocks!

RW: How well do today’s codecs integrate with today’s AoIP networks and infrastructures; what issues do they present?

Green: I’d say they integrate extremely well. As the people who invented AoIP for broadcast, we’d better have codec integration nailed down! Telos Zephyr Connect and iPort codecs integrate directly with Axia Livewire and AES67 networks. It’s very seamless.

RW: How widespread are IP-based systems for STL applications now?

Green: I’d say they’re very widespread. In many parts of the world, IP-based STLs are more the rule than the exception these days.

RW: What tools are available for sending audio to multiple locations at once?

Green: Telos Zephyr Connect and iPort are designed to send audio to over 64 locations using both a main and backup path for each link. As long as half the packets arrive on one path, and half the packets arrive on the other path, audio is seamless on the air. These paths could be two different internet connections, or a combination of some sort of wired or fiber path and an IP radio link.

RW: How are manufacturers assuring reliable transmission with low delay over marginal IP networks?

Green: The gold standard is really path diversity. When it comes to any RF or wired link, it’s not a matter of if the link will fail, but when. Eventually, a backhoe will cut the fiber, or some other radio in your vicinity will become spurious and disrupt your microwave link. 

Fortunately, these days, there are very cost-effective options to achieve path redundancy. You can pair a business-class cable modem or fiber link with inexpensive IP radios to create an STL that won’t go down, even if you experience a fiber cut or interference issues with a microwave link. 

RW: How can an engineer protect codecs and their related infrastructure from cyber attacks?

Green: Three words: “change the defaults.” These are crimes of opportunity, and factory logins are there to get you started, not for long-term use. 

Changing usernames and passwords should be a routine part of any new equipment installation. There are lots of excellent password manager apps out there; pick one and let it generate unique secure passwords for your gear. It’s just good practice.
Ideally, you should also have any codec that is sending audio over the public internet behind a firewall, and sending the traffic through a VPN tunnel. 

By creating a VPN, you can effectively create a link extension from location A to location B on the public internet that nobody else can access — think of it as a very long Ethernet cable. If cost and complexity are an issue, Ubiquiti offers inexpensive solutions like the Gateway Lite and Gateway Max that feature easy-to-use setup wizards

RW: Is availability of parts for legacy codecs a serious problem? 

Green: Although many old hardware codecs soldier on reliably, there are units out there that are 10 to 15 years old, sometimes older. I don’t think parts availability for units this old is an issue, because advances in algorithms, connectivity and link reliability, plus the migration to software codecs for many applications, make replacing these old devices with new, modern solutions a wiser monetary choice than repairing them.

RW: What misconceptions do people have about codecs that you’d like to dispel?

Green: That the public internet isn’t ready for prime time when it comes to audio transport, and that 950 MHz STLs are the only way to guarantee reliability.
While 950 MHz links have traditionally performed well, they have several single points of failure — a dish can be damaged by falling ice or water intrusion over time, large coax cables are an attractive target for copper thieves, etc.
Inexpensive IP links offer transport redundancy that 950 MHz links just can’t beat.

Read more on this topic in the free ebook “Trends in Codecs 2026.”

 

The post Path Redundancy Is Now Very Cost-Effective appeared first on Radio World.

Codecs Go Far Beyond Point-To-Point

17 mars 2026 à 20:01
Tony Gervasi in his workshop in Delray Beach, Fla.
Tony Gervasi in his workshop in Delray Beach, Fla.

A Radio World ebook explores trends and best practices in codecs for radio broadcasting applications.

Tony Gervasi is Intraplex sales manager at GatesAir, which he joined in 2018. He is former senior VP of engineering and technology for Nassau Broadcasting Partners and has held roles as a DOE, chief engineer and DJ. He is also an award-winning restaurateur with his son Chef Michael.

Radio World: Tony what would you say is the most important trend in the design or use of broadcast codecs?

Tony Gervasi: Distribution — it’s more than just point-to-point. It’s NOC to multi-city and satellite replacement distribution. I am currently working with multiple large broadcasting companies, both radio and TV, on designing new public and private IP distribution systems.

RW: How widespread are IP-based systems for STL applications now?

Gervasi: Most stations have some type of IP-based codec for STLs. We pair our 950 MHz STL system with our IPL codecs for the most flexibility. With the HDL-IPL pairing, you can transport baseband FM with RDS and E2X data over the 950 radio using one of the three WAN ports on the IPL unit, and have a backup path over the public internet.

RW: How do today’s codecs avoid problems with dropped packets?

Gervasi: Intraplex offers multiple ways to make up for lost or dropped packets:

  • Forward Error Correction (FEC) with or without interleave — The IPL units offer four choices of FEC but at the expense of additional bandwidth. FEC is very effective for random/isolated losses.
  • Secure Reliable Transport (SRT) — This open-source transport protocol is similar to TCP, however it is performed on the application layer using UDP as the underlying transport. SRT supports retransmission of lost packets while maintaining low latency, 120 ms by default. SRT also supports AES encryption. SRT does require a bidirectional UDP path.
  • Dynamic Stream Splicing — This is exclusive to GatesAir Intraplex. DSS sends grouped streams via path diversity, time diversity or both, providing “hitless” operation.

RW: What are the implications of FM-MPX and microMPX to the way the radio industry chooses and deploys codecs?

Gervasi: GatesAir Intraplex IPL series support FM-MPX, uMPX as well as our proprietary FM-MPX, which transports uncompressed baseband MPX with RDS in 1.64 Mbps.

The current IPL line allows you to select from analog audio, AES3 audio and AES192 as I/O. This provides the most flexibility, as you can deploy audio today and then when you’re ready, convert to MPX / uMPX.

Something unique to the IPL series codec is the ability to transport MPX/RDS with E2X data for HD. By “marrying” the E2X data to the MPX data, once time alignment is set it will not drift. We have stations that are sending this MPX/RDS/E2X bundle to multiple transmitter sites via one IPL unit. And they don’t have to worry about having time alignment hardware for each site. The audio processing and importer/exporter are located at the studio, allowing for easy deployment and maintenance.

RW: What tools are available for sending audio to multiple locations at once?

Gervasi: The Intraplex IPL series codec allows for multi-coding for each audio input with transport up to 12 unique destinations, meaning you can send uncompressed audio to Location A, Opus to Location B, and AAC to Location C. There are some limitations due to sample and coding overhead.

RW: Can you tell us about a recent installation or application for codecs that you found notable?

Gervasi: Currently I’m working with three major groups on centralized NOC distribution direct to transmitter sites, using the Intraplex Ascent Server — our high-density audio codec with up to 24 channels of inputs — and Ascent Media Gateway, our multi-point distribution system.

We are providing the transport layer not only for audio but for metadata for RDS/HD, GPIO for local firing of elements and VGPIO for insertion of local IDs that are embedded in the IPL codec. We will be expanding the local insertion option for extended audio as well as local ad insertion.

RW: How can an engineer protect codecs and their related infrastructure from cyberattacks?

Gervasi: Protect the codec behind a firewall and only open the ports that are required.

Change the default passwords. I know this sound basic, but there are many devices out there that still have the default log-in. Also disable any protocols that are not being used. The IPL series allows the end user to disable HTTP/HTTPS, FTP/SFTP, SNMP, etc.

If your codec allows, you may want to include white-listing IP address(es) for management.

Read more on this topic in the free ebook “Trends in Codecs 2026.”

[Do you receive the Radio World SmartBrief newsletter each weekday morning? We invite you to sign up here.]

The post Codecs Go Far Beyond Point-To-Point appeared first on Radio World.

These Developments Are Reshaping FM Infrastructure

3 mars 2026 à 14:00

This is one in a series about trends in codecs for radio broadcasting. The author is APT product manager with WorldCast Systems.

With the modernization of broadcasters’ distribution chains, there is a clear trend across the industry toward unified, IP-centric infrastructures. 

The transition from hardware-based broadcast systems to converged, software-defined environments mirrors developments seen across various European Broadcasting Union initiatives. IP-based networks, virtualized media functions and automated orchestration are becoming core components of modern, future-proof media workflows.

These developments reflect a broader shift: production, contribution and transmission are increasingly aligned around the same principles of scalability, resilience and centralized control. 

Within FM radio distribution, this transition manifests most visibly in two areas. 

First, receiver functionality at transmitter sites is changing as IP audio decoding becomes a native feature of modern FM transmitters. This removes the need for external devices and simplifies the last-mile signal path. 

Second, MPX generation and distribution are shifting into virtualized, centralized server environments. This allows broadcasters to manage processing, redundancy and distribution from a unified control domain.

Together, these trends illustrate how a fully IP-based workflow — from central MPX creation to IP-native transmitter input — can streamline operations, improve signal consistency and reduce overall cost. 

With this context in mind, we begin by examining the integration of IP audio decoders directly into FM transmitters and the operational impact of this development.

Integrated IP decoding at the transmitter site

Integrated IP decoding at the transmitter site.
Integrated IP decoding at the transmitter site.

One of the most notable developments in modern FM infrastructures is the integration of IP audio and MPX decoding directly into the transmitter. 

Instead of relying on separate STL decoder hardware, current transmitters embed this functionality directly in their processing architecture. This reduces signal transitions and aligns the RF end of the chain with the IP-native workflows used in studios and control rooms. 

Removing clock-domain crossings and other unnecessary conversions improves the accuracy and stability of the MPX signal at the end of the chain. 

A clear example is the Ecreso AiO transmitter series, which includes a software-based APT IP decoder. It receives both audio and MPX-over-IP feeds and delivers them directly to the digital modulator. 

This integration reduces the need for external equipment, lowers maintenance requirements and consolidates configuration and monitoring into a single interface. It enables a streamlined, fully IP-based FM transmission chain.

Last mile on the way to analog FM power amplifier

APTmpX preserves the FM multiplex structure while reducing the required bandwidth.
APTmpX preserves the FM multiplex structure while reducing the required bandwidth.

One of the fundamental architectural decisions in an all‑IP FM workflow concerns the transport format of the composite/MPX signal. 

Broadcasters must decide whether to transmit a fully linear MPX signal, which preserves the multiplex in its raw form but requires several megabits per second, or to use a compression format that delivers a secure, high-quality MPX feed at a fraction of the bandwidth.

This choice directly affects the feasibility and cost of MPX distribution on the last IP‑based segment before the analog FM power amplifier.

An uncompressed MPX signal typically occupies 3–4 Mbps. A suitable compression scheme significantly reduces the data rate, making MPX-over-IP practical even on networks designed initially for linear or compressed audio.

The latest development in this space is the second‑generation APTmpX format, a near‑transparent transport method specifically engineered for composite/MPX distribution.

APTmpX preserves the FM multiplex structure, including pilot tone, stereo components and RDS, while reducing the required bandwidth to only 300–600 kbps.

Broadcasters who rely on APTmpX emphasize its consistent low latency, stable network bitrate, high signal fidelity and intact stereo image. 

In terms of robustness, APTmpX behaves similarly to linear PCM, as each IP packet is transmitted independently without forming packet groups. A lost packet affects only that single unit of data, resulting in extremely short and usually inaudible dropouts, while the overall MPX structure remains essentially intact.

These characteristics make APTmpX particularly well‑suited for wide‑area contribution networks, public IP links or transmitter sites where engineers depend on a predictable, low‑latency MPX feed with stable peak control.

Integrating APTmpX and virtual encoders into the ST 2110 audio core

MPX generation itself is now also shifting to IP-based studio and central structures.
MPX generation itself is now also shifting to IP-based studio and central structures.

While the transmitter side has thus become fully IP-capable and the distribution format has been established, MPX generation itself is now also shifting to IP-based studio and central structures. 

Within the ST 2110 audio infrastructure, enterprise-grade stereo processors operate as native participants on the studio’s media network fabric, where MPX becomes a regular essence stream rather than a separate STL branch. This integration allows MPX to be handled, monitored and routed like any other time-critical audio essence within the facility’s IP domain.

In the central room, these MPX streams are handed off to virtualized APTmpX encoders, which run as VM images, containerized services or Kubernetes workloads. Integrated into the orchestration and HA logic, encoder instances can be provisioned, monitored or automatically replaced without disrupting ongoing operations.

Convergence into a single IP-based architecture

This convergence finally removes the last structural boundaries between studio, control and distribution. 

Together, these developments merge previously separate domains into a coherent whole. As MPX processing and distribution shift into virtualized media functions, all stages, from studio generation to transmitter input, operate on the same converged IP media fabric that underpins modern ST 2110 workflows.

Instead of separate systems for production, control and FM delivery, broadcasters gain a unified, centrally orchestrated and more resilient signal chain. 

This architectural unification positions FM transmission within the broader shift toward software-driven, fully IP-centric broadcast infrastructures. 

Every stage, from studio processing to RF modulation, now operates within the same integrated, ST 2110-aligned ecosystem.

This aligns FM distribution with the broader transformation of broadcast infrastructure and establishes a future-proof foundation. Innovations in processing, redundancy or network design can be introduced centrally and deployed instantly across the network.

Cost savings at a glance

APTmpX cuts bandwidth requirements from several megabits to only a few hundred kilobits per second, allowing existing STL links to be used without upgrades. 

Integrated IP decoding in the transmitter eliminates the need for external receiver hardware, while virtualized MPX encoders run on shared compute resources with centralized updates. 

Together, these measures reduce equipment, maintenance effort and long-term operating costs across the entire FM chain.

Read more about trends in codecs in a free ebook here.

Contact the author at h.foerster@worldcastsystems.com.

The post These Developments Are Reshaping FM Infrastructure appeared first on Radio World.

❌