OD Messaging Audio and Bluetooth Fix

For users or potential users.
Post Reply
User avatar
Justin Shafer
Posts: 597
Joined: Sat Jul 28, 2007 7:34 pm
Location: Fort Worth, TX.

OD Messaging Audio and Bluetooth Fix

Post by Justin Shafer »

I just left an office and they said the messaging feature that plays audio wasn't working with the bluetooth anymore.

Long story short, I had to add 3 seconds of silence to the beginning of the wav file and I also used audacity amplify to make the wav louder.

Apparently Denon Bluetooth likes streaming but not very short wav files.. making it longer fixed the problem. They also use spotify so making the sound louder was helpful.

Not sure about other bluetooth devices, but... hope this helps someone else.

https://www.opendental.com/manual/messa ... ments.html Used sounds.zip here to start with.

Would be really neat if OD used a different sound api that would mute spotify while OD was playing a message.

Got it — C# / .NET Framework. On Windows, the “mute/duck other apps while mine plays” behavior is CoreAudio ducking, and the practical way to get it is:

Play your audio in the “Communications” category (so Windows treats it like Zoom/Teams), and

Let Windows apply the user’s Sound → Communications setting (“Mute all other sounds”, “Reduce by 80%”, etc.)

Important reality check

Your app cannot reliably force-mute other apps from user-mode in a clean/supported way. What you can do is trigger ducking, and Windows will lower/mute others if the user’s setting allows it.

Best approach in .NET Framework: WASAPI + IAudioClient2 (Communications category)

If you use NAudio, you can get WASAPI output easily. To set the stream category you need IAudioClient2 and AudioClientProperties.

1) Install NAudio

NuGet: NAudio

2) Use WASAPI (shared mode) + set category to Communications

Below is a working pattern (COM interop included). This plays a WAV and tags the audio stream as Communications, which is what typically triggers ducking:

using System;
using System.Runtime.InteropServices;
using NAudio.CoreAudioApi;
using NAudio.Wave;

public static class DuckingWavPlayer
{
// ---- CoreAudio interop for IAudioClient2 ----

[ComImport]
[Guid("726778CD-F60A-4EDA-82DE-E47610CD78AA")]
[InterfaceType(ComInterfaceType.InterfaceIsIUnknown)]
private interface IAudioClient2
{
// IAudioClient methods (we don't need to declare all if we only call SetClientProperties,
// but COM vtable requires correct ordering. Simplest is to declare full IAudioClient2.
int Initialize(int shareMode, int streamFlags, long hnsBufferDuration, long hnsPeriodicity,
IntPtr pFormat, ref Guid audioSessionGuid);

int GetBufferSize(out uint pNumBufferFrames);
int GetStreamLatency(out long phnsLatency);
int GetCurrentPadding(out uint pNumPaddingFrames);
int IsFormatSupported(int shareMode, IntPtr pFormat, out IntPtr ppClosestMatch);
int GetMixFormat(out IntPtr ppDeviceFormat);
int GetDevicePeriod(out long phnsDefaultDevicePeriod, out long phnsMinimumDevicePeriod);
int Start();
int Stop();
int Reset();
int SetEventHandle(IntPtr eventHandle);
int GetService(ref Guid riid, [MarshalAs(UnmanagedType.IUnknown)] out object ppv);

// IAudioClient2 additions
int IsOffloadCapable(int category, out int pbOffloadCapable);

int SetClientProperties(ref AudioClientProperties pProperties);

int GetClientProperties(out AudioClientProperties pProperties);

int GetSharedModeEnginePeriod(IntPtr pFormat, out uint pDefaultPeriodInFrames,
out uint pFundamentalPeriodInFrames, out uint pMinPeriodInFrames, out uint pMaxPeriodInFrames);

int GetCurrentSharedModeEnginePeriod(out IntPtr ppFormat, out uint pCurrentPeriodInFrames);

int InitializeSharedAudioStream(int streamFlags, uint periodInFrames, IntPtr pFormat, ref Guid audioSessionGuid);
}

[StructLayout(LayoutKind.Sequential)]
private struct AudioClientProperties
{
public uint cbSize;
public bool bIsOffload;
public AudioStreamCategory eCategory;
public AudioClientStreamOptions Options;
}

private enum AudioClientStreamOptions : uint
{
None = 0x0,
Raw = 0x1
}

// Windows Audio categories
private enum AudioStreamCategory
{
Other = 0,
ForegroundOnlyMedia = 1,
BackgroundCapableMedia = 2,
Communications = 3,
Alerts = 4,
SoundEffects = 5,
GameEffects = 6,
GameMedia = 7,
GameChat = 8,
Speech = 9,
Movie = 10,
Media = 11
}

public static void PlayWavAsCommunications(string wavPath)
{
// Use default render device
var device = new MMDeviceEnumerator()
.GetDefaultAudioEndpoint(DataFlow.Render, Role.Communications); // key: communications-role endpoint

// Create WASAPI output (shared)
using (var reader = new AudioFileReader(wavPath))
using (var output = new WasapiOut(device, AudioClientShareMode.Shared, useEventSync: false, latency: 100))
{
// Tag the underlying audio client as "Communications" to trigger ducking behavior
SetWasapiCategory(output, AudioStreamCategory.Communications);

output.Init(reader);
output.Play();

while (output.PlaybackState == PlaybackState.Playing)
System.Threading.Thread.Sleep(20);
}
}

private static void SetWasapiCategory(WasapiOut wasapiOut, AudioStreamCategory category)
{
// NAudio exposes the AudioClient via a property in newer versions; if yours doesn't,
// you can still get it by reflection. We'll do reflection to be safe across versions.
var audioClientProp = wasapiOut.GetType().GetProperty("AudioClient");
var audioClientObj = audioClientProp?.GetValue(wasapiOut, null);
if (audioClientObj == null)
return;

// Try to cast COM object to IAudioClient2
var audioClient2 = audioClientObj as IAudioClient2;
if (audioClient2 == null)
return;

var props = new AudioClientProperties
{
cbSize = (uint)Marshal.SizeOf(typeof(AudioClientProperties)),
bIsOffload = false,
eCategory = category,
Options = AudioClientStreamOptions.None
};

// Call SetClientProperties before Initialize/Start is best;
// with NAudio, we do it before Play() (before stream starts).
audioClient2.SetClientProperties(ref props);
}
}
Notes

Role.Communications endpoint selection helps too.

Ducking result depends on the user’s Windows setting:
Control Panel → Sound → Communications → “Mute all other sounds” etc.
Post Reply