Hello
I have a 24 bits per sample / 192kHz sound board and I am using the Wasapi Loopback Capture method to get data from it.
When I print out the capabilities of the board, it shows 32 bits per sample instead of 24 bits per sample...
```
private void button2_Click(object sender, EventArgs e)
{
waveIn = new WasapiLoopbackCapture();
waveIn.DataAvailable += OnDataAvailable;
waveIn.RecordingStopped += OnRecordingStopped;
waveIn.StartRecording();
Console.WriteLine(waveIn.WaveFormat.BitsPerSample);
Console.WriteLine(waveIn.WaveFormat.AverageBytesPerSecond);
Console.WriteLine(waveIn.WaveFormat.Channels);
Console.WriteLine(waveIn.WaveFormat.SampleRate);
Console.WriteLine(waveIn.WaveFormat.Encoding);
}
```
Which prints out...
```
32
1536000
2
192000
Extensible
```
If I check the e.Buffer data that the callback delivers, I see that most of the bytes have data indeed, but I was hoping to see 3 bytes with data and then an empty byte (24 bits and a spare byte), but all bytes have data.
So, how should I merge those bytes and in which order?
The Audio Board specs are:
Realtek Semiconductor Corp.
Audio driver 6.0.1.5919
DirectX 11.0
ALC889A
Thanks!
Comments: Yeap, as Mark says the samples come in that way. This is the function I used to sum both channels into one signal. ``` /// <summary> /// This is called when audio samples are ready /// </summary> /// <param name="sender"></param> /// <param name="e"></param> void OnDataAvailable(object sender, WaveInEventArgs e) { Int32 sample_count = e.BytesRecorded / (waveIn.WaveFormat.BitsPerSample / 8); Single[] data = new Single[sample_count]; for (int i = 0; i < sample_count; ++i) { data[i] = BitConverter.ToSingle(e.Buffer, i * 4); } int j = 0; Audio_Samples = new Double[sample_count / 2]; for (int sample = 0; sample < data.Length; sample += 2) { Audio_Samples[j] = (Double)data[sample]; Audio_Samples[j] += (Double)data[sample + 1]; ++j; } Data_Available = true; } ```
I have a 24 bits per sample / 192kHz sound board and I am using the Wasapi Loopback Capture method to get data from it.
When I print out the capabilities of the board, it shows 32 bits per sample instead of 24 bits per sample...
```
private void button2_Click(object sender, EventArgs e)
{
waveIn = new WasapiLoopbackCapture();
waveIn.DataAvailable += OnDataAvailable;
waveIn.RecordingStopped += OnRecordingStopped;
waveIn.StartRecording();
Console.WriteLine(waveIn.WaveFormat.BitsPerSample);
Console.WriteLine(waveIn.WaveFormat.AverageBytesPerSecond);
Console.WriteLine(waveIn.WaveFormat.Channels);
Console.WriteLine(waveIn.WaveFormat.SampleRate);
Console.WriteLine(waveIn.WaveFormat.Encoding);
}
```
Which prints out...
```
32
1536000
2
192000
Extensible
```
If I check the e.Buffer data that the callback delivers, I see that most of the bytes have data indeed, but I was hoping to see 3 bytes with data and then an empty byte (24 bits and a spare byte), but all bytes have data.
So, how should I merge those bytes and in which order?
The Audio Board specs are:
Realtek Semiconductor Corp.
Audio driver 6.0.1.5919
DirectX 11.0
ALC889A
Thanks!
Comments: Yeap, as Mark says the samples come in that way. This is the function I used to sum both channels into one signal. ``` /// <summary> /// This is called when audio samples are ready /// </summary> /// <param name="sender"></param> /// <param name="e"></param> void OnDataAvailable(object sender, WaveInEventArgs e) { Int32 sample_count = e.BytesRecorded / (waveIn.WaveFormat.BitsPerSample / 8); Single[] data = new Single[sample_count]; for (int i = 0; i < sample_count; ++i) { data[i] = BitConverter.ToSingle(e.Buffer, i * 4); } int j = 0; Audio_Samples = new Double[sample_count / 2]; for (int sample = 0; sample < data.Length; sample += 2) { Audio_Samples[j] = (Double)data[sample]; Audio_Samples[j] += (Double)data[sample + 1]; ++j; } Data_Available = true; } ```