Quantcast
Channel: naudio Work Item Rss Feed
Viewing all 738 articles
Browse latest View live

Commented Unassigned: multiple wave normalization [16450]

$
0
0
hi,

just want to ask is there any examples on peak normalization and average normalization for multiple waves

thanks
Comments: ok thanks, i think i go to limiter to prevent clipping...

Commented Unassigned: Wrong BitsPerSample with WasapiLoopbackCapture [16401]

$
0
0
Hello

I have a 24 bits per sample / 192kHz sound board and I am using the Wasapi Loopback Capture method to get data from it.

When I print out the capabilities of the board, it shows 32 bits per sample instead of 24 bits per sample...

```
private void button2_Click(object sender, EventArgs e)
{
waveIn = new WasapiLoopbackCapture();
waveIn.DataAvailable += OnDataAvailable;
waveIn.RecordingStopped += OnRecordingStopped;
waveIn.StartRecording();

Console.WriteLine(waveIn.WaveFormat.BitsPerSample);
Console.WriteLine(waveIn.WaveFormat.AverageBytesPerSecond);
Console.WriteLine(waveIn.WaveFormat.Channels);
Console.WriteLine(waveIn.WaveFormat.SampleRate);
Console.WriteLine(waveIn.WaveFormat.Encoding);
}
```

Which prints out...

```
32
1536000
2
192000
Extensible
```

If I check the e.Buffer data that the callback delivers, I see that most of the bytes have data indeed, but I was hoping to see 3 bytes with data and then an empty byte (24 bits and a spare byte), but all bytes have data.

So, how should I merge those bytes and in which order?

The Audio Board specs are:

Realtek Semiconductor Corp.
Audio driver 6.0.1.5919
DirectX 11.0
ALC889A

Thanks!
Comments: Yeah, but those values are completely expected.

Created Unassigned: Null Ref Crash With Multiple Play/Stop Calls Back-To-Back [16451]

$
0
0
I am using the Wasapi provider and in __WasapiOut.cs__, the implementation of __Stop()__ has a subtle issue in it. With rapid calls to __Stop()__/__Play()__ back to back (actually, __Play()__ and then __Stop()__ and then __Play()__ again when the __PlaybackStopped__ event is raised), a null reference exception can happen in __Stop()__ where __playThread.Join()__ is called.

A small race can exist where playThread is null when a call is made into this method, causing a null reference exception.. If execution resumes, the thread continues to pump data and audio will continue to play.

Adding an __object__ member to the class and using it as a __lock__ for the entire code in __Stop()__ and also in __Play()__ resolves the issue. Also, I would suggest using an event to get the worker/play thread to die. This would allow the thread to fall out of the __WaitAny(...)__ call faster allowing for more timely cleanup.

Added to the top of the class:
```
protected object _locker = new object();
```

__Play()__ and __Stop()__ methods:
```
/// <summary>
/// Begin Playback
/// </summary>
public void Play()
{
lock( _locker )
{
if( playbackState != PlaybackState.Playing )
{
if( playbackState == PlaybackState.Stopped )
{
playThread = new Thread( new ThreadStart( PlayThread ) );
playbackState = PlaybackState.Playing;
playThread.Start();
}
else
{
playbackState = PlaybackState.Playing;
}
}
}
}

/// <summary>
/// Stop playback and flush buffers
/// </summary>
public void Stop()
{
lock( _locker )
{
if( playbackState != PlaybackState.Stopped )
{
playbackState = PlaybackState.Stopped;
playThread.Join();
playThread = null;
}
}
}
```

Created Unassigned: Issues with recording TAPI devices [16452]

$
0
0
Hi,

I can successfully record audio coming from a TAPI device. I'm using the WaveInEvent as I'm doing it in a service but I have observed the same issue now with the Recording Demo program using WaveIn. When I shut the application down the process is still resident and can't even be killed through task manager or even the taskkill command utility. Within Visual Studio, ending the program still leaves it running. The thread debug window shows nothing when doing "Debug -> Break All" so there appears to be nothing running. It would appear that something is locked somewhere to the process but I can't trace it. If I record a standard audio card everything is fine.

Is there anyway to trace this type of issue as I guess something in the Interop layer is playing up?

Commented Unassigned: amr to wma using NAudio MFT [16421]

$
0
0
Hi Mark, while I am trying to convert amr file to wma the following exception I have encountered.
"Exception from HRESULT: 0xC00D36C4". Is there any way to convert amr file format to wma format or mp3 format.

```
public byte[] ConvertAMRToWMA( )
{
var data = new MediaFoundationReader("..\\amr\\sample.amr");
MediaFoundationEncoder.EncodeToWma(data, "..\\sampleamr.wma", 128000);
}

```
Comments: Unable to convert amr

Created Unassigned: MediaFoundationReader reads less data on Windows 7 (on Windows 8.1 - OK) [16453]

$
0
0
Have got strange result of using MediaFoundationReader to extract audio on Windows 7 64-bit.

Actually, I use MediaFoundationReader for extracting WAVE data to temp file like this:

```
using (var reader = new MediaFoundationReader(sourceFilename))
{
WaveFileWriter.CreateWaveFile(copyFilename, reader);
}
```

After that I read WAVE data with WaveFileReader:

```
using (var sourceFileReader = new WaveFileReader(sourceFilename))
{
Console.WriteLine("Length:\t\t{0}", sourceFileReader.Length);
}

using (var copyFileReader = new WaveFileReader(copyFilename))
{
Console.WriteLine("Length of copy:\t{0}", copyFileReader.Length);
}
```

On Windows 8.1 I get the same data lengths, but on Windows 7 copy has end-trimmed data.

Is there bug in NAudio or Media Foundation on Windows 7?

Test project in attachment.

![Image](http://auxmic.com/sites/default/files/pictures/naudio_test_win7_x64.png)
![Image](http://auxmic.com/sites/default/files/pictures/naudio_test_win8.1_x64.png)

Edited Unassigned: MediaFoundationReader reads less data on Windows 7 (on Windows 8.1 - OK) [16453]

$
0
0
Have got strange result of using MediaFoundationReader to extract audio on Windows 7 64-bit.

Actually, I use MediaFoundationReader for extracting WAVE data to temp file like this:

```
using (var reader = new MediaFoundationReader(sourceFilename))
{
WaveFileWriter.CreateWaveFile(copyFilename, reader);
}
```

After that I read WAVE data with WaveFileReader:

```
using (var sourceFileReader = new WaveFileReader(sourceFilename))
{
Console.WriteLine("Length:\t\t{0}", sourceFileReader.Length);
}

using (var copyFileReader = new WaveFileReader(copyFilename))
{
Console.WriteLine("Length of copy:\t{0}", copyFileReader.Length);
}
```

On Windows 8.1 I get the same data lengths, but on Windows 7 copy has end-trimmed data.

Is there a bug in NAudio or Media Foundation on Windows 7?

Test project in attachment.

![Image](http://auxmic.com/sites/default/files/pictures/naudio_test_win7_x64.png)
![Image](http://auxmic.com/sites/default/files/pictures/naudio_test_win8.1_x64.png)

Created Unassigned: WaveFormatConversionStream disposes sourceStream [16454]

$
0
0
I think this is a mistake.
Disposing the sourceStream is not the responsibility of the WaveFormatConversionStream class.
I see no point in disposing a resource which was not created by the class itself.

For example: converting a stream to different formats or demultiplexing it succeeds first, then throwing NullReferenceException because sourceStream was disposed in the meantime.

Created Unassigned: WaveOut.Stop() pauses playback instead of stopping [16455]

$
0
0
Normally you'd expect Stop() to stop playback and seek to the beginning, but it doesn't.
In the following code:
```
_myWaveOut = new DirectSoundOut(device);
_myAudioFileReader = new AudioFileReader(path);
_myWaveOut.Init(_myAudioFileReader);
_myWaveOut.Play();
// ...
_myWaveOut.Stop();
_myWaveOut.Play();
// ...
```
After Play() is called for the second time, the file starts playing from the position it was at when Stop() was called. Please change this so it starts playing from the start like it should.

Created Unassigned: WaveOut.PlaybackStopped is raised when calling WaveOut.Stop() [16456]

$
0
0
The description of WaveOut.PlaybackStopped:
> Indicates that playback has gone into a stopped state due to __reaching the end of the input stream__ or an error has been encountered during playback
However if I call WaveOut.Stop() the event will still be fired, even though there is no error and playback hasn't finished yet. (I'm using DirectSoundOut)

Edited Unassigned: WaveOut.PlaybackStopped is raised when calling WaveOut.Stop() [16456]

$
0
0
The description of WaveOut.PlaybackStopped:
> Indicates that playback has gone into a stopped state due to __reaching the end of the input stream__ or an error has been encountered during playback

However if I call WaveOut.Stop() the event will still be fired, even though there is no error and playback hasn't finished yet. (I'm using DirectSoundOut)

Commented Unassigned: WaveOut.PlaybackStopped is raised when calling WaveOut.Stop() [16456]

$
0
0
The description of WaveOut.PlaybackStopped:
> Indicates that playback has gone into a stopped state due to __reaching the end of the input stream__ or an error has been encountered during playback

However if I call WaveOut.Stop() the event will still be fired, even though there is no error and playback hasn't finished yet. (I'm using DirectSoundOut)
Comments: Yes, that description should be updated. PlaybackStopped will always be raised when playback is stopped for any reason.

Edited Unassigned: WaveOut.PlaybackStopped is raised when calling WaveOut.Stop() [16456]

$
0
0
The description of WaveOut.PlaybackStopped:
> Indicates that playback has gone into a stopped state due to __reaching the end of the input stream__ or an error has been encountered during playback

However if I call WaveOut.Stop() the event will still be fired, and it will be fired when playback is stopped for any reason, even though there is no error and playback hasn't finished yet. (I'm using DirectSoundOut)

Created Unassigned: WaveFileReader reads data to ExtraChunks only [16457]

$
0
0
Hi,

a wave file extracted from a video with ffmpeg seems not to be read correctly by WaveFileReader.

The wave file can be played in any standard player (MS, VLC, ...) but when opened with WaveFileReader sample count is 0 and all data goes into the ExtraChunks.

I am not very familiar with the wave file format, so maybe I do something wrong and it isn't a bug in WaveFileReader.

It would be nice if someone could have a look and confirm this.

command line for FFMPEG:
```
ffmpeg -i myvideo.mp4 -vn -acodec pcm_f32le -ar 44100 -f wav -ac 1 myaudiofromvideo.wav
```

Call in my source code to open the file:
```
WaveFileReader rd = new WaveFileReader(myaudiofromvideo.wav);
```

If anything else is needed from my side, please ask.

Regards,
T.

Edited Unassigned: WaveFileReader reads data to ExtraChunks only [16457]

$
0
0
Hi,

a wave file extracted from a video with ffmpeg seems not to be read correctly by WaveFileReader.

The wave file can be played in any standard player (MS, VLC, ...) but when opened with WaveFileReader sample count is 0 and all data goes into the ExtraChunks.

I am not very familiar with the wave file format, so maybe I do something wrong and it isn't a bug in WaveFileReader.

It would be nice if someone could have a look and confirm this.

command line for FFMPEG:
```
ffmpeg -i myvideo.mp4 -vn -acodec pcm_f32le -ar 44100 -f wav -ac 1 myaudiofromvideo.wav
```

Call in my source code to open the file:
```
WaveFileReader rd = new WaveFileReader(myaudiofromvideo.wav);
```

If anything else is needed from my side, please ask.

Regards,
T.

Edit:
Oh by the way:
OS: Windows 7 x64 (compiling for x86)
IDE: Visual Studio 2013 Desktop Express
NAudio Runtime Version: v2.0.50727
NAudio Version: 1.7.1.17

Commented Unassigned: WaveFileReader reads data to ExtraChunks only [16457]

$
0
0
Hi,

a wave file extracted from a video with ffmpeg seems not to be read correctly by WaveFileReader.

The wave file can be played in any standard player (MS, VLC, ...) but when opened with WaveFileReader sample count is 0 and all data goes into the ExtraChunks.

I am not very familiar with the wave file format, so maybe I do something wrong and it isn't a bug in WaveFileReader.

It would be nice if someone could have a look and confirm this.

command line for FFMPEG:
```
ffmpeg -i myvideo.mp4 -vn -acodec pcm_f32le -ar 44100 -f wav -ac 1 myaudiofromvideo.wav
```

Call in my source code to open the file:
```
WaveFileReader rd = new WaveFileReader(myaudiofromvideo.wav);
```

If anything else is needed from my side, please ask.

Regards,
T.

Edit:
Oh by the way:
OS: Windows 7 x64 (compiling for x86)
IDE: Visual Studio 2013 Desktop Express
NAudio Runtime Version: v2.0.50727
NAudio Version: 1.7.1.17
Comments: Okay just checked and downloaded the latest source code ... The issue seems to be solved somewhen after the stable 1.7 release ... I will stick with the source code then instead of Nuget. Bye.

Edited Unassigned: [SOLVED] WaveFileReader reads data to ExtraChunks only [16457]

$
0
0
Hi,

a wave file extracted from a video with ffmpeg seems not to be read correctly by WaveFileReader.

The wave file can be played in any standard player (MS, VLC, ...) but when opened with WaveFileReader sample count is 0 and all data goes into the ExtraChunks.

I am not very familiar with the wave file format, so maybe I do something wrong and it isn't a bug in WaveFileReader.

It would be nice if someone could have a look and confirm this.

command line for FFMPEG:
```
ffmpeg -i myvideo.mp4 -vn -acodec pcm_f32le -ar 44100 -f wav -ac 1 myaudiofromvideo.wav
```

Call in my source code to open the file:
```
WaveFileReader rd = new WaveFileReader(myaudiofromvideo.wav);
```

If anything else is needed from my side, please ask.

Regards,
T.

Edit:
Oh by the way:
OS: Windows 7 x64 (compiling for x86)
IDE: Visual Studio 2013 Desktop Express
NAudio Runtime Version: v2.0.50727
NAudio Version: 1.7.1.17

Commented Unassigned: WaveFormatConversionStream disposes sourceStream [16454]

$
0
0
I think this is a mistake.
Disposing the sourceStream is not the responsibility of the WaveFormatConversionStream class.
I see no point in disposing a resource which was not created by the class itself.

For example: converting a stream to different formats or demultiplexing it succeeds first, then throwing NullReferenceException because sourceStream was disposed in the meantime.
Comments: You are right. I designed this class many years ago and would probably make a different decision today. It's convenient in some circumstances and a pain in others. The workaround I use when I need one is to decorate the input what I call an "IgnoreDisposeStream" which simply ignores calls to dispose but passes all other stream methods through to the source. I will consider this change for a future NAudio though

Created Unassigned: Unsopported WaveFormat WasapiLoopbackCapture [16458]

$
0
0
Hi, i have an error when the proyect starts to record, this is the code:

WasapiLoopbackCapture waveInStream;
WaveFileWriter writer;
bool isRecording = false;

private void cmdRecord_Click(object sender, System.EventArgs e)
{
if (isRecording == false)
{
this.Enabled = false;
saveFileDialog1.Filter = "Wav Files (*.wav)|*.wav";
DialogResult result = saveFileDialog1.ShowDialog();
if (result == DialogResult.OK)
{
string RutaAGrabar = saveFileDialog1.FileName;
waveInStream = new WasapiLoopbackCapture();
writer = new WaveFileWriter(RutaAGrabar, waveInStream.WaveFormat);
waveInStream.DataAvailable += new System.EventHandler<WaveInEventArgs>(waveInStream_DataAvailable);
waveInStream.StartRecording();// the error is here,
isRecording = true;
}
this.Enabled = true;
}
else
{
waveInStream.StopRecording();
waveInStream.Dispose();
waveInStream = null;
writer.Close();
writer = null;
IsRecording = false;
}
}
void waveInStream_DataAvailable(object sender, WaveInEventArgs e)
{
writer.WriteData(e.Buffer, 0, e.BytesRecorded);
}
thanks for any help

Commented Unassigned: Unsopported WaveFormat WasapiLoopbackCapture [16458]

$
0
0
Hi, i have an error when the proyect starts to record, this is the code:

WasapiLoopbackCapture waveInStream;
WaveFileWriter writer;
bool isRecording = false;

private void cmdRecord_Click(object sender, System.EventArgs e)
{
if (isRecording == false)
{
this.Enabled = false;
saveFileDialog1.Filter = "Wav Files (*.wav)|*.wav";
DialogResult result = saveFileDialog1.ShowDialog();
if (result == DialogResult.OK)
{
string RutaAGrabar = saveFileDialog1.FileName;
waveInStream = new WasapiLoopbackCapture();
writer = new WaveFileWriter(RutaAGrabar, waveInStream.WaveFormat);
waveInStream.DataAvailable += new System.EventHandler<WaveInEventArgs>(waveInStream_DataAvailable);
waveInStream.StartRecording();// the error is here,
isRecording = true;
}
this.Enabled = true;
}
else
{
waveInStream.StopRecording();
waveInStream.Dispose();
waveInStream = null;
writer.Close();
writer = null;
IsRecording = false;
}
}
void waveInStream_DataAvailable(object sender, WaveInEventArgs e)
{
writer.WriteData(e.Buffer, 0, e.BytesRecorded);
}
thanks for any help
Comments: System.ArgumentException HResult=-2147024809 Message=Unsupported Wave Format Source=NAudio StackTrace: in NAudio.CoreAudioApi.WasapiCapture.InitializeCaptureDevice() in NAudio.CoreAudioApi.WasapiCapture.StartRecording()
Viewing all 738 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>