I am developing an application to analyse some very large waveforms, 1-2 GB per waveform. While testing what I have written so far I encountered an Out Of Memory exception when I invoked the Savitzky-Golay smoothing filter on a waveform with ~7,000,000 data points. The application is being built for a 64 bit target environment. I have set the heap to have an initial size of 50 MB. The machine I am testing on has 4 GB of RAM and Resource Monitor reports that there is lots of memory available when I get the exception.
Initially I thought that the issue was that there wasn't a large enough continguous piece of memory to satisfy the need to allocate a large buffer for the return data. But then I tested filtering the same waveform in Diadem 2012. Using Diadem I was able load and display the waveform as well as being able to invoke the Savitzky-Golay filter to successfully smooth the waveform without seeing the Out Of Memory exception.
I have stripped everything out of my application except for the code that loads the TDMS file and the code that loads the data from the specified channel and then attempts to invoke the smoothing filter on it. This stripped down app still throws the Out of Memory exception on the large waveform. Interestingly, once I have caught the exception, the Savitzky-Golay filter will throw the Out of Memory exception every time it is invoked thereafter even with small datasets.
Since Diadem 2012 is able to smooth large waveforms I know it is possible. I am hoping someone at NI can give me a clue as to how Diadem does it.
Thanks in advance.