Applying time-variant filter in Python

前端 未结 3 1312
死守一世寂寞
死守一世寂寞 2021-01-03 04:16

I\'m attempting to apply a bandpass filter with time-varying cutoff frequencies to a signal, using Python. The routine I am currently using partitions my signal into equal-l

3条回答
  •  庸人自扰
    2021-01-03 05:11

    Try filtering the whole signal with each filter you're using, then merge the filtered signals appropriately. Roughly, in pseudocode:

    # each filter acts on the entire signal with each cutoff frequency
    s1 = filter1(signal)
    s2 = filter2(signal)
    s3 = filter3(signal)
    
    filtered_signal = s1[0:t1] + s2[t1:t2] + s3[t2:t3]
    

    I think this will avoid the artifacts you're describing that result from chopping up the signal then filtering.

    Another possibility is using a Short Time Fourier Transform (STFT). Here's an implementation using numpy.

    Basically, you could STFT your signal, filter your signal by operating on the time-frequency array, then inverse STFT the array to get a filtered signal.

    You might get even better results using an invertible wavelet transform. Here's a paywalled paper describing how to do pretty much what you're trying to do with a wavelet transform.

提交回复
热议问题