I\'m attempting to apply a bandpass filter with time-varying cutoff frequencies to a signal, using Python. The routine I am currently using partitions my signal into equal-l
Try filtering the whole signal with each filter you're using, then merge the filtered signals appropriately. Roughly, in pseudocode:
# each filter acts on the entire signal with each cutoff frequency
s1 = filter1(signal)
s2 = filter2(signal)
s3 = filter3(signal)
filtered_signal = s1[0:t1] + s2[t1:t2] + s3[t2:t3]
I think this will avoid the artifacts you're describing that result from chopping up the signal then filtering.
Another possibility is using a Short Time Fourier Transform (STFT). Here's an implementation using numpy.
Basically, you could STFT your signal, filter your signal by operating on the time-frequency array, then inverse STFT the array to get a filtered signal.
You might get even better results using an invertible wavelet transform. Here's a paywalled paper describing how to do pretty much what you're trying to do with a wavelet transform.