Does reactive extensions support rolling buffers?

后端 未结 4 651
时光说笑
时光说笑 2020-11-30 03:50

I\'m using reactive extensions to collate data into buffers of 100ms:

this.subscription = this.dataService
    .Where(x => !string.Equals(\"FOO\", x.Key.S         


        
相关标签:
4条回答
  • 2020-11-30 04:30

    With Rx Extensions 2.0, your can answer both requirements with a new Buffer overload accepting a timeout and a size:

    this.subscription = this.dataService
        .Where(x => !string.Equals("FOO", x.Key.Source))
        .Buffer(TimeSpan.FromMilliseconds(100), 1)
        .ObserveOn(this.dispatcherService)
        .Where(x => x.Count != 0)
        .Subscribe(this.OnBufferReceived);
    

    See https://msdn.microsoft.com/en-us/library/hh229200(v=vs.103).aspx for the documentation.

    0 讨论(0)
  • 2020-11-30 04:45

    This is possible by combining the built-in Window and Throttle methods of Observable. First, let's solve the simpler problem where we ignore the maximum count condition:

    public static IObservable<IList<T>> BufferUntilInactive<T>(this IObservable<T> stream, TimeSpan delay)
    {
        var closes = stream.Throttle(delay);
        return stream.Window(() => closes).SelectMany(window => window.ToList());
    }
    

    The powerful Window method did the heavy lifting. Now it's easy enough to see how to add a maximum count:

    public static IObservable<IList<T>> BufferUntilInactive<T>(this IObservable<T> stream, TimeSpan delay, Int32? max=null)
    {
        var closes = stream.Throttle(delay);
        if (max != null)
        {
            var overflows = stream.Where((x,index) => index+1>=max);
            closes = closes.Merge(overflows);
        }
        return stream.Window(() => closes).SelectMany(window => window.ToList());
    }
    

    I'll write a post explaining this on my blog. https://gist.github.com/2244036

    Documentation for the Window method:

    • http://leecampbell.blogspot.co.uk/2011/03/rx-part-9join-window-buffer-and-group.html
    • http://enumeratethis.com/2011/07/26/financial-charts-reactive-extensions/
    0 讨论(0)
  • 2020-11-30 04:46

    I wrote an extension to do most of what you're after - BufferWithInactivity.

    Here it is:

    public static IObservable<IEnumerable<T>> BufferWithInactivity<T>(
        this IObservable<T> source,
        TimeSpan inactivity,
        int maximumBufferSize)
    {
        return Observable.Create<IEnumerable<T>>(o =>
        {
            var gate = new object();
            var buffer = new List<T>();
            var mutable = new SerialDisposable();
            var subscription = (IDisposable)null;
            var scheduler = Scheduler.ThreadPool;
    
            Action dump = () =>
            {
                var bts = buffer.ToArray();
                buffer = new List<T>();
                if (o != null)
                {
                    o.OnNext(bts);
                }
            };
    
            Action dispose = () =>
            {
                if (subscription != null)
                {
                    subscription.Dispose();
                }
                mutable.Dispose();
            };
    
            Action<Action<IObserver<IEnumerable<T>>>> onErrorOrCompleted =
                onAction =>
                {
                    lock (gate)
                    {
                        dispose();
                        dump();
                        if (o != null)
                        {
                            onAction(o);
                        }
                    }
                };
    
            Action<Exception> onError = ex =>
                onErrorOrCompleted(x => x.OnError(ex));
    
            Action onCompleted = () => onErrorOrCompleted(x => x.OnCompleted());
    
            Action<T> onNext = t =>
            {
                lock (gate)
                {
                    buffer.Add(t);
                    if (buffer.Count == maximumBufferSize)
                    {
                        dump();
                        mutable.Disposable = Disposable.Empty;
                    }
                    else
                    {
                        mutable.Disposable = scheduler.Schedule(inactivity, () =>
                        {
                            lock (gate)
                            {
                                dump();
                            }
                        });
                    }
                }
            };
    
            subscription =
                source
                    .ObserveOn(scheduler)
                    .Subscribe(onNext, onError, onCompleted);
    
            return () =>
            {
                lock (gate)
                {
                    o = null;
                    dispose();
                }
            };
        });
    }
    
    0 讨论(0)
  • 2020-11-30 04:46

    I guess this can be implemented on top of Buffer method as shown below:

    public static IObservable<IList<T>> SlidingBuffer<T>(this IObservable<T> obs, TimeSpan span, int max)
            {
                return Observable.CreateWithDisposable<IList<T>>(cl =>
                {
                    var acc = new List<T>();
                    return obs.Buffer(span)
                            .Subscribe(next =>
                            {
                                if (next.Count == 0) //no activity in time span
                                {
                                    cl.OnNext(acc);
                                    acc.Clear();
                                }
                                else
                                {
                                    acc.AddRange(next);
                                    if (acc.Count >= max) //max items collected
                                    {
                                        cl.OnNext(acc);
                                        acc.Clear();
                                    }
                                }
                            }, err => cl.OnError(err), () => { cl.OnNext(acc); cl.OnCompleted(); });
                });
            }
    

    NOTE: I haven't tested it, but I hope it gives you the idea.

    0 讨论(0)
提交回复
热议问题