At what point are WebSockets less efficient than Polling?
While I understand that the answer to the above question is somewhat determined by your application's architecture, I'm interested mostly in very simple scenarios. Essentially, if my app is pinging every 5 seconds for changes, or every minute, around when will the data being sent to maintain the open Web Sockets connection end up being more than the amount you would waste by simple polling? Basically, I'm interested in if there's a way of quantifying how much inefficiency you incur by using frameworks like Meteor if an application doesn't necessarily need real-time updates, but only periodic