Read only when the XML data is updated

萝らか妹 提交于 2019-12-06 13:41:23

You could use hashes for this, in two ways:

  1. To ease updating - When requesting an update, you hash the whole feed and compare the result with the hash from the last time - if they are identical, you know that the feed did not change and can stop before even parsing it.
  2. To identify changes - On parsing, you hash each item and compare it to the hashes stored from previous runs. If it matches one, you know that you've seen it before.

If the feed in question offers guids for its items you could refine this process by storing guid<>hash pairs. This would make the comparison quicker, as you would only compare items to known previous versions instead of comparing to all previous items.

You'd still need some expiration/purge mechanism to keep the amount of stored hashes within bounds, but given that you only store relatively short strings (depending on the chosen hash algorithm), you should be able to keep quite a backlog before getting performance problems.

HTTP Conditional GET is probably as close as you're going to get to what you want.

Because of the diversity of rss there is no easy solution for the problem your raised. The main issue is how to determine the uniqueness of the rss item. It can be guid, publish time or content itself, but it maybe tricky to detect that automatically.

Once you know the uniqueness criteria you can persist all 'old' items and compare them to the newest ones you receive.

HTTP Cache Control and Expires headers could be used as an optimization for the sites that support that, but unfortunately some doesn't.

@Henrik's solution is correct, however it might be easiest to supply you with an example of the hashing data:

// hash the three channel variables
$hash = sha1($channel_title . $channel_link . $channel_desc);

// here you should check the currently stored database hashed 
// value against current hash value to see if any channel variables
// have recently changed
if ($database_hash != $hash) {
    // you need to update the channel data in your database
    // including the new hash value
}

for ($i = 0; $i < 3; $i++) {

    // hash the item values
    $hash = $item_title . $item_link . $item_description

    // here you should check the currently stored database hashed 
    // value against all item hash values to see if any item variables
    // have recently changed
    if ($database_hash != $hash) {
        // you need to update the item data in your database
        // including the new hash value
    }

}

Also, if you want to do a quick check to determine if any data in the XML file has changed whatsoever, you could hash the XML as a string. You should store this value and check against it every time you run your cronjob to see if the value has changed (indicating some data within the XML file has changed).

$overall_hash = sha1($xmlDoc->saveXML());

Your clients will always be asking for your feed data so you cannot necessarily control when they ask. I dont think most feed readers obey HTTP Cache Control / Expires headers so you cannot rely on using the HTTP spec and leverage HTTP caching.

I think your best bet is to just cache your last response and send all subsequent requests from the cache - updating the cache appropriately when changes are made. Effectively this means that your cost to respond to each client and its stale data is pretty much close to 0, if you just pull it from memcache or the filesystem.

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!