How can I optimize my FQL to avoid Facebook timeouts?

馋奶兔 提交于 2019-12-03 06:53:06

You should loop through using limit/offset like you said, or cache the friends list up front as puffpio suggested.

You said that it still wasn't working reliably - this is because some users may have many, many links, while others not so many. Note also that you may be retrieving uncached data for some users. I would recommend having a single retry in your loop for failed queries - it's often the case that the first one will time out and the second one will succeed due to newly cached data.

Finally, for posterity, I'm opening a task to optimize the link table to do a better job of being efficient when it's being filtered by time.

Paul Sasik

Some db engines do not optimize the IN keyword well, or at all. They may be executing the in clause for every single resulting row of your query. Can you join the link and friend tables instead of using an IN with a subquery?

You may find this article interesting. (Discusses issues with IN clause performance on MySQL and Facebook runs MySQL on the back end.)

puffpio

It would be better to cache the user's friends and only refresh it occasionally. In other words, run this query

SELECT uid2
FROM friend
WHERE uid1 = me()

Cache the list of users and run

SELECT link_id, title, url, owner, created_time
FROM link
WHERE
    created_time > strtotime('yesterday') AND
    owner IN (/*your user list here*/)
LIMIT 100

This way you are not running the inner query all the time. In reality a user's friend list does not have a high churn rate, so you would not need to update it as frequently as getting the share links.

Additionally, architecting it this way will allow you to break up the 2nd query into multiple queries with different sets of 'owner's and then using fql.multiquery to get them all simultaneously

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!