varnish

How to bypass varnish cache on client?

懵懂的女人 提交于 2020-01-16 03:52:11
问题 I'm trying to get new content from a Joomla website or Yii web application, and we're using varnish on server side. How to bypass varnish cache control so that I can get the new content?. I tried to add ?cachebuster=9999 to end of url but it's not working. I can get the new content only when I'm using a proxy. Any idea? 回答1: Add the port 8080 to bypass Varnish: Example: www.example.com:8080 来源: https://stackoverflow.com/questions/26447823/how-to-bypass-varnish-cache-on-client

Cache-control: Is it possible to ignore query parameters when validating the cache?

放肆的年华 提交于 2020-01-15 03:33:11
问题 Is it possible to set a cache-control header communicating with a reverse proxy to ignore query parameters in determining what is a unique uri or in short: validate a cache even if some query parameters have changed? Sometimes query parameters have nothing to do with the rendering of the page at least from a server side perspective. For instance all utm_* variables from Google Adwords. These are needed for the javascript on your page so you don't want to strip them away and redirect to a

FastDFS && Nginx安装及使用

雨燕双飞 提交于 2020-01-15 02:28:43
由于公司最近需要进行图片的集中存储,现在互联网行业主要有两种大的实现思路,基于简单hash实现,基于分布式文件系统实现(比如淘宝的TFS,Amozon的S3,google的GFS等等),下面主要讲解如何使用FastDFS来实现,架构图如下: >> 安装FastDFS 1) 安装libevent # rpm -qa | grep libevent # rpm -qa | grep libevent | xargs rpm -e --nodeps # tar -zxvf libevent-2.0.20-stable.tar.gz # cd libevent-2.0.20-stable # ./configure # make # make install # ln -s /usr/local/lib/libevent* /lib/ # ln -s /usr/local/lib/libevent* /lib64/ 2) 安装FastDFS # tar -zxvf FastDFS_v3.11.tar.gz # cd FastDFS # vi make.sh WITH_LINUX_SERVICE=1 # ./make.sh # ./make.sh install (如果需要修改为开机自启动,则去掉make.sh里的行注释WITH_LINUX_SERVICE=1) 3)

(nginx,Varnish,Squid,Apache TrafficServer)之 nigix与narnish的区别和比较二

a 夏天 提交于 2020-01-15 02:03:26
在前面的文章中,我们曾对HAProxy、Varnish的性能、配置做过详细介绍。今天给各位带来的是这三款开源代理服务器软件的区别,以及什么样的场景使用哪款软件。 哪个软件能够支撑高可用,高并发,还要好维护,运维和网络管理员如何从这些方案中选择一个适合的代理服务器解决方案。 下面我们就来看这三种代理服务器的基本资料,然后对比异同。 关于代理服务器 代理服务器的位置是后端服务的前端,用来负载流量,分配资源,以及解决安全攻击等问题,比如DDOS,并且支持Web应用的高可用。 Varnish Varnish是一款反向HTTP代理服务器,提供加速设计给大流量的网站应用。不像其它 ,如果你要使用代理和负载均衡。 Varnish的典型用户有WikiPedia,纽约时报。我在好乐买(Okbuy.com)也部署了Varnish。保证了服务器的性能和高可用。 还有很多实例,不再枚举。Varnish是从2006年开始开发的。 Nginx Nginx恐怕是Web服务器里排名老大,它兼具负载均衡、反向代理等一身的Web服务器。Nginxr的开发活跃度和社区也都非常活跃。 目前,Nginx分为开源版和企业版两种,另外还有变体版本,比如淘宝优化的Tenginx等。Nginx已经被大量高负载网站所应用,国内外知名大部分都已经使用Nginx做为Web服务器,如WordPress,Aribnb,中国的BAT等大量采用

Heroku & Rails - Varnish HTTP Cache Not Working

你离开我真会死。 提交于 2020-01-05 04:03:06
问题 My heroku website's root page is essentially static, it has some ruby code in the view when its generated, but there's nothing specific to a single user, so I'd like to have it cached by Varnish and served up without hitting my dyno (note that there are other pages that are dynamic in the application). Heroku makes it seem very simple here. Just add response.headers['Cache-Control'] = 'public, max-age=300' and it'll cache for 5 minutes before regenerating. To test this I made the changed and

Varnish: purge if I have cookie in hash_data

纵饮孤独 提交于 2020-01-05 03:33:48
问题 Problem : I couldn't purge my page. After many time I decided to find out how purge works and find! As you can see we have used a new action - return(purge). This ends execution of vcl_recv and jumps to vcl_hash. This is just like we handle a regular request. When vcl_hash calls return(lookup) varnish will purge the object and then call vcl_purge. Here you have the option of adding any particular actions you want Varnish to take once it has purge the object. docs And then I understood that I

Varnish Cache first time hit

十年热恋 提交于 2020-01-04 07:02:21
问题 I'm running varnish on a dedicated server. When i load a page, it is delivered via Apache and on the second and subsequent hits it is then delivered via Varnish Cache (i.e. I can see two timestamps in X-Varnish headers). But when i open up the same page from some other computer, it's again delivered from the backend (apache) for the first time and on further reloads it comes from Varnish. If a page is already in Varnish Cache, isn't it supposed to be delivered via Varnish even on a new

Setting Cookies via ESI:include, how?

隐身守侯 提交于 2020-01-02 05:45:18
问题 Im trying to use esi to make a ninja caching on my site. The idea is, the site is mostly static, I just need to do fancy stuff if the user is logged in or not. So I was trying to put an on the page A, and set triggers in the application at page B. This way I could cache the page A on varnish, and let the server deal with the small work that is page B. But the cookies I've seted on page B were not forwarded to headers of page A and didn't work =/ Is this that Im trying to do possible? I could

Varnish and ESI, how is the performance?

喜夏-厌秋 提交于 2020-01-01 04:44:06
问题 Im wondering how the performance of th ESI module is nowadays? I've read some posts on the web that ESI performance on varnish were actually slower than the real thing. Say i had a page with over 3500 esi includes, how would this perform? is esi designed for such usage? 回答1: We're using Varnish and ESI to embed sub-documents into JSON documents. Basically a response from our app-server looks like this: [ <esi:include src="/station/best_of_80s" />, <esi:include src="/station/herrmerktradio" />

How do I disable 'Transfer-Encoding: chunked' encoding in Varnish?

拜拜、爱过 提交于 2020-01-01 02:10:10
问题 Using Varnish 4, I have a set of backends that're responding with a valid Content-Length header and no Transfer-Encoding header. On the first hit from a client, rather than responding to the client with those headers, Varnish is dropping the Content-Length header and adding Transfer-Encoding: chunked to the response. (Interestingly, the payload doesn't appear to have any chunks in it - it's one contiguous payload). This causes serious problems for clients like Flash video players that are