I am designing a RESTful API that is intended to be consumed by a single-page application and a native mobile app. Some calls of this API return public results that can be c
This is a 5 year-old question from @flexresponsive with the most recent answer having been written 3 years ago and commented upon 2 years ago. While I'm sure the OP has by now found a solution, be it within CloudFlare or elsewhere, I will update the solutions given in a contemporary (2020) fashion and staying within CloudFlare. Detailed Page Rules are always a good idea for anyone; however for the OP's specific needs, this specific set in combination with a "CloudFlare Workers" script will be of benefit:
Edge Cache TTL:
(n)time
set to the time necessary for CloudFlare to cache your API content along/in its "Edge" (routes from edge node/server farm location is dependent upon one's account plan, with "Free" being of lowest priority and thus more likely to serve content from a location with higher a latency from it to your consumers.
However Edge Cache TTL
> 0 (basically using it at all) this will not allow setting the following, which may or not be of importance to your API:
Cache Deception Armor
: ON
Origin Cache Control
: ON
if #3 is being used and you want to do the following :
Use Cache Level
: Cache Everything
in combination with a worker that runs during calls to your API. Staying on-topic, I'll show two headers to use specific to your API 's route / address.
addEventListener("fetch", event => {
event.respondWith(fetchAndReplace(event.request));
});
async function fetchAndReplace(request) {
const response = await fetch(request);
let type = response.headers.get("Content-Type") || "";
if (!type.startsWith("application/*")) {
return response;
}
let newHeaders = new Headers(response.headers);
'Cache-Control', 's-maxage=86400';
'Clear-Site-Data', '"cache"';
return new Response(response.body, {
status: response.status,
statusText: response.statusText,
headers: newHeaders
});
}
In setting the two cache-specific headers, you are saying "only shared proxies can cache this". It's impossible to fully control how any shared proxy actually behave, though, so depending on the API payload, the no-transform
value may be of value if that's a concern, e.g. if only JSON is in play, then you'd be fine without it unless a misbehaving cache decides to mangle it along the way, but if say, you'll be serving anything requiring an integrity hash or a nonce then using the no-transform
is a must to ensure that the payload isn't altered at all and in being altered cannot be verified as the file coming from your API. The Clear-Site-Data header with the Cache value set instructs the consumer's browser to essentially clean the cache as it receives the payload. "Cache" needs to be within double-quotes in the HTTP header for it to function.
Insofar as running checks to ensure that your consumers aren't experiencing a blocking situation where the API payload cannot be transmitted directly to them and a hCaptcha kicks in, inspecting the final destinations for a query string containing a cf
string (I don't recall the exact layout but it would definitely have the CloudFlare cf
in it and definitely not be where you want your consumers landing. Beyond that, the "normal" DDoS protection that CloudFlare uses would not be triggered by normal interaction with the API. I'd also recommend not following CloudFlare's specific advice to use a security level of anything but "I'm Under Attack"; on that point I must point out that even though the 5-second redirect won't occur on each request, hCaptchas will be triggered on security levels Low, Medium & High. Setting the security level to "Essentially Off" does not mean a security level of null; additionally the WAF will catch standard violations and that of course may be adjusted according to what is being served from your API.
Hopefully this is of use, if not to the OP at least to other would-be visitors.
Yes CloudFlare can help with DDOS protections and No it does not implement caching and rate-limiting for your API. You are to implement those your self or you use a framework that does.
You can use CloudFlare to protect your API endpoint by using it as a proxy. CloudFlare protects the entire URL bit your can use the page rules to tweak the settings to your api endpoint.
Example: https://api.example.com/*
For cache APIs
Create a page rule like https://api.example.com/*.json
The are so many other ways you can protect APIs. Hopes this answer has been of help?
Cloudflare has published a list of best practices for using it with APIs.
TL;DR, they recommend setting a page rule that patches all API requests and putting the following settings on it: