Is either GET or POST more secure than the other?

前端 未结 27 1976
没有蜡笔的小新
没有蜡笔的小新 2020-11-22 05:13

When comparing an HTTP GET to an HTTP POST, what are the differences from a security perspective? Is one of the choices inherently more secure than the other? If so, why?

相关标签:
27条回答
  • 2020-11-22 05:41

    The difference between GET and POST should not be viewed in terms of security, but rather in their intentions towards the server. GET should never change data on the server - at least other than in logs - but POST can create new resources.

    Nice proxies won't cache POST data, but they may cache GET data from the URL, so you could say that POST is supposed to be more secure. But POST data would still be available to proxies that don't play nicely.

    As mentioned in many of the answers, the only sure bet is via SSL.

    But DO make sure that GET methods do not commit any changes, such as deleting database rows, etc.

    0 讨论(0)
  • 2020-11-22 05:43

    This isn't security related but... browsers doesn't cache POST requests.

    0 讨论(0)
  • 2020-11-22 05:44

    As previously some people have said, HTTPS brings security.

    However, POST is a bit more safe than GET because GET could be stored in the history.

    But even more, sadly, sometimes the election of POST or GET is not up to the developer. For example a hyperlink is always send by GET (unless its transformed into a post form using javascript).

    0 讨论(0)
  • 2020-11-22 05:46

    The GET request is marginally less secure than the POST request. Neither offers true "security" by itself; using POST requests will not magically make your website secure against malicious attacks by a noticeable amount. However, using GET requests can make an otherwise secure application insecure.

    The mantra that you "must not use GET requests to make changes" is still very much valid, but this has little to do with malicious behaviour. Login forms are the ones most sensitive to being sent using the wrong request type.

    Search spiders and web accelerators

    This is the real reason you should use POST requests for changing data. Search spiders will follow every link on your website, but will not submit random forms they find.

    Web accelerators are worse than search spiders, because they run on the client’s machine, and "click" all links in the context of the logged in user. Thus, an application that uses a GET request to delete stuff, even if it requires an administrator, will happily obey the orders of the (non-malicious!) web accelerator and delete everything it sees.

    Confused deputy attack

    A confused deputy attack (where the deputy is the browser) is possible regardless of whether you use a GET or a POST request.

    On attacker-controlled websites GET and POST are equally easy to submit without user interaction.

    The only scenario in which POST is slightly less susceptible is that many websites that aren’t under the attacker’s control (say, a third-party forum) allow embedding arbitrary images (allowing the attacker to inject an arbitrary GET request), but prevent all ways of injecting an arbitary POST request, whether automatic or manual.

    One might argue that web accelerators are an example of confused deputy attack, but that’s just a matter of definition. If anything, a malicious attacker has no control over this, so it’s hardly an attack, even if the deputy is confused.

    Proxy logs

    Proxy servers are likely to log GET URLs in their entirety, without stripping the query string. POST request parameters are not normally logged. Cookies are unlikely to be logged in either case. (example)

    This is a very weak argument in favour of POST. Firstly, un-encrypted traffic can be logged in its entirety; a malicious proxy already has everything it needs. Secondly, the request parameters are of limited use to an attacker: what they really need is the cookies, so if the only thing they have are proxy logs, they are unlikely to be able to attack either a GET or a POST URL.

    There is one exception for login requests: these tend to contain the user’s password. Saving this in the proxy log opens up a vector of attack that is absent in the case of POST. However, login over plain HTTP is inherently insecure anyway.

    Proxy cache

    Caching proxies might retain GET responses, but not POST responses. Having said that, GET responses can be made non-cacheable with less effort than converting the URL to a POST handler.

    HTTP "Referer"

    If the user were to navigate to a third party website from the page served in response to a GET request, that third party website gets to see all the GET request parameters.

    Belongs to the category of "reveals request parameters to a third party", whose severity depends on what is present in those parameters. POST requests are naturally immune to this, however to exploit the GET request a hacker would need to insert a link to their own website into the server’s response.

    Browser history

    This is very similar to the "proxy logs" argument: GET requests are stored in the browser history along with their parameters. The attacker can easily obtain these if they have physical access to the machine.

    Browser refresh action

    The browser will retry a GET request as soon as the user hits "refresh". It might do that when restoring tabs after shutdown. Any action (say, a payment) will thus be repeated without warning.

    The browser will not retry a POST request without a warning.

    This is a good reason to use only POST requests for changing data, but has nothing to do with malicious behaviour and, hence, security.

    So what should I do?

    • Use only POST requests to change data, mainly for non-security-related reasons.
    • Use only POST requests for login forms; doing otherwise introduces attack vectors.
    • If your site performs sensitive operations, you really need someone who knows what they’re doing, because this can’t be covered in a single answer. You need to use HTTPS, HSTS, CSP, mitigate SQL injection, script injection (XSS), CSRF, and a gazillion of other things that may be specific to your platform (like the mass assignment vulnerability in various frameworks: ASP.NET MVC, Ruby on Rails, etc.). There is no single thing that will make the difference between "secure" (not exploitable) and "not secure".

    Over HTTPS, POST data is encoded, but could URLs be sniffed by a 3rd party?

    No, they can’t be sniffed. But the URLs will be stored in the browser history.

    Would it be fair to say the best practice is to avoid possible placing sensitive data in the POST or GET altogether and using server side code to handle sensitive information instead?

    Depends on how sensitive it is, or more specifically, in what way. Obviously the client will see it. Anyone with physical access to the client’s computer will see it. The client can spoof it when sending it back to you. If those matter then yes, keep the sensitive data on the server and don’t let it leave.

    0 讨论(0)
  • 2020-11-22 05:46

    You have no greater security provided because the variables are sent over HTTP POST than you have with variables sent over HTTP GET.

    HTTP/1.1 provides us with a bunch of methods to send a request:

    • OPTIONS
    • GET
    • HEAD
    • POST
    • PUT
    • DELETE
    • TRACE
    • CONNECT

    Lets suppose you have the following HTML document using GET:

    <html>
    <body>
    <form action="http://example.com" method="get">
        User: <input type="text" name="username" /><br/>
        Password: <input type="password" name="password" /><br/>
        <input type="hidden" name="extra" value="lolcatz" />
        <input type="submit"/>
    </form>
    </body>
    </html>
    

    What does your browser ask? It asks this:

     GET /?username=swordfish&password=hunter2&extra=lolcatz HTTP/1.1
     Host: example.com
     Connection: keep-alive
     Accept: application/xml,application/xhtml+xml,text/html;q=0.9,text/ [...truncated]
     User-Agent: Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) [...truncated]
     Accept-Encoding: gzip,deflate,sdch
     Accept-Language: en-US,en;q=0.8
     Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.3
    

    Now lets pretend we changed that request method to a POST:

     POST / HTTP/1.1
     Host: example.com
     Connection: keep-alive
     Content-Length: 49
     Cache-Control: max-age=0
     Origin: null
     Content-Type: application/x-www-form-urlencoded
     Accept: application/xml,application/xhtml+xml,text/ [...truncated]
     User-Agent: Mozilla/5.0 (Windows; U; Windows NT 6.0; [...truncated]
     Accept-Encoding: gzip,deflate,sdch
     Accept-Language: en-US,en;q=0.8
     Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.3
    
     username=swordfish&password=hunter2&extra=lolcatz
    

    BOTH of these HTTP requests are:

    • Not encrypted
    • Included in both examples
    • Can be evesdroped on, and subject to MITM attacks.
    • Easily reproduced by third party, and script bots.

    Many browsers do not support HTTP methods other than POST/GET.

    Many browsers behaviors store the page address, but this doesn't mean you can ignore any of these other issues.

    So to be specific:

    Is one inherently more secure then another? I realize that POST doesn't expose information on the URL but is there any real value in that or is it just security through obscurity? What is the best practice here?

    This is correct, because the software you're using to speak HTTP tends to store the request variables with one method but not another only prevents someone from looking at your browser history or some other naive attack from a 10 year old who thinks they understand h4x0r1ng, or scripts that check your history store. If you have a script that can check your history store, you could just as easily have one that checks your network traffic, so this entire security through obscurity is only providing obscurity to script kiddies and jealous girlfriends.

    Over https, POST data is encoded, but could urls be sniffed by a 3rd party?

    Here's how SSL works. Remember those two requests I sent above? Here's what they look like in SSL: (I changed the page to https://encrypted.google.com/ as example.com doesn't respond on SSL).

    POST over SSL

    q5XQP%RWCd2u#o/T9oiOyR2_YO?yo/3#tR_G7 2_RO8w?FoaObi)
    oXpB_y?oO4q?`2o?O4G5D12Aovo?C@?/P/oOEQC5v?vai /%0Odo
    QVw#6eoGXBF_o?/u0_F!_1a0A?Q b%TFyS@Or1SR/O/o/_@5o&_o
    9q1/?q$7yOAXOD5sc$H`BECo1w/`4?)f!%geOOF/!/#Of_f&AEI#
    yvv/wu_b5?/o d9O?VOVOFHwRO/pO/OSv_/8/9o6b0FGOH61O?ti
    /i7b?!_o8u%RS/Doai%/Be/d4$0sv_%YD2_/EOAO/C?vv/%X!T?R
    _o_2yoBP)orw7H_yQsXOhoVUo49itare#cA?/c)I7R?YCsg ??c'
    (_!(0u)o4eIis/S8Oo8_BDueC?1uUO%ooOI_o8WaoO/ x?B?oO@&
    Pw?os9Od!c?/$3bWWeIrd_?( `P_C?7_g5O(ob(go?&/ooRxR'u/
    T/yO3dS&??hIOB/?/OI?$oH2_?c_?OsD//0/_s%r
    

    GET over SSL

    rV/O8ow1pc`?058/8OS_Qy/$7oSsU'qoo#vCbOO`vt?yFo_?EYif)
    43`I/WOP_8oH0%3OqP_h/cBO&24?'?o_4`scooPSOVWYSV?H?pV!i
    ?78cU!_b5h'/b2coWD?/43Tu?153pI/9?R8!_Od"(//O_a#t8x?__
    bb3D?05Dh/PrS6_/&5p@V f $)/xvxfgO'q@y&e&S0rB3D/Y_/fO?
    _'woRbOV?_!yxSOdwo1G1?8d_p?4fo81VS3sAOvO/Db/br)f4fOxt
    _Qs3EO/?2O/TOo_8p82FOt/hO?X_P3o"OVQO_?Ww_dr"'DxHwo//P
    oEfGtt/_o)5RgoGqui&AXEq/oXv&//?%/6_?/x_OTgOEE%v (u(?/
    t7DX1O8oD?fVObiooi'8)so?o??`o"FyVOByY_ Supo? /'i?Oi"4
    tr'9/o_7too7q?c2Pv
    

    (note: I converted the HEX to ASCII, some of it should obviously not be displayable)

    The entire HTTP conversation is encrypted, the only visible portion of communication is on the TCP/IP layer (meaning the IP address and connection port information).

    So let me make a big bold statement here. Your website is not provided greater security over one HTTP method than it is another, hackers and newbs all over the world know exactly how to do what I've just demonstrated here. If you want security, use SSL. Browsers tend to store history, it's recommended by RFC2616 9.1.1 to NOT use GET to perform an action, but to think that POST provides security is flatly wrong.

    The only thing that POST is a security measure towards? Protection against your jealous ex flipping through your browser history. That's it. The rest of the world is logged into your account laughing at you.

    To further demonstrate why POST isn't secure, Facebook uses POST requests all over the place, so how can software such as FireSheep exist?

    Note that you may be attacked with CSRF even if you use HTTPS and your site does not contain XSS vulnerabilities. In short, this attack scenario assumes that the victim (the user of your site or service) is already logged in and has a proper cookie and then the victim's browser is requested to do something with your (supposedly secure) site. If you do not have protection against CSRF the attacker can still execute actions with the victims credentials. The attacker cannot see the server response because it will be transferred to the victim's browser but the damage is usually already done at that point.

    0 讨论(0)
  • 2020-11-22 05:48
    1. SECURITY as safety of data IN TRANSIT: no difference between POST and GET.

    2. SECURITY as safety of data ON THE COMPUTER: POST is safer (no URL history)

    0 讨论(0)
提交回复
热议问题