robots.txt

What is the use of the hackers.txt file?

两盒软妹~` 提交于 2020-06-24 03:04:38
问题 First No I am not asking you to teach me hacking, I am just curious about this file and its content. My journey When I dived into the new HTML5 Boilerplate I came accross the humans.txt. I googled for it and I came at this site http://humanstxt.org/. Immediately my attention went to this picture: Do I read this correctly? Hackers.txt ? So I resumed my journey in google and stopped at this articles When I started reading this I had the feeling that its about the difference between Hackers and

Multisite TYPO3 v9, distinct robots.txt for multiple domains on one rootpage

扶醉桌前 提交于 2020-04-30 06:46:47
问题 For marketing purposes do I maintain one identical website with two different domains, in TYPO3 v8 I would simply add a domain record on the root page and create a personalised robots.txt with typoscript for each site trough realurl) ... With v9 I cannot find a way to do this, I tried to enter various anottations in config.yaml manually, but nothing works (i.e. I tried to replicate the annotation for the url)... routes: - route: robots.txt type: staticText content: "User-agent: *\r\nDisallow:

Multisite TYPO3 v9, distinct robots.txt for multiple domains on one rootpage

我的未来我决定 提交于 2020-04-30 06:46:20
问题 For marketing purposes do I maintain one identical website with two different domains, in TYPO3 v8 I would simply add a domain record on the root page and create a personalised robots.txt with typoscript for each site trough realurl) ... With v9 I cannot find a way to do this, I tried to enter various anottations in config.yaml manually, but nothing works (i.e. I tried to replicate the annotation for the url)... routes: - route: robots.txt type: staticText content: "User-agent: *\r\nDisallow:

Sitemap reference in robots.txt for each TLD

十年热恋 提交于 2020-02-05 22:16:07
问题 We are using the robots.txt to reference our sitemap index file. Now we will release new, different countries. Our webseite under the TLD .de provides a robots.txt, containing a reference to our index file. The index files refers to different sitemaps containing our .de link in loc XML node. Other locales (eg. for .fr ) are listed with xhtml:link below. Example: <url> <loc>https://xy.de/hallo</loc> <xhtml:link>https://xy.fr/hello</xhtml:link> </url> The question is now, should we add a robots

Robots.txt file [closed]

孤街浪徒 提交于 2020-01-24 00:32:11
问题 Closed. This question is off-topic. It is not currently accepting answers. Want to improve this question? Update the question so it's on-topic for Stack Overflow. Closed 10 months ago . I am using this code in my robots.txt file: User-agent: * Disallow: But, my one competitor is using User-agent: * Disallow: / And his site is performing well in Google and on 1st rank. But my site is not ranking. I have checked everything in my site and it is OK. But how my competitor is performing well in

robots.txt htaccess block google

和自甴很熟 提交于 2020-01-14 12:39:09
问题 In my .htaccess file I have: <Files ~ "\.(tpl|txt)$"> Order deny,allow Deny from all </Files> This denies any text file from being read, but the Google search engine gives me the following error: robots.txt Status http://mysite/robots.txt 18 minutes ago 302 (Moved temporarily) How can I modify .htaccess to permit Google to read robots.txt while prohibiting everyone else from accessing text files? 回答1: Use this: <Files ~ "\.(tpl|txt)$"> Order deny,allow Deny from all SetEnvIfNoCase User-Agent