seo

.htaccess rewrite multiple urls without redirects

橙三吉。 提交于 2019-12-14 02:56:23
问题 Ok so i have urls like these demo1.domain.net, demo2.domain.net, demo3.domain.net and all needs to show the contents of a subfolder like this example demo1.domain.net will show the contents of domain.net/websites/demo1 i need to do it in a way that all the domains like $whatever.domain.net into domain.net/websites/$whatever.domain.net, i already enabled wildcard dns, and i cant seem to fiqure it out, please keep in mind i dont want it to redirect. Here is what i tried RewriteEngine on

Nofollow for the whole page?

笑着哭i 提交于 2019-12-14 02:14:55
问题 Is it possible to add a nofollow for the whole page instead of rel="nofollow" for every link? I've got some profile pages where users can enter their contacts and other stuff which can be potentially spam, and I cba to alter the filters for wysiwyg data. 回答1: Put this meta tag in the head <meta name="robots" content="nofollow" /> Reference links: Using the robots meta tag rel=nofollow The right use of NoFollow tag in site links – SEO Tips 来源: https://stackoverflow.com/questions/12575290

Prevent folder redirect in htaccess

做~自己de王妃 提交于 2019-12-14 02:11:50
问题 The following rules are in an htaccess file and need to remain: # GENERAL RewriteRule ^([A-Za-z_0-9\-]+)$ /index.php?page=$1 [QSA] RewriteRule ^([A-Za-z_0-9\-]+)/$ /index.php?page=$1 [QSA] RewriteRule ^([A-Za-z_0-9\-]+)/([a-z]+)$ /index.php?page=$1&comp=$2 [QSA] RewriteRule ^([A-Za-z_0-9\-]+)/([a-z]+)/$ /index.php?page=$1&comp=$2 [QSA] However, I need to prevent a specific folder from redirecting, lets call it /folder/ I can't seem to get it to work correctly and hope someone can help. THanks

Prevent cookieless subdomain for static assets from being indexed by search engines

五迷三道 提交于 2019-12-13 18:16:01
问题 I've created a new subdomain for all static assets (static.example.com) by creating a new A record and pointing it at the same server with a new IP address and then creating a virtual host with the same DocumentRoot as the main www.example.com site. We've pointed all references for static resources to the static subdomain however all website resources can be accessed via either static.example.com or www.example.com. The problem is that Google has begun to index html files on the static

HTML codes inside alt tags

本小妞迷上赌 提交于 2019-12-13 17:43:07
问题 Is it ok to put html codes inside alt tags? I have a slider that uses alt tags for description. In order style the description, I have to put html codes. My problem is I dont know if it will harm SEO or any other things to consider.. 回答1: HTML markup is not valid for the contents of the alt attribute . If you need a fancy dialog ox you can easilt accomlish this with JavaScript and maye even plain old CSS. That way your code is valid and you don't run into any potential SEO issues. 回答2: Yes,

Handling SEO Friendly URL with Non-English Characters

我的未来我决定 提交于 2019-12-13 17:30:22
问题 I have URLs like this: .com/topic.php?id=6 I can convert them to this: .com/topic/5.html This works, but now I want to convert .com/topic/title.html The "title" is dynamic, for example çağdaş and can contain non-English characters like Ş or Ğ or Ü In this case, I first convert characters to acceptable equivalents like Ş to S or Ü to U If I convert çağdaş to cagdas then my URL looks like this .com/topic/cagdas.html I have used cagdas in SQL queries to select the proper row, but in the database

How to disallow service api and multilingual urls in robots.txt

随声附和 提交于 2019-12-13 16:42:25
问题 I need to disallow the next URLs: service api /_s/user , /_s/place , ... All starts with /_s/ save form: /{language}/save . For example /{en}/save , /{ru}/save , ... NOTE: most URLs have language parameter at the beginning: /en/event , ... I don't want to block them. Should be something like: (but this is not allowed by robots.txt format) Disallow: /_s/* Disallow: /:lang/save 回答1: In robots.txt matching is from the left, so it matches anything that begins with /pattern . The wildcard like /

Getting links using slug instead of id to be SEO friendly URLS.Integrating Prismic io with angular js

安稳与你 提交于 2019-12-13 16:18:29
问题 https://github.com/heshamelmasry77/javascript-angular-starter I am integrating Prismic io CMS with angular js and i did it and its working properly with me but i am getting the links using id of each document which is not SEO friendly.I am trying to find a way to get the documents using there slug instead of the id but its very complicated i used a function in prismic io API call getByUID and i tried to get by slug but it is not working. In this repo i am posting here you will find the

Nginx URL Rewrite with Multiple Parameters

梦想的初衷 提交于 2019-12-13 16:09:26
问题 I am trying to rewrite the following URL via Nginx: http://www.domain.com/script.php?title=LONGSTRING&desc=LONGSTRING&file=LONGSTRING&id=THREELETTERS into something like this: http://www.domain.com/script/LONGSTRING/LONGSTRING/LONGSTRING/LONGSTRING/THREELETTERS.html All I have been able to find so far is how to include a single variable. I have five variables to pass through, each terminated by a "/". 回答1: you can access a script parameter name in nginx through the $arg_name variable

HTML meta “keywords”. Worth including?

妖精的绣舞 提交于 2019-12-13 14:42:31
问题 Do you use " keywords " meta in your site, knowing that Google does not use them (and has no plans) in page ranking, and perhaps even search? 回答1: Yes you do; Google is not the only search engine in the web although its has the major market share. There are other engines including Yahoo which use the Keywords META to some extent. 回答2: No. I don't want competitors knowing what I am trying to rank for . Keywords are very valuable in some markets. If you found a good keyword phrase that is