Website Transactions March 17, 2016 at 12:06 am

If the control exists, make sure we connect to prevent a user can enter a link to a nonexistent website. * Leave the robots.txt file to control the webmaster: Some content management systems allow you to edit the contents of the robots.txt file from the author of a page. In general, it is best that only the webmaster check the contents of this file to prevent, through ignorance, a user can block robots tracking a major part of the Web. * Avoid duplicate URLs: search engines are extremely selective when it comes to punishing a duplicate content Web, so we must ensure that each page only exists under a single URL. In any case, if we want the users to reach the same content URLs counterparts, it’s best permanent redirects 301, which are not punished by search engines.

Avoid session variables in the URL: if our portal performs electronic transactions or some other process that requires maintaining the state of the client, it is preferable to use a session cookie to a session variable that appears in the URL. The reason is that if the search engines detect this, shall not track the index page to avoid it as separate pages, but with a different session variable. For example: In this URL you can see that there is a parameter ID of the session (PHPSESSID) which, if detected by search engines, will prevent the page being tracked, as successive visits of the form it would archive as separate pages thereof, the only change the identifier value meeting..

Comments are closed.