A standout amongst the most essential search engine optimization or SEO techniques is in utilizing SEO-friendly URL structures. A decent structure can help the indexation process of your site, at the same time, shockingly, there are a lot of sites that utilizes inaccurate, problematic URLs.
Here are ways in which you can define your own SEO friendly URL structure.
• Combine your www and the non-www domain adaptations: Generally speaking, there are two noteworthy versions of your domain recorded in the search engine lists, the www and the non-www adaptation of it. These however can be combined in more ways without any glitches. Most SEOs utilize the "301 redirect" to direct one variant of their site to the next (or the other way around). Then again (for cases, when you can't do a redirect), you can determine your favoured version in Google Webmaster Tools in Configuration >> Settings >> Preferred Domain.
• Maintain a strategic distance from dynamic and relative URLs: Owing to your content management structure, the URLs it produces may be good enough like:
or turn up pretty bad like:
Search Engines have no issue with either versions, yet for specific reasons its ideal to utilize static URLs instead of dynamic ones. This is because, static URLs contain more keywords and are more easy to understand, since one can make sense of what the page is about just by taking a look at the static URL's name.
• Make an XML Sitemap: DO not confuse an XML sitemap with a HTML sitemap. The previous is for the web search tools, while the recent is basically intended for human users. XML sitemap is basically a rundown of your website's URLs that you submit to the search engine. This fills two needs; it helps search engines to discover your website's pages all the more effectively, as well as, search engines can utilize the Sitemap as a source of perspective when picking canonical URLs on your website.
• Close off unessential pages with robots.txt: There may be pages on your site that are hidden from any search engine's reach. These could be your "Terms and conditions" page, pages with confidential information, and so on. It's better not to let these get filed, since they generally don't contain your target keywords and just weaken the semantic of your site. The robotx.txt document contains directions for the internet searchers with reference to what pages of your webpage ought to be overlooked.
• Determine standard URLs utilizing a special tag: Another approach to highlight accepted URLs on your site is by utilizing the canonical tag. The standard tag ought to be connected just with the reason for helping search engine indexes choose your canonical ULR.