CODEDIGEST
Home » Articles
Search
 

Technologies
 

Sponsored links
 

CodeDigest Navigation
 

Technology News
No News Feeds available at this time.
 

Community News
No News Feeds available at this time.
 
Search Engine Friendly (SEO) Tips for ASP.Net Sites

By Bala Murugan
Posted On MAr 30,2012
Article Rating:
Be first to rate
this article.
No of Comments: 3
Category: ASP.Net
Print this article.

Search Engine Friendly (SEO) Tips for ASP.Net Sites

Most often, we will develop applications that are required for automating a business process. In these scenarios, the majority of our efforts are gone into the architecture and design of how efficiently we can handle the business domain in our application. If your application is targeted to Internet audience, then there are some additional technical things which you need to consider apart from your business domain. One of the main sources of audience for these internet applications are the Search Engines like Google, Bing, Yahoo etc. Hence, the end application should not only handle your business problems efficiently but also follow some simple rules so that it yields good results in internet arena. This article, will list some of the simple guidelines which you need to consider if your Asp.Net application is an internet site.

1.    Add descriptive and unique Page Title for every page

Every page in your website should have a unique and descriptive page title that can describe what the page offers. You can set the Page Title either declaratively or in the code behind file. Refer below,

In ASPX,

<%@ Page Language="C#" AutoEventWireup="true" Title="My Home Page"  CodeFile="Default.aspx.cs" Inherits="_Default" %>

In code behind,

Page.Title = "My Home Page";

 

2.    Links should be hyperlinks, no linkbutton or javascript navigation for crawlable links

Make sure all your links in your page are hyperlinks. Search engines can crawl a page only if it is linked through a hyper link(anchor tag). Javascript navigations are not search engine friendly since search engines will not understand it.

 

3.    Use javascript navigation for site related pages that have no search values

Page rank is distributed across the links on your page. Some of the internal website pages like About us, disclaimer, Registration, login, contact us, user profile pages can be navigated through javascript so that the page rank are not distributed to them. Doing like this will make rest of the crawlable content links benefited.

 

4.    Add Meta Keyword and Description tag for every page

Add Meta keyword and Meta description tag with relevant contents. Search engines will use these tags to understand what the page offers. You can dynamically set the meta tags from codebehind file using the below code,

HtmlHead head = (HtmlHead)Page.Header;

 

 HtmlMeta metasearch1 = new HtmlMeta();

 HtmlMeta metasearch2 = new HtmlMeta();  

 metasearch1.Name = "descriptions";

 metasearch1.Content = "my personal site";

 head.Controls.Add(metasearch1);

 metasearch2.Name = "keywords";

 metasearch2.Content = "ASP.Net,C#,SQL";

 head.Controls.Add(metasearch2);

The above code will add the below Meta tags to output html.

<meta name="descriptions" content="my personal site" />

<meta name="keywords" content="ASP.Net,C#,SQL" />

 

In ASP.Net 4.0, Microsoft added 2 new properties on the Page directive (Page object) that lets you to define the Meta keywords and Description declaratively and dynamically from codebehind.

In ASPX,

<%@ Page Language="C#" AutoEventWireup="true" MetaKeywords="asp.net,C#" MetaDescription="This is an asp.net site that hosts asp.net tutorials" CodeFile="Default.aspx.cs" Inherits="_Default" %>

In codebehind,

protected void Page_Load(object sender, EventArgs e)

    {

        Page.MetaKeywords = "asp.net,C#";

        Page.MetaDescription = "This is an asp.net site that hosts asp.net tutorials.";

    }

 

The similar can thing can be achieved in previous versions of .NetFramework by using a custom BasePage class. Read the below article to know more.

Adding Custom Property to Page Directive in ASP.Net 2.0

 

5.    Make descriptive urls

Make your website URL descriptive. URL’s that has lots of query string values, numeric ids are not descriptive. It will provide enough information what the page offers. For example, http://www.example.com/products.aspx?catid=C91E9918-BEC3-4DAA-A54B-0EC7E874245E is not descriptive as http://www.example.com/Electronics

Apart from other parameters, search engines will also consider the website url to match your page for a searched keywords.

Read the below article in codedigest.com to make search engine friendly url’s in asp.net.

Search Engine Friendly URL’s Using Routing in ASP.Net 3.5

You can also use URL rewriting modules for this.

 

6.    Add Alt for images, Title for Anchor

Add ALT text for images and Title for hyperlinks. The ALT text will be displayed when the browser cannot display the image for some reasons. Search engines will not be able to read the image and ALT text will give some hint about the image which the search engine can use.

<asp:Image ID="imLogo" runat="server" AlternateText="My company Logo" ImageUrl="logo.gif" />

<asp:HyperLink ID="hpHome" runat="server" ToolTip="My Website Home" Text="Home" NavigateUrl="Home.aspx"></asp:HyperLink>

 

The above ASP.Net markup will produce  the below output,

<img id="imLogo" src="logo.gif" alt="My company Logo" style="border-width:0px;" />

<a id="hpHome" title="My Website Home" href="Home.aspx">Home</a>

 

7.    Handle ViewState properly, don’t overload the ViewState

ViewState is an encoded string that is populated by ASP.Net to maintain the state of the controls on postback. This string is saved to a hidden field at the top of every page and gets transported with the HTML output. Most of the times, the ViewState string will be long and heavier. Since ViewState has no search value, it will be a real hindrance for search engines when trying to find the real content in your page. Some search engine may have some restriction on the page size.  

Hence, try to handle the ViewState in your page efficiently. Turn off ViewState for the control that doesn’t require it.

You can set EnableViewState="false" to turn off viewstate at control level, page level(@Page directive) and config level(<pages> section).

 

8.    Design your page lighter with less images, less flash and less Silverlight content

Try to design your page with very less media contents like images, flash objects, Silverlight objects, ActiveX objects, etc. Search engines can only read HTML contents. A page that is entirely build on flash or Silverlight are not search engine friendly since the search engine robots cannot find any textual contents in those pages.

 

9.    Do Permanent Redirect with proper return codes to retain  the Page Rank

If you have moved a page to a different URL or changed your domain to new domain then you should do a redirect to the new location by returning an http status code of 301- Permanent Redirect. This is called permanent redirection.  This will make sure the existing page rank is copied to the new page.

Read the below codedigest article that discusses some of the scenario where we can use permanent redirect for search engine optimization.

How to Redirect URLs/domain Without www to With www and vice versa - Permanent Redirection in ASP.Net?

SEO improvements in ASP.Net 4.0

 

10. Add rel=”nofollow” to external links

Add rel=”nofollow” to the user contributed links that are external to the site. Sometimes, the external links posted by a user may have security threats (it may download a malware which will infect the users) or possibly a spam generating site. Doing like this will secure your site from getting penalized from search engines.

Sometime back when you add rel=”nofollow” to an anchor tag, search engines will not share the page rank with that link. This makes the remaining links on the page to take more share of the page rank. Currently, the implementation is changed where the page rank is shared but it no longer allows other links on the page to take more of the share. Read here to know more about how nofollow affect the page rank previously and now.

 

11. Use Header tags

Use Header tags (H1,H2, H3, H4, H5 and H6) wherever appropriate instead of styling the text in SPAN tags. These Header tags are search engine friendly. You can use this tag efficiently to organize your page headings and sub headings.

For example, you can put your page top most heading in H1, sub heading in H2, sub-sub heading in H3, etc that represents a proper hierarchy of your page contents.

 

12. No inline CSS styles and Javascript codes

Always keep your CSS styles and javascript defined in a separate file. This makes your page clean, simple and light-weight. The search engines can find the page content easily and can efficiently index it.

               

13. Unique URL for a Page

Search engines like Google will treat a page with url http://www.example.com/Default.aspx as different from http://example.com/Default.aspx even though they are serving the same page on a website. This may lead to penalize your website for duplicate content issue by the search engine. Hence, always allow single unique URL to identify a page. You can handle this scenario by doing a permanent redirect to one url. Read the below article to handle these scenario in ASP.Net.

How to Redirect URLs/domain Without www to With www and vice versa - Permanent Redirection in ASP.Net?

You can also Google Webmaster Tools for doing this restriction.

 

14. Make SEO friendly pagers

Always construct search engine friendly pager links when displaying list of items in a summary page. For example, product list, article list page, etc. A link is called search engine friendly if it is anchor tag (<A>) that has a reachable url in its href property through GET request.

Read the below article to build search engine friendly pager for GridView control in Asp.Net.

Search Engine Friendly Pager for ASP.Net GridView control

 

15. Limit the number of links per page

Previously there was a limit in number of links (100 links per page) the Google search engine will index on a page. This restriction is now no more. But it is still advisable to have limited number of links in your pages to avoid any adverse effect on your site rank.  This is to prevent link spamming and to preserve the page rank.

 

16. Build SiteMap

Always have a sitemap file that can guide users and search engines to navigate your site pages easily. It is really necessary to have 2 site maps for a site, an xml sitemap file used by the search engines and an html sitemap file for the website users. Refer here to know more creating xml sitemap for search engines. You can submit your xml sitemap or RSS feed to Google Webmaster tools.

 




Things to be aware

1.    Design an efficient navigation system where every content page can be reached in fewer numbers of clicks.

2.    Any contents that are populated dynamically through javascript or through Ajax calls are not search engine friendly and it will not be indexed.

3.    Always using anchor tag with target URL in href for navigating from one page to another. Search will understand only hyperlink based navigation. JavaScript navigation is not search engine friendly.

4.    http://www.google.com/safebrowsing/diagnostic?site=aspalliance.com to detect malicious activity.

5.    Avoid duplicate content and thin content in your website. A page having very less content with low quality which does not serves its purpose are called as thin content.

6.    Avoid too many ads in a page or a page created just for ads.

 

Some Useful tools

Google webmaster Tools

https://www.google.com/webmasters/tools

Google Analytics

http://www.google.com/analytics/

Social plug-in

Integrate some of the social plug-ins to gain some visibility and traffic through social media.

To get facebook plug-in for your site,

http://developers.facebook.com/docs/plugins/

SEO Guides

http://www.seomoz.org/beginners-guide-to-seo

http://www.google.com/webmasters/docs/search-engine-optimization-starter-guide.pdf

http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=35769

 

Conclusion

To conclude, creating websites that satisfies these tips will not make your website popular. The website should have quality and unique content and should be useful for your target audience. Since, content based websites mostly depends upon search engines for traffic make sure you never violate any guidelines of search engines. Beware of Google Panda update rolled out by Google a year late which may penalize your website for bad contents. And remember, too much of optimizations may also result in undesirable effect with search engines.

Similar Articles
You can contribute to CodeDiget.Com:
Donate to CodeDigest.com
Article Feedback
Comments
you made my day
Oh man! really you made my day..i was looking to optimize my asp.net web portal what i am developing right now but i wasnt getting enough guidelines today my search ended..thanks a ton for this precious information and thanks to google who make me landed in your blog. i want to ask you can i add the title and meta description manually in each and every aspx page or do i need to write the codebehind also..

what i meant to ask is can i directily just put ..

<title>my web</title>
<meta name ="description" content = "bla bla" />
<meta name="keyword" content="bla, bla" />

or do i need to write the code what you have mention like..

for title --<%@ Page Language="C#" AutoEventWireup="true" Title="My Home Page" CodeFile="Default.aspx.cs" Inherits="_Default" %>

for keyword and description--

HtmlHead head = (HtmlHead)Page.Header;

HtmlMeta metasearch1 = new HtmlMeta();
HtmlMeta metasearch2 = new HtmlMeta();
metasearch1.Name = "descriptions";
metasearch1.Content = "my personal site";
head.Controls.Add(metasearch1);
metasearch2.Name = "keywords";
metasearch2.Content = "ASP.Net,C#,SQL";
head.Controls.Add(metasearch2);

please help me..waiting for your valuable suggestion..



Great Story
Great informative story, thank you.
Great Story
Great informative story, thank you.