28 January 2014

How to Get Started with Local SEO

It's very difficult to compete online with your small business. It's really hard to beat the online monsters with better sites, greater reach and deeper pockets. You can locally promote your small business by creating your local stores on Amazon.com: locality.
  1. Local Business Search on Mobile - Now a days people is searching local business through mobile. According to a ComScore study mobile search has been increased by 20% compared to a fall of 6% in desktop search. 
  2. G+ Local Page Creation - You can start your local SEO by creating your G+ local business page.
  3. Local Business Listing - List your small business on local directories.
  4. Local Store Creation - Create your local stores and list your product locally.
  5. Local Classified Ads - Local classified ads related to your offers, deals and discount will help to get business leads locally.
Source Link - http://www.business2community.com/seo/get-started-local-seo-0752312#!tEZqf

01 January 2013

Use Google’s Data Highlighter to Improve SEO

What is Google Data Highlighter?

Data Highlighter is a webmaster tool launched by Google to highlight event-related data on your website. It helps to tag the data fields on your site with a mouse. Google display your data more attractively in SERP and in other products such as the Google Knowledge Graph.

Data Highlighter to Improve SEO

Many CMS based website cannot be changed structurally, so it’s hard to mark up the data on a website that makes the most of Google’s new way of working. If you can not edit your site source to mark up structured data, you can use the Data Highlighter tool in Google Webmaster Tools to tell GWT where your important data is on the site.

Extra information displayed in search results next to your web page listing is known as "Rich Snippets".











Source: About Google Data Highlighter

29 January 2012

Different Titles in Google SERPs

Google is not giving more weight-age to meta description as a ranking factor. But now a days Google starts treating page's title like meta description and showing different title in search engine result pages. Even Google is not showing title of their own pages properly in search engine result pages.















I have noticed that Google is not displaying the exact title tag in search engine result page for multiple websites. Google is ignoring title tag and showing their own. It is dramatically decreasing CTR of links. We all are not sure why this is happening. But I think Google is testing their idea for title.

11 December 2011

Google Adword - keyword Matching Options

If you are using Google Ad word to run your PPC Campaign, you should know about keyword matching options. You can use this option to run an effective and successful PPC Campaign. With keyword matching options we can create cost effective ads for our PPC Campaign.

Google Ad word provides four keyword matching options determine which Google searches can trigger our ads to appear. These options can help you control who sees your ads.

Broad Match – Sport Shoe
It allows your ads for search terms that contain either or both words (Sport and Shoe) in any order.
                        
Phrase Match – “Sport Shoe”
It allows your ads for search terms that contain “Sport Shoe” in same order.

Exact Match – [Sport Shoe]
It allows your ads for search term that exactly match with keyword Sport Shoe.

Negative Match: -Used
If your keyword is Sport Shoe and you add the “-“ to “used” (-used), your ad will not appear for any search terms that contain the word “used”.

02 October 2011

Microsoft's Bing Offers Airport Maps












On 29th September, 2011 Microsoft's Bing announced detailed airport maps as the newest addition to their popular venue maps for Bing Maps. The new Bing airport maps give you everything you need to navigate your way through your travels. Airport maps include information on parking garages, ticket counter location, terminals and gates, baggage claims, currency exchange and more! You’ll also see a sortable directory of airlines, cafes and restaurants.

Airport Maps - There are two ways to do this:

1. Just search for the airport (by name, city, or even code) on Bing Maps, and zoom in using map.

2. Search for your flight status on Bing (for example: CO 1665), and click on the map link besides the airport to see its map.

There are maps for 42 airports covering most of the medium and large hub airports in the United States.

Source: Microsoft's Bing Airport Maps

20 September 2011

What is Google Panda?



Google Panda - Google introduce "Panda" to serve the quality result to fuel the usage. Technologies make it very easy to develop such a large website within a short-time. Many web sites being created everyday which cause the rapid increase in poorly developed websites. Now Google needs to check the web page having low quality content in order to deliver quality result to their users. That's why Google introduce Panda to removed spam results from their SERPs.

Panda is a ranking factor that has been added to the Google Ranking Algorithm. It is basically a "filter" that is design to identify the low quality page (articles, blog and simply copy and paste from other website) and effectively flags the site. Panda by Google will help in reducing spam.

03 September 2011

How to Calculate Maximum CPA Bids?


Maximum CPA bid is based on current Maximum Cost-Per-Click (CPC) bids and conversion rates.


How to calculate Conversion Rate?


Conversion Rate = Total Conversions/Total Views X 100

Example to calculate Maximum CPA Bid

Suppose you are running one ad group with two different keywords : "Burger" and "Veg Burger".

Keyword                                                   Burger                                 Veg Burger

Current Max CPC Bid                              $1.00                                  $1.40

Conversion Rate                                       10%                                     20%

No. of Conversion                                    100                                      50


To get a maximum CPA bid for "Burger"

Maximum CPA Bid = Max. CPC/Conversion Rate = $1.00/10% = $10.00


To get a maximum CPA bid for "Veg Burger"

Maximum CPA Bid = Max. CPC/Conversion Rate = $1.40/20% = $7.00


How do I set up goals and funnels?

In order for Google Analytics to calculate goal conversion metrics, you must create one or more goals. Before setting up a goal, please make sure you have the following requirements.

Requirements

* The name of the goal: Specify a name that you will recognize when viewing the goals within each set of your reports. Examples of names you might use include "email sign-up"or "article ABC download."
* The defined funnel: You may specify up to ten pages in a defined funnel. Although funnels are optional, defining one can help you map where visitors drop off during the path to completing a goal.
* The value of the goal: Google Analytics uses an assigned goal value to calculate ROI, Average Score, and other metrics. A good way to value a goal is to evaluate how often the visitors who reach the goal become customers. If, for example, your sales team can close 10% of people who request to be contacted, and your average transaction is $500, you might assign $50 (i.e. 10% of $500) to your "Contact Me" goal. In contrast, if only 1% of mailing list signups result in a sale, you might only assign $5 to your "email sign-up" goal.

Resource Link : Google Analytic Help

30 July 2011

301 Redirect - non-www to www


Apache Server

301 Permanent Redirect - non-www to www (Only for Websites Hosted on Apache Server)

1. Install "Mod_Rewrite" on a server.

2. create a .htaccess file, open notepad, name and save
the file as htaccess.txt (System will not support .htaccess extension)

3. If you already have a .htaccess file on your server, download it to your desktop for editing.

4. Place this code in your htaccess.txt file:

RewriteEngine On
RewriteCond %{HTTP_HOST} ^dipseducation.com
RewriteRule (.*) http://www.dipseducation.com/$1 [R=301,L]

5. Save the htaccess.txt file.

6. Upload this file to the root folder of your server.

7. After uploading "htaccess.txt file" in your root folder rename this file as ".htaccess" on a server.

8. Test it by typing in the old address to the page you've changed. You should be immediately taken to the new location.

Window Server

301 Permanent Redirect - non-www to www (Only for Websites Hosted on Window Server)

1. Install "ISAPI_Rewrite" on a server.

2. create a httpd.ini file, open notepad, name and save
the file as httpd.ini.

3. If you already have a httpd.ini file on your server,
download it to your desktop for editing.

4. Place this code in your httpd.ini file:

RewriteEngine On
RewriteCond Host: ^dipseducation\.com
RewriteRule (.*) http\://www\.dipseducation\.com$1 [I,RP]

5. Save the httpd.ini file.

6. Upload this file to the root folder of your server.

7. Test it by typing in the old address to the page you've changed. You should be immediately taken to the new location.

01 February 2011

What is Code to Text Ratio?

Code to Text Ratio

The Code to Text Ratio calculates the percentage of actual text in a web page. Content ratio tool extracts the text from paragraphs and the anchor text from HTML code and calculates the Code to Text Ratio based on this information.

Why Code to Text Ratio Important for SEO?

Code to text ratio is used by search engines to check the relevancy of a web page. Higher Code to Text Ratio is good for ranking purpose.

An Example to calculate code to text ratio percentage of a web page is given below:

Web Page Size :
17481 Bytes = 17 KB

Code Size :
13692 Bytes = 13 KB

Text Size :
3789 Bytes = 4 KB

Code to Text Ratio = Text Size/Web Page Size*100

Code to Text Ratio = 3789/17481*100 = 21.67%

31 January 2011

What is canonical error?

What is Canonical Error?

When your home page opens with both "http://www.site.com" and "http://site.com" is known as canonical error. Sometime Canonical URL (best URL version) is also referred as "Preferred Domain".

Canonicalization (Methods to set this problem)

1. Set your preferred domain (using webmaster tool)

2. Specify the canonical link for each version of the page
For Example:
link rel="canonical" href="http://www.canonical.com"
Add this extra information to the section of non-canonical URLs.

3. Indicate your canonical (preferred) URLs by including them in a Sitemap

4. Indicate how you would like Google to handle dynamic parameters. (by using Parameter Handling)

Source: Google Webmaster Central


25 October 2010

How to get top 10 position in SERPs?


1.    If you don't want too much competition from other SEOs, choose your keywords precisely. For example, Instead of keyowrd Loan choose keywords like Bank Loan, Equity Loan, Student Loan, Home Loan etc.

Order of keyword also matter for search engines. Search engine treats Loan Equity, and Equity Loan as different keywords.

2.    Best seo practice is to get at least one of your primary keywords in domain or sub domain name of your website. You can use hyphens (-) to separate multiple keywords. For example: seo-service, seo-guidelines, free-seo each cover two keyords.

3.    Get your second or third keywords in your directory name and filename. For example http://www.hiddentricks.com/seo/free-tips.html is best for keyword "free seo tips" , "seo hidden tricks" or "free seo tricks"

4.    Keep your webpage free from any syntax error, declare document type at the beginning and validate your HTML and CSS because search engine don't like pages with too many errors.

5.    Give a short Title in of your page in 3-9 words (60-80 characters) maximum in length containing your primary keyword. Remember it will be displayed in search results so choose wisely.

6.    Try to include your most important keyword phrases in heading tags on your page if you can but keep in mind it should not be exactly same as title of your page. You can use (H1 H2 H3) tag for specifying anything important. To reduce size of heading use CSS.

7.    Specify Meta keywords in heading of document. Limit it to 15 to 20 words. Although not all the search engines give importance but there is no harm doing it. Search engine like Yahoo still give it importance.

8.    Write Your Meta Description tag attractive containing keywords because it will appear on the search engine result pages.

9.    Use text for navigation menu instead of using images or Java scripts.

10.    Try to include your most important keyword in hyperlinked text and text and text that immediately precedes or follows the hyperlink.
Do not use same keyword always use synonyms at few places. Jusk like instead of seo, I have use search engine optimization at many places on this page.

11.    If you are using images then use "alt" attribute to describe your image with proper keyword.

12.    One of the best webmaster guideline is to submit sitemap of your website to make sure all pages of your website are indexed by search engine crawlers.

13.    Keep size of your webpages less than 50KB so it is downloaded fast and visitors don't have to wait for long. For good SEO site page size ideal should be 15KB.

14.    Try to avoid your content in Flash, frame, images, java script because crawler find it very difficult and it is against seo tips and guidelines.

15.    Don't use dynamic url because it don?t contain keywords so its not search engine friendly. If you are using any script which shows dynamic pages then make sure at least it should include one keyword.

16.    Don't try to spam and never use methods like cloaking, keyword spamming or doorway pages. Many seo advices to have multiple domain name and link each other but according to our SEO tips and guidelines search engine can penalize you for this. Instead of that try to add more quality content to your existing website.

17.    Submit your website only once to Google, Yahoo, AltaVista and other search engines and open directory. Don't use any script or website for automatic submission.

18.    If your website contents changes very often then provide visitor with Newsletter and RSS feed.

19.    Write articles on website related to yours having higher page ranking and leave your websites link.

20.    Get link from other sites related to yours, search engine consider it as vote in your favour.


Source Link: Hiddentricks

26 June 2010

Cookie Vs Cache


Cookie
A cookie is a piece of information that a web site(web server) puts on your hard disk while surfing a web site through browser so, that it can remember something about the user & user's log in details for future use.

Cache
Copies of web pages stored locally on an Internet user's hard drive or within a search engine's database. When you return to a page you have recently visited, the browser can get it from cache rather than the original server location.

01 April 2010

Robots.txt Vs Robots Meta Tag

 Robots.txt

While Google won't crawl or index the content of pages blocked by robots.txt, we may still index the URLs if we find them on other pages on the web. As a result, the URL of the page and, potentially, other publicly available information such as anchor text in links to the site, or the title from the Open Directory Project (www.dmoz.org), can appear in Google search results.

In order to use a robots.txt file, you'll need to have access to the root of your domain (if you're not sure, check with your web hoster). If you don't have access to the root of a domain, you can restrict access using the robots meta tag.

Robots Meta Tag

To entirely prevent a page's contents from being listed in the Google web index even if other sites link to it, use a noindex meta tag. As long as Googlebot fetches the page, it will see the noindex meta tag and prevent that page from showing up in the web index.

When we see the noindex meta tag on a page, Google will completely drop the page from our search results, even if other pages link to it. Other search engines, however, may interpret this directive differently. As a result, a link to the page can still appear in their search results.

Note that because we have to crawl your page in order to see the noindex meta tag, there's a small chance that Googlebot won't see and respect the noindex meta tag. If your page is still appearing in results, it's probably because we haven't crawled your site since you added the tag. (Also, if you've used your robots.txt file to block this page, we won't be able to see the tag either.)

If the content is currently in our index, we will remove it after the next time we crawl it. To expedite removal, use the URL removal request tool in Google Webmaster Tools.

Source Link: Google Webmaster Central

31 December 2009

Viral Marketing

What is Viral Marketing?

An effective way of advertising or producing brand exposure through social networks. This technique spreads promotional message throughout the network rapidly by "word of mouth" or by Internet resources like e-mail, blogs or other services.

Viral Marketing Techniques

  • Making Community.
  • Affiliate Program.
  • Offer Free Stuff (Like Customize Screen Saver, E-Books, Tools).
  • Posting Video Clips.
  • Newsletters.
  • Email Campaign.
  • Article Distribution.
  • Press Release.
  • Sharing Information ( Like Images, PDF, PPT).
  • Signature Link.

These are just only some viral marketing ideas that i hope will help.

30 December 2009

URL Rewriting



Introduction 

URL rewriting can be one of the best and quickest ways to improve the usability and search friendliness of your site. It can also be the source of near-unending misery and suffering. Definitely worth playing carefully with it - lots of testing is recommended. With great power comes great responsibility, and all that. There are several other guides on the web already, that may suit your needs better than this one.

  • Apache URL Rewriting Guide - The best guide around
Before reading on, you may find it helpful to have the mod_rewrite cheat sheet and/or the regular expressions cheat sheet handy. A basic grasp of the concept of regular expressions would also be very helpful.

What is "URL Rewriting"?
Most dynamic sites include variables in their URLs that tell the site what information to show the user. Typically, this gives URLs like the following, telling the relevant script on a site to load product number 7.
http://www.pets.com/show_a_product.php?product_id=7
 
The problems with this kind of URL structure are that the URL is not at all memorable. It's difficult to read out over the phone (you'd be surprised how many people pass URLs this way). Search engines and users alike get no useful information about the content of a page from that URL. You can't tell from that URL that that page allows you to buy a Norwegian Blue Parrot (lovely plumage). It's a fairly standard URL - the sort you'd get by default from most CMSes. Compare that to this URL:
http://www.pets.com/products/7/

Clearly a much cleaner and shorter URL. It's much easier to remember, and vastly easier to read out. That said, it doesn't exactly tell anyone what it refers to. But we can do more:
http://www.pets.com/parrots/norwegian-blue/

Now we're getting somewhere. You can tell from the URL, even when it's taken out of context, what you're likely to find on that page. Search engines can split that URL into words (hyphens in URLs are treated as spaces by search engines, whereas underscores are not), and they can use that information to better determine the content of the page. It's an easy URL to remember and to pass to another person.
Unfortunately, the last URL cannot be easily understood by a server without some work on our part. When a request is made for that URL, the server needs to work out how to process that URL so that it knows what to send back to the user. URL rewriting is the technique used to "translate" a URL like the last one into something the server can understand.
Platforms and Tools
Depending on the software your server is running, you may already have access to URL rewriting modules. If not, most hosts will enable or install the relevant modules for you if you ask them very nicely.
Apache is the easiest system to get URL rewriting running on. It usually comes with its own built-in URL rewriting module, mod_rewrite, enabled, and working with mod_rewrite is as simple as uploading correctly formatted and named text files.
IIS, Microsoft's server software, doesn't include URL rewriting capability as standard, but there are add-ons out there that can provide this functionality. ISAPI_Rewrite is the one I recommend working with, as I've so far found it to be the closest to mod_rewrite's functionality. Instructions for installing and configuring ISAPI_Rewrite can be found at the end of this article.
The code that follows is based on URL rewriting using mod_rewrite.
Basic URL Rewriting
To begin with, let's consider a simple example. We have a website, and we have a single PHP script that serves a single page. Its URL is:
http://www.pets.com/pet_care_info_07_07_2008.php
 
We want to clean up the URL, and our ideal URL would be:
http://www.pets.com/pet-care/
 
In order for this to work, we need to tell the server to internally redirect all requests for the URL "pet-care" to "pet_care_info_07_07_2008.php". We want this to happen internally, because we don't want the URL in the browser's address bar to change.
To accomplish this, we need to first create a text document called ".htaccess" to contain our rules. It must be named exactly that (not ".htaccess.txt" or "rules.htaccess"). This would be placed in the root directory of the server (the same folder as "pet_care_info_07_07_2008.php" in our example). There may already be an .htaccess file there, in which case we should edit that rather than overwrite it.
The .htaccess file is a configuration file for the server. If there are errors in the file, the server will display an error message (usually with an error code of "500"). If you are transferring the file to the server using FTP, you must make sure it is transferred using the ASCII mode, rather than BINARY. We use this file to perform 2 simple tasks in this instance - first, to tell Apache to turn on the rewrite engine, and second, to tell apache what rewriting rule we want it to use. We need to add the following to the file:
RewriteEngine On # Turn on the rewriting engine RewriteRule ^pet-care/?$ pet_care_info_01_02_2003.php [NC,L] # Handle requests for "pet-care"
A couple of quick items to note - everything following a hash symbol in an .htaccess file is ignored as a comment, and I'd recommend you use comments liberally; and the "RewriteEngine" line should only be used once per .htaccess file (please note that I've not included this line from here onwards in code example).
The "RewriteRule" line is where the magic happens. The line can be broken down into 5 parts:
  • RewriteRule - Tells Apache that this like refers to a single RewriteRule.
  • ^/pet-care/?$ - The "pattern". The server will check the URL of every request to the site to see if this pattern matches. If it does, then Apache will swap the URL of the request for the "substitution" section that follows.
  • pet_care_info_01_02_2003.php - The "substitution". If the pattern above matches the request, Apache uses this URL instead of the requested URL.
  • [NC,L] - "Flags", that tell Apache how to apply the rule. In this case, we're using two flags. "NC", tells Apache that this rule should be case-insensitive, and "L" tells Apache not to process any more rules if this one is used.
  • # Handle requests for "pet-care" - Comment explaining what the rule does (optional but recommended)
The rule above is a simple method for rewriting a single URL, and is the basis for almost all URL rewriting rules.


Source Link: URL Rewriting

05 December 2009

PPC (Pay Per Click)

The concept of the PPC model was given by Bill Gross the founder of Idealab and Goto.com in 1998. Now Goto.com (Overture) is part of yahoo!. Google started search engine advertising in December 1999. But PPC was only introduced in 2002; until then, advertisements were charged at cost-per-thousand impressions. Yahoo! advertisements have been PPC-based since their introduction in 1998.

What is PPC Model?

Pay per click (PPC), is an Internet advertising model used on websites, in which advertisers pay their host only when their ad is clicked. With search engines, advertisers typically bid on keyword phrases relevant to their target market. Content sites commonly charge a fixed price per click rather than use a bidding system.

Pay Per Click Account Setup

Steps for creating and maintaining a pay per click account is mentioned below:

@Business Analysis
*Challenges
*Goals

@Keyword Research
*Selection
*Biding

@Adwritng
*Head Line
*Body Copy
*Call to Action

@Landing Pages
*Content
*Call to Action

@Launch

@Monitor
*Ad Position
*CTR
*CPC
*Conversions



What is CPA?

Cost Per Action or CPA (sometimes known as Pay Per Action or PPA) is an online advertising pricing model, where the advertiser pays for each specified action (a purchase, a form submission, Lead, Sales, Sign up etc.) linked to the advertisement.

Fraud Clicks

Google started CPA to protect their Ad-word customer from fraud clicks on Ads. Ad-word Conversion Tracking option is available in Ad-word account to track the conversion by applying Code on a Confirmation page (Thank you Page).

Link: PPC Model

04 December 2009

Google VS Yahoo Search

The search engine results can be quite different because of the different search algorithms each one uses. It’s important to know what these differences are so you can incorporate them into your optimization strategies.

Here are three important differences between Google and Yahoo’s search engine algorithms:

Age of Domain

Domain Age Google places a much higher emphasis on the age of a domain than Yahoo does. Yahoo does favor older sites as well, but the impact is far less pronounced than Google.

This is why optimizing for multiple search engines is so important. Google should be looked at as a long-term strategy. If your website is relatively new, you will find it is easier to rank higher on Yahoo. To get around Google Sandbox Effect, you can purchase a well aged domain to place your website on, or get a link from an existing one.

Link Quality

Link Quality ImpactGoogle’s algorithm for determining the quality and reputation of web links is far more advanced and accurate than Yahoo’s. Yahoo simply does not place as much effort in screening the quality of links, and as a result suffers from far more spam in the results than Google.

When optimizing for Google, you must understand that the number and quality of incoming links is a huge factor in determining your ranking success. With Yahoo, you can get away with worse quality links, however I don’t recommend this as it would impact your Google rankings.

Meta Tags

Meta TagsYahoo places much more emphasis on the content of meta tags such as meta descriptions and keywords than Google. Above all else the description should be optimized for a reader, not a search engine. Clearly describe the content of your meta tags without keyword stuffing or sounding spammy and you will find much better conversions and click-throughs to your site.

Resource Link: Google VS Yahoo Search

23 August 2009

7 Advanced SEO Tactics

Top Seven Advanced SEO Tactics are listed below:

1) Syndicating Articles that Link to Your Sitemap
2) Translating Your Website Into Other Languages
3) ROR Site Map
4) Best Keywords Phrases (that convert)
5) GoogSpy
6) Internal Links Within Content
7) Using Your Log Files for SEO

All Suggested tips will increase your business online.

03 August 2009

Microsoft and Yahoo's Search Deal



Microsoft & Yahoo Search Deal
Google is the dominant player in the search market, with a 65 percent market share in June, according to comScore. Yahoo was second with 19.6 percent, while Microsoft was third with 8.4 percent.

On Wednesday (29, July 2009) Microsoft and Yahoo announced a 10-year search deal to take over Google. This deal will allow Microsoft to access Yahoo's search technology and in return Yahoo will receives a 88% of the search-generated ad revenues from its own sites for the first 5 years of the 10-year deal.

According to search deal Yahoo will use The Bing (New Search Engine by Microsoft) for Search related Services to grab the internet marketing and Yahoo will handle advertise sales by using Microsoft Technology.

Are they capable to take over Google?

17 February 2009

Robots.txt file

Robot.txt is a file that gives instructions to all search engine spiders to index or follow certain page or pages of a website. This file is normally use to disallow the spiders of a search engines from indexing unfinished page of a website during it's development phase. Many webmasters also use this file to avoid spamming. The creation and uses of Robot.txt file are listed below:

Robot.txt Creation:

To all robots out
User-agent: *
Disallow: /

To prevent pages from all crawlers
User-agent: *
Disallow: /page name/

To prevent pages from specific crawler
User-agent: GoogleBot
Disallow: /page name/

To prevent images from specific crawler
User-agent: Googlebot-Image
Disallow: /

To allows all robots
User-agent: *
Disallow:

Finally, some crawlers now support an additional field called "Allow:", most notably, Google.

To disallow all crawlers from your site EXCEPT Google:
User-agent: *
Disallow: /
User-agent: Googlebot
Allow: /


"robots" meta tag

If you want a page indexed but do not want any of the links on the page to be followed, you can use the following instead:
< meta name="robots" content="index,nofollow"/>

If you don't want a page indexed but want all links on the page to be followed, you can use the following instead:
< meta name="robots" content="noindex,follow"/>

If you want a page indexed and all the links on the page to be followed, you can use the following instead:
< meta name="robots" content="index,follow"/>

If you don't want a page indexed and followed, you can use the following instead:
< meta name="robots" content="noindex,nofollow"/>

Invite robots to follow all pages
< meta name="robots" content="all"/>

Stop robots to follow all pages
< meta name="robots" content="none"/>

29 October 2008

Dynamic Website Optimization

Dynamic web page physically doesn't exist on server as a static HTML page. Dynamic pages are generated when an user triggers an action through that particular page (generated on the fly). Search Engines aren't good at reading dynamic URL because Search engine usually considers dynamic website as group of never ending links. Now a days many webmasters are suggesting various type of solution for the same. But some important tips are listed below:

@ Create an HTML site map with all text links of your website.
@ Try to get deep inbound links from relevant sites to yours.
@ Convert your dynamic web pages into static web pages with the help of URL re-writing softwares.
@ Just create a static web page linking to all your dynamic web pages and optimized it for search engines.

06 June 2008

Increase traffics to your site.

Traffic generating tips are listed below:

@ Articles Submission.
@ Press Releases.
@ Social Bookmarking.
@ Social Media Optimization.
@ Classified Ads

01 May 2008

What is SMO?

Social media optimization (SMO) is a set of methods for generating publicity through social media, online communities and community websites. Methods of SMO include adding RSS feeds, adding a "Digg This" button, blogging and incorporating third party community functionalities like Flickr photo slides and galleries or YouTube videos. Social media optimization is a form of search engine marketing.

Source: SMO

20 February 2008

Social Networking

Look, I don't really think that the mySpaces and Facebooks of the world are that important for the typical small business as they stand today. There may be very practical business reasons for some to actually use these and other, what are called social networks, for business gain, but most people that have jumped on the social network bandwagon have found themselves left with a "is this all there is" kind of feeling.

To those, I say this, the value of the current public social networks for business folks is not what you can get out of them for gain today, but what you can learn by using them for practical gain tomorrow. That's why SpacebookedIn makes sense for you now.

The Facebooks of the world are busy teaching millions and millions of business folks how social networks work, how social networking works, how shared applications can be viral and ever-present. The real payoff in my opinion is that the wave to come after the Facebook bubble bursts is the "personalized business network." Once everyone of your customers and prospects knows how to use what are easily replicatable social networking tools, like building profiles, sharing video and connecting based on mutual interests, your job of building your own social business network around your own very specific community of niche will get a whole lot easier.

04 December 2007

website optimization

Web site optimization is the process of designing your web pages to rank high in the Search Engines. If you really want to boostup your business then optimization of your website is a must.

Meta tags are rarely used by search engines but title tag is the most important tag on our site. So try to put your all target keywords in title tags. Some time anchor tags are also beneficial for ranking purpose.

Quality contents/back links will definetly boost your ranking.But natural links are the best solution for good SERPs.
Isn't it?

27 November 2007

Link Bait

Link Bait means to create something that naturally attract backlinks for your web page by getting people to talk about it, discussing it on forums, blogging about it, posting it on various social networking sites and linking to it from their sites. Few tips are listed below:

*Write an interesting article

*Test something new that has not been done before

*Be the first to write the latest news in your niche

*Make an interesting picture

*Make a tool that others can put on their sites but that links to you

*Give something valuable for free

20 November 2007

Avoide it for good SERPs.

Please avoide the following things for better optimization.

*Don't use flash for web site design.
*Avoide frames on your site.
*Limite your keywords in contents.
*Don't get more than 20 links on a page.

19 November 2007

Link building

Nowadays google is giving more importance to the one way natural links. Google also appreciate unique contents. So, we have to follow some ways of natural back links for good SERPs in google.

15 November 2007

SEO Friendly Flash

Most SEOs would agree that Flash is not really SEO friendly (yet). This is because all the content (text) is inside the flash movie and the spiders can't read it or at least usually chooses not to index that content. Supposedly SEs can index some content embedded in Flash but I have not seen any sites banking on this theory successfully.

This script allows for a div section named "flashcontent". Anything placed inside this div will not be shown to the user but your Flash movie will show up as usual. The idea is to place plain text content and/or links inside this div as spider food.
An added bonus of using this script is it gets rid of the "white lines" shown around a Flash movie before the "control" is activated in IE by clicking on the Flash object.


In other SEO friendly Flash related news, another designer emailed me a link to fCMSPro which claims to be an SEO friendly Flash based content management system(CMS). I took a look at their product and while it has made some advances in terms of more SEO friendly flash, it also has some serious drawbacks (IMHO).

source: webpronews

Javascript: Friend or Foe?

If you frequent different SEO forums you may have received mixed signals on things like Flash or JavaScript. You may be wondering if it’s safe to use these and if so, what the impact will be. In this article, I attempt to address your concerns over JavaScript.

A brief history of JavaScript

In 1995 Netscape developers realized they needed an easy way to make Java applets more accessible to non-Java programmers and web designers. While it was plagued with problems and errors that were unable to be diagnosed in the early days, the popularity of this simplistic scripting language has hung on and has gotten more popular over time.

Because of its cross-browser compatibility, in that most modern browsers now support JavaScript, and the relative ease of use and implementation, JavaScript has become popular among designers looking for a more dynamic edge for their websites.

So, is JavaScript bad?

In an nutshell, no. If it is used properly, and some basic rules are followed then JavaScript is acceptable.

The biggest flaw with many sites which use JavaScript is that the developer embeds navigation inside of JavaScript menus which render the links invisible to search engine crawlers and therefore the crawler won’t follow those links.

But if you remove the navigation from the JavaScript it then becomes a very powerful scripting tool to help you achieve various effects that HTML can not.

sourec: textlinkbroker

14 November 2007

What is a Google Sitemap?

A Google Sitemap is a very simple XML document that lists all the pages in your website, but the Google Sitemaps program is actually much more important than that. In fact, the Sitemaps program provides a little peek inside Google's mind - and it can tell you a lot about what Google thinks of your website!


Why Should You Use Google Sitemaps?...


Until Google Sitemaps was released in the summer of 2005, optimizing a site for Google was a guessing game at best. A website's page might be deleted from the index, and the Webmaster had no idea why. Alternatively, a site's content could be scanned, but because of the peculiarities of the algorithm, the only pages that would rank well might be the "About Us" page, or the company's press releases.


As webmasters we were at the whim of Googlebot, the seemingly arbitrary algorithmic kingmaker that could make or break a website overnight through shifts in search engine positioning. There was no way to communicate with Google about a website - either to understand what was wrong with it, or to tell Google when something had been updated.


That all changed about a year ago when Google released Sitemaps, but the program really became useful in February of 2006 when Google updated it with a couple new tools.


So, what exactly is the Google Sitemaps program, and how can you use it to improve the position of your website? Well, there are essentially two reasons to use Google Sitemaps:


1. Sitemaps provide you with a way to tell Google valuable information about your website.


2. You can use Sitemaps to learn what Google thinks about your website.


What You Can Tell Google About Your Site


Believe it or not, Google is concerned about making sure webmasters have a way of communicating information that is important about their sites. Although Googlebot does a pretty decent job of finding and cataloging web pages, it has very little ability to rate the relative importance of one page versus another. After all, many important pages on the Internet are not properly "optimized", and many of the people who couldn't care less about spending their time on linking campaigns create some of the best content.


Therefore, Google gives you the ability to tell them on a scale of 0.0 to 1.0 how important a given page is relative to all the others. Using this system, you might tell Google that your home page is a 1.0, each of your product sections is a 0.8, and each of your individual product pages is a 0.5. Pages like your company's address and contact information might only rate a 0.2.


You can also tell Google how often your pages are updated and the date that each page was last modified. For example your home page might be updated every day, while a particular product page might only be updated on an annual basis.


What Google Can Tell You About Your Site


Having the ability to tell Google all this information is important, but you don't even need to create a sitemap file in order to enjoy some of the perks of having a Google Sitemaps account.


That's because even without a Sitemap file, you can still learn about any errors that Googlebot has found on your website. As you probably know, your site doesn't have to be "broken" for a robot to have trouble crawling it's pages. Google Sitemaps will tell you about pages it was unable to crawl and links it was unable to follow. Therefore, you can see where these problems are and fix them before your pages get deleted from the index.

source: ifergan