StudentShare
Contact Us
Sign In / Sign Up for FREE
Search
Go to advanced search...
Free

How to Achieve Top Search Engine Positioning - Essay Example

Cite this document
Summary
The researcher of this analytical essay mainly focuses on the discussion of the topic of search engine optimization and trying to answer the question how to achieve a top position of the website. At the end of the paper, the author comes to the interesting conclusions and gives his recommendations…
Download full paper File format: .doc, available for editing
GRAB THE BEST PAPER92% of users find it useful
How to Achieve Top Search Engine Positioning
Read Text Preview

Extract of sample "How to Achieve Top Search Engine Positioning"

How to Achieve Top Search Engine Positioning Organization Webmasters, frustrated with what they thought was a well made site not showing up on search engines, often consider the option of paying a third party company that touts search engine optimization in order to achieve the results they want. With some research, though, a webmaster can do as well or better without such a service. This paper will discuss how to optimize a personal or business web site for search engines. The objective for webmasters is having their sites placed near the top of the list of results returned in a web search. Knowing how search engines find content and the measures they use to rank sites is essential to understanding how to build a site optimized for placement. To achieve success, the web builder must understand what search engines like and what they don’t like as mistakes in page design are the most common deterrent to placement and are extensive on many web pages. Search engine placement is a vast and free form of advertising that organizations often wish to benefit from. Although search engine administrators frequently provide means to purchase more frequent “spidering” of a site, web site HTML construction and appearance of the pages will determine whether or not that site is ever seen by visitors who are searching for whatever product or idea that site is promoting. This paper will be researched by topic and best available source and will include ways to optimize a web site by giving an overall basic knowledge of the way search engines operate; what they look for in their ranking procedure including how site design influences rank, the use and overuse of graphics, Flash technology, frames and HTML manipulation; and how webmasters can utilize that knowledge to their advantage. Search engines look for “Meta tags” and appropriate keywords embedded in html script when “spidering” a site for list placement as data content and other variables determine how highly a site is ranked. World Wide Web (WWW) pages are retrieved by a web crawler, or spider, a program which browses the WWW in a methodical, automated manner. The contents of each page are then analyzed to determine how it should be indexed. Words are extracted from the titles, headings, or tags. Web crawlers create a copy of all the visited pages for later processing by a search engine that will index the downloaded pages to provide fast searches. Search engines can store information about a large number of web pages, retrieved from the WWW. “Data about web pages is stored in an index database for use in later queries. Some search engines, Google for example, store all or part of the source page (a cache) as well as information about the web pages. Some, Alta Vista for example, store every word of every page it finds. Web crawlers use algorithms for searching web site information” (“Search Engine”, 2005). For the purposes of this study, an algorithm is the sequence in which computers process information – a precise list of precise steps in the precise order of computation. A computer program is nothing more than an algorithm that tells the computer what steps to perform in which specific array in the exact order to carry out a specified task. “The usefulness of a search engine depends on the relevance of the results it gives back. While there may be millions of Web pages that include a particular word or phrase, some pages may be more relevant or more popular than others. A user makes queries to the search engine by inputting key words. The engine looks up the index and provides a listing of best matched web pages according to its criteria, usually with a short summary containing the documents title and sometimes parts of the text” (“Search Engine”, 2005). Search engines algorithms weigh links and combine many factors to rank pages. Webmasters who want their sites to achieve a Top 20 position spend a lot of time studying different algorithms. Keywords are considered ‘inside factors’ that influence rank. Webcrawlers read various types of ‘tags’ typed inside web site content text (HTML) that contain keywords. “Tags are invisible to the person viewing the web site. The content of a web page appears in the HTML Title. Search Engines deem the Title Tag as the most important as it is read first and will appear in search engine results and in browser bookmarks. Title Tags appear near the top of the HTML page. When looking at the source code of an HTML page, you can find the title in the following format: HTML – The Title/Description of this web site” (Lehoczhy, 2000). Keywords written in invisible Tags such as Meta, ALT and Comment appear throughout the HTML document and are especially useful on the main page. The meta description tag is significant because its the only tag supported by some engines. Use this tag in the following format: . When using keywords, avoid repeating them more than 3-7 times in your meta description and keep in mind the idea that search engines vary in their size preferences for meta keyword tags. MSN, for example, will accept a meta keyword tag up to 1,024 characters long, while HotBot specifies 75 characters as its guideline. Some search engines consider the overuse of keywords to be spam. The meta description tag describes your sites content, giving search engines spiders an accurate summary filled with multiple keywords. No engines penalize sites that use meta tags properly” (Lehoczhy, 2000). Web Crawlers do not search images but can detect ALT Tags which describe images ensuring search engines recognize all content on the site. Comment Tags are notes from the webmaster to the webmaster and is another method designers use to place more keywords in the HTML text. “There are many issues to consider when placing keywords in the text of site pages. Most search engines index the full text of each page, so its vital to place keywords throughout your text. Some engines rank a page high if it has at least 100 words, so make that your minimum. Directories include pages based on the quality of their content, so make sure your pages arent simply lists of keywords” (Shapiro, 2000). Prominence, proximity, density and frequency are key concepts to keep in mind when placing keywords. Search engines define set differing and specific algorithmic parameters; however, keyword concepts are essentially universal. Keywords should be prominent at the top and bottom of the page, in close proximity to each other instead of scattered throughout a sentence and densely packed together. Keywords should repeat frequently, three but no more than seven times for every 100 words and they can be a part of the URL. “The URL name is the part of the URL that comes between ‘www’ and ‘.com’. Its the name of a site. For example, in the case of the URL http://www.searchengines.com, the URL name is ‘searchengines’. Recently, search engines began to prioritize the use of keywords in a sites URL in their ranking formulas. Google and Inktomi are two engines that do this. Google is extremely important because Yahoo! uses it to supplement its search results. Alphabetical priority is also important” (Shapiro, 2000). Some search engines are programmed to prioritize alphabetically which includes characters. “Simply put, its why some search engines will list a file named ‘aaa.html’ before a file named ‘bbb.html’ and will rank a file named ‘@ABC’ higher than a file named just ‘ABC’ (Shapiro, 2000). Alphabetical hierarchy is important in naming the URL (web site address) as well as the file names of the pages within it. “Search engines employ editors to physically look for abuses such as naming a site @abcstore.com.  Many search engines also use a complex technology to look for themes in web sites for ranking purposes” (Shapiro, 2000). Many of the more popular search engines such as Google and MSN use link popularity in their ranking algorithms. Linking is a considered an outside factor that influences rank. Beyond creating search engine-friendly pages, Webmasters must now contend with link analysis, a major new trend in the way search engines calculate relevance. Link analysis is a technique that seeks to establish the credibility and authority of Web sites by analyzing the entire structure of the Web, and noting which sites link to others. Links, in essence, are like citations in a book, serving as referrals to ‘preferred’ sites on the Web. If many sites link to a particular site, this ‘peer-review’ process amounts to a vote of confidence that the engines can measure (Sherman, 2000). Link popularity, referencing to other sites that link to yours, make many search engines rank a site higher. The best way to increase link popularity is through a linking campaign. Link exchange programs are comprised of webmasters who agree to link to one another. Free-for-All (FFA) sites are created to capture your e-mail address and offer you a link in return. Attempting to boost a site’s link popularity through such means as FFA sites and exchange programs is generally a bad idea, though. Some search engines, such as Google, say that FFA sites are a false and dishonest way to increase link popularity. Google may ban your site for engaging in an FFA or link exchange program. “Many sites develop links through arrangements with each other and insert keywords in the links description. (e.g.: My Keywords). Link arrangements made by contacting other webmasters is not considered illegitimate ‘spam’ by search engines” (“What”, 2000). DirectHIT, a program considered click-tracking technology, is another tool search engines employ to organize search results. “DirectHits site ranking system, which is based on the concepts of ‘click popularity’ and ‘stickiness,’ is currently used by Lycos, Hotbot, MSN, and other search engines. DirectHit is currently owned and incorporated into Ask Jeeves and Teoma” (“Don’t miss”, 2000). DirectHit measures the number of hits a site receives within a search engine’s results page then adjusts ranking accordingly. This system allows users, not machines to control rank of sites. In an example given, when 20 internet surfers type the words “puppy food” into their search bar, scan the first 10 results and all of them elect to visit Smith Brothers Dog Food Site, the click-tracking technology assumes that this site is more relevant to the average surfer than any of the other sites listed in the top ten, even those listed higher. The next time a surfer enters the words “puppy food” into their search bar, that engine will list Smith Brothers higher in the ranking results. “Stickiness measures the time a user stays at a site. Its calculated according to the time that elapses between each of the users clicks on the search engines results page” (“Don’t miss”, 2000). A well planned and constructed site is essential in determining that site’s stickiness, how long a user stays at the site. Many factors influence the user to either continue to browse a site thus raising the rank of that site or to search elsewhere. To retain visitors, most experts advise making the content attractive and consistent, make pages load on user’s computers quickly, immediately let visitors know what it is you have to offer, point out the benefits of using your site over others, consistently deliver value, use common sense organization and keep the site fresh and interactive (Krkosska, 1999). Popular web sites are neat, orderly and attractive. In web language, clean. Clean pages are generally easier to navigate than those with information crammed into every niche of space. Users are likely to quickly navigate from a cluttered, difficult page to a well-organized page thus reducing the rank of the original site. “Don’t distract your visitors with blinking text, scrolling text, animated GIFs or sound files. Animation and sounds are distracting. How can anyone read what’s on your site when they’re subjected to the equivalent of someone standing next to them poking them in the shoulder repeatedly? Also, visitors who have dial-up connections … may resent that you wasted their time by forcing them to load animations and sound files against their will” (Bluejay, 2005). Assuring site users have the ability to navigate quickly and concisely throughout the site works in concert with other steps webmasters utilize to maximize optimization. “When designing a site, you need to imagine how the visitor will experience your design. There are several styles of site structure and navigation which dictate how the visitor moves from section to section, and this movement is governed by the links and connections you have installed” (Kerr, 2001). Successful search engines also use the clean look on their websites. Google has the cleanest of sites and its popularity has changed the way other engine sites appear. Netscape, by contrast, is more cluttered. “Dont make the mistake of using your main page as a cluttered site map. Instead, provide easy navigation to make sure both people and search engines can easily find their way through your site” (“Clean”, 2000). The main page, or portal page, should contain basic navigation, a few basic links and the most pertinent or latest information relating to the site. Users do not like music or having to scroll through several pages to find out basic information. “Try to keep your home page limited to two screens at most and your other pages to three to four screens. Dont write your life story on your front page. No one is going to scroll through 70 pages. If you have a lot of material on your site, break it up into sections of no more than four pages each. Give your visitors the option of downloading any big files to their hard drives. If your site is clean, easy to navigate, fast and pleasant, visitors will stay longer and your site will score higher. Directories, such as Yahoo!, will review your site before accepting it. Directory editors look for sites with good design to add to their indices” (“Clean”, 2000). Site designers must consider the aesthetics of the web site in addition to the content. A viewer will most likely navigate quickly away if displeased by the overall look of the site. Layout, fonts, colors and graphics should work in concert together to give a professional, pleasing effect as search engines critique the site’s quality for placement and quality includes the way the site appears in the browser space. “Just like any newspaper would place its top story above the fold, place the most pertinent information and navigation buttons on the first screen of each page. Use Arial or another sans-serif font. Researchers maintain these fonts are easiest on the eye when read on screen. Theyre also the most common fonts and are standard on various computer platforms. Avoid using an image for your background. It will probably be large and will slow the pages load time. Also, text printed over background images is difficult for the user to read. The best combination for reading is a white background and black letters” (“Importance”, 2000). Optimization also includes consideration of the download time of the various elements on the page. Viewers and search engine administrators hate to wait. Web surfers will hastily navigate away if the page doesn’t download quickly. The design of the site must take this into serious consideration during development. Optimizing load time is crucial to the success of a site. Basically, the more pictures posted on the site, the slower the load time for the end user. “Use no more than four images on your home page. There may be some images you need to use for aesthetic purposes, but use them sparingly … use fancy text or tables with colored backgrounds instead. If you cant figure out how to reduce the number of images on your site, create a text-only version of the most important pages. Text pages save time for those with slow connections and allow the 25 percent of users who still use text-only browsers to view your site” (“Design fast”, 2000). Technologies such as Flash are fancy and popular but having to wait until an introduction page finishes can be annoying to the viewer. Designers should provide an option to skip the introduction of doorway pages. Doorway pages are pages that are usually optimized for one search engine and anywhere from one to three keywords. They are also known as gateway, bridge, entry, jump or supplemental pages because they stand on their own, separate from the rest of a site. They usually feature a logo, some text and a link that encourages visitors to enter the site. However, there is a danger to using doorway pages too extensively. “Sites using doorway pages can be penalized if found. The biggest risk is associated with auto-generated cookie-cutters; i.e. doorway pages that are all identical except for their target searchterms. They are not difficult for search engines to spot automatically, and I advise against using them” (Craven, n.d.). Something that may be overlooked while optimizing a site for search engine status is checking what the competition has done. Examining what similar successful sites do to achieve a high rank could provide substantial dividends. “Competitive analysis requires a basic knowledge of HTML … There are some software and web-based programs available to help you analyze competing sites. Theyll usually put the HTML code in simpler form and give you statistics for other factors that influence rankings” (“Clean”, 2000). A highly ranked competitor’s site could prove useful for the webmaster who’s studying the technicalities that influence all sites. A website’s source code can be found by right clicking on any page and selecting the ‘view source’ option. A small percentage of sites use a Java script to block this function but the number is few as Java script slows page downloads. The source code can be examined for HTML title, meta tags, ALT tags, keyword frequency and alphabetical hierarchy. “Look for doorway pages and bait-and-switch techniques. Does the competitors site include pages with few images that dont look like they belong to the rest of the site? They may be doorway pages, specially optimized to meet a particular search engines indexing requirements” (“Clean”, 2000). Other factors to consider when building a site are to better meet users needs such as browser compatibility. The way a site appears on one browser may be completely different from the way it appears on another. For users trying to decide whether your site is professional or not, spelling and punctuation can carry a lot of additional weight. “Although search engines dont penalize for spelling or syntax errors, someone who finds a spelling mistake on your site may view it as unprofessional. Many people believe such mistakes indicate a lack of cultural authority and credibility. Run a spell check on every page before publishing” (“Clean”, 2000). Checking a site, or not checking one, for these errors is a WWW deficiency. Anyone can post a site without proofreading, editing or double checking any content. “This is excellent for freedom of speech and for universal access to the medium, but it is very limiting in terms of quality control. A quick search on AltaVista for a simple typing error such as ‘univeristy’ comes up with over 80,000 pages containing the error—and the vast majority of these pages are official institutional pages.” (Kerr, 2001). Many sites also have sloppy HTML coding, with broken images, links that don’t work, browser incompatibilities and other errors and omissions that demonstrate a poor approach to quality control on the part of the web team—often caused by lack of resources. A web site should reflect favorably on your organization, deliver a message and present a positive image. Few things undermine customer confidence more effectively than sloppy workmanship. HTML validation ensures that your code meets the formal standards. This should mean that it appears properly in all versions of the main browsers. One feature of the HTML standard is that older browsers will simply ignore any code that they are not able to understand—this is called graceful degradation, or backward compatibility—which is particularly important if you think that a significant part of your readership might be using older browsers. Most HTML editing software has built-in validation, and if you are hand-coding your pages you could use services such as NetMechanic, Web Site Garage or WebLint” (Kerr, 2001, p. 94). In determining whether your site is ready for search engine placement, it is important to remember to check all of the above requirements and then check them again. “Search engines are based in part on the concept of link popularity, ranking pages and more than 150 criteria to determine relevancy via algorithms. This makes it possible for search engines to order its results by how many web sites link to each found page. For these engine companies to survive, they seek to place the most relevant, appealing and well-constructed sites in the higher end of each search list” (“Search Engine”, n.d.). Search engines are to the present and future what the Yellow Pages were in the past. In the Yellow Pages, all a business had to do was send a check to be placed. Now, exposure is monetarily free while the real cost is spending time in research and in implementation or to write a much larger check amount to someone who can. Servers provide free sites and easy to use site templates which allow people of limited web site knowledge to be published on the WWW. The only visitors those with limited knowledge will bring to their site, though, are probably themselves, family and friends unless, of course, they are advertising elsewhere. The search engine is more convenient for the user and is more comprehensive than the Yellow Pages, newspaper or other forms of advertising. A high rank is achieved when, in building the site, the webmaster assures that the foundation is solid. Design, HTML integrity with proper keywords embedded in the correct places, external links, useful content and knowing what not to do all work in concert in the success or failure of a site. These are the building blocks of successful web sites. Experts agree that the type of submission software available on the internet does exactly the same type of job that the free and low-cost submission services do, which is to submit a specific set of data to all of the search engines. The disadvantage to this is that your site is never optimized and you can never score the top ranking required. “Professional search engine consultants aren’t cheap but are extremely effective and worth considering if you can afford it. … They evaluate your web site, help design keywords and content, give tips and pointers, and design a search engine submission campaign. They will also submit multiple web pages from your site” (“Submit”, 2002). Although you can hire others to tackle the job, submitting your site to search engines on your own is one of the best ways to obtain a listing. By visiting each search engine separately, and manually submitting the information for each web page you want to have listed, you guarantee you’ve done everything possible to prepare that page for that engine. Of course, the drawback to this method is the time element and lack of professional counseling, but neither does it pay to use the free submission services widely available. “Free submission services are useless because they submit the same information to every single engine” (“Submit”, 2002). References Bluejay, M. (August 2005). Website Design Tips. Available from . [Accessed December 5, 2005]. “Clean Design is Good Design”. (2000). SearchEngines.com. Available from . [Accessed December 3, 2005]. Craven, P. (n.d.). Search engine optimization spam: Doorway pages. Web Workshop. Available from . [Accessed December 5, 2005]. “Don’t miss DirectHit – The ins and outs of click popularity and stickiness”. (2000). SearchEngines.com. Available from . [Accessed December 3, 2005]. Kerr, M. (2001). Tips and Tricks for Web Site Managers. London: ASLIB-IMI. Krkosska, B. Is your website sticky? Home Biz Tools. Available from . [Accessed December 4, 2005]. Lehoczhy, E. and Shapiro, Y. (2000). “Using your HTML title effectively”. Search Engines.com. Available from . [Accessed December 2, 2005]. Shapiro, Y. (2000). “Using keywords in the text of your pages”. Search Engines.com. Available from . [Accessed December 2, 2005]. Sherman, C. (October 2000). “Search Engine Strategies 2000”. Information Today. Vol. 17, i9, p.1. “Submit sites to search engines”. (2002). EBizStartups.com. Available from < http://www.ebizstartups.com/submit-site-to-search-engine.html>. [Accessed December 3, 2005]. “The Importance of Pleasant Design”. (2000). SearchEngines.com. Available from . [Accessed December 4, 2005]. “The Importance of Fast Design”. (2000). SearchEngines.com. Available from . [Accessed December 4, 2005]. “What can I do to increase my link popularity?” (2000). SearchEngines.com. Available from . [Accessed December 2, 2005]. Wikipedia contributors. "Search engine". Wikipedia, The Free Encyclopedia. Available from . [Accessed December 2, 2005]. Read More
Cite this document
  • APA
  • MLA
  • CHICAGO
(“How to Achieve Top Search Engine Positioning Essay”, n.d.)
Retrieved from https://studentshare.org/information-technology/1535483-how-to-achieve-top-search-engine-positioning
(How to Achieve Top Search Engine Positioning Essay)
https://studentshare.org/information-technology/1535483-how-to-achieve-top-search-engine-positioning.
“How to Achieve Top Search Engine Positioning Essay”, n.d. https://studentshare.org/information-technology/1535483-how-to-achieve-top-search-engine-positioning.
  • Cited: 0 times

CHECK THESE SAMPLES OF How to Achieve Top Search Engine Positioning

The Entire Revamped E-Marketing Effort Through the Linked

PR Smith (2006) puts the marketing plan and the SOSTAC model in the strategic perspective when it states the following, “Finally, one question that commonly arises: Where would you put Target Markets, Marketing Mix and positioning in SOSTAC?... A summary of the current marketing mix and positioning will appear in the Situation Analysis under a 'review' section.... The future mix and positioning and target markets are often summarized under Strategy and explained in detail under Tactics”....
17 Pages (4250 words) Term Paper

Strategic Marketing Planning for Java Sumatra

In order to achieve the sales forecast from 2009 to 2011, a comprehensive advertising and communication plan has been developed, with the internet and print advertising used at the forefront of the marketing plan.... This marketing plan examines the current market environment, internal SWOT analysis, competitive universe, company mission, marketing and financial objectives, target market, positioning and market research to determine the best layout for the marketing plan....
4 Pages (1000 words) Research Paper

Googles Market Marketing & Strategy in the UK

Google is ranked as the most popular search engine among business information users in the Business Information Resources Survey 2006 (Google UK 2007).... So the prospect of there being a single search engine which has complete coverage of the entire web is some way off.... The search engines are constantly trying to increase their coverage of the web; and to improve their functionality.... hellip; It is also regarded as being one of the biggest search engines....
9 Pages (2250 words) Essay

Internet Marketing of Alison Hayes

The author analyzes the internet marketing of Alison Hayes and states that it could leverage this potential better.... Alison Hayes could give its communications effort a big fillip by using some method of customer lock-in and engaging them in continuous dialogue.... hellip; Free-shipping seems to have a special attraction....
17 Pages (4250 words) Term Paper

Internet Marketing Management

trength: how is your product/service better than the competition Why should a customer feel delighted by using your product or serviceWeakness: What are the opportunity areas in your product /service and how they can be over-comeOpportunity: The marketer should strike at the right opportunity and the right time and should target the right people (Customer)....
14 Pages (3500 words) Essay

A short-term e-Marketing Plan for a specified company

The three strategies focussed upon here include primarily Paid Online Advertising, via products such as Google AdWords, search engine Optimisation and Email Marketing.... The search engine optimization remains an ongoing initiative that requires up to date knowledge and implementation on a regular basis, as these aspects predominantly hinge upon the specific algorithms of the search engines at any given time....
16 Pages (4000 words) Essay

A Well Defined Marketing Plan

The paper 'A Well Defined Marketing Plan' presents a marketing plan for any business whether small or large which is a must.... For any organization, it is a crucial part of their overall business strategic plan.... A well-devised marketing plan becomes a necessity in this competitive world.... hellip; The author of this following paper demonstrates that a well-defined marketing plan when properly implemented can catapult any organization into the big league....
8 Pages (2000 words) Case Study

The Internet and Business Online

Therefore, it is advisable to set internet setting in an organization involving business activities.... Internet has relatively abridged distance among persons from… Therefore, people from different countries can communicate efficiently and effectively thanks to internet innovation.... Through internet, a person from South Africa can communicate with a person from Canada....
18 Pages (4500 words) Assignment
sponsored ads
We use cookies to create the best experience for you. Keep on browsing if you are OK with that, or find out how to manage cookies.
Contact Us