Monday Mornings with Madison

What Do Search Engines Value In Websites?

What do search engines value in websites or web pages in order to rank them higher?  This is the million-dollar question.  No doubt anyone who could definitively and conclusively give a complete and correct answer to this question could become an instant millionaire.  But it basically is a trick question because anyone who can answer it, would only be able to answer with regard to how one particular search engine’s algorithms work, not all, and even that is an ever-moving target.  The answer valid today would be obsolete tomorrow… or soon thereafter.  It is a question over which SEO professionals obsess and marketers distress.  And the question to which few will confess that what is believed is as much supposition and speculation as insight and intelligence.

The truth is that — except for the computer engineers who work at the major search engines such as Google, Yahoo, Bing, etc. — most people don’t entirely know all the variables or and weighting given to the myriad of signals used to determine a website’s rank by any search engine.  It’s like the secret recipe for a great stew. There is a clear sense of what the main ingredients are, but not necessarily all of minor ingredients or the exact measurements for each or how they come together.  So what are the most important ingredients and why keep them such a secret?

Secrecy Required

Anyone who is not deeply entrenched in the world of search may wonder why search engines are so secretive about how they do what they do.   Why not just tell everyone how pages are ranked?  The reason is simple.  Search Engines keep their ‘algorithms’ under wraps to prevent (or at least limit) people cheating, manipulating or skewing search results for their own benefit.  Nevertheless, most SEO gurus agree there are certain basics every website should have in order to rank well.  Does your company’s website have them all?

Let’s start with a brief history lesson on search engine rankings.  The first search engine, Mosaic, began cataloging content on the Internet in 1993.   In the two decades since, search engines have come and gone.  But what they all do is basically the same.  Search engines send a computer program – called a spider or crawler – to browse websites and webpages on the Internet in a methodical, automated manner.  Then they extract links to other pages from it, and return bits of information found on the page for indexing, such as the words it contains and where they are located, as well as any weight for specific words, and all links the page contained, which are then placed into a scheduler for crawling at a later date.

It did not take long for website developers to recognize the value of having their sites ranked higher in search engine results.  A website on page one of a search is found more often, visited more frequently and – for e-commerce sites – sells more than pages that do not rank well.  Every organization wanted its website to be on page one of a search for keywords important to that site.  By 1997, the first SEO-company was born.  (That makes the SEO cottage industry only about 15 years old.)  Strategies were identified that helped push a specific site up in the rankings.

Black Hat vs. White Hat

Leading search engines, such as Google, Bing and Yahoo, quickly determined not to disclose the algorithms they use to rank pages.  By 2004, Google, for example, had incorporated a wide range of undisclosed factors in their algorithms to reduce the impact of link manipulation.  Today, Google says it ranks sites using over 200 different signals and it periodically changes the weighting of signals in its formula.  Signals that Google considers includes such things as how links and backlinks are used, timeliness / freshness and relevance of information, whether content is duplicated from other websites and sources, etc.

Some strategies are viewed as legitimate (“white hat”) by the search engines.  Other strategies simply seek to circumvent the system (“black hat”).  Those are viewed negatively by search engines.  The consequence for using black hat practices can be as benign as receiving a slap on the wrist.  For example, a Google message might say:  “We’ve detected that some of your site’s pages may be using techniques that are outside Google’s Webmaster Guidelines.”  But if black hat techniques continue to be used by a website, Google punishes the website by seriously lowering its ranking or even removing it from being indexed altogether.  For a website, that is the equivalent of falling into a bottomless black hole.

For example, one of the biggest paid blog networks – Build My Rank — had its blogs de-indexed by Google. It has now shut down. That network was one of the strictest blog networks for submitting content with back links to one’s site.  It had to be unique content or well spun content with no grammar and spelling mistakes. Build My Rank was trying not to leave any footprint for Google to follow. Unfortunately they failed.  De-indexing is the punishment for any site that tries to game the system.

Does that mean all SEO strategies are black hat, since in a sense the focus is always to try to manipulate a site’s rankings?  No.  Black hat techniques manipulate or circumvent the process while white hat strategies work within the system to make a site more valuable to the user and easier to crawl for the spiders.

There are certainly lots of books and articles about SEO.  It is important to stay informed.  Read only the most recent information.  The rule of thumb on SEO is that anything published more than three to four months ago is probably partially obsolete.

Google itself offers advice on how to optimize a website so that it ranks well.  Maile Ohye, a Developer Program Tech Lead at Google, recently offered some do, don’ts and common SEO mistakes in a brief video.  Below is a summary of some of her wisdom plus that of other SEO ‘experts’ of what Google bots consider and reward in ranking a website.  The focus here is on only the soundest and most consistent SEO principles.

Top 18 SEO Signals to Optimize

Develop quality content

Google sees content as the value proposition. They recommend not focusing on SEO until the website creator has addressed these questions.  What sets the website apart?  Why should someone click on the site?  Why should someone come back to the site?  Why should someone recommend the site to others?  This all relates to one basic premise:  content.   In SEO, content is king!  It is the most important signal for bots.  Product or service descriptions.  Glossary of relevant terms.  Historical information.  Reports.  Presentations.  Press releases.  Articles.  Comments.  Recommendations.  Case studies.  Reviews.  Forms.  Calendar listings.  Offers.  In short, provide information or items that benefit site visitors.

Have a blog and guest blog

Because of the way major blog hosting sites work — such as WordPress — blogs have a lot of SEO value.  Blogs directly tie to development of quality content, but a blog is typically a microsite within a website.  It is also good to write posts for other blogs and have others write blog posts in return, as long as the content is relevant.

Display quality video

When adding video, it must be relevant to the purpose of the business and its website.  Video is also considered content, but is valued as much or more than text because there are still few business videos online as compared with text. A properly tagged series of business videos on YouTube will achieve an infinitely higher ranking than a page using a lot of ‘old style’ SEO tactics.

Provide relevant content

Content is key but it has to be relevant content.  The information must relate to the site’s area of focus.  Adding information that has nothing to do with the website’s focus will not work.  One old black hat strategy was for site developers to post thousands of pages of unrelated, irrelevant content on the back end of the site.  Visitors would not see it but the bots did.  Sites now get punished for that.

Provide fresh content

For a website to rank well, it must be continually updated.  The ‘build-it-and-forget-it’ approach adopted by many companies is not rewarded by search engines.  That is why regular blog posts and social media connections are invaluable to SEO.  A continuous stream of new information to a website is a signal valued by bots/crawlers.

Use diverse keywords

It is important to use keywords in the header bar, title page, webpage address, meta tags, and text. Keywords are the words or phrases that visitors will search for to find a site.  The blue title bar at the top of the screen is an important place to use keywords.  The page address is another important place.  So are the headlines and subheadings of a page.  Make sure these elements includes two or three keywords/phrases.  But don’t use the same ones for every page.  Diversity is rewarded and keeps you from competing against yourself for ranking.  It is also good to include anchor text keywords such as “click here”, “read more” and “here”.

Avoid keyword stuffing

Use relevant keywords in content but don’t overload any page with keywords.  While bots crawl in search of keywords, overloading a page with keywords is punished.  Keyword density should make sense based on the content and focus of the site.

Attract and encourage public buzz

It is important for a website to generate natural links, votes, #1s, follows, links and other social media signs of support.  Social media support is highly rewarded by crawlers.  Adding Like and #1 buttons to a website’s pages permits visitors to generate that buzz.  It also helps if people re-tweet your content, link to it on LinkedIn, share it on Google+ or pin it on Pinterest. Google is using social signals ever more to determine the value and rankings of websites.

Develop inbound links to the website

Links from other relevant sites are viewed positively.  It is viewed like a recommendation and rewarded.  For example, every employee of a company should link their personal page on LinkedIn to the company’s website. Purchasing inbound links, however, is punished. If you wonder how Google can tell if links are purchased, refer back to the example I reference of Build My Rank.  It’s not worth taking that chance. Also, avoid reciprocal links as Google punishes ‘tit-for-tat’ schemes which are viewed as ‘not genuine.’ It’s also best to get a variety of links from different sources such as article submissions, social bookmarking, wiki submissions, press releases, forum profiles, etc. The more a website’s link profile is diverse, the better it is in Google’s eyes.  Finally, limit the number of outbound links to about one per 100 words of text.

Create internal links

Cross linking between pages of the same website provides more links to the most important pages and improves site visibility.

Extend a domain’s registration

Search engines want to know that a site is committed to being around a long time.  Illegitimate sites come and go, but real sites buy and keep their domains (as part of their brand) forever.  Sites are thus rewarded for being registered at least five years or more.  How do the crawlers know for what length of time any given domain is registered?  That information is online.

Keep old domains

Older domains (that have been registered a long time) are rewarded for stability and continuity.  It is always better to revamp a website using an existing domain than to create a new website on a new domain.

Fix broken links immediately

Bots crawl websites by using links.  Broken links prevent proper indexing. Provide clear and easy-to-use navigation.  Bots are able to crawl a site more easily if the navigation is logical.

Avoid ‘invisible’ elements

Minimize or avoid flash or website information that pulls from a database.  Those elements are not visible to search engines.

Tag all images

Bots cannot see images, but they can read what an image is about if it is tagged.  Ensure that all photos / graphic images have descriptive Alt tags.

Provide a full, accurate site map

Avoid coding work-arounds.  New URLs can now be submitted and crawled within a week when submitted to the Google Webmaster.

Don’t chase site visitors or search engines.

Create a site that has long-lasting value.  It is a better long-term investment.  Google punishes schemes but rewards quality content and site design.

Ensure the site loads quickly

Google looks at page load times and bounce rates as metrics in ranking every website. Ensure a website is quick to load and direct website visitors as soon as they view a page. For WordPress blogs, the caching plugin improves the loading speed of the site.

Armed with this information, any website developer can begin the process of optimizing a site for search engine findability.  It is the first step… but it should never be the last.  SEO is a never-ending process.

Quote of the Week

“It’s quite complicated and sounds circular, but we’ve worked out a way to calculate a Web site’s importance.” Larry Page, Founder, Google

© 2012, Written by Keren Peters-Atkinson, CMO, Madison Commercial Real Estate Services. All rights reserved.

Leave a comment

Leave a Reply

Your email address will not be published. Required fields are marked *

*

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>

WordPress Appliance - Powered by TurnKey Linux