San Clemente Web University: FREE Web Marketing & Advertising Classes

Subscribe to Web “U”

Enter your email address:

Delivered by FeedBurner

Web ‘U’ Sponsors



Web U 2UR Phone

Phone number


Web University Blog

November 2015
« Oct    

Web 'U' RSS Feeds

Posts Tagged ‘SEO’

Is Optimising For Humans The New SEO?

posted by admin @ 1:00 AM
Friday, March 8, 2013
Let's talk about SEO for humans, a great leap in website optimization. Instructions are included.
Comments Off

Turning Google Search Queries Data into Actionable Insights

posted by Michael Garrity @ 1:45 PM
Tuesday, February 26, 2013

Anyone that has ever done SEO work can attest to the fact that it’s usually very complex, multi-layered and never as easy as just tweaking one thing here or there to top the SERPs. That is why successful Web workers turn to multiple datasets to try to detect everything they need to change to improve their search results.

Google recognizes the importance of providing users with this information. One of the ways it helps is through the Search Queries feature in its Webmaster Tools. This feature offers data about Google Search queries that returned pages from a user’s website. It even allows users to see information about their “top pages," which were seen most often in search results. Webmasters can also specify the period they would like to see data for by using a set of calendar drop-downs (the default is the last 30 days of data).

Often, Web pros can focus too much on search queries and rankings, but there are sometimes bigger gains to make in cleaning up the way a site is crawled or reducing duplicate content. For instance, Google Search Queries can provide a plethora of highly actionable information that can quickly improve a website’s SEO performance; one just has to know what they’re looking for.

Know Goals
To begin, webmasters should make sure to shift their focus from simply getting as much traffic as they can to seeking out “qualified” traffic, which are the result of qualified queries. These are those searches conducted by users that have a realistic chance of actually liking a user’s website and, thus, eventually converting. Seeking out this type of qualified traffic is the best way to keep from wasting time or resources on SEO, while still being able to improve conversion rates. However, in order to attract qualified queries, webmasters must know what they want.

As with any sort of business marketing campaign, Web professionals must be aware of their website's goals (e.g. define their conversions), as well as the specific groups they’re targeting, where they’re located, what devices they’re using and what THEIR objectives are and how those align with business goals.

Search Queries helps webmasters better match their site’s content to the queries of their target audience. By selecting “Filter,” users can get a breakdown of all of the countries that visitors to their site are located. If good amount of visitors are coming from a specific country, webmasters should evaluate if their site is currently meeting the needs of these users or if it would be worth the investment to make sure that they do. The Filter also shows the search types (e.g. Web or image) that most frequently include their site in the results, which again enables them to alter their sites’ content to appeal to those searchers.

(Note: For those who are going to do research on individual queries, make sure to do it in a browser that doesn’t store cookies and not when logged into a Google account, so that personalization doesn’t affect the results.)

Click on This
A great way to pull out qualified traffic data is to change one's Google Webmaster Tools homepage to sort by clicks, rather than the default sorting by impressions, which can refer to both qualified and unqualified queries. Clicks, on the other hand, will display those searches that bring one’s site the most traffic, in essence telling them what they’re already doing well before they go back to make any changes.

Modifying the date range when looking at clicks may also bring up different results and once a webmaster has started using Search Queries, it would be a good idea to keep track of their data going forward, as Webmaster Tools only stores information for up to three months.

However, if the goal is to see years of search history to track various trend periods (e.g. holidays), the webmasters have to download the information. They should then look at their top query results to see if they’re the queries they would expect, if they come from qualified traffic and how they can improve the display of their page (or pages) in the SERPs to drive even more traffic.

This can be easily done by clicking on the query in Search Queries, which will then show the pages that appear in the results for that query. For example, if they see different URLs with duplicate content, they can improve their rankings by removing that from their sites. When looking at this information, it is important to see where one’s site ranks, how it is displayed and how its competitors rank and look.

Another way to check a site’s performance with qualified queries is to sort by click-through rate (CTR), where a high number shows that a page has a compelling search result appearance, while a low figure says the opposite. Important queries to monitor with CTRs are branded terms and categories that reflect a site’s goals and user intent.

Page Numbers
Finally, when a user sorts his or her Top Pages by clicks, Search Queries will show which pages were most visited by searchers during the filtered period of time, and clicking on that page’s URL will show all of the queries that the page ranks for. It makes sense, then, that users would want to go back and optimize these pages to make sure the content on it is optimized and the page as a whole makes it easier for visitors to convert.

Webmaster can also sort their Top Pages by impressions, which shows the pages on a website that Google considers the most valuable. With this information, users can link to these Top Pages to some of their lower-ranking pages of high quality to try to help improve their search rankings, as well.

Site Improvement is a Step Away
Search Queries in Google Webmaster Tools provides a lot of useful information that can help improve a site’s rankings for qualified queries and eventually increase conversions, as a result. The real secret to making the most of this particular tool is for webmasters to accept that some of the results may not be what they expected (e.g. some of their favorite Web pages may not be the best performing or the even most likely to bring in quality traffic). Those willing to go with the data they're given and make the necessary adjustments, will enjoy a noticeable boost in their SEO efforts.


Comments Off

Bing Sitemap Plugin Accommodates Webmasters

posted by Michael Garrity @ 11:45 AM
Tuesday, February 26, 2013

Last week was a big one for Bing. The Mircosoft-owned search engine announced a substantial update that consisted primarily of three major changes.

Two of these refinements are distinctly user-centric, as they provide a new tile-based image search layout (similar to the interface of Windows 8 and Windows Phone devices) that comes with revamped “Friends Photo Search” functionality, as well. The other change consists of a series of autosuggest enhancements meant to improve the search experience for users by making autosuggestions faster and more intuitive with the addition of “ghosting” features, which include complimentary text where a user is typing that is already highlighted, so that searchers can either accept the suggestion or keep typing to have the ghosting text change based on additional information.

However, the third Bing update was made with webmasters in mind; it is a plugin that makes it considerably easier to create an XML sitemaps to submit to the search engine. This tool will be especially useful for those website owners that aren’t using a content management system (CMS) to develop XML sitemaps.

As you probably know, XML sitemaps allow webmasters to give search engines a comprehensive and, most importantly, accurate representation of their websites, thus letting them control what the search engines will index, at least in terms of priority. Webmasters can even inform Bing of new or updated content on their sites.

There are many benefits to utilizing sitemaps, the most obvious of which is that users will always have a full, updated list of all of the URLs on their site, plus a list of all the URLs that were recently modified. This makes it much easier for a search engine’s crawlers to prioritize all of a website’s various URLS, which will help keep bot traffic bandwidth down.

Bing’s Sitemap Plugin, which is an open source project that was made available through an Apache license, is compliant with and runs on Microsoft IIS servers, as well as Linux/UNIX servers that run Apache. It can create two different types of sitemaps: The first is a comprehensive sitemap of all the URLs seen in server traffic, and the other is a sitemap that’s dedicated to storing URLs that have recently changed. Furthermore, the plugin doesn’t just generate a list of URLs; it also adds <lastmod> and <priority> tags and values to a sitemap based on a page’s popularity.

But among the most important aspects of the sitemap plugin is the amount of control that it gives to users. Webmasters that use the Bing plugin to create their sitemaps will be able to control exactly what gets added to it, and the plugin will be able to detect any Disallow and Allow directives inside a site’s robot.txt, thus allowing it to skip any URL patterns that shouldn’t be added. In addition, it provides greater control through various configuration files with rules that will augment pre-existing robots.txt Disallow directives, and webmasters will be given total control over selecting those query parameters that the plugin should honor and include in added URLS.

The Bing Sitemap Plugin can be configured to operate in four different site and server scenarios: 1. Single site on a single server, 2. Single site on multiple servers, 3. Multiple sites on a single server or 4. Multiple sites on multiple servers. When the plugin operates across multiple servers, it enables a merge process that generates a single, unified sitemap that can be distributed across all of them.

With the addition of the Bing Sitemap Plugin, the search engine has taken another step toward becoming more webmaster-friendly, which will be important if it ever hopes to take away market share from Google. What do you think about the new plugin? Will it convince you to focus more on optimizing your website specifically for Bing? Let us know in the comments.


Comments Off

Extreme Makeover: SEO Edition

posted by Michael Garrity @ 1:30 PM
Tuesday, February 12, 2013

The winter weather is finally starting to subside, which means that spring is right around the corner and it’s a time for new beginnings. For business owners, webmasters and online marketers, this means doing a little spring cleaning the form of an SEO makeover for their websites.

Regularly performing a checkup on your website and giving it a bit of a facelift when and where it’s necessary are kind of essential to maintaining a solid SEO strategy. However, it can be easy to overlook at times and before you know it, a few weeks (or even months) have gone by since you last fixed things up, and by this time, you’re site may need a more extensive overhaul.

If your site is in dire need of a makeover and you’re wondering where to begin, look no further, because below we’ve listed a handful of nips, tucks and alterations that can help you totally reimagine your website for SEO.

Dust Off the Old Keyword Research Tool
Keyword research is the backbone of any SEO strategy, but it is also one of the easiest things to forget to revisit when turning up your site. That’s why the first step in an SEO makeover should be visiting the Google AdWords Keyword Tool to find the keywords in your niche that are not only most popular, but are also most suitable to your particular Web business. If this new report returns information about new keywords that you’re not currently using, make sure to find a way to work them into your site’s title tags and meta descriptions.

Clean Up URLs
One of the easiest and yet most effective ways to improve your website’s search rankings is by beautifying your URLs to make them easier for search engines to read, which means getting rid of long query strings. Clean, easy-to-understand URLs should include a hypertext transfer protocol, the website’s domain name, the subdirectory of the file or page (if applicable) and the file name of the requested resource, all of which are separated by backslashes. Moreover, different terms within directory or file names should be separated by dashes (-), rather than underscores (_); the URL should not include auto-generated IDs and/or numbers; and they should remain as short as possible. Luckily, there is a tool that helps users rewrite requested URLs on the fly called mod_rewrite.

Refine the Site Architecture
This is the big one, as refining your site’s architecture not only makes it easier for search engines to crawl your content, but it also makes the site more user friendly and increases load times, so it’s a great idea all around. For starters, you should make sure the site follows the basic rule of thumb that it takes no more than three clicks to reach your deepest content. Other handy little tweaks you can make include making it possible to access your homepage in one click from any other page on your site (which could include placing a link your logo image), cross-linking related pages to open the navigation flow (for both users and search engines) and by utilizing text-based links for your menu and page navigation buttons, since that’s the only thing that search engine crawlers can read, which is particularly important for the more crucial aspects of your website.

Purge Your Link Collection
If you like back to all of the websites that link to you, that tells Google that all of your inbound links are not earned, as sites with real authority don’t have to practice link sharing. You may also have a list of outbound or internal links on your site that go to broken pages, downed websites or other places that you may not want them going to various reasons. An SEO makeover is the perfect opportunity to perform a full-scale link checkup on your website to make sure that your site provides only the best and most necessary links for visitors and search engines. Remember, the number of outbound links you have doesn’t matter, as long as you’re linking to authoritative (and working) sites.

Weed Out Irrelevant Content
Finally, you should also do a site-wide examination of all of your content to ensure relevance and quality and help you move up the search rankings, as better content means more visitors and links and, thus, more attention from Google and Bing. Review your blog posts, Web pages and other copy to suss out anything irrelevant or outdated and make changes as necessary. This includes (but is not limited to) updating titles and headlines to include keywords and adding links to trending topics, which can help you get more relevant content and increase your search engine placement.


Comments Off

Visitor Generated SEO Plugin

posted by Pete Prestipino @ 11:55 AM
Monday, February 11, 2013

Is it possible to put your e-commerce SEO campaigns on autopilot?  

Online store owners using Shopping Cart Elite (a hosted e-commerce solution) now have a means to automate their search engine optimization efforts (at least some of the on-site SEO) thanks to the new Visitor Generated SEO plugin.

When the plugin is activated for a particular store, each time a customer searches for something on the site, the search result page is turned into a static page and added to the website. That page then gets indexed by search engines and the website (conceivably) starts ranking for that particular keyword. 

In the future, the plugin will take keywords used by customers to search for a website on Google, and then create SEO-friendly webpages with products using those keywords as the website title. The company claims the "auto-content generation" is strictly white hat. 

Comments Off

The Difference Between rel=author & rel=publisher

posted by Michael Garrity @ 1:30 PM
Tuesday, February 5, 2013

Google conveniently features two distinctly different tags that can be added to a Web page’s HTML to promote the individuals that authored a piece of content or an entire brand.

These are the two new tags known as rel=author and rel=publisher, that will automatically link an article or website back to a corresponding Google+ profile from within the search engine results pages (SERPs). This provides those listings with greater authority (as they’ve essentially been vouched for by Google) and helps them stand out among the other results on the page.

In other words, both of these tags can be very useful for search engine optimization (SEO) and Internet marketing, as long as you know the difference between them.


You know those little author pictures that appear next to some of the blog posts and other content that you come across on Google’s SERPs? They’re the result of a rel=author tag that has been attached to that page.

These tags are meant to help promote individuals alongside each unique article by tying their work to their personal Google+ profiles. Of course, the tags do more than just make articles stand out on the SERPs (although that certainly helps); they also provide more credibility to a listing and garner significantly more exposure for the writer, as they also provide a “More by” link that collects other content that the writer has published and added the rel=author tag to.

Linking an article to a Google+ profile is a simple two-step process. The first is to add a link from the site or Web page with the content to your G+ profile by adding this tag into the HTML:

<a href=”profile_url?rel=author”>EXAMPLE TEXT</a>

Obviously, you’ll want to replace profile_url with the URL for your Google+ profile. After that, you need to link to your content from your profile, which can be done by signing into Google+, clicking on “Edit profile” and adding a custom link to the Contributor To section toward the bottom of your profile page. However, you must be sure to link your rel=author tag to an author page that is on the same website as your content.


The rel=publisher tag has a much broader focus. It will tie a Google+ business page to an entire website, as opposed to claiming just a single article like the rel=author tag. And while the author tag will show a writer’s picture in the search results for a particular article, the publisher tag shows a Google+ page summary in search results for that particular brand.

In doing this, Google is recognizing a brand’s G+ page as its “official” profile, which it will give preference to in the SERPs. It also means they’ll be eligible for Google Direct Connect and aggregates all of the brand’s +1s across their site, G+ page, search results and AdWords ads.

Basically, the rel=publisher tag is Google’s way of providing an incentive to get brands to sign up for Google+ profiles. Right now, however, the capabilities of the tag are pretty limited (just ask any SEO or marketer), largely because it isn’t triggered until someone searches for only a brand’s name without any additional words in their query. That being said, the very existence of the tag is enough to assume that Google will be coming up with new ways to feature it on its search results in the future, meaning implementing it will eventually show greater benefits for marketers.

To manually use the tag, simply add it to the head of your site in the HTML and include the URL of your business’s Google+ page. It should look like this:

<link href=”profile_url” rel=”publisher”/>

Once you’ve done that, you need to go through the verification process. As with the rel=author tag, you just go to your Google+ page and click “Edit profile,” then add the URL to your brand’s website in the Website section to create a reciprocal link between the two sites.

Comments Off

Track Local Website Rankings with STAT Search Analytics

posted by Pete Prestipino @ 12:20 PM
Wednesday, January 30, 2013

The personalization and localization efforts of search engines are making it increasingly difficult for search engine optimziation professional to understand exactly where websites actually rank for a given search term. An update from STAT Search ANalytics however is in a position to mitigate the problem. 

STAT Search Analytics new platform gives SEO professionals the ability to gather daily keyword rankings and other local search analytics by city, state, province, ZIP code, postal code, and even neighbourhood or borough. "We designed our platform for SEO folks who are working with national and global brands. More and more, they need to see how they're ranking in Los Angeles versus New York, or Manchester versus London," said Rob Bucci, the founder of STAT.

"It's also going to be indispensible for SEO agencies serving companies that rely heavily on local search. Things like professional services, real estate, restaurants, and any other business with brick and mortar locations. We don't just let them compare cities-they can get right down to the level of postal codes." 

Comments Off

The line between social and search is getting blurrier by the day, which have left many SEO professionals scrambling to find a way to incorporate this hot new marketing avenue into their work. Rio SEO, a company that provides software tools for SEO automation and reporting, is trying to make that process easier with the release of a new suite of social media marketing products.

This release, known as the Rio SEO Social Media Suite, follows the company’s acquisition of Meteor Solutions. Rio SEO set about integrating the Meteor’s social advertising solutions into its SEO software, and the result is a set of three tools that help digital marketers with social-sharing campaign analysis, social advertising and digital-influencer activation programs.

Rio SEO Social Analyze is an on-demand social analytics platform that helps marketers measure and optimize their social marketing initiatives for scalable and repeatable results in real-time. The Rio SEO Social Advertise software will help marketers deliver digital advertising directly to their “most influential” brand advocates and users. Finally, Rio SEO Social Activate provides an automatic solution for establishing and building upon social engagement.

All of these new features are available in addition to the automation tools already available from Rio SEO, which help SEO pros with website and mobile optimization, as well as keyword discovery and a suite of local search solutions.

“Our aim is to help digital marketers catch the wave of content and discovery marketing by leveraging social marketing activation and measurement tools,” says Rio SEO Vice President of Social Technologies, Ben Straley, who was also the co-founder of Meteor Solutions. “As part of Rio SEO, we are now extremely well positioned to integrate these social tools with search marketing strategies and technologies for a complete discovery marketing solution that delivers maximum consumer engagement and ROI to leading brands.”

Comments Off

Trends in Title Tag Design

posted by Michael Garrity @ 1:00 PM
Tuesday, January 29, 2013

It’s difficult to overstate the importance of title elements when it comes to search engine optimization (SEO). The title of your website or Web page as it appears on a search engine results page (SERP) is an important factor when it comes to how well you ultimately end up ranking.

For those unfamiliar, title elements (or “title tags,” as they’re sometimes referred to) is the text that describes an online document, and they appear in the top of a browser, in the SERPs and, may often be used by other websites as anchor text. 

The title element is often referred to as the most important aspect of on-page SEO, as it is the first thing that both search engines and users will see about your page – and we all know that first impressions are everything.

Best Practices for Title Elements

It is necessary to create descriptive, relevant title tags for your website and Web pages. In particular, you should be sure to include as many keywords as possible (while still sounding natural), as search engines will highlight those terms in the SERPs by bolding them, which helps increase visibility and, thus, click-through rates, as well.

However, when coming up with a title, you need to be aware of the length restrictions, as search engines will only show upwards of 70 characters. You should also include keywords as close to the front of the title tag as possible, and put branding (company name, etc.) at the end of the title, so that the keywords are given prominence. There is an exception to this, though; if your brand is well-known, and the name itself could be considered a keyword, you can (and probably should) put the branding up front.

Most of all, it’s important to remember that a good title tag is brand-focused, original and action-oriented. With that in mind, let’s look at the title elements of some of the best brands on the Web over various industries (retail, media, finance, lifestyle, technology and service providers) on both Google and Bing.



When you’re a huge name in retail like The Home Depot, all you really need to do (apparently) to bring in consumers is make sure that your website is clearly identified with your brand name. However, for a lesser-known company like Vistaprint, it is important to include keywords about the kind of products or services you can offer for users that may not be searching for your brand name.


There are a whole lot of different media websites on the Internet, so the best way to stand out is to use a keyword-focused title tag that tells users exactly what kind of information your site offers, just like TMZ and Forbes.


Both Capital One and Investopedia know that their brand names are going to be the main way that users search for them on the Internet, so they put those first in their title tags. Then, each of them gave a short, succinct description about what services their sites provide to users that include relevant and highly searchable keywords, especially in Capital One’s case.

Lifestyle offers one of the best title tags, as it places all of its relevant keywords at the front of the title and makes the whole thing a bit more actionable by saying “at,” rather than simply including the site’s name at the end of a string of keywords. The much more popular NFL brand, knowing that it’s name alone will be search for often, only has to include it’s name, but it makes sure to do it twice, just to be safe.


Major companies like Adobe and Microsoft don’t really have to do much more than include their names in their title tags. Adobe takes a the minimalist approach and just puts its branding in there, while Microsoft also adds some additional information that indicates to users just which Microsoft website and Web page (out of the hundreds of possibilities) they’re looking at.

Service Providers

The title tag for WebMD is all about branding, as more information-oriented searches will lead users to specific Web pages on the site; so, in lieu of adding all of the different services that WebMD offers, the site has opted for a cleaner title and put that additional information in the meta description area, instead. On the other hand, Kelley Blue Book’s title elements include all of the company’s most important keywords.

Comments Off

Google Dominates 90 Percent of B2B Space

posted by Amberly Dressler @ 8:32 AM
Wednesday, January 23, 2013

From search engine marketing (SEM) and search engine optimization (SEO) to social media and email, Web workers use a variety of tools to build and sustain website traffic. But knowing what is actually working to achieve these goals (Web-wide) is important not only for accurate budgeting of time and money, but also managing stakeholder expectations.

Optify’s recently released 2012 B2B Marketing Benchmark Report can help. The company analyzed more than 27 million visits to nearly 600 B2B sites from various traffic sources throughout 2012. 

“We weren’t shocked at the data, but we were surprised,” said Optify Chief Marketing Officer Doug Wheeler. 

The first takeaway is that organic search serves as the number one driver of traffic to B2B websites, followed by direct traffic (40 percent) and referrals (11.50 percent). The report also finds that despite the increased adoption of social media by B2B in 2012, itʼs still only a fraction (1.90 percent) of total traffic to B2B websites. It is important to note, however, that the report also found that Google is responsible for nearly all of organic search, making it the single most important referring source of traffic.  

“In the B2B space, 90 plus percent is coming from Google,” said Wheeler. “When you are looking at where you are going to spend your money, that’s imperative to know. Second, we are now seeing more than 40 percent of all organic visits from Google coming back as ‘not provided.’

“Google is spending more of its resources, taking that data away and at some point, you have to understand how it is you are going to acquire traffic, customers living in a paid information world. The way to get that traffic is through paid media.”

Very soon on the horizon, Wheeler expects seeing analytics, around what traffic is coming to your site, moving from free to paid. Understanding and optimizing everything Google is going to be imperative, but Web workers also need to find out if paid search makes financial sense to their companies. 

The report states, “For the companies that run paid search campaigns successfully, paid search has the potential to be a sustainable, strong source of leads. But not all companies can, or should, run paid search campaigns. Analyze your paid search potential (price point, average cost per lead, realistic conversion rates, resources, etc.) to find out if this source is for you and how much you can get out of it.”

Optify also looked at sources for converting traffic.

“Don’t put away that checkbook for email,” said Wheeler. “Email is number one in terms of conversion.”

According to Optify, the average conversion rate among all B2B websites in the report, across all sources, was 1.6 percent. Additionally, email shows the highest coversion rate compared to all other sources. 

“Surprisingly, organic search, the number one drive of traffic, is at the bottom of the conversion-rate list, second only to social media among the sources with lowest conversion traffic,” read the report. 

And while the report states that only 5 percent of B2B traffic comes from social media, Twitter outperforms Facebook in terms of lead conversion by more than a 9-1 ratio. Why does Twitter convert so well?

“I think it’s because it’s an accepted form of communication, almost like email,” asid Wheeler. “It’s just a different style, so people respond to it more instantly. They wouldn’t bother communicating with you if they didn’t want to respond, that’s my pure, opinion.” 

The full report is available for free download

“In reading this report, you might discover that you are doing better or worse than the published benchmarks and discover areas of potential for your marketing activities,” said Wheeler. 



Comments Off

Keyword Not Provided in Google Chrome Omnibox Searches

posted by Pete Prestipino @ 2:35 PM
Monday, January 21, 2013

The keyword not provided issue has caused serious issues (some would say damage) for search engine optimization professionals and those responsible for performance analytics, particuarly when it comes to keyword level reporting. And the situation is only going to get worse.

When users are signed in to Google and use the Chrome address bar (the "omnibox") to conduct a query today, Chrome sends those searches over SSL - that's why many have seen their KNP data increasing over the past six months. Now however, or at least starting with Chrome 25 (currently in the Dev and Beta channels), Google will do the same thing for users who are not signed into Google. That means that searches from Chrome address bar will be unavailable (i.e. invisible) - that's right, keyword not provided.

Google introduced Encrypted Search in May 2010 and made encryption the default for signed-in users starting in October 2011. While it may be a good thing for end-users - faster speeds and all, it's causing quite a bit of disruption in the search marketing landscape. 

- The Extent of Keyword Not Provided & Some Practical Solutions

Comments Off

Developing Image Search Optimization

posted by Michael Garrity @ 1:30 PM
Tuesday, January 15, 2013

For SEOs, pretty pictures can do far more than just appeal to their sensitive sides; they can also help move up in the search rankings and drive traffic to their websites if they know what they’re doing.

It’s important to note that optimizing images does more than just make it easier for users to find them in a Google Image Search (although that’s probably the most important reason for doing it), but also because it increases the image’s shareability and plays a role in helping trace the picture back to your site, and on today’s highly social Web, that can be beneficial.

But what does it take to optimize images for SEO? Well, just keep reading…

File Names
Since you’ve already done all of the hard work of keyword research (right?), you already know which terms your most sought-after audience is going to be searching for. Use this knowledge to drop those keywords into the image’s file name when you’re saving it to your server. Now, search engines can read those keywords and your images will begin to appear for related search queries. A couple of considerations you should note: If you’re file name has more than one word, separate them by dashes (or hyphens) and not underscores. Also, Google recommends you only use common image file types, such as JPEG, PNG, GIF or BMP.

Alt Text
Because there isn’t any inherent text content that search engines can scan, images are at a natural disadvantage when it comes to SEO. One of the best ways to combat this prevalent problem (and improve the user experience of your site) is to add an alt tag to an image in a Web page’s HTML code that includes some helpful descriptive information about the content of the picture; and if you can slide some keywords in there naturally, all the better, but it certainly shouldn’t just be made up of keywords. Make sure your alt tag text is as descriptive as possible, and if you’re using a specific image as navigation to link to another page, also include text that is relevant to the page that it links to, as well.

Anchor Text
In the same vein, make sure that any text that you use to link to an image is both ranked for keywords and appropriately describes the image. Again, this isn’t just appealing to search engines; it’s also useful for your website visitors.

Page Placement
Once you’ve begun to optimize your images for the search engines, you can make them easier to find by placing the pictures next to or near keyword-related content on a Web page. Since search engines will be more drawn to these specific areas on a site, they’ll be more likely to spot your image when they crawl the page.

Getting a few inbound links that point directly to your image is another great way to garner it some attention from the search engines. These can come in the form of you leveraging your own network of Internet properties to provide links to the images, or by building links from outside websites and/or publishers in the same way that you would look for links to your site and content; so don't be afraid to ask your friends and business partners for links once in a while. Just make sure these links use targeted keywords (or at least a close variation of those keywords).

Rich Snippets
Using rich snippets will tag a single image to your website or business that will appear anytime someone searches for your brand. So if you have one that is highly related to or indicative of your company/site, add some snippets. It will help your listings stand out on the SERPs and can even increase your click-through rate.

Social Interaction
The real reason images are quickly becoming a valuable currency on the Web is because of the rise of social media and social networking sites, which make shareability one of the most important qualities of good content. Because of this, you should be distributing your already-optimized images on various social networks like Facebook, Tumblr and Google+, as well as image hosting services Flickr, in addition to putting them on your website. Many of these services, particularly Facebook, Tumblr and Flickr, will allow you to tag your images with keywords and/or descriptions or caption, meaning you can also use this opportunity to get your pictures noticed by targeted users. Just don’t forget to find a way to include a link back to your website alongside the image (probably in the caption/description) so that you can also drive traffic from the social network, as well.


Comments Off

The Once-A-Week SEO Checklist

posted by Michael Garrity @ 2:00 PM
Tuesday, January 8, 2013

A new year always brings about new possibilities, which are often predicated on the many resolutions we all make to improve our lives and work during the course of the year.

It’s possible that many of the hardworking webmasters and website owners have resolved to improve or amp up their search engine optimization (SEO) efforts this year to help them find more relevant consumers and increase conversions. However, many of these same Web workers will quickly find themselves faced with the same problems that plagued them in the years passed, most notably a lack of time in an already busy schedule.

No need to worry, though, because here’s some good news for you: It’s possible to maintain a healthy SEO campaign by (mostly) conducting a check up once a week that examines the most important elements of your website for moving up the search engine rankings, allowing you to identify and correct any issues you may be having. And the best part is, once these larger problems are corrected, it will help improve many other aspects of your overall SEO performance.

Just make sure that you regularly follow a version of this SEO checklist once a week, and get ready to watch the inevitable upward progress of your search marketing efforts.

- Use Google Webmaster Tools to check sitemaps

To start, simply sign into your Google Webmaster Tools account (actually, if you don’t have one, the first step is to register one), which can help you quickly identify any issues with your domain. Primarily, you should use this service to make sure your sitemaps don’t have any errors and to review how many of your pages have been indexed. If you find that you have some missing pages, that’s a pretty good indicator that you need to submit a brand new sitemap.xml to the search engines.

- Don’t forget to look for crawl errors, too

Google Webmaster Tools can also help you spot any crawl errors (pages “not found” or broken links) on your site; if these issues are uncovered, they should be considered top priority fixes. In addition, this tool can help you check up on your site speed, HTML problems, such as short or duplicate metadata, and links to your site.

- Look for (and fix) broken links

Having a bunch of dead links on your website is going to hurt your standing with the search engines, so you should make it a point to regularly look for them by using a tools to crawl your website and point out any hazardous hyperlinks that you are unaware of. And once you know which links are bad, you can easily fix or get rid of them.

- Tune up title tags

If you’ve put any effort into your SEO until now, every page on your site should have its own unique, descriptive title (as indicated in the HTML <title> tags), but as we all know, the more pages one adds to his or her site, the harder it is to constantly ensure that every page is given an appropriately SEO-friendly title. If you have a somewhat small site, you should be able to check all of your pages manually pretty easily, but for larger sites, Google Webmaster Tools will gather and present this information to you in a new “Content Analysis” section that can be found under the “Diagnostics” tab.

- Revise meta descriptions (as needed)

Although meta page descriptions don’t have a huge impact on search rankings, they can play a major role in convincing users to click-through to your site, so its worth giving them a once over on a regular basis, especially if you add a lot of new pages from week-to-week. In particular, you should look to make sure you don’t have any duplicate descriptions on your site. Good descriptions should be between 150 and 160 characters and made up of compelling copy that smartly uses crucial keywords, without using quotation marks or other non-alphabet characters.

- Follow the trends

Using an analytics platform like Google Analytics, check the daily, weekly and long-term search traffic trends to see what users are responding to and what isn’t working. Find out which of your pages have increased search engine traffic and which ones have had the opposite effect, and then figure out the reasons for why this is the case. Ultimately, you should have a solid idea/starting point to look at the problems on your site that need to be addressed, as well as the opportunities you have to increase search traffic based on user data.

- Add internal links when possible

Search engines use internal links to determine which pages the website owners think are the most important on the site, so to help your rankings and show off your best stuff, look around your site for ways to include links to these power pages. This is especially easy (and important) if you are consistently adding new content.

- Seek out your best search phrases and use them a lot

Thanks to – you guessed it – Google Webmaster Tools, webmasters can now find out what search phrases are leading users to their virtual door. By going to the “Statistics” tab and look at “search queries,” you’ll see the top 20 search queries that your site is appearing in, which can help you assess the performance of your current keyword campaigns and maybe even discover a few new ones hadn’t even thought of. With this information in tow, you can use TrafficZap’s keyword density tool to receive a report about the words and phrases that appear most densely on the page of the URL that you enter; this will help you figure out just how well you’re using your keywords and phrases on your site, and make adjustments accordingly.


Comments Off

Top SEO Posts of 2012

posted by admin @ 8:05 AM
Tuesday, December 25, 2012

SEO's have certainly had a rough year, haven't they? Despite Penguin and Panda and ever more complex and sophisticated tasks required to make a website top the SERPs, SEO remains as important and relevant as ever.

Let's take a look back at the most popular search engine optimization related posts (as defined by our audience of in-house and agency SEO readers) made available here at over the past year by our talented staff of editors (and contributors).

As you can see in the list below, Website Magazine readers are most interested in (all things) Google, as well as the role of social and its influence on SEO, and lest we forget - link building. We've only included 10 of the most popular SEO-related content items from this year below but hundreds of other posts and article - both online and in the print magazine - are available that deserve (if not demand) your attention.

Subscribe now to receive a free subscription to Website Magazine and never miss any of the information you need to ensure your social media succcess.

Get started now - it takes just two minutes!

Top SEO-themed Articles Of 2012 at Website Magazine

New Study Reveals Top Google Ranking Factors
The volume of Facebook and Twitter shares that a Web page generates is closely correlated to how high it ranks in Google searches, while too many ads on a page are likely to have a negative effect on search visibility.

Google SEO: Algorithm Changes - February 2012
Google announced several (40, in fact) search algorithm changes it made in the month of February 2012. The changes (which, as mentioned in previous announcements, now come with codenames for easier reference) address a variety of user experience and functionality features including on related searches, sitelinks and a lot more.

Crafting an SEO-Friendly Facebook Page
Facebook is gradually becoming the standard, go-to destination for consumers that are interested in learning about a brand. In fact, according to an infographic from the marketing research company Lab42, 50 percent of consumers say a brand’s Facebook page is more useful than its website.

SEO Meta Data Mechanics: Titles & Descriptions
Inbound links and personal connections carry the strongest influence in search results listings, and today you will discover several tactics to increase both of those counts. There’s a lot more to securing first-place and first-page rankings, however, than links and personal connections.

Google SEO: 52 New Changes to Know
Google has released a long list of changes that the search engine made in April 2012, and many are directly related to how search engine optimization professionals will engage in their profession post Penguin.

Give Up the SEO Dream?
It’s time to give up the SEO dream. You know, the one where you rank in the first position on Google and Bing for every conceivable keyword and phrase related to your product, service or published material. But if search engines didn’t exist, would your enterprise? Yes. And here’s how.

Big List of Link Building Strategies
Link building should always be top of mind for those who want to grow their Web properties, but to call it a difficult practice is an understatement. It is, in fact, one of the most challenging components of search engine optimization, and one of the trickier areas in all of Web marketing.

The Future of Search: SEO Trends for 2013
Two weeks from today marks the New Year (assuming the world doesn’t end on Friday). And while the Web has changed dramatically over the last half-decade, there remain a handful of constants that seem as if they will always, in 2013 and beyond, play an important role for Web professionals in developing an Internet strategy, regardless of the industry they operate in. One of those areas is search engine optimization.

The SEO Zoo: Panda vs. Penguin
The Web is a wild place, and the principles of survival of the fittest are very prominent in the realm of search engine marketing and SEO. So it makes sense that Google would name its algorithm updates after some of the world’s exotic wildlife.

Link Building via Google Search Parameters
The ability to search is one of the most useful tools on the Internet, and Google Search is the top dog. The good and bad news is that, unknown to many users, most of us aren’t using Google Search to its full potential.

Comments Off

The Future of Search: SEO Trends for 2013

posted by Michael Garrity @ 1:15 AM
Wednesday, December 19, 2012

Two weeks from today marks the New Year (assuming the world doesn’t end on Friday). And while the Web has changed dramatically over the last half-decade, there remain a handful of constants that seem as if they will always, in 2013 and beyond, play an important role for Web professionals in developing an Internet strategy, regardless of the industry they operate in. One of those areas is search engine optimization.

However, just because SEO is still going to be necessary in 2013, that doesn’t mean it’s going to be the same old song-and-dance that Internet marketers are used to. This year saw the emergence of new trends into mainstream SEO practices (at least as “mainstream” as they can be) that will continue well into 2013 and beyond. Let’s take a look at seven of the most important SEO trends for the upcoming year.

Mobile Search

It’s starting to feel as if mobile has been an “emerging trend” since the iPhone was released five years ago. But now that hundreds of millions of smartphone users are being added every year and there is increased diversification in the growing tablet market. Thus mobile search is no longer a novelty that only the most highly trafficked sites will have to optimize for. Blogs and websites must be prepared for mobile searches and mobile capabilities, as their owners will start to see more-and-more of their traffic coming from these devices; this can include practices like including mobile-specific Web formatting.

Google+ and Google Authorship

Look, there’s no use in skirting around the fact that Google is the undisputed leader in the search engine market, and as such, it has the ability to force the hands of Internet marketers to appeal to it to improve their overall SEO efforts. In 2013, expect that pandering to be done through Google+ and the Google Authorship program (where authors identify a page or post as their own work by linking it to their G+ accounts), both of which are likely going to become more important in the search engine’s rankings. What this means is that if you’re not on Google+, either as a brand or an individual (especially if you write a lot of content/blog posts), it’s time to seize the opportunity.

Title Tags and Headings

These pieces of data have always been one of the most basic and important aspects of SEO. What is different now is the information that Google (and probably other search engines) is going to look for when indexing title tags and headings on articles. In 2013, the focus will be on titles and article headings that specifically relate to content or keyword-based content. When applicable, publishers should always break up their content into different sections and have each one CLEARLY address a specific topic.

Quality Content

It seems a bit obvious to say that search engines are on the lookout for “quality content,” which can include traditional content like blog posts, but also links or site design. But what that really means in 2013 is that publishers are not going to get anywhere repurposing, reusing or aggressively distributing the same material over and over again. A large quantity of content is basically useless lengthy, detailed, specific and, perhaps most importantly, unique. It’s time for Web pros to put their creative hats on, because they’re going to have to produce a lot of new, quality content in 2013 if they want to stay competitive.

Tag Management

Tag management is the practice of adding metadata keywords assigned to a piece of information or content on the Web that describe the item and make it easier to find when users search for it. The next step for marketers will be to concentrate their efforts on managing multichannel tags, and thanks to a relatively recent expansion, there are several great tag management services out there that can help them do just that, including BrightTag, Tealium and TagMan.

The Human Touch
For some reason, Google has often been criticized for being too inhuman. Its one-size-fits-all algorithm was susceptible to black hat SEO tactics like keyword stuffing or low-quality content farms for far too long, but as the Web progresses and becomes more social, Google and other search engines are finally able to completely disregard those cheap techniques in favor of a more human element that will determine how websites and pages rank in their search results. Humanized rankings are going to be far more prevalent over the next year, incorporating more quality content and social media data to deliver results in real-time (or closer to it, at least). So, if you were relying heavily on taking advantage of Google’s impersonal algorithm, it’s time to devise a new strategy.

Conversion Rate Optimization (CRO)
Conversion rates are no longer just important to your bottom line. There are plenty of websites out there that are well-optimized for certain niches in search engines, but don’t offer the kind of quality product or service needed to actually make a conversion, meaning some of the higher-ranking sites for these keywords are actually useless for searchers. By relying more on conversion rates than search traffic when ranking websites, search engines will be able to provide more quality (there’s that word again) products or services to interested users. If you need some help getting started with CRO, take a look at Website Magazine’s Master List of Conversion Optimization Software.

2013 is right around the corner, and that means now is a great time to re-orient your Web strategy, reconsider your SEO practices and make the next year the most successful yet for your Web business.

Comments Off

The Marketing Main Event: Search vs. Social Media

posted by Michael Garrity @ 1:38 PM
Tuesday, December 11, 2012

For Internet advertisers, the main event face-off for 2012 is the on-going battle between social media and search for the attention of advertisers.

On the one hand, search has a much longer, more complete history than social media, and it has been ingrained in the fabric of online advertising for virtually as long as it has been a thing on the Web. This means that search advertising is refined and proven, with many well-established best practices that provide a useful template for all marketers. And perhaps most importantly, search is almost always a major component of any online ad campaign from the very beginning, and rightly so.

However, nothing has altered the way that we all use the Web as quickly or dramatically as social media, which evolved from a way for teenagers to kill some time to one of the very pillars of the Internet in just a few years. Now, marketers use Facebook, Twitter, YouTube, Pinterest and a plethora of other, smaller social media platforms to reach out to consumers, expand their audience and intimately engage with users, all of which helps them drive interest and create advocates for their brands.

But which one is better? Where should advertisers be focusing most of their attention? It turns out that Florida-based ad agency MDG Advertising was curious, as well, which led to the company creating a helpful infographic (which you can view below) that compares the two marketing realms in four different categories to see how they stack up against one another based on the various surveys and studies. So who came out on top?

Lead Generation
The first category remains one of the most important goals for many digital marketers: Lead generation. Marketers love to gather leads as they provide pre-qualified prospects, making it considerably easier to market to them and, ultimately, providing higher conversion rates. According to MDG Advertising’s survey, both B2B and B2C professionals agree that the combination of search and PPC advertising absolutely dominate in terms of generating useful leads for their business. B2C respondents said that SEO make the biggest impact at 41 percent, with PPC coming in at 24.8 percent and social media marketing clocking in at 34.2 percent. Meanwhile, B2B marketers said that SEO impacts their lead generation by a considerable majority, 57.4 percent, followed by social media marketing at 24.8 percent and PPC advertising at 17.9 percent.

Brand Awareness
While lead generation is dominated by search marketing, it actually ranks second in terms of the top objectives of search engine optimization below increasing website traffic and above generating brand awareness. However, as the infographic explains, brand exposure is listed as the primary objective of social media, according to Web marketing professionals. So, while search engines may be able to help consumers find you, it won’t help them know who you are. If you want to generate interest in your brand, turn to social media.

Local Business Visibility
Despite the Internet, many consumers still like to do much their shopping locally, although many still use the Web to find out information about their local businesses, and for the most part, they’re using search engines to do it. That is the case if the consumer is looking for information about a restaurant, bar or club, where they’ll use search engines 38 percent of the time, “specialty” websites 17 percent of the time and social media just three percent of the time. But it’s not just diners and nightclubs that rule the search engines, as search also dominates as the most popular method for users looking for “other” local businesses at 36 percent, followed by specialty sites at 16 percent and social media trailing far behind at one percent. Needless to say, local businesses absolutely must emphasize SEO as part of their online ad campaigns.

I think we all know where this one is going. When it comes to interacting with eager consumers and (hopefully) turning them into brand advocates, social media is the ideal platform, although not by the incredible margins that one might think. A survey of over 600 marketers asked them what they preferred as an interactive marketing tool, and social media came out on top at 65 percent. However, 54 percent of respondents said SEO, while only 34 percent said they use paid search to interact with consumers. This isn’t terribly surprising, however, as the nature of social media obviously lends itself to being, uh, social and engaging customers.

And the Winner Is…
…Those marketers that are smart enough to leverage both search and social media as essential aspects of their online ad campaigns. Search engines aren’t dumb, and the last two years have seen a massive push to integrate social media into search algorithms, and it seems that with every Google or Bing update, social has a greater impact on search rankings.

According to the MDG Advertising’s infographic, 50 percent of marketers said that social media has had an impact on their search marketing efforts, while just 29 percent have actually taken steps to merge parts of their social and search strategies. This has to change in 2013, as the two are going to grow more reliant on one another, as well as more integral to the success on online advertisers, as the line between social and search continues to blur.

Comments Off

Blogging Basics – Fighting Writer’s Block and Learning SEO

posted by admin @ 11:45 AM
Monday, December 10, 2012

Almost everyone who starts out blogging is told to write about something they love – it's common advice because it's sensible advice.

But even if you are hugely passionate about a subject there are going to be days when the thought of sitting down in front of a blank page and filling it with words isn't going to happen. You might be too busy, too distracted or you might be suffering writer's block. The sooner you face up to the fact that blogging won't always be easy, the better. But there are some tactics you can to do to minimize the disruption these periods will bring. 

  • Store up ideas from periods when the words are flowing and use them when things dry up.
  • When you start blogging draw up a list of 15-20 ideas and whenever you cover one of them add another to replace it.
  • You might find you write five entries without putting an idea back on the list, but as long as you refresh it when inspiration strikes, you'll never run out of content ideas.
  • You might also be the kind of person who likes to have specific blog titles in place before you start writing, but it doesn't matter if your ideas list is headlines, rough thoughts on a post or a mixture of both – having it in the first place is what matters.

So you've got your blog, you've got your ideas list and maybe you've even written a few posts - it's time to talk SEO.

SEO is as important as writing

If you're writing for the sheer fun of it then you won't have to worry about search engine optimization if, however, you want to attract visitors to your blog then you need to focus on more than just producing content.

SEO is all about making your blog attractive to search engines. If you do that, you're more likely to rank highly for terms relating to your site and attract more visitors. To start, think of your own experiences of using Google – when you search for something how often do you look any further than the first result on a page?

Around 80 percent of people who use Google click on the first result they see – so the benefits of having your site as that result, and hence the benefits of SEO, should be obvious.

The bad news is some SEO tasks can be very boring. The good news is others are a lot of fun. There is so much to learn about SEO, however, that it's important to remember you won't become an expert overnight.

Google wants to help you

A good place to start for an absolute beginner is by taking a look at Google's Webmaster Guidelines. This will give you a solid overview of the kind of thing you should (and equally importantly shouldn't) be doing. There are a few technical terms in there, but it will pay for you to get a grip on these areas early in your blogging life.

As you grow in confidence you may want to start using things like Google Webmaster Tools in order to make SEO improvements to your site. But as promised, there is another side to SEO and one which involves plenty of human interaction.

Links are crucial to boosting your search engine rankings and getting them should be a key focus for any blog. However, you need to be choosy about the kind of links you're getting.

Good links and bad links

Anyone taking their first steps into the world of SEO will find it impossible to avoid ads for services that promise thousands of links for very little money. But like most things in life, if an SEO offer seems too good to be true, it is.

You'll get your links alright, and they may even boost your Google rankings for a short while. But sooner or later the search engine will realize you are buying links (a big no-no in the Webmaster Guidelines mentioned above) and your site will plummet down the rankings.

Exactly why Google doesn't like link buying, and the full horror of getting caught doing it, would fill at least one blog post on their own, but anyone interested in such things can get a rough overview by reading the Wikipedia article on one of Google's most recent algorithm updates.

So if that's the wrong way to get links, what's the right way?

Firstly, you need to make sure your content really stands out – linking to something is a way of endorsing it and people are more likely to do that if your blog is of a high standard.

There are a few ways of doing this but the best one is to try and ensure you bring something new to the debate each time you sit down to write. Not every post needs to be a thesis, but if you draw on your own expertise and experiences you'll find you can bring a fresh spin to almost any topic.

Make the most of the social side of blogging

Great content on its own won't bring in the maximum number of links – to do that you need to get out there and make friends. Blogging is a social activity and whatever you are writing about there will be a community that revolves around that particular subject. Twitter and Google are your friends here – use them to find other bloggers and start a conversation.

If you're both passionate about the same thing this should be easy - by tweeting people and commenting on their articles you'll start to form a relationship which will be beneficial for your blog. This should be two people talking about something they love – there's no need to fake anything. If you're friendly, open and honest, you're far more likely to get a link than if you brazenly ask a stranger for one. Gently making someone aware of your blog will reap all sorts of benefits – if they like what you're doing they'll link to it and tweet about it – you won't have to ask.

Blogging is what you make of it

This is only the beginning though – we haven't even touched on SEO topics like guest posting or linkbait. And that's the beauty of blogging – you can treat it as just a way to get your thoughts down on the page or as a long-term project with specific goals and aims. Professional bloggers do exist and if you want to start on the road to becoming one, following the tips here will make sure you are heading in the right direction.

About the Author: Will Stevens – a journalist and SEO expert who is part of the blog team. The company is the UK's largest accredited provider of domain names.


Comments Off

Every good SEO wants to get the highest-quality links possible, as they serve as affirmations of the quality of their site’s content and, thus, help them be recognized by search engines and ultimately moved up the SERPs. But sometimes, if they’re not careful, even the best SEOs can end up with some bad links that are, at best, a waste of their time and, at worst, a detriment to their optimization efforts.

These bad links can cause problems for your site with the search engines because Google, Bing and the like are often weary about giving high rankings to sites that have inbound links from questionable or outright unacceptable sites, as they want to discourage users from ending up there. “Bad” links are usually those that come from spam or poor-quality sites, blog networks, paid posts, defunct directories, irrelevant/untrustworthy sites or disproportionate anchor text links. They can also include purchased links, which are also frowned upon.

Needless to say, if you have built bad links, you’re going to want to get rid of them for the sake of your Web business. How can you do this? Well, you first have to start by finding those bad links…

How to Identify Bad Links
For starters, you need to make sure that your SEO problems are a result of bad links and not actually something else, such as duplicate content, bad canonicals or redirects, downtime or latency issues or a malware issue. Fortunately, search engines (specifically Google) will now warn webmasters about their poor or “unnatural” links, so if it’s a problem, you’ll at least know about it before you watch your rankings plummet.

To find these links, you’ll have to do a backlink audit to see where they’re coming from and devise a plan to get rid of them. One of the best ways to conduct an audit is to compare your link profile to those of your competitors. Study their site-wide links, anchor text distribution, article and Web directory links, blog links, link networks, footer links, blog comment/forum links, etc. to get a better picture of what “good” links in your niche or industry should look like. This knowledge will help you better identify those backlinks that don’t meet those criteria.

You can do this manually by checking Google Webmaster Tools to look at your most linked pages. For example, if any of them are highly commercial and feature minimal value-added content, (making it odd that it would get many links), you’ll know where to start looking. Webmaster Tools can also let you see your anchor text distribution. Again, if you see important information like your website name or URL, company name or text directly related to your site, that’s a solid indicator that the link may not be desirable.

Other good tools for checking the quality of your backlinks are SEOMoz’s Open Site Explorer and CongnitiveSEO. These tools do the lion’s share of the analysis, so the process is quicker and more effective. If you’re especially industrious, or just don’t trust your own work, you can also hire an SEO agency to conduct the audit.

Disavowing Bad Links
Both Google and Bing offer a Disavow Tool as part of its Webmaster Tools suite, which allows website owners to quickly alert the search engines about backlinks they’ve found that they don’t trust by submitting page, directory or domain URLs that contain unnatural, spammy or low-quality links to their sites. The search engine in question will then disavow the link so that it will no longer count against you when it assesses your site’s position in the SERPs.

However, while this may seem like a handy quick fix to your link problem, these Disavow Tools aren’t always (or even often) going to be your best option. First, you should try to get rid of these heinous backlinks on your own by contacting the offending website and asking them to remove them. Those who need help finding a contact at the website can try using, SpyonWeb, C-Class Checker or regular social media sites to track them down. If you contracted out your search engine optimization, you can contact them to ask about the original agreement regarding the link and how to get rid of it.

It is important to check, because the links you’re so quick to get rid of, could actually be helping your site. Google says you can eventually get disavowed links back, but it does apparently take a while. For the most part, you should only consider using the Disavow Tools when you have received bad link warnings, been manually penalized, were denied reconsideration after trying to fix a bad link warning or were a victim of negative SEO. 

Dealing with bad links can take time. Luckily, there are a growing number of online tools and services that can help you speed up the process and ensure greater accuracy, such as Website Magazine's Broken Link Checker.


Comments Off

Your SEO Makes What?

posted by Allison Howen @ 11:05 AM
Tuesday, December 4, 2012

As the World Wide Web continues to grow so does the demand for professionals with SEO skills and experience.

In fact, a recent study from SEO platform Conductor reveals that there is a 112 percent year-over-year increase in LinkedIn profiles that list SEO titles and skills, while SEO job postings on have increased by 1900 percent. But where are these jobs located and how much are these search professionals being paid? Luckily, a new Conductor infographic has the answers.

Best Cities

It is no surprise that New York, San Francisco and Los Angeles are the top cities for finding SEO jobs, however, some of the other cities positioned within the top 20 list may be a little more shocking. For example, the study found that San Jose and Cleveland both list more SEO jobs than some larger cities, like Phoenix and San Diego.

Top Titles

SEO professionals come with a variety of titles, with the most common being SEO/Marketing Manager. The study found that the next most popular title is SEO Specialist, followed by SEO Analyst/Strategist and Director of Marketing.

Average Salary

Perhaps the most intriguing part of the study reveals the average salaries for SEO professionals. While the most popular SEO job title (SEO/Marketing Manager) is compensated with an average salary of $63,978, professionals that hold the Director of Marketing title average the highest salary at $94,836.

Comments Off

Invalid URLs and Google Rankings

posted by Pete Prestipino @ 10:00 AM
Tuesday, December 4, 2012

It would certainly seem logical for invalid URLs (broken links, etc.) to negatively impact a site’s broader Google rankings, but according to a recent Google Webmaster Help thread (, that’s not exactly the case.

The specific question in the very active thread was if “a website had several thousand internal URLs that are broken, can it be a signal to Google to crawl the website less and reduce its trust in the domain?”

Google’s John Mueller responded much later in the thread that the number of crawl errors on a site “generally” does not affect a site’s crawling, indexing or ranking.

“The number of broken links -- assuming the links point to URLs that don't exist -- generally does not affect your site's crawling, indexing or ranking at all, regardless if it's a handful or millions of them. This does not make us assume that a website is of lower quality (personally, it's more like a sign that the website is technically handling these invalid URLs correctly, which would be a good sign). The number of 404/410 crawl errors would also not negatively affect the crawl rate of the website -- it might even increase the crawl rate since the server can likely respond to these requests a bit faster than to normal requests.”

So, in short, while invalid URLs may not hurt a site’s ranking in general, it does indeed negatively impact the specific invalid URL (because it can’t be accessed and thus won’t rank) because the full internal linking opportunities aren’t being leveraged.

Comments Off

Crafting an SEO-Friendly Facebook Page

posted by Michael Garrity @ 1:30 PM
Tuesday, November 27, 2012

Facebook is gradually becoming the standard, go-to destination for consumers that are interested in learning about a brand. In fact, according to an infographic from the marketing research company Lab42, 50 percent of consumers say a brand’s Facebook page is more useful than its website.

On top of that, 75 percent said they feel more connected to brands with Facebook, while 35 percent said they feel like brands listen to them more on the social network. Overall, a whopping 82 percent of respondents said Facebook is a “good place” to engage with brands.

This growing preference for interacting with brands through Facebook means that companies need to start putting as much (if not more) effort into optimizing their Facebook Pages for higher search engine rankings as they do for their websites, because it appears that consumers are going to head there first to check out their brands.

Here’s some SEO you can do to catapult your Facebook page to the top of the SERPs:

Claim Your Name
Obviously, the best place to start with your SEO is with your Facebook Page name and vanity URL; you know, all of that branding stuff. Your Facebook Page name will appear to search engines with the prominence of an <h1> headline, so you should name it after your business or brand and stay away from trying to stuff the title with keywords, which could hurt your viral growth on the social network because the Page will seem like spam to users. Putting your company’s name in the title ensures that consumers will be able to find you with little-to-no confusion, especially when they’re looking on search engines. However, once you select a name, don’t change it unless you absolutely have to, as Google (and presumably others) penalizes pages that change their titles.

And it doesn’t just stop there, because you also have to find the best possible vanity URL for your Facebook Page (although this option is only available for Pages that have at least 100 fans). You can select your username at, but keep in mind that the site won’t just let you pick generic keywords for your URL. While there may be some leeway that allows you to use keywords in conjunction with the name of your brand, you will probably be better off just using your brand name (or the closest unused variant you can come up with). Mostly, just make sure that your URL, like your Page name, contains the name of your brand and is carefully crafted to help you stand out on the SERPs.

Make Your Text Count
We’ve all heard about the importance of writing up original content that provides value to users while also taking advantage of the most important keywords in your industry or niche to help you appear higher for related searches. Well, that practice is just as important, although far more limited, on Facebook.

On the social network, you should incorporate as much SEO-friendly text (i.e. heavy on keywords) in your “About” section. This area on a Facebook Page is the only one the site gives you to include a lot of text, and they present useful opportunities to improve your chances of being noticed by searches for some of your brand’s most important keywords, because in addition to the regular About section (which can include a company overview, general description, etc.), there is also space for basic information, contact information and more.

Don’t Forget Multimedia Content
These days, so much of what is posted and shared on Facebook is multimedia content in the form of images or videos. It’s no secret that search engines have a difficult time crawling this content because it isn’t text-based, but luckily it still can be optimized for the search engines.

This is most easily be accomplished by simply appropriately naming any images, logos or videos that you post on Facebook with a filename that is descriptive and, if applicable, uses keywords. Since the titles of videos hare more heavily weighted, it is the easiest type of multimedia content to optimize. Once you’ve got SEO-worthy names, you can add more content to your photos and videos, such as captions or comments, that will also attract search engine bots.

Foster Discussion
It’s easy to lose sight of the fact that Facebook is a social media platform, and that your first task on the site is to be social with your fans. Conveniently, this sort of interaction can actually be beneficial to your SEO efforts, as well. By fostering discussions with other users on your posts, photos and videos, you’re allowing them to generate a wealth of additional content on your page. So, as long as your discussions remain on-topic, all of those user-generated comments could conceivably help improve your rankings for certain keywords. Unfortunately, it’s not known for sure how much weight search engines give to Facebook comments, but we do know that Google at least indexes them.

It’s (Still) All About Links
Links are incredibly valuable for SEO, a fact that remains true even on Facebook. And, as with traditional website optimization, the more links you can accrue, the better. If search engines see your Page getting linked to in numerous places (both on and off the social network), especially from respected or authoritative sources, it will be weighted much more heavily and likely rank higher.

You can start by linking to your Facebook Page on all of your other available Web properties (i.e. your website, Twitter page, YouTube channel, etc.), but you certainly shouldn’t stop there. If you’re going about acquiring links, you could also ask that they link to your Facebook Page in addition to, or in place of, your regular website.

However, the easiest way to get a bunch of links is simply to get more people to Like your Page. Since the site automatically puts links to the Pages that a user Likes right on their profiles, each new fan you get results in one intra-Facebook inbound link to your Page. Then once you foster more discussions and get fans to comment on and Like content on your Page, search engines will see even more reciprocal links in your content stream between you and your fans, since each user’s name on a comment or Like links back to his/her profile.


Comments Off

Google dropped a virtual bomb on search engine optimization professionals (all Web businesses really) when it announced in October 2011 that it would begin encrypting search queries. The result was that visits from organic listings no longer included information about individual queries. As you might imagine, not providing referring keyword data sent shockwaves through the industry and the force of that change has since left many struggling to regain their digital balance.

Just how bad is the problem? One of the most highly cited studies of the past few weeks has come from digital marketing software provider Optify. The company examined 400+ business-to-business websites over the past 11 months, analyzing over 17 million visits from organic search and capturing a total of roughly 7.2 million keywords. As you can see in the graph below, the rate of keyword “not provided” now reaches close to 40 percent of all Google searches – an increase of almost 171 percent since being introduced.

The impact has been massive. According to the Optify study, 81 percent of companies analyzed in the study currently see over 30 percent of their traffic from Google as “not provided” and 64 percent see between 30 and 50 percent of their traffic as “not provided”. For many ‘Net professionals that Website Magazine has spoken with directly, that “not provided” rate actually higher – and is in some cases closer to 50 percent.

So, what does this all mean for search engine optimization professionals? What steps can digital-focused businesses take to mitigate this rather substantial data loss? Optify did provide a few suggestions in its reoprt, but as it stands today, the “not provided” problem is expected only to get worse (unless you’re an advertiser, of course). Fortunately, SEO’s can make due with what they have. While Google isn’t sharing all the data, it is still managing to share some (roughly 60 percent) – and for many, a sample of the total data is sufficient (or at least enough to get a general understanding of keyword performance).

The best and most practical option is to simply use Google’s Webmaster Tools. As if you needed another reason to use the product/platform, Google graciously gives SEO’s access to the top 1,000 daily search queries and top 1,000 daily landing pages for the past 30 days. Since you can export files from GWT it is possible to compare month over month traffic gains/losses on a keyword level. That’s a good thing for SEO’s but it’s just not enough.

The problem with reviewing data in aggregate is that it is impossible to associate referring keywords with website behavior like time on site or page views. Optify suggests using “proxies” such as keyword rank and ranked page to estimate individual keyword performance. Here’s how to do that:

1) Sort leads from organic search by entry page
2) Pull keywords that drive traffic (available from Webmaster Tools) to that page
3) Using rank and known CTR for each keyword, estimate traffic to each page by keyword.
4) Using estimated traffic per keyword and known leads per page, estimate the conversion rate per keyword and tied it to other performance data at the keyword level.

Comments Off

Avoid a Knowledge Graph-tastrophe

posted by Pete Prestipino @ 1:00 PM
Tuesday, November 27, 2012

The search experience at Google has changed dramatically over the past few years.

While the factors used today by the search engine to determine placement on its results pages aim to deliver an enriched user experience, it has made the process of natural or organic search optimization an infinitely more complicated and challenging undertaking for digital marketers.

What Is Google Doing to Us?

When Google announced its Knowledge Graph initiative back in May 2012, search engine optimization professionals (SEO’s) across the Web started to worry. They worried that the top organic listings they worked so diligently to establish would be pushed down the results page. They also worried that big brands would ultimately not just control more virtual real estate, but also, in many respects, completely dominate and overshadow their smaller (and perhaps less savvy) competitors. For some, that’s already happening.

But the situation is getting worse. In early Oct. 2012, Google formally released its Product Listing Ads (read more on page 36) and the very next month “spiffed” up its search result pages with a new navigation bar. Both of these announcements, you guessed it, pushed those coveted first-page listings even farther down the fold (sometimes out of view completely). It is almost as if Google has a virtual vendetta against anyone who ever optimized their site for higher placement on search results.

If Google’s recent set of changes has your SEO team flustered and fearing for the long-term sustainability of your digital enterprise, know that understanding what the Knowledge Graph is and how it works can provide some virtual relief.

Understanding the Knowledge Graph

From a high level, the Knowledge Graph is Google’s attempt to provide information instead of, or rather than, websites alone. As you might imagine, the “knowledge” can come in a variety of forms (from a one-line answer to a micro-biography or an organizational profile). These developments are intended to make search more intelligent and accessible.

With information on hundreds of millions of people, places and things, Google’s Knowledge Graph initiative isn’t just about aggregating, indexing and ranking data, but is also about taking action upon the attributes of data that enables Google to better understand connections and make the information more useful and meaningful to an end-user. For example, if a user searches for “things to do in Chicago,” they will encounter an image carousel that consistently sources information from Wikipedia, an authoritative, well-cited ’Net destination, and contains images of popular destinations in Chicago (see image).

Understand that, today, content needs to be correlated, connected and shared by authorities to top the search results pages.

(article continues below)

Solutions for Search Salvation

Every enterprise on the Web is concerned with the constantly shifting state of SEO best practices. It is being transformed right before your digital-loving eyes and in significant, business altering ways. The strategies employed today are wildly different than those used even a few short years ago. Fortunately, it is possible to, at least in part, avoid a Knowledge Graph-tastrophe. Consider future proofing your SEO with the following strategies:

Google+ Correlation:

The path toward avoiding a Knowledge Graph-tastrophe moves squarely through Google+. Digital marketing and analytics vendor Fathom conducted a study recently that examined the top-100 brands as defined by Millward Brown’s 2012 BrandZ List. What was found may surprise you. Only 24 percent of the top-100 brands examined triggered a Knowledge Graph entry when searched, and the information for 92 percent of those entries came directly from Google+ (the remainder came from Wikipedia). You’ll be hard pressed to find more convincing evidence that creating a Google+ page is fundamental to future proofing your SEO campaigns.

MicroData Inclusion:

If there is one area where Google needs the help and support of SEO’s, in relation to success with its Knowledge Graph, it is in the use of microdata. Remember, Google’s aspirations are to evolve from a search engine into a knowledge engine, and it can’t do that with sparse levels of data. Google needs Web marketers, designers/developers and, of course, SEO’s to use more structured markup, as it enables the knowledge engine to better correlate a Web page to the intent of a user/searcher. This, in turn, makes it easy for Google to determine relevance. To stay alive on the SERPs in 2013 and beyond, you’ll need to focus on supporting more types of microdata on your digital property including types for reviews, people, products, events and, yes, even authorship markup.


“The (SEO) strategies employed today are wildly different than those used even a few short years ago. Fortunately, it is possible to, at least in part, avoid a Knowledge Graph-tastrophe.”


Asset Universality:

Let’s hope you’re not just using traditional “content” in your SEO efforts. If there is anything that Google’s SERP shuffles over the past 18 months have taught SEO’s, it is that using a variety of assets (blogs, videos, places, books, etc.) proves incredibly useful to securing search positions. One of the more feasible strategies is something we’ve discussed previously within the pages of Website Magazine’s Mastering Search column — re-optimization or content re-purposing. For example, if you have spent resources (time or money) on blogging the past few years, revisit that content and determine ways to transform it into videos or infographics.

Authoritative Citations:

Perhaps more than any other strategy referenced here, the acquisition of authoritative citations remains the single most important factor in generating competitive positions on the SERPs. All data on the Web is inter-linked and has relationships, and these signals establish your Web presence. How well you refer to other sites with relevant content and how many sites refer back to the content — considering it a valuable resource — determine the inter-connectivity of data and documents. Other social signals, such as shares, likes, +1s, etc. quantify these signals.

Avoid a Knowledge Graph-tastrophe

From Penguin to Panda and every index refresh in between and after, you’ve likely felt the impact of Google’s seemingly constant algorithm changes in the recent past. As the search engine refocuses its energy and resources on providing an improved experience for endusers, SEO’s may suffer the most. To avoid a Knowledge Graph-tastrophe focus on more and better Google+ participation. You should also accelerate the integration of microdata, expand the types and depth of content and creative assets and pursue the most authoritative inbound links possible.

Comments Off

Google Will Rank Shorter Content…If It’s Good

posted by Michael Garrity @ 8:00 AM
Tuesday, November 27, 2012

Do you find yourself spending a lot of time trying to pad out your Web pages with more words just so they will be ranked by Google or other search engines? Well, you might be glad to know you can stop that practice.

A common misconception about search engine optimization (SEO) is that a page must have at least 500 words (or some other arbitrary number) in order to even be considered to rank on the search engine results pages (SERPs), but Google’s John Mueller has recently stepped in to squash this popular, albeit incorrect, theory.

Mueller’s testimony comes from a thread on Google’s Webmaster Help forum entitled, “Is Short Content = Thin Content?” The Google employee stepped in to assure users that “Googlebot doesn’t just count words on a page or in an article.”

According to his post, Google’s focus is on finding and sharing “useful & compelling” content, which even shorter articles or bursts of content (such as tweets) are able to provide. This means that there isn’t a specific number of words or characters that automatically qualify a Web page for ranking consideration, but rather quality content.

That being said, one way to help get shorter articles noticed is to use them to generate discussions, as allowing users to share comments on an article is an easy way to include additional information on a page that doesn’t actually require any extra work. On occasion, this can be especially useful because “sometimes users are looking for discussions like that in search.”

Mueller wrapped up his post by reiterating that the best way to get ranked is to create truly unique, high-quality content, rather than material that is simply rewritten or autogenerated.

Comments Off

Get to Know Bing’s Webmaster Guidelines

posted by Michael Garrity @ 1:45 PM
Tuesday, November 20, 2012

The big news in the search engine world this week has been the release of Bing’s Webmaster Guidelines (finally!), which are meant to help the webmasters and publishers get their content found and indexed on the Microsoft-owned search engine.

Slowly, but surely, Bing has been chipping away at Google’s long standing dominance in the search engine market, and with the release of these guidelines, the company has taken its biggest shot yet by positioning itself as a worthy alternative to Google that webmasters can get behind.

In many ways, the initial success of Google that led to it becoming the global and cultural phenomenon that it is now was due in large part to its willingness to cater to webmasters right out of the gate by providing them with useful information, tools and programs that they could use to develop and monetize their sites.

Until now, Bing has been slower to fully come around to webmasters, although the search engine has had a set of Webmaster Tools available for some time, which proved to be a good first stop in ingratiating itself with webmasters. With the release of the new Bing Webmaster Guidelines, the company is committing to being a webmaster-friendly service that puts their needs top-of-mind, and although the Guidelines aren’t much different than Google’s, they do provide a great refresher on what how Bing crawls, ranks and indexes sites.

Let’s take a look:


First and foremost, the job of a webmaster is to publish content and get it noticed by the search engines, which are, for lack of a better term, too dumb to notice it on their own. Webmasters must provide “clear, deep, easy to find content” on their sites that make it more likely to be found, and ultimately indexed and shown in search results. Obviously this includes Web pages, but also images, white papers and videos, among other types. Rich, content-heavy sites that engage users and provide valuable (and evergreen) information are always the most sought after by search engines, and Bing is no exception.


Gathering useful, authoritative links is one of the primary purposes of SEO, as they help Bing (and other search engines) “find new content and establish a vote of confidence between websites. As part of the Guidelines, Bing offers a resource section on link building that explicitly states that the search engine prefers organic links and says that the best ways to build links is by offering unique and engaging content, enabling social sharing and enabling copy-and-paste code snippets for visitors.


With Google’s attempts to push for personalized search to become the new norm, the social aspect of Bing’s indexing and ranking is now the biggest consideration that webmasters must make, largely because of the Microsoft/Facebook partnership that integrates the world’s largest social network with Bing that includes a visitor’s Facebook friends in their search results to help influence them and providing positive signals that can impact organic rankings. Specifically, Bing frequently points to social sharing signals as a major way that the search engine measures influence. And since Facebook has significantly more users than Google+, Bing may very well become the standard for socially influenced search results.


Bing explains that the best ways to be indexed by the search engine (and thus appear in SERPs) is to either link your content to Bing or use features that come with the Bing Webmaster Tools (e.g. Submit URL or Sitemap Upload) to make them aware of your content. The Webmaster Tools also feature a Crawl Control feature that lets you manage how Bing’s bot crawls your content and control when and how quickly your site is crawled (hint: Bing encourages that you do it quickly and deeply).


This section of the Bing Webmaster Guidelines examines a variety of different structural aspects of a website that can affect how they are crawled, indexed and eventually ranked on Bing’s SERPs. These include page load time, robots.txt, sitemaps, site technology, redirects and canonical tags.

Search Engine Optimization

The Guidelines directly distinguish the mains areas that webmasters should focus on when optimizing their websites, which include title tags, meta description tags, alt tags, <h1> tags, internal links, outbound links, social sharing, “crawlability” (i.e. XML sitemaps, robots.txt, navigational structure, URL structure, etc.), site structure (i.e. links, clean URLs, content hierarchy, etc.) and rich media warnings. 

It also breaks down specific on-page SEO information about head copy, body copy, anchor text, content and links. For instance, it suggests using unique, relevant titles and descriptions that are around 65 and 160 characters, respectively. The Guidelines also inform webmasters that they should only use one <h1> tag per page, base their content on keyword research, don’t use images to house content, use targeted keywords as anchor text to support other internal pages, use <rel canonical> tags to help engines understand which pages they should index and much more.

Tactics to Avoid

Presumably just to be nice, the Bing Webmaster Guidelines also spell out a handful of different tactics that webmasters should avoid to keep from getting penalized by the search engine in a way that would negatively affect their rankings and standing on the site. In particular, the Guidelines warn against cloaking, using link building schemes (i.e. link farms, three-way linking, etc.), meta fresh redirects, duplicate content and social media schemes.

While the suggestions laid out in the Bing Webmaster Guidelines aren’t exactly groundbreaking, they do provide a great template and reference source for webmasters as they work to optimize their sites for search engines, particularly if they want to improve their Bing rankings. And since the Bing-Yahoo Network currently accounts for approximately 30 percent of the search market, it’s certainly not something they’re going to want to blow off.


Comments Off

SEO in Action: Optimizing White Papers

posted by Pete Prestipino @ 12:35 PM
Tuesday, November 20, 2012

As search engines increase what informational assets to crawl, index and rank, so must search engine optimization professionals develop that content.

One of the most popular forms of content traditionally among B2B brands of course is the whitepaper and if you know what’s good for your bottom line website traffic numbers, you’ll not just be creating these assets already but optimizing them too. If not, let’s put SEO in action with these tips to optimize white papers, PDF-based reports and of course ebooks.

Research Audience Personas & Keywords: Starting with keyword research is certainly one way to ensure you’re white papers eventually get discovered, but consider reverse engineering the process and start with the ideal audience in mind instead. Who are your prospective readers (e.g. prospects or clients), what will they find of value (based on data from a variety of feedback channels), and what terms will they use in their queries to find it? Ultimately, all this information will be useful when producing the content as well as when it is distributed.

Develop & Optimize: When the research is complete, it’s time to get started developing and optimizing. Fortunately, white papers are much like websites in that they share many of the same optimization opportunities – from the file name and document titles to the text itself. SEOs also have the opportunity to embed keyword-based anchor text on links and optimize images with alt tags. Another opportunity to squeeze in a few more keywords into a document is within the headers and footers – each should have a link back to relevant content on a website.

Distribution: Before actually distributing, it’s important to have not just the content, but the meta content, the information that is used to market the product, service, or in this case whitepaper. Search engine optimization professionals need clear, concise, keyword-rich executive summaries and they need landing pages. The keyword research and audience personas again prove useful in this phase. Once these creative assets are in place, it’s time to start distributing. Social media posts, media pitches, email newsletters, press releases, infographics and more are opportunities to amplify awareness so hopefully you’ve make good decisions when it comes to research and development.

Comments Off

Hashtags, Twitter Trends and Social SEO

posted by Pete Prestipino @ 11:45 AM
Tuesday, November 20, 2012

The role that social media now plays in successful search engine optimization campaigns is immense.

The more pronounced a social media profile, the greater the likelihood that a brand will make its way onto search results pages. The challenge of course is how to amplify awareness, while maintaining some semblance of readability, accessibility and well, fun, with social updates. One of the best ways right now is to use hashtags.

Back in October of this year, Dan Wilkerson of LunaMetrics published a study about the impact of hashtag use on reach and the results were mind blowing to say the least. According to the report, using two hashtags per tweet resulted in 87 percent more retweets and 255 percent more mentions. That alone should get you to include as many hashtags as possible (without looking spammy of course) in your next tweet.

But what hashtags work best? Well, that depends on your vertical obviously, but what’s more important is understanding which specific hashtags are trending and when. For that’s it is essential to turn to one of the many resources available for the explicit purpose of researching hashtags data and any emerging trends on Twitter. Some of my favorites include: If you’re quick to post a status update or tweet, the only resource you really need is Twitter itself. The platform provides its top ten most trending hashtags but a quick search on Twitter can also yield some interesting insights. While accessing just 1 percent of total tweets, does provide some interesting historical data. Check out both the popular hashtags (those that you may want to include on every post) as well as the trending hashtags which are updated frequently. The real-time Web presents an excellent opportunity to discover what’s trending and when. WhattheTrend updates its list constantly and it can be filtered to show verified definitions of individual hashtags and undefined so you know what you’ll be tweeting. The service even provides a “trends by location” feature which shows what hashtags are trending around the world.

Another fun way to research hashtags by location is TrendsMap.

Comments Off

Desktop Search On The Rise

posted by Allison Howen @ 8:05 AM
Tuesday, November 20, 2012

Despite mobile’s momentum, desktop search activity has been on the rise.

In fact, a recent comScore study reveals that October’s overall desktop search activity saw an increase of 8 percent compared to the previous month – with a total of 17,623 searches conducted.

Other statistics show that Google gained two-tenths of a percentage point and continues to lead the search industry with 66.9 percentage of the market share. However, Microsoft also gained a tenth of a percentage point and now makes up 16 percent of the search market share, followed by Yahoo at 12.2 percent.

This means that even though many Web workers have focused on mobile lately, desktops should still not be overlooked – especially during the holiday season. In the coming weeks consumers will be conducting a lot of searches, both on mobile devices and on desktops, and the only way to guarantee your business is found is by continuously monitoring your brand’s placement on the SERPs. For those Web businesses who need a little help to do so, check out Website Magazine's Master List of SEO Tools




Comments Off

1&1 Shines Spotlight on SEO

posted by Allison Howen @ 3:00 PM
Thursday, October 18, 2012

If your company is having a hard time gaining visibility in the search engines, you might want to check out the new SEO solution from 1&1.

The popular Web hosting company developed and launched SEO Spotlight to help companies with a limited marketing budget acquire visibility in the search results. The solution acts as an interactive advisor and includes a keyword generator that automatically assembles the best search terms for a website.

The new SEO tool also offers tips for back-link strategies and provides support for creating a Google Places entry. Additionally, SEO Spotlight can be used to keep an eye on the competition – users simply need to submit a competitor’s Internet address into the system in order to monitor the competition’s rankings in relation to their own.

"Prior to the development of 1 &1 SEO Spotlight, there has been extensive market research," said 1 &1 CEO Oliver Mauss. "We found out that most solutions miss the mark on the needs of many small and medium businesses. With our new SEO tool, for the first time there is an offer that fully considers the requirements of beginners and can be used without programming knowledge."

SEO Spotlight is included in the 1&1 MyWebsite package, but can also be purchased starting at $9.99 per month with all 1&1 shared Web hosting, eShop, Dedicated Server, Dynamic Cloud Server or other MyWebsite packages. Furthermore, the company is currently offering a free 30-day trial period for the new tool.

Comments Off

Journey to the Top of the SERPs

posted by Michael Garrity @ 1:00 PM
Tuesday, October 16, 2012

Have you ever wondered why everyone seems to make such a big deal out of search engine optimization (SEO) and reaching that coveted top spot in the search engine results pages (SERPs)?

Well then, new data from Compete may just enlighten you, as a recent study of tens of millions of consumer-generated SERPs from Q4 of 2011 shows us just how massive the difference between a first and second place listing can be to a website.

According to the data, approximately 85 percent of all the listings shown are organic, with 15 percent appearing as paid search listings; they also found that 55 percent of all SERPs have ads.

But the more telling information comes from looking at the clicks on organic results. Compete says that 53 percent of those clicks go to the top organic listing, and the number decreases significantly from there. The second link gets 15 percent of the clicks, while the third gets nine percent, the fourth sees six percent and the fifth is clicked on just four percent of the time.

Being on top is also beneficial for paid listings, as well, though having the number one spot in the ad block is less important than simply appearing in the block at the top of the page. The study shows that about 61 percent of paid search ads show up on the right-hand sidebar, but they only account for 13 percent of the paid search clicks with the top listing getting just four percent of all paid search clicks. In contrast, the third listing in the top ad block receives around nine percent of all paid search clicks; in fact, the ads in the top block see an impressive 59 percent of all the paid clicks.

The report also shows that around 24 percent of the ads on a SERP appear at the top and make up an impressive 85 percent of the clicks, while 15 percent of them appear at the bottom of the page and get a paltry two percent of the clicks.

In other words, this report wants you to know that it pays off (big time) to invest in getting your site at the top of the SERPs. But as we all know, that's easier said than done.

Just a few weeks ago, Google released a list of 65 new changes to its search engine algorithm from August and September, many of which may have direct implications for how websites and pages will be able to move up in the SERPs. Here's a look at six of the most important updates; webmasters and SEOs take note.

The Role of Authority

One of the updates from August focuses on page quality and intends to help searchers find "more high-quality content" from sources that the search engine trusts. This means that building that kind of trust is crucial to improving your search ranking. You can do this by receiving (and giving) links from trusted and authoritative sites, creating a sitemap, lowering your bounce rate and not "over-optimizing" your content.

Matters of Location

Two updates now look more at location to determine a site's place in the SERPs. The first now determines relevancy of pages for queries containing locations, and the other "nearby" change improves the precision and coverage of relevant local Web results, helping it better identify results that are localized for each user (and ranks them appropriately). To better find these local, relevant searchers, optimize your site for local search by registering with a search engine's local business registry, getting links from local directories, developing location-specific landing pages and adding geographic keywords into landing page content, <h1> and <h2> tags and metadata.

The Return of Keyword Density

The changes also included an update to term-proximity scoring, which means it would behoove you to improve the keyword density on your site. You can do this by inserting keywords into page titles, headlines, ALT text, page URLs, rich footers and, of course, your copy.

The Importance of Freshness

Being fresh is crucial, and the last two months saw a number of changes aimed at rewarding sites with fresh content. Now, Google will apply a more granular function based on document age, and also seeks out and favors the latest content from a given site when two or more documents from the same domain are relevant to a search query. So make sure you're continually adding fresh content to your site, and if you're busy schedule makes that difficult, don't be afraid to ask for guest contributors or even press releases that you can publish. Anything to keep your site from just sitting there and getting stale.

Title Tags in Focus

It's difficult for Google to generate preview snippets for pages that it does not crawl because of robots.txt, so now a replacement snippet will be included that explains that there is no preview because of robots.txt. To prevent this, include titles and descriptions in the metadata of your Web pages.

Universal Intent

In addition to all of this, Google can now show improved Universal Search results through a better understanding of when a search has strong image intent, local intent, video intent, etc.

Take these key updates into consideration as you continue to optimize your site for the search engines (and maybe take a look at the other 59 changes, as well), and before you know it, you'll be movin' on up the SERPs.

Comments Off

Exact Match Domains Clipped by Google

posted by Pete Prestipino @ 10:16 AM
Tuesday, October 16, 2012

Unless you’ve been living under the proverbial rock the past few weeks, you are likely well aware that Google’s most recent algorithm update caught quite a few Web workers (and domainers) off guard.

What was interesting about this update in particular is that it targeted exact match domains specifically (some would argue that partial match domains were affected as well). Google’s Matt Cutts announced the “minor weather report” on Twitter just before the update actually took place the final weekend of September 2012. The update, which affected approximately 0.6% of all queries to a noticeable degree, took several days to roll out but we can safely assume that the effects will be long lasting.

According to Cutts, the change targeted “low-quality” exact match domains (EMD) – those domains that carry the keyword or phrase that users queried. As you might imagine, the algo update has left many (as usual) scratching their digital heads to find an answer to what “low-quality” actually means (again) as several sites, many with a long history and decent link portfolios, seemed to have been swept up in the rollout.

With some confidence, and I say this only after giving Google a week or two to fine tune its changes, that the use of Exact Match Domains carries far less weight than it used to. What do you think? Time to switch domains to a more "brandable" option? What will this do to the upcoming release of additional gTLD's? Share your thoughts by commenting below. 


Comments Off

Better SEO Through Content Marketing

posted by admin @ 9:35 AM
Tuesday, September 18, 2012

By Scott Fasser, Director of Customer Experience at Optify

Search Engine Optimization (SEO) and Content Marketing are two critical components of inbound marketing and have moved beyond siloed visibility tactics to an integrated strategy to drive high quality visitors and prospects to your website. These are critical components of a company’s core messaging and dramatically affect how these messages are portrayed to bring the right visitors with the highest chance of becoming customers.

Your content marketing strategy will provide the biggest boost to your organic search visibility if you follow these four rules:

1) Know your key personas - intimately
2) Write to all stages of the buy cycle
3) Talk outside-in vs. inside-out
4) Write in the same direction

Know Your Key Personas – Intimately

We all know how important it is to know your customers, core audience and key personas. This insight influences product development, customer service, sales close rates and marketing. This applies directly to the content you create – especially for B2B companies – who usually have a limited content creation budget (time or dollars). The clearer the picture of your personas are, the more targeted your content creation will be resulting in higher quality visitors that are closer to your ideal prospect.

In order to make the persona work actionable for your keyword and content strategy, we answer the following questions for each persona:

• What are the main types of business problems the persona typically needs to solve?
• How does your product or service provide solutions to these problems?
• What are some specific tasks the searcher wants to accomplish?
• What are some sample search queries the persona might use?
• What can the site provide that will cause the searcher to accomplish this task?
• What is your business goal for the visitor? Lead gen? Newsletter sign-up? Demo?
• How will the searcher be motivated to complete this business goal? i.e., what’s the offer or incentive? What is the call-to-action?

With this intimate knowledge of your audience in hand, you can craft more relevant keyword and content topics. The following is an example of the matrix we build for each key persona to help develop keyword and content strategies:


Write to All Stages of the Buy Cycle

A lot of B2B websites build content around their brand and products/services. Obviously, this is super important when people discover your brand to move prospects towards a lead and sale. However, this focus leaves out the much larger potential audience (your Key Personas) of those who don’t know your brand and are looking to solve a problem, looking for companies that are in a category or are comparing other brands for consideration.

Here are the four main stages of the buy cycle to build content for and the appropriate types of format for each:

Today’s B2B websites must get out of the role of simply a glorified brochure for the company’s products and into the role as the authority in your industry. This is the best way to achieve organic visibility success. Writing to all stages of the buy cycle helps you achieve this.


Talk Outside-In vs. Inside-Out

How you talk about yourself and market yourself, dramatically impacts how well you are found via organic channels – especially SEO. If your website is driven by a brand perspective that creates new phrases to describe what you do that is unique to your communication, you are not creating a true differentiation in your product, but new words to describe something that prospects don’t understand. If you have a large marketing budget to create searches for these new words, great. But most companies – especially smaller companies – don’t have this luxury. 

We see examples of this tension between what the brand marketers tend to want – unique concepts to describe their positioning – versus what the direct marketers tend to want – clear language the speaks to the category and specific solutions that are high traffic search terms – at all types and sizes of companies. Here are two examples:

A major marketing automation company has positioned themselves as a provider of “Revenue Performance Management” software. This term could mean many different things to different functional perspectives, but the core term for this category of service is marketing automation. “Revenue performance management” has about 590 searches in Google in North America per month while “marketing automation” has 14,800. This tells us that marketing automation is a better known term and more people are looking for this type of solution than “revenue performance management.”

Another recent example is an agency that describes their custom content management system as a “publishing strategy” capability. They rank very well for the phrase “publishing strategy” but have had zero visits from the phrase because very few people are searching for it whereas content marketing is a very hot topic right now, has a much higher number of searches and fits the agencies core capabilities very well.

The lesson here it to review your current and future messaging from the point of view of a persona that does not know about your brand, focus on true differentiation/value proposition and create content that they will understand without needing an explanation. Finding that balance between pushing new concepts and terms vs. serving the market where it exists today is an important input into your content marketing planning.

Write in the Same Direction

Content creation and SEO is no longer limited to the marketing department and copywriters. Blogs, social media, press releases, video, podcasting, etc. has created a plethora of ways to easily publish content to your company and other industry related sites. Profiles on Facebook, LinkedIn, Twitter, SlideShare, Pinterest, Google + mean that there are MORE places to fill with content. All of this communication impacts your brand and visibility – positively or negatively.

One of the most important things you can do to amplify your content strategy is to get as many people in your organization to understand how their work can impact the SEO program and what they should to contribute. By giving them the education and the game plan for what key messaging and keywords are reinforces the central promise of your business.

Here is the approach we have found successful:

• Get executive sponsorship to back your SEO/Content initiative. Without a strong top-level executive supporting this process, you will run into major barriers. Assign someone in marketing as the leader of the effort.
• Assemble a tiger team of people from different parts of the organization including: Marketing, Editors, Web Development, Social, PR, Sales and Executive. Include them in the persona development, keyword recommendations and content strategy development process.
• Build a list of blog, article, whitepaper and webinar topics that align with the keyword strategy. Create an editorial calendar that aligns the proposed topics with keywords. The team leader should manage this list.
• Open up the opportunity to become an official company blogger to the greater company. There are many people in your organization who are currently writing or would like to write and boost their own profile. Each one of these individuals also has their own social network that they can publish to when they write.
• Baseline metrics around organic visits, engagement, leads/business goals by source, size of social networks, track individual story syndication, etc. This topic alone is an entire whitepaper!
• Meet with the greater blog team monthly, review results and celebrate those who have seen the most engagement and syndication (we give out Amazon gift certificates). • Rinse, lather and repeat. Organic visibility through content and SEO must become part of your company DNA to be successful – it is NOT a one and done project.


It’s a Marathon, Not a Sprint

Changing the core messaging and approach to content creation is not easy. The larger the organization, the more challenging it is to get approval and buy-in to build a sustainable content marketing program that will significantly move the needle in SEO. However, once the process is defined and rolling and you start to see the results, this type of marketing will be the foundation for sustaining and growing your business for years to come.

About the Author: Scott Fasser, VP of Customer Experience at Optify has over 20 years of experience in launching products in diverse industries such as online marketing, digital media, e-commerce, and electronic entertainment at companies such as Optify, RealNetworks, Domain Strategies, Amazon, Avenue A, and Sierra On-Line.

Comments Off

Search Marketing Supercharged

posted by admin @ 11:30 AM
Friday, September 14, 2012

Believe it or not, the Web is made up of math - lots and lots of math. And, by looking more closely at the math, we can learn a lot about how a website generates revenue and where there is hidden value. 

The concept is really pretty simple. For example, Acme Website spends money to build a product or service, launches and hopes that people come to buy/use it. Ultimately, the site wants to earn more than it costs to build and operate. However, after months praying to the traffic gods to bless its site with users, and cycling through a few marketing campaigns and PR blitzes, Acme realizes it needs a little help. That’s when companies like Acme typically dip into an "Audience Development" budget to start buying traffic through things like Search Engine Optimization [SEO] and Search Engine Marketing [SEM].

Before you stop reading, let me preface my previous statement with this. One might argue that SEO isn’t "buying" traffic, but I believe it is. You’re essentially paying for an employee, consultant or company to achieve high search engine rankings that drive organic search visitors. Whereas SEM is an easier mental exercise, with website publishers simply paying for clicks to have people visit their site. Wash, rinse, repeat until the desired number of views is achieved. The math in this equation is simple, as long as websites make more money from each user’s visit than it costs to get them there, they are in the clear (well, mostly).

This cycle must be working because every year, billions of dollars are spent on SEO and SEM. The vast majority of these dollars are paid to SEO companies and consultants, in-house teams, search engines and traffic marketplaces. Simple economics would state that as long as the revenue per page view or profit per action is greater than the cost, the cycle continues.  But why stop there?

By coupling SEO and SEM strategies with other ways to increase pages per visit, website owners can start thinking beyond the race to break-even. Let’s start by looking at what has arguably become the most overlooked and under appreciated element of a web page: text links. 

Text Links – Overlooked and Under Appreciated

When most heat maps of Web pages are analyzed, they reveal the not-so-surprising fact that users still focus on content. The facts speak for themselves. At LinkSmart, we know this to be true because we’ve seen click-through rates on text links placed in articles reach 10 to 20 times the rate seen on other Web page elements, such as banner ads, related content widgets and side-rail elements. 

But, despite the draw and popularity of content, many website operators don’t think about supercharging content with text links to boost engagement. It starts with a conscious understanding of the benefits, and developing a text linking optimization strategy to more efficiently manage and shape traffic flows through in-article text links. Every additional page view or visitor directed to a high-value location positively impacts the ROI of traffic generation programs, especially SEO and SEM.  

Building out a text link optimization strategy can be difficult, and finding the right levers to pull on a website can be time consuming and expensive. With this challenge in mind, here are three things to keep in mind before getting started. 

Links to Content – Finding the Right Ratio: Optimizing content isn’t about cramming as many hyperlinks into a page as possible. It’s about mastering the art of balancing relevant phrases and keyword text links with content that people naturally find interesting. Over hyperlinking will actually hinder interaction and the desire to read or click on a certain page or piece of content. Each site, content type and page has an optimal text-link structure that will lead to the highest average click-through rates. It’s really about customization and finding what works best for a particular audience. Again, pay attention to the math driving your site.

Engagement is Nirvana: Web publishers have learned that it’s not just the number of people that visit a site or page, it’s about the time and engagement each visitor has with it. Deeper engagement remains critical to revenue generation, and not all traffic is created equal. With organic search leading to users only visiting two pages per visit, text linking can help increase this metric by even one or two pages, thereby decreasing the cost-per-visit dramatically and increasing profit margins on investments like SEO and SEM.

Refreshing High Performance Pages: By linking to high-performance/value pages throughout their content, publishers can drive users back to these overlooked pages by keeping them relevant, both in context and on search engine results pages. Directing readers to high-performance pages is a dependable way to increase yield and boost traffic numbers and SEO. 

Optimizing website pages with relevant links isn’t just a great way to boost a publisher’s return on their SEO or SEM investments, its’ a great way to better serve readers with relevant content that will keep them coming back for more.

About the Author:

Pete Sheinbaum is the Founder and CEO of LinkSmart, a provider of text linking optimization solutions for web publishers. He is also the former CEO of DailyCandy, Inc. 

Comments Off

Winning Strategies for Search Marketing

posted by admin @ 9:15 AM
Thursday, September 6, 2012

In today’s global marketplace, utilizing a winning online media strategy means remaining aware of the wide array of tools available to savvy business professionals and entrepreneurs looking to expand web presence. Digital marketing and social media advertising has become even more popular with the ubiquitous adaptation of mobile devices. Such convenience makes it possible for customers to engage with business interests anytime and anyplace. Being aware of key online marketing considerations can help you build a more comprehensive Internet marketing strategy. Consider the following four factors when seeking to expand your Internet presence:

1. Use SEO content and link building strategies to enhance your results!

Search engine algorithms evaluate websites for “trustworthiness” when establishing rank. Creating relevant, engaging and frequent new content can help you drive traffic to your site. Likewise, linking your site to other valuable and trusted sites can also boost your results. Keep in mind:

- Links that connect your page to another site that has referenced your content is a valuable tool to enhance results.
- Linking to appropriate business directories can increase ranking.
- SEO (Search Engine Optimization) strategies indicate that content in the form of blog posts or “expert” articles imbedded with keywords can inform your clients while building a more powerful and effective online marketing campaign for your business.

2. Use Social Media’s “signal and influencing signals” to build a better site!

By this stage in the game, everyone knows that having a social media strategy is important to increase brand recognition and build client confidence. What you might not be aware of is that search engines read social media and website interactions for “signals and influencing signals” when ranking a site; here are the basics:

“Signals” include indicators that rank a site based on relevancy and freshness which include parameters for reading:
 - Traffic that follows reviews that a client posts on your site.
- Customer engagement with your site and links on your site especially tweets and re-tweets.
- Upstream and downstream data to and from your site.

“Influencing signals” include indicators that provide ranking data through:
- Tracking traffic from specific top ranking sites that then lead to your site.
- Reading data such as videos, photos, and consumer quizzes as a positive factor in relevancy, especially when the data is repeatedly shared.
- Following bookmark usage.

 Being aware of the ranking parameters above can help you better utilize tools like widgets. This awareness can also provide insight about how adding content to sites like StumbledUpon may also increase your chances of landing on top in search engine results.

3. Use Mobile Marketing strategy to its fullest!

Optimizing your mobile website maximizes ease through which customers can engage with your business. This is an important factor driving sales in today’s changing marketplace. Here are a few tips for getting the most out of your mobile marketing strategy:

- Use metrics to monitor peak usage times so you can add fresh content at those times.
- Expand use of mobile emailing - it makes it easier for your customers to engage with you while they are on the go.
- Maintain and expand an active list of clients who sign up to receive SMS (Short Message Service) communications; make content relevant and engaging.
- Place high priority on contact information and reviewer links to easily engage customers new to your site.

4. Consider “Pay Per Click” (PPC) options!

Using PPC advertising as one component of your overall online marketing strategy can help drive business to your site, which in turn can help boost your search engine ranking. While PPC can get expensive, it doesn’t have to be. In order to squeeze the most out of a PPC campaign, while not breaking the bank in the process, here are a few tips to consider:

- Use PPC to help drive response and generate a buzz about a short term marketing campaign.
- Use PPC once you have your online store up and running; if you are a direct response business then PPC can be used easily to fill your needs.
- Use PPC to exploit your niche – highly targeted key words can be less expensive and yield positive results.

Just as with all mobile marketing strategies, and Internet marketing campaigns in general, you need to be careful to plan out your goals thoughtfully, set a budget, and then utilize the best strategy to build sustainable success.

While the up side of mobile marketing and online marketing in general is that once you get your website in place, many of the tools to promote it are free. The downside is that it takes time to filter out which tools are best for you and your business goals. While some business leaders hire consultants others engage in additional training through online Internet marketing programs. With all the great information out there, it is important to devise a powerful online marketing strategy before you engage in the wide array of tools available. Being aware of key mobile marketing tools and other online search engine parameters can help you build a winning strategy for search marketing and put you at the head of the search engine results race time and time again.

About the Author: Dean Vella who writes for University Alliance on topics pertaining to the subjects of mobile marketing training and social media courses.

Comments Off

The SEO Zoo: Panda vs. Penguin

posted by Michael Garrity @ 6:30 AM
Tuesday, August 21, 2012

The Web is a wild place, and the principles of survival of the fittest are very prominent in the realm of search engine marketing and SEO. So it makes sense that Google would name its algorithm updates after some of the world’s exotic wildlife.

Google’s biggest changes to its search algorithm over the last two years are a pair of updates known as Panda and Penguin. Both of them had the same basic goal of lowering the rank of low-quality or “thin” websites, and thus increasing the rank of higher-quality sites. However, despite their common allegiance toward improving the quality of Google’s search rankings, Panda and Penguin are very different beasts.


What really sets the Panda update apart from other algorithm changes is that the content of an entire site (or a specific section of a site) has an impact on search rankings, as opposed to just individual pages. In other words, if a significant number of pages on a site are flagged as having terrible content, the whole site can be penalized.

Panda primarily favors unique, original content – especially content that comes backed with a lot of clout and authority, like in-depth research reports or thoughtful analysis – over auto-generated content. For websites looking to improve their rankings, they should separate out and get rid of all of that content, mostly because having content with little-to-no value can get an entire site shut down, even if most of its content is unique and worthwhile.

At the end of the day, Panda is trying to weed out duplicate, overlapping or redundant content that isn’t beneficial to the searcher. Mostly, it aims to take down so-called “content farms," which publish a lot of low-quality articles stuffed with popular keywords to drive traffic and to get links. It also works to stop content scrapers from outranking the original author and content.


Penguin, on the other hand, specifically has it out for webspam. With this update, any sites that are found to be violating Google’s webmaster quality guidelines can have their site rankings dropped.

There are a few major offenses that Penguin was designed to combat, including stuffing sites with keywords (particularly low-quality keywords), cloaking, spamming anchor text, purchasing links and more. But even some less obvious techniques could wind up on Penguin’s radar, such as incorporating irrelevant outgoing links into a page of content.

However, Google hasn’t been totally upfront about exactly what Penguin is looking for, saying only that the sites it targets are “doing much more than white hat SEO” and that the company believes them to be “engaging in webspam tactics to manipulate search engine rankings.”

If you’re having a Penguin problem, the best way to start recovering your rankings is by getting rid of your low-quality links and removing any keywords you may have stuffed away on your site. Basically, anything on your site that may toe the line and appear to Google as a black hat SEO tactic should be immediately removed. Fortunately, most of the time, Google will notify you on your Webmaster Tools account if it finds questionable links or other issues with your site, so checking that regularly and immediately fixing those problems should help to prevent your site from dropping in the search rankings.

Time to Adapt

Since Panda and Penguin are algorithm updates, and the penalties that they enforce are not manual. Therefore, it will do you little good to make a reconsideration request to Google to get your site back up in the search rankings. Instead, you’re going to have to make changes and fix your site on your own, and then wait for Google to come back and re-crawl your content before you’ll see a recovery. So you know, it’s probably a good idea to make sure you’re not on the bad side of either Panda or Penguin. After all, it’s a zoo out there, and you don’t want to get left behind.

Comments Off

86 Google Search Changes; Four Key Trends

posted by Pete Prestipino @ 8:53 AM
Tuesday, August 14, 2012

Google has released another (very) big list of search quality changes (86 in total) that took place in June and July of 2012. Instead of detailing each and every one of the changes (which can be cumbersome and, well, boring), let's try and decode the list and see what is in store for SEO's in the coming months (and probably years) by identifying a few (four actually) key trends.

A relevant side note; Google is no longer using individual code names, opting instead for project codenames which does provide some insights into the major shifts the Google search quality team is spearheading. That actually makes it much easier to understand the more substantial trends in the search marketing industry, so thanks Google!

Page Quality: Google has never been unclear on the role that page quality plays in results ranking, and the list of recently released changes for June and July certainly confirms that focus. There are numerous references to its "Page Quality" project which follow a theme SEO's are likely familiar with - high quality content, trusted sources, and unique content. If you were hit hard by Panda or Penguin and still haven't recovered, your site likely features low quality content (read Google Crushes Spinners and Spammers) or duplicate content. And if you have not dug into the data being provided by Google to about the quality of inbound links, you probably should make that a priority.

Site Clustering: There have been numerous reports over the past few months of an increase of results (on the same page) from a single domain. While I expect this mainly relates to brand specific search terms, "site clustering" is a term that can be found in the change list repeatedly and its impact can likely be seen elsewhere. Google indicated that is working on multiple projects to make its system for clustering web results "better and simpler," so expect more changes related to site clustering to emerge in the coming months - and keep a virtual eye on the SERPs.

Snippets: Sitelinks are another important element in the list of changes. Falling under the "Snippets" codename, SEO's should expect less boilerplate text in sitelinks titles, improved clustering and ranking of link in the expanded sitelinks feature, and more useful text in sitelinks.

Answers: Perhaps the most important, but definitely the least discussed since the announcement was made, are the search quality changes related to "Answers." 23 items on the list of 86 were related to the "Answers" project and relate to how Google shows query answers atop search results. In my discussions with other search marketers, many expect that this project is directly related to two very important changes to the way Google is planning to return results in the future - the Knowledge Graph and Google Shopping (that's speculation, but consider it speculation of an informed nature).

The full list of Google's search quality changes can be found here.

Comments Off

Google’s Knowledge Graph Going Global

posted by Peter A. Prestipino @ 1:30 PM
Wednesday, August 8, 2012

Google’s Knowledge Graph, a vast database of millions of people, places and things (with billions of attributes and connections included), has taken another rather significant step forward this week. If you’re interested in search marketing or search engine optimization on any level at all, consider Google's recent announcement one of the more exciting developments of the past few years.

The company just indicated that its Knowledge Graph results will now be available across every English-speaking country in the world. WM will example the Knowledge Graph more closely in upcoming issues, but we’ve included a few highlights from Google's official statement below:

- Google will begin providing different suggestions of real-world entities in the search box as users type (sort of a twist on its auto-suggestion feature). Enter “Rio” for example and the user may mean the animated movie, the city in Brazil or the hotel in Las Vegas. Google now offers these different suggestions as the user types the query.

- Often the best answer to a question is a list of things – not a single entity. Google has begun showing "lists" of grouped or connected things in the SERPs with its latest Knowledge Graph developments. For example, search for “things to do in Paris” and atop the search results will be a “list” as seen below.

- Perhaps the most interesting development is that Google is moving beyond the “public Web” and into your inbox. Google is currently testing the inclusion of Gmail right from the search box. Sign up ( is required to test out this feature. 

- Smartphones and tablets will enjoy the recent round of Knowledge Graph updates as well. Google has combined its expertise in speech recognition and in understanding of language and the Knowledge Graph in order to better interpret questions – Google may even speak the answers back in full sentences.

Google is calling these Knowledge Graph developments “baby steps” but without question they provide a rather important glance into the search engine of the future.

Comments Off

Optimizing Social Content for SEO

posted by Michael Garrity @ 1:00 PM
Tuesday, August 7, 2012

As the distinctive Internet marketing areas of search engine optimization (SEO) and social media become more interlaced, Web professionals must be aware of how social content can affect their search marketing efforts.

The traditional conversation about SEO usually revolves around old staples like metadata or link building, but social media has changed all of that. This evolution is marked primarily by the desire of many users to interact and engage with content they find in search results.

Great Expectations

Because of these new expectations, SEO has now become just as much about a brand’s social content as it is about on-page SEO factors. For online marketers of all shapes and sizes, this means trying to balance their ranking optimization and display optimization on search engine results pages (SERPs).

Ranking optimization is merely conventional SEO, in which one wants to procure the highest possible ranking position for their website/page for a specific search query. Display optimization is a less familiar term for some, but refers to the art of making content visible in other sections of SERPs, such as personalized searches, time subsections, or search subsections, all of which are more socially influenced.

With everything on the Web becoming more personal, especially Google searches, social content grows in importance, requiring more of an emphasis on display optimization. Because of this, publishers should be aware of the fact that their content is likely to appear in these other sections of the SERPs based on the amount of social activity it sees.

Creating Socially Aware Content

Any content that can be searched should (obviously) be optimized for relevant search queries, but it should also include additional social elements that increase its potential for virality. This includes adding social signals or share buttons to on-site content (such as a Google +1 or Facebook Like). But adding social elements is no match for heavily promoting and linking content on social media properties. Basically, the idea is to display content across a publisher’s primary site and social media channels, and make it easy for others to share it, as well.

Web pros can easily produce social content by blogging. Blog posts are among the most shareable types of content on the Web, and those that provide unique value to readers and foster engagement are a goldmine when it comes to SEO. Posts that are especially viral include how-to pieces, resource lists, and interviews, along with many others. Of course, publishers aren’t limited to just text posts, as they can also offer images or video (both of which are probably more likely to be shared, anyway) on their blogs, websites, or social network profiles.

Optimizing content for social search also works hand-in-hand with long tail SEO practices, as the material that one publishes can utilize his or her long tail keywords. Publishing material with an eye on display optimization for long tail terms will increase the likelihood that searchers will look at it, as it provides unique information about the more detailed, specific query they were searching for, thus making it relevant to them. Offering social content for the long tail is a great way to get your stuff in front of niche consumers, and encouraging shares/likes/retweets means that they may share it with other interested parties.

It’s also worth noting that for a little while now, Google has been pushing to make content more social (largely because of Google+), and one of these initiatives has been highlighting individual authors, as opposed to just a website or business. This places an author’s name, picture, and reputation next to their content in the SERPs, and can be a major boon to the SEO efforts of the publisher, as it adds authenticity and authority to their content. By attaching the authorship tag (rel=author) to a blog post, one can include an additional social layer to their content.

Going beyond keywords and links and finding a balance between traditional SEO and optimizing for social search is a necessity these days for most online marketers. Hopefully, it is looked at as less of a burden and more of an opportunity to reach relevant consumers and grow their audience using nothing more than their own original content.



Comments Off

Google Offers Deep Look at Structured Data

posted by Michael Garrity @ 9:35 AM
Monday, August 6, 2012

With new websites popping up every single day, carefully crafted search engine optimization (SEO) practices are necessary for webmasters to make sure that their sites are still visible and don’t get buried by all of the new content that finds its way online every day.

When Google, Yahoo, and Bing teamed up like a search engine Justice League to establish, they helped open the door for a whole new level of SEO that adds a more intricate layer to Web pages with the universal acceptance of standardized structured data. This helps websites further differentiate themselves from their competitors and helps return more accurate results for some searches.

Recently, Google announced a new feature as part of its Webmaster Tools offering called the Structured Data Dashboard (under the Optimization tab), which allows Web professionals to verify that Google accurately understands new markup on a site, while also detecting problems with existing page markup.

The Dashboard will provide users with three views of the structured data on their sites.

First is a top level site view that aggregates structured data information by vocabulary schema and root item type, meaning an item that isn’t an attribute of another on the same page.

In addition, the Dashboard offers an itemtype-level view that provides per-page details for each item type. Google derives this information by parsing through and storing a fixed number of pages for each site and item type, which are then stored in decreasing order based on the time in which they were crawled. The search engine will also keep tabs on all of a site’s structured data markup, and for some item types, it will even have specialized preview columns.

Finally, there will also be a page-level view that shows details about all of the attributes of every item type on a given page, and a link to the Rich Snippet testing tool for the page being studied.


Comments Off

Last week Google began ramping up its distribution of messages to those sites with a “pattern of unnatural links pointing to them.” Google has actually been sending notifications for several months but what do these new messages mean? Here's what you need to know about the new inbound link notifications. 

Losing Trust in Your Entire Site

If you’ve been engaging in an ongoing practice of link spam (widgetbait, paid links, blog and guestbook spam, excessive article directory submissions, link exchanges, etc) you may have already received one of the previous notifications which are quite serious in nature. Google will reduce the “trust” of the entire site if there are link spam signals like those previously mentioned, potentially leaving you out of the SERPs altogether.

If you received one of the “original” link notification messages, Google recommends removing those links if possible and submitting a reconsideration request. There are instances however when this is impossible – for example, there have been reports of some blog networks and directories attempting to charge sites for removal of links. Since some unnatural links are simply outside of your control, Google has released a new round of link messages which put the focus on those linking to a site as opposed to those being linked to. Say goodbye to negative SEO! 

Distrusting Some Links to Your Site

The new link messages (which were sent to less than 20,000 domains last week) target specific spammy or artificial links - and distrusting only those links as opposed to taking action on a site’s overall ranking. Google indicated that these messages address a situation that is not as severe as the original link messages which indicate that it is losing trust in your entire site.

Comments Off

One of the most common questions about search engine optimization is whether a page (and website) has been indexed by search engines. Google has just released a new feature dubbed Index Status in Webmaster Tools which shows just how many pages from a site have been included in the Google index.

Located under the Health menu, the Index Status feature (a graph – seen below) shows up to one year of data.

Google indicated that a steadily increasing number of indexed pages should be enough to confirm that new content is being discovered, crawled and indexed, but if there are issues, webmaster can look even deeper into the data to identify potential causes.

Under the Advanced Tab of the Index Status feature, webmasters can access the totals of indexed pages, along with the cumulative number of pages crawled, and the number of pages which Google knows about but which are not crawled (blocked by robots.txt perhaps), as well as the number of pages that were not selected for included in the results.

Comments Off

SEO Integration Comes to Kenshoo

posted by Peter A. Prestipino @ 12:00 PM
Monday, July 23, 2012

Digital marketing software provider Kenshoo has launched a new version (4.7) of its Kenshoo Enterprise product. The platform now includes organic and paid search reporting, and new rules-based bidding policies for Chinese search portal Baidu.

Kenshoo Enterprise 4.6 integrates SEO metrics (through an integration with SEO technology providers including Rio SEO, BrightEdge and Covario) into SEM campaign performance reports. This will provide marketers a way to analyze organic and paid results in side-by-side comparisons, and take action to optimize search programs holistically.

Kenshoo also introduced new rules-based bidding policies for Baidu such as “Increase Profit” and “Control CPA”. The platform upgrade also now offers users mobile and tablet device targeting for Yahoo! Japan, additional adCenter keyword match type support for Bing/Yahoo, and quite a bitm more that is worth a closer look.

“Marketers are increasingly executing paid and organic search in an integrated fashion to improve overall results,” said William Martin-Gill, general manager, Kenshoo Enterprise. “With Kenshoo Enterprise 4.7, clients can determine the most cost-effective ways to manage individual keywords or portfolios through optimal paid and organic placement.”

Comments Off

8 Ways to Improve Your Site Over the Weekend

posted by Michael Garrity @ 7:00 AM
Friday, July 20, 2012

Everybody may be working for the weekend, but if you run a website, you already know that you’re almost always working on the weekend, as well.

In fact, most of the work of the everyday Web professional has to do with tweaking, analyzing, and generally optimizing the performance of their site(s), which can take up a lot of time.

So, if you don't want to spend your whole weekend working, here are eight quick ways to refine your website in two days, and still have time to go see the new Batman movie.

Offer social proof on your website.

While peer pressure is typically seen as negative, that is not the case in the marketing world. Web workers can display customer testimonials or case studies on their website to increase engagement and conversions. By doing this, your audience will be able to relate to other customers, and therefore learn how effective your services or products are from a trusted opinion. So, find some of the nicest things that consumers out there have said about you (and don't forget, you can encourage them) and share it with the world.

Test email subject lines.

Email marketing campaigns won’t be successful if emails aren’t delivered or opened, which is why it is always important to use best practices, such as testing, when creating email subject lines. One way that marketers can quickly test email subject lines is with This free subject line scoring tool evaluates subject lines and provides users with scores, as well as deliverability and marketing tips and advice. Take some time - 15 minutes or so - to sit down and brainstorm great subject lines.

Generate a Fivesecondtest for your site to assess its usability/readability.

When you don’t have the time to conduct user tests, this handy Web-based tool lets Web workers upload a screenshot of their Web page and then creates two different five-second user tests, one for memory and one for descriptive feedback. And, you know, it only takes five seconds.

Insert dynamic meta descriptions into your HTML.

One of the easiest ways to improve your search marketing efforts is to include useful, compelling meta descriptions on your Web pages. Just be sure that they are relevant to the page’s content and are captivating enough inspire a user to click on the search result.

Monitor keywords on Twitter.

It is important to keep an eye on discussions that are relevant to your brand, because it can help social media managers better connect with their audience, as well as help publishers discover possible content ideas. While there are many tools that can be leveraged for monitoring social media mentions, two free Twitter-specific tools worth checking out are Monitter and Twitterfall. Take a few minutes every hour or so to see who is using your keywords in the Twitterverse.

Assess your forms.

Does your email subscription form ask for unnecessary information? If the answer is "yes," you may be scaring off potential subscribers. This is why removing unnecessary or less important information from your forms can prove to be beneficial in the long run. In fact, it is important to note that the only information really needed on a newsletter subscription form is a name and email address, especially because more targeted information can always be obtained at a later time. Should you have some time to spare, why not give one (or all?) of your forms a review to make sure they're not asking superfluous questions.

Fix up and customize your 404 page.

Obviously you never want your visitors to land on one of these, but it happens, and in those situations, it’s good to have a custom 404 page that will not only provide them with information, but also offers additional useful content and encourages them to continue exploring your awesome website. If your 404 page is uninformative and boring, why not take a few hours to create one that will be a little more meaningful to your site's visitors?

Adjust white space to improve readability.

Finding the right balance between too much and too little space around text is one of the essential aspects of a readable website. Remember that your chunks of text content need room to breathe so that your visitors can view them more easily, and they shouldn’t be adhered to other elements, particularly images. Spend a few minutes during your morning coffee looking over your website to make sure it's optimized for readability, and if you see any problems, try increasing your padding and margins.

Comments Off

Breaking: Google Panda Data Refresh

posted by Peter A. Prestipino @ 9:47 AM
Tuesday, June 26, 2012

Google announced via Twitter yesterday that they have pushed out a new Panda algorithm data refresh this week.

The update, which according to Google “noticeably affects only ~1% of queries worldwide”, is the second Panda update this month (the first was around June 8th). While there is little evidence of merit to suggest that any algorithm updates or signal changes were made this time around, now would be a good time to check your SERP positions to see where you stand.

This data refresh likely supports Google’s “high-quality site” initiative which rewards sites that makes optimal consumer experiences available. In early May, Google provided some meaningful guidance as to what constitutes a high-quality site which is definitely worthy of a look for anyone that takes their search positions seriously.

Comments Off

Training for the SEO Olympics

posted by Peter A. Prestipino @ 9:46 AM
Tuesday, June 26, 2012

In the summer of 2012, the world will turn its collective attention to London for the Olympic Games. And much like the best search engine optimization professionals, the finely tuned human instruments of speed and power that will represent countries from across the globe next month will need to be in ideal condition.

But for any naturally talented athlete (or SEO) to make their way into the upper echelon, winning gold medals and taking the first position on the ceremonial stage, they must train.

The responsibilities of SEO’s today are far more intensive – just like those of an Olympic athlete. The amount of training that is required to maximize efficiency and dominate the competition requires careful planning, rigorous training, and perfect execution. So what should your training schedule look like if you want to win gold in the SEO Olympics? Follow along with WM’s Olympic-level SEO Training guide and find out.

Understand Strengths & Limitations: Watch the Olympics and you’ll find athletes with abilities that are naturally suited to their task. They have likely chosen their sport because they have characteristics which make them excel at that particular event. Search engine optimization is not much different – some SEO’s will gravitate towards the development of content, others, perhaps more sociable in nature, may opt to spend their time building relationships to acquire links, and still others who may be “mentally mechanical” in nature will choose to conquer the challenges of on-site optimization. The point is, we’re naturally pre-disposed to certain tasks – understanding what those strengths (and/or limitations) are, is the fastest way to positively influence performance.

Explore Challengers & Competitors: Anyone that has ever been involved in a team sport (baseball, soccer, basketball, etc.) understands the role that research plays in the success of their endeavor. For example, how many times have you heard a coach cite “watching hours of video” as one reason their team was victorious? Likely quite a few. You watch your competitors to find their strengths and weaknesses and ultimately identify a strategy that will work for you and your team and that in the end will position you for greater achievement. As an SEO, if you don’t explore the content development and marketing of your challenges and competitors then you will never be able to compete – or win.

Rest & Relax Resources: There’s a reason you won’t find (many) athletes in pubs, bars and dance clubs to the wee hours of the morning before their Olympic event. To ensure their resources (mind and body) are working at maximum efficiency, there needs to be physical rest and mental relaxation. You, your website and its underlying business are no different. Search engine optimization professionals often work on tight deadlines and within limited budgets, not to mention in an environment that demands innovation and an endless stream of creativity. As you might expect, that can take a toll even on the healthiest among us. Rest your mind and you’ll rediscover the creative well spring and be able to leverage it for more powerful content marketing. Letting your social media audience relax by interacting less frequently at times will provide greater opportunities to have meaningful engagement when you do.

Develop Healthy Routines: Ask any former athlete what they remember about their time sweating it out and you’ll likely find a few – myself included - that enjoyed (or at least now appreciate) the structure and routine that was provided. While routine can at times seem monotonous, when a Zen-like (or zone-like) level can be achieved, big things can happen. It’s not uncommon for track and field athletes for example to walk through their events and races, mimicking the exact actions they will take during the real event. When it comes to SEO, developing healthy routines is of fundamental importance too. For example, content marketers need to perform daily (if not real-time) keyword research; link builders need to stay connected to their networks to identify emerging influencers and immersive content to align their Web properties with; on-site SEO’s must continually address the development of sites and the experience that is provided.

Push Limits, Test Strength: In preparation for their event, athletes (and SEO’s) need to push their limits, testing their strength and abilities to get the final job - victory - done. Those running marathons (26 miles) for example regularly do a 15-20 mile run less than one week away from their event. SEO’s need to push their limits regularly as well – from testing site speed to the impact of their chosen meta-data structure, from the volume and type of content to the types of links that are pursued and ultimately obtained. Without testing what is possible, it is impossible to realize the full potential of what you’re actually doing.

Every athlete, every SEO, and every enterprise is different. Available time, resources and natural abilities need to be understood and leveraged in a way that is the most rewarding. Keep that in mind and the torch of awesomeness always lit.

Comments Off

New SEO Product Launches in the U.S.

posted by Linc Wonham @ 8:00 AM
Friday, June 8, 2012

To help facilitate the U.S. launch of its SEO software product, Hamburg, Germany-based SEOlytics is offering a free starter version for new customers. A full range of features and quality of data empower users to perform in-depth search performance analysis of their own websites and those of their competitors.

Managers, online marketers and webmasters can use SEOlytics to measure the daily visibility of their websites on Google and Bing to easily identify potential areas for improvement. Whether used to analyze competition and market or employed as a professional tool for the day-to-day work of an in-house SEO or agency, SEOlytics gives a complete picture that helps improve search performance over the long term.

The SEOlytics starter version allows for comparing search performance of up to 10 domains based on the unique SEOlytics Visibility Rank (SVR) index. The SVR is calculated based on rankings, search volume and cost per click for a representative reference keyword set of over 1 million U.S. keywords.

Powerful ad-hoc browsing and filtering on this keyword set is just as possible as daily rank tracking for up to 20 custom keywords. Users can also create their own Daily Visibility Rank (DVR) indices based on the keywords that really matter to their domains.

SEOlytics is available as the free starter version, a PRO version ($99) and an Elite version (starting at $339).

Comments Off

New Study Reveals Top Google Ranking Factors

posted by Linc Wonham @ 8:30 AM
Thursday, June 7, 2012

The volume of Facebook and Twitter shares that a Web page generates is closely correlated to how high it ranks in Google searches, while too many ads on a page are likely to have a negative effect on search visibility. So says a new study from search and social analytics company Searchmetrics.

The research also finds that top brand websites appear to have a natural advantage for ranking highly in searches. Searchmetrics analyzed search results from Google for 10,000 popular keywords and 300,000 websites in order to pick out the issues that correlate with a high Google ranking. The correlations were calculated using Spearman’s rank correlation coefficient, in which a correlation of +1 indicated a perfect positive correlation and -1 indicated a perfect negative correlation.

The five key findings of the study are highlighted below:

Social media’s effect on search
Social signals from Facebook and Twitter now correlate very strongly with good rankings in Google’s index. The number of Facebook Shares that a Web page has received appears to have the strongest association (a correlation of 0.37). Twitter is far behind Facebook but is still the sixth strongest factor on Searchmetrics’ list of Google ranking factors with a correlation of 0.25.

Top brands have a ranking advantage
Despite the perception of search as a level playing field, the study found that top brand websites enjoy a ranking advantage. Some of the main factors that are commonly believed to help Web pages rank well, such as the quantity of text on a Web page and having keywords in headlines and titles, have no effect in the case of large, well-known brands.

“Surprisingly, the data show a negative correlation between these factors and rankings – contradicting traditional SEO theory,” explains Marcus Tober, Searchmetrics’ CTO. “So, not having keywords in headlines or having less text on a page seems to be associated with sites that rank higher.

“When we looked deeper at the top 30 results we found that this pattern really starts to emerge with highly ranked pages. And when we looked at sites that are in the top position on page one of Google – the natural position occupied by brands – this is where the negative correlation is strongest. This indicates that strong brands rank highly even without perfectly conforming to common SEO practice.”

Too much advertising hurts rankings
Too many and/or excessively clumsy advertisements were presumed to be a factor in the Google Panda Update and its successors which have tried to lower the search visibility of poor quality results. The data in this study supports this assumption as all the analyzed advertisement factors returned a negative correlation (-0.04).

A deeper analysis revealed that this pattern was strongest when there was a high percentage of Google AdSense ads; rankings for pages with more AdSense ad blocks seem to drop sharply. This supports Google’s statements early in 2012, in which the company said that particularly prominent, distracting or above-the-fold ads could lead to ranking problems.

Quality of links is vital
The number of backlinks is still one of the most powerful factors in predicting Google rankings (with a correlation of +0.36). To get the most benefit, however, it appears a site needs to have a spread of links that looks natural – not like it was artificially created by SEO experts.

This means that a site should not simply have a large number of perfectly optimized links that include all the keywords it wants to be ranked for in the anchor text. It needs to have a proportion of ‘no follow’ links and links that contain ‘stopwords’ (such as ‘here’, ‘go’, ‘this’).

Keyword domains still attract top results
Contrary to reports, websites with keywords in the domain name such as still often top the rankings (correlation of +0.11). Although Google has repeatedly said that keyword domain sites will slowly weaken in power in searches, this does not yet seem to be the case.

“We collated the data for our research in February and March 2012, meaning that it takes into account the impact of Google’s various Panda algorithm updates that have greatly changed the look of search results since early 2011,” explains Tober. “We conducted similar studies in the UK, Germany, France, Spain and Italy and found very similar results across the board, which seem to show that these findings apply internationally.”


Comments Off

Long-Tail Keyword Tips for Affiliates

posted by Michael Garrity @ 9:30 AM
Wednesday, June 6, 2012

Working within a niche industry as an affiliate marketer is very much a system of give-and-take that comes with potentially huge rewards. But it does not come without extraordinary effort on the part of the publisher.

When affiliates operate in highly targeted industries, some of their most useful instruments are long-tail keywords – those more obscure words and phrases that focus on smaller volumes but yield more qualified search results.

Affiliates tend to target these keywords for two reasons: 1) There is less competition for them, and 2) They appeal to users searching for more specific products.

In other words, long-tail keywords emphasize quality over quantity, and finding the best long-tail keywords for your Web property can be a rigorous process of research and testing. Fortunately, there are a multitude of free keyword testing tools widely available on the Web.

Tools for success
Google’s free AdWords keyword tool is used by Web professionals looking to optimize their search engine rankings, but it’s especially useful for affiliates looking to uncover the best long-tail phrases available for their websites. Publishers can use this tool (or others like it, such as Wordtracker, Keyword Discovery, NicheBot and many more) for research, which any successful Web worker will tell you is crucial to one’s success.

To find long-tail keywords with these tools, start by conducting a general search using the two or three keywords that are the most relevant to your site. After getting the results, re-order the “Global Monthly Searches” column so that the lowest number is at the top, and then work down the list to identify all of the long-tail keywords that were returned, and note any that may have special relevance to your particular niche.

Identify your goals
Long-tail keyword research can only go so far if you have a clearly defined goal. Typically, it is to entice visitors to click on a merchant’s ad, meaning that you want to create copy geared towards actively helping users achieve their own goals. For example, if advertising coffee mugs, effective content might include information about why one type of mug is better than another, or how much money your website visitors can save by making their own coffee as opposed to buying it every day.

Once you have established the purpose you want your keywords to convey, it’s important to pay attention to the search volume of each of the candidates to figure out which options will be worthwhile. The tail of a broad keyword can have hundreds of thousands of potential matches but, realistically, few will actually be searched for often enough to actually help drive a significant number of relevant consumers to a website.

Do the research
What you want is to research how often people are searching for specific content related to the general topic of your website, and then select those long tails that are pertinent to the site’s goals.

Once you have determined the best long-tail keywords for your website, the process just becomes regular SEO. You need to find ways to include these long tails in your URLs and page titles, naturally integrate the phrases directly into the copy of your Web pages, use them in anchor text and add them into a page’s HTML using headline tags (i.e. <h1> and <h2>).

Long-tail keywords are a big part of the successful affiliate marketer’s arsenal because they allow you to reach out to users when their reason for conducting a search aligns closely with the goals of your website. While it can require more work on the research end of things, the time spent is usually well worth it.

Comments Off

CRM for SEO and Social Media

posted by Peter A. Prestipino @ 8:00 AM
Wednesday, June 6, 2012

Building and maintaing relationships should be the focus of every SEO (link building) and social media campaign, but that's not always the case – even for the most savvy and sophisticated marketers.

Fortunately, many software platforms are catching on to this reality and putting the focus back on the relationships that enterprises need to advance.

Raven Internet Marketing Tools, for example, has just released a contact relationship manager for search engine optimization and social media campaigns. The solution enables marketers to research and communicate with website owners (for link-building efforts) or influential social media users and writers.

Much like other CRM solutions, Raven's CRM integrates leads through third-party vendors such as the Wufoo form builder, and email service providers Campaign Monitor and Aweber. The solution is ideally designed for larger SEO and social media marketing groups where high levels of collaboration and review are required.

"Previously we would facilitate link building with a spreadsheet log and dozens of back and forth emails – not anymore," says James Agate, managing director of Skyrocket SEO in the United Kingdom. "We get to keep everything in one place, track performance and keep a record of the links acquired automatically. Raven Tools' new CRM feature plus their existing Link Manager feature equals a major efficiency win for us."


Comments Off

Google Penguin Update 1.1 (Watch Closely)

posted by Peter A. Prestipino @ 9:54 AM
Tuesday, May 29, 2012

An update to Google's search index arrived prior to the long holiday weekend here in the U.S.

Google's Matt Cutts announced the "data refresh" on Twitter late Friday afternoon. Despite claims otherwise, this is the first official update since the "Penguin" update rolled out in late April 2012. Many SEOs noticed shifts in the SERPs throughout the month of May which were likely just live tests of this current update.

Penguin, an algorithm change that took direct aim at sites that were violating Google's quality guidelines, has caused quite a bit of digital commotion in the past month. This most recent update, now being dubbed Penguin 1.1, refreshes the index based on changes Google made since Penguin arrived. Cutts indicated in his tweet that less than one-tenth of a percent of English-language searches would be affected.

If you were negatively affected by the initial rollout of Penguin, the good news is that sites that were previously penalized (perhaps mistakenly) may see their sites return or recover. The bad news is that it's likely not over yet – Google will likely roll out numerous updates to Penguin over the course of the next few months. Those that believe they escaped a Penguin penalty initially may now lose their positions as Google refreshes its index based on improved/modified filters.

So what should you do? Stay calm, geek on, and watch your ranking positions closely over the course of the next week.

Comments Off

Big List of Link Building Strategies

posted by Michael Garrity @ 1:00 PM
Friday, May 11, 2012

Link building should always be top of mind for those who want to grow their Web properties, but to call it a difficult practice is an understatement. It is, in fact, one of the most challenging components of search engine optimization, and one of the trickier areas in all of Web marketing.

Fortunately, a number of helpful strategies have emerged in recent years, and WM has highlighted some of them below. If you have additional suggestions, we welcome you to share them in the comments section.


Quality Counts
The most effective way to acquire quality links is to produce quality content – something that your visitors and other website owners are going to find worth reading and sharing. The better your content, the more likely it is to be picked up and/or linked to by publishers in your industry. It’s always good to keep in mind that the most useful, sharable content is that which includes lasting, actionable information such as tutorials, lists, industry reports, glossaries, etc.

Most Web users appreciate a well-conceived and attractively designed infographic. They can provide practical, useful information in an easy-to-digest format, and they are easy to share between links to the infographic and other sites embedding it into their own content.

Viral Content
Viral content is basically anything that encourages those who see it to look into its source for more information. Typically, it can be videos, photos or games that offer an ambiguous message that creates intrigue and drives users back to its source website for more.

Linking Made Easy
If you want people to share your content, especially images or videos, you’d do well to make it easy for them to link to your site. The best way to do this is to provide HTML-ready snippets that people can easily copy and then paste right into their own website/blog/social media post.

RSS Feeds
Most content management systems create RSS feeds for your blog content. If you don’t automatically have that option, it’s a good idea to create one. This allows interested users to add your content to their daily feeds, increasing the likelihood that they will not only see it but also share it with others.

White Papers
One great way to establish authority and gain links is to publish white papers. Not only do white papers help you earn links from interested parties, but they also help to identify you as a thought-leader in your industry or niche.

It is true that we live in a visual society, which is why photos are a very valuable type of content to leverage. Photos can be set up to link back to your website, which is a valuable way to increase website traffic. Images can also be posted and shared on social media sites for even greater visibility.

Videos are another type of content that can be very influential when it comes to building links. Content producers should remember to vary the types of videos so that they don't become too repetitive, such as a mix of tutorials, product announcements, general information and industry news.

Controversy Loves Company
Sometimes the best way to get noticed is to make waves, and online that often comes in the form of bold or occasionally controversial statements. Content that takes a bold stance on a topic, even one that is contrarian to the present popular belief, is very likely to encourage others to link back to your site as the source.


Press Releases
Submitting company news to the top press release engines such as PRWeb, PRNewswire and Buisnesswire will immediately increase your business’ online visibility and encourage journalists and bloggers to write about your brand and link to your website.

Engaging your audience with contests is a great way to promote your brand online. Post information about the competition on your website and social media channels, as well as within email newsletters, so that your audience can participate and share the promotions with their friends.

Daily Deals
By participating in a daily deal through a website such as Living Social, Groupon or WhereYouShop, merchants will not only increase their conversions but also their online visibility through a link on the daily deal vendor’s website, as well as within the deal’s social media and email promotions.

Widgets and Apps
Valuable widgets and apps will build links through promotions, reviews and social sharing among audience members. Additionally, when another site downloads and uses your widget or app on their website, they will link it back to your website.

Create an informative e-book for your audience to download can result in linkbacks through reviews, promotions and social sharing.

Set up a marketing campaign that will put your site in front of Web surfers all over the Internet. Try display advertising on social sites such as Facebook or join an affiliate program to increase your brand’s reach.

If your business doesn’t already have a corporate Wikipedia page, create one and link back to your website from the page.


Guest Contributions
This is the most useful form of networking for both parties, and it can go two ways. You can either enlist other prominent industry experts to create content that you will post on your site – which gets links from users as well as the contributor linking to it on their own website and social media profiles. Or you can contribute something to another blog or website and then link back to your own site from there.

Listing Site Directories
One of the easier ways to pursue links is to submit your site to an online website directory. It can be especially useful to include your site on an industry-specific directory (if one exists). Unfortunately, while this may be a great way to begin the link-building process, garnering links from website directories does not do you a lot of favors when it comes to SEO.

Talk to People
In the digital age, it can be easy to forget that sometimes the best way to network is to just go out and form relationships with others. This can be achieved by reaching out on social media to authoritative bloggers in your industry, or linking to their content on your blog or website in an effort to get on their radar.

A great way to show off your expertise is to give interviews to relevant Web publishers and get a link back to your website in return. You can also conduct interviews with others in your niche, and the more you do, the more in demand you will be.

Everyone has an opinion, and for some reason they love to share them on the Internet. This is why message boards and forums have become such a popular medium for discussing pretty much anything. Including a forum on your site can help bring in relevant users who love to talk about whatever it is you do. It can also improve your link building efforts because when an interesting discussion pops up, users may share it with their friends/readers/visitors in an effort to engage more relevant voices.

Like white papers, webinars are a very effective way to show off your authority. And webinars also present an outstanding opportunity to network. You can go out and find other industry experts and put together a joint presentation, and then each party can link to it for their own unique user base.

Social Media

Social Presence
The Web is full of social media networks, and most come with a large, very interactive audience. Website owners can leverage these platforms by choosing a few of the most beneficial sites for their brand to join, such as Facebook, Twitter, LinkedIn, Google+, YouTube, Pinterest, FourSquare or Tumblr. Once your brand has its own page on these sites, administrators should grow the audience and brand’s visibility through posting great content and promotions, as well as interacting with their fans and followers.

Gamification functionality enables website owners to better engage their audience as well as increase their online visibility through rewards and badges. Gamified sites can be set up to reward social sharing among audience members, as well as reward influential consumers with badges that link back to the gamified website.

Social Login
Not only have studies proven that social login increases audience engagement levels (and that customers prefer it), but it also makes it easier for site visitors to share content. A recent Janrain study even revealed that 78 percent of people using social login posted to their social networks about a product or service that they liked and thought others should know about.

Share Buttons
This link-building tactic is pretty obvious, especially since we have already covered the power of social media when it comes to link building. By providing your audience with social sharing buttons that are easy to locate on your website or within email newsletters, website owners will most certainly see an increase in shared content and traffic to their sites.

Email is still king when it comes to marketing, which is why it is also a valuable link-building tool. Include links to content, promotions and sales within the email, and it is also important to make it easy for your audience to share and link back to the content.

Comments Off

Google SEO: 52 New Changes to Know

posted by Peter A. Prestipino @ 9:14 AM
Monday, May 7, 2012

Google has released a long list of changes that the search engine made in April 2012, and many are directly related to how search engine optimization professionals will engage in their profession post Penguin.

Website Magazine has listed some of the most important changes related to Google SEO below by generalized category, but do review the full list which focuses on the search experience for consumers as much as anything of note for SEO professionals. In line with previous search quality change reports, Google has assigned codenames for each change which does make it easy to keep track of the impact of the changes in the future.

For the first time in my recent memory, Google noted that it increased the size of its base index by 15%. The base search index is Google’s main index for serving search results. Google also introduced a new “index tier”. Google keeps its index in tier where “where different documents are indexed at different rates depending on how relevant they are likely to be to users.”

And now, on with the Google SEO changes to know!

LOCAL RELATED: Google made some significant modifications related to language relevance and country identification for Web pages. The “Raquel” update takes into account language to help return relevant navigational results, while Sudoku improves the systems currently in place to detect when a website, subdomain or directory is relevant to a set of countries and does so down to the page level. The “ImpOrgMap2” change is likely the most important local related change in this set, which makes it more likely users will find a website from a specific country – e.g. over – in the search results pages.

SNIPPET RELATED: Several changes were also made to snippets in April 2012. Google updated its system for generating snippets with the project codename “DSS” to keep it consistent with other infrastructure improvements and is now more likely to show text from the beginning of a page in snippets when the text is particularly relevant - codename Solar.

FRESHNESS RELATED: Perhaps the most noteworthy and actionable of all changes released in this round are related to freshness. Google indicated that its “Citron” update enables the search engine to better identify fresh documents, but its “NoRot” update modified a classifier to ensure content that is identified as low-quality” will be excluded – even though its fresh.

SPELLING RELATED: Google released a set of changes related to spelling corrections. The “Potage” update internationalizes one of Google’s algorithms to prevent bad spell corrections; the “Pita” update extends spelling corrections to more than 60 languages, and the “Spelling” update is a change which makes it more likely that queries get a spell correction even if it’s longer than ten terms.

Comments Off

Everflux SEO & Post Penguin Predictions

posted by Peter A. Prestipino @ 10:30 AM
Tuesday, May 1, 2012

Google’s latest round of algorithm changes, which affects just 3 percent of queries, is now known as the Penguin update. The latest sweep of the SERPs penalized spinners and spammers, and sends a clear message about what the search engine expects from the SEO community.

So, should your SEO strategy change?

Not Just SEO, Everflux SEO
While Google has been battling the most roguish of SEOs since it came into commercial being (see a brief history of Google updates below), many digital marketers still mistakenly believe that black hat shortcuts are better than having a sustainable white hat strategy.

As someone who likely requires some measurable success with SEO, your best tactic in the now post-penguin era is to invest your energy in Everflux SEO. No, it’s not a software or a service, it’s a mindset – a mindset that demands a greater focus on creating genuine value-added experiences for Google’s users and your prospective visitors. While the search indices are constantly in flux, SEOs need to be “ever” vigilant in adopting the many well-known best practices.

To understand what Google wants, or any search engine for that matter, you must first understand its history with the SEO community and the changes and modifications it has made to provide users the best possible experience over the years and which lead us to today.

A Brief History of Google Updates
The number, scope and depth of algorithm changes made by Google over time is extensive, but even by examining a brief history (we’ve highlighted the “key words”) can we start to see what SEOs should be focusing on. For example, Google has always been concerned with link quality. The Cassandra and Dominic updates put the quality of links at the center of discussion in the search engine optimization community, and that conversation continues today.

When the Florida update rolled out in late 2003, the practice of SEO became much more serious. Shortcut SEO techniques such as keyword stuffing, which once provided the most open point of access into the search results, were finally sealed with Florida.

With several loopholes closed, in early 2004 Google introduced an update known far and wide as Brandy, which dramatically improved Google keyword analysis through Latent Semantic Indexing (LSI). This set the stage for the most significant changes to the Google SERPs in the Jagger and Big Daddy update of 2005. Google cracked down on link manipulation with Jagger and introduced the term canonicalization with Big Daddy, which made site quality issues (and some would argue search usability) the focus.

For many, it was the golden age of search engine optimization and out of the blue – it got much better. The Universal Search update in 2007 created a much more immersive experience for search engine users and provided SEOs with many more tools to practice their craft. News content, images and video started Google down a path of introducing new content formats to the user search experience. Google has continued with these integrations and, in some instances, bought companies outright (ITA Software, Zagat) which provide content that users can’t or don’t provide. For those it could not purchase, Google partnered with. The best example of this is seen in what I refer to as the “Fire Hose Wars”, where the interest in real-time search was at fever pitch. Google starting indexing content from a variety of services, including the fire hose (or full feed) of Twitter.

The real-time search update presented some further speed and indexation problems, which were addressed in the Caffeine update of mid-2010. If 2010 was the year of real time and caffeine, then 2011 was the year of the Panda, a broad update which harshly punished “thin content sites.” Then, in mid-2011, Google and other search engines announced support for – again providing something of value for SEOs in terms of a means to provide more information to Google and influence their position on the search results.

As the interest in social media rose, so did Google’s reliance on social signals. And of course what still remains as one of the hottest topics in SEO, Search+ Your World further changed the already highly dynamic, personalized search results by incorporating user profiles and more social data. Social media optimization is now an essential technique for success with SEO.

By understanding the update history and with a firm belief in the importance of Everflux SEO, we can finally make some post-Penguin predictions.

Post Penguin Predictions

Greater Reliance on Knowledge-Base Optimization (KBO): The Web doesn’t exist without information, and more often than not, the more you have, the better the opportunity to receive website traffic (visitors). But you can’t get those website visitors if at some point you don’t first create the content. Knowledge-base optimization puts your whole organization/enterprise to work, developing and sharing information that will by its very nature appeal to those looking for your products or services on the Web.

Deeper Exploration of People-Powered Optimization: The practice of search engine optimization has changed and, for many, it has not been an easy transition. Today’s SEO must be skilled not only at content creation, site architecture or keyword analysis. Today, the best in the SEO business are those that have mastered the art of people-powered optimization. To benefit in this new era, you’ll need to understand the practice of social media optimization and the role that individuals play in the success of not just SEO, but of the entire digital enterprise.


Comments Off

More Top Search Queries in Google Webmaster Tools

posted by Peter A. Prestipino @ 8:50 AM
Monday, April 30, 2012

While Google is putting the virtual smack down on some SEOs, the search engine is also playing nice by providing a set of terrific presents to those donning their white hats in the form of additional search query data in Google Webmaster Tools.

Webmasters and SEOs can now access up to 90 days (three months) of historical data on search queries. The top search query data was previously restricted to just 35 days.

There are two other improvement to Google Webmaster Tools that were just announced and are also worthy of note. Google indicated that basic search query data will be accessible as soon as site ownership is verified, and that it will be collecting data for the top 2,000 queries for which verified sites get clicks.

Comments Off

Google Crushes Spinners and Spammers

posted by Peter A. Prestipino @ 7:05 AM
Wednesday, April 25, 2012

The much anticipated over-optimization penalty has arrived, and the most roguish (or simply the most uninformed) SEO's are in full-on panic mode.

Google announced that it has taken a step to reward high-quality sites, but what it has really done is punish those engaged in techniques and tactics that don't provide a good user experience. Sounds Ok to me. 

Google has been very vocal about its anti-webspam efforts the past few months, rolling out some serious Panda changes as well as a page layout algorithm which lowered the ranking of sites that emphasized advertising over content clarity particularly above the web design fold. But this algorithm change is different.

This change takes a direct shot at webspam and will (very likely) decrease rankings for sites that Google believes are knowingly violating its existing quality guidelines. While no mention of specific signals that would warrant a ranking drop were made, Google made it clear in the announcement exactly the type of behaviors that would be deemed black hat and jeopardize a high ranking. At least it’s a start.

Google referenced two web spam tactics in particular, the first being keyword stuffing - which has long been discredited as a reasonable means to optimize a page for higher rankings. The second example is a little more complex and something that Google has long struggled with in my opinion - article spinning.

The example provided by Google (see image below) showed "unusual linking patterns" - call it link spam - within the content. The links within that content were completely unrelated to the content itself – giving Google an indication that something was amiss. The article was spun, or created automatically with the help of software.

Article spinning essentially takes a piece of content and replaces different parts or elements of the article with “spintax” (a play on the word syntax) which is really just a list of text, but most often keyword-laden links used in an attempt to game the search engine. Many of the algorithm changes we’ve covered here at Website Magazine were focused on the use of anchor text and this may be one manifestation of those changes.

Google indicated that this change in particular will go live for all languages at the same time. While the recent Panda change affected about 12 percent of search queries to a “significant degree”, this change affects 3 percent of search queries in English to a degree that a “regular user might notice.” This algorithm change will affect other languages to varying degrees.


Link Spam Example Provided By Google:

Comments Off

Grammatical Gaffes on the Web are the Worstest

posted by Michael Garrity @ 1:00 PM
Friday, April 20, 2012

They’re maybe one or too things wrong with this sentence your reading, or may be even five – quite possibly more.

Grammatical gaffes can have a devastating effect on Web workers. Content has long been the King of the Internet, but that has never been so true as it is today. Website visitors ultimately make their decisions based on the quality and relevance of the content they consume, so grammatical errors have a profoundly negative effect on overall user experience – and thus, on conversion rates.

Users are not the only ones put off by careless grammatical errors. Search engines do not look kindly on poorly written and/or edited content. The most severe situations will usually involve error-ridden page titles or headlines, which can actually prevent the offending Web page from appearing in search results on Google, Bing and other engines.

So, let's say that you are neither a Pulitzer Prize-winning author nor a university English professor; how do you know what pitfalls lurk in the shadows? Here are a few common mistakes that all Web workers can and should avoid:

Correct Usage of Homonyms
Homonyms are words that are pronounced the same but differ in meaning and also frequently in spelling. Some classic examples of these include the following:

- Their (to show possession), They’re (a contraction of "they" and "are") and There (which refers to a place or acts as a pronoun)

- Your (to show possession) and You’re (a contraction of "you" and "are")

- Other common misuses of homonyms can include "affect/effect" and "then/than"

These mistakes are easy to make and can be hard to catch, but the negative "effects" they have on big brands and small businesses can be significant.

Hyphens and Apostrophes
Making sure that you use punctuation properly can make a big difference in terms of your SEO success. But these things are often determined by preferences in style rather than universal grammatical rules.

The common Web terms "Email/e-mail", "Ecommerce/e-commerce", "eBooks/Ebooks", etc., pose challenges for online content providers, and stylistic standards can change overnight. The best advice when it comes to specific Web-related jargon is to remain consistent; if not throughout your entire site, certainly throughout a given article or post – anything less will be seen as unprofessional by the eagle-eyed members of your audience.

Apostrophes are also often overlooked but vitally important elements to creating great Web content. The most typical infractions occur in the different case uses of the words "its" (possessive) and "it's" (it + is), and in the different versions of commonly used acronyms such as "CEO's" (possessive) and CEOs (plural).

Dangling Modifiers and Subject/Verb Agreement
Dangling modifiers take place when a sentence is structured in such a way that a modifying word or adjectival clause is associated with a word or phrase that is not the one it is supposed to be modifying.

Example: The robber ran from the policeman, still holding the money in his hands.

Subject/verb agreement simply means that the main verb, or action, in a sentence must “agree” in number with the subject, or main noun. In other words, a plural subject (cats) requires a plural verb (ran), while a singular subject (umbrella) requires a singular verb (opens).

Both of these errors can make it hard for readers to decipher the intended meaning of a sentence, thus obscuring your brand’s message.

This Goes Without Saying
You may not have a crack editorial staff at your disposal, but most website owners and content producers do have access to simple spellcheck programs that can save them a lot of headaches and maybe even some business. Avoid using them at your own peril.

Final Word
Did you see what we did here? We purposely planted dozens of grammatical errors in this very post to see who amongst you is really paying attention. Let's have it in the comments section, and we'll be happy to provide a critique.


Comments Off

Rich Snippet Updates Support HTML Testing

posted by Michael Garrity @ 12:00 PM
Wednesday, April 18, 2012

Google has blogged about two big updates to the types of rich snippets the search engine will be crawling. The news is especially useful to online merchants, SEO professionals and, well, pretty much anyone that runs a website.

Rich snippets, of course, are microformats that can be added into a Web page’s code by including special tags to enhance search results with more detailed data.

The first major change has to do specifically with product rich snippets. Up until now, these were only available as a limited set of locales, meaning products could only be previewed by specific users (based on their locations) viewing your site’s results in Google Search. Now, however, product rich snippets will be supported globally, meaning that users can preview site information about products from basically anywhere in the world.

Take a look at global product rich snippets (from Google's blog):


At the request of many webmasters, Google also announced HTML input support for its rich snippets testing tool. This change will allow users to test their HTML source without having to publish it on the Web. An update like this is a pretty huge deal because webmasters can simply test a selection of HTML code to make sure that it will appear as intended for users, even while it’s still a work in progress.

Here's what HTML input for the rich snippets testing tool looks like:

These changes will help webmasters and SEO professionals to streamline the optimization process since they’ll be able to test their rich snippet coding before a page goes live, which alleviates a lot of the headaches that can come with publishing defective code. It also aids online merchants by helping them promote more detailed information about specific products to a more global audience.


Comments Off

Google Search: More Quality Changes & Codenames!

posted by Peter A. Prestipino @ 12:05 PM
Saturday, January 7, 2012

December's list of Google search quality highlights is out and there are quite a few that you may want and pay close attention to you. Google has even added some codenames to the individual changes – which certainly makes it easier to track (and recognize).

Website Magazine's February 2012 issue recently covered several of the changes in an article titled Inside the Black Box. One of the most important improvements made last month (which Google reiterated this announcement) are about related queries. Sometimes Google fetches results for queries that are related but have fewer words. Google has changed its algorithms (codename "Lyndsy") to make results more conservative and less likely to introduce results without query words. 

The recently published list is lengthy (30 total items) but really only a handful are going to make any significant impact to search optimization professionals as many are related to auto-suggest, etc. Some of the most noteworthy improvements and modifications include:

- Codename "Simple":
An improvement that analyzes various landing page signals within image search. Google also made improvements to image size signal (codename "matter") which will result in users seeing images with larger full-size versions.

- Codename "Concepts":
More relevant sitelinks may mean Google will show sitelinks specific to a metropolitan region (which can be controlled with location settings within Webmaster Tools).

- Codename "Greencr":
Country-restricted search has arrived. Now, on domains other than .com, users have the option to see results from one particular country.

- Codename "Foby":
More accurate byline dates were announced, which improves how Google handles what data to associate with a document. The result shoul dbe more accurate dates annotating search results.

A whole host of other changes were also announced including live results for NFL and college football, better lyric results, improved Hebrew synonyms and more. Perhaps most noteworthy of all the announcements is that the much discussed encrypted search feature is expanding into the UK, Germany and France.

Comments Off

Poor SEO Health for the Pharmaceutical Industry

posted by Michael Garrity @ 11:55 AM
Tuesday, December 13, 2011

It looks like the pharmaceutical industry could use a new SEO prescription; that is, at least, according to a recent study by Covario, which details exactly how Big Pharma is providing the perfect blueprint for what not-to-do when it comes to SEO.

Covario looked at 16 of the largest pharmaceutical advertisers in the world and compared them using their patented SEO Audit Score to determine just how well their websites were optimized for the highest volume keywords common to both consumers and medical professionals searching for drug-related information on Google, Bing and various other search engines.

"For decades pharmaceutical companies have been effective at using traditional advertising to target their largest brand advocatse — doctors. The results of this study show that pharma advertisers have leveraged this expertise and applied it to SEO as a branding channel directed at medical practitioners," says Russ Mann, Covario CEO. "Having said that, pharmaceutical marketers have yet to translate their decades-old success in direct-to-consumer advertising in traditional channels like television into the Web-based organic search channel."

The study showed pharma giant Pfizer coming in first, largely because it has been able to effectively optimize its website properties around the word "pharmaceutical," a point that Covario found critical for brand recognition among medical professionals.

Trailing behind Pfizer was Johnson & Johnson, and then a three-way tied for fifth place among Eli Lilly, Novartis and Bayer.

Covario states that the big takeaway from this study is the poor returns in organic search results for "more consumer-focused, high-volume keywords like 'medicine,' 'drugs,' and 'healthcare.'" The sites that do show up high in the results for these terms are WebMD, and a variety of universities, hospitals and medical establishments.

"The good news is that there are a number of clear opportunities for pharmas to distinguish their brands by using SEO as a direct-to-consumer branding mechanism," says Mann. "It's clear that consumers are increasingly going to the Internet for information related to their medication needs."

Other brands included in the study were GlaxoSmithKline, Abbot Laboratories, Bristol Meyers Squibb, Allergan, Amgen, Biogen Idec, Mylan, Gilead and Genzyme.

Comments Off

Google published another list of algorithm changes last week, following up on set of improvements it announced last month. Here’s what digital Web workers need to know and can expect on the search results pages in the future:

 Original Content Signals: In perhaps the most important of the changes, Google indicated that it has added “new signals” to indicate which page is original and which is not. While Google obviously won’t reveal what those individual signals are, over time you can expect less duplicate content in the search results.

Less “Host Crowding”:
A modification (and additional processing) on the top set of results ensures that Google won’t show too many results from one site (host crowding). Expect a more fair and balanced search results page.

Related Query Results:
A refinement has been made that will ensure that sites/pages which only “partially” match the original query are seen less often. Why Google was in some instances returning only partial matches to a user query is up for debate.

More Long Tail Indexing:
Google has made a change to make more long-tail content available in its index. Expect more of your long-tail content to rank for relevant queries.

Blog Search Results:
The Google blog search index is now more comprehensive and will feature fresher results in the natural search results. If you’re not blogging now, you should be.

Parked Domain Classifier:
In an update that should have happened many years ago, Google finally released a new algorithm for automatically detecting placeholder sites (parked domains).

Google has also made some layout changes to improve usability on tablet devices, made modifications to better determine image freshness, and is now offering more autocomplete predictions.


Comments Off

Drive Higher Engagement with These Content Types

posted by Peter A. Prestipino @ 2:00 PM
Tuesday, November 29, 2011

A Web marketer's job can be made much easier when meaningful and valuable content is available to promote.

There is no shortage of platforms on which to promote content, or channels through which to promote your content, or ways to construct messages about your content.

But where many Web professionals fall short is in the assembling of the content itself. Too often, online workers will recycle dated material, tired messaging and straight-up boring content. On the other hand, when you have content that by its very nature leads to deeper levels of engagement, it will show in the volume of conversions.

So, what types of content drive engagement?

It is likely that they are already at your disposal and, if not, you should start creating it. There's actually quite a bit of content that you as a Web marketer and search engine optimization professional should have in your asset library. Videos, images, interviews, product manuals – just check out WM’s article on knowledge-base optimization. But many content types don’t really lead to high levels of engagement (return visits, more pageviews, additional downloads), so it's important to accurately know the value of your content assets.

Let’s look at a few vital content assets that all marketers should be regularly promoting on social networks and private forums, within email newsletters, on your own website, through display and search-based advertising and, of course, within the Google and Bing natural search results.

Product/Feature Releases:
There seems to be a general feeling in our industry that it’s not wise to be overly self-promotional. I agree in some respects but disagree in many others. It’s hard to argue, particularly when it comes to those with an established audience, that there remains a need to notify users of advancements about your business. Case in point – product/feature releases.

Not only are they a powerful way to keep messaging fresh and compelling, it’s also worth mentioning and should be in your lineup of options to promote content when the aim is to drive engagement. People like “new” and recently added features, and products definitely fit the bill. The key to driving engagement with product release-focused content is to carefully select the platforms on which to promote that content. For example, product releases are ideal for social media followers but not great for cold prospects that can be reached elsewhere.

Webinars/White Papers: Marketers jump at the chance to promote a webinar or a white paper. The reason is simple: It’s easy – easier than nearly any other form of content promotion. Production of these content types aside, buyers are naturally drawn to webinars and white papers because they provide information, meaningful/valuable insights that can be immediately used.

That’s one of the things that the three content types featured here all provide – valuable information. Webinars and white papers are perfect for nearly any channel (social media and search) but can be most effectively used in advertising, when the challenge is to educate, entertain and inform in a matter of seconds. When you show up with something as valuable as free information, the likelihood you will generate more clicks than the competition is all but guaranteed.

Feature Articles:
The term "content marketing" is poorly defined. With so many opportunities to promote content and so many formats, it’s not uncommon that marketers opt for the fastest solution, and that is rarely the feature article. If you’re staffing a team of writers, or are skilled at producing/publishing content, then it would be a shame not to leverage feature-ready, long-form articles (starting at 800-1,000 words).

Search engines give preferential treatment to long-form content over short-form – at least in my experience – so dedicating yourself to regularly producing information in this manner will serve you well. Long-form, insight-rich content increases the time on site (and even page views) and drives sharing and additional on-site activity, particularly when linking to your own content. With the exception of advertising, features articles can be used within any promotional channel – particularly in search and social media.

Content is king and the level of quality does matter a lot, but marketers can give themselves a leg up by focusing on the types of content that have proved they can deepen engagement and increase conversions.


Comments Off

Six Simple SEO Techniques to Improve your Search Engine Ranking

1. Title Tags

The title tag is contained within the ‘HEAD’ tags of your HTML, before the ‘BODY’ tags. This states the title of the page, and must contain the major keywords of the page. The contents of your title tag do not appear in the text of the page: its purpose is to inform the search engine spiders what the topic of your page is, and what words are important (i.e. your main page keyword). For example, the TITLE tag of a page based on this article would be “SEO Techniques – Improve your Search Engine Ranking”.

2. Description Tags

The description Meta tag is used by Google, and other search engines, in the search engine listings. I have tested this with them all and Google uses it as is, while Yahoo uses part of it. You should provide a description of what the web page is about, and a simple check of the descriptions in other sites using your keyword on Google will show you how many words you can use to have the whole description included. About 20 words are fine.

3. Keyword Tags

Search engines rarely use the keyword Meta tag: Google ignores it completely. However, it doesn’t hurt, and can help in a small way. Include your brand name and your own name. That way some engines might show your pages if somebody is looking for your name. The other Meta tags have no SEO value, and do not help to improve your search engine ranking whatsoever.

4. Heading Tags

Heading tags (H1, H2, . . .) are used by Google to determine the importance of the text contained in your headings. Use H1 tags for the main title of your page (you also use it in the TITLE tag, but that isn’t seen by readers, only by the spiders). Put subtitles in H2 tags. You can change the font size of the text within these tags.

5. Text Formatting

Text in bold, italics and underscored are seen by the search engines as having greater weight, and so will be used in determining the relevance of your site. Always bolden your titles, and it also helps to underline it if it doesn’t make it look out of place.

6. Writing Style and Content

Do not write for algorithms (spiders): write for your readers. Always write for humans and you won’t go wrong. If your page content reads well, and has good vocabulary relating to the topic, then it will have a better chance of a higher listing than if you stuffed it full of keywords. I rarely use more than 1.5% – the keyword densities of the terms ‘SEO’, SEO techniques’ and ’search engine ranking’ (the main keywords) of this article are 1.5, 0.87 and 0.87 respectively. Too many keywords is bad SEO, and could result in a poor listing for your page – if it is listed at all.

So there you are: six simple SEO techniques to improve your search engine ranking. It is surprising how many experienced webmasters fail to apply all of these: there is no excuse, and they are failing to get the nuts and bolts properly fitted and tightened on their web pages.

Apply these to every page and not only will you improve your SEO, but also your chances of a good search engine ranking. It is amazing how many web pages lack these basic SEO techniques.

SEO: How To Research For Free

posted by Web_University @ 8:00 AM
Tuesday, August 23, 2011

SEO: How To Research For Free

The first step of any online campaign is deciding what you want to be known for, particularly when aiming for a stronger presence on the search engines. These terms or search phrases that you wish to be seen for are called ‘keywords’. Researching and implementing these keywords into your website is crucial to the success of your website. It is vital that you look into both the search volume of each keyword and also the competition levels. By conducting a bit of research first you will gain a good understanding of how your market looks online, and also what keywords you could realistically achieve good positions for.

There are many ways you can conduct keyword research. The internet offers different tools, both free and paid for, and in addition to that there are some commands on the search engines that will also give you some clues. Free tools such as Google’s Adword’s keyword tool give you a basic look into monthly search volumes and competition levels, however it is sometimes questionable as to how accurate these figures can be. If you want to step it up a notch, you could try paying for software such as Word Tracker. Word Tracker will give you a much more thorough analysis of each word and ultimately give you a better view of the market.

For most people, using the free tools and search commands are enough to give a good idea of what keywords you should target with your website. Listed below are a few useful search commands that you could use to do a little research:

1. “In URL : ‘keyword’”

By typing this into a search engine and replacing the word ‘keyword’ with your keyword, you will get a list, and more importantly a number, of websites that are using your keyword in their URLs. This will give you a solid idea of how many websites are directly targeting your keyword.

2. “In title: ‘keyword’”

2nd most important to the URL, the meta title is vital to targeting a particular keyword. By using this command you will be able to see how many websites are using your keyword in their meta titles.

3. “In text: ‘keyword”

This command will give you a really good rounded view of the market. It will produce a list of websites that are talking about your keyword. The perfect keyword is one that has a substantial amount of traffic, yet a relatively low competition level. Of course, the keyword must be highly relevant to your website and business to ensure that any traffic that comes from the search engines is looking for exactly what you offer.

You will probably find that when you start to look into keywords, by varying the phrase slightly you can find keywords with high traffic levels and low competition…this is exactly what you are looking for. Also, don’t ignore keywords with low search volumes; if these keywords are relevant to your website the traffic they could bring could be a lot higher in quality if you are targeting a niche area.

SEO Secrets: Fighting Against the Domain Age Tide

posted by Web_University @ 8:00 AM
Thursday, August 18, 2011

SEO Secrets: Fighting Against the Domain Age Tide

So you’ve bought your Dreamweaver, an eternity later worked out how it works, started to build your site which is targeting your chosen niche or promoting your affiliate product and after what seems like forever you have added quality content. You’ve bust a gut to get this far, but this is where the real work begins. You need to get your site seen by as many people as possible. You need to drive as much traffic as you can whether it be through a pay-per-click campaign or via organic traffic.

Let’s say you decide to target organic traffic. You need to get a high ranking in the search engines. During your website construction you have already been using a keyword research tool and a SEO Tool and spending the midnight hours mulling over the chosen keyword phrases your research has indicated you should be targeting on your pages and using in your URL. Your whole site from headings to meta-tags, to meta-descriptions and anchor text are all optimized taking account of semantically related words, keyword density and the long tail.

Next you begin your back link campaign spending hours trying to get quality back links to your site. Social bookmarking takes over your life for days on end, you post on relevant blogs and all relevant forums, you set up your own blog, submit articles to article directories and then submit your entire site to SEO friendly website directories.

Yet more analysis follows, as you now study your competitors’ websites. You investigate what keywords they are targeting and study the links that they have developed and you develop strategies to be better than they are. Slowly but surely your site climbs the rankings. You constantly update your site adding quality content and before you know it your site is fast approaching the first page. A steady trickle of traffic flows on a daily basis. At this rate you should soon be top of the rankings and then… the sky’s the limit!

Unfortunately it doesn’t usually quite work like this. Yes it is true that if you have picked some long tail keywords to target you may be able to get somewhere near the top of the rankings or indeed even to the number one spot. In most instances however the traffic won’t be great and you would need to get each page of your website targeting a different long tail phrase and getting to the top of the rankings for that phrase each time for the total traffic to be lucrative. It’s certainly possible but does require a lot more graft.

For very popular keyword phrases it proves incredibly difficult to dislodge the top sites from their positions, even though in theory you may know that you have better back links and better content. Without the top positions the mass traffic will never be yours. It is just so frustrating as you probably know. I certainly do. So you go away and you research again and you analyze again and you spend days poring over the detail and you
work and you work and you know what, it makes not one jot of difference. You just cannot crack the top spots. Why not? How many times have I asked this of myself?

Now perhaps your tactics aren’t quite right. Perhaps the phrases you are targeting are the wrong ones or perhaps you’re back links are not quality back links. Do you have the quality of content on your site that you think you do? Yes, yes, yes, I hear you say. So what is going on? Well it could be something as simple as the age of your domain.

Google places a lot of trust in back links, especially quality back links to your site, but it also places significant trust in sites which have been around for a significant period of time. This is especially true if they are frequently updated. In many cases the top ranked sites are trusted sites as far as Google is concerned because they have an authority status due to their back links, but also due to the length of time they have been operating. If your site is an equal to a competitor’s site in terms of content and back links, but it is a far newer domain, there will be little chance of you dislodging your competitor from the top slot. The site that has been there for five years serving the web community carries a lot of trust with Google.

To dislodge these sites requires tremendous effort to create quality back links and sometimes you may never achieve it. It can be done however, you just need to be aware of what is going on and keep persevering. Some internet marketers have resorted to buying old domains in an attempt to overcome this challenge in building the website around the domain. I’m not entirely sure how Google reacts to this, especially if you are adding new content on a continual basis. Does the domain itself carry an inherent trust because of its age or is it the content that carried the trust from the old website?

Either way it’s worth exploring as one of the tactics along with keyword analysis and building back links that you could adopt in an attempt to get higher rankings and hence higher rates of traffic.

How Much is Too Much to Pay for SEO?

posted by Web_University @ 8:00 AM
Wednesday, August 17, 2011

How Much is Too Much to Pay for SEO?

How much is too much to pay for SEO? (…or should you try to do it yourself first?)

Yes, Search Engine Optimization (SEO) can be an excellent way of getting leads.

Yes, good SEO can level the playing field between you and competitors.

Yes, you should do some level of SEO.

…but how much is too much?

Too much? Good question.

You’ll find that everybody that uses the word “Internet” is going to suggest that you engage in SEO and many will make you an offer to do it for you. There’s nothing wrong with that as far it goes. But what you have to decide is how much you should pay for having it done.

* Some Perspectives On Paying For SEO

Here are a few important tips to help you decide the answer to that question:

(1) First, make sure you really do need outside assistance. If you’re looking for better SEO placement for relatively unique or so-called “long tail” key words (e.g. “pine street rental condominiums” ) it might be worth trying it yourself before you involve an SEO consultant or SEO firm.

(2) SEO is not rocket-science. Mostly it’s monotonous drudgery. So what you pay should not be about hiring “expertise”. The SEO effort is more like 90% drudgery, 8% experience, and 2% expertise and you should compensate accordingly.

(3) The value of SEO boils down to “clicks” – preferably clicks that result in a sale conversion. SEO should be measured on the same cost-per-click basis any search-engine-marketing (SEM) or pay-per-click (PPC) campaign would be – i.e. the basis of ROI. If you don’t know how many clicks or orders you want, do not engage SEO until you do.

(4) SEO is not static and optimization is competitive. You may be on the first page today but your competitors aren’t necessarily going to sit still forever. You could be bumped at any time. So if you’re not prepared to maintain an ongoing and strategic SEO effort – no matter what the competition does – then save your money.

(5) Search engine “secrets” are just that – secret. The search engines aren’t telling them and anybody that claims to know the secrets is just guessing. It doesn’t mean they can’t help but it’s not as if they have some special advantage. Impossible.

(6) Frankly, from the search engine point-of-view, if your site doesn’t have enough useful and relevant content to be on the first page, ethical SEO notwithstanding, eventually it won’t be. This is the objective of the search engines and there’s little likelihood that the SEO “expert” pitching you is going to out-think Google, Yahoo!, Bing and others in the long run.

* Is Doing SEO Yourself An Option?

It’s almost always worth taking a first crack at SEO yourself. Often only a little effort can make a significant difference. If you do want to make the effort, invest a few dollars in a do-it-yourself SEO guide and try to adhere to the following minimum suggestions:

(1) Focus on keywords that are realistic. You’re not likely to get a good placement with a keyword like “real estate” but you might get first page with a keyword like “Hill street real estate”;

(2) Make sure your keyword is mentioned in the link to your page. Instead of saying “click HERE” make sure the text for the link says something like “for more information about HILL STREET REAL ESTATE”;

(3) Make sure your keyword is mentioned in your page title, your keyword list, your page content, bolded page content;

(4) For every keyword you’re interested in, make sure you have an appropriate page to focus on it (and that it complies with #1, #2, and #3 above)

(5) Register with search engine webmaster accounts so that you can submit your site to them quickly and efficiently (search for “google webmaster”, “bing webmaster”, or “yahoo webmaster” to find the details).

* Don’t Forget Links

Lastly, if you going to make an initial stab yourself, understand that quality links to your site are a vital factor in your ultimate placement. The more the merrier. To get a head-start on building links to your site, do the following:

(1) Enroll in all relevant local or regional directories – (search “free directories” to find lists of these); many will be free, some will want nominal fees or backlinks. You decide.

(2) Ensure that any press releases and announcements you make refer to your site and specific pages within it.

(3) Post pages of your site to or to and to similar bookmarking sites.

(4) Ask local friends and business acquaintances if they will exchange links with you.

(5) On the other hand, DO NOT sign up for paid links without the guidance of someone experienced in Internet marketing.

* Do these things sound particularly difficult? No.

And well worth taking a stab at by yourself. From there you can decide whether its desirable or worth the cost to pay for SEO services from a 3rd-party.

* No Matter What You Do…

You need to think in terms of what kind of return you are going to get on your investment. The calculation is simple: Divide the total SEO cost (yours or a 3rd-party’s) by the number of orders/sales you’ve received as a result of the effort. Then compare that cost-per-sale against your margin-per-sale. If you have margin left over, you’re in the right territory. If you don’t, you’ve got a problem.

The bottom line is that when you talk with any SEO service provider, you must think in terms of ROI. Not in terms of “secrets” or first pages or top spots, but ROI. (Note: it is theoretically possible to be on the 3rd page and still get a positive ROI – not likely, but possible) If the ROI doesn’t work, then search engine optimization may not be for you and other Internet marketing methods might yield better results and a better ROI.

Optimize Your Images On Search Engines

posted by Web_University @ 8:00 AM
Tuesday, August 16, 2011

Optimize Your Images On Search Engines

Starting competitive advantages in the business is vital both in online and offline aspects. If you are a business organization, you need to distinguish your organization from the competition, especially if your business is not on commodity products. You can easily distinguish these advantages on the internet because of the already established fundamentals of the business.

For some companies, they get their edge from targeted advertising from high-traffic websites. For others, it is through social media marketing.

But one relatively easy way to drive relevant traffic to your website without too much effort is to optimize your images for search engines. Think of it as a tiny webpage within the website structure. It is recommended for you to include an anchor text, descriptive tagging, and URL structure to maximize results.

* Search for the Right Image

Successful bloggers, writers, and website owners know the value of using the right image for their text content. It adds another dimension to articles and enables readers to appreciate their webpage even more. However, many fail to use images for search engine optimization purposes. It is in fact a good way to drive backlinks and visitors. There’s no need to upload your own pictures. A lot of stock photos are available from free from sites like Flickr and iStockPhoto among others. It is also a good idea to use Google Search to find good photos. Make sure that you are not violating any copyright if you do this (look for Creative Commons licensing).

* Make Proper Use of Keywords

Keywords are an integral part of any search engine optimization effort. It is used to optimize all kinds of digital assets from videos to podcasts. You will really benefit if you use keywords wisely for your images. Rename the image because having “012345RR.jpg” as the file name isn’t going to help. It is a very simple step and can go a long way in helping your website rank better.

* Use Descriptive Text

It is important to use descriptive tags, file names, and alt text for your images. This is because search engines cannot read images and would use its surrounding text as reference. Make use of this feature by adding keywords on the descriptive text, anchor text, and any other tags (related to the image). Bear in mind that anchor text is one of the most important elements to optimize your image effectively so take advantage of it.

As you can see, image SEO is a straight-forward process that provides tenfold benefit from your efforts. Follow the guidelines outlined above and you’ll see your more traffic coming in from image search soon.

Meta Title Tags are Gold

posted by Web_University @ 8:00 AM
Sunday, August 14, 2011

Meta Title Tags are Gold

* How to write compelling page titles

From an SEO perspective, the title of the webpage is very important. These are the words that describe what your page is about and are the first words that a search engine sees when it crawls your webpage looking for content to add to its index.

The page title is also what the searcher sees in a search result – so the page title is very important in describing what the page is about and if the title meets the searcher’s criteria, then it is more likely to be clicked on and your page opened.

It is safe to assume that the majority of searchers these days will be tempted to either click or ignore based on the content of the title. This is like your ad in the natural search section of the search engine results page.

Now that the impact of the title of the webpage is obvious, let me explain how to write an effective and powerful title.

First the basics! The webpage title aka the title tag is the synopsis of the content of the web page. So, as no two pages on your website are the same, hence why should their title tags be? Therefore, as a general rule, title tags for each page on your website should be unique. This is an added bonus from an SEO perspective, because now you can target many more keywords and spread your reach across search engine indexes.

The second thing to consider is whether you want to add your company name in the title tag? The answer is that it depends on your branding strategy. If your company name is a known brand, or if you want to promote your firm name as a brand or if your company name consists of keyword(s) that you want to target such as ABC Family Solicitors targeting the keyword “Family Solicitors”, then by all means add your company name in the title tag. If not, then use the limited but valuable space to add your targeted keywords. If you do decide to add your company name, make sure that it is at the end of the title. This is because you want search engines and your visitors to first read
the targeted keyword(s) for that page and then the company name.

It is important to remember that since the title tag is the synopsis of the content of the web page, you need to make sure that the title tag is relevant. For example, the title tag for an about us page is “About Website Design Company – ECommerce Partners”. Hence, the title tag does its job of informing what the page is about. Now, you might have noticed that instead of “About Us – ECommerce Partners”, we added “About Website Design Company – ECommerce Partners”.

The reason is because “Website Design Company” is one of the key phrases we want to target and so, we replaced “About Us” with “About Website Design Company”. This brings out an important point. We need to do a keyword analysis before we write an effective and powerful title tag.

Keyword mining and analysis is a very important part of writing compelling page titles and is a part of the Search Engine Optimization service that First One On provides to their clients.

The next step after keyword analysis will be to write down title tags for each and every web page on your website.

Please be careful when writing title tags and never, never over stuff keywords in the title. Doing so will undermine the power of the title tag and defeat the purpose of better ranking in the search engines. The title tag is the title of your web page and so it must be relevant and meaningful. Remember, this is the title in your AD in the natural search listings of the search engine.

* General Suggestion

You cannot promote all of your keywords in one page. Normally, you should promote 3 to 5 keyword phrases per page. The ‘Title tag’ should contain up to 3 important keywords that match to the body of the page content. If the keyword you are trying to promote is highly competitive, you can consider repeating the important keyword twice in the first 100 words of the page content.

Limit the length of the title keywords to 65 characters or less, including spaces. There’s no reason to have the engines cut off the last word and have it replaced with a “…” Note that some search engines are now accepting longer titles and Google, in particular, is now supporting up to 70 characters.

Use a divider when splitting up the keywords. We generally recommend the use of the “|” symbol aka the pipe bar. Others choose the arrow “>” or hyphen “-” and both work well.

Re-using the title tag of each page as the H1 heading tag can be valuable from both a SEO keyword targeting standpoint and a user experience improvement. Users who go to the page from the search result listing will have the expectation of finding the title they clicked on. Users will be more likely to stay on a page they’re reasonably certain fits their intended search query.

Your Website’s Financial Success Will Rely On Search Engine Optimization!

One of the difficulties that internet business proprietors encounter is obtaining enough traffic and revenues for their products and services. The financial success of a site is driven by the amount of targeted traffic that it gets, and the chief source of traffic nowadays is top search engines like Bing, Google and Yahoo!. Search engine optimization can either offer you an increased rating if done properly or make your online business fail if done erroneously.

It is estimated that around 85%-90% of web surfers go to websites via search engines. Moreover, most folks would click on the websites which are included in the very first page of search engines’ listings. If you wish to get even more targeted traffic, sales and recognition, your target should be to secure an improved page rating for your site and get it to be included as among the leading results for numerous web queries. Having a website without audience is pointless, so it’s vital that you compel individuals to go to your website and see what you are selling them. A lot of search engine optimization methods will help you realize this goal.

There are a few basic search engine optimization aspects that you ought to keep in mind when you try to make search engines notice your site. Several website owners feel that the only thing they have to undertake is send their URL to many search engines, then sit back and expect their traffic to improve. While search engine submission could help you get an improved ranking and also put you on page 1 of search engines’ results, it must be supported by other SEO tactics.

The search engine optimization process is as follows: after submitting your web address to search engines or allowing them to come across it by themselves, search engines would dispatch crawlers that will index and examine your website. All your web pages are going to be ranked through complex algorithms and a sophisticated combination of factors. These bots’ purpose is to find the most appropriate pages that they can offer their searchers for specific subjects, so they must ensure that every single link they include in search engines’ listings actually possesses the needed information or details that accurately fit the users’ keywords. Search engines will then rate your website based on how valuable they determine it to be.

The highest target for your web-based business and website is to have it on page 1, not page 10 or 20. Thus, generating applicable keyword phrases, accurate labels and also related content is truly important for your success! Your keywords must be painstakingly planned and also researched. It’s incredibly crucial to find out what words or phrases visitors often enter to look for certain products and services. In many instances, it’s best to stick to the basics instead of attempting to be too creative with your keyword phrases. To illustrate, if you sell pet products, refrain from using words like canine because this is not the first word that folks use when looking for a dog collar.

Another good SEO technique would be to utilize a web address that reflects your products and services rather than using a business name, except if your brand is already established. Using the previous example, the phrase ‘dog collars’ ought to be a part of your web address if that is the product that you’re marketing. Aside from that, you should have corresponding or related words in your webpage titles as well as throughout your site. Developing titles and copy that do not correspond with each other will confuse your customers and you wouldn’t secure the positioning that you want.

In addition, your page titles should be thoughtfully made in order to mirror the material that appears on your site, and your material has to be unique, helpful as well as enjoyable. Your website cannot look like a twin of other websites; you want to make your own web presence and be
known as an expert in your selected niche.

It is tough to trick contemporary search engines, so don’t even bother having a go at it! Keyword stuffing, which calls for placing popular search phrases on websites that are totally unconnected to those keyword phrases, is a technique employed by Internet business owners who desire great outcomes without undertaking search engine optimization correctly. This specific strategy will lower your page rank or even get you banned from search engines’ listings permanently! Search engine optimization ought to be done correctly for it to make your internet business and website successful.

Honing The Art Form of SEO

posted by Web_University @ 8:00 AM
Thursday, August 11, 2011

Honing The Art Form of SEO

In the world of SEO there seems to be an article on everything from “how to get more backlinks” to “getting that #1 ranking on Google.” The more you read up on SEO tactics, the more you realize many people seem to give the same advice: write great content, submit it to social bookmarking sites, and be consistent.

Although all of these are great points, the one thing missing from many of these articles is that it isn’t that simple. You can write a great article, get it up and have it go viral-but you need to focus on the ART form that is SEO as well. Google is always changing the game for SEO experts, and you have to know how to go with the flow.

The fact is no one knows for sure what Google is looking for. We can guess, run case studies and be pretty sure on many things, but giving the same advice over and over again isn’t going to get any of us anywhere. So what else can we do?

* Build Relationships

We hear this a lot in the social media world. But the more relationships you build with site owners, bloggers and reporters the more likely you can get your articles out to the right audience. Creating a funny infograph or article is great, and you could get a lot of link juice from it, but what about the next week, month, or year? The more relationships you build the more likely you can get your great content out to the right

* Reach Out To Other Writers

Obviously your company needs to come up with great content-for both your site as well as others. But another way to get back links is to have other people write about your business. One of the reasons you create content is so people will link back to it-so start reaching out to writers and give them information they will want to write about.

Press releases are one great way to do that, but remember reporters get tons of press releases and phone calls from PR pros every day. This is where you need to get creative and figure out a reason why someone would want to write about your company.

* Fresh Ideas

Having a great plan is, well, great, but just as Google is consistently changing, so should your plan. What works great one day could work against you on another day. This is why you need to consistently think of new ideas to get backlinks, viral content etc. This is much easier said than done however. If it was easy we would all have great PageRank, and not need to read up on SEO every day.

Mind mapping tools and weekly meeting to brainstorm are a great idea. One idea could morph into many and the next thing you know, you could have the next few months of content ideas in just a few meetings.

* Trial and Error

SEO is also a game of trial and error. You won’t know what works until you try it. Don’t be afraid to take chances and see if your theory is right. Good or bad, it is a learning experience. The tools are out there, and writing great content and submitting to social bookmarking sties can work-but there are other ways to get exposure. It just takes some creativity and hard work.

Just like with other art forms, sometimes you have to create a mess before you can start creating masterpieces. This isn’t to say you need to bomb your SEO efforts completely and lose revenue, it just means you can stumble a little on new efforts while you’re hitting home runs with your “tried and true” efforts.

Search Engine Optimization

posted by Web_University @ 8:00 AM
Tuesday, August 9, 2011

Search Engine Optimization: Help Your Business Grow

When someone has a question, many times these days the first thing they turn to for an answer is the Internet.  Just about anything and everything can be found online.  From the most basic answer to the simplest question to in depth research about complex topics — it’s all available on the Internet.

As a business owner, the Internet can be a valuable tool for you to utilize in your marketing efforts.  You need to make sure that when people type in a search phrase related to your business that your website comes up on the results pages.  The process of search engine optimization (SEO) enhances your website to make it more appealing to the search engine spiders that crawl your site and when done properly, improves your website as a whole.

* SEO as a Business Strategy

The top three search engines are Google, Bing, and Yahoo so these are the sites that you should target in your SEO efforts.  Since Google is undoubtedly the leader in the search engine industry, they set the standards and they are the site to target with your SEO efforts.

If your business is not optimized for the search engines and your competition’s website is then they will rank higher in the search engine results pages (SERP) and thus people will more likely click through to their site and do business with them.  The higher you rank on Google and the other search engines, the more traffic your site will receive and if you have a good website, the more conversions you will make and your profits will increase.

* Editing Your Website For SEO Purposes

In order to properly optimize your website for the search engines, you will need to perform some basic tasks.

First, you need to make sure you pick around 10-15 keywords that you should use as the meta tags for your website and focus on 3-5 as your main keywords.  Meta tags consist of the title of the page, the description of the page and the keywords of the page.

Search engines use the meta description tag as the description of your website in the SERPs.  The keywords are used as search terms or phrases so that when someone searches for those terms, your website appears as a result.

You want your site to be as friendly to the search engines as possible in order to rank higher for your targeted keywords.  You want to repeat your keywords but not too much.  You need to keep the keyword density to  no more than 1 keyword per 100 words of text.

Being at the top of the results pages for targeted keywords is an extremely important part of business in today’s world.  Making your website both user-friendly and search engine friendly by performing website editing tasks like regular content updates and the use of meta tags.

SEO Company Stole My traffic!

posted by Web_University @ 8:00 AM
Monday, January 3, 2011

SEO Company Stole my traffic!

Believe it or not, the article is true. This is what happened to a friend of mine. I am not at liberty to name the SEO Company, especially since the investigation is still ongoing, but this is what happened.

A couple months ago, my friend hired an expensive SEO company (charged $2500) to reoptimize his website to get maximum exposure for the search engines.

After he paid the fee, he soon learned that they contracted out the job overseas to a bunch of random people who asked for his website hosting username and ftp password and told him that it should be ready in a few days.

They made some changes to his website. He visually saw many of them but not all of them. They said wait 3 months before making any other changes and let our SEO work do the job.

He waited, and his traffic started dropping. He contacted them, and they told him that it was completely normal while his website was being reindexed by Google, and to be patient.

His orders began to suffer, his visits were decreasing, he barely lasted the 3 months. When he tried to contact them again, they had disappeared.

He hired someone else to go in and take a look at his website to figure out what had gone wrong.. This is what they had done..

In his product catalog, some of the product names had a special hidden javascript next to them. When someone would go to the main website and click everything, the website would perform normally…

However, if they came through a google referer in the http request, the javascript would activate and send his visitor to a competitor / spammy website who was selling the same products.

The only way he could have seen this, is if he visited his website like a normal visitor would who showed up from Google. Instead, he manually typed in his website address and therefore the javascript wouldn’t activate.

So here is a guy, who pays $2500 to an SEO company to help increase his traffic, and instead, all they ended up doing was stealing his money, and his traffic.

This is something that everyone needs to be careful about. Don’t Ever trust an SEO company unless you have investigated Them first. Don’t just hand over your FTP username and password to someone, and say “go ahead, and do what needs to be done”

…in this case what needed to be done was to hijack his website, steal a nice sum of cash, and run off in the middle of the night. His payment was cashed overseas, and the free mail accounts they had were no longer operational.

A real nice scam. Plus you have to wonder how much they made off his free traffic they stole over that 3 month period. How much other website hosting traffic did they steal from other websites caught in their SEO scam?

We’re all so desperate to get to the top rankings of the search engines, sometimes, we lose our business sense, and just hand money over to the first person who promises what we want to hear.

SEO & Social Media Marketing To The Rescue

posted by Web_University @ 8:00 AM
Saturday, October 16, 2010

SEO & Social Media Marketing To The Rescue

Now that the website is live, we just sit back, celebrate at the launch party with a bottle of bubbly and wait for all that traffic and recognition, right? Dream on!

Internet marketing and online promotion require ongoing attention and acute care. The good news is that this part of the “web-dev” process can be the most fun and bring the most joy – both emotionally and financially. Checking your website stats and seeing that you have ten times the traffic increase when compared to last month, and having to hire additional help to respond to all of the contact form submissions can be very exciting times for any online business. (Yay! #success) Where do I start then?

A properly crafted website well on it’s way with this.

During the website development phase great care and attention was taken to lay the ground work for a sound SEO (search engine optimization) foundation. Well thought out page titles, meta tags, keyword sensitive copy-writing and a solid interlinking system should have all been in place. Having these items in place will greatly help in increasing your relevance in the eyes of major search engines. Web directories and Search engine submission

One of the first steps after launching the website should be submitting your site to the top web directories and search engines. Getting listed can sometimes cost a nominal fee and at times can be a long process, so it’s important to get the ball rolling.

Getting your site known and listed on directories can help in numerous ways:

People who use these directories can find your site

Lets the search engines know that your website exists

Provides links back to your site from some top page ranked sources

Link building

Link building is another important step in promoting your website. Having in-bound links listed on highly reputable and relevant websites is probably at the top of Google’s long list of criteria for determining your pages rank. Make sure that your approach with link building is ethical.
(Not-so-good practices such as using link farming services can quickly make your site disappear in the eyes of the major search engines.) The Best practices of getting your links listed on websites that are relevant to yours can direct visitors to you as well as making search engines say: “Hey if that top ranking website thinks that your site has reputable and relevant information then it must be so.”

Best practices for link building include (but are not limited to) the following:

Writing articles for websites or blogs and including a link back to your site in your authors bio

Writing articles and submitting them to article publication services

Adding a link on your site to a website you hope to get your name on, contacting them and asking for a reciprocal link in return

Having content on your website that is actually inbound link worthy

Keeping your content fresh

A very large mistake many people make after launching their website is never adding new content or updating existing pages. This hurts your visibility in numerous ways. If there is never anything new to see visitors will never feel the need to return. The same is true for search engine spiders – they want to eat up and index all of your new content.

Top experts also feel that search engines give more importance to sites with frequently updated content as the info must be time relevant. Spiders keep track of how often new content is added to your site and determine the frequency of their trips back to you based on this data.
The more content you produce, the more visits you’ll have from all your friendly neighbors on the net. Harnessing social media.

Using social media channels like Facebook and Twitter can help keep you in touch with your fans, tribe and client base. Social Media is a great avenue for building upon old relations and creating new ones as well. The viral nature of social media marketing also provides huge potential for exposing your website to an exponential number of people, friends of theirs and their friends’ friends, and so on ad infinitum.

There are many ways to market your website and get your name out there so that your people find you.

When is an SEO agency NOT an SEO agency?

posted by Web_University @ 8:00 AM
Friday, October 8, 2010

When is an SEO agency NOT an SEO agency?

Let’s face it, almost all SEO agencies, like most online service providers, will outsource some aspects of their work. This is nothing new. In fact, even before the advent of the internet, businesses would traditionally outsource certain tasks – Hence the plethora of ‘temping’ agencies.

Commonly outsourced tasks were things like telesales, recruitment, even accounting and auditing to some extent. But with the massive growth of online business, outsourcing has grown, and indeed actually changed in its very nature.

In the online business world, the term outsourcing has almost become synonymous with paying for workers that actually live in other countries, whereas previously the outsource workers would be the same country as the employer, if not the same city!

What about the question we started with? When is an SEO agency NOT an SEO agency?

The answer to this is not simple, and the question is really only meant to provoke thought. My own personal feeling is that an SEO agency is not really an SEO agency when it outsources all the work except the customer facing aspect.

What I mean by this is that some SEO ‘agencies’ merely employ UK based office staff to do the selling and silver tongue sales pitches, then farm out the actual work to cheap overseas labour. The better agencies may have their own database of workers they use on a regular basis, and trust – whereas others may simply use freelancer type websites to ‘pick up’ staff.

Whilst this may not at first seem like much of an issue, it is not exactly honest!. The kind of setup mentioned above, to my mind, is NOT an SEO agency, more of a sales agency that then passes the work on to others.
I guess you could argue that they are more than a sales agency; after all, perhaps they plan a campaign prior to the work being handed out to freelancers, and pop a quick report together once it’s been done… Alright, let’s call them an admin agency

Is it wrong for SEO agencies to outsource at all?

Again, this is down to a matter of opinion. I think its fine for any company, whatever its industry, to outsource some of its workload. At kingpin-seo we sometimes outsource bits and pieces – who doesn’t! But perhaps the problem arises when the bulk of the workload is outsourced, without the knowledge of the SEO agencies clients.

The client is in the belief that their work is being carried out by UK workers, whereas unbeknownst to them, someone half way round the world is carrying out the actual work, and for a tiny fraction of the amount the client is paying the agency.

What is it good to outsource, and what shouldn’t be outsourced

Again, this is my own opinion – but it is founded on background knowledge of what our own clients are happy with.

We feel it is okay to outsource small amounts of manual, repetitive work, things like gathering initial, basic info on competitors (although we prefer to carry out in depth reports in house!) , and some elements of promotion (such as submitting articles, that are of course, written in house)

A basic rule of thumb is that if the client would not be comfortable with something being outsourced, then it shouldn’t be outsourced! – Simply!

Another little rule of thumb is that if something requires or involves decision making that could have some bearing or impact upon the SEO campaign on the whole, it shouldn’t be outsourced.

How do I find a decent SEO Agency, that isn’t simply an ‘Outsource/Admin Agency?

Simple! – Just ask!. Seriously, call or email the agency, and ask them straight if they outsource, and to what extent they outsource.

If they say they don’t outsource anything at all, either they are lying, or they don’t fully understand the question!

If they say they do outsource, but are fully transparent about what processes and work practices they outsource – that’s a good start. We can define in a single sentence our stance on outsourcing, and we suggest you ask any potential SEO agency to do the same…

“We never outsource any decision making, creative production or administrative workload, everything is planned in house, and the vast majority of our work is carried out in house… any outsourcing we do is isolated to repetitive techniques such as submitting a previously written high quality piece of content to directories & websites”

If an agency is not willing to disclose to what extent they outsource, then I would just walk away. At the end of the day, it is YOUR money being used for this service, and just like a service offered offline, you are entitled to know where it is going!

Resource Section – About Kingpin-seo

Kingpin-seo is a client focused, ethical SEO agency that benefits from Google news approval. Kingpin provides cutting edge ethical link building services and transparent SEO techniques for its SEO clients.

Organic SEO or PPC advertising?

posted by Web_University @ 8:00 AM
Wednesday, October 6, 2010

Organic SEO or PPC advertising?

Sadly enough, there are still many online business entrepreneurs who take it for granted that they can still use the old marketing techniques in order to overtake their competitors. Little do they realize that those old basic techniques such as including a simple “click here” link, simply doesn’t cut it anymore. If for example you own an online business which is currently in dire need of some changes, then you may need to consider taking advantage of online marketing techniques which include the likes of search engine optimisation and pay-per-click advertising.

In fact, I wouldn’t be surprised if you’ve already heard about these techniques from some of the world’s best internet marketers, but the truth is, you might be reluctant to begin integrating them into your existing business. However, there’s nothing to be gained by simply keeping these strategies on hold, so here are a few tips on how to go about using search engine optimisation (SEO) and PPC advertising.

Search Engine Optimisation

Organic search engine optimisation is essentially an online marketing strategy which is dependent on having momentum and some long term commitment. By utilising SEO, you’ll be taking a step in the right direction in terms of accumulating information regarding link building campaigns, relationship building with other webmasters, and even some respectable and desirable publications. Of course, in order to be in full control, you need to set certain milestones so that you’ll be able to monitor your progress as you proceed.

For example, you need to ask yourself what it is exactly that you wish to accomplish. You also need to pay attention to your current image and to the level of optimisation regarding your website.

An experienced SEO specialist will be able to help you determine which the best keywords to use are, and of course they’ll be able to help you integrate those keywords into your meta tags so that you’re able to restructure your marketing strategies in order to overcome any negative fallout resulting from your previous attempts. Over and above SEO, you could of course also take advantage of other techniques, such as paid one way links and link exchanges for example, providing you do so with other reputable websites. But don’t forget – once you start organic SEO, you need to continue with it in order to maintain the momentum or else your diligently attained rankings will go down.

Pay-Per-Click Advertising

PPC advertising places much emphasis on keyword usage and the placement of adverts which are relevant to a specific website. In fact, it’s often said that this form of advertising has revolutionised the world of online advertising, in that it can provide small businesses with the same amount of leverage as what the big businesses have. Providing it’s done correctly, PPC advertising can certainly help you stand out from the crowd. If you’re currently considering a PPC campaign then you should also pay attention to the following three questions:

What do I have to offer?

Why will customers want to click on my advert?

How can I hook them with just ten words?

At this point, the most important thing for you to do is to integrate an ideal title and ten words which tell potential customers what your business is all about. The most difficult aspect of PPC advertising is that you will be in close proximity to your competitors, both in search engine results and in sponsored positions. Remember, if someone types in a search relevant to the type of website you own, your advert will appear at the top of the page or on the right hand side, and it’s vital that your advertisement must be powerful enough in order to trigger an immediate response.

Essentially, in order to get the best results you should ideally consider using SEO and PPC advertising simultaneously, rather than just opting for one of them.

Use SEO Strategies to Increase Web Traffic

posted by Web_University @ 8:00 AM
Monday, October 4, 2010

Use SEO Strategies to Increase Web Traffic

Every new technology adopted widely by society brings about a number of new opportunities. The movable type printing press created affordable print information, the telephone and radio created the concept of instantaneous communication over great distances. Today, the Internet has unified both of these concepts into the information explosion that is the digital age.

Consider this article alone – a mere forty years ago printing even fifty copies of each page would cost either a chunk of change or at least a suspicious look from the boss as you hovered over the office copier. Now the information can be sent to thousands of people within the time it takes to brew a good cup of tea.

Of course with every technology comes a system to make the best marketing use of that advancement. The radio gave rise to the modern commercial advertisement, which was refined by the television and still persists on the Web. The telephone gave us telemarketers and the first concept of communication networking. For making the most of the Internet, the strategy of the day is Search Engine Optimization (SEO).

What is SEO, again?

In short, SEO is the presentation of a webpage in such a way that it consistently ranks highly in particular search engine results. While fads and sensations can quickly boom online from “word of mouth,” they don’t produce the same reliable success as a balanced, systematic approach.

Very few businesses, after all, want one rush of attention that leads to a website crash, followed by an equally quick slide into the various forgotten graveyards of the web. Therefore, SEO uses a combination of elements to make the site increasingly relevant to the various searches that Internet users perform, to bring it up again and again among the best results.

Key SEO Strategies

1. Set goals.

Identify what you want your SEO campaign to accomplish. While any SEO-conscious writing and page design can contribute to a site’s search engine rankings, an unfocused effort will simply waste time and money. After all, a business promoting athletic clothing and footwear may not benefit too much from showing up in searches for evening wear. Is your goal simply to increase your site’s visitor traffic? Do you want to generate more sales of a product? Is it part of an effort to promote your digital brand? Each of these goals benefits from different aspects of SEO technique.

2. Link up.

Link building is one of the cornerstones of any SEO effort. Many search engines are spider-based, meaning they use automated processes to collect and categorize information on various websites. When a large number of websites provide links back to your business, or when a particularly high-traffic site does so, the spiders take notice of it and increase the relevance of that link in searches related to those sites.

3. Get the keys.

Keyword writing is consistently stressed as a requirement when websites look for content writers. Keywords are just that, words and phrases chosen for their popularity and relevance to key searches.

There are dozens of theories about keyword writing. In the earlier days of SEO writing, it wasn’t uncommon to see pages that were nothing but long strings of repeated variations on a few keywords. This has evolved into more organic writing that fits in keywords with the article as a whole.

Whichever strategy is chosen, care must be taken to avoid the temptation to abuse keyword searches. Yes, a proper keyword density will bring up your search rankings over time. However, Google can and does ban pages from its index when they determine it to be a keyword-abusing effort. So consider your keyword choices carefully, and seamlessly integrate them into your entire strategy.

4. Be on the right page.

One aspect occasionally neglected in SEO is the architecture and design of the webpage itself. Search engines and their ranking systems (be they spider or human based) are growing more sophisticated all the time, and look at many different factors in their decisions. A site that buries its keyword-rich articles on interior pages behind dozens of subsidiary links will not perform as well as one with strategic keyword-oriented material right on the front page. Have an SEO-conscious designer look over your page, as well as your articles.

Remember that every business is a multi-faceted whole. Many failures occur when people attempt to compartmentalize too much. You can’t consider SEO as some sort of ‘event’ that you do every so often, just as a business can’t put off routine maintenance of their equipment and expect it to function properly. Integrate your efforts into the entire process, and give them the same focus as any other effort in the business, and they will return their investment much more reliably, quickly, and ideally.

Optimize PDF Files for Maximum SEO Performance

posted by Web_University @ 8:00 AM
Saturday, October 2, 2010

Optimize PDF Files for Maximum SEO Performace

A PDF file can be in the form of an eBook, technical document or a brochure. Most of the search engines can read the content and index the PDF files. Currently, there are a number of well optimized PDF files which rank well and are a source of traffic for their website. Listed below are some tips for optimizing PDFs:

1. Use a Text Based PDF Creator:
There are a lot free tools available online with Adobe Acrobat being the best text based PDF creator. If a PDF document is created in an image based program, the search engines will completely ignore it. If the PDF is created using a text based creator like Adobe acrobat, the search engine robots will read and index the text like any other web page.

2. Update the Document Title:
The title of the PDF file is as important as the title tag of a web page. The PDF title property tells the search engine robots about the type of content. The most important aspect of the title is that Google uses the text in the title field as the link in the search engine result pages. Thus, the title field should be keyword rich and should not contain random text.

3. Complete the document properties:
A PDF file contains many document properties apart from the title field. These are keywords, description, author info, copyright info etc. All the fields must be completed with relevant information. The keyword field should not be stuffed with keywords or remain empty. It has not been proven that the search engines give importance to the keyword field in the document properties. If in future they do, your PDF file will have an advantage over other web pages.

4. Link to the PDF File from the Homepage:
The Searchbots will not discover and index the PDF file if it is placed too deep within the website. To ensure that the PDF file gets crawled by the search engines, it should be visibly linked from the home page or any other page which gets crawled regularly. If your aim is to get the PDF in top search engine result pages, then you have to lead the searchbots to it.

5. Optimize the content in the PDF File:
The content in the text based PDF files is similar to the content in a website. This makes content optimization an important aspect in optimization of PDF files. The content should be relevant to the subject matter. Important text should be highlighted by increasing their font size and utilizing the bold and italics features of the PDF files. Keywords should be placed in the first few lines of the content.

6. Place Links in the PDF File:
When a visitor opens a PDF file ranking in the top search engine results page, there should be a provision in the file to link back to its original website. This action reduces the efforts of the visitor to hunt for the main website. Also, a link from the PDF file can be considered as a backlink by the search engine.

A PDF file is similar to a web page in an assortment of aspects. It should be optimized with as much care as a web page to achieve high rankings.

SEO Article Writing: Using Keywords in Article Headlines

posted by Web_University @ 8:00 AM
Monday, September 27, 2010

SEO Article Writing: Using Keywords in Article Headlines

So, you have your list of keywords and you’re wondering how to incorporate them into your article titles. You’re wondering if it’s possible to do SEO article writing that also makes sense to humans.

If you go overboard with your key phrases, then your article has a good chance of being declined by publishers right off the bat.

How can you effectively use keywords in your article titles?

Is it possible to please search engines, publishers, and human readers?

Yes! This article spotlights a few techniques you can implement to effectively and correctly use your keywords in your article titles.

First, let’s lay the ground rule:

*Your title must serve your reader, first and foremost. The purpose of your title is to tell the reader what your article is about. A title is a great place to use your keywords, but the title must still make sense, be grammatically correct with proper spelling, and accurately portray the subject matter of the article.

Now, on to the tips:

  1. Your title must reflect what your article is about. Most of the time this decline reason comes up when a person writes an article and then tries to include their keywords in the title as an afterthought, when the article is not really about the keywords. For example: If your article title is “10 Heart Healthy Soups”, then your article must talk about 10 heart healthy soups. Whatever is promised in the title must be delivered in the article.
  2. Resist the urge to use a minimalist keyword-only title. If you’re extremely focused on your keywords and the impact they can have on your search engine ranking, you might wonder, “Why not just make a title that is totally keyword focused?”

For example: Hiking Boots

What is wrong with that?

Well first of all, this title is not very specific, nor does it draw a reader in. If you’re using a two word key phrase, most likely your phrase is extremely general and not specific enough to make a good title.

Your title should specifically indicate what your article is about, and if your article is about a specific aspect of “hiking boots”, then the title should reflect that. For example: “Hiking Boots: Top 5 Best Performers”

If you’re using long tail keyword phrases (3-5 words long), then the title almost writes itself sometimes. For example “How To Eat Healthy” may be your long tail key phrase, which also works well as a title.

But many long tail key phrases need extra words added to them in order to make sense. For example, the phrase “Used Car Values” is pretty general, and the article is likely about a more specific topic, such as “Used Car Values: How To Negotiate The Best Price For A Used Car”

3 – This almost goes without saying, but unfortunately I see this sometimes: Your title should not be a list of keywords.

What would you think if you saw a “title” that looked like this:

Used Car Pricing, Used Car Values, Used Car Deals

This type of title does not make sense, is not helpful to the reader, and was obviously an attempt to get as many keywords in the title as possible. Most publishers would immediately decline an article with a title like that.

The main idea is to write for your human readers first by creating a helpful and specific title that reflects what your article is about. You may use your keywords in the title if they sound natural and make sense.

Why is SEO So Important for E-Commerce Websites? The Basics

posted by Web_University @ 8:00 AM
Sunday, September 26, 2010

Why is SEO So Important for E-Commerce Websites? The Basics.

In this modern age, it is increasingly important to make sure your websites are optimized for the major search engines through the use of Search Engine Optimization (SEO) practices.  The current top three search engines are Google, Yahoo and Bing.  Let’s face it though; Google is the major player and they set the standards when it comes to the Internet search industry.

There are a few basic practices you can follow to make your site more friendly to the search engines and thus rank higher for certain keyword phrases when users search for them.  You just need to make sure you are conducting these practices properly.  Being at the top of the search engines for your target keywords is a make or break factor for the success of your business.

There are millions of websites in existence and there could be thousands competing for the keywords you are trying to rank  for.  The Internet is a very competitive marketplace so it is vital to stay on top of the current SEO trends.  Not being recognized by the top search engines can be very frustrating, especially after putting a lot of work and money into creating and designing your website.

Each search engine uses a different algorithm to conduct their searches.  These algorithms are kept secret by the companies and are always changing.  However, there are numerous people whose jobs consist of attempting to deconstruct these algorithms to figure out how the engines rank pages.

In order to increase your web page’s rank in the search engine results, take the following advice into consideration:


  • Remember that content will always be king.  Fill your web pages with relevant information that will help visitors – search engines love this.
  • Fill your content with your target keywords but don’t overdo it by stuffing keywords into your pages.  You need to keep your keyword density to no more than one anchor text keyword hyperlink per 100 words of content.  If the pages of your site contain relevant keywords to a searcher’s search phrase then your site has a higher likelihood of appearing on the results pages.
  • Be specific with your keywords.  If you are selling “Men’s Nike Air Jordans” target that phrase not “Michael Jordan shoes.”
  • When choosing keywords, put yourself in the shoes of the searcher. What would you search (WWYS)?

Meta Tags (Title, Description & Keywords):

  • The title tag is the description of the current page you are on that appears on the top bar of your web browser.  For SEO purposes, you need to use your first targeted keyword at the beginning of your title and try to describe what your website is all about In 10-15 words.
  • The keyword tag consists of the list of keywords you have put together in your market research.  Search your competitors websites and see what keywords they are using then rank them in order of relevancy from first to last.
  • The description tag is a short explanation about the content of your website and its topic.  Keep this under 20-25 words and try to use at least three to four target keywords.  This is what users will see on the search engine results pages under the link to your site.
  • For the best SEO results, edit the meta tags for each page of your website to match the contents of that specific page.

Submit Your Site to the Search Engines:

  • It is a quick and painless process to submit to the top search engines.  Simply visit the websites of the major search engines and fill out the corresponding submission forms.
  • You should also submit your website to the directory.  It can take a few months to get listed but it is a great SEO technique and most sites listed in the directory get higher priority in the search results over sites that are not listed.
  • Patience is a virtue when it comes to submission.  It can take days or weeks for the search engine spiders to actually crawl your website and get indexed.

Build Links by Submitting Your Site to Directories and Niche Websites:

  • The more meaningful links from high ranked websites linking to your site, the higher your search rating.  However, this does not mean you should practice “link farming” or paying for links.  This is considered a negative practice and can actually have a negative impact on your results rating and can even get you banned from the search engines.
  • The best way to be featured in directories is to make a profile with a link to your page.
  • Article marketing is another very valuable SEO tactic.  Write articles with keyword hyperlinks pointing to your page.  Other sites are happy to have quality content and you obtain linkage, so both sides win!  You will also see direct traffic from these links which will help your site gain popularity.
  • Link building is a time-consuming process that can take long periods of time to see real results, so again – be patient!

By following these simple SEO tactics, you can raise your ranking in the search engines!

SEO: How To Research For Free

posted by Web_University @ 8:00 AM
Wednesday, September 22, 2010

SEO: How To Research For Free

The first step of any online campaign is deciding what you want to be known for, particularly when aiming for a stronger presence on the search engines. These terms or search phrases that you wish to be seen for are called ‘keywords’. Researching and implementing these keywords into your website is crucial to the success of your website. It is vital that you look into both the search volume of each keyword and also the competition levels. By conducting a bit of research first you will gain a good understanding of how your market looks online, and also what keywords you could realistically achieve good positions for.

There are many ways you can conduct keyword research. The internet offers different tools, both free and paid for, and in addition to that there are some commands on the search engines that will also give you some clues. Free tools such as Google’s Adword’s keyword tool give you a basic look into monthly search volumes and competition levels, however it is sometimes questionable as to how accurate these figures can be. If you want to step it up a notch, you could try paying for software such as Word Tracker. Word Tracker will give you a much more thorough analysis of each word and ultimately give you a better view of the market.

For most people, using the free tools and search commands are enough to give a good idea of what keywords you should target with your website. Listed below are a few useful search commands that you could use to do a little research:

1. “In URL : ‘keyword’”

By typing this into a search engine and replacing the word ‘keyword’ with your keyword, you will get a list, and more importantly a number, of websites that are using your keyword in their URLs. This will give you a solid idea of how many websites are directly targeting your keyword.

2. “In title: ‘keyword’”

2nd most important to the URL, the meta title is vital to targeting a particular keyword. By using this command you will be able to see how many websites are using your keyword in their meta titles.

3. “In text: ‘keyword”

This command will give you a really good rounded view of the market. It will produce a list of websites that are talking about your keyword. The perfect keyword is one that has a substantial amount of traffic, yet a relatively low competition level. Of course, the keyword must be highly relevant to your website and business to ensure that any traffic that comes from the search engines is looking for exactly what you offer.

You will probably find that when you start to look into keywords, by varying the phrase slightly you can find keywords with high traffic levels and low competition…this is exactly what you are looking for. Also, don’t ignore keywords with low search volumes; if these keywords are relevant to your website the traffic they could bring could be a lot higher in quality if you are targeting a niche area.

Google Instant Means The End Of SEO

posted by Web_University @ 8:00 AM
Saturday, September 11, 2010

Google Instant Means The End Of SEO

Initially, Google’s new Instant Search system could mean a major change in how web surfers look for information online. Instead of typing a search query into Google and then hitting return, and waiting for a list of results, Googlers now see a dynamic list of results as they type. Google considers this a positive step forward in the development of searching. Google claims this new style of response will save between two and five seconds per search query. That potentially means 11 hours are saved every second. but does anyone other than Google really care?

The internet marketing community, however, will never be very enthusiastic about Google Instant. SEO consultants, who try to get sites listed at the top of Google’s organic search rankings, and SEMs, who battle for their clients’ sites to be placed near the top of Google’s Adwords Sponsored Listings, have been blogging and tweeting as if Armageddon is here.

The SEO community is paranoid at the very best of times, and perhaps with good cause as: a small change in the Google algorithm can determine the future of many websites. In this instance, however, the reaction is not necessary, essentially the results are the same, the sole change is you can see potential results of each word as you type it in, so if you are typing in ‘Italian restaurant’ you will observe everything Italian prior to getting to the restaurant results and then you will have to include your location unless you are very flexible about your travel arrangements, so in fact long tail key phrases are far from dead.

And this time round the latest Google scare is ‘much a do about nothing’ or will it be? There isn’t any denying that Google’s original innovation in search transformed how the Internet worked and made the business of finding stuff considerably quicker and easier. It also created an enormous market – one Google still dominates – that allowed companies to market us things depending on whatever we had entered in that box and all was well, for a while.

But something happened. Social networking, social media, whatever you want to refer to it as… suddenly, content was coming right at us, without us even looking for it. We couldn’t escape it. Several hyperactive egotists in each community began curating content and spewing it out to their friends. People were sharing photos, stories and links so we found that we were spending less and less time foraging around for things and increasingly more time sitting back and allowing it to wash over us.

Fast forward to 2010, and we’re being assaulted by more stuff than we could possibly consume. Facebook, Twitter, and email are shoveling pictures and video down our throats more and more quickly. Feedback loops enabled by sharing and retweeting functions imply that each of us has now changed into an over-sharer as well as an over-consumer. If you are not confused and over loaded with information, you soon will be.

Eleven Step Guide to Understanding SEO – Part 2

posted by Luigi_M_Scollo @ 8:00 AM
Tuesday, August 31, 2010

Eleven Step Guide to Understanding SEO – Part 2

One piece of ‘new media jargon’ that has got the vast
majority of business leaders confused is SEO (search engine
optimization). Too many people have been charged too much
for either inappropriate or ineffective SEO services, often
because the supplier does not really understand it either.
This two part article is for people who are not experienced
or very knowledgeable when it comes to SEO – defining what
it is and what it can do.

The introduction and first five steps to SEO heaven
discussed how to get a website ready for an SEO campaign.
In part two (steps 6 to 11), we’ll explore the continuous
and competitive process of earning a high search engine
ranking for a SEO prepared website.

Step 6 to SEO Heaven – Web Analytics

As with any marketing, but particularly for online
marketing, where the tools and results are so effective, it
is essential to measure and track results. Market behaviour
is very predictable. Accordingly, the effectiveness of each
part of your campaign can be compared and optimized. The
options for web analytics vary from free services to very
expensive and customizable packages.

Whichever you choose, don’t put it off. Measure your results
from day one and use them to improve your site and your
marketing campaigns. That old statement ‘I know half my
advertising does not work, if I only knew which half was not
working I’d stop spending it’ is not true on the internet.
You can and must know.

Step 7 to SEO Heaven – Content Building

Part of being ‘the best and most relevant’ result is having
the freshest content, and the search engines look for that
by regular visits to your site and reviewing your site’s
progress. They use a formula, not usually a human being,
unless they detect potential fraud.

Actually, good SEO means a website is never done, and the
fact that it has to change and grow over time gives your
customers a better experience. Search engines reward a
‘natural process’ that adjusts to changes in the market
and your normal business growth.

Providing good quality content that is related to what you
do, but not necessarily aimed at selling something directly,
is a powerful, perhaps the the best, opportunity to increase
the traffic to your website and the exposure of your
business. Most people do not link to pages that only serve
the purpose of making a sale.

This leads to the next step in this 11 step process of
successful search engine optimization for your website.

Step 8 to SEO Heaven – Link Building

The internet works through links, it would not be a “net”
without links. A collection of independent pages that are
not connected to each other cannot be found and, for the
most part, that defeats their purpose. People seeing and
clicking on links to your site make effective inbound links
that search engines like to reward with a higher ranking for
your website. They are also vital for SEO.

Inbound links play an important role in virtually every
search engine when it comes to ranking pages in their search
results. In the normal course of business links are added,
and sometimes removed, all the time. This never ending
organic process is monitored and measured by the search
engines as an indicator of importance and relevance – so it
is advisable to be pro-active in acquiring good inbound
links. There are plenty of sites out there that should link
to you, but don’t know you and your content. Help them to
find your content and encourage linking to it.

Step 9 to SEO Heaven – Engagement, Trust and
Community Building

Like it or not social media is a reality whole sections of
society participate in for hours daily and is a fundamental
indicator of relevance and popularity. Don’t allow your
website to exist in an isolated bubble. Talk to people and
allow them to respond and to interact with you.

People will talk about you with or without your permission.
Much better to seize the initiative and become part of the
discussion. Use it to build trust and deeper relationships
with your customers or potential customers. Use it like
research. Listen to what they say and learn about their
wants and their needs. Listen and take note of comments,
especially criticisms, and use them to improve. You can save
the money you might have spent on focus groups and get
feedback free of charge on the internet.

In relation to SEO, social media provides a huge opportunity
to expand your link building. For your business it increases
your brand exposure for a fraction of the cost of traditional,
more intrusive advertising campaigns that are usually less

Step 10 to SEO Heaven – Ranking and Traffic Analysis

When you begin, or if you have already started, check where
you are today to be able to track and compare with data in
the future. Look for trends and evaluate the progress
towards your goals. You know those goals which we set in the
first paragraph before we began the campaign. The ones you
should specify before you engage in any type of marketing
campaign. Those goals were measurable I hope? If you view
improving your SEO ranking as a measure of your business
success rather than an essential step to achieving business
success, you will maintain a high SEO ranking for the long
term. Why, because the high traffic that comes with it will
drive your business.

Does the change in ranking yield the traffic you expected?
Does this traffic actually convert? Which leads us neatly
to step 11 of SEO heaven.

Step 11 to SEO Heaven – Conversion Analysis

All of this effort matters not one jot unless you make your
profit number (or the equivalent in not for profit
organizations). It all comes down to one critical factor -
what is your bottom line? Did you make profit or did you
lose money. Web Analytics is part of the process of making
this determination. Focus on the things that work and help
your bottom line and stop doing the things that don’t.
Work on the details to increase visitor conversion to sales.
This requires testing. Don’t try anything upfront without
testing it first. The things that work for others might not
work for you and the same is true the other way around.

Many with experience in the SEO game will tell you that
there is another, more important step that would make this
article 12 steps to SEO heaven. That step is to be sure
only to work with people who can really explain SEO in plain
English. To be blunt, small, independent and one-man-band
web designers rarely get SEO fully and their usually well
meaning efforts end up costing you more than they deliver.
They do get part of the story, but they fail you by wasting
both your money and your time.

Eleven Steps to SEO Heaven – Part 1

posted by Luigi_M_Scollo @ 8:00 AM
Monday, August 30, 2010

Eleven Steps to SEO Heaven – Part 1

Are you fed up with feeling baffled by search engine
optimization (SEO) because of jargon and poor practitioners? Do
you feel you have been charged too much for less than you were
promised? This two part article sets out to explain the process
and put you back in control.

If you have focused objectives and a clear online strategy then
SEO will almost always be a good cost effective addition to the
marketing tool set. The first thing to understand is that search
engine businesses, like Google, Yahoo and Bing, have customers
to satisfy too. Their customers are searching and they expect to
see the ‘best and most relevant’ search results. I expect like
me, you get frustrated if your searches bring irrelevant results
first. No surprise then, that the methods used by the search
engine operators are designed to deliver customer satisfaction.
They work hard to eliminate bogus SEO services that aim to

It is possible for you to make your website ‘the best and most
relevant’ for certain searches and to convince the search
engine operator you are just that too. That is SEO. Each of my
11 steps to SEO heaven is necessary. I assume that you will be
committed to a long term marketing strategy, and to measuring
results with a view to adjusting your activity. The steps
include those of preparation as well of those of continuous
repeated activities. The early preparatory steps are perhaps the
most important as errors here will frustrate the effectiveness
of the later ones.

Armed with our clear objectives and online strategy:

Step 1 to SEO Heaven – Keyword Research

A vital first step that should not be undertaken lightly. While
experienced pay-per-click advertisers will know that you can
easily test and change hundreds of keywords in paid search
campaigns, they should understand this not possible for organic
search optimization. It is normally advisable to concentrate on
one to five key phrases for the whole site around a core theme.
Then, for individual pages only one to three phrases. For large
sites with hundreds of pages it is hard to optimize every single
page. The effort and cost of SEO to the full extent produces
diminishing returns.

Step 2 to SEO Heaven – Competitive Intelligence

SEO is competitive. There is only one front page and only one
top slot so it is important to know your competition and perform
better. What are they doing? Where do they rank and for which
keywords? Who is linking to their website and why? The less
competitive your industry is online the easier it is for you to
outperform your competition. This is an important determining
factor in the cost and resources necessary to achieve your
desired SEO outcome.

Step 3 to SEO Heaven – Web Design and Development

Like trying to cable an old building for modern communications
or boosting performance of an obsolete machine, fixing a bad
website design is much tougher than building properly from
scratch. When you create a new website, make sure to consider
search engine friendly design and architecture before and during
the actual development of the website. Almost all template-based
websites are tough to re-engineer for SEO. A good design from
the start will save you a lot of time and money. In most cases
it will put you ahead of a considerable number of your
competitors. In most cases a high performing design for SEO is
also a user friendly design, but occasionally compromise is

Step 4 to SEO Heaven – Get Your First Inbound Links

There is no need to pay to submit your website to any search
engine. Just as soon as you create inbound links from other
websites to yours the search engines will find your website.

There are plenty of scam products and services. Avoid them. They
are a waste of your money. No one can guarantee you a number 1
ranking. It must be earned and maintained by being the best and
most relevant.

There are some web directories that are recognized by search
engines and gaining a trade listing there will be a helpful
kick-start to your SEO campaign. Then ask your customers and
suppliers to place a link to your website from theirs. Most will
be pleased for the favor to be returned.

Step 5 to SEO Heaven – Sitemaps

The larger search engines allow webmasters to submit a sitemap
to them via a webmaster console. The search engines also provide
reports and other useful information, such as technical problems
with your websites you might not be aware of via their console.
Even if you decide against the submission of a site map to the
search engines, it is advisable to create an account and
register your website with them, just for the reports and
statistics they provide free of charge and which are invaluable
for your internet marketing efforts.

After completion of the first 5 steps, schedule them for
occasional review. The remaining tasks require regular and
repetitive effort. In Eleven Steps to SEO Heaven (steps 6 to
eleven) we look at taking a website that is a SEO ready site
with a ready to run campaign and look at the steps and work
needed to claim a high search engine ranking.

SEO Tips: Get More Traffic with These 10 Important Inbound Links

Don’t overlook your inbound linking strategy as you think
about search engine optimization for your site. An inbound
link is a hyperlink back to your site from another web site.
The one constant and reliable strategy in search engine
optimization is that sites with a variety of high quality
backlinks rank higher in the search engine results pages.

Why are these links to your site important? They can can…

– bring potential customers to your site when they click
on the link

– increase the number of visitors to your web site

– dramatically improve your search engine rankings

Even though there are software packages on the market that
help automate the linking process, use them sparingly, if at
all. The only way to succeed in linking strategies is (aside
from creating useful content that will encourage inbound
links) by manually creating the links. That’s a hard fact
to swallow, given how I like to automate as much of my
marketing as I can.

Here are the 10 most important inbound links you must have
to your site:

1. Directory Links

Directories are indexes of online sites, typically organized
by category. Links back to your site from directories like
Yahoo Directory and are very valuable. is
edited by human editors, and while it’s free, it may take
awhile for your site to be listed. Getting listed in Yahoo’s
Directory costs $299/year.

2. Press Releases

If you’re writing press releases, make sure they are optimized
for keywords that someone would use to find a business like
yours and include links back to your site, as well. Once
written, you can have your press release distributed through
a service like, which will create links from high
traffic news sites back to your site.

3. Article Directories

Writing and distributing articles through high traffic article
directories, like EzineArticles. com, is a great way to get
valuable inbound links from a high traffic site. By crafting
an effective resource box at the close of your article, you can
drive traffic back to your site!

4. Social Bookmarking

Similar to web browser bookmarks, social bookmarking sites
store individual pages (bookmarks) online and allows users
to tag (with keywords), organize, search, and manage bookmarks
of web resources as well as share them with others. If you
bookmark your own content on these sites (like,, com), you get a link from the service.
By producing content that your readers love and then bookmark
to their friends, the link increases in SEO value.

5. Blog Comments

To find blog posts on which to comment, you can use
blog-specific search engines like Google Blog Search. Make sure
these are blogs read by your target market, not your colleagues.
Brand yourself by always using the same name and remember to
link back to your site. Always leave a comment that adds to the
conversation that’s happening within the comments.

6. Social Media

Now, Google also indexes your Twitter updates and your social
networking profiles. Add that to Web 2.0 hubsites like Scribd
or HubPage and you’ve got the option of creating many, many
inbound links in a very short period of time.

7. Blog/Podcast Syndication

Submitting the RSS feed of your blog and podcast to syndication
services will give you a link back to your site. In some cases,
each time you publish a new blog post, the post itself will also
get a link.

8. Video Syndication

YouTube is one of the most visited sites online, and the number
of sites that syndicate videos is growing each day. These sites
often allow you to link to your site either in your video’s
description or on your profile page, or both.

9. .EDU and .GOV Links

Search engines place a great deal of credibility in government
and education web sites, and the links carry a great deal of
weight. Frankly, it isn’t easy to get inbound links from these

10. Internal Links

Remember, if you have more than one web site, or a web site and
a blog, be sure and link one to the other. You can do this by
linking one article to other related articles, or link to
categories or archives of information.

Creating a sound inbound linking strategy is a key component
of your search engine optimization efforts. Try a few of the
strategies listed above and see how your traffic and rankings

18 Effective Search Engine Optimization Techniques

posted by Luigi_M_Scollo @ 8:00 AM
Thursday, August 19, 2010

18 Effective Search Engine Optimization Techniques

Proper Search Engine Optimization, otherwise known as SEO, has
quickly become a popular topic of conversation among website
owners and entrepreneurs. The difference between having a
successful website, and hosting a flop, is often the difference
between whether or not you’ve incorporated proper keywords and
phrases into your webpages.

Learning proper SEO techniques can seem like a daunting task,
especially to those who are not familiar with the concept. The
following list offers 18 simple SEO techniques you should keep
in mind when developing and marketing your website.

1. Make sure your website is initially designed with your search
engine optimization needs in mind. Search engines look for text,
not flashy graphics and cool layouts. The trendiest web designs
will mean nothing if no one is able to find your site.

2. Every page of your website should have a title tag with text
describing either your site or what is on the page. Be sure the
text includes SEO-type keywords instead of the name of your
website. Unless you’re incredibly popular, no one is going to be
looking for you by searching for your name. They’ll most likely
search for a product or service and the keywords you use will
lead them to your site.

3. Consider canonicalization, or whether or not your website
address includes or excludes the www prefix. If you choose to
use the www version of your website, make sure the non-www
version directs users back to the one you use. Make sure you use
your preferred version ( or every time you place a link to your site on
the web. Never use both!

4. When designing your website, be sure to avoid too many
drop-down menus, confusing image maps, and excessive images. If
you must use any of these methods, be sure to include plenty of
text links for the search engine spiders to find and identify.
Without links, the search engines will not pick up your site

5. It does not matter what type of website extension you use
(ie. .html, .htm. .asp, .php). Search engines do not look at the
web extension and it will not have any impact at all on search
results or ranking.

6. Every page on your website should include a link to your home
page and your sitemap. Make sure every link is the same. Home
page links should go directly to your domain
( Make sure your internal links do not
include the additional /index.html or .php text as it is not
needed (ie.

7. Are you sharing a server with other websites? If so, you’ll
want to conduct a black-list check to make sure you are not
sharing a proxy with someone who has been banned by search
engines in the past. Being on the same server as a website with
a poor reputation may damage your own.

8. You’ll hear the same phrase over and over again: “Content is
king”. It is imperative that your website have fresh, unique, and
quality content that is updated on a regular basis. Be sure to
include your favorite keyword phrase within the body of the

9. People are more likely to input a phrase instead of a single
word when conducting internet searches. If your business has a
physical location, incorporate the name of your city into the
text as well. For example, you might use “our Philadelphia
location” instead of “our location”. Including your city name
will increase the chances of your site being seen in location

10. If the information on your company website doesn’t change
regularly, or remains static, you might want to consider
starting a blog. Search engine spiders are always looking for
fresh content. Use your blog as an advertising tool and link
back to your website within each and every post.

11. Write naturally. The worst thing you can do is try to cram
a zillion keywords into your article or blog entry, making it
messy and difficult to read. Search engines are able to determine
whether or not your text is logical and they will ignore content
with ridiculously high keyword density.

12. Building links to your website is essential to its success.
As a matter of fact, links are like the queen to complement your
king’s fresh content. Choose a keyword phrase and network with
other websites, asking them to place links on their pages. Don’t
hurt your ranking by having non-related websites place haphazard
links. While it may seem great to gather 100s of backlinks,
you’re better off limiting your links to related websites. Ten
relevant links stand a better chance than 100 irrelevant links.

13. Links within your own website should be built with keyword
phrases as well. Try to avoid using generic anchor text such as,
“click here”.

14. Don’t place a list of links on your website. Always place a
link within at least two to three lines of related content. The
better your description, the more likely it is someone will click
on the link.

15. Don’t limit your keyword or phrase to text links. You should
also incorporate your keywords into your image alt tag and domain
name, whether it is part of the name itself or contained within
the description.

16. Try to avoid using frames, Ajax, and Flash as much as
possible. None of these functions are keyword or search engine
friendly and will hurt your SEO results.

17. Before your website can be found by the search engine
spiders, it must be indexed. Search engines such as Google have
regular submission forms, but it can take days or weeks for your
form to be processed. Having a highly ranked website place a
link to your site is a sure-fire way to have your site indexed

18. No matter what you hear, don’t be overly concerned with the
Google PageRank of your website. A website that is properly
developed and contains good content can outrank a website with
higher PageRank.

Reverse SEO: Restoring Online Reputations

posted by Luigi_M_Scollo @ 8:00 AM
Wednesday, August 18, 2010

Reverse SEO: Restoring Online Reputations

Reverse SEO fits seamlessly within the context of your online
reputation management (ORM) program. It is the quickest, most
effective solution for dealing with bad press that has
surfaced on the search engines about you or your company. By
pushing negative listings from the front page of Google,
Yahoo, and Bing, reverse SEO shields you from the damaging
commentary of others.

Negative publicity online has become one of the most
frustrating challenges for companies. It is typically
anonymous. Names are often unattached to forum threads, blog
posts, and even entire websites. Therefore, it is difficult to
track and address the source of the complaint. Moreover, the
growing popularity of social networking platforms has made
it easier than ever for anyone with a mild grievance to give
weight to their grudge. If you or your company have been the
target of bad press online, it may be time to launch a
reverse SEO campaign.

In this article, we’ll clarify how negative publicity gains
traction within the search engines, and how it can lead to a
public relations nightmare. We’ll also provide a working
blueprint for executing a reverse SEO campaign and
controlling the damage.

Controlling Bad Publicity With Reverse Search Engine

To appreciate why reverse SEO is effective, you should
understand how negative press takes root within the top
search listings in the first place. Google, Yahoo, and Bing
rank pages based on a large number of criteria. If a website
and its individual pages satisfy the most important of those
criteria, those pages will rank well.

A lot of the bad press that targets companies (possibly even
your own) is placed on websites that meet key ranking
parameters in the search algorithms. That means the negative
publicity can climb into the top positions and gain exposure.
When people search for you or your company, they’ll see the
bad press. That damages your reputation.

Reverse search engine optimization is an ORM strategy that
pushes negative publicity from the top search positions. By
moving the bad press off the first page of listings, reverse
SEO limits its exposure and stifles its impact.

Ingredients For An Effective Reverse SEO Campaign

Like search engine marketing, reverse SEO uses a methodical,
multi-pronged approach to protect your online reputation. The
first step is to identify the sites and pages that contain
negative publicity about your company and that are ranking for
important keywords. Those keywords might include your name,
that of your company, or key employees.

The second step of reverse SEO is to analyze those sites and
pages for their respective ranking authority. That will help
you determine the effort and tools you’ll need to use in
order to move them from the first page of listings within
Google, Yahoo, and Bing. A negative PR blitz that is
spreading across social networking sites is more difficult
to remove than a single blog post that is on a non-authoritative

The third step is to gather the necessary tools and execute
your reverse SEO campaign. Such tools might include
optimized press releases, a new network of competing sites
and blogs, social media profiles, and a social bookmarking
program. Reverse SEO may also include heavy content
syndication to build high-quality links. A search engine
marketing specialist will have these tools at their

Reverse SEO Begins Before Negative Press Emerges

The best time to launch a reverse SEO campaign is before bad
publicity appears in the search engines. This is due to the
way that the pages link. A page will rank well within the
search engines if there are enough thematic links pointing
toward it. However, once it ranks, it will gain exposure.
That exacerbates the problem.

Negative press can spread rapidly as people attach the press
to their own blogs, sites, forums, and social media accounts.
That creates a growing portfolio of links pointing toward the
damaging press, cementing its position in the top listings.
It becomes more difficult to address. By launching a reverse
SEO campaign upfront, you can prevent the negative publicity
from gaining exposure in the first place.

Protect Your Online Reputation With Reverse SEO

Reverse SEO should play a key role in your online reputation
management program. It is far too easy for unsatisfied
customers, resentful employees, lazy journalists, and
malicious competitors to tarnish your name. And when it
happens, it is usually done under the cover of anonymity.
Anonymity makes the complaint or grievance impossible to
address in private.

Launch your reverse SEO campaign now – before trouble
strikes and the damage begins to gain momentum in the search
engines. In a year’s time, you’ll be glad you did.

DIY SEO with a Foolproof Twist

posted by Luigi_M_Scollo @ 8:00 AM
Wednesday, August 11, 2010

DIY SEO with a Foolproof Twist

As the founder of an independent SEO firm I am asked almost
daily “What is the easiest way to optimize a web site?” It
is harder to answer than you might think. It depends on who
is asking and what their knowledge or experience level is.
But last week I read an article written by the person who
taught me search engine optimization ten years ago, and I
now have an answer… Learn the secret recipe, then use it.

There is no magic bullet that works on every web page like
so many SEO ‘miracle’ books and programs would have us
believe. Each page has its own unique ‘secret recipe’ that
has to be discovered and then applied because every single
page it is trying to outrank, is unique. If this seems
confusing, just remember my new favorite quote that I
borrowed from that author’s website: “Think about it.
Anyone could bake Mrs. Fields famous cookies – possibly
better than Mrs. Fields herself – with the secret recipe.
And that’s all SEO is; knowing the secret recipe for any
given web page.”

And it really is that simple. We will look at how to
discover the secret recipe in a moment, but first there is
one crucial thing to understand about search engine
optimization before you go any further:

SEO means making a web page as highly visible as
possible to search engines… for a given keyword.

Just about everyone knows the first part; but too many
people forget the second.

Remember that your soon to be visitor types in very
specific keywords (search terms) on search engines like
Google, because they are looking for something very

If you know your most important keyword that is great! If
not, do not waste your money on expensive keyword software
that SEO professionals use. Use a free keyword tool like
Google’s, found at:

Just enter in the keywords that you think people are
searching for to find your product or service and then
check the search numbers for those and related keyword

So what about the secret recipe? Once you know your main
keyword or keywords you need to see how well your site is
optimized for that term already. This is called getting
your SEO Quality Score.

What you do is make a list of the most important
optimization aspects of a webpage, for the search engine of
your choice (because Google looks for different things than
Yahoo or Bing.)

Here are ten critical factors for Google, but there are
nearly 140 to be aware of…

* Keyword use in document title
* Keyword use in body text
* Link texts of inbound links
* Global link popularity of web site
* Keyword density
* Keyword position and proximity
* Number of words
* Readability level of web page
* Keyword use in H1 headline texts
* HTML validation of web page to W3C standards

Once you have all of this information for your webpage, you
need to get it for the pages currently occupying the top
ten positions for that keyword and then see how your page

You can buy software that does a lot of this for you but it
generally costs $500 to $2,500 for the ones that really
work and even then you might need to buy an hour of an SEO
consultant’s time to help you really understand it.

Doing it yourself is very educational. You will be an SEO
pro after a couple of these. So if you have the time, I
would not hesitate.

In my case I did not have the time – or let’s be honest,
the patience – to invest what would be 20 plus hours doing
this manually and not really understanding what I was
looking for. At the time I was a lawn sprinkler installer
looking to optimize my company webpage and knew nothing
about SEO, except that it was too expensive for me to hire
someone else to do it. So I went the cheap and easy route
paying a company called DotCom Pirates (at $50 for something they
called their SEO To Go package that reviewed my site for
all 140 SEO factors and provided plain English instructions
to fix every aspect of it. I did most of them and hit
number two on Google.

Here’s my dirty little secret… Realizing this was much
easier than breaking my back in irrigation, I picked up a
couple local shops as SEO customers, ordered reports for
those sites, followed the directions and my SEO business
was born. I no longer need their help, but it made for some
easy success when I needed it most.

Knowing the secret recipe is the foolproof answer to DIY
SEO. Once you have the secret recipe it’s easier than you
ever thought possible and you’ll be laughing all the way to
the bank with the money you saved.

SEO Tips to Double Rankings, Traffic and Conversion

posted by Luigi_M_Scollo @ 8:00 AM
Wednesday, July 28, 2010

SEO Tips to Double Rankings, Traffic and Conversion

The only thing better than one search result in the top 3
positions in Google is two search results from a double ranking.
This SEO tip works by pushing a competitor off the first page,
broadening your websites keyword funnel and thereby doubling
traffic and conversions.

Two Results are Better than One

I read somewhere that 87% of search engine traffic for a given
keyword is allocated from occupying the Number 1 position in the
search engine results page. If you understand SEO, then this
post will share a quick method to double your SERP positions and
to improve the likelihood of keyword conversions – once you have
reached the Mecca for a specific search term.

SEO is predicated on one simple premise, rankings; in order for
SEO to be effective, it must produce ranking on the first page
in search engines.

Not only is this the crowning achievement of search engine
optimization, but once you achieve a top 10 position, then you
can pull other keywords into the spotlight as a result of
strategic linking. We often refer to this as the buddy system
for lateral linking.

Search engine algorithms pay particular attention to individual
pages capable of offsetting all of the other inconsistencies of
a competitor’s web pages and deem one page worthy above all
others for any given search term.

Obviously the metrics are unique for each market, keyword or
niche, but the reality is the same, once a top 3 or more
importantly Number 1 or Number 1, 2 and 3 positions are present
in Google. I have mentioned before, the fastest way to get a top
10 position in Google is to get a link from a website already
ranking in the top 10 for that keyword.

It does not matter if that link is provided from your own
website or another website, rankings are by the page and there
is a daisy-chain effect of linking pages together that fuses the
pages through a dynamic give and take relationship (based on
citation). This citation can provide the algorithmic equivalents
of trust needed for the newly linked page to jump in line past
others duking it out for that keyword.

Depending on the competitiveness of the keyword or key phrase
and the thresholds inherent to the barrier to entry; the time
required to initiate a campaign, create all of the necessary
content, inbound links and citation from other web 2.0
properties, RSS feeds and social bookmarking sites divided by
the amount of time you invest managing or outsourcing the
various components involved determine your profitability and
return on investment.

With this in mind, from a tactical perspective, it’s better to
leverage the SERP positioning you already have than to look
outside of your own website for off page ranking factors. If
you understand the power of a Number 1 position, then you can
replicate this next simple SEO tip.

Identify all current Number 1 positions in Google for keywords.

Validate they still exist.

Use Keyword Research to find “related keywords” based on the
Number 1 ranking Link from the page that ranks to a new page
(using similar anchor text or overlapping keywords to promote
the new page).

Let the new page get indexed, then check the SERPs.

Identify: My favorite tool for this is SEMRush, but if you
don’t want to use this, there are other programs out there, or
even Google webmaster tools can show you your website’s top
ranking SERP positions when you log in.

Either way, this is your base, so, identify the keywords which
could represent hub status for your SEO campaign and pass along
the power of ranking to other pages in your website.

Validate: Check to see if you still hold the Number 1 position,
even a top 3 will do, but this tactic works better if you are at
the helm of a particular search phrase.

Keyword Research: You should be able to gauge whether or not the
keyword is profitable for you based on the frequency of hits and
the type of traffic you garner as a result. You can always look
through Google Analytics or whichever analytics package you have
to assess the keywords that represent the highest percentage of
traffic to your website.

Once you know what those keywords are, then use keyword research
to find stemmed semantic variations that also fall under the
same category or keyword cluster. Those related keywords will
become the new focal point for step 4 – linking.

Linking: The closer the shingles (groups of keywords) the more
effective this technique is. You can call this padding the
search results (if you use similar exact match titles, tags or
content), or you can pass this ranking factor along to help
synonymous terms.

Simply go back and edit the page ranking in the Number 1
position and add a link to the new target page (with the
keyword you intend the target page to rank for as the anchor
text). Then, the authority from the page in the Number 1
position will group the new page under its umbrella and pull
that page into the spotlight.

When the new page gets crawled and the old page reveals the
connectivity between the two, typically a double SERP position
occurs or a double position accompanied by jump links,
breadcrumbs or the [+] with additional search results for that
keyword appear in Google to showcase the degree of relevance
your website has for the said term.

You can then build additional deep links from other sites or
addition internal links to the newly dubbed page. As a result,
you should see buoyancy for other pages for multiple keyword
variations related to the parent keyword cluster.

With this simple tip you can double your conversions by
increasing your website’s semantic array of keywords. Obviously
you will know which keywords and traffic is most lucrative for
your business model, but this technique is priceless for
creating controlled keyword stemming if you understand the
implications underlying its premise.

The Tricky Issue of Duplicate Content and Google

posted by Luigi_M_Scollo @ 8:00 AM
Friday, April 2, 2010

Being a full-time online marketer means you have to keep a close watch on how Google is ranking pages on the web… one very serious concern is the whole issue of duplicate content. More importantly, how does having duplicate content on your site and on other people’s sites, affect your keyword rankings in Google and the other search engines?

Now, recently it seems that Google is much more open about just how it ranks content. I say “seems” because with Google there are years and years of mistrust when it comes to how they treat content and webmasters. Google’s whole “do as I say” attitude leaves a bitter taste in most webmasters’ mouths. So much so, that many have had more than enough of Google’s attitude and ignore what Google and their pundits say altogether.

This is probably very emotionally fulfilling, but is it the right route or attitude to take? Probably not!

Mainly because, regardless of whether you love or hate Google, there’s no denying they are King of online search and you must play by their rules or leave a lot of serious online revenue on the table. Now, for my major keyword content/pages even a loss of just a few places in the rankings can mean I lose hundreds of dollars in daily commissions, so anything affecting my rankings obviously gets my immediate attention.

So the whole tricky issue of duplicate content has caused me some concern and I have made an ongoing mental note to myself to find out everything I can about it. I am mainly worried about my content being ranked lower because the search engines think it is duplicate content and penalizes it.

My situation is compounded by the fact that I am heavily into article marketing – the same articles are featured on hundreds, some times thousands of sites across the web. Naturally, I am worried these articles will dilute or lower my rankings rather than accomplish their intended purpose of getting higher rankings.

I try to vary the anchor text/keyword link in the resource boxes of these articles. I don’t use the same keyword phrase over and over again, as I am nearly 99% positive Google has a “keyword use” quota – repeat the same keyword phrase too often and your highly linked content will be lowered around 50 or 60 places, basically taking it out of the search results. Been there, done that!

I even like submitting unique articles to certain popular sites so only that site has the article, thus eliminating the whole duplicate content issue. This also makes for a great SEO strategy, especially for beginning online marketers, your site will take some time to get to a PR6 or PR7, but you can place your content and links on high PR7 or PR8 authority sites immediately. This will bring in quality traffic and help your own site get established.

Another way I combat this issue is by using a 301 re-direct so that traffic and pagerank flows to the URL I want ranked. You can also use your Google Webmaster Tool account to show which version of your site you want ranked or featured: with or without the www.

The whole reason for doing any of this has to do with PageRank juice – you want to pass along this ranking juice to the appropriate page or content. This can raise your rankings, especially in Google.

Thankfully, there is the relatively new “canonical tag” you can use to tell the search engines this is the page/content you want featured or ranked. Just add this meta link tag to your content which you want ranked or featured, as in the example given below:

Anyway, this whole duplicate issue has many faces and sides, so I like going directly to Google for my information. Experience has shown me that Google doesn’t always give you the full monty, but for the most part, you can follow what they say. Lately, over the last year or so, Google seems to have made a major policy change and are telling webmasters a lot more information on how they (Google) rank their index.

So if you’re concerned or interested in finding out more about duplicate content and what Google says about it try these helpful links. First one is a very informative video on the subject entitled “Duplicate Content & Multiple Site Issues” which is presented by Greg Grothaus who works for Google.

Another great link is this page from Google Webmasters Support Answers by Matt Cutts. It has a lot of helpful information, including a video on the Canonical Link Element.

In yet another post, Matt Cutts discusses the related issue of content scraping and advises webmasters not to worry about it. This is a slightly different matter, other webmasters and unmentionables may use software to scrape your site and place your content on their site. This has happened to me, countless times, including when my content has been reduced to scrambled nonsense. Cutts says not to worry about this matter as Google can usually tell the original source of the material. In fact, having links in this duplicate content may just help your rankings in Google.

“There are some people who really hate scrapers and try to crack down on them and try to get every single one deleted or kicked off their web host,” says Cutts. “I tend to be the sort of person who doesn’t really worry about it, because the vast, vast, vast majority of the time, it’s going to be you that comes up, not the scraper. If the guy is scraping and scrapes the content that has a link to you, he’s linking to you, so worst case, it won’t hurt, but in some weird cases, it might actually help a little bit.”

As a full time online marketer I am not so easily convinced, I mainly have pressing concerns about my unscrupulous competition using these scrapings and duplicate content to undermine one’s rankings in Google by triggering some keyword spam filter. Whether in fact this actually happens, only Google knows for sure, but it is just another indication, despite the very detailed and helpful information given above, duplicate content and the issues surrounding it, will still present serious concerns for online marketers and webmasters in the future.

10-plus SEO Questions – Google Rules

posted by Luigi_M_Scollo @ 8:00 AM
Friday, March 12, 2010

This morning I woke up to someone having submitted a pile of SEO questions using our newsletter question form. At first I thought, “Yikes, that’s kind of pushy to think I have time to answer all those questions!” But then I remembered that this was a newsletter week and I still had no idea what I was going to write about. A second look at the questions made me think that you guys would probably be interested in the answers to many of them, so it worked out perfectly.

Most of these questions have been answered in greater detail in various articles that I’ve written, so if you’d like more info on any of them, I’ve linked to the relevant ones for your convenience.

Thanks to Umair R., who submitted these questions.

1. Is there any fixed rule for Google as far as SEO is concerned? If so, what are the steps?

If only! There are no fixed rules because every website is different and has different needs. There are basic things that all websites need to do in order to improve their chances of showing up in Google search results for relevant phrases, but no magic formula.

See “The Art of SEO” article for more on this.

2. Do the following play important roles in website page ranking and positioning?

• PR

Yes, real PageRank (PR), the kind that only Google knows about plays a very large part in websites showing up (or not) for search queries that are relevant to it. But toolbar PageRank is another matter entirely. What you see there doesn’t correlate very well to where your page will show up in the search results.

See: “Getting Into Google” (Scroll down to the “Google Still Loves Its PageRank” part.)

• The number of incoming links

Not so much in and of itself. Real PR, as mentioned above, is calculated not only on the number of links, but also on the quality of those links. A handful of links from authoritative, trustworthy, relevant pages should far outweigh hundreds of links from so-so sites.

See the High Rankings Link Building Forum.

• Keyword density

Not in that there’s some special percentage that you need to aim for. Certainly it’s helpful to have the keyword phrases that you’d like to show up being used within the content of your page. But that’s just common sense, if you ask me. Surely, if your page is about a certain something (your keyword phrase), how could that phrase NOT be on the page?

See the various threads on keyword density on the High Rankings Forum.

• Page response time

This is important only because if it takes too long to load, it might not be properly (or completely) indexed.

• Bounce rate

It’s doubtful that this matters, because there’s no way for Google to know the bounce rate of every site. And it wouldn’t be fair for them to only count the bounce rates of those sites that have Google Analytics installed, so my guess is that this is not a factor.

See various High Rankings forum threads.

• Time on site

Like the above answer, they don’t know this number unless the site has Google Analytics installed. That said, they may sometimes incorporate the old trick of seeing if a searcher clicks to another site in the search results after clicking one result, and how long it took them to click another. In other words, if they find that lots of people who clicked to one site in the search engine results pages (SERPs) always end up back at Google to try another site, then perhaps that first site wasn’t a great answer to the search query after all.

• Domain page / Page age

From what I can tell, this can often be a factor. But it doesn’t seem to be as prominent a factor as it was a few years ago.

3. Is there any special technique for content writing?

There’s no special technique, but I highly suggest hiring a professional marketing copywriter. You will see a positive return on your investment very quickly if you do. In addition, the tried and true SEO copyediting techniques in my “Nitty-gritty of Writing for Search Engines” may come in handy if you’re not sure how to integrate your keyword phrases into your professionally written content.

4. Should we cater to code-to-text ratio while developing websites?

There’s not one shred of evidence that this would have an effect on where a page would show up in the search results for a relevant search query.

5. If active scripting is a must for webpage development, how harmful can it be for PageRank and positions?

It’s typically not harmful at all because it’s usually done before a browser (or search engine spider) sees a page. To users and search engines, your dynamically generated pages are just static HTML by the time they get to them. Still, not all dynamically generated pages are created equal. There are some ways of developing your site that are less search friendly than others. For example, some JavaScript menus, some AJAX, etc.

See “Diagnosing the SEO Health of Your Website”.

6. If a webpage is ranking top for a specific keyword, if we make textual changes in that webpage, is there any chance that we lose the rankings?

Any changes you make to a page’s content can affect how relevant the search engines believe it to be for any particular search query. That doesn’t mean it definitely will change the search results, but it could. The only way to know is to try it and see. Usually, if you’re rewriting your page to be more useful to your site visitors and you don’t remove all the instances of the keyword phrase, you should be fine. Because nothing is permanent with SEO, if you don’t like what you see you can tweak it until you do.

Google’s SEO Report Card… Information Nuggets or Fool’s Gold?

While ostensibly aimed at helping Google target potential weaknesses in its own product pages, and of no direct use to SEOs, there is nonetheless more than a little gold to be found here, if one just examines the document in a little more depth. So while the post at Google’s Webmaster Central Blog is already beginning to bristle with comments lamenting the fact that this isn’t a clear treasure map to the search-ranking mother lode, it’s worth sifting through the Report Card to see what informational nuggets are hidden inside.

Subject I: Search Result Presentation

It’s easy to see why some readers simply dismissed this document out of hand, as the first section starts off being little more than a rehash of the standard “Use Page Titles, Use Meta Descriptions” advice found in any SEO-101 manual. Only by persevering to the part talking about Google Sitelink Triggering, does one begin to suspect that there may be a little more to the report card than meets the eye. Here the authors throw out a couple of crumbs about categorizing website and link-structure, and consolidating a site’s URLs to maximize its informational focus with the aim of increasing the chances of Google generating Sitelinks.

Even so, it’s nothing most professionals haven’t heard before, and I suspect that by this time a lot of readers had given up, thinking that nothing interesting was in store.

Subject II: URLs and Redirects

This is where we see a little glitter among the rubble, as the section starts off with the statement that: “Google products’ URLs take many different forms. Most larger products use a subdomain, while smaller ones usually use a directory form…”

In itself this is not an exceptional statement, and the chapter continues to give handy, but hardly unique, information about canonicalization, URL structure, and redirects until Page 10, where we find the following declaration:

“Subdomains require an extra DNS lookup, slightly affecting latency, which is very important at Google.”

Page load-speeds are an important factor to Google. There’s been talk and speculation about this ever since Matt Cutts dropped the first hints last year, and these days most SEOs are busily proclaiming that slow websites are now a handicap.

Haven’t they always been?

Be that as it may, this fact is not common knowledge with the average webmaster, as demonstrated by a question I’m regularly confronted with over at the Google Webmaster Help Forum:

“Which is a better way to categorize my site, subdomains or folders?”

The standard answer to this question used to be “Whichever you prefer” before load-times became an issue. Now, however, we find a clear indicator that a folder-based approach is much-preferable unless a category actually contains enough information to merit its own site, which is effectively what a subdomain turns it into.

Subject III: On-Page Optimizations

While at first glance this chapter is more standard SEO-101 fodder, it’s where we find a sizable nugget, as the report talks about semantic markup, and how Google uses it to gauge a page’s content.

“Nothing new here; we all use H1 tags.” you might say, but you’d only be partially right, because this issue not only runs much deeper than H1 headings, it runs beyond Heading tags altogether, as I’ll explain shortly. For the moment, however, let’s stay with them.

In the past few years, a great many Optimizers have reached the conclusion that only H1, and, to a degree, H2 are of any promotional value, and that lesser headings (H3 – H6) carry practically no weight at all. But let’s take a look at the following statement, taken from Page 38 of the Report:

“Most product main pages have an opportuníty to use one

tag, like the example above, but they’re currently only using other heading tags (

in this case) or larger font styling. While styling your text so it appears larger might achieve the same visual presentation, it does not provide the same semantic meaning to the search engine that an

tag does.”

For starters it’s obvious that the lesser headings are alive and well, and being used by Google. We’re also told that Google does not, or cannot, judge the visual-context meaning of CSS styled text. The conclusion is to use more heading tags instead of CSS styles wherever your content calls for it. However, there’s more to it still. Let’s take another look at part of that statement:

“…but they’re currently only using other heading tags…”

It would appear that Google still places greater value on other semantic markup tags (em, strong, blockquote, etc.) than many professionals give them acknowledgment, for these days. Otherwise why would the author specifically note the fact that Google only uses headings and font styles?

I personally know quite a few professionals who have long-since abandoned most semantic markup tags in favour of CSS style, since the prevailing attitude of designers and SEOs has been that making text bold or italic no longer carries much promotional weight, following widespread abuses in the mid-2000s and Google’s consequent algorithm updates.

And although the above statement may be a tentative one, it might just point the way back to a more HTML-based approach to web design. Indeed, if it can be taken at face-value, it’s entirely possible that those SEOs and designers advocating CSS-based, table-less design as the way forward are barking up the wrong tree. Whatever the case may be, there is undoubtedly more to the SEO Report Card than first meets the eye, and at the very least, there is a little gold to be extracted from the mass of standard information. Only by reading the full document will you be able to make an assessment yourself.

What should also be remembered is that the SEO Report Card is not aimed at high-flying SEOs or E-lebrity industry pundits, but at the intermediate webmaster for whom even the report’s basic information is of immense value, if read alongside Google’s SEO Starter Guide

Discover the Answers to the Top 10 SEO Questions

posted by Luigi_M_Scollo @ 8:00 AM
Friday, February 26, 2010

Discover the Answers to the Top 10 SEO Questions

Any type of online business will strongly benefit from a few SEO techniques. However, everyone and their brother has advice on how to do it. All this ‘expert’ advice can make the simple task of optimizing your site incredibly confusing. Here are some straightforward answers to the most common SEO questions.

1. What is SEO?

SEO stands for search engine optimization. A search engine is a tool many internet users use to find sites that are relevant to their needs. The three biggies when it comes to search engines are Google, Yahoo and MSN. There are however, hundreds of search engines available to internet users. Search engines work by sending out spiders to crawl through the World Wide Web and gather information. If you have the information they’re looking for, in the places they are looking, they’ll find you and place you in their results when a person is looking for your information.

The task of understanding what search engines are looking for and putting it in the right places on your website and in your content, is the essence of search engine optimization. So now you might be asking…what do search engines look for and where do they look for it? The answer is keywords and links. Keywords in your html coding, keywords on your webpage content, keywords in your content, and the number of incoming links you have to your website.

2. How Important is SEO?

Let’s just put it this way. What’s better, a few visitors who stumble upon your website or hundreds of visitors that go to your website with the direct intention of learning more or making a purchase?

With more and more people searching and shopping online, getting on the first page or two of the search engine results can mean the difference between keeping your day job and becoming an internet millionaire.

3. What are Text Links?

Links are just one of the tools you can use to improve your search engine optimization. The more quality links you have, the better your search engine ranking will be. Text links are links that contain only text. Wikipedia is a great place to examine internal text links. The links are contained within a sentence and when a reader clicks on them they are taken to a different page on the same website. The kind of text links you’re looking for will be text links that will take readers from your article, ebook, or web copy to your website.

An excellent tool to generate incoming links is to write copy for online audiences like article directories, blogs, and ezines and insert text links in the copy. Webmasters will link to the content and thus to your site. Additionally, when you allow free reprints of your copy and provided the links are maintained, you’re encouraging links to your website.

4. What are Link Farms and Link Exchanges?

Search engines don’t accept just any old link. The link has to be from a relevant and quality company. This means you don’t want to participate in link farming. If a search engine suspects your links to be lacking, they’ll actually penalize you. Link farming or link exchanging is essentially the process of exchanging reciprocal links with Web sites in order to improve your search engine ranking. A link farm is a Web page that is nothing more than a page of links to other sites. Stay away from link farms. When you generate a link from another site, it had better be relevant and coming from a real web site.

5. What is Duplicate Content?

The definition of duplicate content is web pages that contain substantially the same content. Search engines will penalize you for this. How do you avoid duplicate content? Don’t publish the same article in several locations. There are many tools available online to help you re-write your content so that it is 30%, 40%, and even 50% different. However, the best way to avoid duplicate content is to simply write new content.

6. How do I Find the Right Keywords?

There are several steps to finding the most profitable keywords. The first step is to generally do a bit of brainstorming and come up with a list of keywords you think people will use to find your products. The next step is to research supply and demand for those particular keywords. Supply means how many other websites are using those same keywords and demand is how many people are looking for those particular keywords.

The key is to find keywords with high demand and relatively low supply. There are many effective and useful keyword tools to help you find this information and to generate keyword ideas. Once you decide on a few keywords, it may be useful to do a bit of testing before you commit to them.

7. How do I Optimize My Web Pages?

Placing your keywords in the right location is a good start to optimizing your web pages. Search engines look to the headings, subheadings, domain name, and title of your website. They also look in the content on your page and primarily focus on the first paragraph.

Try to get a domain name with your primary keyword included. When you include your keyword in your URL it tells the search engine spiders immediately what your site is about.

Title Tag. Your title tag is the line of text that appears on search engine results pages that acts as a link to your site. This is a crucial element of your webpage as it describes to your visitors what your page is about.

If you view your source code, your title tag will look something like this: Search Engine Optimization Tips

Keeping your title tags brief, descriptive, up to date, and keyword rich will help to improve the relevance of your site in the eyes of the search engines, as well as giving your potential visitors a good idea of what they can expect from your site.

Meta Tags have lost their importance to the search engines, however, it is still helpful to place your keywords in your meta tags. In your source code they look something like this:

8. Do I Need to Submít My Site to The Search Engines?

The simple answer is – no. Search engine spiders are always out there doing their job and collecting information. Every time you update your website, add content, or change your keywords, the search engines capture the information and record it. However, if you want to be listed in a dírectory, like the DMOZ Open Directory Project, then you will need to submit to those.

9. What are Spiders?

Search engine spiders are also called web crawlers or bots. They’re basically automated programs which scan websites to provide information to search engines often for the purpose of indexing or ranking them.

10. How does Content Help My SEO?

Content is one of the best tools to improve your search engine ranking. It is a great place to emphasize keywords, encourage linking to your site, and boost traffic. The key to content is to make sure you’re offering quality content and you’re updating your website and your content frequently. Content can be provided in many forms including:

• Blogs

• Forums and chat rooms

• Articles

• Reviews

• Case studies

• Reports

• How to guides

• Tutorials

• e-books and much more.

Search Engine Optimization – Title Tags Revisited

posted by Luigi_M_Scollo @ 8:00 AM
Wednesday, February 24, 2010

Search Engine Optimization – Title Tags Revisited

What Is a Title Tag?

The title tag has been – and probably will always be – one of the most important factors in achieving high search engine rankings.

In fact, fixing just the title tags of your pages can often generate quick and appreciable differences to your rankings. And because the words in the title tag are what appear in the clickable link on the search engine results page (SERP), changing them may result in more clickthroughs.

Search Engines and Title Tags

Title tags are definitely one of the “big three” as far as the algorithmic weight given to them by search engines; they are equally as important as your visible text copy and the links pointing to your pages – perhaps even more so. Yet, even though this has been common knowledge among SEO professionals for at least 10 years, it is often overlooked by webmasters and others attempting to optimize their websites for targeted search engine traffic.

Do Company Names Belong in the Title Tag?

The answer is a resounding YES! I’ve found that it’s fine to place your company name in the title, and (gasp!) even to place it at the beginning of the tag! In fact, if your company is already a well-known brand, I’d say it’s essential. Even if you’re not a well-known brand yet, chances are you’d like to be, right? The title tag gives you a great chance to further this cause.

This doesn’t mean that you should put *just* your company name in the title tag. Even the best-known brands will benefit from a few good descriptive phrases added, because they will enhance your brand as well as your search engine traffic. The people who already know your company and seek it out by name will be able to find you in the engines, and so will those who haven’t heard of you but seek the products or services you sell.

Title Tags Should Contain Specific Keyword Phrases

For example, if your company is “Johnson and Smith Inc.,” a tax accounting firm in Texas, you would want your company’s site to appear in the search engine results for searches on phrases such as “Texas tax accountants” and “CPAs in Texas.” (Be sure to do your keyword research to find the best phrases!) If you prefer to work with people only in the Dallas area, you’d need to be even more specific by adding geographical modifiers to your title tags, such as “Dallas tax accountants.”

Using our Dallas accountant example, you might create a title tag like this one:

Johnson and Smith Tax Accountants in Dallas

or you might try:

Johnson and Smith – Dallas CPAs

However, there’s more than enough space in the title tag to include both of these important keyword phrases. I find that using 10 to 12 words in my title tags works great.

One way to include two keyphrases would be like this:

Johnson and Smith – Dallas Tax Accountants – CPAs in Dallas, TX

I’ve always liked the method of separating phrases with a hyphen; however, in today’s competitive marketplace, how your listing appears in the SERPs is a crucial aspect of your SEO campaign. After all, if you have high search engine rankings but your targeted buyers aren’t clicking through, it won’t do you much good.

The idea is to write compelling titles as opposed to simply factual ones, when you can. But it also depends on the page, the type of business, the targeted keyword phrases, and many other factors. There’s nothing wrong with the title tag in my above example. If you were looking for a tax accountant in Dallas and saw that listing at Google, you’d probably click it. (Note: Don’t worry if some of your visible title tag info gets cut off when the search engines display your page’s info; they are still indexing all the words contained within it.)

Still, you could make it a readable sentence like this:

Johnson and Smith are Tax Accountants and CPAs in Dallas, TX

I’m not as thrilled with that one. I had to drop the exact phrase “Dallas Tax Accountants” because it wouldn’t read as well if it said:

Johnson and Smith are Dallas Tax Accountants and CPAs in Dallas, TX

It sounds redundant that way, as if it were written only for the search engines.

In the end, it’s really a personal preference.

Don’t make yourself crazy trying to create the perfect title tag, because there’s just no such thing. Most likely, either of my examples would work fine. The best thing to do is to test different ones and see which bring the most traffic to your website. You might very well find that the second version doesn’t rank as well, but gets clicked on more, effectively making up the difference.

Use Your Visible Text Copy as Your Guide

I prefer to create my title tags *after* the copy on the page has been written and optimized. I need to see how the copywriter integrated the keyword phrases into the content to know where to begin. If you’ve done a good job with your writing (or better yet, hired a professional SEO copywriter), you should find all the information you need right there on your page. Simply choose the most relevant keyword phrases that the copy was based on, and write a compelling title tag accordingly. If you can’t seem to get a handle on the most important phrases for any given page, you probably need to rewrite the page content.

I recommend that you *don’t* use an exact sentence pulled from your copy as your title tag. And don’t use the exact wording that’s in your top headline. It’s much better to have a unique sentence or a compelling string of words in your title tag.

You’ll want to watch out for certain website content management systems (CMS) and blog software that automatically generate the title tag from information you provided elsewhere. Some, in fact, default to the same exact title tag on every page, which is the best way to kill your search engine leads! The good news is that most of today’s CMS’s and blog software have workarounds so that you can customize your title tags fairly easily. If yours doesn’t, or your developer claims they can’t do this, then you’ll want to find a new developer or CMS as soon as possible!

The Three SEO Factors That Really Matter

posted by Luigi_M_Scollo @ 8:00 AM
Tuesday, February 23, 2010

The Three SEO Factors That Really Matter

Search for a list of SEO factors and you’ll find that most feature at least 50.

That’s 50+ elements of your website that influence your ability to rank in search engines. Sounds complicated, doesn’t it?

Some SEO Consultants will tell you that ranking in search engines is about applying a precise formula to these 50+ elements – about using “special proprietary techniques” fine-tuned to search algorithms to boost your website above the competition.

Not exactly.

There are actually more like 200+ signals that search engines use when ranking websites.

Imagine trying to reverse-engineer something like that? Sounds impossible, right?

That’s because it is.

The good news: it doesn’t matter.

You don’t need to be a computer engineer to rank well in search engines. Relieving, isn’t it?

The truth is that everything boils down to three factors:

1. Search-Friendly Pages
2. Relevant Content
3. A Trusted Website

All of those other factors and elements of SEO? They all fit into one of these three basic categories.

You don’t need to be a search scientist to understand the basics of what’s going on with these three factors and improve them for your website.

1) Search-Friendly Pages
Essentially, this first factor has to do with the technical aspects of how your website and pages work.

Search engines use crawlers (or “bots”) to browse the web by following links. As they browse, these crawlers scan the content they see and store it in databases. These databases form the search engine’s web index – and when a user comes along and enters a search phrase the index is scanned for pages that match.

The basic idea: you want to make sure your pages, and the content that fills them, are visible to search engine crawlers.

There are a few things you should know about crawlers:

• They don’t support JavaScript – so that rollover menu, those drop-down links, etc, might not be visible to search engine crawlers.

• They don’t support Flash (mostly) – while there have been a few developments in this regard recently, Flash websites still aren’t too search engine friendly .

• They can’t “see” – sometimes designers use images instead of HTML text (usually because they want to use a certain font that isn’t web-safe), and search engine crawlers can’t read or index this text. Crawlers can only read code – and if your content isn’t found there it’s essentially invisible to search engines.

• They skimp on resources – it takes a lot of energy and time (and money) to crawl the web (there are a lot of pages out there) so crawlers are usually programmed to be conservative with how far they’ll dive into a page. If your web pages take a long time to load or feature a tremendous amount of content crawlers might leave without scanning/indexing everything.

There are some other things crawlers can’t/won’t do. To get a sense of what they can see on your website try . This tool allows you to enter the address of a web page and see it as search crawlers see it.

The bottom line: you might have the best content in the world, but if crawlers can’t see it you won’t rank for relevant keywords.

2) Relevant Content
This factor is all about the words on your pages.

As we discussed above, the visible content on your pages is stored and searched every time someone uses a search engine. If the keyword or phrase entered doesn’t occur on your page you probably won’t show up.

There are a few key places where you’ll want to use the right language on your pages:

• Title tags
• Headlines
• Body copy
• Anchor text (links pointing to internal pages)

As you browse the web you’ll probably notice that lots of webmasters have gotten a bit, shall we say, “overzealous” with optimizing their content. Title tags stuffed to the brim with dozens of keyword variations is common. Sometimes even the body copy itself is stuffed with keywords in an attempt to boost rankings.

You might be tempted to do this yourself to try and enhance your chances of ranking for a given keyword.

Don’t do it. Please.

Why not? Try reading a page that’s been stuffed with keywords this way. It’s an awful experience, right? Certainly enough to stop your reading flow and send you to another website, isn’t it?

Don’t sacrifice your user’s reading experience in the aim of ranking for a given keyword. It’s not worth it. All of the traffic in the world won’t mean a thing if the users who land at your pages are turned off and leave. Your competitors are just a few painless clicks away.

To learn about what keywords people use when they search for your products/services/info try Google’s AdWords Keyword Tool – enter either your website address or a keyword and this tool will return a líst of related keywords including numbers on how many people search for them.

The bottom line: it’s rare to rank for a keyword that doesn’t occur on your pages so use the language your users do when they search. Don’t overdo it and stuff keywords, though, because you’ll annoy your visitors (and search engines don’t like it either – they might flag you as SPAM).

3) A Trusted Website
When you’ve got 1) search-friendly pages and 2) relevant content it’s still not time to sit back and let the search traffic pour in.

The truth is that most of your competitors will have looked into these factors already – they’re kind of the “low hanging fruit” of SEO, because they’re not usually terribly difficult to work out.

Trust is what sets you apart. It is by far the most important of the three factors.

Before Google came onto the scene using PageRank (a measurement of link popularity) to rank websites, search engines generally based their rankings on the first two factors we’ve discussed.

What was the problem with that approach?

Webmasters are greedy. We can’t help ourselves. We love traffic.

Keyword stuffing was rampant, and rarely did webmasters stick to the honest truth about what their website was relevant to. The result: search results littered with SPAM and just about anything with very little relevance.

The reason links were a better signal to Google was simple – it’s harder to game. While you can control the content/keywords on your website, it’s a lot harder to control it on someone else’s. It’s pretty tough to get someone to link to you against their will.

The model simply worked – Google’s results were better. The other search engines quickly caught on and looked to signals of trust for sorting through the SPAM.

Some signals that search engines use to determine whether they can trust your website:

• Inbound links – quality is more important than quantity here – that’s why those “500 directory links for $49.95″ deals are worthless. The easiest links to get are the least valuable/powerful. A single link from, for example, would outweigh tens of thousands of weaker links – that’s how much quality matters.

• Website age – if your website is new there’s not much you can do about it without a Delorian and a working flux capacitor (“Marty, the website is in place – now we gotta go back to the future!”). A website that’s been around for a while is simply more trusted by search engines.

• Who you link to – it’s not just about inbound links. Search engines also look at what websites you link to from your pages. If you’re linking out to SPAMMY websites, they might consider you part of that “bad neighborhood” and penalize your website. Be careful who you vouch for.

There are other signals involved, but if you’ve got these three trust factors working in your favor you’re very likely to dominate the competition.

The bottom line: search engines don’t like getting burned by ranking SPAMMY websites. They want to know they can trust your website. Once you’ve got your on-page factors right (#1 and #2 above), you’ll need to build trust signals before your website will rank competitively.

Simple And Successful SEO Strategies – On Page Optimization

SEO doesn’t have to be complex and by following these simple on-page optimization techniques you can give your SEO campaign the perfect start.

SEO is often seen as being a difficult and in-depth process, but the reality is that by following some reasonably common sense guidelines it is possible to get good rankings. That’s not to say that optimization is a simple or quick process; there are, unfortunately, no short cuts. Your SEO efforts should be a concerted and long term endeavour, in order for you to enjoy the best possible results, and should incorporate both on-page and off-page optimization techniques. By following the on-page SEO strategies below you can set a strong foundation for all your SEO work.

Keyword Research

Before you begin penning content and writing title and meta tags you first need to research the keywords you will use on each of your pages. Using the wrong keywords can negatively impact your entire campaign, causing you to lose untold hours and days of work and eventually forcing you to concede that you made the wrong decision and start all over again.

The most appropriate and most beneficial keywords are popular enough that they will enjoy regular searches but without being prohibitively competitive or overly generic. A number of keyword research tools exist and your competitors’ websites are a good place to start your early research. Ensure keywords are targeted specifically to the type of content you will provide as well as the service or product you will be selling. More targeted keywords will result in more targeted visitors and targeted visitors mean greater conversion rates and an improved return on your efforts.

Niche And Semantically Related Keywords

A good strategy is to incorporate a reasonable list of competitive keywords with less competitive ones. The more niche keywords will serve you well during the early days of your website and over time you should be able to start competing for the more challenging of the keywords you use. Also incorporate semantically or topically related keywords into your keyword list because the search engines are placing more and more emphasis on those pages that use related keywords as well as primary keywords.

Accessibility And Standards

Site accessibility is an integral part of good website design, but it should also be considered an important factor in any SEO strategy. Using standards based code for your website will help to ensure that anybody that wishes to access and view your website will be able to do so. It will also mean that the spiders used by search engines will be able to access and index your pages effectively ensuring that you get the full credít for your site.

Navigation And Intra-Linking

Your navigation menu and internal links should be prominently placed, easy to see, and easy to follow for the spiders. It is good practice to include a text link from the home page to a compliant sitemap on your site, alleviating any potential problems that might arise from broken links or the use of graphical or flash based navigation menus. You can also consider adding links into the main body of your content, although too many will make the page difficult to read and therefore diminish the overall effectiveness so don’t get too carried away.

Title And Meta Tags

While search engines do not specifically use the meta tags to help assess the value of a page like they once did, meta tags are still critical to good SEO performance. The title and description tags that you add at the top of a page are used in various ways including in the compiling and display of Search Engine Result Pages (SERPs). This is the first thing a potential site visitor will see from your site so this mini listing needs to be as effective as any paid advert or PPC ad. Poorly written titles and descriptions can put many readers off viewing your pages so a little time and effort here can have a very positive effect.

Using your keywords in the title and the description is good practice because these will be highlighted in the search results if they were used in the search query itself. This will make your result more prominent and instantly identify your page as being relevant to the user. Don’t needlessly use keywords, however, and don’t throw extra keywords into the description at the cost of a well written, short ad.

Other Formatting Tags

On-page content should always be written with the visitor in mind, although obviously it can still be optimized for search engines. As such, proper page structure is important to your reader as well as to the engines. H1 and H2 tags are an effective way of breaking up page content, and give readers the chance to skim through a page and determine its relevance.

A page should only contain a single H1 tag at the top of the content but can include multiple H2 and H3 tags. Alt tags on images should also be included and these as well as the actual file path to the image itself can include important keywords (but do make sure that they actually make sense and are more than just a keyword thrown in for the sake of SEO).

Page Content Optimization

Finally, we get to the heart of the page – the content itself. Use the keywords you researched for a page, including semantically related keywords. Write as naturally and appealingly as possible while keeping those keywords in mind and don’t get carried away stuffing or cramming them into the body of the text. Not only is this unappealing to readers but is seriously frowned upon by the search engines.

The reader really is the most important aspect of your content. If the majority of your visitors are coming from the search engines, remember that they arrived using specific keywords. This means that they are searching for equally specific information relating to those keywords – make sure you deliver on the promise that you made in your title and description tags.

How to Recognize a Bad SEO Company

posted by Luigi_M_Scollo @ 8:00 AM
Sunday, February 21, 2010

How to Recognize a Bad SEO Company

Search Engine Optimization (SEO) is about getting potential customers to visit your website. It is also about building a quality website full of great content. It uses keywords appropriately and gets links “naturally” because people love what you have on your site. SEO companies can provide very useful services including keyword research, site review, providing technical advice on your website development and also management of online business marketing campaigns. They can also help with content development, article marketing and article distribution. Although it’s not brain surgery, it is hard to do and usually requires a lot of thought and real work.

Some unethical SEO firms attempt to manipulate search engine results in unfair ways. These practices could get your website ranked lower or even banned. When looking at SEO – either to optimize yourself or if you are looking to hire a company, here are some things to take into account.

Be Cautious Of SEO Firms That Say They Will Get Thousands Of Links To Your Site

It is not the number of sites that make the difference – it’s the quality of the sites. When firms promise huge numbers of links, or say that you will become part of their “network of sites”, it usually means a link farm is involved. A link farm is any group of websites that all hyperlink to every other site in the group. Search engines don’t like this and it can lead to penalties. Instead, practice reciprocal linking with legitimate and related websites for better search engine ranking.

Be Wary Of SEO Firms That Guarantee A High Ranking On Google

No one can guarantee a high ranking on Google. Some SEO companies provide a guarantee on their services. This is fine. What’s not fine is guaranteeing high ranking in an incredibly short period of time. When these unrealistic results fail to happen, the company will balk at giving a refund, suggest other services instead and start to become unreachable or disappear.

Be Cautious Of SEO Firms That Send “Spammy” Emails

These emails are unsolicited and usually begin with “We’ve noticed that you are not lísted in some search engines…” You should be searching for a high-ranking SEO company; they will not be searching for you. Spam means scam. You don’t buy your medications from spammers so why buy SEO services from them?

Be Wary Of SEO Firms That Are Secretive Or Don’t Clearly Explain What They Are Going To Do

Most reputable SEO firms are upfront with their clients and like to share their knowledge. They are confident that even if their clients understand their process, they won’t leave them. If the SEO firm claims it’s too complicated for you to understand, or if they say they have trade secrets and proprietary technology, it’s a sign that they may not be ethical in dealing with your website.

Be Wary Of SEO Firms That Say They Will Submít Your Site To Thousands Of Top Search Engines And Directories

Besides the small fact that there aren’t that many search engines, consider that the guidelines of the search engines themselves tell you that it doesn’t do any good anymore. Search Engines are good at what they do – searching for sites – and you don’t need to pay someone to submít your site to a search engine. If they make this claim, they will probably use Free For All (FFA) junk sites that might damage your site’s standings.

Be Cautious Of SEO Firms That Say They Can Optimize And Promote Your Site For A Low, Low Monthly Fee

Not all monthly SEO or SEM (Search Engine Management) service contracts or monthly fees are a scam. There are real reasons to pay a monthly fee to an SEO expert. These would include conditions when you would require SEO management: when you or someone else is constantly generating new content or new features for your site; implementing link-building campaigns; implementing PPC (Pay Per Click) campaigns; or starting a brandcasting campaign. Press release distribution, email campaigns and article marketing campaigns could also require a legitimate monthly fee.

Not-so-legitimate fees could include monthly re-submittíng of your site to search engines, “tweaking” your code to keep up with changes and regularly submitting your site to hundreds of useless free-for-all directories. The worthwhile companies that charge a monthly fee will usually be able to tell you exactly how much it is per month to generate blog entries or generate and distribute articles or press releases. And it won’t be for the low, low price of $79.95.

Choose Your SEO Company And Services Carefully

Do your research and don’t make the decision lightly. If you were hiring a contractor to remodel your kitchen you would want to see other kitchen projects they’ve done and speak with the owners about the company’s business practices. You should do the same thing when hiring an SEO company. Get referrals and really speak with them.

There are many online tips about choosing and hiring SEO firms that you can check out as well. Remember, SEO is a long-term strategy and you should take the time to do your research before buying or you’ll probably be buying again.

Top 10 Don’ts for SEO Copywriting

posted by Luigi_M_Scollo @ 8:00 AM
Saturday, February 20, 2010

Top 10 Don’ts for SEO Copywriting

Following in the footsteps of Rand Fishkin and Guy Kawasaki, I decided to come up with my own list of don’ts.

There is no shortage of don’ts when it comes to SEO copywriting. It seems this niche got off to a rough start many years ago when early comers somehow misconstrued the core principles of the trade. Allow me to elaborate on how not to write SEO copy.

1. Don’t shove as many keyphrases into the copy as humanly possible. It’s not about the sheer volume of search terms you include. Yes, Google and other engines should be able to follow what the page is about. Yes, engines are looking to match a searcher’s query with search engine optimized content on your web pages, but which pages land at the top is decided through a series of calculations far more complex than any simple ratio. When you overload copy with keyphrases you sacrifice quality and user experience.

2. Don’t lose site of balance. If SEO copywriting isn’t about the percentage of keywords within the copy, then what is it about? Balance. You have two audiences with SEO copywriting: the search engines and your site visitors. But surprisingly, the balance doesn’t come with serving both masters well. The balance comes in how much you cater to the engines. You see, your site visitors always come first. However, if you write with too little focus on the engines, you won’t see good rankings. If you put too much focus on the engines, you’ll start to lose your target audience. Balance. Always balance.

3. Don’t let someone else choose the keywords. If keyword research isn’t a service you offer, an SEO firm, keyword specialist or some other professional that your client hires will have to conduct the research. Don’t just accept keyphrases these folks toss your way. Ask to see the entire list with recommendations as to which terms would be best strategically. Then you, as the professional writer, can decide which will also work best within the copy.

4. Don’t sacrifice flow for numbers. This is a follow-up to number three and is a major issue with bad SEO copywriting. SEOs or clients sometimes insist on using hacked-up search phrases that simply don’t work in a normal sentence. An example? “Candies samples free.” Many copywriters will just grin and bear it, sacrificing quality and flow for the sake of competitive values or other numbers. The result is often some obnoxious sentence like, “If you’re looking for candies samples free, you’ve come to the right place!” Forcing a phrase into the copy at all costs never turns out well.

5. Don’t use keyphrases that don’t apply to the page. If you operate a site about wedding receptions, don’t try to force a search term about wedding dresses into the copy just because it pulls a lot of traffic. (A) Unless you sell, alter or design wedding dresses, it won’t be applicable. (B) Even if you manage to get the page ranked well for the phrase [wedding dresses], once the visitor clicks to your site and realizes you have nothing to do with wedding dresses, they will leave. It’s a waste of time and effort and it creates a poor user experience.

6. Don’t use misspellings and correct spellings on the same page. I fully understand that the misspellings of keyphrases can be valuable search terms. However, to mix correct spellings and misspellings within the same page of copy looks like you’ve got a bunch of typos in the content. It’s just not professional. Some writers will go for the old, “We rent limousines (sometimes spelled limosenes) for the most affordable prices in town.” I don’t care for that approach. It’s just not natural. Would you ever see brochure or newspaper copy that reads that way? I think not.

7. Don’t use keyphrases the exact same way every time. This is how we end up with horrible SEO copy that sounds like a 4th grader wrote it. (See #4.) There are lots of ways to use keywords in copy, not just one. In order to sound natural, you have to get creative with your keyphrase use. One way is to break up phrases using punctuation. Since search engines don’t pay attention to basic punctuation marks, you can easily write something using the search term [real estate Hawaii] that reads like this: “Currently there is an impressive selection of available real estate. Hawaii listings can be.” See? “Real estate” is at the end of the first sentence and “Hawaii” is at the beginning of the second sentence. The engines ignore the period so there’s no problem.

8. Don’t use all types of search phrases for every situation. There are many ways in which this “don’t” applies. One quick example is that of an ecommerce site. It wouldn’t be advisable to use specific, long-tail keyphrases on the home page of your site. They are much too specific in most cases and are better suited for individual product pages. Broader terms are typically best for an ecommerce home page. If you don’t understand the best applications for the various types of keywords, you’re likely to have lackluster results.

9. Don’t neglect ALT tags/image attributes. These tags are the ones associated with images on your pages and they carry a good deal of weight especially if the image is used as a link. The ALT text counts the same as anchor text in a text-based link. Depending on a few different factors, ALT text may be a good place for those misspellings mentioned in #6.

10. Don’t forget the chain of protocol. There’s a method to the SEO copywriting madness. The idea is not to get as many different keyphrases onto a page as possible. Just the opposite, in fact. Rather than having 12 different search terms used only one time each, you need to use two to four keyphrases (depending on the length of your copy) per page. The title, META tags, ALT tags, other coding elements and on-page copy need to support each other as far as keyphrase use goes. Your goal is to let the engines know that you have original, relevant content about a narrow topic.

Unless you have an exceptional number of back links built up, just mentioning [dark chocolate], [chocolate strawberries], [chocolate chip cookies], [chocolate cake], [chocolate desserts], [organic chocolate] and [chocolate cheesecake] once each on a web page isn’t likely to do a lot of good. Instead, pick two or three terms which are closely related and use them several times each along with mentioning them in your tags.

When you avoid making common mistakes, you’ll find your SEO copywriting flows much better, is more natural-sounding and ranks higher, too.

Top 5 SEO Copywriting Mistakes That Will Cost You Money

posted by Luigi_M_Scollo @ 8:00 AM
Friday, February 19, 2010

Top 5 SEO Copywriting Mistakes That Will Cost You Money

Just as there are different ways of writing for novels, for advertising and for films, there is a way to write for the Internet. To find content on the web we use search engines. To make sure the search engines find our content we optimize it. Search Engine Optimization (SEO) copywriting is writing content that the reader wants to read and will be easily found and rank well with search engines.

The object of writing for the Internet is to get the reader to use your content to click through to your website. If they don’t get to your website, they can’t look at your products or services and you will have lost a potential customer. Here are a few mistakes that you’ll want to avoid.

Mistake #1 – Have a Boring Or Vague Title

This is a very important mistake not to make. If they don’t even look at your article, all your time and effort are wasted. If you provide an attention grabbing title, one that makes them curious enough to open your article, you’re halfway there.

Here are just a few ideas to get you thinking: Use titles that describe the content of your article but are short and concise; Use keywords in your title that people might be searching for; People can’t resist articles with lists or tips such as, “Top 10 Copywriting Mistakes” or “Top Tips on Getting Your Articles Read”; and “How to” articles are popular as well.

The bottom line here is to put some thought into your title. Think about how to get a reader’s attention.

Mistake #2 – Create Bland Content

From beginning to end – try to keep it interesting. Make reading your article a pleasurable experience for your reader. Here are a few suggestions.

Make it fun, relevant and grammatically correct. Nothing pulls the reader out of a story more than bad grammar and misspelled words.

Use short sentences and try to limit paragraphs to two or three lines. Concentrate on writing rich and appropriate copy rather than just practical words.

Have a sense of humor. This gives your articles personality. Don’t give a sales pitch – use a call to action. The purpose of your article is to get your reader to get to your website. Your writing could include a reason for them to find more information, either from another article that you’ve written or from your website.

“Content is king”. If you keep this in mind, you’ll be ahead of the game. Search engines love well-written and useful content. So do readers.

Mistake #3 – Make Your Article As Hard To Read As Possible

Every post should be easy to scan. That means your reader should be able to easily scan your article and find headings that will tell them what the section is about. You can use numbered lists and bullets to organize your ideas so they are quickly read. If you italicize, bold or underline a word, the search engine assumes that it’s a keyword. You can use this to your advantage. However, if you use these tags a lot or if you use them on non-keywords, you’ll confuse the search engines and lose any advantage you would have gained.

The other thing that makes a page easy to scan is short paragraphs. When you look at your copy on the page, you should see a lot of white space. Looking at a page that’s completely filled with words is intimidating to a reader. You want to make it as friendly and welcoming and as easy to read as possible.

Mistake #4 – Misuse Keywords

Keywords are at the core of writing for the web. You should research and know your keywords. Here are a few suggestions about keywords:

• Target a set of keywords in every post but don’t use them more than three or four times on a page. If you use the same keywords again and again, search engines can tell that the article isn’t very useful.

• Use a wide variety of words that pertain to your topic.

• Use synonyms of your keywords in addition to the keywords.

• Don’t stick to a standard keyword density for every article or post. You want your words to flow naturally, and overuse of keywords makes your copy sound forced.

• Review your keywords every so often. Sometimes your business changes and you want your articles to change also.

If you provide your reader with content that lets them learn or experience something, you’ll have a happy reader. If you provide the search engines with good keywords and a variety of them, you’ll have a happy search engine.

Mistake #5 – Try To Trick the Search Engines

Practicing questionable tactics like cloaking and using hidden text is a bad idea. The last thing you want is to get your site banned. These kinds of tricks will do it. So can using hidden links, link farms, linking to bad sites, distributing viruses and sending spam. Don’t try to trick the search engines and don’t work with any companies that use these techniques.

Overcoming these common mistakes can give you head start when creating effective content on the Internet. SEO copywriting requires effort. Putting content on your site and distributing it on the web takes time. If you work at it over time and create lots of valuable content, effectively “brandcasting” your site, you’ll be rewarded with more traffíc.

Search Engine Optimization SEO is No Place for Amateurs

posted by Luigi_M_Scollo @ 8:00 AM
Thursday, February 18, 2010

SEO is No Place for Amateurs

How come everybody nowadays is an SEO Expert?

Let’s face it; not a day goes by where we don’t see someone offering their services as a Search Engine Optimization specialist. The strange thing is however, many of the people offering such services on the various forums tend to have no runs on the board themselves.

Of course, I’m not saying there aren’t a great deal of reputable internet marketing services out there, but they are becoming increasingly outnumbered by those with little or no background at all, and it is these people and their companies which are highly unlikely to ever produce satisfactory results for their clients.

Perhaps one should bear in mind that there is no difference between investing your money in internet marketing, and investing your money in a regular market. In both cases you need to measure your results just as you need to target the correct audience. For example, you wouldn’t even consider wasting your money by advertising your product or your service in a newspaper that is completely irrelevant to your target market. Advertising is done for one purpose and one purpose only, and that is to bring in a return on your investment, irrespective of whether the advertising is done online or offline.

Why You Should Avoid the Amateurs

Essentially, you need to bear in mind that while any Tom, Dick, and Harry can learn about search engine optimization, it takes several years of dedication, practice, and careful analysis, in order to fully understand the different techniques, and how to apply different techniques to different types of business.

The bottom line is; a SEO campaign is in fact a highly intensive process that starts out with intense keyword research in order to establish which keywords are the most likely to produce maximum results for a website. Once the ideal keywords have been established, it can be incredibly tempting to simply spread them around on your website and hope for the best. However, in most cases you’ll find that the most popular keywords also have the most competition.

As such, why bother targeting particular key words, irrespective of how popular they are, if there’s virtually no chance they are going to help in terms of ranking? In fact, you could end up waiting for several months before the major search engines start recognizing your website.

On the other hand, a specialist who is highly skilled in internet marketing will be aware of which relevant keywords and keyword phrases will help to improve a website’s ranking. Likewise, a true professional will also know where the keywords and keyword phrases should be placed on a website in order for them to have the maximum amount of impact, without being penalized for keyword stuffing.

Onsite optimization of keywords is notoriously time consuming if it’s being done properly. But if your goal is to give a website a boost in search engine ranking, then this optimization process needs to be continued off-site as well. Here again, a competent SEO professional will know exactly how to go about implementing a successful link building campaign, including article marketing, submittíng articles to directories, taking advantage of several social networking sites, and also social book marking.

Furthermore, because a professional SEO specialist appreciates the importance of being able to get a good return on investment, they will also make use of Analytics tools in order to track conversions and monitor the success of an internet marketing campaign. Bear in mind, that these tools are essential in order to fine tune any good SEO campaign.

Steer Clear of Internet Marketing Fraudsters

Unfortunately, but also to be expected, the internet is full of undesirable people who focus entirely on targeting honorable businesses by means of providing them with false promises regarding guaranteed results. These people will more often than not guarantee that they will get your website to the top of the search rankings by using specific keywords. However, in most cases, they simply use keywords which are so rarely used, they show at the top of search rankings simply because they have no competition – no one uses them.

Obviously, if no one is ever typing that keyword into the search box, then why waste money on it? One of the easiest ways to determine whether or not an internet marketing expert is in fact legitimate, is that the legitimate ones don’t ever provide any guarantees with regards to getting you in the top spot on search results. This is because they know that no one can guarantee such results due to a number of reasons, such as algorithms which change continuously.

How to Avoid the Wrong Internet Marketing Service

First and foremost, you need to ask the right questions:
1. You need to determine how long the company has been involved with Internet marketing.

2. You should ask to see testimonials from past clients.

3. You should search online for their services. In fact, you should attempt to find their website by using keywords and keyword phrases which are relevant to the services they provide. Obviously, if you fail to find them on the first page of Google search results, then your alarm bells should start ringing. For example, if you were considering using the services of Sunshine Coast Internet Marketing Company, you could do a search for Sunshine Coast Internet marketing, internet marketing Sunshine Coast, etc.
The most important thing of all is that you acknowledge the fact that going with the wrong internet marketing company can end up costing you a considerable amount of money for nothing. On the other hand, if you choose to use the services of a reputable company, you can almost be certain that your website will end up ranking much higher than it did before.

Putting SEO Under the Microscope

posted by Luigi_M_Scollo @ 8:00 AM
Wednesday, February 17, 2010

Putting SEO Under the Microscope

There is not a day that goes by that people recommending search engine optimization (SEO) don’t come up with yet another interesting idea or opinion on a topic in their field. They are all so focused on structures and procedures that they often forget that not every one agrees with their viewpoints and practices – that is, if their technical mumbo-jumbo can be understood.

The following are 5 SEO topics that are frequently discussed and disagreed upon:

1 – The Importance of Content Structure & Keywords

While keywords may add great value from a technical, algorithmic ranking perspective, their presence may not always entice the audience to explore the site they are visiting. The content may seem boring and unappealing, rather than grabbing and fascinating. In that case, the psychological triggers that will tell the reader to continue browsing will be missing, as will the desire to share the information with their friends and family.

SEO experts won’t ever agree on which is more important when it comes to keywords and compelling content. In the end, it will be up to the website owner or manager to decide what is more important to him: search engine rankings or sales.

2 – Pro or Con Reciprocal Link Exchange

A ‘reciprocal link exchange’ is an effective and efficient way of driving traffic to a website and improving the search engine placement of participating websites. At least, that is what some experts believe, while others are fearful and refuse to swap any kind of link that may refer to their business.

Artificially manipulating links may not be the best SEO idea on the market, but there is definitely nothing wrong with link trading programs that exchange links of companies endorsing a relationship, or business related directories.

If you do decide to participate in a link exchange, check the links regularly and report the dead ones to the webmaster so they can either be fixed or removed.

3 – Should the H1 Headline and Title Tag Match – or Not?

Many SEO consultants are skeptical when they notice sites whose H1 header is different from the title tag. One may wonder what the reasoning may be, because this action may confuse and upset the audience. Users click on a certain headline because they are interested in its content, yet when the search result is complete, and the header and title tag do not match, they may find themselves confronted with a completely different message, which may be something they are not interested in. That is very disappointing for the user, even if it may result in a higher ranking.

4 – The Relevance of a Website’s Age

Although many web designers believe that the age and history of a website are pertinent, it is not quite clear if search engines actually do use an ‘age’ or an ‘age of links’ metric to inflate incumbent rankings. Search engines check keywords, pay-per-click, link building and other SEO features and don’t necessarily verify when a website was built. All they care about is how user friendly and SEO strong the site is, which means that a younger, highly efficient site should absolutely be able to compete with more mature competitors.

5 – Reporting a Competitor’s Spam Activities

Spam is a reality and spammers should be reported. At least, that is what a number of SEO specialists would argue. Others may disagree and point out that those who are extremely vocal about competitors’ manipulative tactics to enhance search engine ranking are usually the ones abusing it the most. All they are trying to do is shift the focus away from them.

Anyone reporting spam should not publicly announce their actions because, even if spammers are breaking guidelines, the SEO community is vehement about socially shunning those violating the “code of silence”. As unethical as this blackmail may seem, it should not stop you from warning the search engines about illegal activities and, at the same time, reap some of the benefits associated with this. In the end, you will have to market and protect your site and business.

Here are several arguments in favor of spam reporting:
• Taking out spammers will improve the value of the Internet and help search engines provide more accurate search results.

• Your ranking may improve by eliminating a competitor.

• Removing manipulators will leave more room for your site to achieve better rankings, to boost visibility and to boost your sales.

• You can learn from researching spam activities and tactics. You will learn what is inappropriate, what the engines do/don’t tolerate and what penalties can be expected for which unlawful actions.

• As long as you are clean yourself, reporting spammers can gain you trust with the search engines.
These are a few reasons against it:

• If you are engaged in certain types of spam, or unknowingly benefit from it, you can accidentally hurt your website’s ranking.

• It is unethical to blow the whistle on and hurt other SEO specialists. People have been arguing about ethics for centuries and in the end it will be up to each individual to decide what is more important to them and to their website.

Search Engine Marketing: A Perfect Blend Of Social Media, SEO and SEM PR

Search engine marketing (SEM) has evolved to become the most reliable strategy for reaching your target audience and driving conversions on the internet. It compels your market to visit your website; it boosts your company’s exposure within your space; it positions your product as the solution to their problems. As a result, your sales go up. Your revenue and profit swell. Your ROI rises. And your business enjoys stronger branding and customer loyalty in the process.

Many of your competitors are already using SEM in an attempt to capture a larger portion of your market. There hasn’t ever been a better time to protect and expand your territory. This article will explain why search engine marketing should be a critical piece of your online marketing strategy. You’ll discover the value of hiring an SEM expert versus forging a path yourself. We’ll also describe how SEM PR and SEM social media tactics converge with SEO and PPC to produce a groundswell of momentum.

Why Search Engine Marketing Is Critical

Search engine marketing blends SEO, pay-per-click advertising, and social media strategies to give your company a higher level of visibility within the search engines’ listings. However, visibility without sales defeats the purpose. And therein lies the true value of SEM.

Your marketing efforts must generate conversions in order to justify the investment. Conversions might include a prospect buying your product, signing up for your newsletter, or becoming your affiliate. It might include subscribing to a continuity program that generates monthly revenue. Search engine marketing not only allows your company to approach your audience, but it engages the conversation that is already occurring in their mind. It compels action, which lifts your conversion rate.

Is Hiring A Search Engine Marketing (SEM) Expert Necessary?

Every tactic that is leveraged within a comprehensive search engine marketing deployment can be learned. The problem is, doing so is incredibly time-consuming. The algorithms that govern the search engine’s organic rankings change constantly. The major PPC platforms endure a seemingly endless string of upheavals. Social media sites are still in their infancy; as they mature, so too, will the tactics required to leverage them. Developing proficiency in each area of search engine marketing takes an enormous amount of time.

An SEM expert will design a search engine optimization campaign that pushes your website to the top rankings for your chosen keywords. They can also launch a pay-per-click, PPC, advertising campaign that further improves your exposure. Social media marketing tactics can be integrated to dovetail with the rest of your search engine marketing deployment. Even though you could launch these strategies yourself, do you have the time to learn and apply them?

SEM PR: Melding Search Engine Marketing With Public Relations

SEM PR has its roots in search engine optimization. Years ago, online public relations was managed largely through the creation and distribution of online press releases. This is still effective today. These press releases gain traction in the search engines’ organic listings. That builds your company’s brand while helping to push negative publicity off the first page of results.

Today, online public relations has been incorporated within a broader search engine marketing context that includes PPC, SEO and online reputation management (ORM). For example, a press release can be distributed online in order to gain traction within the natural listings. Then, a PPC campaign can be launched to direct your audience to the press release on your website. Links can be placed throughout the page to other positive coverage. The more points of exposure, the less likely negative press will penetrate the top rankings in the search engines. This is a core element of ORM and by extension, search engine marketing.

Leveraging SEM Social Media Optimization For A Competitive Edge

Social media sites began to enjoy ranking authority in the major search engines a few years ago. That authority has only increased over time, making social media an important cog in search engine marketing. This is the reason SEM social media optimization has become critical for companies that need to reach niche markets.

By establishing a presence on the top social media sites, a search engine marketing agency can develop multiple entry points in the organic listings. That increases your audience’s exposure. It also prevents bad press from infiltrating the top listings for your keywords. These advantages converge to deliver a competitive edge for your company.

The Value Of Hiring A Professional SEO Marketing Consultant

Time is the most valuable commodity of all. Once it expires, it cannot be retrieved. This is why a growing number of companies – including your competitors – are opting to hire a professional SEO marketing consultant. They realize that search engine marketing strategies are complex. The learning curve is steep. What’s more, deploying PPC, SMO and SEO tactics poorly can do more harm than good. Precision in execution is critical.

If you have already mastered each of the strategies that make up search engine marketing, and have a refined the systems through which to deploy them, you may not need an SEM expert. Otherwise, you might be fighting an uphill struggle. Consider contacting a search engine marketing specialist today.

Video SEO – A Neglected Path To Higher Search Rankings

posted by Luigi_M_Scollo @ 8:00 AM
Sunday, February 14, 2010

Video SEO – A Neglected Path To Higher Search Rankings

Video SEO is an underutilized search engine marketing
strategy. Even as videos continue to gain significant
traction in the search engines’ natural listings, most
companies either ignore them, or remain completely unaware
of their potency. That oversight represents a valuable edge
your company can use to leapfrog your competitors in the
organic rankings.

The strategy blends traditional search optimization tactics
with a relatively new platform. With the rise of YouTube,
Revver, Blip, and similar video sites, consumption patterns
have driven the search engines to provide these sites with
greater ranking authority. As long as your primary
objective is clearly established, a video SEO campaign can
have a dramatic effect on your exposure in Google, Yahoo,
and Bing.

In this article, we’ll explain why you should integrate
video SEO into your current search marketing strategy. We
will also provide a few ingredients that will help you
avoid potential pitfalls along the way. Last, you will
learn what to look out for when choosing a video SEO
company that can drive traffic and conversions.

How Video SEO Improves Your Search Exposure

Before Google released their Universal Search platform in
May 2007, their natural listings were dominated by
text-based pages. Videos were rare in the top spots.
Universal Search changed the way Google displayed their
primary index. Google, Yahoo, and Bing now include entries
from their respective video search platforms. What’s more,
popular video-sharing sites have been given higher ranking
authority and increased link weight (we’ll describe this
latter point in a moment).

Video SEO gives you greater exposure in the search engines
through two levers. First, it caters to the algorithm used
for Universal Search. By allowing syndication of your
videos to authoritative video-sharing sites, you will enjoy
more exposure through their increased ranking authority. In
effect, those sites will rank higher, drawing more people
to your videos.

Second, videos that are placed on your site (as opposed to
syndicating them) attract links – both directly and
indirectly. As your videos gain popularity, direct links
will naturally build, pointing to the pages on your site
that host the videos. Indirect links will point from other
sites whose owners have embedded your videos. As a result,
your inbound link profile will continue to grow and
strengthen, lifting your site higher within the search
engines’ organic listings.

3 SEO Video Tips To Capture Higher Search Positions

Your video SEO campaign can only be effective if you
recognize the limitations of the search engines. First,
their algorithms cannot read lips. In order to rank for
your target keywords, they must be available to the search
engines’ spiders in text form. If you’re placing videos on
your site, optimize your titles and surrounding text, and
include an edited transcript of the video. If you’re
syndicating them, optimize your external titles and tags.

Second, focus on inbound links. An effective video SEO
campaign relies on contextually related links pointing from
a wide breadth of sites. Videos that spark a groundswell of
attention – whether through entertainment, information, or
controversy – can achieve this easily.

Third, integrate a social media sharing component. You want
viewers to share your videos with their friends on
Facebook. You want them to “Tweet” about your videos on
Twitter. You want them to bookmark your videos on
StumbleUpon, Digg and Delicious. These social media sites
can form the backbone of your video SEO campaign, driving
waves of inbound links to your site.

Key Factors In Choosing A Video SEO Company

Traditional search optimization is a mature strategy. SEO
specialists have honed their craft for more than a decade.
By contrast, video SEO is still an evolving science. Even
though it leverages the core tenets of a traditional SEO
campaign, the rise of social media and video-sharing sites
have infused video SEO with enormous complexity. Hiring a
video SEO company removes the need to keep up with the
roiling landscape. The key is using the right criteria to
identify a proficient firm.

A professional video SEO company should have an established
track record that shows a keen grasp of the search engines’
organic algorithms. That track record should also
demonstrate an ability to evolve as the algorithms change.
Many search optimization experts were completely unprepared
for the debut of Universal Search. By extension, so too,
were their clients.

Leveraging Video SEO For More Traffic And Higher Conversions

A carefully executed video SEO campaign can sharply
increase your exposure within the search engine’s natural
listings. When implemented as a component of a
multi-pronged search engine marketing campaign, it can
drive more targeted traffic to your site. Targeted traffic
translates into higher conversions. If you are not yet
utilizing video SEO for your site, your current organic
rankings may be more vulnerable than you realize.

The Fundamentals of Search Engine Optimization SEO

posted by Luigi_M_Scollo @ 8:00 AM
Friday, February 12, 2010

The Fundamentals of SEO

Why SEO?

Search engines provide the majority of traffic to websites
across the Internet, regardless of website focus. Therefore, if
your site cannot be properly located and indexed by the leading
search engines, you are missing out on the best opportunity to
drive targeted visitors and potential revenue.

What is SEO?

Search Engine Optimization or SEO, is the process by which pages
are improved to increase their organic search engine rankings.
This is done by assessing what the individual search engines are
looking for and providing that. The outcome of an SEO Campaign
is to create high organic rankings for the keywords/phrases for
which the client is indeed an authority. This will ultimately
create an increase in targeted traffic. A good SEO campaign
includes the following three aspects:

1. Keyword Analysis
2. Onsite Optimization
3. Offsite Optimization

Keyword analysis is the process by which you analyze and
select keywords based on traffic, competition, and relevance. If
you are not selecting the proper keywords, then the rest of the
optimization is really a lost cause. The text and theme of the
site needs to revolve around these keywords and very much define
how the site appears to both users and search engines.

Onsite optimization deals with changes made to the site
itself. This involves making changes to the text content,
architecture of the site, HTML code, and page layout.
CSS design
css-sculptor/?WAAID=898″) is often recommended when working
to optimize a website as it helps keep important content at the
top of your pages and allows for your pages to be easily and
efficiently crawled by the search engines. This is the most
commonly understood aspect of SEO, but only accounts for about
40% of a site’s rankings. This is where your keywords are placed
throughout the code to show the search engines what your site is

Offsite optimization deals with changes made outside the
scope of the site. This mainly involves increasing the quantity
and quality of inbound links to the site. Approximately 60% of
Google’s current ranking algorithm is based on inbound linking.
Your goal is to maximize the site’s exposure on the Web and get
as many sites as possible to link back to your site.

What is a Good Keyword and What is Not?

This is the ultimate question we have to ask ourselves when
judging keywords. There are many variables you have to take into
account when selecting exactly what keywords your site will be
optimized for. Use the following criteria to determine the
viability of a keyword:

* The estimated amount of searches for the keyword in a 24
hour period
* The number of sites competing for the keyword
* The quality of the sites competing for the keyword
* The ability of the site to support the keyword
* Relevance between keywords
* The target audience of the site

Keep in mind that your number one goal is to accurately depict
what the site is about through the keywords (and the eventual
text content). If your site is not properly described by the
keywords, then either the site is targeted wrong or you’ve
selected the wrong keywords.

Search engines like sites that are targeted to a specific topic.
If a site is spread too thin as far as topic goes, then it will
be much harder to appear as an authority for any one topic.
Search engines do favor large sites, but generally it is better
to have a smaller targeted site than a larger broad site that is
about many topics.

It’s not uncommon to discover site theme issues when doing
keyword selection. Oftentimes, it leads to a reassessment of the
site as a whole (which is a positive). In this way, general
marketing, user experience, and SEO overlap. If you do not feel
your site is targeted towards the correct keywords and themes,
it is important that you re-target the site and its content prior
to optimization. You should understand your audience, the
purpose of your site, and its themes before even starting an SEO

It is also common for sites to get caught up in industry jargon.
You have to look at your keywords as your target audience would.
If you’re targeting the general consumer and you use lots of
industry jargon, then you cannot expect much of a return on

Another thing to watch out for is overly generic keywords. If
you are attempting to optimize your site for keywords that can
mean many other things, you are bringing in a whole lot of new
competition. So, we have a small list now of what to avoid.

* Keywords that are not relevant to each other
* Keywords that do not fit the theme of the site
* Industry jargon, if it is not applicable to the audience
* Keywords that are too generic/overly competitive

META Headers

Optimizing the META header is the first and easiest step in
onsite optimization. There are four main areas that you should
be concerned with:

1. Title
2. Description
3. Keywords
4. Robots

Depending on the keyword selection, the Title should be made up
of the first two keywords. This provides high density and
prominence for both keywords instead of using it all on one. Of
course, the Title should make sense and be descriptive of the
page. The Description borrows the same idea, but expands on it a
little. It is ideal to include both the primary and secondary
keyword in a short sentence describing the page. The Keywords
field is simply a list of the keywords separated by commas with
no spaces in between. The Robots tag tell the search engine
spiders what to do with the page.


“Google interprets a link from page A to page B as a vote, by
page A, for page B. But, Google looks at considerably more than
the sheer volume of votes, or links a page receives; for
example, it also analyzes the page that casts the vote. Votes
cast by pages that are themselves “important” weigh more heavily
and help to make other pages ‘important.’” -Quote from Google’s

Link popularity is one of the most important factors search
engines use in determining where you will rank in the search
engine results (SERPs) for your keywords and phrases, as it
helps them to determine how important or popular your site is
and what its reputation is. Link building, as part of the
offsite optimization process, is the process of finding
related/relevant websites and receiving a link from them to you.
Natural linking occurs when a site has good content that others
will link to without being asked. But to get these links, people
have to know about you. It is a catch 22. Building links has
become pretty sophisticated over the last couple of years. Today
you need a mixture of links from many sources including
articles, press releases, social media, blogs, directories and

10 Things You Need to Know about SEO

posted by Luigi_M_Scollo @ 8:00 AM
Thursday, February 11, 2010

10 Things You Need to Know about SEO

I have compiled a list of 10 vital things – from choosing an
expert to instructing your web developer – that every marketer
needs to consider when undertaking search engine optimisation as
part of their marketing mix.

1. Strategy First
Please, don’t ask for a full SEO proposal from an agency until
you have set your strategy. Too often, agencies will respond
with a full proposal, including lots of articles to be created,
sites to be built and links to be implemented without a clear

Some sites are more straight forward but others are complex and
would benefit from asking a couple of agencies to get involved
at the research stage – ask them about the strengths and
weaknesses of your site, what they think of your competitors,
and what strategic approach they would take with your site.

To get the best advice from this process, expect to pay the
agencies involved. A small percentage of your online budget
spent on good strategy will save you money in the long term.
Even better, pay two agencies for a strategy recommendation and
then choose the best one for your business!

2. Choosing a Consultant
You need to work with someone who can communicate about SEO in
plain English, someone who can take complicated ideas and
techniques and turn them into something you can understand,
then make a decision on – especially as there are often many
possible solutions to choose from.

Someone with experience in your vertical – such as travel,
finance, retail – as well as several other verticals is
important. An SEO consultant with experience across multiple
types of business, as well as experience that is directly
relevant to you will have better problem solving skills and more
exposure to technologies. Experience in your sector will mean
the consultant will be very helpful in defining your strategy,
understanding terminology, and knowing what your competitors are

3. Expectations
What are realistic expectations from your investment in SEO?

Too often, we see marketers defining their keyword set or crazy
goals for their site without any basis in how SEO really works.
If you are a law firm, for example, and you want to rank highly
for terms such as “lawyer”, or “barrister”, then you have to
take into account that these are extremely popular and
competitive terms. It might not be achievable, and even if it
is, it’s probably a very hard road to get there.

Be open to advice when setting the goals for your website (which
should be a part of the keyword research period of your SEO
project). If you have a PPC campaign running first, you can use
the keyword data from that campaign to gain an understanding of
what is important for your website.

4. Using the Right Language
Optimising begins with keyword research that helps you
understand the language your customers are using to find your
products and services.

Be realistic. It may sound obvious but, if the words your
customers are using to search are not on your website pages,
then you won’t be found in the search engines for those words.

Similarly, brand words and buzz words are all very nice in
marketing, but if people aren’t using those words to search,
then again you won’t be found.

Be ready to change the language of your site. Be open to the
idea of conforming your website to the language people use.
Optimisation is about including those words in the right areas
of your pages (such as navigation, links, headings, meta tags
and content) so the search engine sees all the right signals to
understand what your site’s pages should be ranked for.

5. Measurement
Rankings are not the only measure of success! For many years,
SEO firms have measured everything on rankings. However, we
recommend using analytics similar to a PPC (paid) search
campaign for a more comprehensive measure of success.

Here is a simple description of how to do that: Take what you
are spending on SEO and put it against traffic and conversions
to work out cost per unique browser, cost per click and cost per
conversion. It’s best to analyse these over a period of six
and/or twelve months to allow for any changes in SEO to come into
effect. This is because the major difference between SEO and PPC
is the implementation time – for SEO, the results will take
months, rather than days.

6. Moving Variables
There are so many moving variables in SEO that it would be
impossible to find one person who knows everything! But a good
SEO consultant is worth their weight in gold. Their value is not
necessarily in the implementation, but in tapping into their
experience to find the right implementation. One tiny piece of
advice from them which may take 10 minutes to explain could be
worth more than a copywriter producing numerous articles for
your site each month.

7. One Agency or Two?
Some agencies have two separate teams working on SEO and PPC.
Some marketers choose two completely different agencies to
handle their SEO and PPC campaigns.

However, the two are very closely related and the results from
one can be useful to the other. For instance, the keyword data
from your PPC campaign can help with your SEO keyword research.
On the flip side, optimising pages for SEO will usually provide
your PPC campaigns with a better quality score. When PPC and SEO
listings are seen together on a search engine, they usually
increase the click-through and conversion rates for both

They go hand in hand, and each can have a positive effect on the
other if done well. And with one agency on both campaigns, they
will have a greater depth of experience with your business,
which can only help you to succeed.

8. Web Developers are not SEO Experts
Finally, a word on expertise. Most web developers say they are
experts in SEO (
search-engine-optimisation.htm). There is no doubting that many
of them do a reasonable job, but they are not truly specialists
in the area of SEO.

In the same way, I wouldn’t recommend that an SEO specialist
designs your website. They are specialist skills, which both
contribute to the success of your business online.

9. Use of Java Script
Those pesky robots that the major search engines rely on to rank
web pages have until recently imposed some limitations for web
development. While useful code such as Java Script can make your
website really functional – a simple example is a loan
calculator, and many websites’ navigation and links – and thus
attractive to users, the robots often couldn’t follow the code
properly, and thus skipped over it. The major problem was that
commonly, web developers didn’t know that Java Script wasn’t
being read or followed by the robots.

That has changed recently, with Google updating its technology
so that the robots can read and follow Java Script. When the
robots can follow a website’s navigation and links properly, the
SEO rankings are greatly influenced.

10. Flash
Potentially any Flash file can now be indexed, according to
Google, but it still depends on how that Flash site is
constructed. Generally older Flash sites are not seen in the
most effective way by the search engines, though it depends on
the practices of the Flash developer. Many older Flash sites
have overcome this problem by building an underlying version of
the site in html – though this method too has its drawbacks.

Flash sites need to be built like html sites, with multiple
files that optimise each keyword. If you are building a new
Flash site, be sure to consult with an SEO expert before the
developer starts on the build.

10 Do’s and Don’ts to Avoid SEO Mistakes

posted by Web_University @ 12:00 AM
Wednesday, November 30, -0001