Popular Posts

Saturday, April 23, 2011

Google's Cat & Mouse SEO Game

This infographic highlights how Google's cat and mouse approach to SEO has evolved over the past decade.

One of the best ways to understand where Google is headed is to look at where they have been and how they have changed.

Click on it for ginormous version.

Google's Collateral Damage Infographic.

If you would like us to make more of them then please spread this one. We listen to the market & invest in what it values ;)

Feel free to leave comments below if you have any suggestions or feedback on it :)

Categories: 

Is the Huffington Post Google's Favorite Content Farm?

I was looking for information about the nuclear reactor issue in Japan and am glad it did not turn out as bad as it first looked!

But in that process of searching for information I kept stumbling into garbage hollow websites. I was cautious not to click on the malware results, but of the mainstream sites covering the issue, one of the most flagrant efforts was from the Huffington Post.

AOL recently announced that they were firing 15% to 20% of their staff. No need for original stories or even staff writers when you can literally grab a third party tweet, wrap it in your site design, and rank it in Google. Inline with that spirit, I took a screenshot. Rather than calling it the Huffington Post I decided a more fitting title would be plundering host. :D

plundering host.

We were told that the content farm update was to get rid of low quality web pages & yet that information-less page was ranking at the top of their search results, when it was nothing but a 3rd party tweet wrapped in brand and ads.

How does Huffington Post get away with that?

You can imagine in a hyperspace a bunch of points, some points are red, some points are green, and in others there’s some mixture. Your job is to find a plane which says that most things on this side of the place are red, and most of the things on that side of the plane are the opposite of red. - Google's Amit Singhal

If you make it past Google's arbitrary line in the sand there is no limit to how much spamming and jamming you can do.

we actually came up with a classifier to say, okay, IRS or Wikipedia or New York Times is over on this side, and the low-quality sites are over on this side. - Matt Cutts

(G)arbitrage never really goes away, it just becomes more corporate.

The problem with Google arbitrarily picking winners and losers is the winners will mass produce doorway pages. With much of the competition (including many of the original content creators) removed from the search results, this sort of activity is simply printing money.

As bad as that sounds, it is actually even worse than that. Today Google Alerts showed our brand being mentioned on a group-piracy website built around a subscription model of selling 3rd party content without permission! As annoying as that feels, of course there are going to be some dirtbags on the way that you have to deal with from time to time. But now that the content farm update has went through, some of the original content producers are no longer ranking for their own titles, whereas piracy sites that stole their content are now the canonical top ranked sources!

Google never used to put piracy sites on the first page of results for my books, this is a new feature on their part, and I think it goes a long way to show that their problem is cultural rather than technical. Google seems to have reached the conclusion that since many of their users are looking for pirated eBooks, quality search results means providing them with the best directory of copyright infringements available. And since Google streamlined their DMCA process with online forms, I couldn’t discover a method of telling them to remove a result like this from their search results, though I tried anyway.
... I feel like the guy who was walking across the street when Google dropped a 1000 pound bomb to take out a cockroach - Morris Rosenthal

Way to go Google! +1 +1

Too clever by half.

Google Panda Coming to a Market Near You

If you live outside the United States and were unscathed by the Panda Update, a world of hurt may await soon. Or you may be in for a pleasant surprise. It is hard to say where the chips may lay for you without looking.

Some people just had their businesses destroyed, whereas the Online Publisher Association sees a $1 billion windfall to the winning publishers.

Due to Google having multiple algorithms running right now, you can get a peak at the types of sites that were hit, and if your site is in English you can see if it would have got hit by comparing your Google.com rankings in the United States versus in foreign markets by using the Google AdWords ad preview tool.

In most foreign markets Google is not likely to be as aggressive with this type of algorithm as they are in the United States (because foreign ad markets are less liquid and there is less of a critical mass of content in some foreign markets), but I would be willing to bet that Google will be pretty aggressive with it in the UK when it rolls out.

The keywords where you will see the most significant ranking changes will be those where there is a lot of competition, as keywords with less competition generally do not have as many sites to replace them when they are whacked (since there were less people competing for the keyword). Another way to get a glimpse of the aggregate data is to look at your Google Analytics search traffic from the US and see how it has changed relative to seasonal norms. Here is a look out below example, highlighting how Google traffic dropped. ;)

What is worse, is that on most sites impacted revenue declined faster than traffic because search traffic monetizes so well & the US ad market is so much deeper than most foreign markets. Thus a site that had 50% profit margins might have just went to break even or losing money after this update. :D

When Google updates the US content farmer algorithm again (likely soon, since it has already been over a month since the update happened) it will likely roll out around other large global markets, because Google does not like running (and maintaining) 2 sets of ranking algorithms for an extended period of time, as it is more cost intensive and it helps people reverse engineer the algorithm.

Some sites that get hit may be able to quickly bounce back *if* they own a well-read tech blog and have an appropriate in with Google engineers, however most will not unless they drastically change their strategy. Almost nobody has recovered and it has been over a month since the algorithm went live. So your best bet is to plan ahead. When the tide goes out you don't want to be swimming naked. :)

Categories: 

Friday, April 22, 2011

Doorway Pages Ranking in Google in 2011?

When Google did the Panda update they highlighted that not only did some "low quality" sites get hammered, but that some "high quality" sites got a boost. Matt Cutts said: "we actually came up with a classifier to say, okay, IRS or Wikipedia or New York Times is over on this side, and the low-quality sites are over on this side."

Here is the problem with that sort of classification system: doorway pages.

The following Ikea page was ranking page 1 in the search results for a fairly competitive keyword.

Once you strip away the site's navigation there are literally only 20 words on that page. And the main body area "content" for that page is a link to a bizarre, confusing, and poor-functioning flash tour which takes a while to load.

If you were trying to design the worst possible user experience & wanted to push the "minimum viable product" page into the search results then you really couldn't possibly do much worse that that Ikea page is (at least not without delivering malware and such).

I am not accusing Ikea of doing anything spammy. They just have terrible usability on that page. Their backlinks to that page are few in number & look just about as organic as they could possibly come. But not that long ago companies like JC Penny and Overstock were demoted by Google for building targeted deep links (that they needed in order to rank, but were allegedly harming search relevancy & Google user experience). Less than a month later Google arbitrarily changed their algorithm to where other branded sites simply didn't need many (or in some cases any) deep links to get in the game, even if their pages were pure crap. Google Handling Flash.

We are told the recent "content farm" update was to demote low quality content. If that is the case, then how does a skeleton of a page like that rank so high? How did that Ikea page go from ranking on the third page of Google's results to the first one? I think Google's classifier is flashing a new set of exploits for those who know what to look for.

A basic tip? If you see Google ranking an information-less page like that on a site you own, that might be a green light to see how far you can run with it. Give GoogleBot the "quality content" it seeks. Opportunity abound!

Categories: 

A Thought Experiment on Google Whitelisting Websites

Google has long maintained that "the algorithm" is what controls rankings, except for sites which are manually demoted for spamming, getting hacked, delivering spyware, and so on.

At the SMX conference it was revealed that Google uses white listing:

Google and Bing admitted publicly to having ‘exception lists’ for sites that were hit by algorithms that should not have been hit. Matt Cutts explained that there is no global whitelist but for some algorithms that have a negative impact on a site in Google’s search results, Google may make an exception for individual sites.

The idea that "sites rank where they deserve, with the exception of spammers" has long been pushed to help indemnify Google from potential anti-competitive behavior. Google's marketing has further leveraged the phrase "unique democratic nature of the web" to highlight how PageRank originally worked.

But why don't we conduct a thought experiment for the purpose of thinking through the differences between how Google behaves and how Google doesn't want to be perceived as behaving.

Let's cover the negative view first. The negative view is that either Google has a competing product or a Google engineer dislikes you and goes out of his way to torch your stuff simply because you are you and he dislikes you & is holding onto a grudge. Given Google's current monopoly-level marketshare in most countries, such would be seen as unacceptable if Google was just picking winners and losers based on their business interests.

The positive view is that "the algorithm handles almost everything, except some edge cases of spam." Let's break down that positive view a bit.

  • Off the start, consider that Google engineers write the algorithms with set goals and objectives in mind.
    • Google only launched universal search after Google bought Youtube. Coincidence? Not likely. If Google had rolled out universal search before buying Youtube then they likely would have increased the price of Youtube by 30% to 50%.
    • Likewise, Google trains some of their algorithms with human raters. Google seeds certain questions & desired goals in the minds of raters & then uses their input to help craft an algorithm that matches their goals. (This is like me telling you I can't say the number 3, but I can ask you to add 1 and 2 then repeat whatever you say :D)
  • At some point Google rolls out a brand-filter (or other arbitrary algorithm) which allows certain favored sites to rank based on criteria that other sites simply can not match. It allows some sites to rank with junk doorway pages while demoting other websites.
  • To try to compete with that, some sites are forced to either live in obscurity & consistently shed marketshare in their market, or be aggressive and operate outside the guidelines (at least in spirit, if not in a technical basis).
  • If the site operates outside the guidelines there is potential that they can go unpenalized, get a short-term slap on the wrist, or get a long-term hand issued penalty that can literally last for up to 3 years!
  • Now here is where it gets interesting...
    • Google can roll out an automated algorithm that is overly punitive and has a significant number of false positives.
    • Then Google can follow up by allowing nepotistic businesses & those that fit certain criteria to quickly rank again via whitelisting.
    • Sites which might be doing the same things as the whitelisted sites might be crushed for doing the exact same thing & upon review get a cold shoulder.

You can see that even though it is claimed "TheAlgorithm" handles almost everything, they can easily interject their personal biases to decide who ranks and who does not. "TheAlgorithm" is first and foremost a legal shield. Beyond that it is a marketing tool. Relevancy is likely third in line in terms of importance (how else could one explain the content farm issue getting so out of hand for so many years before Google did something about it).

Categories: 

Quick & Dirty Competitive Research for Keywords

There are so many competitive research tools on the market. We reviewed some of the larger ones here but there are quite a few more on the market today.

The truth is that you can really get a lot of good, usable data to give you an idea of what the competition is likely to be by using free tools or the free version of paid tools.

Some of the competitive research tools out there (the paid ones) really are useful if you are going to scale way up with some of your SEO or PPC plans but many of the paid versions are overkill for a lot of webmasters.

Choosing Your Tools

Most tools come with the promises of “UNCOVERING YOUR COMPETITORS BEST _____".

That blank can be links, keywords, traffic sources, and so on. As we know, most competitive research tools are rough estimates at best and almost useless estimates at worst. Unless you get your hands on your competition’s analytics reports, you are still kind of best-guessing. In this example we are looking for the competitiveness of a core keyword.

Best-guessing really isn’t a bad thing so long as you realize that what you are doing is really triangulating data points and looking for patterns across different tools. Keep in mind many tools use Google’s data so you’ll want to try to reach beyond Google’s data points a bit and hit up places like:

The lure of competitive research is to get it done quickly and accurately. However, gauging the competition of a keyword or market can’t really be done with a push of the button as there are factors that come into play which a push-button tool cannot account for, such as:

  • how hard is the market to link build for?
  • is the vertical dominated by brands and thick EMD’s?
  • what is your available capital?
  • are the ranking sites knowledgeable about SEO or are they mostly ranking on brand authority/domain authority? (how tight is their site structure, how targeted is their content, etc)
  • is Google giving the competing sites a brand boost?
  • is Google integrating products, images, videos, local results, etc?

Other questions might be stuff like "how is Google Instant skewing this keyword marketplace" or "is Google firing a vertical search engine for these results (like local" or "is Google placing 3 AdWords ads at the top of the search results" or "is Google making inroads into the market" like they are with mortgage rates.

People don't search in an abstract mathematical world, but by using their fingers and eyes. Looking at the search results matters. Quite a bit of variables come into play which require some human intuition and common sense. A research tool is only as good as the person using it, you have to know what you are looking at & what to be aware of.

Getting the Job Done

In this example I decided to use the following tools:

Yep, just 2 free tools.... :)

So we are stipulating that you’ve already selected a keyword. In this case I picked a generic keyword for the purposes of going through how to use the tools. Plug your keyword into Google, flip on SEO for Firefox and off you go!

This is actually a good example of where a push button tool might bite the dust. You’ve got Related Search breadcrumbs at the top, Images in the #1 spot, Shopping in the #3 spot, and News (not pictured) in the #5 spot.

So wherever you thought you might rank, just move yourself down a 1-3 spots depending on where you would be in the SERPS. This can have a large effect on potential traffic and revenue so you’ll want to evaluate the SERP prior to jumping in.

You might decide that you need to shoot for 1 or 2 rather than top 3 or top 5 given all the other stuff Google is integrating into this results page. Or you might decide that the top spot is locked up and the #2 position is your only opportunity, making the risk to reward ratio much less appealing.

With SEO for Firefox you can quickly see important metrics like:

  • Yahoo! links to domain/page
  • domain age
  • Open Site Explorer and Majestic SEO link data
  • presence in strong directories
  • potential, estimated traffic value from SEM Rush

Close up of SEO for Firefox data:

Basically by looking at the results page you can see what other pieces of universal search you’ll be competing with, whether the home page or a sub-page is ranking, and whether you are competing with brands and/or strong EMD’s.

With SEO for Firefox you’ll see all of the above plus the domain age, domain links, page links, listings in major directories, position in other search engines, and so on. This will give you a good idea of potential competitiveness of this keyword for free and in about 5 seconds.

It is typically better & easier to measure the few smaller sites that managed to rank rather than measuring the larger authoritative domains. Why? Well...

Checking Links

So now that you know how many links are pointing to that domain/page you’ll want to check how many unique domains are pointing in and what the anchor text looks like, in addition to what the quality of those links might be.

Due to its ease of use (in addition to the data being good) I like to use Open Site Explorer from SeoMoz in these cases of quick research. I will use their free service for this example, which requires no log in, and they are even more generous with data when you register for a free account.

The first thing I do is head over to the anchor text distribution of the site or page to see if the site/page is attracting links specific to the keyword I am researching:

What’s great here is you can see the top 5 instances of anchor text usage, how many total links are using that term, and how many unique domains are supplying those total links.

You can also see data relative to the potential quality of the entire link profile in addition to the ratio of total/unique domains linking in.

You probably won’t want or need to do this for every single keyword you decide to pursue. However, when looking at a new market, a potential core keyword, or if you are considering buying an exact match domain for a specific keyword you can accomplish a really good amount of competitive research on that keyword by using a couple free tools.

Types of Competitive Research

Competitive research is a broad term and can go in a bunch of different directions. As an example, when first entering a market you would likely start with some keyword research and move into analyzing the competition of those keywords before you decide to enter or fully enter the market.

As you move into bigger markets and start to do more enterprise-level competitive research specific to a domain, link profiles, or a broader market you might move into some paid tools.

Analysis paralysis is a major issue in SEO. Many times you might find that those enterprise-level tools really are overkill for what you might be trying to do initially. Gauging the competitiveness of a huge keyword or a lower volume keyword really doesn’t change based on the money you throw at a tool. The data is the data especially when you narrow down the research to a keyword, keywords, or domains.

Get the Data, Make a Decision

So with the tools we used here you are getting many of the key data points you need to decide whether pursuing the keyword or keywords you have chosen is right for you.

Some things the tools cannot tell you are questions we talked about before:

  • how much captial can you allocate to the project?
  • how hard are you willing to work?
  • do you have a network of contacts you can lean on for advice and assistance?
  • do you have enough patience to see the project through, especially if ranking will take a bit..can you wait on the revenue?
  • is creativity lacking in the market and can you fill that void or at least be better than what’s out there?

Only you can answer those questions :)

Categories: 

Google Update Panda

Google tries to wrestle back index update naming from the pundits, naming the update "Panda". Named after one of their engineers, apparently.

The official Google line - and I'm paraphrasing here - is this:

Trust us. We're putting the bad guys on one side, and the good guys on the other

I like how Wired didn't let them off the hook.

Wired persisted:

Wired.com: Some people say you should be transparent, to prove that you aren’t making those algorithms to help your advertisers, something I know that you will deny.

Singhal: I can say categorically that money does not impact our decisions.

Wired.com: But people want the proof.

This answer, from Matt Cutts, was interesting:

Cutts: If someone has a specific question about, for example, why a site dropped, I think it’s fair and justifiable and defensible to tell them why that site dropped. But for example, our most recent algorithm does contain signals that can be gamed. If that one were 100 percent transparent, the bad guys would know how to optimize their way back into the rankings

Why Not Just Tell Us What You Want, Already!

Blekko makes a big deal about being transparent and open, but Google have always been secretive. After all, if Google want us to produce quality documents their users like and trust, then why not just tell us exactly what a quality document their users like and trust looks like?

Trouble is, Google's algorithmns clearly aren't that bulletproof, as Google admit they can still be gamed, hence the secrecy. Matt says he would like to think there would be a time they could open source the algorithms, but it's clear that time isn't now.

Do We Know Anything New?

So, what are we to conclude?

  • Google can be gamed. We kinda knew that....
  • Google still aren't telling us much. No change there....

Then again, there's this:

Google have filed a patent that sounds very similar to what Demand Media does i.e looks for serp areas that are under-served by content, and prompts writers to write for it.

The patent basically covers a system for identifying search queries which have low quality content and then asking either publishers or the people searching for that topic to create some better content themselves. The system takes into account the volume of searches when looking at the quality of the content so for bigger keywords the content would need to be better in order for Google to not need to suggest somebody else writes something

If Google do implement technology based on this patent, then it would appear they aren't down on the "Content Farm" model. They may even integrate it themselves.

Until then....

How To Avoid Getting Labelled A Content Farmer

The question remains: how do you prevent being labelled as a low-quality publisher, especially when sites like eHow remain untouched, yet Cult Of Mac gets taken out? Note: Cult Of Mac appears to have been reinstated, but one wonders if that was the result of the media attention, or an algo tweak.

Google want content their users find useful. As always, they're cagey about what "useful" means, so those who want to publish content, and want to rank well, but do not want be confused with a content farm, are left to guess. And do a little reverse-engineering.

Here's a stab, based on our investigations, the conference scene, Google's rhetoric, and pure conjecture thus far:

  • A useful document will pass a human inspection
  • A useful document is not ad heavy
  • A useful document is well linked externally
  • A useful document is not a copy of another document
  • A useful document is typically created by a brand or an entity which has a distribution channel outside of the search channel
  • A useful document does not have a 100% bounce rate followed by a click on a different search result for that same search query ;)

Kinda obvious. Are we off-base here? Something else? What is the difference, as far as algo is concerned, between e-How and Suite 101? Usage patterns?

Still doesn't explain YouTube, though, which brings us back to:

Wired.com: But people want the proof

YouTube, the domain, is incredibly useful, but some pages - not so much. Did YouTube get hammered by update Panda, too?

Many would say that's unlikely.

I guess "who you know" helps.

In the Panda update some websites got owned. Others are owned and operated by Google. :D

Categories: 

How Google Destroyed the Value of Google Site Search

Do You Really Want That Indexed?

On-demand indexing was a great value added feature for Google site search, but now it carries more risks than ever. Why? Google decides how many documents make their primary index. And if too many of your documents are arbitrarily considered "low quality" then you get hit with a sitewide penalty. You did nothing but decide to trust Google & use Google products. In response Google goes out of its way to destroy your business. Awesome!

Keep in mind that Google was directly responsible for the creation of AdSense farms. And rather than addressing them directly, Google had to roll everything through an arbitrary algorithmic approach.

< meta name="googlebot" content="noindex" />

Part of the prescribed solution to the Panda Update is to noindex content that Google deems to be of low quality. But if you are telling GoogleBot to noindex some of your content, then if you are also using them for site search, you destroy the usability of their site search feature by making your content effectively invisible to your customers. For Google Site Search customers this algorithmic change is even more value destructive than the arbitrary price jack Google Site Search recently did.

We currently use Google Site Search on our site here, but given Google's arbitrary switcheroo styled stuff, I would be the first person to dump it if they hit our site with their stupid "low quality" stuff that somehow missed eHow & sites which wrap repurposed tweets in a page. :D

Cloaking vs rel=noindex, rel=canonical, etc. etc. etc.

Google tells us that cloaking is bad & that we should build our sites for users instead of search engines, but now Google's algorithms are so complex that you literally have to break some of Google's products to be able to work with other Google products. How stupid! But a healthy reminder for those considering deeply integrating Google into your on-site customer experience. Who knows when their model will arbitrarily change again? But we do know that when it does they won't warn partners in advance. ;)

I could be wrong in the above, but if I am, it is not easy to find any helpful Google documentation. There is no site-search bot on their list of crawlers, questions about if they share the same user agent have gone unanswered, and even a blog post like this probably won't get a response.

That is a reflection of only one more layer of hypocrisy, in which Google states that if you don't provide great customer service then your business is awful, while going to the dentist is more fun than trying to get any customer service from Google. :D

I was talking to a friend about this stuff and I think he summed it up perfectly: "The layers of complexity make everyone a spammer since they ultimately conflict, giving them the ability to boot anyone at will."

Categories: 

Tracking Offline Conversions for Local SEO

We have certainly seen a trend over the last one to two years where Google is focusing on more personalized search and an increasing focus on providing local results. As you know, a searcher does not even have to be burdened with entering a local modifier anymore.

Google will gladly figure out, for you, whether or not your search has local intent. :)

Google's Investment into Local

Late last year Google moved one of their prized executives over to local services, Marissa Mayer. Moving Mayer, fresh off Google Instant and a variety of other high profile areas of Google's search development, to head up local is a real strong reinforcement of how much attention Google is putting on local and local result quality (or perceived quality).

If you are a business owner who operates locally, say a real estate agent or insurance agent or really any other consumer-based service, then this presents a huge opportunity for you if you can harness the targeting and tracking ability available online.

Merging Offline Marketing with Online Marketing

A lot of small businesses or larger businesses that operate locally still rely quite a bit on offline advertising. It use to be that business owners had to rely on staff nailing down exactly how a lead came to them (newspaper ad? radio ad? special discount ad? and so on).

While it is still good practice to do that, relying solely on that to help gauge the ROI of your advertising campaign introduces a good amount of slippage and is not all that accurate (especially if you sell something online).

As local businesses start to see the light with SEO and PPC campaigns versus dropping 5 figures on phonebook advertising, a big selling point as a service provider or an in-house marketing staff member will be to sell the targeting of online campaigns as well as the tracking of those results.

If your a business owner, it's equally important that you understand what's available to you as an online marketer.

Types of Offline Advertising to Track

Locally, you are essentially looking at a few different types of advertising options to work into your new found zest for tracking results:

  • Radio
  • Television
  • Print
  • Billboards

Print is probably the most wide-ranging in terms of branches of advertising collateral because you can get into newspapers, magazines, flyers, brochures, banners, yellow pages, and so on.

While your approach may be different to each marketing type, the core tracking options are basically the same. You can track in your analytics program via:

  • Separate Domains
  • Custom URL's
  • Custom Phone Numbers

The beauty of web analytics, specifically a free service like Google Analytics, is that it puts the power of tracking into the hands of a business owner at no cost outside of perhaps a custom set up and implementation by a competent webmaster. All of these tracking methods can be tracked in Google Analytics as well as other robust analytic packages (Clicky.Com as an example, is a reasonably priced product which can do this as well, save for maybe the phone tracking).

Structuring Your Campaigns

With the amount of offline advertising many businesses do, it is easy to get carried away with separate domains, custom URL's, custom phone numbers, and the like.

What I usually like to do is use a good old fashioned spreadsheet to track the specific advertisements that are running, the dates they are running, and the advertising medium they are using. I also include a column or three for the tracking method(s) used (custom URL, separate domain, special phone number).

In addition to this, Google Analytics offers annotations which you can use to note those advertising dates in your traffic graph area to help get an even better idea of the net traffic effect of a particular ad campaign.

How to Track It

Armed with your spreadsheet of ads to track and notes on how you are going to track them, you're ready to set up the technical side of things.

The tracking is designed to track the hits on your site via the methods mentioned, once they get there you'll want to get that traffic assigned to a campaign or a conversion funnel to determine how many of the people actually convert (if you are able to sell or convert the visitor online).

Custom URL's

A custom URL is going to be something like:

yoursite.com/save20 for an advert you might be offering 20% savings on
yoursite.com/summer for an advert you could offer a summer special on

You may or may not want to use redirection. You can use a redirect method if you are using something like a static site versus a CMS like Wordpress. With Wordpress, you could create those url's as specific pages and just no-index them and ensure they are not linked to internally so you keep them out of the search engine and the normal flow of navigation. This way you know any visit to that page is clearly related to that offline campaign.

A redirect would be helpful where the above is not possible and you need to use Google's URL builder to help track the campaign and not lose referral parameters on the 301.

So you could use the URL builder to get the following parameters if you were promoting a custom URL like yoursite.com/save20:

http://www.yoursite.com/savings.php?utm_source=save20&utm_medium=mail&utm_campaign=bigsave

Then you can head into your .htaccess file (Apache) and insert this code:

(should be contained on 1 line in your .htaccess file)

RewriteRule ^save20$ /savings.php?utm_source=save20&utm_medium=mail&utm_campaign=bigsave [L,R=301]

When you test, you should see those URL builder parameters on the landing page and then you know you are good to go :)

If you are worried about multiple duplicate pages getting indexed in the search results (with slightly different tracking codes) you can also leverage the rel=canonical tag on your landing page

Separate Domains

Some companies use separate domains to track different campaigns. The idea is the same as is the basic code implementation with exception that you apply any redirect to the domain rather than a sub-page or directory off the domain as we did in the prior example.

So you sell snapping turtles (snappingturtles.com) and maybe you sell turtle insurance so you buy turtleinsurance.com and you want to use that as a part of a large campaign to promote this new and innovative product. You could get this from the url builder:

http://www.snappingturtles.com/?utm_source=national&utm_medium=all&utm_campaign=turtleinsurance

The .htaccess on turtleinsurance.com would look like:

(should be contained on 1 line in your .htaccess file)

RewriteRule .* http://www.snappingturtles.com/?utm_source=national&utm_medium=all&utm_campaign=turtleinsurance [L,R=301]

This would redirect you to the home page of your main site and you can update your .htaccess with a sub-page if you had such a page catering to that specific market.

Custom Phone Numbers

There are quite a few ways to get cheap virtual numbers these days and Phone.com is reliable service where you can get a number for roughly $4.88 per month.

I know companies that implemented custom numbers for a bunch of print ads and it was pretty eye-opening in terms of which as performed better than others and how much money is wasted on untargeted print campaigns.

There certainly is a somewhat intangible brand equity building component to offline ads but it is still interesting to see ads which carry their weight with traffic and response rates, as well as being really helpful when it comes time to reshape the budget.

Here are a couple handfuls of providers which offer phone tracking inside of Google Analytics. Most of these providers will require the purchase of a number from them to tie into a specific URL on your site or just right into the domain + help track those calls alongside the pageviews generated.

Some campaigns are wide-ranging enough to where you may want to target them with a custom number or two and a custom URL or domain. Using a spreadsheet to track these measures along with using Google Analytics annotations to gauge traffic spikes and drops offers business owners deep view into the use of their marketing dollars.

Custom Coupon Codes

If you run a coupon code through Groupon you of course know where it came from. But other channels are also becoming easier to track. Microsoft Office makes it easy to create & track custom coupon codes. There are even technologies to allow you to insert tracking details directly into coupon codes on your own website (similar to online tracking phone numbers via services like IfByPhone or Google's call tracking). Some online coupons offer sophisticated tracking options, and Google wants to get into mobile payments to offer another layer of customer tracking (including coupons).

Finding a Reputable Provider

If you are a business owner who thinks "wow this is awesome, how the heck do I do it?", well here is some advice. If the field of web analytics is mostly foreign to you I would suggest finding a certified Google Analytics provider or ask if your current web company can do this for you. Certainly there are plenty of competent people and companies that are not part of the Google Analytics partner program.

If you are interested in a Google Analytics partner you can search for them here. There is also quite a bit of information in the self-education section of Google Analytics.

I would recommend learning how to do this over a period of time so you can make minor or major changes yourself at some point. Also, it helps to establish a business relationship with someone competent and trustworthy for future tasks that may come up, which you cannot do on your own.

If you are a service provider, start implementing this for some of your local clients and you'll likely be well on your way to establishing yourself as a sought-after marketer in your area.

Categories: 

Thursday, April 21, 2011

SEO For Designers, Developers & Managers

SEO on your own site is straightforward, at least in terms of the politics. SEO'ing a site that a team works on is another matter.

You will come up against barriers. These barriers are often put up by designers, developers, copywriters and management. Frustrating as it is for the SEO, this is the reality of working on a site alongside other people, all of whom have agendas and requirements that may differ markedly from your own.

So how do you navigate this space? How do you ensure your SEO objectives can be met when other people may be resistant to change, or openly try to block you? In this post, we'll take a high-level, conceptual look at the challenges the SEO faces when working on a client site, and talking-points to help explore and clarify concepts.

1. Why Are We Doing SEO At All?

SEO is a pain.

It's complicated. It gets in the way, particularly when it comes to design. Why do we need headings and a lot of text when a picture tells a story? SEO appears to be an arbitrary, dark art with little in the way of fixed rules, and the client probably doesn't care about it anyway.

The thing is - if SEO is done well, a client may throw a whole lot more money at the site in future. Everyone likes to build on success, and that means more business, and more exposure, for everyone involved. On the internet, traffic = success. Traffic = money. A site that few people see, no matter how well executed, will likely fail, just like a site that fails to engage and convert visitors will fail. The client may not know they want SEO now, but you can be certain they'll be asking questions about it after launch.

If SEO is done poorly, the site may not be seen by as many people as it otherwise would. What use is a beautiful design that is seldom seen? What use is great code that is seldom used?

The value proposition of SEO is that it helps get a site seen. It's a powerful marketing channel, because most people use search engines to navigate the web. Sites that deliver what the search engines want stand to gain a lot more traffic than sites that do not undertake SEO. If your competitors are undertaking SEO, this puts your work at a competitive disadvantage. Their site will be seen more often by search visitors. Their web agencies will likely get more business as clients see greater returns on their investment.

That's why we do SEO. To be seen.

Of course, a site can be seen by other means. Word-of-mouth, social media, links, brand awareness, and offline advertising. A site doesn't need SEO, but given that it is a relatively easy win in terms of cheap traffic acquisition, the extra effort involved is negligible compared to the upside benefits. It's like being given a choice of having a shop located on main street vs a location way out in the desert. Much the same effort involved in building, but significantly different traffic potential.

2. SEO Is A Design Element

Just as copywriters require space to insert paragraphs and headings, SEO's require space to do their thing.

If you're a designer, an SEO will likely provide you with a list of their requirements. These requirements need not be onerous, any more so than leaving space for copy is considered onerous.

There are two key aspects where SEO needs to integrate with design. One aspect is the requirement for machine readable text, provided in a format the search engines are able to read, and derive meaning. Search engines "think" mostly in terms of words, not pictures. Make design allowances for copy that includes lot of headings and sub-headings, a technique which also dovetails nicely with usability.

The other key aspect is crawl-ability. A search engine sends out a spider, a piece of code that grabs the source code of your website, and dumps it back in a database. It skips from page to page, following links. If a page doesn't have a link to it, or no crawlable link to it, it is invisible to the search engines. There are various means of making a site easy to crawl, but one straightforward way is to use a site map, linked to from each page on the site. Similarly, you should ensure your site navigation is crawlable, which means using standard hyperlinks, as opposed to scripted/executable links. If you must use scripted links, try and replicate the navigation elsewhere on the page in non-scripted form, or within the body of the text.

For most sites, that's pretty much it when it comes to design considerations. In summary, the inclusion of machine readable text, and a means for a spider to crawl easily from page to page.

An SEO may also wish to specify a page hierarchy and structural issues, where some pages are given more prominent positions than others. Of course, this needs to be weighed against navigation considerations for visitors who arrive at the site via other means.

3. SEO For Developers

Like design, there are two key areas of integration.

One is tagging. SEO's will want to specify title tags, and some meta tags. These need to be unique for each page on the site, as each page is an entry page as far as a search engine is concerned. A search visitor will not necessarily arrive at the home page first.

The title tag appears in search results as a click able link, so serves a valuable marketing function. When search visitors consider which link to click, they'll use the title tag and snippet to influence their decision.

The second aspect concerns URL's. Ideally, a URL should contain descriptive words, as opposed to numbers and random letters. For example, acme.com/widgets/red-widgets.htm is good, whilst acme.com/w/12345678&tnr.php is less so. The more often the keyword appears, the more likely it will be "bolded" on a search results page, and is therefore more likely to attract a click. It's also easier for the search engine to determine meaning if a URL is descriptive as opposed to cryptic. For an in-depth look at technical considerations, see "SEO For Designers".

One workaround if the database needs unique codes is to translation at the URL level, using URL rewriting.

4. SEO Is A Marketing Strategy

The on-page requirements, as dealt with above, are half the picture.

In order to rank well, a page needs to have links from external sites. The higher quality those sites, the more chances your pages have of ranking well. The SEO will look to identify linking possibilities, and point these links to various internal pages on the site.

It can be difficult, near impossible, to get high quality links to brochure-style advertising pages. Links tend to be directed at pages that have reference value. This is a strategic decision that needs to weighed during site conception. Obviously, few sites strive, or want to be, Wikipedia, however there are various ways to incorporate reference information into commercial sites where the primary purpose of the site is not the publication of reference information.

For example, include a blog, a news feed, publish the e-mail newsletter to the site, and/or incorporate a reference section within the site. It doesn't matter if this section isn't viewed by visitors who navigate directly to the site. It provides a means to get a lot of information-rich content into the site without disrupting design and other commercial imperatives. Think of it as a "mini-site" within a site.

Not every page needs to be for the purposes of SEO. SEO can be sectioned off, although this is often less ideal than more holistic integration throughout the site.

5. Strategic Factors For Managers

Concept, design and development can screw-up SEO.

Poor integration can result in loss of potential traffic. This traffic will go to competitors. The longer a site doesn't use an SEO strategy, the harder it is to ever catch the competition, as a head-start in link building is difficult to counter.

If your aim, or your clients aim, is to attract as much targeted traffic as possible - as most site owners do - then SEO integration must be taken as seriously as design, development, copy and other media. It may influence your choice of CMS. It may influence your strategic approach in terms of how and what type of information you publish.

Whilst SEO can be bolted-on afterwards, this is a costly and less-effective way of doing SEO, much like re-designing a site is costly and less effective than getting it right in the planning stage. If SEO is integrated in the planning stage, it is reasonably straightforward.

The time to incorporate SEO is during site conception. SEO is a text publishing strategy. Design and development will need to make minor changes to the way they approach a site build. Doing this retrospectively, whilst not impossible, is more difficult, and therefore more costly.

Coda: Flash Workarounds For SEO

There are various workarounds to existing search-unfriendly design, but I'd advise to avoid the problem in the first place.

Flash, whilst a useful tool for embedding within sites, should be avoided for the entire site. Flash is a graphics/animation format, whereas search - and the web in general - is primarily a text format. If you build an entire site using Flash, then your competitors will overtake you in terms of search visitors. The formats simply do not gel.

One work around is strategic - split the site in two. Use Flash as a brochure site, and create a hub site that is text based. Consider including a "printable" version of the site, which will give the search engines some text to digest. Whilst there are technical and strategic ways around Flash, they are often clumsy and tedious.

The search engines can make sense of most sites, but if you're expecting to get rewarded by search engines, then it pays to stick as close to their technological strengths and weaknesses as possible.

Categories: 

Love & Farming vs Exploitation: Who's Winning?

Value Systems

Many broken belief systems that exist do so because of a misinformed understanding of how the world works through naive idealism, with various special interests paying to syndicate misinformation that coincides with their current business model to foster culturally constructed ignorance - agnotology.

It is not a bubble. This time is different. The internet changes everything

And then of course we had "Real estate always goes up!"

Who was behind that lie? The bankers, the mortgage brokers, the Realtors, bond raters, hedge funds, construction companies, media running real estate ads, local government tax revenues, current home owners who kept seeing their "savings" go up while doing nothing. Some of those people did not intentionally aim to be deceitful, they just believed a convenient lie that fit with their worldview.

"It is difficult to get a man to understand something when his job depends on not understanding it" - Upton Sinclair.

Fraud vs the Stuff Bankers do

And so the bubble grew until one day the fraud was so integrated into society that there was simply nobody left to sell to.

Then the bottom fell out.

The rule of law was SELECTIVELY & arbitrarily enforced against a few, even while companies that made sworn statements admitting to doing literally millions of times more damage were not penalized, but rather bailed out / promoted.

“In mid-2006, I discovered that over 60 percent of these mortgages purchased and sold were defective,” [Citibank's] Bowen testified on April 7 before the Financial Crisis Inquiry Commission created by Congress. “Defective mortgages increased during 2007 to over 80 percent of production.”

The rule of law only applies to those who lack the resources needed to subvert it. Socialism for the rich, capitalism for the rest!

Wachovia was a strong brand. A true pioneer and market leader in the drug money game, which funneled over 1/3 TRILLION Dollars onto the hands of drug dealers. For the crime they got a slap on the wrist. there was no bonus clawbacks. There was no jailtime. There was no honest attempt at the rule of law.

Online, Just Like Offline

The same is true online. Those who exhibit desirable characteristics are promoted & those who do not fit such a frame are left to fight amongst other losers in the market.

"we actually came up with a classifier to say, okay, IRS or Wikipedia or New York Times is over on this side, and the low-quality sites are over on this side." - Matt Cutts

Create an Itch

For a marketer to say what is old and steady and boring is effective is not a way to be perceived as relevant (that old coot is still stuck in '97!)

Being grounded is not a way to get positive headlines. Saying that the web is becoming just like the fraud laced offline world would be considered in poor taste. You have to sell something new...to try to push to inspire, achieve, gain hope, etc.

If you manufacture evidence that your LinkedIn votes are directly tied to better Google rankings then outsiders who are unaware of the workings of your industry may syndicate that misinformation. Even if you run a public experiment that fails it still shows you are trying new things (are cutting edge), and is a low cost branding exercise. Just like how MLM folks say you can get rich by using the same system they used to get rich. Everyone wants to sell a life worth living, even if they are not living that life, but rather sentenced to life in prison.

Pushing the Boundaries

Most profitable belief systems sell into an existing worldview but with a new hook on it. Most new marketing approaches are all about pushing the boundaries of what exists, probing to find the edges. Some people do it on the legal front, others probe on the ethical front, and yet others are just more creative & try to win by using technologies in unique ways. If you never fall off a cliff and never have any hate spewed your way then you are likely a bland marketer who hasn't done very much.

Google is in the press almost every week, aiming to stretch the boundaries on trademark, copyright, privacy, and so on.

I was lucky enough to chat with Larry one-to- one about his expectations for Google back in 2002. He laid out far-reaching views that had nothing to do with short-term revenue goals, but raised questions about how Google would anticipate the day sensors and memory became so cheap that individuals would record every moment of their lives. He wondered how Google could become like a better version of the RIAA - not just a mediator of digital music licensing - but a marketplace for fair distribution of all forms of digitized content.

Google is seen as an amazing company that does a limitless amount of good for the world. Yet the are up for anti-trust review and carry ad categories for "get rich quick." Google massages how they are viewed. Anytime something bad happens to their brand you can count on a new invention or an in-depth story of a rouge spammer getting torched by "justice."

For a company that is so good at manipulating outside rules & guidelines, they really lean hard on the arbitrary guidelines they create.

And they are willing to buy websites that violate their own guidelines. And they are not against running custom advertorials.

Foundational Marketing vs Public Relations Spin

The web is constantly shifting. Mailing lists, email newsletters, blogs, wikis, Facebook, Twitter, Color, etc. Most of the core infrastructural stuff is boring. But it is essential. If you don't understand email marketing or newsletters you can't create Groupon.

It is the new stuff with some sort of twist that earns the ink, which drives the perceived value, which earns the ink, which builds the actual value. But most people can't tell the difference between real innovation and public relations fluff. And so after a series of failures and burning millions of Dollars of capital it is time to pivot again. Anything to be seen as new and/or relevant.

If you manufacture evidence that your new strategy is better than Google then outsiders who are unaware of the workings of your industry may syndicate that misinformation. Even if you run a public experiment that fails it still shows you are trying new things (are cutting edge), and is a low cost branding exercise.

Sounds familiar, right?

History keeps repeating itself.

Algorithmic Fallout vs Spam

The perfect algorithm is something that does not exist.

Every choice has winners and losers. No matter what happens to the network & how the algorithms evolve people will find ways to exploit them. Many of Google's biggest holes were caused by Google patching old holes.

Which is precisely why Google leans so hard on public relations & shaping market behavior.

It is not the fault of the search engineer when something goes astray, but rather an evil exploitative spammer (even when Google's AdSense is the revenue engine driving the project).

Clean Your ____ Up!

Thinking back to the content farm update (which was never called the content farm update, because it impacted a wide array of websites) the main message that came out of it is that "Google can determine content quality" and "you better increase your content quality." Webmasters who heard that message were stuck in a tough situation if they had hundreds of thousands or millions of pages indexed in Google. How exactly do you *profitably* increase the quality of millions of pages, even while your site is torched to the ground, revenues are off over 50%, and the timetable + certainty for the solution are both unknowns? In many cases it would be cheaper to start from scratch than to repair the broken mess & deal with all the known unknowns.

Based on Google's advice many webmasters decided that as part of their strategy they would improve the quality of some of their best pages & then have a look at some of their worst content sections and try to block and/or remove them from Google. That sounds pretty logical! In response to that overly-logical approach to problem solving, Matt Cutts wrote the following:

What I would not recommend is sending tons (as in, thousands or even tens of thousands) of individual url removal requests to the url removal tool. And I would definitely not recommend making lots (as in, dozens or even more) of Webmaster Central accounts just to remove your own urls. If we see that happening to a point that we consider excessive or abusive, we reserve the right to look at those requests and responding by e.g. broadening, cancelling, or narrowing the requests.

So here you are trying to comply with Google's latest algorithmic approach (after they already torched your website once) and they have to give you another "or else."

Why The SEO Consultant Will NEVER Go Away

It would be nice to know what pages Google thinks are of low quality, but they don't say. It would be nice to know what pages are indexed in Google, but even official data given in Google Webmaster Tools varies widely over time, let alone the data which is shared publicly.

Further, some sites, like forums, are hard to edit to please Google without potentially destroying the flow of the community and enraging the community. Should sites have to delete or de-index their water cooler area because of Google?

What about the pages that GoogleBot arbitrarily creates by putting keywords into search boxes and generating pages that the site owner may not even know are indexed?

The reason so many webmasters are forced to rely on external search advice is that Google's desire to not be manipulated is so strong that they frequently appear dishonest & not worthy of trust. They speak vaguely, distort, and change the numbers as needed to fit the vision. Saying "in an ideal world" doesn't make that ideal world appear. And people don't trust folks like Donald Rumsfeld - at least smart people don't.

And that is why the SEO market will never die.

Corporatocracy

As for the web, it is still teething. We are most alike in the areas where we are vulgar & we are most unique the areas in which we are refined. Ultimately what happens as Google becomes more corporate is that Google becomes a boring shopping mall.

The search world loses love & farmers. But unfortunately it was the wrong kind of farmers, as eHow lives on.

Categories: 

Google Penalized BeatThatQuote.com

Shortly after we found out that Google was to purchase Beat That Quote we highlighted how Google purchased a website that was deeply engaged in numerous nefarious black hat SEO practices. A friend just pinged me to confirm that Google has penalized the domain by removing it from the search results.

From a competition & market regulation perspective that was a smart move for Google. They couldn't leave it in the search results while justifying handing out penalties to any of its competitors. As an added bonus, the site is building up tons of authoritative links in the background from all the buzz about being bought by Google. Thus when Google allows it to rank again in 30 days it will rank better than ever.

Based on their web dominance which generates such a widespread media buzz, Google adds millions of Pounds worth of inbound links to any website they buy.

The message Google sends to the market with this purchase is that you should push to get the attention of Google's hungry biz dev folks before you get scrutiny from their search quality team. After the payday the penalty is irrelevant because you already have cash in hand & the added links from the press mentioning the transaction will more than offset any spam links you remove. Month 1 revenues might be slightly lower, but months 2 through x will be far higher.

Using PPC for Local SEO

We are all aware of the importance Google has been placing on local search over the last couple of years, we touched on it in a recent blog post.

Google also has some interesting statistics on local numbers pertaining to small business stats and Google's local stats (20% of searches have local intent).

As a local advertiser, starting an SEO campaign in your local market is typically built on the strength of your keyword research. Say you are an insurance agent, do more people use car or auto when searching for auto insurance? Do people use "city/town keyword", "city/town, state, keyword", "zip code keyword"?

Some of these questions can be answered using a tool like Google Trends. Here you can see the results for "Texas Doctor" versus "TX Doctor":

So here you can see that it's pretty close, and volume is pretty close in Google's keyword tool as well:

However, when you get into phrase match the volumes separate a bit:

Overcoming Keyword Tool Volume Concerns

The other thing you'll want to keep in mind is that sometimes these tools can be off on volume, sometimes a lot and sometimes not so much. How do you solve this? You can do a PPC campaign to test a few things like:

  • actual search volume of your chosen keywords
  • conversion rates on keywords
  • additional keywords that trigger your ads via the Search Term Report in Adwords

The beauty of starting your campaign with PPC is that you can not only keep it running if it's profitable for you, rather than it just being a proving ground for keywords, but you are able to discover keywords and keyword groups that are profitable and have enough volume to where an investment into SEO is worthwhile.

Local search, by definition (since it is roughly a quarter of the search market), is on the lower end of the volume pole but in comparsion to a local business's resources and reach the volume is typically relative to that of keywords for a national company pursuing non-local keywords country wide.

Thinking About Campaign Structure

In addition to finding juicy keywords and keyword themes to build on, you can eliminate the poorly performing ones or the ones which have close to no volume from your PPC campaign and remove it from your SEO planning. This not only helps your PPC account grow and mature but also helps you avoid wasting time and resources on chasing irrelevant or unworthy keywords.

As we discussed, sometimes local keywords can use a variety of modifiers like the city or town name, the state name, and the zip code in conjunction with the keyword(s) so making sure you are targeting the right mix from an SEO perspective is really helpful in getting quicker and better results. There is no point in optimizing your on-page content and targeting your link building plans on your keyword(s) plus a zip code if your market is searching by city/town and state (and vice versa). In the interest of time and better results, it makes sense to nail down the correct keywords upfront.

Starting off with Research

Generally, my initial research process goes something like this (we are assuming you've got a live site already):

  • look in analytics to find keywords that you are already receiving traffic for
  • see if there are any trends in that data in terms of language (car vs auto insurance for example)
  • begin broad keyword research to find terms related to the market (exclude local modifiers for now)
  • use free mindmap software or free site planning apps to visualize the main content areas of the site with those keywords
  • use google trends and insights, in addition to the google keyword tool and the free seobook keyword tool to compare data points on core terms (again, like with car/auto insurance or home versus homeowners insurance)
  • make a list of competitors in my area and check the volume on their brand name

So now I should have a good idea of which keywords I want to look at locally and some notes on any glaring differences in volume between closely related terms.

Going Local

Now it's time to "localize" the data. I like the local keyword tool over at PPCblog.com because it does a really good job of working in all the different local modifiers that can be associated with your local PPC campaign.

That is a paid tool, as part of the PPC blog community and training membership (along with a lot of other quality PPC tools), and it's quite robust and easy to use.

If you are looking for a free tool along those lines, with less on the functionality front, you can use this free tool from 5minutesite.com.

Then I move into searching on some of the core terms in Google's keyword tool and the SEObook keyword tool (powered by Wordtracker). Many times you'll find nothing for some of your local searches, in terms of volume, but you should still keep them around for testing in PPC because keyword tools can be off on local searches based on their traditionally lower volume sets. Also, most keyword tools don't or can't allocate resources to capture every single search.

So now I should have a list of locally modified terms where the keyword portions were driven by non-local keyword research and local modifiers were added via a local keyword tool.

In addition, I should have notes and screenshots of data from Google Trends and Insights showing any language differences (of substance) both nationally and locally (locally when available, sometimes no data exists in the tools). I also should have notes as to any language or keyword trends I found in my analytics or tips I received by talking with employees who deal with customers as well as my own knowledge of the industry.

Working with AdWords

There are different ways of attacking your campaign in AdWords. Initially, I am just doing this for testing on an SEO campaign but if you decide to stick with the PPC campaign you can get into removing the local modifiers and bidding on those broader keywords while targeting searchers geographically.

Google has a few different ways of targeting users based on location:

Locations and Languages offer you the ability to target in 4 ways:

  • Bundles - mostly specific countries (United States, Spain, Canada, etc) and regions (North America, Central America, East Asia, etc)
  • Browse - essentially goes country - state - metro area - specific city or town
  • Search - search for and add just about anything (country, state, town, zip code to find towns or cities)
  • Custom - a nifty point, click, drag interfact where you can isolate a specific area where you want your ads shown

You also have some advanced options like the Targeting Method:

Google has a really helpful chart on this here, and below is a screenshot of the information:

I like to leave both on as it helps with gauging not only the potential of your keywords but also the overall level of activity for your services (via keywords) in your market. Plus, the search term report can help you breakdown keywords that trigger your ads and this kind of PPC can help you show for broad SEO terms that you might not have the resources to compete for.

Another advanced targeting option is the exclusion method:

Google has information on this method here and here's a chart showing the relationship:

I like to use this in some cases where there may be towns that overlap. For example, you could live in Maine and be targeting "Augusta" as a modifier but you'll probably want to exclude Georgia from your targeting as that is another area which can produce searches for that modifier. You can also get around that by adding a state modifier, Augusta Maine Insurance or some such, but you may find many folks use just the city or town name. That is when exclusion methods can be helpful.

Starting off on the Right Foot

Now I'll start to build the PPC campaign and pay attention to some of the core principles of trying to obtain a good quality score and good overall performance for a new account:

  • tight ad groups with keywords that are relevant to the ad group and the query
  • quality landing pages which speak specifically to the intent of the query (don't use a generic insurance template for all the different kinds of insurance you sell)
  • starting off with a managable amount of keywords to help focus on quality of traffic rather than quantity, and to help promote good keywords and remove or isolate bad ones

As an example, you might be selling life insurance in a few different towns. I would consider using town-specific ad groups -> keywords -> landing pages as my structure.

You can use helpful landing pages for a specific town by talking about things like average family size in the town, average income, and so on to help residents get a more customized experience when shopping for life insurance.

You can also build product-specific ad groups and group your town/city modified keywords in there if that makes more sense for your specific campaign.

Waiting for Results

In about a month or less I should have a pretty good idea of:

  • search volume for my proposed keywords
  • new keywords that I didn't find initially
  • which keywords convert and which don't
  • will PPC fit into my ongoing marketing efforts?
  • what type of SEO investment does my search volume call for?

We live in a world and business environment where we want things yesterday and sometimes it can be tough to play the patience game. In my opinion, lack of patience is a leading cause of SEO and PPC failure these days.

If you take the above approach with a new campaign or a new idea, you will thank yourself in the short, mid, and long run. There are few sources of advice better than hard data, whether it tells you what you do or don't want to hear.

Free Google AdWords Coupons

Google is advertising a free $75 coupon for new AdWords advertisers, and offers SEM firms up to $2,000 in free AdWords credits via their Engage program.

Google Shows True Colors With BeatThatQuote Spam

Guidelines are pushed as though they are commandments from a religious tome, but they are indeed a set of arbitrary devices used to hold down those who don't have an in with Google.

When Google nuked BeatThatQuote I guessed that the slap on the wrist would last a month & give BTQ time to clean up their mess.

As it turns out, I was wrong on both accounts.

Beat That Quote is already ranking again. They rank better than ever & only after only 2 weeks!

And the spam clean up? Google did NOTHING of the sort.

Every single example (of Google spamming Google) that was highlighted is still live.

Now Google can claim they handled the spam on their end / discounted it behind the scenes, but such claims fall short when compared to the standards Google holds other companies to.

  • Most sites that get manually whacked for link-based penalties are penalized for much longer than 2 weeks.
  • Remember the brand damage Google did to companies like JC Penny & Overstock.com by talking to the press about those penalties? In spite of THOUSANDS of media outlets writing about Google's BTQ acquisition, The Register was the most mainstream publication discussing Google's penalization of BeatThatQuote, and there were no quotes from Google in it.
  • When asking for forgiveness for such moral violations, you are supposed to grovel before Google admitting all past sins & admit to their omniscient ability to know everything. This can lead one to over-react and actually make things ever worse than the penalty was!
  • In an attempt to clean up their spam penalties (or at least to show they were making an effort) JC Penny did a bulk email to sites linking to them, stating that the links were unauthorized and to remove them. So JC Penny not only had to spend effort dropping any ill gotten link equity, but also lost tons of organic links in the process.

Time to coin a new SEO phrase: token penalty.

token penalty: an arbitrary short-term editorial action by Google to deflect against public relations blowback that could ultimately lead to review of anti-competitive monopolistic behaviors from a search engine with monopoly marketshare which doesn't bother to follow its own guidelines.

Your faith in your favorite politician should be challenged after you see him out on the town snorting coke and renting hookers. The same is true for Googler's preaching their guidelines as though it is law while Google is out buying links (and the sites that buy them).

You won't read about this in the mainstream press because they are scared of Google's monopolistic business practices. Luckily there are blogs. And Cyndi Lauper. ;)

Update: after reading this blog post, Google engineers once again penalized BeatThatQuote!

Categories: 

Wednesday, April 20, 2011

Google Wants to Act Like a Start Up

I just saw this Google snippet while trying to fine one of our old posts and it was *so* awful that I had to share it.

This is an area where Bing was out in front of Google & used a more refined strategy for years now before Google started playing catch up last fall.

Google ignored our page title, ignored our on-page header, and then use the 'comments' count as the lead in the clickable link. Then they follow it with the site's homepage page title. The problem here is if the eye is scanning the results for a discriminating factor to re-locate a vital piece of information, there is no discrimination factor, nothing memorable stands out. Luckily we are not using breadcrumbs & that post at least had a somewhat memorable page URL, otherwise I would not have been able to find it.

For what it is worth, the search I was doing didn't have the words comments in it & Google just flat out missed on this one. Given that some huge % of the web's pages has the word "comments" on it (according to the number of search results returned for "comments" it is about 1/6th as popular online as the word "the") one might think that they could have programmed their page title modification feature to never select 'comments' as the lead.

Google has also been using link anchor text sometimes with this new feature, so it may be a brutal way to Google-bomb someone. It is sure be fun when the political bloggers give it a play. ;)

But just like the relevancy algorithms these days, it seems like this is one more feature where Google ships & then leaves it up to the SEOs to tell them what they did wrong. ;)

Categories: 

Majestic SEO Fresh Index

Majestic SEO has long had great link data, but their biggest issue has been usability. They sorta built with the approach of "let's give them everything" as a default, and then allowed advanced filtering to be done over the top to generate custom reports.

For advanced users this type of set up is ideal, because you are able to slice and dice it in many ways on your own terms. It allows you to spot nepotistic networks, pinpoint strategies quickly, and generally just give you a good look at what is going on in ways that wouldn't be able to do if you couldn't get all the data in a table. There are so many valuable edge case uses that can't practically be put in a single interface while keeping usability high for the average use.

But for people newer to the SEO game & those looking for a quick source of data the level of options can be a bit overwhelming when compared against something like Open Site Explorer. A parallel analogy would be that when I want to spot check rankings real quick I rely on our rank checker, but if you want to have a variety of in-depth historical views then something like Advanced Web Ranking can be a quite helpful tool.

In an attempt to improve the "at a glance" style functionality Majestic SEO announced their new site explorer, which puts more data at your fingertips without requiring you to open up an Excel spreadsheet:

How much can you use the Majestic Site Explorer?
The system is designed for silver users and above. Silver subscribers can query upto 10 different domains an HOUR. Gold subscribers can query upto 30 different domains an hour and Platinum subscribers can query upto 100 different domains an hour. All levels are subject to fair use terms.

These allow you to view data on a sitewide basis, at the subdomain level, or drill down to individual pages.

Here is an example of a site level report

and if you wanted data down to the URL level, here is an overview of a top few links (note that the report goes on for numerous pages with data)

This update helped Majestic SEO close the gap a bit with Open Site Explorer, but a couple more things they may want to consider doing are

  • adding result crowding / limit results to x per domain
  • allowing you to filter out internal link data

Those features are available via their advanced reports, but making it easier to do some of that stuff in the "at a glance" interface would allow Majestic SEO to provide as a best in breed solution for both the "at a glance" function and the "in-depth deep research" options.

Majestic SEO also announced their new fresh index, which allows you to view fresh link data as recently as within the past day. It doesn't require waiting for a monthly update or such, but offers link data right away. To help spread the word & give everyone a chance to see some of the new features they gave us free discount voucher codes to give out to get a 20% discount on your first month at any level.

If you have any questions about how Majestic SEO works you can sign up & register your own site, which allows you to access many of their features free. As a comparison SEOmoz (which offers Open Site Explorer) is also running a free 1-month trial right now.

Categories: 

 
Free Host | new york lasik surgery | cpa website design