Popular Posts

Saturday, April 23, 2011

Google's Cat & Mouse SEO Game

This infographic highlights how Google's cat and mouse approach to SEO has evolved over the past decade.

One of the best ways to understand where Google is headed is to look at where they have been and how they have changed.

Click on it for ginormous version.

Google's Collateral Damage Infographic.

If you would like us to make more of them then please spread this one. We listen to the market & invest in what it values ;)

Feel free to leave comments below if you have any suggestions or feedback on it :)

Categories: 

Is the Huffington Post Google's Favorite Content Farm?

I was looking for information about the nuclear reactor issue in Japan and am glad it did not turn out as bad as it first looked!

But in that process of searching for information I kept stumbling into garbage hollow websites. I was cautious not to click on the malware results, but of the mainstream sites covering the issue, one of the most flagrant efforts was from the Huffington Post.

AOL recently announced that they were firing 15% to 20% of their staff. No need for original stories or even staff writers when you can literally grab a third party tweet, wrap it in your site design, and rank it in Google. Inline with that spirit, I took a screenshot. Rather than calling it the Huffington Post I decided a more fitting title would be plundering host. :D

plundering host.

We were told that the content farm update was to get rid of low quality web pages & yet that information-less page was ranking at the top of their search results, when it was nothing but a 3rd party tweet wrapped in brand and ads.

How does Huffington Post get away with that?

You can imagine in a hyperspace a bunch of points, some points are red, some points are green, and in others there’s some mixture. Your job is to find a plane which says that most things on this side of the place are red, and most of the things on that side of the plane are the opposite of red. - Google's Amit Singhal

If you make it past Google's arbitrary line in the sand there is no limit to how much spamming and jamming you can do.

we actually came up with a classifier to say, okay, IRS or Wikipedia or New York Times is over on this side, and the low-quality sites are over on this side. - Matt Cutts

(G)arbitrage never really goes away, it just becomes more corporate.

The problem with Google arbitrarily picking winners and losers is the winners will mass produce doorway pages. With much of the competition (including many of the original content creators) removed from the search results, this sort of activity is simply printing money.

As bad as that sounds, it is actually even worse than that. Today Google Alerts showed our brand being mentioned on a group-piracy website built around a subscription model of selling 3rd party content without permission! As annoying as that feels, of course there are going to be some dirtbags on the way that you have to deal with from time to time. But now that the content farm update has went through, some of the original content producers are no longer ranking for their own titles, whereas piracy sites that stole their content are now the canonical top ranked sources!

Google never used to put piracy sites on the first page of results for my books, this is a new feature on their part, and I think it goes a long way to show that their problem is cultural rather than technical. Google seems to have reached the conclusion that since many of their users are looking for pirated eBooks, quality search results means providing them with the best directory of copyright infringements available. And since Google streamlined their DMCA process with online forms, I couldn’t discover a method of telling them to remove a result like this from their search results, though I tried anyway.
... I feel like the guy who was walking across the street when Google dropped a 1000 pound bomb to take out a cockroach - Morris Rosenthal

Way to go Google! +1 +1

Too clever by half.

Google Panda Coming to a Market Near You

If you live outside the United States and were unscathed by the Panda Update, a world of hurt may await soon. Or you may be in for a pleasant surprise. It is hard to say where the chips may lay for you without looking.

Some people just had their businesses destroyed, whereas the Online Publisher Association sees a $1 billion windfall to the winning publishers.

Due to Google having multiple algorithms running right now, you can get a peak at the types of sites that were hit, and if your site is in English you can see if it would have got hit by comparing your Google.com rankings in the United States versus in foreign markets by using the Google AdWords ad preview tool.

In most foreign markets Google is not likely to be as aggressive with this type of algorithm as they are in the United States (because foreign ad markets are less liquid and there is less of a critical mass of content in some foreign markets), but I would be willing to bet that Google will be pretty aggressive with it in the UK when it rolls out.

The keywords where you will see the most significant ranking changes will be those where there is a lot of competition, as keywords with less competition generally do not have as many sites to replace them when they are whacked (since there were less people competing for the keyword). Another way to get a glimpse of the aggregate data is to look at your Google Analytics search traffic from the US and see how it has changed relative to seasonal norms. Here is a look out below example, highlighting how Google traffic dropped. ;)

What is worse, is that on most sites impacted revenue declined faster than traffic because search traffic monetizes so well & the US ad market is so much deeper than most foreign markets. Thus a site that had 50% profit margins might have just went to break even or losing money after this update. :D

When Google updates the US content farmer algorithm again (likely soon, since it has already been over a month since the update happened) it will likely roll out around other large global markets, because Google does not like running (and maintaining) 2 sets of ranking algorithms for an extended period of time, as it is more cost intensive and it helps people reverse engineer the algorithm.

Some sites that get hit may be able to quickly bounce back *if* they own a well-read tech blog and have an appropriate in with Google engineers, however most will not unless they drastically change their strategy. Almost nobody has recovered and it has been over a month since the algorithm went live. So your best bet is to plan ahead. When the tide goes out you don't want to be swimming naked. :)

Categories: 

Friday, April 22, 2011

Doorway Pages Ranking in Google in 2011?

When Google did the Panda update they highlighted that not only did some "low quality" sites get hammered, but that some "high quality" sites got a boost. Matt Cutts said: "we actually came up with a classifier to say, okay, IRS or Wikipedia or New York Times is over on this side, and the low-quality sites are over on this side."

Here is the problem with that sort of classification system: doorway pages.

The following Ikea page was ranking page 1 in the search results for a fairly competitive keyword.

Once you strip away the site's navigation there are literally only 20 words on that page. And the main body area "content" for that page is a link to a bizarre, confusing, and poor-functioning flash tour which takes a while to load.

If you were trying to design the worst possible user experience & wanted to push the "minimum viable product" page into the search results then you really couldn't possibly do much worse that that Ikea page is (at least not without delivering malware and such).

I am not accusing Ikea of doing anything spammy. They just have terrible usability on that page. Their backlinks to that page are few in number & look just about as organic as they could possibly come. But not that long ago companies like JC Penny and Overstock were demoted by Google for building targeted deep links (that they needed in order to rank, but were allegedly harming search relevancy & Google user experience). Less than a month later Google arbitrarily changed their algorithm to where other branded sites simply didn't need many (or in some cases any) deep links to get in the game, even if their pages were pure crap. Google Handling Flash.

We are told the recent "content farm" update was to demote low quality content. If that is the case, then how does a skeleton of a page like that rank so high? How did that Ikea page go from ranking on the third page of Google's results to the first one? I think Google's classifier is flashing a new set of exploits for those who know what to look for.

A basic tip? If you see Google ranking an information-less page like that on a site you own, that might be a green light to see how far you can run with it. Give GoogleBot the "quality content" it seeks. Opportunity abound!

Categories: 

A Thought Experiment on Google Whitelisting Websites

Google has long maintained that "the algorithm" is what controls rankings, except for sites which are manually demoted for spamming, getting hacked, delivering spyware, and so on.

At the SMX conference it was revealed that Google uses white listing:

Google and Bing admitted publicly to having ‘exception lists’ for sites that were hit by algorithms that should not have been hit. Matt Cutts explained that there is no global whitelist but for some algorithms that have a negative impact on a site in Google’s search results, Google may make an exception for individual sites.

The idea that "sites rank where they deserve, with the exception of spammers" has long been pushed to help indemnify Google from potential anti-competitive behavior. Google's marketing has further leveraged the phrase "unique democratic nature of the web" to highlight how PageRank originally worked.

But why don't we conduct a thought experiment for the purpose of thinking through the differences between how Google behaves and how Google doesn't want to be perceived as behaving.

Let's cover the negative view first. The negative view is that either Google has a competing product or a Google engineer dislikes you and goes out of his way to torch your stuff simply because you are you and he dislikes you & is holding onto a grudge. Given Google's current monopoly-level marketshare in most countries, such would be seen as unacceptable if Google was just picking winners and losers based on their business interests.

The positive view is that "the algorithm handles almost everything, except some edge cases of spam." Let's break down that positive view a bit.

  • Off the start, consider that Google engineers write the algorithms with set goals and objectives in mind.
    • Google only launched universal search after Google bought Youtube. Coincidence? Not likely. If Google had rolled out universal search before buying Youtube then they likely would have increased the price of Youtube by 30% to 50%.
    • Likewise, Google trains some of their algorithms with human raters. Google seeds certain questions & desired goals in the minds of raters & then uses their input to help craft an algorithm that matches their goals. (This is like me telling you I can't say the number 3, but I can ask you to add 1 and 2 then repeat whatever you say :D)
  • At some point Google rolls out a brand-filter (or other arbitrary algorithm) which allows certain favored sites to rank based on criteria that other sites simply can not match. It allows some sites to rank with junk doorway pages while demoting other websites.
  • To try to compete with that, some sites are forced to either live in obscurity & consistently shed marketshare in their market, or be aggressive and operate outside the guidelines (at least in spirit, if not in a technical basis).
  • If the site operates outside the guidelines there is potential that they can go unpenalized, get a short-term slap on the wrist, or get a long-term hand issued penalty that can literally last for up to 3 years!
  • Now here is where it gets interesting...
    • Google can roll out an automated algorithm that is overly punitive and has a significant number of false positives.
    • Then Google can follow up by allowing nepotistic businesses & those that fit certain criteria to quickly rank again via whitelisting.
    • Sites which might be doing the same things as the whitelisted sites might be crushed for doing the exact same thing & upon review get a cold shoulder.

You can see that even though it is claimed "TheAlgorithm" handles almost everything, they can easily interject their personal biases to decide who ranks and who does not. "TheAlgorithm" is first and foremost a legal shield. Beyond that it is a marketing tool. Relevancy is likely third in line in terms of importance (how else could one explain the content farm issue getting so out of hand for so many years before Google did something about it).

Categories: 

Quick & Dirty Competitive Research for Keywords

There are so many competitive research tools on the market. We reviewed some of the larger ones here but there are quite a few more on the market today.

The truth is that you can really get a lot of good, usable data to give you an idea of what the competition is likely to be by using free tools or the free version of paid tools.

Some of the competitive research tools out there (the paid ones) really are useful if you are going to scale way up with some of your SEO or PPC plans but many of the paid versions are overkill for a lot of webmasters.

Choosing Your Tools

Most tools come with the promises of “UNCOVERING YOUR COMPETITORS BEST _____".

That blank can be links, keywords, traffic sources, and so on. As we know, most competitive research tools are rough estimates at best and almost useless estimates at worst. Unless you get your hands on your competition’s analytics reports, you are still kind of best-guessing. In this example we are looking for the competitiveness of a core keyword.

Best-guessing really isn’t a bad thing so long as you realize that what you are doing is really triangulating data points and looking for patterns across different tools. Keep in mind many tools use Google’s data so you’ll want to try to reach beyond Google’s data points a bit and hit up places like:

The lure of competitive research is to get it done quickly and accurately. However, gauging the competition of a keyword or market can’t really be done with a push of the button as there are factors that come into play which a push-button tool cannot account for, such as:

  • how hard is the market to link build for?
  • is the vertical dominated by brands and thick EMD’s?
  • what is your available capital?
  • are the ranking sites knowledgeable about SEO or are they mostly ranking on brand authority/domain authority? (how tight is their site structure, how targeted is their content, etc)
  • is Google giving the competing sites a brand boost?
  • is Google integrating products, images, videos, local results, etc?

Other questions might be stuff like "how is Google Instant skewing this keyword marketplace" or "is Google firing a vertical search engine for these results (like local" or "is Google placing 3 AdWords ads at the top of the search results" or "is Google making inroads into the market" like they are with mortgage rates.

People don't search in an abstract mathematical world, but by using their fingers and eyes. Looking at the search results matters. Quite a bit of variables come into play which require some human intuition and common sense. A research tool is only as good as the person using it, you have to know what you are looking at & what to be aware of.

Getting the Job Done

In this example I decided to use the following tools:

Yep, just 2 free tools.... :)

So we are stipulating that you’ve already selected a keyword. In this case I picked a generic keyword for the purposes of going through how to use the tools. Plug your keyword into Google, flip on SEO for Firefox and off you go!

This is actually a good example of where a push button tool might bite the dust. You’ve got Related Search breadcrumbs at the top, Images in the #1 spot, Shopping in the #3 spot, and News (not pictured) in the #5 spot.

So wherever you thought you might rank, just move yourself down a 1-3 spots depending on where you would be in the SERPS. This can have a large effect on potential traffic and revenue so you’ll want to evaluate the SERP prior to jumping in.

You might decide that you need to shoot for 1 or 2 rather than top 3 or top 5 given all the other stuff Google is integrating into this results page. Or you might decide that the top spot is locked up and the #2 position is your only opportunity, making the risk to reward ratio much less appealing.

With SEO for Firefox you can quickly see important metrics like:

  • Yahoo! links to domain/page
  • domain age
  • Open Site Explorer and Majestic SEO link data
  • presence in strong directories
  • potential, estimated traffic value from SEM Rush

Close up of SEO for Firefox data:

Basically by looking at the results page you can see what other pieces of universal search you’ll be competing with, whether the home page or a sub-page is ranking, and whether you are competing with brands and/or strong EMD’s.

With SEO for Firefox you’ll see all of the above plus the domain age, domain links, page links, listings in major directories, position in other search engines, and so on. This will give you a good idea of potential competitiveness of this keyword for free and in about 5 seconds.

It is typically better & easier to measure the few smaller sites that managed to rank rather than measuring the larger authoritative domains. Why? Well...

Checking Links

So now that you know how many links are pointing to that domain/page you’ll want to check how many unique domains are pointing in and what the anchor text looks like, in addition to what the quality of those links might be.

Due to its ease of use (in addition to the data being good) I like to use Open Site Explorer from SeoMoz in these cases of quick research. I will use their free service for this example, which requires no log in, and they are even more generous with data when you register for a free account.

The first thing I do is head over to the anchor text distribution of the site or page to see if the site/page is attracting links specific to the keyword I am researching:

What’s great here is you can see the top 5 instances of anchor text usage, how many total links are using that term, and how many unique domains are supplying those total links.

You can also see data relative to the potential quality of the entire link profile in addition to the ratio of total/unique domains linking in.

You probably won’t want or need to do this for every single keyword you decide to pursue. However, when looking at a new market, a potential core keyword, or if you are considering buying an exact match domain for a specific keyword you can accomplish a really good amount of competitive research on that keyword by using a couple free tools.

Types of Competitive Research

Competitive research is a broad term and can go in a bunch of different directions. As an example, when first entering a market you would likely start with some keyword research and move into analyzing the competition of those keywords before you decide to enter or fully enter the market.

As you move into bigger markets and start to do more enterprise-level competitive research specific to a domain, link profiles, or a broader market you might move into some paid tools.

Analysis paralysis is a major issue in SEO. Many times you might find that those enterprise-level tools really are overkill for what you might be trying to do initially. Gauging the competitiveness of a huge keyword or a lower volume keyword really doesn’t change based on the money you throw at a tool. The data is the data especially when you narrow down the research to a keyword, keywords, or domains.

Get the Data, Make a Decision

So with the tools we used here you are getting many of the key data points you need to decide whether pursuing the keyword or keywords you have chosen is right for you.

Some things the tools cannot tell you are questions we talked about before:

  • how much captial can you allocate to the project?
  • how hard are you willing to work?
  • do you have a network of contacts you can lean on for advice and assistance?
  • do you have enough patience to see the project through, especially if ranking will take a bit..can you wait on the revenue?
  • is creativity lacking in the market and can you fill that void or at least be better than what’s out there?

Only you can answer those questions :)

Categories: 

Google Update Panda

Google tries to wrestle back index update naming from the pundits, naming the update "Panda". Named after one of their engineers, apparently.

The official Google line - and I'm paraphrasing here - is this:

Trust us. We're putting the bad guys on one side, and the good guys on the other

I like how Wired didn't let them off the hook.

Wired persisted:

Wired.com: Some people say you should be transparent, to prove that you aren’t making those algorithms to help your advertisers, something I know that you will deny.

Singhal: I can say categorically that money does not impact our decisions.

Wired.com: But people want the proof.

This answer, from Matt Cutts, was interesting:

Cutts: If someone has a specific question about, for example, why a site dropped, I think it’s fair and justifiable and defensible to tell them why that site dropped. But for example, our most recent algorithm does contain signals that can be gamed. If that one were 100 percent transparent, the bad guys would know how to optimize their way back into the rankings

Why Not Just Tell Us What You Want, Already!

Blekko makes a big deal about being transparent and open, but Google have always been secretive. After all, if Google want us to produce quality documents their users like and trust, then why not just tell us exactly what a quality document their users like and trust looks like?

Trouble is, Google's algorithmns clearly aren't that bulletproof, as Google admit they can still be gamed, hence the secrecy. Matt says he would like to think there would be a time they could open source the algorithms, but it's clear that time isn't now.

Do We Know Anything New?

So, what are we to conclude?

  • Google can be gamed. We kinda knew that....
  • Google still aren't telling us much. No change there....

Then again, there's this:

Google have filed a patent that sounds very similar to what Demand Media does i.e looks for serp areas that are under-served by content, and prompts writers to write for it.

The patent basically covers a system for identifying search queries which have low quality content and then asking either publishers or the people searching for that topic to create some better content themselves. The system takes into account the volume of searches when looking at the quality of the content so for bigger keywords the content would need to be better in order for Google to not need to suggest somebody else writes something

If Google do implement technology based on this patent, then it would appear they aren't down on the "Content Farm" model. They may even integrate it themselves.

Until then....

How To Avoid Getting Labelled A Content Farmer

The question remains: how do you prevent being labelled as a low-quality publisher, especially when sites like eHow remain untouched, yet Cult Of Mac gets taken out? Note: Cult Of Mac appears to have been reinstated, but one wonders if that was the result of the media attention, or an algo tweak.

Google want content their users find useful. As always, they're cagey about what "useful" means, so those who want to publish content, and want to rank well, but do not want be confused with a content farm, are left to guess. And do a little reverse-engineering.

Here's a stab, based on our investigations, the conference scene, Google's rhetoric, and pure conjecture thus far:

  • A useful document will pass a human inspection
  • A useful document is not ad heavy
  • A useful document is well linked externally
  • A useful document is not a copy of another document
  • A useful document is typically created by a brand or an entity which has a distribution channel outside of the search channel
  • A useful document does not have a 100% bounce rate followed by a click on a different search result for that same search query ;)

Kinda obvious. Are we off-base here? Something else? What is the difference, as far as algo is concerned, between e-How and Suite 101? Usage patterns?

Still doesn't explain YouTube, though, which brings us back to:

Wired.com: But people want the proof

YouTube, the domain, is incredibly useful, but some pages - not so much. Did YouTube get hammered by update Panda, too?

Many would say that's unlikely.

I guess "who you know" helps.

In the Panda update some websites got owned. Others are owned and operated by Google. :D

Categories: 

How Google Destroyed the Value of Google Site Search

Do You Really Want That Indexed?

On-demand indexing was a great value added feature for Google site search, but now it carries more risks than ever. Why? Google decides how many documents make their primary index. And if too many of your documents are arbitrarily considered "low quality" then you get hit with a sitewide penalty. You did nothing but decide to trust Google & use Google products. In response Google goes out of its way to destroy your business. Awesome!

Keep in mind that Google was directly responsible for the creation of AdSense farms. And rather than addressing them directly, Google had to roll everything through an arbitrary algorithmic approach.

< meta name="googlebot" content="noindex" />

Part of the prescribed solution to the Panda Update is to noindex content that Google deems to be of low quality. But if you are telling GoogleBot to noindex some of your content, then if you are also using them for site search, you destroy the usability of their site search feature by making your content effectively invisible to your customers. For Google Site Search customers this algorithmic change is even more value destructive than the arbitrary price jack Google Site Search recently did.

We currently use Google Site Search on our site here, but given Google's arbitrary switcheroo styled stuff, I would be the first person to dump it if they hit our site with their stupid "low quality" stuff that somehow missed eHow & sites which wrap repurposed tweets in a page. :D

Cloaking vs rel=noindex, rel=canonical, etc. etc. etc.

Google tells us that cloaking is bad & that we should build our sites for users instead of search engines, but now Google's algorithms are so complex that you literally have to break some of Google's products to be able to work with other Google products. How stupid! But a healthy reminder for those considering deeply integrating Google into your on-site customer experience. Who knows when their model will arbitrarily change again? But we do know that when it does they won't warn partners in advance. ;)

I could be wrong in the above, but if I am, it is not easy to find any helpful Google documentation. There is no site-search bot on their list of crawlers, questions about if they share the same user agent have gone unanswered, and even a blog post like this probably won't get a response.

That is a reflection of only one more layer of hypocrisy, in which Google states that if you don't provide great customer service then your business is awful, while going to the dentist is more fun than trying to get any customer service from Google. :D

I was talking to a friend about this stuff and I think he summed it up perfectly: "The layers of complexity make everyone a spammer since they ultimately conflict, giving them the ability to boot anyone at will."

Categories: 

Tracking Offline Conversions for Local SEO

We have certainly seen a trend over the last one to two years where Google is focusing on more personalized search and an increasing focus on providing local results. As you know, a searcher does not even have to be burdened with entering a local modifier anymore.

Google will gladly figure out, for you, whether or not your search has local intent. :)

Google's Investment into Local

Late last year Google moved one of their prized executives over to local services, Marissa Mayer. Moving Mayer, fresh off Google Instant and a variety of other high profile areas of Google's search development, to head up local is a real strong reinforcement of how much attention Google is putting on local and local result quality (or perceived quality).

If you are a business owner who operates locally, say a real estate agent or insurance agent or really any other consumer-based service, then this presents a huge opportunity for you if you can harness the targeting and tracking ability available online.

Merging Offline Marketing with Online Marketing

A lot of small businesses or larger businesses that operate locally still rely quite a bit on offline advertising. It use to be that business owners had to rely on staff nailing down exactly how a lead came to them (newspaper ad? radio ad? special discount ad? and so on).

While it is still good practice to do that, relying solely on that to help gauge the ROI of your advertising campaign introduces a good amount of slippage and is not all that accurate (especially if you sell something online).

As local businesses start to see the light with SEO and PPC campaigns versus dropping 5 figures on phonebook advertising, a big selling point as a service provider or an in-house marketing staff member will be to sell the targeting of online campaigns as well as the tracking of those results.

If your a business owner, it's equally important that you understand what's available to you as an online marketer.

Types of Offline Advertising to Track

Locally, you are essentially looking at a few different types of advertising options to work into your new found zest for tracking results:

  • Radio
  • Television
  • Print
  • Billboards

Print is probably the most wide-ranging in terms of branches of advertising collateral because you can get into newspapers, magazines, flyers, brochures, banners, yellow pages, and so on.

While your approach may be different to each marketing type, the core tracking options are basically the same. You can track in your analytics program via:

  • Separate Domains
  • Custom URL's
  • Custom Phone Numbers

The beauty of web analytics, specifically a free service like Google Analytics, is that it puts the power of tracking into the hands of a business owner at no cost outside of perhaps a custom set up and implementation by a competent webmaster. All of these tracking methods can be tracked in Google Analytics as well as other robust analytic packages (Clicky.Com as an example, is a reasonably priced product which can do this as well, save for maybe the phone tracking).

Structuring Your Campaigns

With the amount of offline advertising many businesses do, it is easy to get carried away with separate domains, custom URL's, custom phone numbers, and the like.

What I usually like to do is use a good old fashioned spreadsheet to track the specific advertisements that are running, the dates they are running, and the advertising medium they are using. I also include a column or three for the tracking method(s) used (custom URL, separate domain, special phone number).

In addition to this, Google Analytics offers annotations which you can use to note those advertising dates in your traffic graph area to help get an even better idea of the net traffic effect of a particular ad campaign.

How to Track It

Armed with your spreadsheet of ads to track and notes on how you are going to track them, you're ready to set up the technical side of things.

The tracking is designed to track the hits on your site via the methods mentioned, once they get there you'll want to get that traffic assigned to a campaign or a conversion funnel to determine how many of the people actually convert (if you are able to sell or convert the visitor online).

Custom URL's

A custom URL is going to be something like:

yoursite.com/save20 for an advert you might be offering 20% savings on
yoursite.com/summer for an advert you could offer a summer special on

You may or may not want to use redirection. You can use a redirect method if you are using something like a static site versus a CMS like Wordpress. With Wordpress, you could create those url's as specific pages and just no-index them and ensure they are not linked to internally so you keep them out of the search engine and the normal flow of navigation. This way you know any visit to that page is clearly related to that offline campaign.

A redirect would be helpful where the above is not possible and you need to use Google's URL builder to help track the campaign and not lose referral parameters on the 301.

So you could use the URL builder to get the following parameters if you were promoting a custom URL like yoursite.com/save20:

http://www.yoursite.com/savings.php?utm_source=save20&utm_medium=mail&utm_campaign=bigsave

Then you can head into your .htaccess file (Apache) and insert this code:

(should be contained on 1 line in your .htaccess file)

RewriteRule ^save20$ /savings.php?utm_source=save20&utm_medium=mail&utm_campaign=bigsave [L,R=301]

When you test, you should see those URL builder parameters on the landing page and then you know you are good to go :)

If you are worried about multiple duplicate pages getting indexed in the search results (with slightly different tracking codes) you can also leverage the rel=canonical tag on your landing page

Separate Domains

Some companies use separate domains to track different campaigns. The idea is the same as is the basic code implementation with exception that you apply any redirect to the domain rather than a sub-page or directory off the domain as we did in the prior example.

So you sell snapping turtles (snappingturtles.com) and maybe you sell turtle insurance so you buy turtleinsurance.com and you want to use that as a part of a large campaign to promote this new and innovative product. You could get this from the url builder:

http://www.snappingturtles.com/?utm_source=national&utm_medium=all&utm_campaign=turtleinsurance

The .htaccess on turtleinsurance.com would look like:

(should be contained on 1 line in your .htaccess file)

RewriteRule .* http://www.snappingturtles.com/?utm_source=national&utm_medium=all&utm_campaign=turtleinsurance [L,R=301]

This would redirect you to the home page of your main site and you can update your .htaccess with a sub-page if you had such a page catering to that specific market.

Custom Phone Numbers

There are quite a few ways to get cheap virtual numbers these days and Phone.com is reliable service where you can get a number for roughly $4.88 per month.

I know companies that implemented custom numbers for a bunch of print ads and it was pretty eye-opening in terms of which as performed better than others and how much money is wasted on untargeted print campaigns.

There certainly is a somewhat intangible brand equity building component to offline ads but it is still interesting to see ads which carry their weight with traffic and response rates, as well as being really helpful when it comes time to reshape the budget.

Here are a couple handfuls of providers which offer phone tracking inside of Google Analytics. Most of these providers will require the purchase of a number from them to tie into a specific URL on your site or just right into the domain + help track those calls alongside the pageviews generated.

Some campaigns are wide-ranging enough to where you may want to target them with a custom number or two and a custom URL or domain. Using a spreadsheet to track these measures along with using Google Analytics annotations to gauge traffic spikes and drops offers business owners deep view into the use of their marketing dollars.

Custom Coupon Codes

If you run a coupon code through Groupon you of course know where it came from. But other channels are also becoming easier to track. Microsoft Office makes it easy to create & track custom coupon codes. There are even technologies to allow you to insert tracking details directly into coupon codes on your own website (similar to online tracking phone numbers via services like IfByPhone or Google's call tracking). Some online coupons offer sophisticated tracking options, and Google wants to get into mobile payments to offer another layer of customer tracking (including coupons).

Finding a Reputable Provider

If you are a business owner who thinks "wow this is awesome, how the heck do I do it?", well here is some advice. If the field of web analytics is mostly foreign to you I would suggest finding a certified Google Analytics provider or ask if your current web company can do this for you. Certainly there are plenty of competent people and companies that are not part of the Google Analytics partner program.

If you are interested in a Google Analytics partner you can search for them here. There is also quite a bit of information in the self-education section of Google Analytics.

I would recommend learning how to do this over a period of time so you can make minor or major changes yourself at some point. Also, it helps to establish a business relationship with someone competent and trustworthy for future tasks that may come up, which you cannot do on your own.

If you are a service provider, start implementing this for some of your local clients and you'll likely be well on your way to establishing yourself as a sought-after marketer in your area.

Categories: 

Thursday, April 21, 2011

SEO For Designers, Developers & Managers

SEO on your own site is straightforward, at least in terms of the politics. SEO'ing a site that a team works on is another matter.

You will come up against barriers. These barriers are often put up by designers, developers, copywriters and management. Frustrating as it is for the SEO, this is the reality of working on a site alongside other people, all of whom have agendas and requirements that may differ markedly from your own.

So how do you navigate this space? How do you ensure your SEO objectives can be met when other people may be resistant to change, or openly try to block you? In this post, we'll take a high-level, conceptual look at the challenges the SEO faces when working on a client site, and talking-points to help explore and clarify concepts.

1. Why Are We Doing SEO At All?

SEO is a pain.

It's complicated. It gets in the way, particularly when it comes to design. Why do we need headings and a lot of text when a picture tells a story? SEO appears to be an arbitrary, dark art with little in the way of fixed rules, and the client probably doesn't care about it anyway.

The thing is - if SEO is done well, a client may throw a whole lot more money at the site in future. Everyone likes to build on success, and that means more business, and more exposure, for everyone involved. On the internet, traffic = success. Traffic = money. A site that few people see, no matter how well executed, will likely fail, just like a site that fails to engage and convert visitors will fail. The client may not know they want SEO now, but you can be certain they'll be asking questions about it after launch.

If SEO is done poorly, the site may not be seen by as many people as it otherwise would. What use is a beautiful design that is seldom seen? What use is great code that is seldom used?

The value proposition of SEO is that it helps get a site seen. It's a powerful marketing channel, because most people use search engines to navigate the web. Sites that deliver what the search engines want stand to gain a lot more traffic than sites that do not undertake SEO. If your competitors are undertaking SEO, this puts your work at a competitive disadvantage. Their site will be seen more often by search visitors. Their web agencies will likely get more business as clients see greater returns on their investment.

That's why we do SEO. To be seen.

Of course, a site can be seen by other means. Word-of-mouth, social media, links, brand awareness, and offline advertising. A site doesn't need SEO, but given that it is a relatively easy win in terms of cheap traffic acquisition, the extra effort involved is negligible compared to the upside benefits. It's like being given a choice of having a shop located on main street vs a location way out in the desert. Much the same effort involved in building, but significantly different traffic potential.

2. SEO Is A Design Element

Just as copywriters require space to insert paragraphs and headings, SEO's require space to do their thing.

If you're a designer, an SEO will likely provide you with a list of their requirements. These requirements need not be onerous, any more so than leaving space for copy is considered onerous.

There are two key aspects where SEO needs to integrate with design. One aspect is the requirement for machine readable text, provided in a format the search engines are able to read, and derive meaning. Search engines "think" mostly in terms of words, not pictures. Make design allowances for copy that includes lot of headings and sub-headings, a technique which also dovetails nicely with usability.

The other key aspect is crawl-ability. A search engine sends out a spider, a piece of code that grabs the source code of your website, and dumps it back in a database. It skips from page to page, following links. If a page doesn't have a link to it, or no crawlable link to it, it is invisible to the search engines. There are various means of making a site easy to crawl, but one straightforward way is to use a site map, linked to from each page on the site. Similarly, you should ensure your site navigation is crawlable, which means using standard hyperlinks, as opposed to scripted/executable links. If you must use scripted links, try and replicate the navigation elsewhere on the page in non-scripted form, or within the body of the text.

For most sites, that's pretty much it when it comes to design considerations. In summary, the inclusion of machine readable text, and a means for a spider to crawl easily from page to page.

An SEO may also wish to specify a page hierarchy and structural issues, where some pages are given more prominent positions than others. Of course, this needs to be weighed against navigation considerations for visitors who arrive at the site via other means.

3. SEO For Developers

Like design, there are two key areas of integration.

One is tagging. SEO's will want to specify title tags, and some meta tags. These need to be unique for each page on the site, as each page is an entry page as far as a search engine is concerned. A search visitor will not necessarily arrive at the home page first.

The title tag appears in search results as a click able link, so serves a valuable marketing function. When search visitors consider which link to click, they'll use the title tag and snippet to influence their decision.

The second aspect concerns URL's. Ideally, a URL should contain descriptive words, as opposed to numbers and random letters. For example, acme.com/widgets/red-widgets.htm is good, whilst acme.com/w/12345678&tnr.php is less so. The more often the keyword appears, the more likely it will be "bolded" on a search results page, and is therefore more likely to attract a click. It's also easier for the search engine to determine meaning if a URL is descriptive as opposed to cryptic. For an in-depth look at technical considerations, see "SEO For Designers".

One workaround if the database needs unique codes is to translation at the URL level, using URL rewriting.

4. SEO Is A Marketing Strategy

The on-page requirements, as dealt with above, are half the picture.

In order to rank well, a page needs to have links from external sites. The higher quality those sites, the more chances your pages have of ranking well. The SEO will look to identify linking possibilities, and point these links to various internal pages on the site.

It can be difficult, near impossible, to get high quality links to brochure-style advertising pages. Links tend to be directed at pages that have reference value. This is a strategic decision that needs to weighed during site conception. Obviously, few sites strive, or want to be, Wikipedia, however there are various ways to incorporate reference information into commercial sites where the primary purpose of the site is not the publication of reference information.

For example, include a blog, a news feed, publish the e-mail newsletter to the site, and/or incorporate a reference section within the site. It doesn't matter if this section isn't viewed by visitors who navigate directly to the site. It provides a means to get a lot of information-rich content into the site without disrupting design and other commercial imperatives. Think of it as a "mini-site" within a site.

Not every page needs to be for the purposes of SEO. SEO can be sectioned off, although this is often less ideal than more holistic integration throughout the site.

5. Strategic Factors For Managers

Concept, design and development can screw-up SEO.

Poor integration can result in loss of potential traffic. This traffic will go to competitors. The longer a site doesn't use an SEO strategy, the harder it is to ever catch the competition, as a head-start in link building is difficult to counter.

If your aim, or your clients aim, is to attract as much targeted traffic as possible - as most site owners do - then SEO integration must be taken as seriously as design, development, copy and other media. It may influence your choice of CMS. It may influence your strategic approach in terms of how and what type of information you publish.

Whilst SEO can be bolted-on afterwards, this is a costly and less-effective way of doing SEO, much like re-designing a site is costly and less effective than getting it right in the planning stage. If SEO is integrated in the planning stage, it is reasonably straightforward.

The time to incorporate SEO is during site conception. SEO is a text publishing strategy. Design and development will need to make minor changes to the way they approach a site build. Doing this retrospectively, whilst not impossible, is more difficult, and therefore more costly.

Coda: Flash Workarounds For SEO

There are various workarounds to existing search-unfriendly design, but I'd advise to avoid the problem in the first place.

Flash, whilst a useful tool for embedding within sites, should be avoided for the entire site. Flash is a graphics/animation format, whereas search - and the web in general - is primarily a text format. If you build an entire site using Flash, then your competitors will overtake you in terms of search visitors. The formats simply do not gel.

One work around is strategic - split the site in two. Use Flash as a brochure site, and create a hub site that is text based. Consider including a "printable" version of the site, which will give the search engines some text to digest. Whilst there are technical and strategic ways around Flash, they are often clumsy and tedious.

The search engines can make sense of most sites, but if you're expecting to get rewarded by search engines, then it pays to stick as close to their technological strengths and weaknesses as possible.

Categories: 

 
Free Host | new york lasik surgery | cpa website design