Mega Session: SEO Vets Take all Comers – Live from SMX Advanced

NOTE: Some of the answers here are tongue-in-cheek. I will attempt to note where this is the case.

Greg Boser, Bruce Clay, Vanessa Fox, Todd Friesen, Rae Offman, Stephan SPencer, Brett Tabke

Q: Do EDU links continue to have a lot of link juice to them?

In short, .edu links are still quite valuable.  For instance, student news paper links, alumni news paper links and others, do pass good value.  THere is some trustability from links that come from EDU sites. THere are some loopholes, but Google has figured out what these are. If there is technical content that would be appropriate in context, then it is appropriate.

Q: How do I optimize “Silverlight” for SEO?

(Tongue-in-cheek) Redirect conditionally to show HTML to Google, but users see Silver Light. (this can be considered spam).

Right now, Silverlight is not very searchable. There is a whitepaper on MSDN about how to make Silver Light more searchable.

Q: Based on Matt Cutts comments about how nofollow links were going to be less effective at controlling the spiders, what should we do about sites that have used this technique for link-sculpting?

Right now, there is no clear direction with what’s going on. If it has been used in the past, it may not work as well, but that isn’t a reason to change the site structure now if it has been using it for a while. Matt Cutts will write a blog post about this shortly. If you use no follows to help the bots crawl efficiently, then that’s cool. Alternately, use iframes to contain your links with content of iframes having a no follow, this will keep the links out of the range of the spiders.

Best to combine rel= nofollow with ROBOTS meta tag with noindex. This way, you can manage which pages the search engines see and index, and can increase visibility of those good pages. If you sculpt the site as a whole, then it will keep the right pages visible. Ideally, if you set up a disallow for a directory, ALSO use a meta no index on the individual pages within that subdirectory.

Q: When the Vince update happened in February, large brands started showing up more frequently in search results. Does this mean that you have to be a big brand to show up?

There is a lot of association between the searches for your brand and your domain name, which means that those sites will likely come up more often. Therefore, if someone is searching for a generic phrase that appears on a big brand website, the brand website is equated with those generic terms, even if it isn’t optimized for those terms. Although there are positioning moves showing big brands higher, if you do good SEO, then you will not lose positioning from page one.

Smaller sites will continue to have more challenges in getting positioning than in bigger brands or longer-term sites. However, Google is trying to provide the most relevant results. Sometimes, the brand is the most relevant result, whether or not their sites are done well from an SEO standpoint. Big brands are more successful in the head terms, but they are not as successful at mid- and long-tail terms. That is the area of opportunity for newer sites or ones with lesser-known brands.

Many big brands concentrate their efforts offline rather than online. That means that there is great opportunity with targeted phrases if done right.

Q: In the REI site, most sales go on in the head terms, however most traffic go to long tail. What can we do?

The best way to affect this is to understand where people are breaking down in the purchase process, and optimizing the user experience. Some of the keywords may be just informational searches rather than commercial intent queries.  Look at what is going on when someone comes to the page – if they are looking for a particular product and then get to a category page, they may not do any further searching to get to the product. Try to get them directly to the product page. If it is a research intent, then provide content that meets their need.

Alternately, (although this is not a good idea), move people to the right page based on browser or referer URL to go to the specific page that will help them. This kind of conditional redirect can be good for the user, but can be seen as a spamming technique. However, you can show an alternate version of a webpage for a short term to test whether there is an improvement in conversion, you should be OK from an SEO standpoint.

Stephen Spencer: rule of thumb – if you are willing to show and explain what you are doing to a Google engineer and can explain why, then you are in pretty good shape. If you are uncomfortable to explain it, then you might be a spammer.

One way to do identify what type of intent is behind a search phrase is to look at the referral string, etc., to see what the search phrase is and what the visitors do once on the site. After you assess intent, you can do testing on appropriate segments of the site to improve conversion. Also, look at which pages are not performing well [not appearing in search results]. Identify ways to give better visibility to them through linking (internal and external) to improve their visibility and performance in natural search. Could do template optimizations, could do internal linking work, etc.

Q: We’ve had conflicting reviews on XML site maps – some say that they are great, and others say that they really hurt. Which is true?

Vanessa Fox – XML sitemaps do not help with ranking. What they DO help with is discovery – so if you want Google to see each of those pages without having to go through a crawl, then they help. Should all pages be shown in the sitemap? Short story – set up the right pages in the XML sitemap and block out the ones you don’t want to be indexed. Also, Google uses XML sitemaps as a canonicalization signal.

Google Webmaster Tools offers two types of crawler errors – first is those on the site, second in the sitemap. If there are ones that they cannot read in the sitemap, then there may be a parsing error. Best, just submit those pages you want indexed, keep the ones out that you don’t want indexed.

Q: Am rebranding a blog. Currently, the blog is in a subdomain, but want to move it to a new domain. What’s the best approach?

Basically, set up the content on the new site, then do page-by-page 301 redirect if a global redirect will not work. Best, though, look at which pages are driving traffic, and redirect just those. If the page names are going to be the same, except for the domain, then just do a global rewrite / 301 that directs to the new site and pages. If redirecting is not an option, then the best way to find the redirects that need to be implemented is to download all of the URLs that have external links, and then redirect all of those to the new site, individually.

Good hint: once the 301 redirects are in place, resubmit the old sitemap so that Google finds the new site more quickly.

Q: How many people think that they can get good ranking with quality content?

Most said yes, when posed to the audience.

However, you can have good content, and no links, then you are not going to get found. SEO is three pillars: content, links and architecture. Need to have quality content, but you also need good set up and good inbound links. In an ideal world, you will get good links if you have good content.  Sometimes, from a user perspective, content may be good, but are mashups and are not algorithmically valuable. In the Big Daddy launch, Google began giving better value to review pages with user generated content. The challenge is that those reviews are not necessarily good content for the user – and users do not convert on those pages. This leaves a disconnect between what is algorithmically good content and what is good content for users.

If you are doing link bait, sometimes you need to have a fair amount of history in your blog, just to show legitimacy of the blog – whether it is good quality or just standard content.

Q: What is the most important or best practice for SEO?

Top answers were title tags and anchor text on inbound links. Also, silo the hell out of it. From Vanessa Fox: On page, quality content and title tags.

Q: What is a good resource for finding architecture recommendations?

Google has a PDF SEO 101 – for how to build sites. Check out Google for “Google SEO guide”. Also look for a Powerpoint on NetConcepts about SEO site architecture.

Lightening round….

Q: Should we narc on people who are buying links?

Few people said that they would tell on people who bought links, but a few said that they wouldn’t do that.

Todd Friesen – doesn’t like to report paid links. Usually, he just tells the client where to report paid links.

Greg Boser – doesn’t usually report because nothing usually happens.

Stephan Spencer – sometimes competitor is not being helped by paid links. It may be something else that is giving them a better boost to their site. Therefore, even if they do lose the value of paid links, they will still not necessarily lose their higher position.

Q: What are the new big SEO strategies for 2009?

Vanessa Fox – looking at the data and trying to figure out who the people are who are searching for your products, what they want and building a better experience.

Todd Friesen – regular SEO is still important. On page, content, link building. But the data is a great thing.

Greg Boser – interesting things going on in local SEO.

Rae Hoffman – Google is making it easier to fix sites and getting bad URLs out of the index. Would be great to have a global 301 and easy way to get 404 out of the index.

Brett Tabke – New search tools that include searches of Twitter URLs, what’s going on with Bing and other new real-time search.

Danny Sullivan – search analytics is really interesting. How to do more with the traffic that is arriving at the site.

Stephan Spencer – understanding what the value is for a target keyword before you actually optimize for it and get traffic. There are some great new tools to measure this.

Bruce Clay – being able to optimize against a bunch of different types of content that can be used like images, video and audio.

Todd Friesen – microformats and the value that those will bring to search optimization.