SEO
now browsing by category
SEO
Subject Matter Experts and Their Role in Digital Marketing Strategy – Whiteboard Friday
Posted by Eric Enge
Establishing expertise and thought leadership is key to the success of your digital marketing strategy. In today's Whiteboard Friday, learn how your team can work with (or become!) subject matter experts in your niche, giving consumers of your content a chance to learn from the best.
For reference, here's a still of this week's whiteboard!
Video Transcription
Good morning everybody in Moz land. I'm Eric Enge, I'm the CEO of Stone Temple Consulting. I'm here to do a Whiteboard Friday here for you today. By the way, I'm also co-author of "The Art of SEO" together with the beloved Rand Fishkin.
So I want to talk to you a little bit about subject matter experts and their role in your digital marketing strategy. They play an incredibly important role. I see lots of businesses out there that publish sites and they put content out there and there's really no identity behind them. It's really important because, at the end of the day, your target audience wants to attach to a person more than they want to attach to a nameless entity. They want to feel like they're interacting with real people.
By the way, your subject matter expert could be subject matter experts, plural, and that's good, but incredibly important that you have something, somebody where people can attach to them in a material way. And at the end of the day, from my perspective, you have to have an expert or go home. You're just not going to be able to succeed in a big way going forward if you don't have some sort of established expertise for your business. That's my view of it. You just have to have that expert, or you need to go home.
So with that in mind, you run into the next problem. Your experts are human beings. There are 24 hours in a day, right? They have limited time to do what they need to do, and that actually limits how much scale you can get out of their activity. Maybe they only have two hours a day. And if that's the case, then that limits how much content and how much communication of that expertise can happen out in the wild.
So I want to talk now a little bit about how do we scale their efforts so you get more out of your expert, and that's where we lead to a few ideas I have over here. All right.
Best thing to do is see if you can get some smart young people, don't necessarily have to be young, but smart people to assist your subject matter expert in a number of different ways. Some great things you can do to help them out, one is you can research article topics. I know for myself, when I get up on Saturday morning, which is when I tend to write my columns, I sometimes spend two hours trying to come up with an idea for what the column is going to be about. It can be very painful, very frustrating. If you have somebody there helping you, coming up with ideas and really giving you a set of things that you can look at and think about for that next column or blog post or whatever it is, it can be a big, big help for you.
You can even potentially have them draft articles for you. You need to be careful about this. I'm actually not a big fan of ghost writing, because keeping in mind that people want to attach to an expert, if the thing is truly ghost-written, well, it's not really the expert that's writing it, and to me that relationship gets weakened. So I think it's very important to have the subject matter expert really be involved in writing the article. But you can have someone draft an article as long as the subject matter expert sort of recuts it and tears it apart, not just simple editing, but actually turns it into their own voice. Can be very helpful though to have that drafted article.
Find influencers. Very, very important thing to do. Who do you want that subject matter expert to build relationships with? That can be a lot of work to figure out too. You can use a variety of tools to figure this out. You can do social media research, just bum around on Twitter, Facebook, Google+, whatever your social site of preference is, or all of them, and help identify people that you want have the subject matter expert interact with. Figure out how to contact them, do research on what they like, help get that relationship process going. Subject matter expert has to be the one to do the outreach, but you can make it easier for them by doing some research up front.
Next thing, just monitoring social media sites. I'm going to use Twitter as an example. Find tweets by key people, maybe by influencers, maybe just by good friends. Have your assistant, basically, help the subject matter expert by monitoring, in this case Twitter, more frequently and more thoroughly than they can on their own. So that's a very valuable service. So you look at tweets by key people and tweets by others, direct questions that get asked of you, or breaking news, all these sorts of things to allow the subject matter expert to be responsive without having to live in the social media site.
Next up, you can draft actually social media posts, be it a tweet or Google+ or Facebook or whatever it is, and then send your subject matter expert proposed things that they can put out on social media. Again, a big time-saver.
You can curate content for them. The assistant can go ahead and research other articles and find things going on and actually suggest comments on those articles.
Creating graphics, I'm lucky enough that I have someone who is able to create graphics for me. So I can walk in, in the morning and say, "Hey I want to do this post today," and I can sketch out a little design, here's what I want to do, I want this, sort of build a little design for them, and then they go off and create it and then two hours later I have a beautiful graphic which I can go ahead and use for my post. I actually end up with a lot of custom graphics in my posts that way, which is really cool.
They can also just edit your articles. Hopefully, that's not too painful for them, because hopefully your subject matter expert is a good writer. But this is another valuable service. It's really great to have that person, that other set of eyes on the article to help you with that.
The big key in all these services that I've talked about, which will help us lead to our happy SME down here in the bottom, is all about the relationship between the assistant and the subject matter expert. The assistant has to be doing things the subject matter expert finds valuable. So, if I'm a subject matter expert and I don't find your curating content for me valuable because I'm just too opinionated or I don't want to put that stuff out there, then having you do that for me doesn't help. So the subject matter expert and the assistant or assistants, as the case may be, have to build a special relationship so that they understand how to work together and really make it work.
So that's some ideas for you on how you take your subject matter expert, you give them a little more time, and help them scale their efforts, leading to a happy subject matter expert and good results for your business.
Thanks for listening to me today, and have a good day.
Video transcription by Speechpad.com
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
How Website Speed Actually Impacts Search Ranking
Posted by Zoompf
Google uses a multitude of factors to determine how to rank search engine results. Typically, these factors are either related to the content of a webpage itself (the text, its URL, the titles and headers, etc.) or were measurements of the authenticity of the website itself (age of the domain name, number and quality of inbound links, etc.). However, in 2010, Google did something very different. Google announced website speed would begin having an impact on search ranking. Now, the speed at which someone could view the content from a search result would be a factor.
Unfortunately, the exact definition of "site speed" remained open to speculation. The mystery widened further in June, when Google's Matt Cutts announced that slow-performing mobile sites would soon be penalized in search rankings as well.
Clearly Google is increasingly acting upon what is intuitively obvious: A poor performing website results in a poor user experience, and sites with poor user experiences deserve less promotion in search results. But what is Google measuring? And how does that play into search engine rankings? Matt Peters, data scientist at Moz, asked Zoompf to help find the answers.
Disclaimer
While Google has been intentionally unclear in which particular aspect of page speed impacts search ranking, they have been quite clear in stating that content relevancy remains king. So, in other words, while we can demonstrate a correlation (or lack thereof) between particular speed metrics and search ranking, we can never outright prove a causality relationship, since other unmeasurable factors are still at play. Still, in large enough scale, we make the assumption that any discovered correlations are a "probable influence" on search ranking and thus worthy of consideration.
Methodology
To begin our research, we worked with Matt to create a list of of 2,000 random search queries from the 2013 Ranking Factors study. We selected a representative sample of queries, some with as few as one search term ("hdtv"), others as long as five ("oklahoma city outlet mall stores") and everything in between. We then extracted the top 50 ranked search result URLs for each query, assembling a list of 100,000 total pages to evaluate.
Next, we launched 30 Amazon "small" EC2 instances running in the Northern Virginia cloud, each loaded with an identical private instance of the open source tool WebPageTest. This tool uses the same web browser versions used by consumers at large to collect over 40 different performance measurements about how a webpage loads. We selected Chrome for our test, and ran each tested page with an empty cache to guarantee consistent results.
While we'll summarize the results below, if you want to check out the data for yourself you can download the entire result set here.
Results
While we captured over 40 different page metrics for each URL examined, most did not show any significant influence on search ranking. This was largely expected, as (for example) the number of connections a web browser uses to load a page should likely not impact search ranking position. For the purposes of brevity, in this section we will just highlight the particularly noteworthy results. Again, please consult the raw performance data if you wish to examine it for additional factors.
Page load time
When people say"page load time" for a website, they usually mean one of two measurements: "document complete" time or "fully rendered" time. Think of document complete time as the time it takes a page to load before you can start clicking or entering data. All the content might not be there yet, but you can interact with the page. Think of fully rendered time as the time it takes to download and display all images, advertisements, and analytic trackers. This is all the "background stuff" you see fill in as you're scrolling through a page.
Since Google was not clear on what page load time means, we examined both the effects of both document complete and fully rendered on search rankings. However our biggest surprise came from the lack of correlation of two key metrics! We expected, if anything, these 2 metrics would clearly have an impact on search ranking. However, our data shows no clear correlation between document complete or fully rendered times with search engine rank, as you can see in the graph below:
The horizontal axis measures the position of a page in the search results, while the vertical axis is the median time captured across all 2,000 different search terms used in the study. So in other words, if you plugged all 2,000 search terms into Google one by one and then clicked the first result for each, we'd measure the page load time of each of those pages, then calculate the median and plot at position 1. Then repeat for the second result, and third, and on and on until you hit 50.
We would expect this graph to have a clear "up and to the right" trend, as highly ranked pages should have a lower document complete or fully rendered time. Indeed, page rendering has a proven link to user satisfaction and sales conversions (we'll get into that later), but surprisingly we could not find a clear correlation to ranking in this case.
Time to first byte
With no correlation between search ranking and what is traditionally thought of a "page load time" we expanded our search to the Time to First Byte (TTFB). This metric captures how long it takes your browser to receive the first byte of a response from a web server when you request a particular URL. In other words, this metric encompasses the network latency of sending your request to the web server, the amount of time the web server spent processing and generating a response, and amount of time it took to send the first byte of that response back from the server to your browser. The graph of median TTFB for each search rank position is shown below:
The TTFB result was surprising in a clear correlation was identified between decreasing search rank and increasing time to first byte. Sites that have a lower TTFB respond faster and have higher search result rankings than slower sites with a higher TTFB. Of all the data we captured, the TTFB metric had the strongest correlation effect, implying a high likelihood of some level of influence on search ranking.
Page size
The surprising result here was with the the median size of each web page, in bytes, relative to the search ranking position. By "page size," we mean all of the bytes that were downloaded to fully render the page, including all the images, ads, third party widgets, and fonts. When we graphed the median page size for each search rank position, we found a counterintuitive correlation of decreasing page size to decreasing page rank, with an anomalous dip in the top 3 ranks.
This result confounded us at first, as we didn't anticipate any real relationship here. Upon further speculation, though, we had a theory: lower ranking sites often belong to smaller companies with fewer resources, and consequently may have less content and complexity in their sites. As rankings increase, so does the complexity, with the exception of the "big boys" at the top who have extra budget to highly optimize their offerings. Think Amazon.com vs. an SMB electronics retailer vs. a mom-and-pop shop. We really have no proof of this theory, but it fits both the data and our own intuition.
Total image content
Since our analysis of the total page size surprised us, we decided to examine the median size, in bytes, of all images loaded for each page, relative to the search rank position. Other then a sharp spike in the first two rankings, the results are flat and uninteresting across all remaining rankings.
While we didn't expect a strong level of correlation here we did expected some level of correlation, as sites with more images do load more slowly. Since this metric is tied closely to the fully rendered time mentioned above, the fact that this is equally flat supports the findings that page load time is likely not currently impacting search ranking.
What does this mean?
Our data shows there is no correlation between "page load time" (either document complete or fully rendered) and ranking on Google's search results page. This is true not only for generic searches (one or two keywords) but also for "long tail" searches (4 or 5 keywords) as well. We did not see websites with faster page load times ranking higher than websites with slower page load times in any consistent fashion. If Page Load Time is a factor in search engine rankings, it is being lost in the noise of other factors. We had hoped to see some correlation especially for generic one- or two-word queries. Our belief was that the high competition for generic searches would make smaller factors like page speed stand out more. This was not the case.
However, our data shows there is a correlation between lower time-to-first-byte (TTFB) metrics and higher search engine rankings. Websites with servers and back-end infrastructure that could quickly deliver web content had a higher search ranking than those that were slower. This means that, despite conventional wisdom, it is back-end website performance and not front-end website performance that directly impacts a website's search engine ranking. The question is, why?
TTFB is likely the quickest and easiest metric for Google to capture. Google's various crawlers will all be able to take this measurement. Collecting document complete or fully rendered times requires a full browser. Additionally, document complete and fully rendered times depend almost as much on the capabilities of the browser loading the page as they do on the design, structure, and content of the website. Using TTFB to determine the "performance" or "speed" could perhaps be explainable by the increased time and effort required to capture such data from the Google crawler. We suspect over time, though, that page rendering time will also factor into rankings due to the high indication of the importance of user experience.
Not only is TTFB easy to calculate, but it is also a reasonable metric to gauge the performance of an entire site. TTFB is affected by 3 factors:
- The network latency between a visitor and the server.
- How heavily loaded the web server is.
- How quickly the website's back end can generate the content.
Tail wagging the dog?
Do these websites rank highly because they have better back-end infrastructure than other sites? Or do they need better back-end infrastructure to handle the load of ALREADY being ranked higher? While both are possible, our conclusion is that sites with faster back ends receive a higher rank, and not the other way around.
We based this conclusion on the fact that highly specific queries with four or five search terms are not returning results for highly trafficked websites. This long tail of searches is typically smaller sites run by much smaller companies about very specific topics that don't receive the large volumes of traffic that necessitate complex environments of dozens of servers. However, even for these smaller sites, fast websites with lower TTFB are consistently ranked higher than slower websites with higher TTFB.
Takeaways
The back-end performance of a website directly impacts search engine ranking. The back end includes the web servers, their network connections, the use of CDNs, and the back-end application and database servers. Website owners should explore ways to improve their TTFB. This includes using CDNs, optimizing your application code, optimizing database queries, and ensuring you have fast and responsive web servers. Start by measuring your TTFB with a tool like WebPageTest, as well as the TTFB of your competitors, to see how you need to improve.
While we have found that front-end web performance factors ("document complete" and "fully rendered" times) do not directly factor into search engine rankings, it would be a mistake to assume they are not important or that they don't effect search engine rankings in another way. At its core, front-end performance is focused on creating a fast, responsive, enjoyable user experience. There is literally a decade of research from usability experts and analysts on how web performance affects user experience. Fast websites have more visitors, who visit more pages, for longer period of times, who come back more often, and are more likely to purchase products or click ads. In short, faster websites make users happy, and happy users promote your website through linking and sharing. All of these things contribute to improving search engine rankings. If you'd like to see what specific front-end web performance problems you have, Zoompf's free web performance report is a great place to start.
As we have seen, back-end performance and TTFB directly correlate to search engine ranking. Front-end performance and metrics like "document loaded" and "fully rendered" show no correlation with search engine rank. It is possible that the effects are too small to detect relative to all the other ranking factors. However, as we have explained, front-end performance directly impacts the user experience, and a good user experience facilitates the type of linking and sharing behavior which does improve search engine rankings. If you care about your search engine rankings, and the experience of your users, you should be improving both the front-end and back-end performance of your website. In our next blog post, we will discuss simple ways to optimize the performance of both the front and back ends of a website.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
The 100 Best Free SEO Tools & Resources for Every Challenge – Interactive
Posted by Cyrus-Shepard
At Moz, we love using premium SEO Tools (especially our own). Paid tools are essential when you need advanced features, increased limits, historical features, or online support.
For other tasks, a free tool does the trick.
Below you'll find an interactive list of 100 best completely free tools, tools with both free and paid options, and free trials. Simply select the checkbox for the area you're working in, and view the tools forr that category.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51eb500d730a11.54516003.jpg)
Free Tools
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f70e5e5e4130.25768456.jpg)
1. Anchor Text Over Optimization Tool
http://www.removeem.com/ratios.php
Link Research, Technical SEO
Worried about Google's Penguin algorithm hitting you for over-optimized anchor text? Simply type in your URL for a full report of which links might raise flags.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f70f7187d1f2.65457106.jpg)
2. Bing Webmaster Tools
http://www.bing.com/toolbox/webmaster
Tools Suite, Diagnostic
Similar in function to Google Webmaster Tools, Bing offers a suite of interesting research tools and resources for webmasters.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f7111a7d6733.37015896.jpg)
3. Bitly
Social, Analytics
Most people use Bitly for URL shortening, but the real power of this platform comes from its analytics.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f7264a39e166.51112014.jpg)
4. Boomerang
http://www.boomeranggmail.com/
Email, Productivity
Boomerang lets you follow up on emails, even when you forget. Great for link building or any time you send a lot of emails.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f7265d9099f7.73570898.jpg)
5. Buffer
Social
Optimize your online social media sharing. Buffer allows you to share with your audience at the optimal times for greater visibility.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f7266833bd03.51629459.jpg)
6. BuiltWith
Competitive Intelligence
Use BuiltWith to discover what technology nearly any website was, well, built with. Great for competitive intelligence as well.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f72674a0db66.44190123.jpg)
7. Buzzstream Tools Suite
http://tools.buzzstream.com/link-building
Link Building, Tools Suite, Email
Most people know Buzzstream as an outreach platform, but they also offer a number of free link-building tools. This company gets it.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f726c5823539.63548420.jpg)
8. Caption Tube
http://captiontube.appspot.com/
Video
Free and easy resource used to create captions for YouTube. Helps with usability and offers viewers a readable transcript.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f726d5635fd1.26564975.jpg)
9. CircleCount
Social, Analytics
Google+ analytics ramped up. Free resource to track your followers and analyze your shares. See how many followers you've gained over time.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f729f16970a9.71484024.jpg)
10. Content Strategy Generator Tool
http://seogadget.com/content-strategy-generator-tool-v2-update/
Content
This tool from SEOgadget helps you plan your content strategy intelligently, using keyword research and estimating your audience size.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f729fe50ebf8.06895977.jpg)
11. Convert Word Documents to Clean HTML
Content, Productivity
Despite the rise of Google Docs, Word still dominates much of the world. Copying and pasting has always been a hurdle, but this tool makes it easy.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f72d35bd77f6.54613551.jpg)
12. Copyscape
Content
Copyscape serves both as a plagiarism checker and a duplicate-content checker. Great to use if your content has been distributed across the web.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f72a282e3fe0.86753929.jpg)
13. Domain Hunter Plus
Link Building
This magic extension for Chrome not only helps you find important broken links, but also tells you if the links point to an available domain.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f72a336b5b88.77441128.jpg)
14. Easel.ly
Infographics
Free tools for creating and sharing inforgraphics. The templates allow anyone to create a professional-looking graphic.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f73e6fb07b49.33572404.jpg)
15. Email Format
Email, Productivity
Email Format helps you find the proper structure for thousands of companies and organizations across the web.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f72a4c489164.27841518.jpg)
16. FindPeopleonPlus
http://www.findpeopleonplus.com/
Social
The ultimate Google+ directory that's great for research, outreach, and link building. Sort by keywords, profession, country, and more.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f72d77065c75.06443654.jpg)
17. Frobee Robots.txt Checker
http://www.frobee.com/robots-txt-check
Robots.txt, Technical SEO
Many robots.txt files contain hidden errors not easily visible to humans. Run your file through this tool and you never know what you'll discover.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f750b0545235.88228668.jpg)
18. GetListed
Local, Moz
This awesome local SEO tool scores your local SEO visibility and gives you actionable next steps to raise your score.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f72a6d1d2aa7.58163455.jpg)
19. Google Keyword Planner
http://adwords.google.com/keywordplanner
Keyword Research
The tool to replace Google's popular keyword tool has been derided by some, but still offers data not available anywhere else.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f72a78788fe4.90751383.jpg)
20. Google Analytics
http://www.google.com/analytics/
Analytics
The most popular of all the analytics tools available, Google Analytics continually innovates and sets the standard.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f72a82e267a7.58556851.jpg)
21. Google Analytics API
https://support.google.com/analytics/answer/1008004?hl=en&ref_topic=1008008
API, Analytics
The Google Analytics API is great for building custom reports and tools, and also for pulling data straight into Excel or Google Docs.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f72aa3af0e89.88581356.jpg)
22. Google Map Maker
http://www.google.com/mapmaker
Local
Among other things, Google Map Maker allows you to contribute to public map information, which may be shared and incorporated into Google Maps.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f72aae34fe40.63009617.jpg)
23. Google PageSpeed Insights
https://developers.google.com/speed/pagespeed/insights
Speed
Tools, data, and insights to improve your page speed. Page speed is correlated with better rankings and user engagement, so this matters.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f72f980ad0c9.08962798.jpg)
24. Google Public Data
http://www.google.com/publicdata/directory
Content
Drawing on vast public databases, Google public data offers a great starting point for content research, infographics, and more.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f72fa711c0b8.89751972.jpg)
25. Google SERP Snippet Optimization Tool
http://www.seomofo.com/snippet-optimizer.html
Technical SEO, CRO
That SEO Mofo! Use this tool to see how your snippet may appear in Google's search results. Add structured data, review stars, and more.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f72fb27569a7.79746990.jpg)
26. Google Structured Data Testing Tool
http://www.google.com/webmasters/tools/richsnippets
Structured Data, Technical SEO
If you use Schema.org microformats or any other type of structured data, this tool will verify your markup.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f72fc6062906.09373239.jpg)
27. Google Trends
Keyword Research
See what's trending in Google search results and view keyword search popularity over time. A must for trends.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f72fd051d546.74392457.jpg)
28. Google Webmaster
http://www.google.com/webmasters/
Tools Suite, Diagnostic
The interface recently received an overhaul, and Google Webmaster remains a must-have resource of diagnostic and health tools for site owners.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f72fd9def088.67187997.jpg)
29. IFTTT
Productivity
IFTTT stands for IF This, Then That. The tool allows you to create automatic triggers between various apps, like Gmail and Twitter.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f72fe537b1d4.51481516.jpg)
30. Infogr.am
Infographics
A great free Infographics resource that allows you to easily create graphics and data visualizations.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f72ff9bad002.76117527.jpg)
31. Internet Marketing Ninjas SEO Tools
http://www.internetmarketingninjas.com/tools/
Tools Suite
The Ninjas are some of the best SEOs and online marketers out there, and they've put some of their best tools online for free.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f7300427ebd5.88496913.jpg)
32. Linkstant
Link Building
This nifty analytics tool alerts you anytime someone links to your website. Great for outreach and intelligence gathering.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f73010788250.95984012.jpg)
33. Linksy.me Email Guesser
Email, Link Building
Need to send an email, but you don't have the recipient's address? Type in what you know and this nifty tool will help you figure it out.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f737cc654439.87196888.jpg)
34. MailTester.com
Need to send an email to an untested address, but you don't want to spam them? Check it first with this mail tester to verify.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f750cf49ed82.54056642.jpg)
35. MozCast
SERP Tracking, Moz
Want to know if Google is testing its algorithm this week? MozCast gives you a daily weather report to track changes in the SERPs.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f73335cd6152.34992045.jpg)
36. MyBlogGuest
Link Building, Content
Guest blogging is still alive and thriving. MyBlogGuest helps you find the good opportunities out there.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f73345de4461.44360842.jpg)
37. Panguin Tool
http://www.barracuda-digital.co.uk/panguin-tool/
Analytics
This awesome tool connects with your Google Analytics account to help you see if and when you've been hit by Google Algorithm updates.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f7335130ea14.67741134.jpg)
38. Pingdom
Speed
Pingdom offers an entire suite of speed tools to help analyze page load, DNS issues, and connectivity.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f7335a81a3a3.19113200.jpg)
39. Piwik
Analytics
Piwik is a lightweight web analytics solution, and a great alternative to Google Analytics.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f7336f225fe2.77475069.jpg)
40. Rank Checker for Firefox
http://tools.seobook.com/firefox/rank-checker/
Rank Tracking
This light and easy desktop tool checks rankings with the click of a button. Quick, easy and free.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f7337c974db4.39400496.jpg)
41. Rapportive
Email, Link Building, Productivity
Rapportive works with your Gmail inbox to give you near-instant rich contact information for almost everyone you want to reach. A must-have for marketers.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f73387971de5.90961608.jpg)
42. Remove Duplicate Items
http://ontolo.com/tools-remove-duplicates
Productivity
Ontolo offers a suite of link building software and a few helpful productivity tools for link builders. The remove duplicates tool solves a common problem.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f733973b99f1.16590203.jpg)
43. Robots.txt Checker
http://tool.motoricerca.info/robots-checker.phtml
Robots.txt, Technical SEO
Use robots best practices and discover hidden errors in your robots.txt files that may cause search engine crawling problems.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f733a2162035.80986374.jpg)
44. Schema Creator
Structured Data, Technical SEO
Everyone loves using Schema.org, but the microformats are difficult to write by hand. This generator from the folks at Raven simplifies the task.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f733ac9be794.53961035.jpg)
45. Scraper for Chrome
https://chrome.google.com/webstore/detail/scraper/
Productivity
If you've never scraped a webpage, you're missing out. Scraper for Chrome puts the power of simple web scraping in your hands without the need for code.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f733b6c78e64.96528771.jpg)
46. Seer Toolbox
http://www.seerinteractive.com/seo-toolbox/
Tools Suite, Analytics, Link Research
SEER opened up its internal toolbox for everyone in the world to use. These are the same tools used in-house at SEER, and they rock.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f733ca2b0bd3.82220606.jpg)
47. SEO Toolbar
http://tools.seobook.com/seo-toolbar/
Tools Suite, Toolbar, Technical SEO
On of the most popular tools available, The SEO Toolbar puts a ton of information at your fingertips including backlinks and competitive research.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f733d5c81b99.62120706.jpg)
48. SEO Tools for Excel
http://nielsbosma.se/projects/seotools/
Tools Suite, Analytics, Social
You don't need to be an Excel ninja to use Niels Bosma's SEO Tools for Excel. This plugin does so many things many SEOs won't work without it.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f733eed10b32.29855918.jpg)
49. SEOgadget Links API
API, Link Research
The SEOgadget Links API lets you easily gather not only backlink data but contact information as well. A huge time saver.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f733fa294059.22269334.jpg)
50. SEOgadget Tools
Tools Suite
This suite of tools from the Gadget lab includes several Excel plugins, a content strategy generator, and more.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f73406c6e431.44731562.jpg)
51. SEOQuake
Toolbar, Tools Suite, Technical SEO
More raw data than any other SEO toolbar out there.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f73fe3bb3061.26646965.jpg)
52. SharedCount
Social, Analytics
Want to know how any piece of content was shared socially across the major services? This is the tool to use.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f737f0277608.91212417.jpg)
53. SharedCount API
http://www.sharedcount.com/documentation.php
API, Social
Harnessing the combined statistics of Google+, Twitter, Facebook, and more, the SharedCount API puts a ton of social data at your fingertips.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f737fbe8b254.52331119.jpg)
54. Similar Page Checker
http://www.webconfs.com/similar-page-checker.php
Content, Technical SEO
Use this tool to check for duplicate content issues. The Similar Page Checker will give you a score of how closely the HTML of two pages resemble each other.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f738080ce853.87307535.jpg)
55. Sitemap Generators
http://code.google.com/p/sitemap-generators/wiki/SitemapGenerators
Sitemaps
Google offers a slew of free, top-notch sitemap generators. Most of these live on your server and generate new sitemaps automatically.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f750e239b8b7.68291210.jpg)
56. Social Authority API
https://followerwonk.com/social-authority
API, Social
How much reach and social authority do your followers have? How about the people you're trying to connect with? The free Social Authority API will tell you.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f738907c2b56.93069572.jpg)
57. Social Crawlytics
Social, Analytics
Social Crawlytics allows you to conduct competitive research by showing you your competitors' most-shared content. Lots of other features as well.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f738aba5dd21.93857214.jpg)
58. Social Mention
Social
Social mention offers real-time social media search and analysis. Enter a search term and see who's sharing what, right now.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f738b7475303.44530850.jpg)
59. Text Cleaner
Content
Some of the best tools solve the simplest problems. Text cleaner cleans up all kinds of text formatting when copying and pasting between aplications.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f738c3caece1.09558498.jpg)
60. Ubersuggest
Keyword Research
Every SEO loves Ubersuggest for its ease of use and wealth of keyword research ideas. Utilizing the power of Google Suggest, it returns hundreds of potential results.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f738cecde7b2.76017453.jpg)
61. URI Valet
Technical SEO
A great tool for digging into server headers, canonical information, analyzing redirect problems and more.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f738e8890a91.09085778.jpg)
62. Virante SEO Tools
http://www.virante.org/seo-tools
Tools Suite
Virant offers a number of high quality SEO tools to the public. These are often the same tools developed for the Virant team, opened up for public use.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f738fa4a83f1.94517652.jpg)
63. Wayback Machine
http://archive.org/web/web.php
Competitive Intelligence
Want to see the history of your website or your competitor's site? The Wayback Machine allows you to step back in time and track important changes.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f7391e562f62.93304959.jpg)
64. WebPagetest
Speed
Quick and easy website speed tool. Offers suggestions for improving performance.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f73dd6442738.30293773.jpg)
65. Wordle
Content
Create beautiful word clouds. Great for visualizations, graphics, and research.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f73de14d6c26.66256461.jpg)
66. Wordstream Free Keyword Tools
http://www.wordstream.com/free-keyword-tools
Keyword Research, Tools Suite
In addition to its paid offerings, Wordstream offers a suite of free keyword tools offering access to thousands of keyword suggestions.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f73deb505572.01576496.jpg)
67. Xenu's Link Sleuth
http://home.snafu.de/tilman/xenulink.html
Diagnostic, Technical SEO
Winner of the ugliest-SEO-tool-on-the-planet award, Xenu is also one of the most useful. Crawl entire sites, find broken links, create sitemaps, and more.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f73e0409bff7.87469057.jpg)
68. XML-Sitemaps.com
Sitemaps
XML-Sitemaps offers probably the easiest sitemap creation solution anywhere. Great for smaller sites when you need a sitemap in minutes.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f73e0e1fe593.58173783.jpg)
69. Yahoo Pipes
Content, Productivity
A great mashup tool that combines different feeds into content and other magical creations. Used for link building and whatever you can dream of.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f73e34b335f5.22756192.jpg)
70. Yoast WordPress SEO Plugin
http://yoast.com/wordpress/seo/
Technical SEO
If you could only choose one WordPress plugin for you site, the first would be from Yoast, and so would the second. This one sets the standard.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f73e3f4fc6e2.35127739.jpg)
71. YouTube Analytics
https://www.youtube.com/analytics
Video, Analytics
Offers video-specific analytics for YouTube videos. A must-have for YouTube video publishers.
Free and Paid
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f73e4c1c4d34.26160837.jpg)
72. Ahrefs
Link Research, Link Building
One of the more popular link research tools, Ahrefs offers a large index and nice anchor text distribution charts. Mostly a paid tool, but they offer some free data.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f73e5854ab69.09204056.jpg)
73. Banana Tag
Banana Tag allows you to track your emails after you send them. For example, check your email open rates from Gmail.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f73e63eef0d6.97088537.jpg)
74. CloudFlare
Speed
How do they make CloudFlare free? It works both as a CDN and a security service to provide your website with speed and safety.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f750f4b388f0.88516778.jpg)
75. Followerwonk
Social, Analytics, Moz
Perhaps the coolest thing about Followerwonk is the ability to track your followers. Smart SEOs also use it for outreach and research.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f73e7e4d8e15.53711730.jpg)
76. Keyword Eye
Keyword Research
Keyword eye adds a twist to keyword research by adding rich visualizations — essential when you want to move beyond keywords to valuable concepts.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f73e92035580.87390008.jpg)
77. KnowEm
Social
KnowEm allows you to check 100's of social profiles at once to check availability. Looking for the perfect brand name? Check KnowEm first.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f73e9b35bfd2.77859194.jpg)
78. Majestic SEO
Link Research, Competitive Intelligence, Link Building
You've probably seen Majestic SEO link charts all over the Internet. Great crawling technology combined with several free options make for great link research.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f73ea7798630.80643528.jpg)
79. Majestic SEO API
http://blog.majesticseo.com/general/majestic-seo-api-now-explained/
API, Link Research
Majestic makes much of its backlink data available for free via its API.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f750ff2316a4.47876646.jpg)
80. MozBar
http://moz.com/tools/seo-toolbar
Tools Suite, Toolbar, Moz
The standard SEO toolbar for legions of marketers, the MozBar allows you to perform over 50 key tasks right from your browser. Highly recommended.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f751093a4489.56380894.jpg)
81. Mozscape API
API, Link Research, Moz
Companies everywhere incorporate the Mozscape API into their own products, but it's also available to individuals, and much of the data is free.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f73eb59df0b1.80577034.jpg)
82. nTopic
Content
nTopic is one of the few proven methods for giving your content a relevancy score and offering keyword suggestions to improve it.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f75119356867.81823633.jpg)
83. Open Site Explorer
http://www.opensiteexplorer.org/
Link Research, Moz, Competitive Intelligence, Link Building
When Google and Yahoo started removing backlink data from the public, Moz built Open Site Explorer to fill a huge need. See backlinks, anchor text, popularity metrics and more.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f73ec152a7f7.39812683.jpg)
84. Piktochart
Infographics
A cute and easy infographic generator. No experience required.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f73f661a6254.04080941.jpg)
85. RowFeeder
Social, Analytics
RowFeeder allows you to track social usernames, hashtags and keywords and load that information into Excel for easy social media monitoring.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f73f7201bf05.49407017.jpg)
86. Screaming Frog
http://www.screamingfrog.co.uk/
Diagnostic, Technical SEO
A powerful website crawling tool with a ton of features and customizations. A must-have for most serious SEOs.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f73f89d8a0a9.02903647.jpg)
87. Searchmetrics Visibility Charts
http://suite.searchmetrics.com/en/research
SERP Tracking, Competitive Intelligence
Track the search visibility of any website, in addition to tracking winners and losers in Google's search results.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f7421501fff3.09288653.jpg)
88. SEMrush
Tools Suite, Keyword Research, Competitive Intelligence
The paid and organic keyword data offered by SEMrush is often scary good and comprehensive. Also great for researching competitors' ads.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f74228781428.21314925.jpg)
89. SERPmetrics
SERP Tracking, Competitive Intelligence
SERPmetrics flux charts track the flux for US search results across Yahoo, Bing and Google over a 30-day period. A paid API is also available.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f742333f4e38.48073986.jpg)
90. SimilarWeb
Competitive Intelligence
Impressive competitive intelligence across a number of online industries. Competitor website stats are hard to come by, but Similar Web does a good job.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f7423e473752.98232353.jpg)
91. StatCounter
Analytics
Free, quick, and lightweight analytics solution. Often used by those who want to avoid using Google Analytics for privacy reasons.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f74249110fd6.78465968.jpg)
92. Trello
Productivity
Project management and tracking made simple. Used and endorsed by Moz.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f74253a21089.12556863.jpg)
93. Whitespark Local Citation Finder
https://www.whitespark.ca/local-citation-finder/
Local
Finding local citations is key to local SEO. Whitespark offers a number of free and paid solutions to find the local citations to rise above the competition.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f7426538c064.58755806.jpg)
94. Whois Lookup
Competitive Intelligence
Find registration, contact, and administrative information for any domain.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f742d9ab44d4.79516851.jpg)
95. Wistia
Video
The king of online video, Wistia offers SEO-friendly solutions for video hosting. Both free and low-cost options available.
Free Trials
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f7512b4eea78.37867927.jpg)
96. Moz Analytics
Tools Suite, Diagnostic, Moz, Rank Tracking, Social
The flagship of the Moz software suite, Moz Analytics offers a dashboard of all your important marketing data in one place with actionable analytics for better marketing.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f742eb6c1cd1.94917088.jpg)
97. Optimizely
A/B Testing, CRO
Easy A/B testing and analytics to help you move toward success in your CRO efforts.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f742f54c1228.99949035.jpg)
98. Raven
Tools Suite, Diagnostic, Content, Social
Raven offers a classic suite of SEO, content, and research tools popular with many marketers.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f7430128b599.66107216.jpg)
99. Visual Website Optimizer
https://visualwebsiteoptimizer.com/
A/B Testing, CRO
Visual Website Optimizer allows you to run A/B tests with a simple online editor that lets you test content without knowing code.
![](http://d1avok0lzls2w.cloudfront.net/uploads/blog/51f7430ab69e72.88076231.jpg)
100. Wordtracker
Keyword Research
A powerful keyword research suite used by many top marketers, Wordtracker offers a generous free trial option.
What's your favorite free tool?
Narrowing a list down to the 100 best SEO tools and resources is not an easy challenge. Although I visited hundreds of webpages to compile this list, these four resources offered particular value:
- Annie Cushing's Must-Have Tools for Marketers
- The Tools page at Inbound.org
- Dr. Pete's APIs for Data-Driven Marketers
- Free SEO Tools - A Curated List
The format for this ultimate interactive post was inspired by Jon Cooper's complete list of link building strategies. You should check it out — it's a great post.
Despite researching hundreds of tools, a few great ones didn't make the list. What's your favorite free SEO tool? Let us know in the comments below.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
Announcing Moz Academy!
Posted by Nick_Sayers
We’re stoked to announce Moz Academy!
Have you ever wanted a resource to learn inbound marketing or a place your team can reference marketing best practices? Well, we hope you do a backflip over Moz Academy. If you have a Moz Subscription, check it out now!
Subscription-based content
At Moz we produce a wealth of free content in the blog, our guides, Q&A, and pretty much everywhere on the site. We want to do something special for Moz subscribers by transforming our free content and reinventing it for Moz Academy. You could probably scour the Moz Blog and other websites to obtain the information in Moz Academy, but we think having it easily digestible and all in one place is a huge win for Moz subscribers. Moz is excited to add the simplicity and power of Moz Academy to the list of Moz subscription benefits.
Why create an inbound marketing school?
Moz is extremely passionate about educating our community. In fact, our entire business started as a blog where people could learn about SEO. Moz Academy gives subscribers the power to be better marketers, which will enable them to use our products in more depth and with greater confidence. We want to provide a hub of marketing knowledge that will create a stronger community where people can teach each other while using the Academy as a frame of reference. One could say that Moz Academy is the Mr. Miyagi of inbound marketing. The key to this project is empowering you to kick even more butt than you already do!
We hope Moz Academy turns into the one-stop-shop for inbound knowledge for Moz subscribers. Everyone on the team is committed to continually refreshing content and adding new lessons. Again, we really want this to be the easiest and most comprehensive place to learn internet marketing on the web.
Furthermore, we’ve designed each lesson with empathy in mind; they will be easily digestible and considerate of your time. That means you can drop in whenever you like and have comfortable breakpoints if you’re brain is exploding with inbound marketing knowledge.
Wait, how do I use Moz Academy?
Moz Academy is easy to use! Check out these six simple steps:
Step 1: Log into your Moz account.
Step 2: Go to moz.com/academy.
Step 3: Look through the lessons.
Step 4: Click a lesson you find interesting.
Step 5: Enjoy a video and/or read the lesson below it!
Step 6: Crane kick.
What lessons do you have right now?
We're starting with the following lessons:
- Inbound Marketing
- SEO
- Link Building
- Social Media
- Content Marketing
We plan to add a lot more! Look for lessons on local SEO, community management, video marketing, email marketing and web analytics. Yup, it's going to be pretty sweet!
Well, Moz, what's next for Moz Academy?
The future of Moz Academy really depends on how everyone uses it. In the next few months, we want to create a good foundation for beginners and subsequently build up to intermediate-level content. Eventually, we'd like to have sections for beginners lessons, intermediate lessons, and advanced lessons. Keep your eyes peeled, because we’ll be releasing a lot of new stuff! Some of our longer-term goals for Moz Academy are to have interactive quizzes and some sort of gamification. Yes, we know you'd like to track your progress and unlock achievements. That way you can show off how awesome you are at Moz Academy!
Eventually, we want Moz Academy to look more like Treehouse and Code School’s online learning platforms. We have a long way to go, but are excited about the journey to get there. With your help and feedback, we can make Moz Academy something awesome. Thanks in advance, and enjoy!
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
SEO Finds in Your Server Logs, Part 2: Optimizing for Googlebot
Posted by timresnik
This is a follow-up to a post I wrote a few months ago that goes over some of the basics of why server log files are a critical part of your technical SEO toolkit. In this post, I provide more detail around formatting the data in Excel in order to find and analyze Googlebot crawl optimization opportunities.
Before digging into the logs, it’s important to understand the basics of how Googlebot crawls your site. There are three basic factors that Googlebot considers. First is which pages should be crawled. This is determined by factors such as the number of backlinks that point to a page, the internal link structure of the site, the number and strength of the internal links that point to that page, and other internal signals like sitemaps.
Next, Googlebot determines how many pages to crawl. This is commonly referred to as the "crawl budget." Factors that are most likely considered when allocating crawl budget are domain authority and trust, performance, load time, and clean crawl paths (Googlebot getting stuck in your endless faceted search loop costs them money). For much more detail on crawl budget, check out Ian Lurie’s post on the subject.
Finally, the rate of the crawl — how frequently Googlebot comes back — is determined by how often the site is updated, the domain authority, and the freshness of citations, social mentions, and links.
Now, let's take a look at how Googlebot is crawling Moz.com (NOTE: the data I am analyzing is from SEOmoz.org prior to our site migration to Moz.com. Several of the potential issues that I point out below are now solved. Wahoo!). The first step is getting the log data into a workable format. I explained in detail how to do this in my last server log post. However, this time make sure to include the parameters with the URLs so we can analyze funky crawl paths. Just make sure the box below is unchecked when importing your log file.
The first thing that we want to look at is where on the site Googlebot is spending its time and dedicating the most resources. Now that you have exported your log file to a .csv file, you’ll need to do a bit of formatting and cleaning of the data.
1. Save the file with an Excel extension, for example .xlsx
2. Remove all the columns except for Page/File, Response Code and User Agent, it should look something like this (formatted as a table which can be done by selecting your data and ^L):
3. Isolate Googlebot from other spiders by creating a new column and writing a formula that searches for “Googlebot� in the cells in the 3rd column.
4. Scrub the Page/File column for the top-level directory so we can later run a pivot table and see which sections Google is crawling the most
5. Since we left the parameter on the URL in order to check crawl paths, we’ll want to remove it here so that data is included in the top level directory analysis that we do in the pivot table. The URL parameter always starts with "?," so that is what we want to search for in Excel. This is a little tricky because Excel uses the question mark character as a wildcard. In order to indicate to Excel that the question mark is literal, use a preceding tilde, like this: "~?"
6. The data can now be analyzed in a pivot table (data > pivot table). The number associated with the directory is the total number of times Googlebot requested a file in the timeframe of the log, in this case a day.
Is Google allocating crawl budget properly? We can dive deeper into several different pieces of data here:
- Over 70% of Google's crawl budget focuses on three sections, while over 50% goes towards /qa/ and /users/. Moz should look at search referral data from Google Analytics to see how much organic search value these sections provide. If it is disproportionately low, crawl management tactics or on-page optimization improvements should be considered.
- Another potential insight from this data is that /page-strength/, a URL used for posting data for a Moz tool, is being crawled nearly 1,000 times. These crawls are most likely triggered from external links pointing to the results of the Moz tool. The recommendation would be to exclude this directory using robots.txt.
- On the other end of the spectrum, it is important to understand the directories that are rarely being crawled. Are there sections being under-crawled? Let’s look at a few of Moz’s:
In this example, the directory /webinars pops out as not getting enough Google attention. In fact, only the top directory is being crawled, while the actual Webinar content pages are being skipped.
These are just a few examples of crawl resource issues that can be found in server logs. A few additional issues to look for include:
- Are spiders crawling pages that are excluded by robots.txt?
- Are spider crawling pages that should be excluded by robots.txt?
- Are certain sections consuming too much bandwidth? What is the ratio of the number of pages crawled in a section to the amount of bandwidth required?
As a bonus, I have done a screencast of the above process for formatting and analyzing the Googlebot crawl.
In my next post on analyzing log files, I will explain in more detail how to identify duplicate content and look for trends over time. Feel free to share your thoughts and questions in the comments below!
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
An Introduction to Integrated Marketing and SEO: How It Works and Why It Matters
Posted by StephanieChang
To say that the SEO industry has changed would be considered a massive understatement. In previous years, for a site to excel in the SERPs (search engine results page), it needed a few key important ingredients:
- A strong technical foundation, with a crawlable and clean information architecture (that also contained a clear internal linking structure)
- The strategic use of target keywords on the page and in the URLs
- Key links with targeted anchor text
Now, the rules have simply changed. Not only are the SERPs displayed differently depending on the user's specific search query (Dr. Pete's Mozcon presentation pointed out 85 different, distinct features in the SERPs from knowledge graph to the related search carousel), but our day-to-day roles have changed. We're now supposed to be knowledgeable about UI/UX, branding, PR, responsive design, international considerations, content strategy/design/implementation, social media, structured data, local SEO, authorship markup, CRO, analytics... the list goes on and on. The reality is that it will always be important as marketers to have a high-level understanding about each of these different disciplines and how they should work together. However, it is impossible to be specialized in all of them. Many of the specialties above have been established industries for quite some time, and like SEO, they have improved and matured. In essence, we need to learn how SEO can integrate itself in a meaningful way with other marketing divisions, or in simpler terms, leverage integrated marketing.
Image courtesy of Mozcon
Why Integrated Marketing?
Integrated marketing is the strategy and implementation of leveraging and unifying different marketing activities. The overall purpose is to complement and reinforce the overall impact of each of these marketing methodologies, so that the marketing process is not only more consistent across different mediums, but also more effective in meeting marketing objectives and increasing a business's bottom line.
In terms of the industry, here are some statistics on overall digital marketing spend as compiled by Gartner in 2012.
-
Companies in different industries spend an average 2.5% of their annual revenue on digital marketing.
-
Companies spend an average 25% of their total marketing budget on digital marketing and specifically, on these type of marketing activities:
As the image above demonstrates, companies spend, on average, 10.7% of their total digital marketing budget on search marketing (though I'd venture and guess that the vast majority of this percentage goes to paid search). However, when it comes to the activities that marketers view as most attributable to their marketing success, only 8-9% of all companies surveyed rated search marketing (including paid) in their top 3.
Images courtesy of Gartner
This perception of search marketing (much less SEO) directly impacts the amount of budget and, subsequently, head space we receive from companies for our work. Although SEOs are involved in many of the activities companies attributable to their marketing success (like content development, UX/UI of the site, and commerce experience), it can be challenging as a consultant or working in-house to be involved in these types of conversations.
As an industry, we need to broaden our scope and find ways to immerse ourselves into these conversations. Like Wil Reynolds mentioned during his presentation at Mozcon, it's about knowing what to pitch and how to pitch.
- How can we demonstrate and provide value to a company's marketing activities and integrate SEO meaningfully into the process?
- The goal doesn't necessarily have to be for SEOs to become specialized experts in PR, branding, content, etc., but more focused on how we all can leverage our knowledge and provide value to these existing activities, while also integrating ourselves into discussions on overall marketing vision, strategy, and implementation.
- How can we stop viewing marketing as distinct channels and, instead, work with other marketing specializations to reinforce and complement all marketing activities/goals/KPIs?
As the online marketing industry continues to change, it becomes more vital for a company to have a consistent mission and vision across all marketing channels.The purpose of this post is not only to inspire us to think bigger about the direction of our industry, but also in our day-to-day work. I also want to showcase examples of other companies I've researched that have successfully leveraged multiple marketing channels to meet common goals.
Integrated Marketing Examples
PR, Social, and SEO
Being at Distilled has provided me with the great fortune of being exposed to individuals with specializations beyond SEO, such as PR. Distilled's previous PR/SEO specialist (now at Dynamo PR), Lexi Mills, and our current specialist, Jess Champion, have really inspired me to think about how to make a content piece more compelling to its target audience and the media. For instance, Lexi once shared how critical it is to ensure that you have enough valuable resources on-hand to enhance a piece of content or a story. For instance, when making a pitch, it's important to ensure that you've created enough material for people to credit and that enhances the value of a story. Once you've built that relationship with the media and they've credited appropriate and legit sources, you've essentially accomplished link building without realizing it (receiving links didn't become a primary focus; it became a consequence of achieving bigger goals). Lexi said that, "As a result, the links you may have attained don't just look natural; they truly are natural."
For example, Australia.com's "Best Job in the World" campaign was effective for multiple reasons. It took a different spin on a concept that could traditionally be seen as "boring" (jobs) and created a hook to receive significant media attention. From a SEO standpoint, the team did a great job of attempting to put some of the campaign on the actual Australia.com domain (even though the actual competition is on a subdomain) because so many PR campaigns are placed on a separate domain and are never mentioned on the actual company website. Not to mention dominating rankings for the keyword phrase "best job in the world" and "best jobs in the world" (an effective branding play).
From a social media perspective, the only way to apply for the position was via Facebook. As a result of the campaign, several media outlets provided links to both Australia.com, as well as the "Best Job in the World" landing page. From the campaign, the site received 1,462 links from 442 linking root domains (including sites like ABC news, the Daily Mail, the Daily Mirror, etc..) From a social standpoint, the campaign has 483,534 likes and approximately 1,000 user interactions on every post.
Images courtesy of Australia.com
PR alongside SEO doesn't just apply to bigger organizations, but also to start-ups or any organization participating in crowd funding. Also according to Lexi, when doing PR for a start-up or an organization participating in crowd funding, it's important to make sure that the actual site is receiving link equity (and not just the crowd funding site). This is important when maintaining the sanctity of the brand because you still want to sell the product on a website once the crowd funding round is complete. You always want to rank first in search engines for your product name. Hence, leveraging the PR surrounding your crowd funding round will help get your potentially new site off to a great foundation.
Or, you can ultimately decide to crowd fund on the product's actual website and reap all the benefits from PR and media coverage directly for your site, such as the case with the Tile App.
Finally, like many other online marketing channels, it's important to make decisions off the back of data. SEO and PR can support one another because SEOs and PRs can work together to determine the specific keywords they want to target for a campaign (both from a branding and from a search engine/user intent perspective) using tools like Google Trends and Google Adwords Keyword Tool. We can also work together to help establish the sites we want to target both from a publicity and a link equity standpoint using metrics, such as DA and PA, as well as what types of credit we want to receive (dofollow link vs image, etc...).
Overall, as SEOs, we also want to help ensure that once a PR campaign is complete, the company can still reap long-term benefits from it whether it be from a technical standpoint (during the course of researching this post, I observed countless PR campaign sites containing 3-5 duplicate home pages and non-indexable content in iframes, sites built in flash, broken links, etc.). In addition, doing so will help ensure that the company will continue ranking for that campaign name in the future, instead of only the PR agency ranking for it (when they publish client case studies).
Offline (Events/Print Advertising/Billboards), Mobile, and SEO
Offline campaigns (like events/print advertising/billboards) have historically been a powerful marketing medium. At the same time, it can be challenging figuring out how offline channels can work seamlessly with online ones. I found inspiration through this image of a recent American Express campaign seen in London (unfortunately it's a little blurry).
There's so much potential from a campaign like this. Having users search for an easy-to-remember keyword phrase on their mobile devices (in this case, having them search "AMEX Gold Tube" is another opportunity to gather data for a traditionally difficult-to-measure channel). Depending on the brand, it's an opportunity to measure traffic (and some of the keyword data that brought users to the site, with the notable exception of "not provided" and others), as well as some compelling mobile usage data (do your research beforehand, especially as it pertains to iOS6 and Android 4 search traffic). It's also an opportunity to create a seamless offline to online interaction that could result in SERP dominance for specific, brand-based terms. Also, depending on the search term that was chosen, it could also be an effective medium to immediately convert users from both a PPC and a SEO perspective. The biggest challenge and goal for a SEO is to ensure that the correct landing page for the specific keyword lands on the number one spot in the SERPs while also creating an ideal SERP landscape (alongside improving conversions for that specific landing page).
Another interesting offline campaign that has become more and more popular is the emergence of pop-up stores. I found the use of Debenhams virtual pop-up stores particularly fascinating. Debenhams created a tour of London's most famous sites and once shoppers were in the correct location on an app, users could "try on" different outfits using augmented reality technology with a backdrop of famous London landscapes. Shoppers could then upload their favorite outfits and receive opinions via social media. If they choose to purchase any of the outfits, they'd automatically receive a 20% discount. Debenhams also implemented SEO best practices in a compelling way by leveraging the press to garner links to key category pages, such as in press releases and asking for any articles or media coverage mentioning the Debenham's virtual pop-up store to give the company proper credit.
PPC, Branding, Content, and SEO
Snickers' ad agency put together am amazingly creative PPC campaign. They compiled a list of the top 500 most commonly misspelled words in search with the help of Google (as usually Google Adwords automatically corrects misspellings and it is against the terms of service to deliberately target misspelled words) and used an algorithm to generate 25,381 different misspelled words. They used these terms to create a "You Are Not You When You're Hungry" campaign. Within two days - yes, two days - Snickers received 558,589 impressions with a stunningly high CTR of 1.05%. The three-day campaign resulted in 5,874 visitors to the site. The endearing video below explains the campaign in more detail.
Initially, the campaign was intended as solely a branding exercise and not necessarily designed to generate CTRs. However, it's important to be aware that this specific campaign might not have been successful solely off the back of the PPC campaign. The "You're Not You When You're Hungry" campaign has been in the works since at least the 2010 Superbowl. To build interest around the campaign, teaser videos and a PR outreach were released that showcased celebrities, notably Betty White. The campaign also utilized print to showcase a variety of celebrities, display ads to relevant audiences, an online video campaign, and social media to engage with the brand's Facebook community. It also appears that the campaign has, at some point, leveraged celebrity tweets, newspaper placements, and Snicker's handouts that have resulted in 705,000 additional Snicker's bars sold that year compared to the previous year, as well as double digit growth in sales.
For instance, we can leverage PPC to test out relevant keywords and ad copy before we decide to invest significant resources into targeting them. In addition, at Distilled we strongly believe in the concept of testing. For many of our clients, developing creative content is often times one of the most resource and budget-intensive aspects of SEO. As a result, we want to be sensitive to the costs they are incurring. Thus, we've used PPC to test out different titles for our creative content pieces to determine which ones generated the highest CTR or the greatest number of conversions. Supplying clients with this data helps develop trust, and consequently builds more buy in for our on-going strategy.
Content, Branding, and SEO
Content is one of my greatest passions because I find telling compelling stories and helping my clients build a brand so personally fulfilling. In many ways, content and SEO work seamlessly together, especially in an era where so many individuals have developed the habit of researching information on their own using the Internet. For example, Adria Saracino and I have repeatedly found (whenever we conduct customer surveys), that so many individuals decide to purchase a product based on what they read over the Internet. This means that in order to become successful at SEO (not just in the form of rankings, but in conversions), we need to ensure that we're consistently developing content that relays trustworthiness/authority/loyalty to our customers, while also remaining vigilant about our online reputation.
There have been so many amazing companies that create content for the benefit of their intended audience and subsequently reap the benefits of it like the often mentioned Survey Monkey Survey Templates and MailChimp Resource Guides. However, not all amazing content is in written form. Sometimes content in image form is as, if not more effective (especially if it's pertinent to your brand).
Take, for instance, Polyvore's vision "to capture the breadth of soft goods and people's tastes better than any other platform thanks to a unique, vibrant community of contributors and cutting edge technology." (more detail about the vision can be found here.) Polyvore encapsulates its vision by providing its users with a platform to essentially create their own content using their editor (it's so simple, yet fits so seamlessly with their target audience and vision). Its editor has generated 18,664 links from 237 linking root domains. However, Polyvore also created a tool that doesn't limit its users to build collages out of products that can be found on their site. You can drag, edit, and link any clip onto your collage using their clipper (and for SEOs, the clipper landing page has generated 18,646 links from 70 domains. Plus, from an SEO standpoint, all the tools are a part of the domain, which is an added bonus. Oh, and if you really fall in love with your collage, you can purchase it immediately on their site (content merging with commerce...so many opportunities!).
In the competitive non-profit world, countless numbers of organizations clamor for the mind share and resources of the general public, all while facing limited budgets of their own. It's often times difficult to know what type of content to put on the website that would be effective in immediately developing an organization's credibility. Having previously been a part of the non-profit world, I was really impressed by the Robin Hood Foundation's website. All the content on their site speaks back to their mission of fighting poverty in New York City and they've carefully invested their resources on organizing and presenting the most relevant data on their site in a clean, visually stimulating format that is incredibly easy for anyone to digest. It has also been effective - their Hurricane Sandy page has garnered links from MTV, Forbes, and Foursquare.
Finally, I was really inspired by Brittan Bright of iAcquire's Moz meetup presentation last year when she talked about her experience working on AXE's Susan Glenn campaign. Brit worked closely with a few other large agencies like Edelman and BBH on a meme marketing campaign that integrated branding, online reputation management, social media, and SEO on how to ensure that the term "Susan Glenn" would come to mean the girl that got away but who remains untouchable for the guy that covets her. There were television commercials (see below), a separate website, domination in the SERPs in universal search results (with image snippets and video results).
In Conclusion
Going through this process is hard work, requires a collaborative effort between multiple marketing channels, and can often feel as if it takes a long time to accomplish anything. However, relaying a consistent message across all marketing channels and unifying the marketing vision for the company is incredibly powerful. That consistency reinforces the brand's trust and authority to potential consumers. Truly, our end goal as marketers, regardless of channel, remains the same: we're all here to support the organization's vision/mission/values, and to work hard to fulfill and grow the company's bottom line.
I'm extremely hopeful that this industry will continue to propel itself forward, continuously ask itself the right questions (the why's and what's the big picture), and really push ourselves to think outside the box. Only then are we in a position to effect change.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
The Key to Empowering your Marketing Team – Whiteboard Friday
Posted by randfish
What holds marketing teams back from accomplishing great things? In today's Whiteboard Friday, Rand tackles the big challenges many internal marketing teams face, and outlines a way to bring structure and empowerment back to your marketers.
Have something to add? Leave your thoughts and questions in the comments below!
For reference, here's a still of this week's whiteboard.
Video Transcription
Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. Today I'm going to be talking to you a little bit about what holds marketing teams back from being able to accomplish great things inside of companies, and for external marketing teams that are on an agency or consulting basis, but really oftentimes internally.
So this, I've got here my six little friends. This one, this guy is kind of awkward. His back is a little out of whack. But that's okay. He's just a stick figure. He's probably feeling just fine.
The challenge for these guys is that they constantly need their work reviewed. They're kind of in the weeds, in the trenches doing marketing activities, building content, trying to get that content shared and linked to, trying to earn rankings and traffic, trying to buy advertising, trying to influence the website and the marketing materials, make the conversion rate higher, do all these things to promote the marketing funnel improving. Yet they're constantly changing course, sometimes daily, sometimes even hourly. Boss comes in, it's sort of like, "No, no, no, don't do that anymore. Focus on this thing. No, wait, I know I told you to do that, but we don't need that anymore. We need this other thing."
They're not empowered to make decisions, not even about their own work. They really have to get constantly reviewed. Someone comes and gives them feedback on everything they do. I've been this marketer myself before. Especially as a consultant, you're oftentimes in this position. You don't have that empowerment to make great decisions.
But there's a way to fix this, and it's an architecture I want to share with you that's been really powerful for me and for a number of other companies that have adopted this and that have shared it too. So the idea is basically that what we want to do is we want to take all the things that the company wants to accomplish today, in the future, in the far, far flung future, and we want to connect that all the way down to what the marketing team is actually working on today, right now. But it takes a little bit of work, and it takes a lot of transparency, and it takes some thinking. If you don't have this architecture yet, you should give it a try. Let me show you what I'm talking about.
A big company vision is a great starting point. I know many small and medium businesses don't even really have a great big company vision. But if you can imagine one, if you can put one on there, "We want to be Cleveland, Ohio's best marketing agency, and we define best as our clients are the happiest, we have the most clients, and we have the highest revenues in the city." Okay, great, now you've got a company vision. Moz's vision, for example, is to help people do better marketing. Tesla's vision is to transform how the world is transported. NASA has an organizational vision to explore space. So you can get a company vision.
So let's say it is, "Help people do better marketing." From that flows things that you're going to do over the next few years. It could be five years, it could be just two or three years, but the mission that you have. I'm going to go back to Tesla again because I love Tesla's five-year mission. Tesla's five-year mission is to "Power the transformation from gas to electric vehicles and to become the world's leading car company by doing that." So become the world's leading company by powering the transformation from gas to electric.
Okay. Then, based on that mission, that thing that you want to accomplish over the next few years, you have a BHAG. A BHAG is Big Hairy Audacious Goal. I know it sounds a little funny, but this acronym is actually quite important, and so are all the letters in there. Big because you want it to be hard to achieve. My favorite thing that people say about a BHAG is,
"It's out of reach, but not out of sight." A goal that is out of reach, I can't see us accomplishing it today. My God, it's almost hard to imagine that we accomplished it, but not completely out of sight.So perhaps Tesla would say that their BHAG is to be the world's number one auto manufacturer in ten years or in five years. That means that they have to build so many cars and sell so many cars that they are the world's leading car company through number of cars on the road. For Moz, our BHAG is one million people subscribing to our platform. For your Cleveland, Ohio consulting agency, it might be successfully keeping and maintaining 100 paying customers at $5,000 a month or more for a full year, nonstop. Whatever it is, it has to be definable, easily definable, easily measurable, and powerful, something that people can get behind.
I'll go back to NASA again. That moon mission that they had, in the 1960s NASA had the moon mission and the BHAG for the moon mission was, "Put a man on the surface of the moon and return him safely to the earth." Super measurable, super definable, incredibly powerful to get behind. If you're doing marketing for that, you can see that big vision and that big goal very clearly. Then from there, from these two, I'm going to take our mission and our BHAG, and I'm going to define a list of strategic goals, things we need to accomplish in order to get these things done. But they're going to be things that we do over the next 6 to 12 months, just 6 to 12 months, just the next little while. This is really powerful because those strategic goals should flow down to everything else that the company does.
So if, for example, I say, "Hey, in order to sell more cars, Tesla needs to open Tesla dealerships in 500 cities over the next 12 months, and here's the list of cities." Okay, that's a strategic goal. Now we've got to go get that done. We need to figure out people who know how to open stores and people who know about real estate, and we need to have a bunch of investment dollars that we can put it in these things. We need to figure out how long it is before we open a dealership before that actually turns into sales for us. We need to hire all the salespeople. We need to build a process for that. Huge list of things that come from those, but the strategic goal is very simple. "Open stores in 500 cities."
At Moz, one of our strategic goals is to increase the retention of our Pro subscribers. Build stuff. Make stuff in the product that makes people want to stick around and use Moz longer. Okay, these are strategic goals.
Then, from there, now we really start to get into the nitty-gritty with the marketing goals being tied to these company goals, and this is such a powerful architecture. It just removes all kinds of barriers, because now I can go and I can build a process like this, right here. So I take a goal that the team is trying to accomplish, and I translate that into what my actual marketing task is around it. Then I have the process and the people that I need for that goal. So actually, I'm going to use my checkboxes that I actually made.
I define my goal, I get the process and people I need, I figure out how we define success, what the measurable elements are. Maybe it's, "Hey, we need to broaden our brand's reach." We want to have more people exposed to the Moz brand, and so therefore, we are going to define a goal as half a million people following our Twitter account and 100,000 people following our Google + account, and maybe a million people following us on Facebook and whatever those things are.
Then you have those metrics-based targets. So those could be website visitor statistics. They could be conversions. It could be an ROI number. It could be a cost number. Many times a strategic goal will be to reduce cost to a certain amount, and then you have these goals. "Hey, we need to reduce customer acquisition costs. We need to find channels that don't cost as much." Oftentimes, inbound channels don't cost as much, things like SEO and email marketing, opt-in email marketing, community building, and content and those kinds of things, that's a great way to reduce customer acquisition costs. It could be a marketing goal, and you figure out who the process and people are behind that. We may need a writer. We're going to need someone who is a marketing analyst to do all the statistics work. We're going to figure out how we measure success. That's going to be measured through number of people acquired through these lower-cost channels. We're going to have metrics-based targets. We're going to say we want to acquire 20% of our customers through non-paid channels by the end of 2013.
Great. Now you have something so amazing. You have marketers that can see the big picture. They can see all the way. They know everything that's connected here, and that means that they know how their work matters. I can't tell you what a change in attitude you get when you understand how your work matters versus wondering why you're pushing buttons. It's just a remarkable change. Now, those same people can navigate project complexity without needing someone over their shoulder, looking all the time at their work, making sure that they're doing the right thing, reviewing, because they can see that full connection.
You might have someone who reviews the work at the end of the cycle or is in a project planning meeting with them, maybe a manager or a senior leader or something like that, and that's fine and that's a good thing. But you don't need to be in the weeds with your team anymore, and because they're empowered, they can choose how they work best, figure out what makes them most effective, and then they can execute on projects.
I urge you to give this a try. It won't take that long, especially if you've got some of these bigger things already defined, and it can really move the needle on how your marketing team works.
All right, everyone. Hope you've enjoyed this edition of Whiteboard Friday. We'll see you again next week. Take care.
Video transcription by Speechpad.com
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
Google’s "Multi-Week" Algorithm Update
Posted by Dr-Pete
Back on June 21st, Matt Cutts replied to a tweet about payday loan spam with an unusual bit of information (reported on Search Engine Roundtable):
The exact timeline was a bit unclear, but Matt seemed to suggest a prolonged algorithm update covering as many as three weeks. Four days later, we tracked our highest temperature ever on MozCast, followed by more record highs:
Seven days during the "multi-week" timeline showed temperature spikes near or above 90°, with six of those days exceeding the severity of the original Penguin update.
Was It A MozCast Glitch?
Let me perfectly honest – Google rankings are a moving target, and tracking day-to-day flux has proven difficult at best. Any given temperature on any given day is prone to error. However, this was a sustained pattern of very high numbers, and we have no evidence to suggest a glitch in the data.
There were some reports that other tools were not showing similar spikes, but some of these reports were based on apples-to-oranges comparisons. For example, if you look at SERPmetrics flux data and isolate just page 1 of Google (which is what MozCast tracks), you'll see this:
Sorry, it's a bit hard to see the dates on the reduced image, but the two spikes equate to roughly June 28th and July 4th, with a smaller bump on June 25th. While they're not an exact match, these two data sets are certainly telling a similar story.
Was It A Large-scale Test?
This is a much harder question to answer. Our beta 10K data set showed similar patterns across multiple C-blocks of IPs, so we have no reason to believe this was specific to one or a very few data centers.
What if Google made a massive change one day, though, and then reverted it? Theoretically, we would see two days of high MozCast temperatures, but if we looked at the two-day flux (instead of two one-day numbers), the temperature would be very low. While this multi-day flux is theoretically interesting, it can be very hard to interpret in practice. Some rankings naturally change, and Google can roll out multiple small updates in any given week.
If we look at the overall flux between the start and end of recorded spikes (June 25 - July 4), we get a MozCast temperature of 120.3°, not much higher than the one-day temperature on June 27th. The average daily temperature for this period was 92.5°. Now, let's look at a similar time period (May 28 - June 6) – the average temperature for that period was 66.8°, and the multi-day temperature across the entire period was 114.7°.
Comparing the two time periods, the overall flux for the period of record temperatures was roughly the same as the peak and about 30% higher than the multi-day average, whereas the overall flux for the quieter period was 72% higher than the average. This is an inexact science at best, and we don't have a good historical sense of multi-day patterns, but my gut feeling is that some of the multi-week update involved changes that Google tested and later rolled back.
What About PMDs & EMDs?
In my post on the June 25th temperature spike, I reported a noticeable single-day drop in partial-match domain (PMD) influence. That post happened very early in the multi-week update, so let's look at the PMD influence data across a 30-day time period that includes all of the high-temperature days:
While there was a lot of movement during this period, you can see that PMDs recovered some of their initial losses around July 4th. The overall trend is downward, but the June 25th drop doesn't appear to have been permanent.
It's interesting to note, even if not directly relevant to this analysis, that the long-term trend for PMD influence in our data is still decidedly downward. Here's a graph back to the beginning of 2013:
So, how have EMDs fared? They seem to show a similar pattern, but in a much tighter range. Scaled to the same Y-axis as the PMD chart above, we get this:
The EMD data is fairly consistent with Dr. Matt Peters' early report on our 2013 Ranking Factors study. Keep in mind that we are measuring two different things – the correlations show how well PMDs/EMDs ranked compared to other domains, whereas MozCast tracks how many PMDs/EMDs ranked across the data set. If the number of total PMDs drops, but they rank roughly as well, the correlations will remain stable, but the "PMD Influence" metric will drop. In other words, the correlations measure how well PMDs rank, whereas MozCast measures how many PMDs rank.
Which PMDs Lost Long-term?
There's one more question we can ask about the drop and subsequent recovery in PMD influence. Did the PMDs that fell out eventually come back, or were they replaced by different PMDs? The metric itself doesn't tell us, but we can dig deeper and see who lost out long-term.
On the initial drop (between June 25-26), 62 PMDs fell out of our public 1K MozCast query set. New PMDs always enter the mix, so the net drop is smaller, but 62 PMDs that were ranking on June 25th weren't ranking on June 26th. So, let's compare that list of 62 to the data on July 5th – after the apparent recovery. On July 5th, 37 of those PMDs (60%) had returned to our data set. This certainly suggests some amount of legitimate recovery.
So, which losing PMDs failed to recover? Here's the complete list (query keywords in parentheses):
- californiacarshows.org (car shows)
- digital-voice-recorder-review.toptenreviews.com (voice recorder)
- fullyramblomatic-yahtzee.blogspot.com (yahtzee)
- virginiamommymakeover.com (mommy makeover)
- www.appliancepartscenter.us (appliance parts)
- www.appliancepartssuppliers.com (appliance parts)
- www.campagnolorestaurant.ca (campagnolo)
- www.campagnolorestaurant.com (campagnolo)
- www.capitalcarshows.com (car shows)
- www.chicagoweddingcandybuffet.com (candy buffet)
- www.dollardrivingschool.com (driving school)
- www.elitedrivingschool.biz (driving school)
- www.etanzanite.com (tanzanite)
- www.firstchoicedrivingschool.net (driving school)
- www.fitzgeraldsdrivingschool.com (driving school)
- www.monogrammedgiftshop.com (monogrammed gifts)
- www.moscatorestaurant.com (moscato)
- www.newjerseyluxuryrealestate.com (luxury real estate)
- www.ocsportscards.com (sports cards)
- www.phoenixbassboats.com (bass boats)
- www.rvsalesofbroward.com (rv sales)
- www.sri-onlineauctions.com (online auctions)
- www.stoltzfusrvs.com (rvs)
- www.vibramdiscgolf.com (vibram)
It's not my goal to pass judgment on the quality of these domains, but simply to provide data for further analysis if anyone is interested. You can see that there are a few examples of multiple PMDs falling out of a single query, suggesting some kind of targeted action.
How Did The Big 10 Do?
In MozCast, we track a metric called the "Big 10" (I did my grad work at U. Iowa, so I should probably have thought twice about that name) – it's just a count of the total percentage of top 10 ranking positions held by the 10 most prominent sites on any given day. Those sites may change day-to-day, but tend to be fairly stable. Looking back to the beginning of 2013, we see a clear upward trend (this graph starts on January 8th, due to a counting issue we had with YouTube results at the beginning of the year):
The "Big 10" gained almost 2-1/2 percentage points in the first half of the year. Some of the gain across the year represents a shuffling of sites in the mix (Twitter falls in and out of the "Big 10", for example, and the root eBay domain struggled earlier this year), and some of this is a symptom of other changes. As Google gets more aggressive about spam, the sites that already dominate naturally tend to take more spots.
I thought it would be interesting to look at these numbers alongside the year-to-date PMD and EMD numbers, but the "Big 10" doesn't seem to tell us much about the multi-week update. As a group, they moved only a fairly small amount between June 25th and July 5th (from 14.97% to 15.17%). Whatever Google tested and rolled out over this period, it didn't dramatically advantage big brands in our data set.
What Happened, Then?
Unfortunately, the patterns just aren't clear, and digging into individual queries that showed the most movement during the multi-week update didn't reveal any general insights. The volatility during this time period seems to have been real, and my best guess is that while some changes stuck, others were made and rolled back. Google may have been doing large-scale testing of algorithm tweaks and refining as they went, but at this point the exact nature of those changes is unclear. Between the multi-week update and Google's announcement of 10-day Panda roll-outs, it appears that we're going to see more prolonged updates. Whether this is to mitigate the impact of one-day updates or make the update process more opaque is anyone's guess.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
Competitive Link Analysis: Link Intersect in Excel
Posted by mihai.aperghis
This post was originally in YouMoz, and was promoted to the main blog because it provides great value and interest to our community. The author's views are entirely his or her own and may not reflect the views of Moz, Inc.
Without a doubt, one of the main steps in creating an SEO strategy is the competitive analysis. Competitor backlinks can offer information on their link building strategies as well as giving you opportunities to strengthen your own link profile.
These opportunities are hard to identify manually, especially in competitive niches, where websites tend to have a significant amount of backlinks. Although some tools do exist that can ease up this process, like the Moz Link Intersect tool, I chose to build my own tool using Excel that would offer greater flexibility in handling data.
I wrote this guide to explain how you can build your own competitive link analysis in Excel, including a template to help you start right away.
What can you find in this guide:
- What Is Link Intersect Exactly?
- Why Would I Need the Excel Version?
- Got It, Now Show Me the Magic!
- Future Improvements
- Template and Instructions
What Is Link Intersect Exactly?
As you may know, the Moz Link Intersect tool (also known as the Competitive Link Research Tool or Competitive Link Finder), along with other tools of its kind, allowed you to find domains that link to your competitors but aren't linking to you. You can thus find lots of link opportunities, especially on the domains that link to more than one of your competitors, since there's a higher chance they might link to you as well.
The Moz tool is currently unavailable, but I have it on good authority it will be back up down the road.
The Excel version is somewhat the same thing, allowing you to view these domains, the competitors they are linking to, and from exactly which URL they link to them, in addition to other metrics that will help you decide which ones to contact.
Why Would I Need the Excel Version?
Here are the advantages of using the Excel version over other tools:
First of all, most tools that include finding link opportunities from your competitors are part of a bigger platform that usually requires a monthly subscription. Excel is a one-time thing (though the backlink source is usually a monthly subscription platform itself), and chances are you might have it already.
Second, if you have a small SEO business like I do (or are a freelancer) and can't really afford a developer to build your own tools yet, Excel might be one of the most useful software at your disposal. It's great for data analysis and visualization and has lots of nifty plugins that aid you in your day-to-day SEO job. Even more-so, almost every major online-marketing-related platform out there has the ability to export to Excel, giving you a centralized location of all the data.
Third, the Excel version will allow you to:
- Use backlink data from ANY provider, not just OSE, as long as it has a Source URL (where the link is posted) and a Target URL (where the link points to); of course, any metrics can help you, but are optional for the functionality of the tool
- Sort the data the way you need it, either by the number of competitors the domains links to, or by one of the metrics that came with your data
- Analyze as many competitors as you want (as long as your computer can handle it)
Got It, Now Show Me the Magic!!
OK, if you're still with me, I assume you're interested in this tool, so I'll take a step-by-step approach to explain how to create it. It does include a pivot table, but it's really not that hard to use and I'll use screenshots to show how to implement it.
If you want to skip to the end result, the last chapter includes a template and instructions on how to use it.
Tools of the Trade
Before we start the Excel-fu, here's a list of what you need to have at your disposal:
- You might be shocked by this one, but you will need Excel to make this work :) I used the 2010 32bit version in my example, but other versions should work just as well.
- Backlink data. You can use the Moz OSE, Ahrefs' Site Explorer, MajesticSEO's Site Explorer, basically anything that meets the requirements I mentioned above and has the ability to export to Excel. For this example, I've used OSE exports. An alternative would be using an API to get the data, that's up to you.
- 6 to 8 hours of your time. I'm joking :)
Step 1: Export Your Backlink Data (skip if you already know/have this)
This is fairly basic. If you've used Excel for backlink analysis before, you probably already know how to do this. Personally I have a Moz PRO account, so I'll be using OSE for this step.
Since I've just recently launched my company website, I won't be using it as an example. Also, since I am too lazy to pick a random website, I'm going to use seomoz.org as my primary domain.
I'll choose 3 competitors (I mentioned you can choose as many as you want, but since these are fairly big websites, three should be enough for this example). These are: distilled.net, seerinteractive.com and seogadget.co.uk.
Getting backlink data from each of those sites (including the primary one) is straightforward. Go to OSE, enter the domain and click search. Next, you will want to filter the results to include only external links to pages on the root domain or subdomain (the latter if the site's hosted on a subdomain that is fairly separate from the root domain, like a blogspot.com blog).
IMPORTANT NOTE: Getting links to the root domain will usually get you more data, but will require the need of two additional formulas in step 3.
Optionally, you can filter this more to only include dofollow links. Be sure to click the Filter button once you're done.
Next, you'll want to download these links. Now, OSE gives you two options. Either use the "Download CSV" button and get up to 10k links, or use the Advanced Reports module where you have a daily credit limit and can export up to 100k links.
In case you use that, you'll need to choose the "External linking page" and "Any page on this root domain" (or subdomain, accordingly) options. Everything else can be left as is, though you can choose to filter links with DA/PA higher than a certain value, to reduce the total number of results.
Note that you can queue exports, so you don't have to wait for one to finish until you start the next one. You'll get emailed when they're done.
Repeat this for all your competitors as well. In the end, we should have four different CSV files (one for our backlink data, three for the data of our competitors).
Step 2: Import It Into Excel (skip if you've already done this and removed the errors)
It's time now to open the magical software that our people refer to as Excel.
To get the data from CSV files, we have two options:
- Either open the CSV files directly, copy the columns we're interested in (this would be the URL, Target URL and any metrics you need) and then paste them into a new worksheet
- Or use the Excel Text Import wizard to import the data into an empty worksheet without opening the CSVs
Both options are fairly simple, though the first one is easiest to do (won't even do screenshots for this). The problem is that the first option doesn't work if your Windows installation is set to a European country.
That's because a CSV contains Comma-Separated Values, the comma being the default list delimiter in the US. For European countries, the default delimiter is usually the semicolon (";"), which means Excel won't read the CSV files correctly.
To resolve this issue, you need to open the Regional and Language Options from the Control Panel in your Windows installation and either set it to English (United States), or keep your current country and, in the Advanced Settings, set the decimal symbol to dot (".") instead of comma, and the list delimiter to comma instead of the semicolon. You can view the exact process here (Solution #3):
Alternatively, you can use the second option. The problem is, due to the way Excel imports data, some of it may be displayed erroneously, which would lead to some extra steps to clean up the data by removing all the errors. Due to this issue I decided not to include a tutorial on how to do this (but you can do it regardless if you prefer not to change your list delimiter).
Regardless of your choice, after including data from the first domain, copy the data for the other domains underneath, without including the header row again. This way you'll have a continuous list of data from all the domains with just one header row (the first one).
IMPORTANT NOTE: If you'll be analyzing a large number of backlinks (over 50k), enter only a limited number at start (10-20k), and add the rest (also in batches of 10-20k) after inserting the formula columns from the next step. This is necessary depending on your Excel version and your resources to avoid error warnings.
Right, you should now have all the data imported into Excel. This is optional, but I find it much easier to work if this data is in a table. To do that, select all the data so far (click on one of the cells containing data, like A1, and hit CTRL-A), then transform it into a table (hit CTRL-L).
Remember that without a table, you'll have to edit the formulas to include exact cell references (e.g. $A2 instead of [@URL]).
Step 3: Apply the Necessary Formulas
Now that we have all our data in Excel, we need to apply the formulas necessary for the next step.
Our first two formulas will simply take the (source) URL and Target URL column data and strip everything but the subdomains. These nifty formulas are also part of the excellent "Excel for SEO" guide from Distilled.
We'll need two create two new columns to hold this data. We'll name the first one "Source Subdomain", and the second one will be "Target Subdomain". Since we have a table, we just need to enter the names in the first two adjacent columns, and Excel will attach them to the table automatically.
The first formula is
Source Subdomain
=MID([@URL],FIND("://",[@URL])+3,IFERROR(FIND("/",[@URL],9),LEN([@URL])+1)-(FIND("://", [@URL])+3))
(where [URL] is the column that contains the Source URL, might be named differently if you don't use OSE; thanks to GerardGallegos for pointing out a typo!)
and the second formula is:
Target Subdomain
=MID([@[Target URL]],FIND("://",[@[Target URL]])+3,IFERROR(FIND("/",[@[Target URL]],9),LEN([@[Target URL]])+1)-(FIND("://",[@[Target URL]])+3))
(basically the same, just for the Target URL column).
The formulas basically get and display what's after the "://" part and before the first "/" of the URLs (this allows it to also get links from and to secure locations with "https"). The IFERROR part ensures you get the right result for the case where the URL doesn't have an ending slash, like the homepage URL (OSE always adds this slash, but Majestic SEO doesn't).
You just need to enter these inside the first cell of each of the columns, and Excel will auto-populate them for the whole column.
IMPORTANT NOTE: If you selected the "pages to subdomain" instead of "to root domain" option when getting your data, you won't need to include the next two formulas, since you only have one subdomain for each site (e.g. www.seomoz.org for Moz, seogadget.co.uk for SEO Gadget, etc.). If that's the case, skip to the Unique Domains formula.
Now, you might be wondering on the fact that we only got the subdomain of the Target URL, and that's an actual issue. This means that seomoz.org and www.seomoz.org will be counted as different sites, which may be a problem further down the line (you would see domains that link to 4 or more competitors, even if you only have 3 competitors in your data).
To fix this, we need to pull the actual root domain from the subdomain. Unfortunately this will be a tad complicated since we have to differentiate between TLDs (Top-Level Domains) and SLDs (Second-Level Domains), because one of our competitors is on a SLD (SEO Gadget), and we don't want to end up with the domain "co.uk" instead of "seogadget.co.uk" (so you can't use the "just grab whatever is after the last dot as TLD" routine).
IMPORTANT NOTE: If you have the SeoTools for Excel plugin by Niels Bosma, you can skip the TLD and Target Root Domain formulas, and use just one formula to get the root domain. In this case the formula would be =UrlProperty([@[Target Subdomain]]; “domain�). However, if you intend to use your own custom SLDs (e.g. "blogspot.com" to avoid clumping different blogs from the same domain together), you'll need to use the functions below. Thanks to Roald for reminding me of this function!
First, we need to list all the TLDs and SLDs we expect to encounter in the Target URL column somewhere separated from the table. I chose the Z column for this. Our list will be the following:
.com
.net
.org
.co.uk
Always place the SLDs under the TLDs (as a point of principle), so they get detected last. Think of it like a set of rules, the formula will check for all rules and return the last match it found. So for example, if one of your competitors is a blog hosted on ".blogspot.com" (which is not really a SLD, but you would consider it as such for your analysis, since you're not interested in "blogspot.com" as a competitor), you would want to place that under the ".com" TLD so it gets matched correctly.
With the list set in place, our next formula will retrieve the TLD/SLD (I will just refer to them as TLD from now on) from the Target Subdomain column. Use it in the next adjacent column to the table, and name the column "TLD". The formula is:
TLD
=LOOKUP(2^15,SEARCH($Z$1:$Z$4,[@[Target Subdomain]]),$Z$1:$Z$4)
The 2^15 value inside the LOOKUP tells the formula to always look for the last occurrence of the TLD in the Target Subdomain. If you're curious in what case would this be useful, imagine the subdomain "test.comparison.org". You would want to retrieve the ".org" part, as that is clearly the TLD. However, without the 2^15 part, Excel would first encounter ".comparison" and stop, so it would then match it to ".com", which would be a mistake.
The $Z$1:$Z$4 range references the cells that contain the TLDs.
Now that we have the TLD, let's get the actual root domain. To do this, we basically get the Target Subdomain, strip the TLD, get everything that's after the last dot, and then apply the TLD back on it. That means that if we have "some.thing.example.com", we'll strip the TLD and get "some.thing.example", retrieve everything after the last dot which gets us "example", then finally apply the TLD to get the root domain "example.com".
All of the above is done in one formula, which you'll place in the next column to be named "Target Root Domain":
Target Root Domain
=IFERROR(RIGHT([@[Target Subdomain]],LEN([@[Target Subdomain]])-FIND("|",SUBSTITUTE(LEFT([@[Target Subdomain]],LEN([@[Target Subdomain]])-LEN([@TLD])),".","|",LEN(LEFT([@[Target Subdomain]],LEN([@[Target Subdomain]])-LEN([@TLD])))-LEN(SUBSTITUTE(LEFT([@[Target Subdomain]],LEN([@[Target Subdomain]])-LEN([@TLD])),".",""))))),[@[Target Subdomain]])
Yeah, bit of a long one, I know. I wanted however to get it inside one formula to avoid creating unnecessary columns and get the root domain in one go. The IFERROR portion at the beginning is for the case where the Target Subdomain is actually the Root Domain, so it just returns that instead. The rest of the formula does exactly what I described above.
We now have our root domains!
The fifth (or third, depending if you used the last two) formula has the purpose to check if a source domain links to a target one at least once, so you can later see how many of your competitors get links from that source.
In the pivot table that we build in the next step, the formula will have the role of doing somewhat of a "distinct count" of target root domains for each source URL. Unfortunately, there is no way to do this without the formula unless you are using Excel 2013.
I'm saying this because, in order to create this "distinct count", we can actually use one of three formulas. Two of them might be faster than the third, but you also might get error warnings from Excel (at least the 2010 32 bit version I'm using) on large amount of links (like 30k+). I've decided to use the third formula which, albeit it might be slower, seems to work fine with a lot of links and yield the same result.
Create a new adjacent column called "Unique Domains", and add the following formula:
Unique Domains
=IF(COUNTIFS(INDIRECT(ADDRESS(ROW(Table2[#Headers])+1,COLUMN([Source Subdomain]))&":"&ADDRESS(ROW([@[Source Subdomain]]),COLUMN([Source Subdomain]))), [@[Source Subdomain]],INDIRECT(ADDRESS(ROW(Table2[#Headers])+1,COLUMN([Target Root Domain]))&":"&ADDRESS(ROW([@[Target Root Domain]]),COLUMN([Target Root Domain]))), [@[Target Root Domain]])=1,1,0)
The COUNTIFS function counts how many times a Source Subdomain has the same Target Root Domain associated with it. The IF identifies the first association of this kind and returns the value 1 for it and the value 0 for subsequent associations (kind of like saying "Yes, this source links to this target at least once").
The formula might seem long, but it's actually equivalent to this:
=IF(COUNTIFS($O$2:$O2,$O2,$R$2:$R2,$R2)=1,1,0)
In this case, the O column is the Source Subdomain, while the R column is the Target Root Domain. 2 is the row number where you first introduce the formula (the row right beneath the header row).
I chose the longer version so it can be applied without identifying the columns and rows needed to make it work. This is why I used the INDIRECT and ADDRESS functions, which have the purpose of automatically identifying the necessary references for the formula, regardless of where the table is positioned or how many columns it has.
Remember, if you haven't used the TLD/Root Domain formulas, you will use the Target Subdomain reference instead of the Target Root Domain.
Unfortunately, the COUNTIFS function won't work for Excel 2003, so you need to use a different formula for this issue, which you can find here (the SUMPRODUCT version).
The final formula is rather simple, where we need to check if the link points to us (our primary domain) or not, so we can later filter it. Create an adjacent column named "Link To Us", and enter the following formula (where you replace the "seomoz.org" with your root domain):
Link To Us
=IF([@[Target Root Domain]]="seomoz.org",1,0)
The formula is a basic if conditional, which returns 1 if it's our primary root domain, 0 if it's a competitor.
Remember, if you haven't used the TLD/Root Domain formulas, then you're gonna use Target Subdomain instead of the Target Root Domain (in which case, for this example, you would have [@[Target Subdomain]]="www.seomoz.org").
Step 4: Build the Pivot Table
Okay, now that we have our backlink data table set up, the last two steps will be quite simple.
Create a new sheet (or rename one of the other default ones) named "Pivot Table". It is here where the data visualization will occur, and where you will be spending the time to analyze it.
So let's create our pivot table. The process goes something like this:
Click the Insert tab -> click the Pivot Table button -> enter the name of our table containing backlink data (usually Table1, unless you renamed it or created multiple tables) -> hit OK.
You will now have an empty pivot table with a Field List sidebar. Here's how we configure it:
- Drag the Source Subdomain, Target Root Domain (or Target Subdomain accordingly) and URL fields into the Row Labels box (in that order)
- Drag Unique Domains, Link To Us and any metrics you want to have (like Domain Authority) to the Values box (order doesn't matter). All of the fields should be added automatically as a sum ("Sum of..."). For the metrics we actually need averages, so repeat this process for each metric: click on "Sum of Domain Authority (or whatever metric you have)" -> click Value Field Settings -> choose Average instead of Sum, under "Summarize value field by" -> hit OK.
The field configuration should look like this (might differ somewhat depending on your backlink data provider and the metrics you use):
Next, since the data is expanded and we can't really see anything, we need to collapse it under the Source Subdomain fields. To do that, click one of the source subdomains in the pivot table -> make sure you're in the Options tab menu of the Pivot Table -> click Collapse Entire Field.
You can repeat the collapse process for a Target Root Domain as well.
Step 5: Sort It Out and You're Done
Now that we have everything we need inside the pivot table, we only need to sort the data. First of all, since we're trying to get competitor links, we need to filter out the Source Subdomains that already link to us.
To do this, click on the Row Labels dropdown -> Value Filters -> Equals... -> in the new window that opens select "Sum of Link To Us" and enter "0" in the value field -> hit OK.
Basically, you want to see the sites that link to most of your competitors, since, as I mentioned at the beginning of this guide, these are the sites most likely to link to you as well.
To sort it, click the Row Labels dropdown again -> More Sort Options -> choose Descending (Z to A) by -> choose Sum of Unique Domains -> hit OK.
There, we now have a pivot table with domains that link to our competitors but not us, sorted by the amount of competitors they link to.
You can of course sort it by Domain Authority (or any other metric you chose to include). I've yet to find a proper way to sort by multiple columns (as in sort by Sum of Unique Domains first, then by Domain Authority).
The final version looks something like this:
IMPORTANT NOTE: Before saving the file, to reduce the time it takes to open it as well as reducing its size, we can replace all the formulas with their value so Excel won't recalculate them. This is done by selecting all table data (CTRL-A) -> copying it (CTRL-C) -> then pasting just the values (Paste Values).
Future Improvements
There are a few things I had in mind that could be added to this tool:
- Ability to sort by Unique Domains first, then by one or more metrics.
- Ability to filter by metric value greater/smaller than x (e.g. Domain Authority greater than 30); you can partially do that by moving the metric to the Report Filter box and then ticking which values you need (gets boring really fast).
- Ability to remake the pivot table to see Co-Citation opportunities; what I mean by that is, instead of having Source Subdomain as your primary rows, you would have the URL field, so you can see exactly which pages (as opposed to which site) link to more than one competitor.
If you have any idea on how any of these can be implemented, write it in a comment below.
Template and Instructions
As I promised, I'll include a template that contains all the necessary formulas and the pivot table, to which you only need to add your backlink data.
I've decided to create two separate files:
- vertify-link-intersect-sample-SUBDOMAIN.xlsx – this can be used when you have backlink data from only one subdomain for each individual website (both yours and your competitor's), and the backlinks point to pages of that subdomain (in this case, domain.com and www.domain.com are considered different sites/subdomains, so you'll have to choose which one you want the data for)
- vertify-link-intersect-sample-ROOTDOMAIN.xlsx – this can be used for the case where you use backlink data to pages of the root domain for at least one website; in this case there will be two extra columns to extract the root domain, which means you'll have to configure the TLD list in the Configuration sheet
If you have any problems with the files or there's anything that you have trouble understanding, please let me know in a comment below. Enjoy my carefully crafted meme:
Editor's note: this post is available in Romanian on the author's own site at http://www.vertify.ro/analiza-link-urilor-competitiei-link-intersect-in-excel/
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!