The very first Buffer blog post—Want to Tweet While You Sleep?—was published in January 2011.

Three-and-a-half years and 595 posts later, we’ve covered a lot of ground, learned a huge number of tips and tricks that make social media easier, and written a ton of helpful, actionable content.

Many others have just as much—probably even more—awesome content sitting in the archives. If you’re just starting out, you’ll get to this point, too. When your archives are bursting with content, it could very well be time for a look back to see what has worked and what could be made even better.

We ran a content audit on the Buffer blog and came away with several key action items and quick wins that we’re excited to implement. Let me show you how we did ours and some of the helpful takeaways we learned.

content audit

How to run a content audit on your blog

Before you get to the takeaways, you first need to grab the data. This involves running a handful of reports, and placing the results into a spreadsheet. Here’s how we created our master file of Buffer blog content.

1. Run a Screaming Frog analysis of your blog

The free Screaming Frog SEO spider will pull out all the relevant pages that are indexed on your blog. Download the software and run the report, and you’ll get a data file that you can easily upload into Excel or Google Drive.

Screaming Frog analyzes each of your pages and returns a host of valuable information on the specifics of each page. Here’s what I found most helpful.

  • URL
  • Title
  • Title length (in characters)
  • Meta description
  • Meta description length (in characters)
  • The major H1 and H2 headings on the page
  • Word count

2. Filter out all links that might not be necessary for your audit

The Screaming Frog results were incredibly detailed, which meant there was more information than I really knew what to do with. So I sorted and filtered to get what I needed.

For example, Screaming Frog returned results for all the images and files we had uploaded to WordPress (since each image and file has its own url). I chose to filter these results out so that I would only see the blogposts in the spreadsheet.

To do this:

Select all cells.

Click the Filter icon.

filter

Go to the Address column of your spreadsheet. Click the filter icon at the top of the Address column.

Search for the phrase “wp-content” (which appears in the url of all uploaded files in WordPress).

filter wp-content

Click “Clear.”

Done! All urls that contain “wp-content” should be removed from the view. (They’re all still in the spreadsheet, just out of sight for now.)

3. Input all URLs into Shared Count‘s bulk uploader, export, add to spreadsheet

Now that you have all your relevant blogpost pages in view, select all the urls in the Address column and copy them to your clipboard.

Head over to Shared Count and click on the URL Dashboard from the top menu. We’ll be adding all the URLs we copied into the Shared Count bulk uploader.

bulk upload shared count

Once these URLs have been uploaded, you can run Shared Count to see the results. At the bottom of the results page is an option for exporting these results to a file.

Take this exported file and import into your spreadsheet.

You can match up the imported data with the correct rows in a couple of ways.

  1. Copy and paste. Ideally the data you grabbed from Shared Count should be in the same order as your spreadsheet.
  2. Run a VLOOKUP to find the data.

There are some great tutorials on exactly how VLOOKUP works. Basically, you’re telling your spreadsheet to look for a certain value (the URL) in a selection of cells (the imported data), and if the spreadsheet finds that value (the URL), it should return the corresponding value from the cell you specify (the URL’s Facebook shares, for instance).

Here’s a bit more from Google Sheets on the topic:

Screen Shot 2014-10-07 at 1.40.58 PM

When the Shared Count information is entered into your spreadsheet, you should have the following data for each URL:

  • Facebook Likes, Shares, Comments, and Total Engagement
  • Tweets
  • +1s
  • Pins
  • LinkedIn Shares
  • StumbleUpon

(Reddit, Digg, and Delicious columns were empty when I ran the tool, so I deleted them from the spreadsheet.)

4. Grab Google Analytics data

Google spreadsheets has a neat integration with your Google Analytics data. You can run reports from right within your document and then carryover the resulting data to your main sheet.

In Google spreadsheets, go to Add-ons > Google Analytics > Create a New Report.

This will bring up a dialog box where you can set the basics for your report. I pulled in a pair of reports for this content audit.

  1. Unique visits
  2. Time on page

When creating the report, you can put your desired data in the Metrics field and the associated page in the Dimensions field. Here’s an example:

Screen Shot 2014-10-08 at 6.11.29 AM

Once the initial reports are made, you can refine your results with filters and maximum results.

Filters: Tell Google to only show results that meet certain criteria.

For instance, if you only want page results that received 500 or more unique pageviews, you would add this bit to the Filters section on the Report Configuration tab: ga:uniquePageviews>500

Max results: Enter any number here, and Google will only return this certain number of results.

When you’re ready to see the data, click on Add-ons > Google Analytics > Run Reports. This will create two new tabs with report data. You can run the VLOOKUP function to add this data back into your content audit table.

5. Pull in some keyword and search data

Some of the best tools out there for this kind of data come with free trials or plans to get you started. Here are some to consider:

In general, the results of these SEO tools will mostly look the same. We ran the Buffer blog through SEM Rush and Moz for our audit. What was hugely helpful to see from this data was how some of our blogposts ranked in search results pages and which keywords were best.

Here are the key data points I added back into our audit spreadsheet.

  • Keyword
  • Position
  • Search volume
  • Pagerank
  • Inbound links

With this data, I could see which posts were ranking for keywords, how high they were ranking, and how significant the keyword was in terms of traffic.

To pull these numbers into the spreadsheet, I ran a VLOOKUP for each blogpost URL. One of the limitations with VLOOKUP is that it is only capable of returning one result. So for blogposts that ranked for multiple keywords, I wasn’t able to grab that data. If anyone has ideas on how to improve on this, I’d love to hear in the comments!

Additionally, with a free trial at Moz, I was able to grab full data on the Buffer blog via Open Site Explorer. Specifically, I exported the data from the Top Pages tab, uploaded to my spreadsheet, and ran VLOOKUP to pull in the appropriate data for each page.

top-pages-ose

6. Add the day of the week that each post was published

Credit to Matthew Barby for this really cool tip. You can import the publish date for each of your posts by using a slick formula called XPath. This formula basically searches your page for an element you specify and then returns the result.

=IMPORtXML($A3,”//span[@class=’timestamp’]”)

The //span[@class=’timestamp’] portion is the XPath.

Here’re the instructions from Matthew on how to grab the XPath from your page:

All I did was find where the category is displayed within an article, right-clicked (within Chrome) and selected inspect element. On the line of code that is highlighted within Chrome Developer Tools (i.e. the line of code displaying the category name), I right-clicked and pressed Copy XPath.

xpath

Another way I’ve found to grab this information is to enter the CSS class name into the XPath field. For instance, in the example above, our publish date is wrapped in a “timestamp” class. So “timestamp” is what I use in the XPath formula.

=IMPORtXML($A2,”//span[@class=’timestamp’]”)

Once the date is entered, you can run another quick formula in a new column to have Google spreadsheets find the day of the week from the date.

=CHOOSE(weekday(A2), “Sun”, “Mon”, “Tue”, “Wed”, “Thu”, “Fri”, “Sat”)

What we found with our content audit

Get the complete Buffer content audit spreadsheet.

Running an audit like this can be an eye-opening experience, no matter the results. It’s a similar feeling to planning a content calendar: You get to pick your head up from the daily writing, editing, and publishing, and see your blog from a different perspective.

Keyword insights were fascinating. It’s really great to see which posts are ranking for which keywords—and to learn from what we did well on those posts.

Comparing the success of not only your recent posts but also your posts all-time is a great learning experience, too. Our content audit gave us data from more than three years ago, which I had yet to consider.

And of course seeing all these numbers and figures highlighted some areas where we can grab some quick wins.

Here are some specific takeaways we got from our content audit.

What you can do with the results

1. Compare lengthy posts to shorter posts

Which get shared more?

Our content audit has given us all the data needed to find this. We’ve just got to run a couple quick calculations. First is to place the word count data into a handful of buckets. Here’re the ones I chose:

  • Fewer than 500 words
  • 501 to 1,000 words
  • 1,001 to 1,500 words
  • 1,501 to 2,001 words
  • 2,001 to 2,500 words
  • More than 2,500 words

To get these buckets, I ran a quick “if > then” formula that looked like this:

=IF(A2<500,”up to 500″,IF(A2<1000,”up to 1000″,…))

Then I went to Data > Pivot Table Report and created a pivot table from the data. When choosing the range and data to manipulate, I added the word count bucket column into the Columns field and the Total Social column into the Values field, making sure to summarize by Average.

Here is the result:

word count social shares

Our posts that exceed 2,500 words receive an average of 6,600 social shares—far and away the most popular post length on the blog.

Our next most popular post length—2,000 to 2,500 words—receive less than half that amount, 3,200 social shares.

2. Discover the posts that search engines love best, and optimize them

Pulling in the search statistics into the content audit reveals which ones might be bringing in the long-term traffic from Google. These posts would be great ones to hone in on and fully optimize.

To find these quickly, I added conditional formatting to the columns for search volume and for where we rank on the search results page. I added green backgrounds to any cell with more than 1,000 visits from search and a Top 5 page ranking.

search volume keywords

Armed with this knowledge, I can quickly see which posts might stand to have a good once-over. For instance, in the above example, I see that “how music affects the brain” gets 1,600 visits from search and that we rank No. 1 for the result. I could hop over to our blogpost and make sure everything is as well-optimized and shareable as possible.

This might include:

  • Creating new, shareable images
  • Update title tags and SEO information
  • Adding email signup calls to action and Buffer signup calls to action to relevant spots in the article
  • Proofreading
  • Adjusting styles and headings to match our latest posts

Bonus: Optimize posts that get tons of traffic

You can also do this optimization on popular posts. Sort the content audit by unique visits and see which posts get the most traffic.

For these posts especially, it might be a good idea to add a quick call-to-action for an email signup or a product trial since these posts get viewed the most.

3. Learn from Pinterest success

One of the neat bits of data in this content audit is Pinterest pins. We’ve yet to optimize our blog posts for Pinterest, so it’s awesome to see that a few of them have really taken off on that channel.

top pinterest posts buffer

What do these posts have in common?

A quick glance at the top performers shows that each of these has a mini infographic element. Here’s an example of one from the Instagram Stats post:

instagram stats tips

In general, the images on these posts are a bit more vertical (a Pinterest specialty) than some of the other images we create for blogposts. It’s a good reminder to see what has worked well in the past and to incorporate that into our visual content strategy moving forward.

4. Find the best day of the week to publish new blogposts

Which publish days lead to the most social shares?

Which days lead to the most traffic?

We can grab this information from our content audit by using a pivot table.

For the Columns field, add the day of the week column. For the Values field, add total social shares and/or most traffic. Choose to summarize the Values field by average.

Here’s what we learn when we run the Buffer blog archives through these queries.

Screen Shot 2014-10-08 at 10.13.00 AM

publish blog posts which days

Monday is our biggest day for social shares.

Tuesday and Thursday are our biggest days for traffic.

Note: The data here measures shares and traffic all-time, not just the shares that happened and the visits that occurred on that particular day of the week.

5. Shorten the long page titles

The ideal length of a title tag is 55 characters, with a max of 70. If a page title runs too long, it risks being truncated in Google search results.

page titles

Our content audit tells the character count of the page title on each blogpost, so I can quickly see which pages might need an edit. We use the Yoast SEO WordPress plugin (one of our favorite plugins!), which allows us to set up custom page titles in our SEO settings.

For instance, this title:

8 Surprising New Instagram Statistics to Get the Most out of the Picture Social Network – – The Buffer Blog

Screen Shot 2014-10-08 at 8.49.46 AM

… can be shorted to this …

8 Surprising New Instagram Statistics and Tips for Marketers

Screen Shot 2014-10-08 at 8.49.29 AM

To make for an even quicker win, I can sort and filter the table in order to see the too-long title tags compared to the number of unique visits for the page. This’ll tell me which pages to address right away.

Tip: One of my favorite ways to do this is to apply conditional formatting to the title length column so that all values above 70 have a yellow background. Then I’ll sort the spreadsheet from most traffic to least traffic (“Z to A “sort), and I can easily browse the title length column to see which results stand out.

Takeaways

Running a content audit on your blog can reveal a huge number of insights into what’s sitting in your archives.

You can discover what’s worked in the past and what to try in the future.

You can see opportunities for quick wins and simple fixes that could have big effects.

I’m excited to try out even more analysis with the data we have from our Buffer blog audit. And I’d love to know what you find out if you run your blog through the same steps. Feel free to add to the discussion here in the comments with any questions, insights, or feedback!

Image sources: Blurgrounds, Icon Finder, Death to the Stock Photo

Free up your day with our Social Media Tools

Buffer can save you up to an hour a day and grow your traffic too.

Learn More
Written by Kevan Lee

Content crafter at Buffer. You can find me online, tweeting about my writing process, or at home, second-guessing football coaches. Live simply, give generously, beat cancer.

  • Anna Shelton

    Such good information Kevan! As per usual. How long did it take you to run the complete audit?

    • http://blog.bufferapp.com Kevan

      Hi Anna! Thanks so much! 🙂 The audit took about 4 hours to pull everything together!

      • Anna Shelton

        Really? That’s not bad at all! Thanks for the how-to.

  • Terence Mace

    Hi Kevan,

    Great post with lots of actionable stuff here.

    In regards to pulling in the keyword and search data, you might want to look at using a pivot table and then doing your vlookups from that table. Alternatively, the sumif, countif and averageif functions might be useful.

    • http://blog.bufferapp.com Kevan

      Hi Terence! Thanks so much for the comment! That sounds like a great way to go about it. I’m excited to try out your ideas. Thanks for the help! 🙂

  • http://www.outbrain.com/ Will Fleiss

    epic!

    • http://blog.bufferapp.com Kevan

      Thanks, Will!

  • Amanda Gallucci

    Love the idea of tracking day of the week! That’s something I will add to my audits in the future.

    • http://blog.bufferapp.com Kevan

      Thanks, Amanda! Very glad that makes sense for you all. It was an awesome one from Matthew! 🙂

  • Swati Nagpal

    Brilliant work! Thanks. 🙂

    • http://blog.bufferapp.com Kevan

      Happy you found this useful, Swati!

  • http://marketeer.kapost.com/ Andrew J. Coate

    This is amazing (as always). Fantastic insights, Kevan! Such an incredible amount of work and thought you folks put into this.

    Our hope at Kapost is to ease content audits like this in the future with http://contentauditor.com/ This post has given some insights to us as well!

    • http://blog.bufferapp.com Kevan

      Hi there Andrew! Thanks for mentioning contentauditor.com here! I’ve enjoyed playing around with this tool quite a bit!

      • LeanieFeltonmav

        my best frends uncle just got an awesome 9 month old Subaru Impreza WRX just by some parttime working online… see this page HOW TO EARN

  • http://about.me/brennakl brennaKL

    Great SEO tip about shortening blog titles, can’t wait to install that plug-in 😉 Keep up the amazing work Kevan!

    • http://blog.bufferapp.com Kevan

      Thanks, Brenna!

  • https://www.sortmoney.com Rajkanwar Batra

    Wow, It looks like you have written one blog every second day. How large is your team which does this?

    • http://blog.bufferapp.com Kevan

      Hi there Rajkanwar! Yes, tons of credit to the writers that came before me. 🙂 Our team right now is two full people (and a couple halves) 🙂

  • http://www.TheSpreadsheetGuru.com/ Chris Macro

    As an Excel blogger, I’m glad to see you put the program to good use! Excellent article and great use of analysis. I plan to do a similar audit at year end. Cheers!

    • http://blog.bufferapp.com Kevan

      Awesome stuff, Chris!

  • http://pandasecurity.com Ana Etxebarria

    Thanks, and again Thanks!! to you and Leo and all the others. This blog is absoultely inspiring. Love the Buffer team and the Buffer culture. You are awesome!

    • http://blog.bufferapp.com Kevan

      Thanks, Ana! 🙂

  • http://www.workreadplay.com Bryan Collins

    Good post Kevan. Kingsumo has a plugin that enables WordPress users to A/B test headlines in real-time. Do you use it here?

    • http://blog.bufferapp.com Kevan

      Good one, Bryan! I’ve heard great things about that plugin. We’ve yet to use it ourselves, choosing to A/B test headlines with tweets.

      What’s been your experience with the plugin? 🙂

  • http://rayedwards.com/about/ Ray Edwards

    This is an incredibly helpful and generous post. Once again Buffer demonstrates why you are one of my very favorite companies. Thanks!

    • HeatherHeavisidebol

      My Uncle Wyatt got a year 2013 Ford Mustang Convertible only from working parttime off a home computer. check WORK FROM HOME

    • http://blog.bufferapp.com Kevan

      Thanks so much, Ray!

  • HathawayP

    Really cool post Kevan, and thanks for mentioning URL Profiler. You can in fact cut down on all the tools and VLOOKUPs but using the integrations within URL Profiler (import from Screaming Frog, import Analytics data, scrape social shares directly) and allows you to keep it all in Excel too.

    I’d love to see a follow up post on what you end up doing with this insight (e.g. re-writing or deleting poor pages, or pimping up new pages etc…)

    Great job!

    • http://blog.bufferapp.com Kevan

      Thanks so much! Awesome to hear that URL Profiler adds all this functionality! Will definitely give this a look for audit 2.0!

  • KathleenHeuer

    Amazing work. I had no idea that generating a spreadsheet like this was even possible! Can’t wait to put one to work for me.

    • http://blog.bufferapp.com Kevan

      Thanks, Kathleen! Excited to hear how it works for you!

  • Angi @SchneiderPeeps

    Hi. I’m working through these steps and have a question. My blog has over 500 posts, when I did the Screaming Frog SEO Spider it said it was crawling 500 urls but the report it generated only has 150 urls on it. Can you tell me what I did wrong?

    • http://blog.bufferapp.com Kevan

      Hmm, that’s a good one, Angi. Sorry I’m not quite sure what might be up with this one! My very un-technical guess would be maybe retrying the spider in case it ran into some snags the first time. Sorry this one is a bit beyond my expertise!

      • Stuart Brameld

        I’d check the Spider configuration settings. Also if you have over 500 pages you’ll probably want to consider the paid version of Screaming Frog- not just so you can crawl the whole site, but it also enables a number of other really useful features.

  • http://www.metisme.com/ Shalin Tejpal Jain

    Kevan, thanks for that helpful post. Glad to see that you guys have taken the lead in elaborating the need of Content Audit. In the last couple of days, several customers have signed up with us primarily for their weekly content audits. But the fact is there still many web publishers who downplay the importance of content audit. We seriously believe that if not weekly, content audit should be done at least monthly. IMO, merely looking at Google Analytics independent of the blog posts at an individual unit level leaves a lot of room for optimisation of content and thereby lesser growth in key metrics.

    Kudos to your efforts on actually putting down the actionables basis the content audit of Buffer’s blog. Will inspire confidence in many to do regular content audit, even if they have just a couple of dozen posts. However, weekly / monthly content audits can be quite cumbersome and tedious for many. That’s the reason why we offer the same as a part of WPAlfred’s Rockstar package [http://www.wpalfred.com]. Do check it out and let me know what you think.

    • http://blog.bufferapp.com Kevan

      Awesome stuff! Thanks!

  • http://blitzen.com/ Blitzen

    Great post – very much appreciate your generosity.

    I was wondering if you’ve made an effort to ask some “why” questions around some of your findings? If so, I’d love to hear about some of those thoughts or perhaps that’s the topic of another helpful post!

    For example, long form content is clearly performing better than short form content. Do you think it’s the length of the post alone that is leading to the increase in social shares? Perhaps longer posts correspond to a certain kind of content (e.g. step by step guide), and it is the content type that is more shareable and not the length itself. Or, if you find users aren’t spending an amount of time on the page that corresponds to the length of the post, maybe people are sharing based on length, the title, the source (in your case, reputable), and some high level scanning. In either case, and in many not mentioned here, the actionable insight differs while still consulting the same data.

    I think this kind of data is an awesome thing to have as it answers a lot of “what” questions. However, digging a little deeper and trying to at least hypothesize around the “why” questions helps us fill in the gaps in the data. Without asking the “why”, the data suggests pumping out more 2500+ word blog posts… but the length might not be the main reason these posts perform so well.

    Just a thought! Either way, love this post!

    Cheers

    • http://blog.bufferapp.com Kevan

      Great thoughts! I love the deep thinking about the 2,500-word posts. My gut is that you’re right about the content weighing significantly into the success of the post. Those longer ones are typically our “complete guides” that people do seem to share often.

      Thanks again for the nudge with “why” questions! It’s a great one to keep in mind as I’m composing these posts!

      • http://blitzen.com/ Blitzen

        Didn’t realize it was the same author for both posts. I was just hoping to get some ideas from one of you… turned out you’re both the same person. Anyway, thanks for replying and if it leads to some future posts, that’s awesome! 🙂

  • Rashini

    Love you 4ever, Buffer! <3

    • http://blog.bufferapp.com Kevan

      Thanks, Rashini!

  • angsuman

    Love your detailed analysis and openness. Keep up the good work!

    • http://blog.bufferapp.com Kevan

      Thanks so much! Glad this was helpful!

  • http://realtyzations.com Realtyzations (aka Ali Zang)

    Hi Kevan, Another great post. What I really like about your posts overall is that your format is really a case study in how to use resources available online. I’m curious, what is your method for finding and evaluating an online service/resource, such as Screaming Frog for instance?

    • http://blog.bufferapp.com Kevan

      Hi Ali! Thanks for the comment! Great question, I think for Screaming Frog one of my considerations was that it is a tool mentioned by someone whose opinion I respect (Moz). If I don’t have a recommendation, I’ll often judge quite a bit based on the professional look of the tool and the ease of use – I can definitely improve here since I know I’ve probably overlooked so many useful ones!

      • http://realtyzations.com Realtyzations (aka Ali Zang)

        Best case scenario is definitely a referral, which ironically is a big reason I read your posts, yet if that isn’t the case I’m wondering how much trial and proofing are reasonable and feasible in order to make a qualified recommendation? Just thinking out loud. Thanks Kevan!

  • Everett Sizemore

    Great article Kevan! I just want to add that a lot of the data-combining and fetching can be done with URL Profiler. We used to do the same thing with our content audits (using the various services and then Vlookups to combine it all at the URL level) but now it’s all done automatically for us by this tool.

    • http://blog.bufferapp.com Kevan

      Thanks, Everett! Great tip there. I’ll give this one a try next time around. 🙂

  • Cami Bird

    I am having trouble with the google analytics report and can’t get the report to run. It keeps telling me I have an “Unknown metric” Suggestions? I followed your walk through to a T

  • http://seovavavoom.com/ Anthony Prive

    Hi Kevan. Thanks for this!

    Looking at your content audit spreadsheet, it looks like you have multiples tags on your posts. Is that on purpose?

    • http://blog.bufferapp.com Kevan

      Hi Anthony! Thanks for the comment! Great eye. 🙂 Yes, I believe our headlines and our main cta are both tags. I’m afraid I don’t have a whole lot of SEO expertise. What’s been your take on multiple tags on a page?

      • Jrm PrvO-st

        I would say Keep this one What We Learned Analyzing 595 Buffer Blogposts: A Complete Content Audit and Spreadsheet Template and place the other in h4. Use h2/h3 for your content in order to set a content hierarchy

      • http://seovavavoom.com/ Anthony Prive

        Hi Kevan, thanks for replying.
        Having multiple tags on a page can confuse Google as it checks heading tags (amongst other things) to understand what the content is about.
        Therefore tags should be organised with a hierarchy h1>h2>h3>… with being the main one.
        While you can have multiple h2, h3,… tags, it is recommended to have a unique heading on a page.

        Try to see tags as a way to organise your content, as you would do in a dissertation, or a white paper for instance.

        Hope that helps,
        Anthony

  • http://bradonomics.com/ Brad West

    For Buffer (and a lot of other companies) views and shares are metrics worth tracking. But I’d rather not optimize for that. Instead I’d rather optimize for impact …only I don’t know of any good ways to track that. Any ideas?

    • http://courtneyseiter.com/ Courtney Seiter

      Hey Brad! Awesome thinking here; makes a lot of sense! How would you define impact, do you think? Feels like once you get that definition in mind, maybe there will be metrics that could align with it?

      • http://bradonomics.com/ Brad West

        That’s the thing. Shares, pageviews and the like are objective. But impact, being subjective and often unreported, is almost impossible to track. There are a few bloggers who’ve written articles that have significantly changed my thinking, and while I might have shared their posts, I didn’t email or comment to let them know.

        I don’t know how to measure something that’s unreported like this, but I think it’s a conversation we need to start having.

  • Jrm PrvO-st

    Thanks for this post, i worked on my own blog audit a few hours and i’m not so enthusiastic.

    I used the same tools and finally get the audit with quick wins as my posts from 1000 to 1500 words perform better and get shared much more.

    The thing is the screaming frog wordcount is absolutely wrong, it does not

    count the article words but the whole page words and especially comments that are below the article.

    And the first result is, the more popular a post is the more comments it gets and makes the wordcount growing.

    http://blog.bufferapp.com/idea-to-paying-customers-in-7-weeks-how-we-did-it is announced 6771 words in your audit but the post is only 1298 words

    • http://blog.bufferapp.com Kevan

      Hi there! Thanks so much for the comment. Ah, definitely looks like the word count may be off. I really appreciate your digging this up for us. Sorry to have thrown off some of the advice/learnings based on these!

      • Jrm PrvO-st

        You’re welcome, but still the audit works perfectly for the title lenght that matters in google news rankings/google search

  • Martin Rotstain

    You have to add CQS to your worksheet. We have developed a tool named CQS (Content Quality Score) checker exactly for this purpose: to make sure your content is okay.

    I have checked yellow.com as it is one of the big losers (lost 80% in traffic) and indeed it got a score of 28 out of 100.

    To be on the safe side, you want a CQS of 70 and up.

    Free registration: http://www.cqscore.com

    Watch a demo on http://www.youtube.com/watch?v=zWF3gVU5dZI

    We are now working to embed Google’s semantically comprehensive wording in the tool.

    Warning: The tool is in its early beta stage. If a feature of CQS is not working as you expect, please bear with us as we work through these issues.

    • http://blog.bufferapp.com Kevan

      Thanks for the tip, Martin!

  • CMG

    This is a great post! Thanks so much for the detailed write-up. I’m curious: what date range did you use when you created the Google Analytics reports as well as the search data reports via Moz? I imagine the data would shift depending on the date range, so I’m curious to know how you thought about this.

    Thanks!

    • http://blog.bufferapp.com Kevan

      Hi there! Thanks for the question! If I remember right, the GA reports covered traffic from all-time since the posts were originally published. I believe the same was true of the Moz data, e.g. the inbound links represent all links since publication. Hope this helps! Please do let me know if I can be clearer on any of this!

  • Chloe Gray

    Great post! This is so useful. I’m curious to know the date ranged you used when pulling data from Google Analytics and the keyword tools. This data will fluctuate over time, so the date range could heavily affect the results of your audit. How did you think about this?

  • http://www.hover.com/blog Michael Keshen

    Very helpful! I’m encountering one problem though. When I run the Google Analytics report in Google Sheets I get the following error:

    Unique Views: Unknown metric(s): ga:contentGroupUniqueViewsXX

    Is that happening to anyone else/anyone know what’s causing that?

    • Adtmin

      That seems to be solved using Universal Analytics 🙂

  • Aki Balogh

    Hi Kevan,

    Great post on how perform a content inventory by looking at data from the tools you have today.

    There’s another approach: you can compare your content to ranking content, and identify the gaps.

    Check out our Site Audit tool at MarketMuse (https://www.marketmuse.com). Given a keyword you want to rank for, we tell you what topical gaps you have that are preventing your site from ranking.

    I co-founded MarketMuse and would love to hear your thoughts and feedback.

    Aki

    • http://blog.bufferapp.com Kevan

      Thanks for sharing this, Aki! Excited to take a look!

      • Aki Balogh

        Thanks, Kevan. Feel free to ping me with any questions or if you’d like a full walkthrough of how it works. (In brief: we use machine learning on the back-end to generate topically-related keywords.)

  • http://framed.io/ Tim Wut

    Always thorough. Thanks for this Kevan – great idea to rope in the Google Analytics data. I’ve seen similar blog auditing posts in the past that left this out.

    Do you guys ever run heat map or read analytics on your longer form posts? Would be really interested in seeing the read::share ratio (and perhaps bounce if people are finding that they need to bookmark and get ot it later).

  • Melissa Reyes

    Risking repetition by not looking for a relevant q&a in the comments, I wonder if when looking at the data of 500+ posts you factored in popularity of the website? I would think for a blogger who is building up a following from scratch, some content from the earliest posts would have lower views and shares than they would if posted more recently. Doesn’t all of this data need to be viewed on some sort of scale based on readership?

  • Magda Baciu

    Hi, Kevan! Thank you for the thorough article, I have already started to put it in practice. Though, there is something that I would like to ask. How have you proceeded with Moved Permanently URLs? There are like 500 URLs within this category and I took a sample of 20 to see what’s the error, but they are actually working, the URLs are available. False negative?

  • Magda Baciu

    I have already put it in practice, thank you for the tips.
    Nevertheless, I would like to know what insights did you get from the ”level”column.

    Does the depth level of an URL affect SEO?

  • http://dawnconsultancy.com/ Hassan

    Very informative and full of information. You have proved that you are the best in this field. RAK
    Offshore company formation
    ,

  • JessLHutton

    Kevan, I just realized this afternoon that SEMRush pulls in day and time of day more cleanly than the XPath formula. If you have a paid account, it’s worth it.

    Also, this post is a lifesaver – we’ve shared it around Clearlink about a gajillion times. Thank you!

  • sharedcount

    Hi there, I noticed that you are linking to Sharedcount.com on your article https://blog.bufferapp.com/content-audit . As that service is being discontinued, perhaps it could be useful to replace the link for Sharescount.com, a new working alternative. Keep up the great work!