Here are the two truths I know about modern marketing. First, content has emerged as the best way to attract, compel, and close deals. Second, marketing is now being held accountable for driving real, quantifiable demand.
But while these are both truths, they haven’t always gotten along.
That’s because as the buyer’s journey has changed, marketing teams have had to own the top of the funnel, often holding a lead generation quota much as sales bears a revenue number.
And marketing automation has enabled marketers with the advent of lead scoring, which signifies whether a buyer has reached the point where we can declare them a qualified lead.
With lead scoring marketers know if they’re hitting their demand gen goal, but still lack insight into why they are. Content Scoring—a new methodology—changes that by tracking how individual content assets and campaigns perform in generating leads and opportunities.
Okay, but how does Content Scoring work? Begin by looking at the journey of one Buyer to becoming an MQL (we’ll use MQL or Marketing Qualified Lead, in our example, but Content Scoring can measure from any point in the Buyer’s journey).
Here’s the hypothetical journey of our Buyer, with each “touch” on the journey described by the content asset with which they interacted:
Touch # | Description |
1 | Reads BlogPostA |
2 | Watches VideoB |
3 | Attends WebinarC |
4 | Downloads eBookD |
5 | Opens EmailE |
6 | Watches ProductVideoF |
BECOMES MQL |
Content Scoring looks at this journey and allocates the value of the MQL back to the various content assets that were touched. In its simplest form, Content Score would give each of these content assets an equal amount of credit for the MQL. Since there were six touches, each asset receives ⅙ of that MQL’s value.
But most organizations give more weight to the first and last touch: the first because it serves as the first contact point with the buyer; the last because it led the buyer to become an MQL.
If the first and last touch were both given 30% of the credit a piece, that would leave 40% to distribute among the remaining touches (or 10% each). So the Content Score earned per asset for this sample MQL would be the following:
Asset | Score Earned from Sample MQL |
BlogPostA | 0.3 |
VideoB | 0.1 |
WebinarC | 0.1 |
eBookD | 0.1 |
EmailE | 0.1 |
ProductVideoF | 0.3 |
This is the example of a single buyer. The insight becomes richer and actionable when these values are added up for each of the MQLs generated in a given period. Let’s suppose that your marketing team did 2,000 MQLs for a given quarter. You could then look back and analyze the aggregate Content Score of all of the assets which touched those MQLs. Totaling the results of the individual journeys (like the table above), would yield something like this:
Asset | Content Score |
ProductPageA | 238.9 |
WebinarB | 122.3 |
VideoC | 97.4 |
BlogPostD | 89.9 |
BlogPostE | 76.6 |
EmailF | 75.9 |
etc. | . . . |
The sum of the “Content Score” column would equal 2,000, as the credit of each of the MQLs generated would have been attributed out to each of the content assets. Those individual content scores represent the number of leads, opportunities, and/or closed deals that a piece of content generated. With it, marketers can now look back at their performance (in this case, generating 2,000 MQLs) and understand clearly what worked and what didn’t.
We reviewed one simple example manually. Imagine doing that across hundreds, even thousands of leads. That’s why we built Content Scoring as a feature within Kapost.
We reviewed one simple example manually. Imagine doing that across hundreds, even thousands of leads. That’s why we built Content Scoring as a feature within Kapost, giving users the ability to process the millions and millions of data points of buyer—content interaction that modern marketing efforts generate. This enables marketers to have real-time visibility into their performance and discover patterns.
We also just glanced at one simple approach around Content Scoring. There’s much more to discover: different approaches to weighing touches, different methods to slice and dice results (by persona, buying stage, category, author, etc). We’re going to continue to evolve Content Scoring, both as a practice and feature.
The days of ignorance are over. Marketers now have the visibility into performance that enables success. Lead Scoring made marketers accountable by measuring the activity of their buyer. Content Scoring, the counterpart to Lead Scoring, now makes marketers successful by measuring their efforts to affect their buyer.
See How Content Scoring Fits into the Framework
Explore the entire lifecycle of content creation through to sales. See how Kapost can make this process work for your brand.