Hi all! I am working on a few different publishing sites that all have a lot of outdated, unhelpful content. This content has dropped a lot in the last year, so we are working through it and trying to take a "kitchen sink approach" to turn things around. Internally we have what we call a "content inventory file", which is just a Google sheet (the most valuable tool in an SEO's stack IMO lol) that we pull every quarter with all the indexable URLs, their metadata, and KPIs. We use this regularly for a lot of different workflows. The KPIs include things like total amount of traffic each URL has received in the last 12 months, total amount of search traffic each URL has received in the last 12 months, total ranking keywords, etc. We can filter this file by tag to understand topical performance and by content type to understand template performance.
We would like to get a better idea of performance across multiple URLs, including how our performance and traffic is spread across the site, and how efficient your content is. Meaning, I would like to know if a small set of URLs is driving the majority of search traffic and would like to be able to see, at a URL level but also at higher level, the value of different URLs. I've toyed around with adding in cumulative traffic, traffic share/percentage, and stuff like that, but I'm wondering if anyone has any helpful tips to share or other ways we should be thinking about this to ensure we are looking at our priority content first and foremost. We've also pulled in YoY or PoP traffic data to get a feel for what content has dropped the most, but I am wondering if there's a better way to think about this. Any and all input is welcome! Thanks :)