Take advantage of Microsoft’s Power BI (Business Intelligence) rollout in 2016 to collect and analyze more data than ever before and find the SEO performance gaps you didn’t even know about.
If you’re not familiar with Microsoft’s Business Intelligence paradigm, then I heartily recommend you take a walk through their promotional material. The most pertinent part for SEOs is the ability to embed Queries in the latest Office 360 suite natively (without requiring Power Query extensions in Excel, for example), allowing advanced capabilities baked into Excel since 2013 to be accessed easily by anyone in your corporation (or client side) on the 360 platform.
Queries are fantastic because they allow you to parse serious quantities of data relatively easily and extend desktop tools to operate more like servers using SQL-like statements. If code is not your bag, the BI suite is designed to be largely WYSIWYG using 360 natively or their desktop BI interface. Of course, you can get “advanced” in your query building and drop “M” code in directly to create your own functions (You will end up doing this… and you will love it).
Imagine running INDEX MATCH or VLOOKUPs on five million rows of data instantly, and you’ll get a feel for the, well, power of Power Queries. In the screen grab above, a dataset of +20GB is transformed into a manageable table made available for analysis as a Pivot Table by an SEO for performance gaps. Calculations that would not be possible within Excel due to data size are easily handled by being transformed into data restrictions within the final SQL statements underpinning the data generation process.
Refactoring trivial operations into M can be challenging, especially when you get into the realms of calculation data based on the history of an earlier state of your data transformation process (Doctor Who has nothing on this!). In fact, you can simply run, say, a MIN / MAX across the generated data set to get a value to be referenced in a pivot table to get to the answer in an Excel formula. But you should push to move all logic into your data queries, as it allows all outputs to be made available robustly within the Data Model when your Queries run, meaning they become available to all Power Pivots in Excel for further segmentation. Again, you will inevitably start segmenting within your larger data sets to find the long-tail gold for performance improvements. Just embrace it!
You can also replace pretty much any VBA scripting you care to think of within M and the BI framework, which means automation of your data refresh just became a whole lot easier and more robust. Sharing data and dynamically updating the underlying data are also a cinch.
To see if we could leverage the power of BI for our clients, at QueryClick recently, we refactored a Search Console analysis tool to use power BI and found it allowed us to take granular day-by-day data across 250,000 search terms and match it against the equivalent time data from AdWords… again, instantly.
This allowed us to build on the existing tool to share AdWords’ performance data with its organic twin and better understand the user behavior and revenue value associated with the term to a level of detail that had previously been impractical to analyze.
In addition, as it is so easy to pull in additional data sources as Queries, we are now able to look at historic performance trends from the likes of Stat and calculate a “volatility” metric for the top positions and again associate that with the term — a level of data transformation and number crunching that would cause standard Excel approaches to shut up shop and go home for the day.
Why would that depth of data be useful? Well, it means we can assess the difficulty to perform in the traffic-driving positions (positions 1–3) and contrast it with the expected return (revenue per click behavior from AdWords behavior for the term historically).
That allows us to laser-focus on terms which provably drive revenue for a business and quantify clearly the impact of performance improvement in revenues gained and paid spend that can be tested for reallocation once a top position is achieved and SERP CTR behavior is higher than expected for the position (indicating searchers are overwhelmingly satisfied with the organic SERP call to action).
Of course, we can flip that analysis approach into reverse and use organic data to lead PPC insights. For example, we can show the revenue available for generic search term expansion where paid behaviour is better than expected — indicating a tight match between searcher intent and client product offer — and organic behavior is less than expected (suggesting some level of paid cannibalization).
In our first test data case for a UK high street retailer, we found £2.3m of additional paid opportunity in this scenario, and that’s just in the first pass! Flipping our data filter around to look exclusively at brand terms (We’ve used Power Query to reverse terms inserted into a table in the Excel workbook into a brand tagging mechanism for the data), we found £1.8m of spend on brand where organic behavior was extremely strong (more than 40 percent above expected CTR in the first rank position in the SERP) and eligible for testing for reallocation in the AdWords account to growth generation generic terms.
Big data is truly large data — much larger than the data discussed here — and so isn’t the right terminology, but Power BI allows larger-than-usual data sets to be easily and dynamically analyzed by an SEO with an eye to uncovering the opportunities and inefficiencies in a campaign strategy. I can’t recommend it highly enough and look forward to hearing about your favorite BI use cases.
Some opinions expressed in this article may be those of a guest author and not necessarily Search Engine Land. Staff authors are listed here.