Recently, a client sent over a site audit another agency had prepared. The client was concerned; although they were happy with their campaign-to-date, the report supposedly identified a slew of “critical” issues to fix on their site. What did we think?
At first, I was worried. Had the Ninjas missed something? Would the client doubt our due diligence? I reviewed the audit and within 5 minutes, I wrote back to the Account Management team: tell the Client to relax and stay the course. The critical issues weren’t critical at all. In fact, the audit wasn’t really an audit, nor a report. It was a glorified checklist, churned out by software. I doubt a human wrote a single word of it.
I should have known at first glance. The language and writing style, the templated feel to the sections, the icons and graphics – everything about the audit screamed Automated Reporting. There are a growing number of agencies and SEOs churning out no more than a series of checklists with fancy graphics.
They rely heavily on scraped reporting and data dumps from other 3rd party tools and suites – Webmaster Tools, Google Analytics, Built With, etc. The commentary and analysis, if any, is thin and boilerplate – it often repeats from section to section and populates from a menu of drop-down choices.
Once the APIs and data pulls are set up, this kind of programmatic reporting is VERY cost effective for agencies to sell. For the clients however? Poor value.
Checklists are NOT audits
These kinds of Automated Reports do not meet the standards IMN holds up for our own audits and reporting.
I cannot stress this enough, but a checklist is not an audit. A data dump is a not a report.
Audits and reports certainly utilize and leverage checklists and data, but the latter can never substitute entirely for the former.
A checklist is not an audit. A data dump is a not a report...Click To TweetHere’s why Automated Reporting often fails both agencies and clients:
Missing or Unclear Action-Items:
The Automated Report included a section listing out a few social icons and buttons on pages that were missing Alt Text attributes. These issues were not really that important (there was no genuine need to optimize the Alt text for what really weren’t images – particularly for SEO purposes), but moreover, even if they were, there was nothing Actionable about that section.
If the Client was going to put in a request to their developer, what should it be? How should the Alt text read? There were no specifics here. The report didn’t even explicitly say to create Alt text attributes – it just identified missing ones. You can imagine a client reaction: “And?”
Vague Action Items Cost Time and Money
Clients may not have the time or the knowledge needed to communicate to other stakeholders exactly what needs to get done. Any decent audit should never just identify a missed check box – it should include a specific Action or sequence of actions that the client can take to rectify the issue.
Otherwise, development teams or other stakeholders will be left on their own to make decisions they aren’t in a position to make. They’ll circle back for more clarification or make mistakes. This wastes client time and money.
No Prioritization:
Other sections of the Automated Report were a little more actionable, but they were not prioritized or stack-ranked against one another in terms of importance. If a client has missing title tags on pages and they also have stacked redirects, which issue should the developers tackle first?
SEO and digital marketing do not exist in vacuums – they are context specific. Agencies and SEOs who sell site audits should have an adequate understanding of the client, their organization, their business model, their competition, etc. and they should understand how ALL of these factor in together.
Otherwise, they will not be able to properly prioritize all of the Action Items they’ve identified. At minimum, Action Items should be tagged as Low, Medium or High Priority, although numerical or letter grading is sufficient as well.
Prioritization is Everything
No Effort Level Grading:
Similar to the lack of prioritization, the Automated Report didn’t describe or characterize the level of effort needed for Action Items. A canonical issue impacting the homepage might be a High priority, but it’s a relatively “quick fix” – a redirect or a line of code can resolve the issue.
Conversely, a hornet’s nest of stacked redirect chains might be a Medium priority, but it might take a bit of heavy lifting from an SEO or developer team to work through them all.
Unfortunately in digital marketing, and particularly SEO, the level of importance for Action Items do not always correlate to the level of effort needed to implement them. Some very important action items are very easy to fix, some are not. Some low priority action items could be more fuss than they’re worth to address and hence, they have a low ROI. One cannot even begin to gauge the level of effort for an action item without a clear understanding of the Client’s resources and budget.
It’s one thing if the Client has an in-house team of 5 developers and another if the Client wears all the hats within the business. If they rely on freelancers or outsource work, then every Action Item is also a bill. SEO audits should demonstrate cognizance of these factors. Automated Reporting will never do that.
No Strategy or Timeline:
The Automated Report left our client wondering: “What’s Next?” and “How Long”, nevermind the concern always on the mind of clients: “How much?” In total, there were less than a dozen clear Action Items we could identify from the audit, but there was absolutely no timeline or strategy for how they would sequence or fit together. This, ultimately, might have been the greatest deficiency with the report.
In terms of expectation setting, it’s critical for SEOs and marketers to be able to communicate how long various Action Items, initiatives and campaigns are going to realistically take to complete and how long before they start showing measurable results.
Technical SEO issues can be resolved, typically, within a few days to weeks if developers have the necessary bandwidth. However, a brand new content marketing plan is going to take months to execute on. Other priorities like linkbuilding and social media can be ongoing, never-ending initiatives.
A good site audit should lay out, even roughly, how long the various Action Items are going to take to implement and how long until their impact can be observed or measured. IMN uses tables, GANTT charts, calendars – whatever visualizations our clients prefer – to timeline and sequence the various action items and plans. Automated Reporting is rarely, if ever, timelined.
No Insights, No Analysis
Don’t get me wrong. There is a place for tools, software, and data pulls. IMN uses tools all the time. We LOVE tools here.
Our CEO, Jim Boykin, is always brainstorming and designing new tools and widgets that we can use to make our jobs easier and our clients happier. But tools are not a substitute for actual analysis and insight. This is where Automated Reporting will always fall short.
It’s not enough to programmatically move through a checklist – a trained and experienced analyst might start with a checklist, but they are not going to entirely trust or rely on one either. It’s intellectually lazy, for one, and it’s a risky move as well. Software is only as smart as it’s programmed to be and there are inevitably going to be deficiencies or things just plain wrong with a report devoid of any human agency.
For example, the Automated Report our client was sent had a list of reported 404s from Search Console. The Report said “Fix These”, but that’s not a very accurate statement. It implies something needs to be fixed to begin with.
There is nothing inherently wrong with a page returning a 404 status code and in many cases it’s EXACTLY what should happen. There’s nothing to fix, because nothing is wrong. The pages do not exist.
If there are broken internal links pointing to a non-existent page, then the internal links should be updated. That’s an internal linking issue. If it’s an old page that never was redirected properly, then a redirect needs to be implemented. That’s a site architecture issue.
But what about a weird external link from another site that Google followed to the 404 page? There is little a client can immediately do about it. They could 410 the page, sure, but is that really needed? Should they reach out to the linking domain? Maybe. Maybe not. What exactly is “wrong” with the 404s and what did the Automated Report mean when it said “Fix these?” It’s hard to say. No insights. No analysis.
Don’t Mistake Tools for Carpenters:
Thankfully, I get to spend most days leveraging tools AND experience. IMN audits contain TONS of external reports, data pulls, tool runs, etc, but they are organized primarily around solid, clear writing and analysis – not checklists or automation. All of our audits include Executive Summaries for leadership and Section Summaries for specific teams/stakeholder.
All sections come with clear Action Items, which are compiled in one master table and stack ranked against one another in terms of Priority and Effort. They’re sequenced in at least a 3 to 6 month plan, if not longer. They take into account the client’s budget, experience, vertical and internal resources. Frankly, they take quite a while to create, but we prefer it that way. Diligence takes time and slowing down allows us to avoid mistaking the trees for the forest.
If you’ve received an Automated Report or want to see what an industry leading Reporting and Analysis team can produce for YOUR website, consider IMN for your next website audit.
The post Beware of Automated Reporting: Software are NOT Analysts appeared first on Internet Marketing Ninjas Blog.
from Internet Marketing Ninjas Blog http://ift.tt/1pxjTQ5
via IFTTT
No comments:
Post a Comment