One of the frustrations of doing content consultations is when the organisation resists recommendations to improve the content production process. The consultation is going smoothly. The discovery and gap analysis has been completed, and a set of recommendations has been approved. The stakeholders are on their way to adopting a highly-efficient ecosystem, and then the drama begins.
Sources of resisting content operations
The resistance falls into a few general scenarios.
- The management with budget on the line for the investment are unfamiliar with the need for the technology. They insist that they have a CMS that they are convinced is enough of a solution to solve their content dilemmas. They would rather accept a partial solution than invest in tooling that goes hand-in-hand with governance and process changes.
- Management talks to procurement, who decide that the new tooling is above some corporate level of spending, and they block the investment. Sometimes, the block is an outright rejection based on cost; other times, they delay the investment until they seek confirmation from outside sources.
- Management or procurement consults the CMS developers, data scientists, computer engineers, or other technologists who are involved at the delivery end of content but only a passing knowledge of content production processes upstream. The technologists decide that the tooling isn’t necessary, based on a superficial assessment.
- Less often, content producers themselves are resistant to change. They feel that they’ve managed to optimise their processes using a hodge-podge of tools such as word processors, spreadsheets, process ticketing, and code hosting software, and they don’t want to go from moderate pain back to dire pain.
Assumption-based vs evidence-based decisions
The common factor in the various types of resistance is the method used to make their decisions. None of these groups consult the content production department to determine what the real pain points are. No one sends in a business analyst to document processes. There is no Gemba audit or 5S exercise to look for the multiple types of waste throughout the content production processes. The decision not to investigate is because the management and/or procurement and/or developers make assumptions about what is involved in the production process.
The assumptions made are generally way off base. A few of the more dramatic statements I’ve heard as reasons to ignore content operations include:
- Why would you need [that] feature? I mean, how often would you use it – once every couple of months? (How about multiple times a day!)
- They write, get it approved, and move it to the CMS. What’s to improve? (Instead of 3 steps, try over 70 steps with multiple forks, loops, and iterations.)
- The writers we have here wouldn’t be smart enough to use more sophisticated tools so they’ll have to use what they have. (I was so taken aback that I didn’t even know how to respond.)
- We’ve decided it would take an extra sprint to connect the authoring tool to the production system, and we don’t see the value. (Contrasting the cost of an extra sprint with multiple years of productivity seems like a superficial comparison.)
- I’ve worked in this field for 15 years, and I’ve never heard of the tooling you’re describing; sounds like voodoo to me. (Seeing as the tooling had been around for 20 years at that point, I couldn’t help him with his substantial knowledge gap.)
If the decisions not to proceed were evidence-based, I could respect that. But not getting the facts and basing the decisions on gross assumptions? That’s just bad business.
The cost of doing nothing
While the cost of adopting new software and the related change management is relatively straight-forward to calculate, no one does a head-to-head comparison of before and after processes. In other words, the cost of “doing nothing” is not calculated against the cost of adopting new tooling or processes. In the comparison grid, the “Keep the Status Quo” should be calculated, to determine the actual costs incurred on an ongoing basis.
Each organisation is different, of course, but it is not unusual to get anywhere from 40% to 80% improvement in productivity. Here are some case studies that either I’ve worked on, that colleagues have worked on, or have been presented at conferences by clients.
- A financial services company where improvements on a single piece of content reduced the overall effort from 1-4 hours each of 12 people – writers, translators, subject-matter expert approvers, legal, and developer time – over a 2-week period to 5 minutes of 1 person’s time. No matter which hourly rate you use to calculate costs, the savings were impressively massive.
- A company that produces training guides for multiple trades across multiple states in a couple of languages, had a detailed copy-and-paste method over some 12 outputs, tracking that content over almost 50 spreadsheets. Instead of hiring a second manager to add needed capacity, they invested in fit-for-purpose tooling and changed the operating model for their content production, thereby eliminating their use of spreadsheets and automating the multichannel publishing from a single source – without hiring a second manager and freeing up the existing manager to do more value-add work.
- Calculations of content production waste for a large healthcare equipment manufacturer uncovered numerous areas where efficiencies could be gained, with customer-facing content re-use hovering around 80% and internal-facing content (maintenance technicians and customer support) over 60% for an average savings of 70% in the production cycle. More importantly, the error reduction in customer documentation dropped dramatically, improving customer trust and reducing support calls.
- Measuring waste in a government department demonstrated that investing in production-grade software to improve content production would reduce the cost of maintaining content by over 60%, and assist with personalised delivery of content – a problem that had plagued several teams over several years. The cost of doing nothing also resulted in dozens of temporary contractors being brought in to help meet urgent publishing deadlines, costing taxpayers well over £2M across two years.
Caveats about quantifying content efforts
The temptation to throw some resources at the problem sometimes ends badly. The importance of employing the skills of someone who has domain knowledge, has kept up with the range of potential tooling, and understands which tooling is important and which tools to choose in particular situations cannot be emphasised enough.
- Sometimes, the best outcomes aren’t identified because the analyst or consultant doing the discovery does not understand enough about the content production domain to ask the right questions. Or they ask only about the delivery end and not the actual work environment, because they don’t understand the nooks and crannies where the most efficiency could be gained.
- Often, the recommendations are based on using the same old, same old tooling as before. If you’re not tired of hearing the metaphor: when you ask people how they’d get across the Atlantic, those who don’t know that planes exist will describe a boat. When you describe an airplane to them, they will continue to describe different types of boats, but not an airplane. In calculating potential content production efficiencies, a true comparison is not done because of substantial gaps in the knowledge of staff, technologists, and the analyst documenting it all.
- The other situation that is quite common is professional hegemony. The company, while quite willing to invest in some technologies, simply don’t see the value of investing in tech for content operations, and as their opinions dominate – think of the last time that a content designer was asked about the suitability of tooling for the, say, data scientists vs how often content designers are coerced into using tooling meant for code because the developers’ opinions were that the tools they use for code should work for content.
As the bottlenecks increase, the cost of production increases, and the inability to respond to customer demands for information, I predict that ignoring the cost of doing nothing will no longer be acceptable. Organisations will be forced to look at their content operations in order to meet the demands of their customer experiences.