43
Metrics for Continual Improvement Process Lavacon DUBLIN – June 6, 2016 Nolwenn Kerzreho, Technical Account Manager, IXIASOFT

Metrics for continual improvements - Nolwenn Kerzreho LavaconDublin2016

Embed Size (px)

Citation preview

Metrics for Continual

Improvement Process

Lavacon DUBLIN – June 6, 2016Nolwenn Kerzreho, Technical Account Manager,

IXIASOFT

Nolwenn Kerzreho

Technical Account Manager – IXIASOFT

10+ years in the tech comm industryspecializing in managing documentation & translation projects

Adjunct teacher at Université Rennes >8 yearsContact @NolwennIXIASOFT #Lavacon @lavacon

Agenda

• What are metrics • What are useful metrics (hint: actionable)• Creating the right metrics• Metrics to confirm, refine, plan• A couple of questions• Resources

Metric: a standard of measurement.

Key performance indicator: a metric that demonstrates how effectively a company is achieving key business objectives.

Example of metrics in documentation

Sets up a standard measurement• Content: number of passive voice; complex

tenses; synonyms; hazard statements…• Writers’ productivity: number of documents

produced• Time-to-market: delivery date per

language/product• Costs: time spent on each phase of the life cycle • Information efficiency: down-time; repair time• Customers satisfaction: perceived quality surveys

Example of KPIs

Measure progress • Content: compliance to new style guide• Writers’ productivity: augment time spend on

troubleshooting information• Time-to-market: 90% delivered info with product• Costs: reduce time experts spend on editing• Information efficiency: repair time 20% down• Customers satisfaction: augment perceived

quality 40%

Why? Typically to solve a problem…“Keep up with change or be left behindor, keep up with continual improvement process”

1) Define your goals 2) Measure your starting point3) Track your progress

Let’s play…

A short test – aligning the stars

Align the objectives with a) Potential solutionsb) Potential KPIs

Test: align tech pub solutions with the organization objectives

1. New regional markets

2. Going Agile

3. Repacking with partners’ brands

4. Earlier time-to-market

5. Modular product lines with options

A. Add variants

B. Branding auto applied

E. Topic-based authoring

C. SME edit source content

D. Optimized translation workflow

Organization Tech pub solutions

1. New regional markets

2. Going Agile

3. Repacking with partners’ brands

4. Earlier time-to-market

5. Modular product lines with options

Test: align tech pub solutions with the organization objectives

Organization Tech pub solutionsA. Add variants

B. Branding auto applied

E. Topic-based authoring

C. SME edit source content

D. Optimized translation workflow

4. Nbr deliveries made on-time

Test: align tech pub solutions with metrics and KPIs

A. Add variants

B. Branding auto applied

D. Optimized translation workflow

E. Topic-based authoring

C. SME edit source content

Tech pub solutions

1. Translated deliverables

2. % reuse in new map

3. Nbr branding stylesheets

5. Number of variants % product

Metrics and KPIs

Test: align tech pub solutions with metrics and KPIs

A. Add variants

B. Branding auto applied

D. Optimized translation workflow

E. Topic-based authoring

C. SME edit source content

Tech pub solutions

1. Translated deliverables

2. % reuse in new map

3. Nbr branding stylesheets

4. Nbr deliveries made on-time

5. Number of variants % product

Metrics and KPIs

When to track what and other flavors

New KPIs

Refined

KPIs

Framework

• Confirm: Align with ROI(track team & process maturity)

• Refine: Results of previous changes (unexpected and soft benefits)

• Plan: Changes from continual improvements

Confirm ROI: Cost-per-Topic

• Idea came from a CIDM Best Practices article by Mike Eleder (“The Illusive, Writing Productivity Metric: Making Unit Cost a Competitive Advantage”)

• Basically: cost per topic = monthly tech writer team cost

topics produced monthly

• This is a unit cost measure, and it produces the monthly average cost for producing topics Can estimate the cost of future work based on result

Example Chart Showing Cost-Per-Topic

• The trend line records the average cost-per-topic over the time measured• This shows an overall downward trend: ideally this is what you would expect

over time• Same cost measures can be applied to published documents, localized content

and more

$

Cost per topic over two years with trend line

Year 1 Year 2

Confirm ROI: Localization Spend

• After the move to DITA + CMS justified in terms of unspent localization budget due to efficiencies

Annual localization budget and spend

Year 1 Year 2 Year 3 Year 4 Year 5 Year 6 Year 7 Year 8

Continual Improvements Metrics

• DITA metrics can be used to guide managers, information architects and writers on how to improve and refine their content

• Some views to maintain the quality of the content

Continual Improvements Metrics

Provides the ability to:• Set more accurate project

estimates• Justify need for more resources

(tools/people)• Understand quality of production• It’s also an opportunity to

measure value

DITA Production Metrics without a CMS

• DITA metrics outside of a CMS are limited to information contained within the DITA files + file system Can search for text strings within the XML, and also use

date/time info from filename… and that’s about it Make no mistake though, there’s plenty of information there to be

mined There are tools that can help you search for text patterns

within your content such as Windows Grep Pro Tip: If you go this route, choose a tool that uses

Regular Expressions (regex) and learn how to use it!

DITA Production Metrics with a CMS• DITA metrics within a CMS can capture additional information,

which depending on the CMS may include things like: Author information Workflow status How many times a topic has been modified/versioned Topic/map dependencies

• Expect CMSes to have “canned” search routines that make the task of mining information from your DITA content easier

Samples Used

• Thunderbird sample set available on GitHub provided by Gnostyx and maintained by the communitySoftware user manual, 1 product

• IXIASOFT documentation set, all users and administration guides for TEXTML and the DITACMSAdministration & user manuals, 2 products

Time and Workflow Metrics

• Again, usually done within a CMS, tracks who is responsible for which topic production, and whether it is on schedule

Content Types in Various Documents

Looks at the topic types that go into maps• Why would this matter? It can provide you with an idea

as to whether content is being properly “typed”. It ensures that writers are writing/structuring content properly. Some examples: A typical “Installation Guide” ought to be made up

primarily of task topics APIs ought to have a lot of reference topics Can expect most end-user software docs to have a

roughly even mix of concepts and tasks Would generally expect to have more maps than

bookmaps

Thunderbird Document Metrics

• I would argue that a user-oriented document ought to have a more even balance of concepts and tasks than we see here

• Direction to the Thunderbird technical writers: check that all possible tasks a user might encounter are explainedCount: 87

s

Content Types within All Documents Over a Year• This chart looks at the DITA

topic breakdown for all documentation produced by IXIASOFT in 2015

• Documentation consists of User/Admin Guides for our DITA CMS and TEXTML software

• Good ratio of concept to task topics

• When I showed this to our Lead Tech Doc person, she immediately wanted to investigate the 3% of generic topic types Nice practical example of how

DITA metrics can improve quality!

Count: 1307

DITA Metrics to Check for Tagging Consistency

• Why? Consistently tagging XML helps promote house-style, common look-and-feel within doc team

• An example: what if you encourage your writers to add short descriptions to your topics Could check to see how many topic contain <shortdesc>

(which would contain content) vs. <shortdesc/> (which is empty)

• Need to be careful in defining your search characteristics in this case: Exclude things that typically do not contain the shortdesc

element, such ditamaps, maps and glossentry topics Depending on how you validate your topics, may not include

an empty <shortdesc/>

A Couple of Sample Results

• Clearly Thunderbird is doing something right! ;)• List of non-compliant IXIASOFT topics merits further

investigation

Tracking Topic Type Usage Over Several Years

• These charts look at several years-worth of semiconductor documents

While I expected a high percentage of reference topics, I wondered whether there were more topics that ought to be tasks which were instead done as references

Tracking Topic Type Usage – Directing Change

• Asked writers to be more diligent about writing task topics where they might be temped to write them as references instead

• Result was a measurable increase in the percentage of task topics created over the course of the following year

When to track what and other flavors

New KPIs

Refined

KPIs

Framework

• Confirm: Align with ROI(track team & process maturity)

• Refine: Results of previous changes (unexpected and soft benefits)

• Plan: Changes from continual improvements

Move Content to DITA 1.3

• DITA 1.3 opens up many new possibilities for structuring and describing content

• Using new elements/features opens up new possibilities

• A couple of easy examples: New XML Mention domain means that you can replace angle

brackets for tags (i.e. &lt; and &gt;) with a pair of “<xmlelement>” tags This is the most common example, and there are other entities in this

domain for describing attributes, parameters, numeric characters and more

With new Troubleshooting topic type, look for obvious candidates for topic conversion containing the word “troubleshoot”

Results from Search for Angle Bracket Entities

• 40 matches were found in 1502 topics from 2015

Search results in Windows Grep

Sample DITA file full of &lt;*&gt; examples

Results from Search for “trouble*” in Topics

• 38 file matches from 1502 topics; each would need to be investigated• Example above found from the search is a solid candidate

Other DITA 1.3 Possibilities

• Search all maps for the names of keys and look for those that have the same value (“name”) Introduction of keyscopes in DITA 1.3 allows you to

share keys (and the values) across maps; identifying key matches suggests opportunities for key scoping

• Search for instances of MathML or SVG graphics DITA 1.3 has MathML and SVG “baked in”, so you can

insert code directly or partition them off as referenced topics

In most instances search for content contained with <foreign> tags for likely candidates

• At Scrum meetings doc manager can report on topics assigned to their group and report on how “done” they are

• This is typically only feasible within a CMS containing workflow metadata

“Our project managers could track progress of documentation deliverables within our DITA-based CMS on a daily basis.” - Jason Owen

Agile and DITA Metrics

Topic Authoring work 56 % of

content

DITA Reuse Metrics

• “Reuse of DITA Topics? What is the Best Metric to Measure the Success of Your Reuse of DITA Topics?” (http://ow.ly/X7mzM) by Bill Hackos

Bill Hackos’ Reuse Formula

• Percent Repository Words Reused in Context = (Words in All Produced Content – Words in the Repository) / (Words in the Repository)

% Words =

Lorem ipsum

Lorem ipsum

Example Based on IXIASOFT DITA Documents for 2015

Based on 2015 numbers:• Total number of words in the repository: 268,663• Words in All Produced Content: 623,078• PRWRC = (623,078* – 268,663)/268,663• PRWRC = 354,415 / 268,663 = 132% (lower bound)• How is 100%+ reuse value possible?

Easy: extensive conditional processing Number of publications that are created based on a series

of DITAVAL value, as much as 21 per bookmap

Other Examples for Future Challenges?

• Efficiency of repair manual• Compliance to translation style guides• Mapping products and documentation sets• Efficiency of product sprints

Conclusions

• Metrics necessary? Are KPIs better?• Can you use metrics to go further than costs?• Is it easier with refined tools?1. Scout business objectives & align with them2. Look forward to your future challenges

Resources & Further Reading…• Mark Lewis, DITA Metrics 101, Business Case for

XML and intelligent content, published by Rockley• Keith Schengili-Roberts, IXIASOFT DITA Specialist

[aka the DITAWriter]: Metrics for continuous improvement (IXIASOFT slideshare account)

• What is a KPI? Definition & article https://www.klipfolio.com/resources/articles/what-is-a-key-performance-indicator

• DITA sample: https://github.com/gnostyx/dita-demo-content-collection

Questions and Answers

Blog: www.ixiasoft.com/en/news-and-events/blog Twitter: @IXIASOFT

DITA Metrics LinkedIn group: www.linkedin.com/groups?gid=3820030