

Overview
The problem
Playbooks from Document Crunch provide key information to construction teams, which helps them reduce risk by ensuring higher levels of compliance with their legal contracts. Although customers reacted enthusiastically to the tremendous benefits this feature could provide, actual usage data for the Playbooks feature was extremely low. This indicated that despite the value, there were significant barriers to adoption. It turned out that the process of manually filling out Playbooks was so time consuming that almost no one was actually using the feature.
the solution
Enter Generative AI Playbooks, which write the content automatically for customers — no more manual setup required. All information in the AI Playbooks are pulled directly from source documents to combat LLM hallucination, and cited so that humans can fact-check what the LLM returns. This lead to massive time savings for teams using AI Playbooks, plus wider adoption of Playbooks among construction project teams. Our AI solution made it easier and faster for project teams to have quick reference answers to their most common questions during a construction job.
Outcomes
Within 30 days of rollout, 55% of customers were on a subscription plan that had access to this feature, and 6% of unique active users were engaging with this feature on a weekly basis. Prior engagement was close to 0%, so this was quite a success!
How we got there
Defining the problem space
I started this project by running a collaborative workshop with cross-functional partners (product, design, engineering, and internal subject matter experts). The goals of the workshop were to establish mutual understanding of the problem space and develop a shared vision of how we could solve that problem for our users.

The introduction & agenda for the cross-functional design workshop I ran
To set the stage for the workshop, I created a journey map of the current experience that identified where pain points came up for our two primary end users: back-office attorneys and on-site project managers.

A mapping of the current workflow that users experienced when using our product
With this as context, I framed the problem statement for team discussion:

The problem statement that we all agreed on
We refined the problem statement together and quickly reached consensus. The rest of the workshop dug into identifying knowns and unknowns, defining user tasks and primary goals, and brainstorming on how to refine the experience for our target users. At the end of the session, I had successfully led the team to craft an ideal workflow that would be used to guide users through the new AI-assisted Playbooks feature.




Exercises that I led the team through during our workshop
Rather than forcing attorneys to manually write 20-30 sections of material for project teams, now the AI would do all the writing, so attorneys could streamline their workflow by simply reviewing the AI’s work, editing it if needed, and then marking it as ready for project teams to use. This also would benefit project teams, as they would no longer receive blank Playbooks — that meant their need to hunt down answers by reviewing lengthy legal documents or having to contact legal representation was reduced.
Exploring early concepts
Based on the outputs of my workshop, I drew out the user flow using low-fidelity sketches to drive conversation around the desired experience.


(Top) an overview of the user flow based on the ideal flow that came out of our workshop, (bottom) a zoomed in section of the flow showing interaction notes and unanswered unknowns
After getting feedback from the team, I created two high-fidelity prototypes to test with users.

Overview of the high-fidelity options that I created to test with users
Validating our approach
I interviewed 5 customers and conducted a thematic analysis in Dovetail to extract key insights. We learned how project teams and their back-office counterparts collaborate to understand and mitigate risks, that project managers prefer concise yet informational instructions, and that users wanted a way to mark which AI-generated instructions had been reviewed and approved by a human.

A high-level look at the outputs from the thematic analysis that I did in Dovetail
We learned how project teams and their back-office counterparts collaborate to understand and mitigate risks, that project managers prefer concise yet informational instructions, and that users wanted a way to mark which AI-generated instructions had been reviewed and approved by a human.
INSIGHT
Shorter instructions are better.
It's important to strike a balance between details and readability; too long and no one will read it.
INSIGHT
Human approval needs to be explicit.
Users want to distinguish between AI-generated content that's under review vs. approved.
INSIGHT
Processes vary across teams today.
Some project teams own their playbooks, while others digest what they get from reviewers.
Refining the design
After summarizing my research findings, I proceeded with the next set of design iterations, incorporating our learnings from the conversations with end users. I went through a couple rounds of revisions based on team feedback, and we finally landed on a design that presented Playbook topics in a card-based list, where users could drill down to read more details. The designs also included a badge to let users know what content was AI-generated; when a human had reviewed and approved the content, the badge would disappear.


The final designs for the Playbooks feature, with the list of issues (top) and detail view of a single issue (bottom)
Final Thoughts
Reflections
From start to finish, this project took about three months for design. One of the main roadblocks was finding research participants and scheduling calls with them; despite help from my product manager and several of our account managers, it took almost a month to hit our customer call goal. If we had a more robust research practice or a continuous discovery pipeline, I think the research turnaround time would have been significantly reduced.
After launch, despite seeing growth in Playbooks usage, one challenge to adoption was that the main Document Crunch product primarily serves legal users, like attorneys; however, the Playbooks feature targets a totally different persona: project managers. Based on the research conversations I had, many project managers aren’t currently in Document Crunch — they receive PDF exports that were vetted previously by the back-office risk team. If more project managers were direct users of Document Crunch, I would expect adoption to increase accordingly.