Art of Lean
Back to Reference

A3 Report Example: Apollo 13

An experiment in AI-assisted problem-solving report writing — March 2026

This is an experiment, not a template. In March 2026 I wanted to see what kind of A3-style practical problem-solving report an AI model could produce when given sufficient context and structured guidance.

The Apollo 13 oxygen tank failure was chosen deliberately — it is a well-documented problem with extensive public records, official NASA investigation reports, and decades of analysis. The model did not guess at root causes or invent countermeasures. Everything in this report is derived from existing, publicly available knowledge about the incident.

The report was generated using HTML and relied on specialized skill files to guide the model through each section of a practical problem-solving A3. The skills defined the structure, the thinking sequence, and the level of rigor expected — the same way a manager would coach a junior engineer through their first A3.

What this shows: LLMs have real limits — they cannot investigate a novel problem, go to the gemba, or generate original insight from observation. But with sufficient context and structured guidance, they can organize known information into a useful problem-solving format. For training, historical case studies, and documented problems, this is a legitimate use case.

What this is not: A replacement for actual problem solving. An A3 is a thinking process, not a document format. The value is in the struggle — the back-and-forth with your manager, the trips to the gemba, the hypotheses that turn out to be wrong. An AI can produce the artifact but not the learning.

Note: This is a scrolling web report, not an actual 11" × 17" A3 sheet. The format was adapted for screen readability.

View the Apollo 13 A3 Report

Opens the full interactive report.