From the Desk of Doc Holiday >

How to Automate Release Notes From Github Workflows

Explore three approaches to automating release notes on GitHub: native generation, semantic versioning scripts, and AI-powered drafts. Learn the trade-offs and find the right fit for your team.
April 30, 2026
The Doc Holiday Team
How to Automate Release Notes From Github Workflows

Your release is scheduled for tomorrow morning. Someone just pinged the engineering lead asking who is writing the release notes.

Nobody has started. Nobody wants to. The person who did it last time is on vacation, and the person before them left the company six months ago along with the custom script they built to pull PR titles into a Google Doc.

This is the actual problem with release notes. Not that they're hard to write. It's that they require someone to stop what they're doing, sift through two weeks of merged pull requests, decide what matters to users versus what was an internal refactor, and produce something coherent before the release goes out. Documentation tasks consume roughly 11% of developers' work hours. When teams are shipping weekly, that's not a rounding error.

The good news is that if your team uses GitHub, you already have everything you need to automate this. The question is how much automation you actually want, because every approach involves a real trade-off.

What GitHub Gives You Out of the Box

GitHub can generate release notes automatically. You configure a .github/release.yml file that maps pull request labels to categories, and when you draft a release, you click "Generate release notes" and GitHub populates the description with a grouped list of merged PRs, a contributor list, and a link to the full diff.

The configuration is straightforward. You define categories like "Breaking Changes" or "New Features" using label names, exclude bot authors, and exclude PRs tagged internal or dependencies. It takes about thirty minutes to set up.

What it produces is a structured bullet list. Not a narrative. Not an explanation of why anything changed. Just: here are the PRs, here is who merged them.

That is fine if your release notes are an internal audit trail. It is not fine if your users are trying to understand what changed and why it matters to them. The output quality is also entirely dependent on your team's PR discipline. If engineers are merging PRs titled "fix thing" or "update stuff," that is exactly what will appear in the release notes. The automation faithfully reproduces whatever input it gets.

Webhooks can extend this: the release event fires when a release is published, and the pull_request event fires on each merge, giving you the PR body, labels, and linked issue references. That's the raw material for the next level.

When You Start Writing Scripts

Most teams outgrow the native approach and reach for semantic-release or a custom GitHub Actions workflow. The idea is to enforce a commit message convention (Conventional Commits is the standard: feat:, fix:, BREAKING CHANGE:), then parse those messages automatically to determine the next version number, generate a changelog, and publish the release without anyone clicking anything.

This works well when it works. The changelog is consistent, the versioning is deterministic, and the whole thing runs in CI without human intervention.

The hidden cost is maintenance. A large-scale empirical study of GitHub Actions workflows across 200 mature projects found that workflow files generate significant extra workload for developers, with bug fixing and CI/CD improvements being the major drivers. The researchers called it "the hidden costs of automation." Workflow files need updates when dependencies change, when the CI environment shifts, when someone adds a new label convention that the script doesn't know about.

More practically: these pipelines are usually built by one or two engineers who care about this stuff. When they leave, the team inherits a fragile web of YAML files and bash scripts that nobody fully understands. The bus factor is real.

There are also edge cases that scripts handle poorly. Hotfixes need to be documented immediately, not batched with the next sprint. Rollbacks need to explain what was reverted and why. These require either special branch naming conventions that the script recognizes, or manual intervention, which defeats the purpose.

The deeper issue is that even a well-maintained script still produces a list. It can group changes by type. It cannot explain that the authentication refactor in this release was the prerequisite for the SSO feature shipping next quarter. That context lives in the PR description, in the linked issue, in the conversation between the engineer and the product manager. Scripts don't read those.

Treating GitHub as a Data Source, Not a To-Do List

The third approach flips the framing. Instead of asking "how do we format the PR titles automatically," it asks "what does GitHub actually know about this release, and how do we turn that into something a user can read?"

GitHub knows a lot. It has the commit history, the PR descriptions, the linked issues, the review comments, the labels, the author context. A 2025 study from Peking University built an LLM-powered release note generator that ingests code, commit, and PR details to produce structured, narrative notes. It outperformed traditional tools on completeness, clarity, and organization. A benchmark study analyzing nearly 95,000 release notes found that LLMs consistently outperform traditional baselines, particularly when they have access to structured commit tree information rather than just raw diffs.

The practical version of this looks like: the AI generates a first draft that groups related technical changes into coherent feature narratives, flags edge cases like hotfixes or security patches, and produces something a technical lead can review rather than something they have to write from scratch. A case study involving a large European bank found that when engineers edited and approved AI-generated changelogs, the system learned from those edits and improved over time. The feedback loop is the product.

The trade-off here is that LLMs are probabilistic. They require human review. But the nature of that review is different: a technical lead reading a draft and approving it is a much lighter lift than a technical lead assembling a draft from scratch. The Mercari engineering team's experience with AI documentation is instructive: using AI doesn't eliminate the need to know what you want and why you want it, but it does eliminate the blank page.

For this approach to work, you need clean input from engineers. That means a PR template that requires a "User Impact" field and a linked issue. It means requiring labels. It means treating the PR description as a first-class artifact, not a formality. The practitioners' expectations research found that release notes should draw on issues (29% of content), pull requests (32%), and commits (19%). That's already in GitHub. The question is whether your team is disciplined enough to put it there.

How to Decide Which Approach is Right

Here is a rough framework:

If your team ships fewer than ten releases a year and has a dedicated technical writer, manual creation with GitHub's native PR list as a starting point is probably fine. The ROI on complex automation isn't there.

If you're shipping weekly and need strict semantic versioning, invest in semantic-release and GitHub Actions. Budget time for workflow maintenance. Treat the pipeline as a first-class engineering artifact, not a side project.

If release notes are a bottleneck, if they're inconsistent across releases, if your technical writers are spending their time assembling information rather than improving it, then building more automation yourself is likely the wrong answer. The ARENA research established over a decade ago that automated generation from version control and issue trackers is feasible. The question now is whether you want to build and maintain that system or adopt one that already has the structure in place.

The goal isn't to remove humans from the process. It's to change what they're doing. A technical lead reviewing a draft is doing editorial work. A technical lead hunting through two weeks of PRs at 11pm before a release is doing data entry. Those are not the same job, and only one of them is worth their time.

Doc Holiday generates release note drafts directly from your GitHub commits and PR history, then gives your technical lead a structured review workflow to validate and publish. The blank page problem goes away. The editorial judgment stays exactly where it belongs.

time to Get your docs in a row.

Join the private beta and start your Doc Holiday today!