I have written a command-line tool in Go called github-export that exports all GitHub issues, pull requests, releases, labels, and milestones from a repository into a local folder as plain markdown files. What makes it different from existing tools is that it syncs incrementally and generates event files that an AI agent (like Claude Code) can pick up and act on. This lets me maintain my open-source projects with AI assistance.
GitHub-Backup and gh2md
There are existing tools for exporting GitHub data, but none of them did what I needed. gh2md is a Python tool that exports issues and pull requests to markdown. It works well for creating a readable archive, either as a single file or one file per issue, but it always does a full export. Every run fetches everything from scratch, with no concept of incremental sync or change detection. GitHub-Backup takes a different approach: it clones all repositories of a user or organization and can optionally include issues, pull requests, and other metadata as JSON. It is designed for disaster recovery backups rather than for working with the data.
Both tools are useful for their intended purpose, but neither supports incremental sync or generates events for changes. I needed a tool that could run repeatedly on a schedule, only fetch what changed, and tell me exactly what happened since the last run. That’s what github-export does.
Incremental sync
On the first run, github-export fetches all data from a repository. On
subsequent runs, it reads the synced_at timestamp from repo.yml and only
re-fetches issues and pull requests that were updated since then. Each touched
issue file is fully rebuilt from the GitHub timeline endpoint, so the local copy
is always complete and accurate.
A typical incremental sync of 50 updated issues costs around 200 API requests, well within GitHub’s 5,000/hour rate limit. The tool also automatically sleeps when the rate limit gets low.
The event system
The key feature that enables AI-assisted maintenance is the event system. During
each sync, github-export compares the freshly fetched data against what was
previously on disk and generates individual event files in
github-data/events/. This adds no extra API calls - it’s just a diff of the
local state.
Event types include issue_created, issue_closed, issue_reopened,
pr_created, pr_merged, pr_closed, and comment_created. Each event file
is a markdown file with YAML frontmatter containing the event type, issue
number, title, author, state, labels, and a link to the full issue file:
---
event: pr_created
number: 69
title: Add whiteblack color
author: lqj01
state: open
labels: []
file: github-data/issues/0069.md
repo: mevdschee/2048.c
url: https://github.com/mevdschee/2048.c/pull/69
exported_at: 2026-04-02T07:53:37Z
---
Maintenance with Claude Code
The event files are designed as a handoff point: an agent reads them, acts on them, and deletes them afterwards. Here is an example of how I use this with Claude Code to maintain my 2048.c repository.
After running github-export, I had 6 events: several new pull requests and an issue. I opened Claude Code and pointed it at the events directory. Claude read all the event files and gave me a prioritized summary:
- PR #66 - Spam (phone numbers, no code). Already closed, but no comment.
- Issue #67 + PR #68 - Feature request to add board rotation, with an implementation.
- PR #69 - A new white-to-black color scheme, with screenshots.
- PR #65 and #70 - Already closed (merged + reverted), no action needed.
For each item, Claude suggested an action and I decided what to do. For the spam PR, Claude posted a friendly comment. For the rotation feature, I decided it was out of scope (we want to keep the code minimal) and Claude posted appreciative comments explaining why, then closed both the issue and the PR. For the color scheme PR, I liked it - Claude posted a thank-you comment, merged the PR, pulled the changes, and fixed a minor indentation issue. After everything was handled, Claude cleaned up the event files.
The entire session took a few minutes instead of the usual context-switching
between GitHub notifications, reviewing diffs in the browser, and typing out
responses. Claude handled all the gh commands while I made the decisions.
How it all fits together
The workflow is straightforward:
- Run
github-export mevdschee/2048.c(either manually or on a schedule) - Open Claude Code and point it at the events
- Review the suggested actions and approve or adjust them
- Claude executes the maintenance using
ghcommands
Because all the GitHub data is available locally as plain text, the AI agent can read issue histories, review PR diffs, and understand context without needing direct API access. The event files tell it what needs attention, and the full issue files give it the details.
Download
You can find the code on my GitHub:
https://github.com/mevdschee/github-export
The tool requires Go 1.22+ to build and a GITHUB_TOKEN environment variable
for authentication. You can get a token with gh auth token if you have the
GitHub CLI installed.
Enjoy!