Who reviews Linux kernel commits?

I’ve been thinking about code review lately, and took a little time to look at the Linux kernel git tree to see how many commits are marked with “Reviewed-by” (indicates that the patch has been reviewed and found acceptable).   The short answer is, not a whole lot – or at least not consistently, but improving.

I iterated over the entire linux-2.6 git tree as of today (~3.7-rc8), and counted all commits (excluding merges), then counted up every Reviewed-by tag in all commits.  (I realize that this isn’t the only measure of review quality.  More caveats below).  It’s  possible that some commits might have more than one Reviewed-by tag, but I assume that’s rare overall.  I found:

4.7%: 313,848 commits, 14,631 reviews

Breaking it down further, for some of the top-level directories here’s what I found:

 0% sound: 12151 commits, 68 reviews
 1% virt: 451 commits, 8 reviews
 2% crypto: 888 commits, 23 reviews
 3% block: 1643 commits, 61 reviews
 4% security: 1675 commits, 68 reviews
 5% drivers: 149445 commits, 8107 reviews
 7% fs: 26747 commits, 2138 reviews
19% mm: 5858 commits, 1134 reviews

Changes Over Time

However, things seem to be improving.  Here is the total ratio of “Reviewed-by” tags to commits for each release since 2.6.13:

Perhaps Reviewed-by is coming into vogue; I find that interesting, since there has been no formal push for using Reviewed-by more often, as far as I know.  What if we look at those top-level subsystems only since v3.0?

 0% sound: 3341 commits, 32 reviews
 1% crypto: 185 commits, 2 reviews
 3% block: 303 commits, 10 reviews
 3% security: 406 commits, 15 reviews
 5% virt: 111 commits, 6 reviews
12% drivers: 40747 commits, 5054 reviews
15% fs: 5362 commits, 814 reviews
27% mm: 1193 commits, 325 reviews

mm/ continues to win, and improve the reviewed % as well.  fs, drivers, and virt have improved their percentage lately, too.

Filesystems

fs/ is second, and of particular interest to me.  How’s it doing since the beginning of the git tree?

1% fs/affs: 117 commits, 2 reviews
1% fs/bfs: 72 commits, 1 reviews
1% fs/coda: 143 commits, 2 reviews
1% fs/debugfs: 85 commits, 1 reviews
1% fs/fuse: 399 commits, 5 reviews
1% fs/gfs2: 1233 commits, 13 reviews
1% fs/logfs: 124 commits, 2 reviews
1% fs/minix: 91 commits, 1 reviews
1% fs/ncpfs: 121 commits, 2 reviews
1% fs/nfs: 2731 commits, 36 reviews
1% fs/ocfs2: 1578 commits, 29 reviews
1% fs/omfs: 57 commits, 1 reviews
1% fs/pstore: 64 commits, 1 reviews
1% fs/reiserfs: 512 commits, 7 reviews
1% fs/sysv: 95 commits, 1 reviews
2% fs/9p: 369 commits, 8 reviews
2% fs/autofs4: 216 commits, 6 reviews
2% fs/btrfs: 2607 commits, 55 reviews
2% fs/hostfs: 96 commits, 2 reviews
2% fs/hppfs: 46 commits, 1 reviews
2% fs/jbd: 172 commits, 4 reviews
2% fs/nfsd: 1358 commits, 33 reviews
3% fs/devpts: 57 commits, 2 reviews
3% fs/ecryptfs: 403 commits, 16 reviews
3% fs/ext2: 314 commits, 10 reviews
3% fs/ext3: 516 commits, 19 reviews
3% fs/jbd2: 222 commits, 8 reviews
3% fs/lockd: 376 commits, 14 reviews
3% fs/ubifs: 471 commits, 16 reviews
3% fs/udf: 306 commits, 11 reviews
4% fs/nls: 23 commits, 1 reviews
4% fs/quota: 116 commits, 5 reviews
4% fs/ramfs: 70 commits, 3 reviews
5% fs/ext4: 1629 commits, 93 reviews
6% fs/hugetlbfs: 131 commits, 9 reviews
6% fs/proc: 934 commits, 62 reviews
7% fs/ceph: 673 commits, 51 reviews
16% fs/cifs: 1741 commits, 286 reviews
50% fs/xfs: 2376 commits, 1204 reviews

Any fs not listed above had < 1% Reviewed-by tags.

I’m interested primarily in xfs, ext4, and btrfs for my day job.  How have they done over time?

(XFS exceeds 100% at times when many commits have more than one Reviewed-by tag.)

The Caveats

Now, granted: all this measures is the number of aforementioned Reviewed-by tags per commit in the tree; the absence of a tag does not necessarily mean there was no review. And, in many cases, it may be that the maintainer has done a review, and it’s unusual for the maintainer to add Signed-off-by as well as Reviewed-by.  And finally, presence of the tag does not say anything about the quality of review. So take this with as big a grain of salt as necessary. If you’re the maintainer of one of these subsystems, and Reviewed-by is not what you use, I’d like to know (and to know why that is). Meanwhile, this seems like the best measure we’ve got.

Finally, who are the top ten individuals with the most Reviewed-by: tags since the git tree’s inception?

213     Reviewed-by: Jesse Barnes <jbarnes@virtuousg---.org>
227     Reviewed-by: Chris Wilson <chris@chris-wil---.co.uk>
238     Reviewed-by: Josh Triplett <josh@joshtripl---.org>
245     Reviewed-by: Mike Christie <michaelc@cs.w---.edu>
255     Reviewed-by: Dave Chinner <dchinner@red---.com>
284     Reviewed-by: Michael Chan <mchan@broad---.com>
362     Reviewed-by: Pieter-Paul Giesberts <pieterpg@broad---.com>
435     Reviewed-by: Roland Vossen <rvossen@broad---.com>
478     Reviewed-by: Arend van Spriel <arend@broad---.com>
486     Reviewed-by: Christoph Hellwig <hch@---.de>

hch, somehow I am not surprised. :)

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.