Editor’s Note: Analyses seeking to understand the effect of public comments need to analyze the substance of the comments submitted.
For decades, a central question in rulemaking has been the extent to which public comments on proposed rules affect the substance of agency regulations. On the one hand, the notice and comment process has been likened to Kabuki theater. On the other hand, researchers have discovered that, under certain circumstances, comments can have substantial effects on final rules.
A variety of approaches have been utilized in the quest to uncover the impact of comments. Some analysts have focused on quantifiable changes across proposed and final rules, such as dollar amounts of government allocations, investigating the association between comments the magnitude of these changes. Others have manually coded the content of comments and changes across proposed and final rules. Still others have relied on evidence from regulators themselves, who often point to instances in which comments have fundamentally shaped their agencies’ regulations.
In an intriguing working paper, Andrei Kirilenko, Shawn Mankad, and George Michailidis propose a new approach, dubbed RegRank, to analyzing the impacts of comments. RegRank uses topic modeling to uncover the structure and sentiment of proposed rules, comments, and final rules. Topic modeling is an automated method, developed in context of natural language processing, that draws off of the meaning and tone of words contained in documents. This method produces scores that quantify documents in terms of their sentiment, for example, the extent to which they are positive or negative in content.