The doctored picture above is from Mother Jones’ article this past week attempting to discredit John Lott and the Crime Prevention Research Center.
For several years, Mother Jones, a leftist magazine funded by people such as George Soros, and John Lott have been having a running feud. They are apparently willing to do anything to push for more gun control. Even other academics, who are liberal gun control advocates, such as James Alan Fox have also taken them to task, for their misleading use of data. Unable to win the battle of facts, Mother Jones this week tried hitting back in a more personal way.
Julia Lurie, the reporter Mother Jones assigned to do a story on Lott, somehow managed to ask him over 40 detailed questions and still write on many issues that she had not asked him about, and when she did ask about things she ignored Lott’s responses. Her piece was filled with simple factual errors. Even a brief look in Lott’s book More Guns, Less Crime or in his original research paper with David Mustard would have prevented them.
Lurie somehow couldn’t manage to talk to researchers who have found results similar to Lott’s, and she couldn’t even mention what most of the per reviewed studies have found.
Mother Jones has gone after others, such as Bill O’Reilly, and launched a personal attack. In Lott’s case, they claim that the work produced by the Crime Prevention Research Center isn’t “academic quality” and quotes Professor Gary Kleck as saying that credible criminologist don’t believe that “with more guns there are less crimes” and that Lott’s research “was garbage in and garbage out.” That the “National Research Council . . . concluded that the existing research, including Lott’s, provided “no credible evidence” that right-to-carry laws had any effect on violent crime.” The attacks are misleading and out of context.
Take Lurie’s points in order.
1) “The National Research Council, a branch of the National Academy of Sciences, assembled a panel to look into the impact of concealed-carry laws; 15 of 16 panel members concluded that the existing research, including Lott’s, provided “no credible evidence” that right-to-carry laws had any effect on violent crime.”
The National Research Council report actually concluded as follows: “The committee concludes that with the current evidence it is not possible to determine that there is a causal link between the passage of right-to-carry laws and crime rates.” The majority of the panel advocated that more money be available to academics to fund additional research. Lurie somehow manages not to mention that despite evaluating every gun law that has been studied, the Council found no evidence supporting that any law had any impact.
Right-to-carry laws were actually the only type of law where there was dissent. James Q. Wilson, who at the time was possibly the “most influential criminal justice scholar of the 20th century,” concluded: “I find that the evidence presented by Lott and his supporters suggests that [right-to-carry] laws do in fact help drive down the murder rate.”
2) Where Lott’s results biased because the crack cocaine epidemic in the late 1980s and early 1990s?
Lott’s research from the very first work with David Mustard dealt with the crack cocaine issue. As Florenz Plassmann and John Whitley (Stanford Law Review, 2003) summarize the research:
“One of Ayres and Donohue’s greatest concerns is the apparent failure of previous research to account for the differential geographic impact of cocaine on crime. Lott’s book (and the Lott and Mustard paper) reported that including price data for cocaine did not alter the results. Using yearly county-level pricing data (as opposed to short-run changes in prices) has the advantage of picking up cost but not demand differences between counties, thus measuring the differences in availability across counties. Research conducted by Steve Bronars and John Lott examined the crime rates for neighboring counties . . . on either side of a state border. When the counties adopting the law experienced a drop in violent crime, neighboring counties directly on the other side of the border without right-to-carry laws experienced an increase. . . . Ayres and Donohue argue that different parts of the country may have experienced differential impacts from the crack epidemic. Yet, if there are two urban counties next to each other, how can the crack cocaine hypothesis explain why one urban county faces a crime increase from drugs, when the neighbor- ing urban county is experiencing a drop? Such isolation would be particularly surprising as criminals can easily move between these counties. . . . Even though Lott gave Ayres and Donohue the cocaine price data from 1977 to 1992, they have never reported using it.”
Lott’s third edition of More Guns, Less Crime in 2010 also used new data from Fryer et al that was published in Economic Inquiry that attempted to measure the impact of crack cocaine from 1980 to 2000.
The claim that the research supporting right-to-carry laws somehow ignores the potential impact on crime is simply wrong. Even worse, critics, such as Ayres and Donohue, who claim that the results could be explained away by the impact of crack cocaine have never provided any estimates that include this factor to show that is true.
Would it have been that difficult for Ms. Lurie to ask about this point if she were going to write about it? Alternatively, Lurie could have just looked in the appendix in More Guns, Less crime to see all the discussions of crack cocaine.
3) “When [Ayres and Donohue] extended their survey by five years, they found that more guns were linked to more crime, with right-to-carry states showing an eight percent increase in aggravated assault.”
This is a simple counting error. Ayres and Donohue made the false claim, and Lurie never bothered to confirm it. The Second edition of More Guns, Less Crime, which was published in 2000, used data from 1977 to 1996. Ayres and Donohue’s 2003 paper used data from 1977 to 1997. John Lott provided Ayres and Donohue his data from 1977 to 1996 and they added one year to the data. Adding that one year to the 20 that were already being examined didn’t make a difference. They obtained somewhat different results because they used a different specification and misinterpreted their results.
Again, either a fast look at either the second or third editions of More Guns, Less Crime would have let Lurie realize that this claim was incorrect.
4) Claim by Gary Kleck that John Lott hadn’t “accounted for missing data,” and that “It was garbage in and garbage out,” The problem is simple: in some counties not have all the cities in those counties reporting crime rate data every year. This causes some randomness in the number of crimes reported for those counties. The problem used to be particularly prevalent in low-population counties, but it has improved considerably over time.
Take Georgia, one state that has been singled out by some of those concerned about this problem over the period from 1980 to 1993, after which the problem had largely disappeared. Of the state’s 159 counties over this period, the 16 least populated ones with a total of about 1 percent of Georgia’s population were missing about 35% of the police departments reporting. By contrast, the 127 most populated counties with 97.2% of the total population averaged an under-reporting rate of 5.6%. Since all the regressions that John Lott had reported in his work weighted the county data by their populations, the counties with the largest problems had little impact on the results.
All data contains some errors. The question isn’t whether data has errors, it is whether those errors are random or whether they systematically bias the results. Starting with the first paper John Lott and David Mustard, this data error has been dealt with in many different ways.
— The original paper with Mustard first looked at all counties and then just counties with more than 50,000 people and then those with more than 100,000 people. If the small population counties were creating a bias in favor of right-to-carry laws, removing those small counties and looking at larger counties should eliminate that result. But that didn’t happen. The results were very similar when just the more populous counties were used.
— The Second edition of More Guns, Less Crime studied city, county and state level data. Even if that particular error existed for county level data, it did not exist for city or state level data. And, again, the results were similar.
— A 2002 paper with John Whitley explicitly examines errors in the county level data and finds no evidence of any systematic biases. This was published in 2003 in the Journal of Quantitative Criminology.
All this information was provided to Lurie.
5) “Kleck, who conducted a controversial, yet often-cited survey on defensive gun use, observes, “Do I know anybody who specifically believes with more guns there are less crimes and they’re a credible criminologist? No.”
So does Gary Kleck acknowledge that James Q. Wilson was a credible criminologist? A survey just completed by Gary Mauser of people who published on firearm issues in refereed criminology journals from 2000 to 2014 found that 31% thought that right-to-carry laws lowered murder rates, 15% said that it increased murder rates, 46% said that the laws had no effect, and 5.1% said that they didn’t know. While the largest category of criminologists agree with Gary Kleck that right-to-carry laws have no impact on crime rates, the second largest group of researchers do believe the more guns, less crime hypothesis.
6) What John Lott actually claimed about the views of economists and criminologists was that the vast majority of published peer-reviewed papers looking at the impact that right-to-carry laws had on US crime rate found that they reduced violent crime rates and the rest of the papers claimed that there was no effect for murder, rape and robbery (see also here).
7) “The organization . . . proceeds and publishes ‘academic quality’ reports that have yet to be published in peer-reviewed journals.”
It helps provide some perspective that John Lott has published over a 100 peer-reviewed academic journal articles. The CPRC was only started in October 2013, and it takes time to produce research and then even more time to go the peer-review process. Yet, despite that, we supported research published last year titled “The Impact of Right-to-carry laws on Crime: an Exercise in Replication” by Carlisle Moody, Thomas Marvell, Paul Zimmerman, and Fasil Alemante that was published in the Review of Economics and Finance. The CPRC co-authored a paper that was published in Public Choice, which is also a peer-reviewed journal. In addition, as Lurie knows, she was also informed that one paper at a journal had been revised and resubmitted to the journal. In addition, another paper showing errors in a recent FBI report on active shooters was published in the Academy of Criminal Justice Sciences Today.
Not surprisingly, Mother Jones fails to note the CPRC’s prestigious academic advisory board, with people at the top of their fields from the University of Chicago, Harvard, and the Wharton Business School.
8) “one of the small number of very pro-gun researchers like Gary Kleck or John Lott”
This statement makes two mistakes. First, most economists who have published research on firearms in peer-reviewed journals believe that there is a net safety benefit from people carrying guns. For example, worldwide 83% of economists who have published on this topic believe that guns are more likely to be used in self-defense than to be used in crime and 74% believe that concealed handgun laws lower the murder rate. As noted earlier, those who publish in criminology journals are more divided on the issue, and they are thus do not take monolithic position that the article describes for researchers.
It is strange that Kleck is labeled “pro-gun” in the same article where he is quoted as saying: “Do I know anybody who specifically believes with more guns there are less crimes and they’re a credible criminologist? No.” Kleck believes guns have no net effect on crime rates, and thus he doesn’t thinks that it matters whether guns are banned or licensed or regulated in some other way. Gary Kleck and John Lott clearly have very different views on guns, and it is surprising that the articles lumps the two of us together.
8) “Lott claimed that it was based on a data from a survey he had conducted—but that the data had been lost in a computer crash.”
The hard disk crash was widely documented by people at the time it occurred on July 3, 1997. The crash destroyed data for all the papers that Lott was working on up to that point. A number of co-authors who he was working with also lost data for papers that they had been working on together (Larry Kenny at Florida State, Richard Manning who was then at BYU, Jonathan Karpoff at the University of Washington, David Mustard at University of Georgia) and others who had contemporaneous knowledge of the crash (including Geoffrey Huck, an editor at the University of Chicago Press; Dan Kahan at Yale; and John Whitley who was at the time at the University of Adelaide in Australia).
9) Mother Jones completely manage to mangle the timeline and seems unable to accurately report the numbers for a follow-up survey that confirmed the previous results in 2002, before controversy erupted about the first survey. The point of the 2002 survey was to see if there had been any changes in the rate of defensive gun uses since the 1997 survey, but when the earlier results were questioned, it also served as a way of replicating them. The survey was designed differently than other surveys, such asking people only about recent crimes over the previous year rather than events that had occurred over the last 10 years or longer. As to the timeline, the survey had started being being prepared by one of Lott’s then research assistants, James Knowles, in June 2002, well before the controversy over the first survey arose at the end of 2002 and the beginning of 2003.
10) “Rosh and Lott shared an internet address.” This is simply false. We had a dynamic IP address. Julian Sanchez had put a post up on his blog site noting that “maryrosh” had an IP address in Southeastern Pennsylvania, but he asked for help with anyone who might know who the person was. When Lott saw Sanchez’s post, Lott emailed Sanchez and told him that Lott had used his kids’ email address in putting posts up on an internet chatroom. Lott had original started using his own email address in postings in the chatroom, but, unfortunately, some people tried to continue the discussions in unpleasant ways in person. Since the vast majority of people using the chatroom were using pseudonyms, it seemed appropriate to follow that example.
11) The Mother Jones story original made fun of the fact that Milwaukee Sheriff David Clarke is on the Crime Prevention Research Center’s Board of Directors (their story has now been updated, but they have changed it without acknowledging the original post that they had up). We are very proud of our relationship with David Clarke and believe that he brings in an important real work perspective to the Center. They also fail to note that Professor Edgar Browning, who is also on our board, has been one of the top public finance economists in the world.
This article by Mother Jones is part of pattern of similar attacks. For a decade Media Matters attacked John Lott in over a 110 posts, and they were so unwilling to discuss the claims that they made that they would systematically remove any responses that were attempted to be posted in their comment section. More recently a new group named “Armed with Reason,” which has connections to Bloomberg’s gun control groups, rehashed old comments, but completely ignored the academic responses that Lott had already published. A more productive approach would have been to explain why Lott’s responses were wrong.