john.bullock@yale.edu

By post:
Institution for Social and Policy Studies
Yale University
PO Box 208209
New Haven, CT 06520-8209

By courier:
Institution for Social and Policy Studies
Yale University
77 Prospect Street
New Haven, CT 06511-8955

Add to address book

Peer-Reviewed Articles

Bullock, John G. 2011. Elite Influence on Public Opinion in an Informed Electorate. American Political Science Review 105 (August): 496-515. [More information]

Luskin, Robert C., and John G. Bullock. 2011. “Don’t Know” Means “Don’t Know”: DK Responses and the Public’s Level of Political Knowledge. Journal of Politics 73 (April): 547-57. [More information]

Bullock, John G., Donald P. Green, and Shang E. Ha. 2010. Yes, But What’s the Mechanism? (Don’t Expect an Easy Answer). Journal of Personality and Social Psychology 98 (April): 550-58. [More information]

Bullock, John G. 2009. Partisan Bias and the Bayesian Ideal in the Study of Public Opinion. Journal of Politics 71 (July): 1109-24. [More information]



Other Published Research

Bullock, John G., and Donald P. Green. 2013. Mediation Analysis in the Social Sciences (comment). Journal of the Royal Statistical Society, Series A (January): 38-39.

Bullock, John G., and Shang E. Ha. 2011. Mediation Analysis Is Harder than It Looks. In Cambridge Handbook of Experimental Political Science, ed. James N. Druckman, Donald P. Green, James H. Kuklinski, and Arthur Lupia. New York: Cambridge University Press. [More information]

Green, Donald P., Shang E. Ha, and John G. Bullock. 2010. Enough Already about “Black Box” Experiments: Studying Mediation Is More Difficult than Most Scholars Suppose. Annals of the American Academy of Political and Social Science 628 (March): 200-08. [More information]

Bendor, Jonathan, and John G. Bullock. 2008. Lethal Incompetence: Voters, Officials, and Systems. Critical Review 20 (March): 1-24. [More information]

Sniderman, Paul M., and John G. Bullock. 2004. A Consistency Theory of Public Opinion and Political Choice: The Hypothesis of Menu Dependence. In Studies in Public Opinion: Gauging Attitudes, Nonattitudes, Measurement Error, and Change, ed. Willem E. Saris and Paul M. Sniderman. Princeton, NJ: Princeton University Press. [More information]

Syllabi

Political Preferences and American Political Behavior: Syllabus, 2013 Fall

Political Psychology (undergraduate lecture) :
Syllabus, 2014 Spring

Public Opinion and Representation in the United States (undergraduate seminar) : Syllabus, 2013 Fall



Other Resources

Learning R

“Don’t Know” Means “Don’t Know”: DK Responses and the Public’s Level of Political Knowledge Abstract

Does the public know much more about politics than conventionally thought? A number of studies have recently argued, on various grounds, that the “don’t know” (DK) and incorrect responses to traditionally designed and scored survey knowledge items conceal a good deal of knowledge. This paper examines these claims, focusing on the prominent and influential argument that discouraging DKs would reveal a substantially more knowledgeable public. Using two experimental surveys with national random samples, we show that discouraging DKs does little to affect our picture of how much the public knows about politics. For closed-ended items, the increase in correct responses is large but mainly illusory. For open-ended items, it is genuine but minor. We close by examining the other recent evidence for a substantially more knowledgeable public, showing that it too holds little water.

Article

Published version

Appendices

Bibliographic Information

[BibTeX] [EndNote] [Google Scholar] [RIS]
doi:10.1017/S0022381611000132

Replication Archive
Partisan Bias and the Bayesian Ideal in the Study of Public Opinion Abstract

Bayes’ Theorem is increasingly used as a benchmark against which to judge the quality of citizens, but some of its implications are not well understood. A common claim is that Bayesians must agree more as they learn and that the failure of partisans to do the same is evidence of bias in their responses to new information. Formal inspection of Bayesian learning models shows that this is a misunderstanding. Learning need not create agreement among Bayesians. Disagreement among partisans is never clear evidence of bias. And although most partisans are not Bayesians, their reactions to new information are surprisingly consistent with the ideal of Bayesian rationality.

Article

Published version

Appendix

Bibliographic Information

[BibTeX] [EndNote] [Google Scholar] [RIS]
doi:10.1017/S0022381609090914

Replication

[Figure_1.R] [Figure_2.R][Figure_3.R]

Elite Influence on Public Opinion in an Informed Electorate Abstract

An enduring concern about democracies is that citizens conform too readily to the policy views of elites in their own parties, even to the point of ignoring other information about the policies in question. This article presents two experiments that suggest an important condition under which the concern may not hold. People are rarely exposed to even modest descriptions of policies, but when they are, their attitudes seem to be affected at least as much by those descriptions as by cues from party elites. The experiments also include measures of the extent to which people think about policy, and contrary to many accounts, they suggest that party cues do not inhibit such thinking. This is not cause for unbridled optimism about citizens’ ability to make good decisions, but it is reason to be more sanguine about their ability to use information about policy when they have it.

Article

Preprint and appendix

Bibliographic Information

[BibTeX] [EndNote] [Google Scholar] [RIS]
doi:10.1017/S0003055411000165

Replication Archive
Lethal Incompetence: Voters, Officials, and Systems Abstract

The study of voter competence has made significant contributions to our understanding of politics, but at this point there are diminishing returns to the endeavor. There is little reason, in theory or in practice, to expect voter competence to improve dramatically enough to make much of a difference, but there is reason to think that officials’ competence can vary enough to make large differences. To understand variations in government performance, therefore, we would do better to focus on the abilities and performance of officials, not ordinary citizens.

Article Bibliographic Information

[BibTeX] [Endnote] [RIS]
doi:10.1080/08913810802316290

Mediation Analysis Is Harder than It Looks Abstract

Mediation analysis is the effort to understand the mechanisms through which some variables affect others. It is increasingly common in political science. But political scientists typically draw inferences about mediation without manipulating mediators, and their analyses are likely to be biased. Recognizing the problem, social scientists are gradually turning to methods that involve experimental manipulation of mediators. This is a step in the right direction, but experiments have little-appreciated limitations of their own. We describe these limitations and conclude that inference about mediation is fundamentally difficult—more difficult than inference about treatment effects, and best tackled by a research program that is specifically designed to speak to the challenges of mediation analysis.

Article Bibliographic Information

[BibTeX] [Google Scholar]

Enough Already about “Black Box” Experiments: Studying Mediation Is More Difficult than Most Scholars Suppose Abstract

The question of how causal effects are transmitted is fascinating and inevitably arises whenever experiments are presented. Social scientists cannot be faulted for taking a lively interest in “mediation,” the process by which causal influences are transmitted. However, social scientists frequently underestimate the difficulty of establishing causal pathways in a rigorous empirical manner. We argue that the statistical methods currently used to study mediation are flawed and that even sophisticated experimental designs cannot speak to questions of mediation without the aid of strong assumptions. The study of mediation is more demanding than most social scientists suppose and requires not one experimental study but rather an extensive program of experimental research.

Article

Preprint
Published version (gated)

Bibliographic Information

[BibTeX] [Endnote] [Google Scholar][RIS]
doi: 10.1177/0002716209351526

Yes, But What’s the Mechanism?
(Don’t Expect an Easy Answer)
Abstract

Psychologists increasingly recommend experimental analysis of mediation. This is a step in the right direction because mediation analyses based on nonexperimental data are likely to be biased and because experiments, in principle, provide a sound basis for causal inference. But even experiments cannot overcome certain threats to inference that arise chiefly or exclusively in the context of mediation analysis—threats that have received little attention in psychology. We describe three of these threats and suggest ways to improve the exposition and design of mediation tests. Our conclusion is that inference about mediators is far more difficult than previous research suggests, and best tackled by an experimental research program that is specifically designed to address the challenges of mediation analysis.

Article

Published version

Appendix

Bibliographic Information

[BibTeX] [Endnote] [Google Scholar] [RIS]
doi:10.1037/a0018933

News and Other Information

Commentary on the article by Eliot Smith, the new editor of Journal of Personality and Social Psychology: Attitudes and Social Cognition.

New standards for mediation analysis in Social Psychological and Personality Science. The editor, Allen McConnell, cites our article while establishing the new standards.

Learning R General Resources

Efficient R (my own advice)
Quick-R
An Introduction to R [PDF]
R in a Few Hours [PDF]
Paul Johnson’s R Tips
Stack Overflow
R-help Mailing List Search
Yale Statlab workshops
Simon Jackman’s workflow slides (about R and more)

Graphics

R Graphics chapters and code for many figures
Lattice code for many figures (nice interface)

Scoping

Gentleman and Ihaka (2000) [JSTOR]
John Fox’s introduction to scoping in R [PDF]

Style

Hadley Wickham’s R Style Guide
Jonathan Nagler’s Style Guide

Miscellany

Convert columns of dummies to a factor variable
Dropping named columns from a data frame
Get source code for any function
Using R with Eclipse

Viewing help pages in a browser while using StatET

Learning Economics, Math, and Statistics

I list only the free resources that I think most useful for political science students.

Economics

Osborne and Rubinstein’s A Course in Game Theory
Polak’s introductory game theory class
Rubinstein’s lecture notes on micro [PDF]

Math

Teach yourself calculus: single-variable, multivariable;
Keisler’s Elementary Calculus; Strang’s Calculus

Osborne’s math-for-econ tutorial
Economist’s Mathematical Manual (very terse)

Statistics

Russ Lenth’s power-and-sample-size calculators
Harvard’s Intro to Probability
Don Green’s lecture notes
American Statistician archives [JSTOR]
Elements of Statistical Learning (not for beginners)

Related

Useful methods books for undergrad and grad students
Mathematical sociology textbook (learn Markov chains)

Miscellany Data

High School and Beyond: how to import the data:
Original HSB study (1980): school questionnaire
Original HSB study (1980): student questionnaire
First sophomore follow-up (1982)
First senior follow-up (1982)
Local labor market indicators (1982)

Graduate Admissions

Dan Nexon’s advice (best I’ve seen for political science)

LaTeX

apsr2006.bst (the best BibTeX style file for the APSR) Beamer item indentation
Beamer list of colorable elements
Detexify (draw a symbol, get the name)
LaTeX Previewer (enter code, get picture of equations)

Presenting

Leslie Lamport’s advice

Workflow

Deleting an Amazon S3 bucket that has many files in it
Simon Jackman’s workflow slides (about R and more)

Writing

Against The Elements of Style
Michael Munger on writing habits