February 13, 2015

National Academies: Census survey data should be more user-friendly


The U.S. Census Bureau should be paying more attention to the needs and opinions of the people and organizations that use its data, according to a recent National Academy of Sciences report.

The report, which focused on the American Community Survey, recommended that the bureau sharpen its sometimes confusing advice on evaluating data quality, consider trimming the number of tables it produces, and make it easier for users – both inexperienced and more advanced – to access statistics they need.

The American Community Survey is “an invaluable resource,” but the bureau should do more to make sure the survey meets researchers’ needs, including establishing a formal advisory group of users, according to the panel of data scientists who wrote the report at the Census Bureau’s request. The National Academy of Sciences is a private nonprofit organization, established by Congress, that provides independent advice on science and technology.

The Census Bureau is still studying the National Academy’s recommendations and has not yet decided which ones to pursue, agency spokeswoman Shelly Hedrick said last week in an email.

The review was meant to help the bureau, which is in the midst of its own examination of the American Community Survey. The bureau’s largest household survey – which is based on data from more than 2 million households – provides national and local estimates on education, immigration, housing and other topics that are widely used by government agencies, nonprofit organizations, academic researchers and businesses. The survey also plays a role in allocating $400 billion a year in government funding and in enforcing civil rights laws, among other things.

But people who use this census data say the bureau’s website leaves much to be desired, the report said. For example, inexperienced users “have difficulty navigating the many options,” and more sophisticated users “feel constrained by the limited flexibility and features” that are offered.

Meanwhile, the academy panel said, the job of producing more than 11 billion estimates a year from the survey is stretching the bureau’s capacity. The agency has never evaluated which data products are most useful. The report suggested that the bureau do such an evaluation and consider trimming the number of data tables it publishes.

The Census Bureau took one step in that direction this month by proposing to stop publishing estimates that combine three years of survey data; these data provide a larger sample size that lets researchers analyze data in more detail. The agency will continue to publish five-year estimates for that purpose.

Dropping some data products would free bureau resources for other projects to help data users, the report said. One such project it recommended was development of a more user-friendly data query system so advanced-level researchers could run customized tables and perform high-level statistical analysis.

The report also told the bureau to offer clearer advice on the limits and quality of its data. For example, although the bureau publishes margins of error for its survey estimates, its guidance on how to use these estimates is sometimes “confusing and uninformative,” the academy panel said. The report also questioned the bureau’s practice of not publishing some data it deems too imprecise, such as estimates about a small population group with a large margin of error. It would be better to publish the data and offer users more help in evaluating its quality, the report said. (The bureau also withholds data to protect respondent confidentiality; the report did not challenge that practice.)

The academy’s report was commissioned chiefly to study ways to improve the survey’s patchy data about small geographic areas and small population groups as well as to recommend other improvements. In addition to urging more user-friendliness, the report made two dozen recommendations for improvements in sample design and survey-question selection, as well as in data collection, processing and analysis.

The report comes as the bureau is reviewing public comments about its proposal last year to drop several questions from the survey about marriage and other topics. The bureau is to announce its recommendations by April to the Office of Management and Budget, which makes the final decision. Changes would take place in 2016.

Topics: Federal Government, Population Trends, U.S. Census

  1. Photo of D’Vera Cohn

    is a senior writer/editor focusing on immigration and demographics at Pew Research Center.

1 Comment

  1. SenatorJPO2 years ago

    The American Community Survey needs to be more detailed in its reports, not dumbed down. Current means of presenting ACS data over-simplify the statistical picture because they band together starkly different data values, thereby obscuring meaning that would be accessible if the microdata were reported in matrices of lower intervals between bands.

    For example, educational attainment should be granulated into smaller age bands, as well as banded by tier of educational establishment (Ivy League, state university, and private university).

    The ACS lumps together earnings from alumni of these strikingly different institutions — under the broad category “” — thereby leading citizens to draw false conclusions about their chances of a decent-paying job after attending a state university.

    Many consequently reason, “I’ll attend a local university so that I don’t have to move, and my earnings won’t be hurt,” but then they end up working manual labor for $10 an hour years after degree conferral. Bearing this out are Consumer Credit Panel data from the Federal Reserve, which indicate around 40 percent of U.S. college graduates are involuntarily unemployed or under-employed within 1 year of degree conferral.

    For how many degree holders is that problem only short-term? What jobs are they working in the intermediate term, several years down the line? To what extent does this under-employment dominate their working lives in the long term?

    These are pressing public policy problems that cannot be answered by eliminating data tables. We need more granular summaries from narrower numerical bands (for ages) and categorical values (for type of university), not fewer!

    As it stands, advocates of higher education can simply say, “See? Those ages 18-64 who earned a 4-year degree or advanced degree earned more on average than those who did not!”

    This ignores the perilous plight of 40% or more under-employment rate for recent U.S. college graduates, which may approach 50 percent as new Consumer Credit Panel data is disclosed by the Federal Reserve.

    Student-consumers and citizen-journalists need smaller age bands of microdata to analyze, so that we may answer specific questions about earnings per education level. Ergo, the 4-year degree group would be segmented into 18-22, 23-26, 27-30 years, etc. to either confirm or refute the CCP findings about those who earned their degree 1-3 years ago (the 23-26 age band of college-educated residents).

    That should be easily attainable because ACS already has the microdata! There’s no excuse to not report data summaries in smaller bands. It’s not as if Congress or the Census Bureau forbids ACS staff to add greater detail! They merely set minimum standards, which the ACS sadly does not exceed.

    An over-achieving ACS would strive for even greater accuracy, beyond mere fine-tuning of age bands, in distinguishing economic winners from losers among the college-educated. The next step in data richness is to segment each college-educated age cohort into subtype of conferring institution, by prestige tier or average cost of attendance for their alma mater.

    This would be a potential blow for state universities, and that’s why their lobbyists vigorously oppose such a measure. “Truth to power” would shrink their enrollment virtually overnight, but the citizenry demand greater transparency in where our alumni are winding up economically. To what extent is our public investment in college ending up a sunk cost without hope for recovery?

    ***We need the aforementioned improvements to ACS instrumentation and reporting to better understand this resource allocation problem.***

    To do otherwise, to embrace the status quo, amounts to saying the resource allocation problem of academic inputs to their outputs isn’t worth answering at local or regional levels. The national picture on the issue is foggy due to vagueness introduced by broad banding of data segments and omission of information as to the source of one’s degree (state university, Ivy League, liberal arts, polytechnic, etc.).

    Potential students wind up gambling against far higher odds than anticipated, and many would have walked away from the deal if they had known the probable risk of being no better-off financially in the long run from their investment of student loans and precious time you’re only young once, after all!

    “Save early, save often” is a plan oft-derailed when young adults choose to study full-time instead of work full-time. And though one might argue post-2013 that a manual laborer might wind up with two part-time jobs, such earnings are still more reliable than the purported “college wage premium” that all too commonly never materializes for university graduates!

    This intentional omission of post-intervention information distorts the market because a free market can maximize consumer utility IF, AND ONLY IF, all pertinent benefit-and-cost information is made available to those consumers!

    Those who claim, “We don’t need to improve anything,” wish to leave potential consumers in the dark about the recent median or mode worth of any particular state university’s degrees — in terms of jobs actually attained by its own alumni in the labor market — versus those of another. They stand to gain from inefficient resource allocation, lest informed student-consumers opt out of the admissions process.