- Husbands Fealing, Kaye
- University of Minnesota
- Start date
- End date
Goals and ObjectivesThis project directly addresses a key agricultural priority: that of improving U.S. food safety. The president's 2012 USDA budget called for $1billion to be spent on food safety; USDA's research priorities mirror that emphasis. Yet it is difficult systematically to document how much research there is in the field, who is doing the research, and what are the results of the research.The purpose of the study is to develop frameworks and techniques for measuring outcomes from federally funded research targeted at the agricultural sector in general and food safety in particular. The main question to be addressed is:What are the results of federally sponsored academic research on food safety and security--and how do they affect science, innovation, and broader aspects of social well-being?This study seeks to answer this question in three ways: by using new data, new technologies and new methods. A core feature of the project is the use of STAR METRICS data, which have new and very granular information on all participants in all federally funded research projects.Other key questions that this study addresses are:
1. How do scientists themselves describe their research in food safety and security?
2. How do these definitions vary across funding agencies?
3. What expenditures have been made in food safety and security--not just by USDA but by all federal agencies - and how have these expenditures changed over time?
4. Who is doing research in food safety and security - including principal investigators, graduate students, postdoctoral researchers, and staff scientists?
5. What are the research outputs at the university--journal articles, books, and patents--that are most relevant in the near and longer term to food safety and food security in the United States and abroad. How are they linked to research funding?
6. What is the competitive advantage for specific universities analyzed in our database?
7. How can researchers, administrators and students facilitate technology transfer without changing the main mission of the University--that is to educate students and do high quality faculty research?
The researchers will examine the potential to trace measures from inputs to outcomes of the efficacy of Federal expenditures in research, development and innovation activities at 10 major research institutions. Therefore, the study will enable us better to understand the "how" of research-to-practice and research-to-commercialization processes using both quantitative and qualitative evidence.
- More information
A 2009 Pew Research Center Survey found that almost three-quarters of Americans agreed that government spending on basic scientific research, as well as on engineering and technology, "usually pays off in the long run." The same survey also found that roughly 60 percent of Americans said that "government investment in research is essential for scientific progress," while almost one third said that "private investment will ensure that enough scientific progress is made, even without government investment." That year roughly three percent of total output in the U.S. was spent on research and development (R&D) by private sector firms and government agencies. Federal expenditure on R&D was $133 billion, with about 25 percent of that spent on basic research. Almost half of the nondefense R&D budget went to basic research. Arguably, these expenditures advanced science, which in turn affected social outcomes, such as national security, health outcomes, food safety and security, energy and natural resource use, transportation, communication, and education. Yet estimates of the impact of science, technology and innovation on society (from both the government and private sector) are typically based on multipliers and other proximate values.Economic returns, such as financial earnings from patent licenses, commercialized products, and spinoff companies, are one means of assessing the benefit streams of expenditures on science. Papers generated by researchers and number of students graduated are also often counted as "returns" on financial expenditures on research activities at the universities--e.g., yields from federally sponsored research and from private corporations. However, these measures do not strictly identify the outputs generated by any specific stream of funding. The gross measures ignore the obvious necessary comparison--what is the additional output from these expenditures beyond what would have occurred given the status quo. Furthermore, these measures of outputs from research activities do not go far enough to measure the social impacts of research. The general public wants to know how much their tax dollars contribute to improvements (or retrenchments) in social well-being.Assessing the public value of science and technology, therefore, is a critically important activity, for without such assessments the collective citizenry would not be able to grasp the return on their "investments" in the scientific enterprise. The purpose of this study, therefore, is to develop frameworks and techniques for measuring outcomes and impacts from federally funded research targeted at the agricultural sector in general and food safety in particular. The main question to be addressed is: What are the results of federally sponsored academic research on food safety and security--and how do they affect science, innovation, and broader aspects of social well-being? We advance understanding of this in three ways: by using administrative data linked to public records on patents and employment; new text-mining technologies that mine published articles and patents for research outputs that have impacts on food safety and (potentially) food security; and new methods that produce estimates of returns on expenditures on university research.
At the heart of our approach is the recognition that the core outcome of interest for science funders is the creation, transmission and adoption of scientific ideas. These are generated by social (both scientific and economic) networks; science funding works by enabling those networks to exist and expand. The focus of the empirical exercise will be to estimate (a) measures of the efficiency with which research networks convert their activities into the transmission and adoption of ideas and (b) measures of efficiency whereby funding creates and sustains those networks. This will be done for a variety of different measures and sets of outcomes, in a variety of different specifications to begin the discussion with USDA policy makers and researchers about the utility of such an approach for decision making. Our conceptual model identifies individual researchers (or the research community consisting of networks of researchers) as the 'engine' that generates ideas. Here, the theory of change is that there is a link between funding and the way in which those networks assemble. Then, in turn, there is a link between research networks and the way in which those ideas are created and transmitted -- and hence generate scientific, social, economic and workforce 'products'. These causal links are often long and tenuous: the frequently used approach of simply asking scientists to report on the results of grants will inevitably be systematically flawed not only because of spotty reporting but also because science is not a slot machine in which funds are dropped and then three years later "a miracle occurs." The model is much more the Kuhnian approach where scientists have roles as trailblazers, pioneers, settlers, sodbusters, ranchers, and developers of research agendas (Warsh, 2012). In summary the STAR METRICS approach of assessing returns to government research funding uses existing administrative records to understand micro-level activities in the research process described above. The STAR METRICS methodology facilitates the tracing function from inputs to results at a much more granular level. Furthermore, STAR METRICS framework is more useful for more targeted and local policy decisions, rather than binary fund/do-not-fund macro-level decisions.We will develop empirical ways of describing research networks; and the role of funding in stimulating the creation, transmission and adoption of ideas through those networks. Ancillary goals include developing more open and transparent measures of research activity and progress by capturing information on who is supported by research funding; what research is done, with whom and where; as well as the results. Our approach is to use new digital technologies to capture the data needed to demonstrate the broad scientific, social, economic, and workforce results of Federal S&T expenditures. Research institutions are already developing structured information architectures to capture current and more accurate information about their scientists' interests, activities, and accomplishments (e.g., , the VIVO Project http://vivoweb.org the Harvard Profiles System). Of course, privacy and confidentiality issues come to the fore here: we will store all data in the NORC data enclave and strict rules will be followed to ensure that the data on individual researchers will be protected. (Prada et al., 2011)There are many ways in which the technologies can be applied. The first is to use natural language processing to describe WHAT research is being done, using proposal and award text to identify the research topics in a portfolio. The second is to use administrative records to describe WHO is doing the research (and WITH WHOM). The third is to use CV data, university websites, administrative data from, for example, a new patents database and other sources of data to describe WHAT RESULTS the funding has generated.We will examine who is doing research on federally supported grants on food security by using the new STAR METRICS data from 10 CIC institutions, which receive substantial funding from USDA NIFA. The data on the full team of researchers supported by an individual research grants are captured in the STAR METRICS Level I research institution data. This is made possible because the data are drawn directly from payroll records, which also have the occupational classifications of each individual employed; it is thus very straightforward to describe the variety of occupational categories directly supported by agency funding. In the case of one university's data, there is a broad spectrum of occupations involved in working on a single PI's projects, including clinicians, technicians, undergraduate, graduate, and post-graduate researchers and research support staff. The data also permit the capture of much more detailed information on time allocation and on the interactions of all staff on projects (Lane & Schwarz, 2012).It has hitherto been very difficult to describe what research has been done across agencies, because manually created taxonomies and budget oriented program codes do not fully portray the science that is done by researchers. Topic modeling is an alternative to possible pre-existing classification or categorization schemes, and can also be used together with these pre-existing schemes. Topic modeling creates a unified topic basis for a wide variety of analyses. The topic representation provides an immediate structure to compare, contrast, and combine text documents. Since each document in a collection contains multiple topics, researchers can use cluster analyses to identify aggregate research areas within a portfolio and to visualize the growth and decline of these research communities over time.This is particularly true when it comes to the area of food safety and security. What exactly do these topics mean? The following list gives a quick idea of the range of questions one could answer using the topic modeling approach--and answer the types of questions raised by our opening questions:1. How much does research agency X spend on food safety?2. How has spending on food safety changed between 2010 and 2012?3. What documents are similar to this set of documents funded by USDA?4. What is the topical makeup of documents categorized as food safety by USDA?5. What documents are related to a given keyword query?6. What topics do documents sets X and Y have in common?Although we are very aware that innovation and patenting are two separate activities, we will begin by creating links between researchers funded to do work on food security with their publishing and patent activities. In earlier work we have developed links using different types of webscraping as well as machine reading existing CV data. We will use fuzzy matching algorithms in order to match CIC researcher names to inventor names in a newly updated USPTO dataset that has data up to March, 2013 on inventors, assignees, geography and coauthorships.Finally, we will draw from our work in other contexts to estimate the conceptual model using the empirical framework we have developed. We will: develop ways of identifying teams and measures of their composition; develop ways of tracking Principal Investigators and the teams attached to their scientific activities; describe the longitudinal evolution of those measures using descriptive statistics and network analysis; extend the descriptive analysis to incorporate an understanding of different funding and institutional structures for different scientific areas; describe interrelations between different team output and performance measures at different levels of aggregation; and run descriptive regressions to tease out the relationships between expenditures on food safety research and subsequent research results.
- Funding Source
- Nat'l. Inst. of Food and Agriculture
- Project source
- View this project
- Project number
- Accession number
- Food Defense and Integrity
- Bacterial Pathogens