Organizational Science and the Role of Research and Development

Release Date:

This is the first of a two-part series on the role of Research Grade Evaluation (RGE) scientists at the Earth Resources Observation and Science (EROS) Center.

Venn diagram highlighting data science

The significance of Data Science in an era of “data is the new oil.”

The 13 RGE scientists in the Center not only are involved in scientific research and development, but also help to lead conversations here at the Center, within their scientific communities—indeed, throughout the USGS—on the ways science can better monitor and protect our planet.

This first article, written by Integrated Science and Applications Branch Chief Pete Doucette, provides his thoughts on the role of Research and Development as it pertains to RGE scientists within organizational science. Tomorrow, several RGE scientists at EROS will discuss how the RGE program works, the value and expectations it brings to their individual careers, and what it means to the reputation of the Center as a whole.

P. Doucette

As a career scientist who has worked within federal, industrial, and academic settings, let me begin by submitting that the pursuit of robust organizational Science starts with sound business principles. As with any business model, objectives and outcomes should not be discerned in a vacuum. Thoughtful planning, execution, and communication are necessary for Science endeavors, or more specifically, Research and Development (R&D).

The OPM Research Grade Evaluation Guide (RGEG) portrays R&D as dichotomous, such that the purpose of research is “extending knowledge and understanding” and development is about “new or improved products, processes, and techniques.” While obvious overlap exists between these concepts, the RGEG goes on to recognize that research assignments are “difficult to define in terms of expected outcomes and measurable results,” whereas development assignments “have predictable outcomes or measurable results.” The reality is RGE assigned scientists regularly find themselves supporting a range of R&D activities that satisfy shifting organizational roles and needs. Still, a fundamental intent of research is to motivate the pursuit of understanding, inspiration, and imagination. Where every useful method, service, or product started with the cultivation of a good idea, general expectations tend to dwell on tangible outcomes. But as noted by Epictetus, “No great thing is created suddenly.”

Realizing the return on investment in “research” needs perseverance, which is reaped through the advancement of scholarship and innovation. Regarding scholarship, RGE science advancement incentives closely parallel tenure-based faculty positions in academia. For example, a primary measure of accomplishment and reputation is tied to publication authorship, i.e., the “publish or perish” model. Within that model exists a conventional weighting system that generally measures rigor from non-refereed to fully peer reviewed works, and maturation across conference proceedings, reports, journals, and books. Significant prestige factors include lead authorship, and a publisher’s title stature (e.g., as measured by standardized “impact factors” for journals). In actuality, the ability to secure funding is generally the most significant driver for advancement in academia, to which publication productivity and prestige demonstrate requisite credentials. By contrast, in engineering settings (industry and academia) there can be more incentive for securing intellectual property for generating revenue, e.g., patents, over publications. The takeaway message is, framing and/or understanding the drivers of incentive within a Science organization is a critical business principle for effective management.

RGE advancement criteria (as in academia) also emphasize community impact in addition to publication output. This is a very important consideration, because over-emphasis of the latter can unwittingly reward a quantity over quality mentality. Ivory towers of scholarship are not immune from unintended consequences from the pressure to publish, so advancement incentives must be thoughtfully balanced to consider aspects of community impact. Examples include, championing a science agenda on behalf of a community; holding leadership positions in professional societies and conference committees; proactive communications and outreach to diverse audiences; public speaking, teaching, and mentoring; and contributing to organizational science policy development, among others. It must be stated that regardless of how publishing motivations and practices have evolved over time, the virtuous impact of an open system through which to share research ideas and results is undeniable.

Whilst the RGE scientist is provided ample freedoms through which to explore new ideas and a quest for understanding, individualized research goals must ultimately be aligned with organizational needs. Perhaps the foremost challenge facing the institution of RGE science is articulating research value when competing interests or perceptions arise regarding organizational priorities. A common case— generating data products versus analysis products (e.g., via publication). The concept of value often becomes amplified with qualities of measurability, to which data products generally better lend themselves. Whereas the intangible qualities of value as captured by Oscar Wilde, “the cynic…knows the price of everything, and the value of nothing,” often prove more vexing to defend. The point is, alternate and even adversarial viewpoints must exist to some degree, and honest communication among researchers, developers, managers, and sponsors is critical to head off counterproductive directions. Expectations must be clearly articulated and understood by all parties concerned, with potential compromise solutions identified and managed accordingly. For example, while temporary emphasis on one end of the R&D spectrum may serve “situational” needs, a persistent emphasis can lead to adverse organizational conditions of imbalance over time.

Within an era of “data is the new oil” that is fueling a remarkable surge in data-driven analytics, my belief is that an infusion of data science holds the potential to bring a healthy cultural balance to the business of R&D. A fundamental ambition of data science is the pursuit of inference, i.e., turning data into knowledge, through an integrated multidisciplinary approach (e.g., domain knowledge, math and statistics, machine learning, and computer science/IT). As such, data science is hardly a new concept. But consider the sheer enormity of the challenge before us— an analytic environment that can combine massive data across heterogeneous dimensions of observation type and phenomenology (e.g., spectral, spatial, radiometric, temporal, textual, contextual, thematic, etc.), from which to perform inference.

To this end, conscientious integration that employs data science principles is needed to close the gap that exists between domain knowledge, and a rapidly expanding technological capacity to interrogate it. It’s where state-of-the-art meets art-of-the-possible—or more likely, the unfathomable. Think ‘cloud’ and its fastest area of growth, AI/ML (deep learning), which feasts on data in ways that remain beyond the experience of the typical scientist. What it all boils down to is this:  both researcher and developer share in the responsibility to effect cultural change largely by recognizing that we are not disturbed by change per se, but the perceptions we form of it. If change is indeed the only constant over time, then embracing adaptation and agility is key to enduring it.