Researrch data

Making sense of research data

Evaluating organic waste management options and cost comparisons

When personal expertise is lacking, people place their trust in experts to facilitate decisions about everything from home additions to medical care.  Governing boards are no different.  They rely on the knowledge of utility directors, staff engineers, and consultants to help them make informed decisions.

But when presented with an avalanche of numbers – from scientific data to cost and operating projections – how do members of city councils and county commissions know if the information contained in that mountain of reports is accurate and unbiased?  How do they know they’re comparing apples to apples and not apples to oranges?

Placing value on studies specific to waste management can be complex.  One report might compare landfilling, incineration, and anaerobic digestion, but leave composting out of the mix.  Another may include composting, but base assumptions on an antiquated window system and not a modern, high-rate technology.  Research could unearth reports about a costly public project but never discover a more efficient, cost-effective commercial system.

This is not to suggest such errors or omissions are intentional.  Sometimes, it’s simply a case of “you don’t know what you don’t know.”  But when combined with the fact that detailed financial or operational data from private-sector owners is rarely made available in public spaces, one begins to understand the difficulty in obtaining good data on which to base conclusions and recommendations when doing composting cost comparisons.

The takeaway?  Assume all research is flawed in some way.  No one knows everything there is to know about every subject.  But there are a handful of questions that members of city councils and town boards can ask to help clarify reported numbers, level the playing field, and present a more accurate picture of construction and operational realities.

Who paid for the research?

Perhaps the most significant influence on any research project is the entity that foots the bill.  Even university research is funded by someone … and it may not be the university.  Non-profits may fund research, but they rely on the support of donors.  Government agencies can be funders, but governments are run by politicians.  When the private sector funds studies, the results may never see the light of day if unfavorable to the funding entity.  Student work may not be funded, but it’s still student work.

Was the research scientifically sound?

Some “research” may not be new research at all, but assumptions or conclusions based on a literature review that includes outdated or invalid findings.  Investigations may have been conducted in a manner that does not reflect “good science,” including a lack of statistically-representative sampling.  Some findings are more opinion poll than science.  But when sifting through millions of scientific papers for data, researchers won’t always pick up on these types of flaws.

Also know statistics can be presented in a manner that makes differences look more (or less) important than they really are.  (See an example in this SlideShare title:  Apples and oranges: comparing waste management technologies)

How old is the research data?

Unfortunately, it’s all too common to discover a case built on multiple levels of citations that eventually trickle down to data or conclusions that may not reflect present day realities.  Knowing the date and technological sophistication of the original study will help decision-makers evaluate the value/validity of the conclusions and recommendations included in the consultant’s report.

Don’t accept a current date on a citation at face value.  Follow the citation trail to the date and circumstances of the original research.

Is data based on full-scale operations using current technologies?

Was the data based on bench scale, pilot scale, field scale, or full scale?  Conclusions reached during early stages of product or system development can fail to “scale up” successfully.   Investigations based on dinosaur technologies of 20 or 30 years ago exclude advancements and enhancements made in recent years, distorting findings.

For composting specifically, ensure that systems and technologies are apples-to-apples comparisons using the most current data available.  If evaluating high-rate systems, include successful private-sector facilities, too, not just municipal.   Net expense and revenue values per ton processed can vary widely between different types of operations.

Using old data and processing systems for dollar comparisons could greatly skew conclusions when comparing composting to other waste management technologies.

Sometimes, imperfect is the only data available for composting cost comparisons

When conducting research in a field like composting, where meaningful research is scant, at best, the imperfect may be all there is.  Knowing and accepting this reality, proactively seeking out the most accurate information, and evaluating results based on a variety of studies and viewpoints can only help decision-makers make better choices for their respective communities.

Read the article:  Valuing composting as an infrastructure investment