Posted on July 14, 2011
My bishop sent a letter to Indiana clergy calling for participation in the Call to Action process. Here is a direct quote from the letter:
This denominational focus on Vital Congregations means that every United Methodist congregation in the world will be setting goals around these five factors (which have been proven through the Call to Action study to be those areas which best measure the vitality of a local congregation):
Disciples worship – average worship attendance
Disciples make new disciples – professions of faith
Disciples grow – number of small groups for faith development
Disciples engage in missions – number of people engaged in mission
Disciples give to mission – amount of money given to mission
It all sounds good, yes?
But there are a couple of problems.
First, the Call to Action study does not prove anything. It provides some statistical evidence for some conclusions. How much evidence it gives we cannot know because the actual data and statistics that would have to be reported to make that judgement have not been shared.
Second, the five “factors” listed above come from two different places in the CTA report. Here is a brief summary of the process.
The report first collected a set of measurements that it said were “proxies” for congregational vitality. These included things such as worship attendance, giving by members, and total membership. These were not causes of vitality, but the way we measure it. The were identified in large part because they were numbers the denomination already collects.
The consultants then engaged in a statistical process known as exploratory factor analysis to group these measures of vitality into a groups or “factors.” Using these groups, they sorted UM congregations into high, medium, and low vitality based on arbitrary dividing lines.
Then, they used another statistical tool called multiple regression to figure out what data that we have about the congregations most account for some of them being high vitality and others being low or middle. From this regression, the consultants indicated four “drivers,” including things such as number of small groups and the mix of worship styles at a church.
Aside from my generic concern that the statistical tools used in this analysis do not prove anything about the causes of vitality, we also continue to act as if the tools are magic black boxes that spit out truth. What statistical tests actually do, when reported in ways that allow this, is provide evidence that can be used to engage in further thinking, research, and action. We treat the report like holy writ either because we do not understand it or want to appeal to the authority of numbers.