Planning and Strategizing
In order to ensure that the principles and operating procedures adopted by the partnership are being followed, and that an effective partnership is being established and maintained, partnerships need to conduct an ongoing participatory and formative evaluation of the partnership process.
Such an evaluation involves partners in the design and conduct of the evaluation (e.g., determining questions to be asked, how data is collected), and provides ongoing feedback of the results to the partners in ways that are understandable and useful (e.g., written reports, verbal presentations). All partners need to be involved in the interpretation of the findings and applying them to make changes in the partnership process, as appropriate.
It is important to use process evaluation to monitor the health of the partnership. Process evaluation can be done relatively simply and inexpensively. It does not require a full or part time evaluator. For example, facilitated reflective discussions can be incorporated into regular board meeting agendas, periodic online surveys can gather anonymous information from partners and graduate students or consultants can be engaged to conduct annual face-to-face interviews with partners. Even with an informal process, the information gathered can provide valuable insight into the direction of the partnership. For example, an informal evaluation process might entail having the chair of the partnership board interview partners between meetings to assess their satisfaction with the partnership.
Evaluations that identify strengths and areas for growth and improvement will help partnerships make changes that increase their chance for success. Evaluation findings should be presented at least annually to the partnership board (or other governing and advisory bodies) to determine whether changes need to occur within the partnership. The board should allocate time to discuss the value of the evaluations and what response if any is needed. Evaluation findings can be used to reflect and critique the partnership process and relationships.
As partnerships and their membership progress over time, it is especially important to document decisions and their rationale. Documentation helps partnerships to create a mutual understanding, and also serve as a record of the decisions made by the partnership, should conflicts arise in the future regarding a particular issue or decision.
Example 7.1.1: Using Evaluation and Indicators of Success
Our partnership has monitored our impact through the evaluation of the Broome Team, the Prevention Research Center, and the individual projects and programs that have been implemented. We have used instruments such as closed-ended questionnaires, monthly reports by each organization, surveys, focus groups, field notes and in-depth interviews. In the early years of our partnership, one evaluator from the University of Michigan was assigned to complete our evaluation. This evaluator used a participatory evaluation model to determine indicators of success. Subsequent evaluators have built on this process, and it is now a collaborative effort where we collectively define our indicators of success:
One of our indicators of success is the integration of our windshield tours into the residency training programs at local hospital systems in our County.
Another indicator of success is the development of an Office of Community-Based Public Health at the University with dedicated staff, whose mission is to connect community and health department partners to faculty and students. A school-wide community-based public health (CBPH) committee was also established to provide policy direction and oversight for the Schoolís CBPH efforts. Our community and institutional partners are supervisors, teachers, and mentors to graduate students inside and outside of the classroom, and they are also involved regularly as classroom presenters.
We must also point to the longevity of our partnership as an indicator of success. It is our sustainability even after funding has ended and the recognition that we will stay at the table even though we have had differences of opinion that allows us to continue addressing our communityís problems. Jokingly, one partner said, "you only get out of this by death." There is some truth in this joke because a successful partnership requires this level of commitment, a commitment described by one of our founding members as one that goes beyond the 9-5 workday.
We also know that we have been successful because of the increase in the number of community-based organizations that have become engaged in various projects as a result of our teamís influence. More community-based organizations now have involvement on steering committees throughout the community at large.
We also attribute the proliferation of organizations committed to community-based public health to our work nationally such as the Prevention Research Center (PRC) National Community Committee, which is a network of community-based organizations involved in Prevention Research Centers across the country and the Community-Based Public Health Caucus within the American Public Health Association.
Reprinted by permission from Lippincott Williams and Wilkins http://lww.com
Excerpted from Flint PRC proposal
Example 7.1.2: Using Evaluation for Program Planning
As a result of this formative component of the Detroit Community-Academic Urban Research Center (URC) evaluation, results were presented to the Board in a manner that allowed members to redirect or refocus activities on several occasions. For example, results from the evaluation revealed that many Board members had grown uncomfortable with the URC's stated focus on "maternal and infant health" in its original goals and objectives. The majority of members perceived the actual emphasis of the group to be broader. These results were presented back to Board members, who in turn had a lengthy discussion about the advantages and disadvantages of a more expanded focus for URC interventions. Subsequently, the group decided to change its official focus to "family and community health."
As another example, an issue that arose in the early evaluation results from the in-depth interviews was a possible difference in opinion between academic and nonacademic Board members regarding the types of research in which the URC might be involved. Some of the academic Board members expressed visions of a variety of research endeavors, including research further describing the extent to which specific health problems or their correlates and causes exist in URC communities. The majority of nonacademic Board members, however, clearly stated their belief that the only type of research the URC should be conducting is intervention research. Descriptive or epidemiologic studies were perceived as "research for the sake of research," activities that they felt take away from communities without giving anything in return. Evaluation results regarding this issue were presented back to the Board and some very frank discussions ensued. Subsequently, Board members reached an understanding that the primary work of the URC should be intervention research, or research that provides and evaluates a community-based program.
From Israel BA, Lichtenstein R, Lantz PM, et. al. (2001) The Detroit Community-Academic Urban Research Center: lessons learned in the development, implementation and evaluation of a community-based participatory research partnership. Journal of Public Health Management and Practice. 75(5), 1-19.