SOLICITATION NOTICE
R -- Integrating Environmental Conflict Resolution Evaluation Results into Practice
- Notice Date
- 5/1/2003
- Notice Type
- Solicitation Notice
- Contracting Office
- Morris K. Udall Foundation, Office of the Director, Office of the Director, 130 S. Scott Avenue, Tucson, AZ, 85701
- ZIP Code
- 85701
- Solicitation Number
- HEW-EVAL-1.1
- Archive Date
- 5/31/2003
- Point of Contact
- Pat Mahalish, Sr. Administrative Assistant, Phone 520-670-5299, Fax 520-670-5530, - Mindy Brugger, Administrative Assistant, Phone 520-670-5299, Fax 520-670-5530,
- E-Mail Address
-
mahalish@ecr.gov, brugger@ecr.gov
- Description
- REQUEST FOR QUOTATIONS Project Title: Integrating ECR Evaluation Results into Practice This is an announcement that the U.S. Institute for Environmental Conflict Resolution (the U.S. Institute) is soliciting quotations from qualified professionals to analyze and interpret evaluation data for environmental collaborative problem-solving and conflict resolution (ECR) cases. Part of the effort will involve aggregate quantitative and qualitative analysis of new evaluation data and part will involve comparing the results of the analysis with findings in the published ECR literature. The goal of this project is to identify the key ingredients of ECR success, and to communicate these findings within the ECR field to improve practice. Background The U.S. Institute for Environmental Conflict Resolution is a federal program established by the U.S. Congress to assist parties in resolving environmental, natural resource, and public lands conflicts. The U.S. Institute serves as an impartial, non-partisan institution providing professional expertise, services, and resources to all parties involved in such disputes. See www.ecr.gov for more information on the U.S. Institute. In 1999, the U.S. Institute and the Policy Consensus Initiative (PCI) began an inquiry into the feasibility of developing program evaluation guidance for state and federal agencies that administer public policy and environmental conflict resolution programs. With funds from the Hewlett Foundation, PCI has extended the initial collaboration involving the U.S. Institute, the Massachusetts Office of Dispute Resolution and the Oregon Dispute Resolution Commission (ODRC) to additional states. At the same time, the U.S. Institute has been working with federal agencies to enlist their cooperation and involvement. The initial collaborative effort focused on developing a conceptual framework for evaluating ECR programs ? a logic model of expected outcomes for each of the agencies? program areas. With respect to ECR cases, the logic model identifies necessary practice factors and a set of both process and agreement outcomes. In addition, specific questionnaires were developed to collect data needed to measure the degree to which the practice factors were in place and the outcomes were achieved. The U.S. Institute, ODRC, and the Florida Conflict Resolution Consortium (FCRC) have adopted and are using the same theory of practice model and essentially the same questionnaires to evaluate their cases. In addition, the U.S. Institute is working with several other federal agencies to evaluate their ECR cases. This collaborative effort will produce an initial combined excel database collected in a consistent manner for 30 - 40 cases. Analyzing this aggregate database and interpreting the results should add substantively to the growing body of information from other ECR research and evaluation efforts. Project Partners The partners providing cases or case evaluation data for this study include: - Oregon Dispute Resolution Commission - Conflict Prevention and Resolution Center, Environmental Protection Agency - Office of Collaborative Action and Dispute Resolution, Department of Interior - Federal Energy Regulatory Commission - Florida Conflict Resolution Consortium - Other state programs with available and compatible data sets. Overview of the Project Inquiry The aggregate analysis component of this evaluation effort will use the theory of practice model adopted by the U.S. Institute, ODRC and FCRC as the basis for this inquiry. (An overview of the U.S. Institute?s ECR evaluation system, a list of variables, and a draft ECR evaluation-report template are available under the section ?What?s New?Upcoming Events? on the U.S. Institute website at www.ecr.gov.) The primary goal of this evaluation effort is to understand the key ingredients of success (i.e., achieving expected process and agreement outcomes). While exploring and explaining variations in achievement of outcomes, the relationships between variables such as practice factors and specific outcomes will be analyzed. The study results will also review the results from published ECR literature and compare and contrast the findings of the aggregate analysis with the findings in the published literature. Findings will then be framed in terms of learnings for program improvement and ECR practitioner training. The types of questions to be addressed through this ECR evaluation effort (both the aggregate analysis and the review and synthesis of published ECR literature) include but are not limited to: 1. What aspects of ECR best practices are most important for collaborative problem-solving and conflict resolution processes to be successful? 2. Which practices need to be employed more effectively by ECR practitioners and program managers? 3. What is the relationship between the complexity of cases (duration of the controversy, the number of parties, etc.) and the process and agreement outcomes? 4. What is the relationship between the degrees to which best practice factors are employed and whether or not agreement is reached? 5. Which of the services provided by the neutral(s) are most important in determining the parties? overall satisfaction with the neutral(s) and with the process? 6. Does prior experience of the parties with collaborative processes influence their level of satisfaction with ECR processes? 7. What elements of the process influence participants overall satisfaction with the process? 8. What factors influence the participants? perspectives on whether the ECR processes are better than any other processes they are aware of? 9. What elements of the agreement (flexibility, parties understanding of the key elements of the agreement, etc.) influence whether the participants feel the agreement will last? 10. To what extent does reaching agreement affect participants? satisfaction with the process? 11. In cases where full agreement is not reached, what factors influence (positively or negatively) satisfaction with the process? Required Professional Expertise It is anticipated that a two- or three-member analysis and review team will be assembled to perform the required tasks (one member may be qualified to perform more than one function and individuals may submit responses for more than one of the described team roles). The team will reflect the following competencies: * Statistical evaluation ? a program evaluation expert who is familiar with evaluation in the ECR field, and possesses methodological expertise needed to design and interpret the results of aggregate analyses of case data, and possesses (or has access to) analytical skills needed to conduct such analyses; * The ECR research ? a recognized expert in research on the effectiveness of ECR and public involvement activities, will be thoroughly familiar with the associated literature, and will be able to assess the rigor and generalizability of the findings of both the aggregate case analysis and of other empirical studies; and * The ECR practice ? an experienced facilitator of collaborative planning and problem solving processes and/or a mediator of environmental disputes, who is a recognized leader in the field, and who has the expertise to assess the utility of the evaluation findings to ECR practice. Individual statements of interest and quotations will be solicited from and individual contracts will be negotiated with each member of the analysis and review team. The U.S. Institute also encourages interested candidates to identify and assemble a complete team and to submit a combined response. PART A: STATISTICAL EVALUATION Scope of Work The U.S. Institute is seeking an evaluator with statistical expertise to: 1. Conduct aggregate statistical analysis of ECR case data from several similar data sets. 2. Work with the other members of the review team to distill and interpret key findings from the aggregate analysis and the published results from other ECR studies. The successful candidate will be responsible for the research design in consultation with the other team members, and the U.S. Institute staff. The candidate will conduct the aggregate case data analysis and address data reliability and validity issues. The candidate will report results in the form of presentation slides as well as a brief written evaluation report. Both report formats should summarize the findings, describe the statistical relationships, interpret the findings of statistical results, in light of the key questions to be addressed and the implications for ECR practice. The candidate will work with the other members of the team to review key findings from the aggregate analysis and compare these results with published results from other studies. The candidate will assist the other team members in preparing a draft report of the combined findings for distribution to a broader review group (i.e., approximately 30 practitioners, program managers from state and federal ECR programs, trainers, and researchers). The candidate will present the study findings at a two-day workshop with the broader review group to be held in Tucson (tentatively scheduled for November 20-21, 2003). The candidate will also participate in workshop discussions assisting with the identification and prioritization of significant themes or issues for communication to the field to improve practice. Evaluator Qualifications Specific qualifications and expertise sought are listed below: - Training and extensive experience in quantitative research methods and statistical analysis (experience with Excel and SPSS preferred). - Training and experience with program evaluation. - Familiarity with ECR. - Ability to work both independently and collaboratively. - Ability to communicate professionally and effectively, both orally and in writing. - Available July 2003 ? February 2004 to complete project related work and participate in project related activities. The Contractor will be selected by and contract with the U.S. Institute for Environmental Conflict Resolution. The U.S. Institute will provide project oversight. PART B: Researcher/Practitioner Scope of Work The U.S. Institute is seeking two ECR researchers and/or practitioners to help design, organize, and review findings from the aggregate ECR case data analysis (conducted by the team evaluator) and the published results of other ECR studies, state program evaluation efforts, and other recently published research pertinent to improving ECR. The successful candidates will be responsible for providing input to the research design for the aggregate ECR case data analysis. The candidates will conduct and compile a literature review of recent evaluation and research findings pertinent to improving ECR. The candidates will systematically review and synthesize the findings from the aggregate case data analysis and the published results of other studies to distill from this work the key findings on ECR success that will be useful to practitioners and program managers working in the ECR field. The candidates will be responsible for submitting a draft report of findings (including the literature review) to the U.S. Institute for distribution to a broader review group (i.e., approximately 30 practitioners, program managers from state and federal ECR programs, trainers, and researchers). The candidates will also present the preliminary findings at a two-day workshop with the broader review group in Tucson (tentatively scheduled for November 20-21, 2003). Researcher/Practitioner Qualifications Specific qualifications and expertise sought are listed below (note, we are looking for two team members. Each may have strengths primarily as an ECR researcher with knowledge of ECR practice, or as an ECR practitioner with knowledge of research methods). However, a single candidate with strong credentials as both a researcher and practitioner could qualify for both roles: - Extensive knowledge of and experience with environmental conflict resolution (ECR) processes, either from a research or practice perspective. - Familiarity with research methods (an ECR practitioner), or expertise with research methods and familiarity with statistical techniques (an ECR researcher). - Experience and ability to work both independently and collaboratively. - Available July 2003 ? February 2004 to complete project related work and participate in project related activities. The Contractor will be selected by and contract with the U.S. Institute for Environmental Conflict Resolution. The U.S. Institute will provide project oversight. Criteria for Selection Selection of successful candidates for each of the positions will be based on the qualifications described above for each position, availability, any identified limitations, and cost. Submission of Quotations The U.S. Institute is accepting expressions of interest and quotations for each of the two types of positions on the review team. Individual candidates may submit a quotation for one or both positions. Teams may self-assemble and submit a quotation for the entire team. Candidates should provide the following information. Responses must be limited to no more than five typed pages, appended by three work samples (reports of relevant and representative projects). Responses must include: - The position sought (Evaluator and/or Researcher/Practitioner) - A description of specific skills, expertise, and experience related to the described qualifications (include any support personnel that your propose to use). - An estimate of the labor hours and total cost to complete the work for the indicated positions. - Hourly labor rates including rates for travel. - Total project costs (not including travel) cannot exceed $40,000. - Your availability to begin work on this collaborative project with the U.S. Institute from July 2003 through February 2004. Please specify any large periods of time that you may not be fully available or constraints that may influence your ability to complete this project on schedule. - A description of other limitations of which the U.S. Institute should be aware. Funding and Scheduling Funding available for the analysis and review team is $ 40,000 plus travel costs, with funding levels for each of the team members to be determined based on negotiated scopes of work. The work will be conducted from July 1, 2003 through February 28, 2004. A tentative project timeline follows. Tentative Project Timeline (Note: The project start date may be revised to 9/1/03 necessitating the revision of the project timeline by 2-months) Major Activities, Outputs and Timeline Task 1 ? Conduct Aggregate Analysis and Review Other Findings (Review Team)Conduct the aggregate analysis and identify and review findings from other federal and state program evaluation efforts and other recent published research pertinent to improving ECR performance and based on systematic multi-case or meta-analysis studies. The collaborative review group members will recommend to the consulting team additional references they may incorporate into their review. Project start date 7/1/03 Project planning conference call July 1, 2003. Research and review of other evaluation efforts and published research 7/01/03- 9/15/03Data analysis 8/01/03-9/15/03 Task 2 - Review Findings and Draft Report (Review Team)The consulting team will distill from the aggregate analysis and the published results from other studies, the key findings on performance that are methodologically robust and will be useful to practitioners and program managers working in the field. A draft report will be reviewed at a meeting in mid-September in Tucson. Meet in Tucson, review findings, and draft report mid-September 2003 Task 3 ? Circulate Findings (U.S. Institute and Review Team)The consulting team will revise their report and the U.S. Institute will circulate the revised report to the larger collaborative review group of practitioners, program managers from state and federal ECR programs, and researchers. The U.S. Institute will make preparations will be made for the collaborative review group meeting in Tucson in November 2003. Circulate findings to larger collaborative review group10/15-11/1/03 Task 4 ? November Workshop (U.S. Institute and Review Team)A two-day workshop will be conducted to focus on the consulting team?s findings, and identify and prioritize significant themes or issues that need to be communicated to the field to improve practice. Workshop meeting in Tucson November 2003 Task 5 ? Prepare Final Report (U.S. Institute and Review Team)A final report on the research and workshop findings and recommendations will be circulated broadly through website and email listserve. Publications and presentations will be developed and delivered at the summer 04 U.S. Institute conference. Completed final report by December 2003 or early 2004 Task 6 ? Develop e-network (U.S. Institute)Develop an e-network of ECR trainers, starting with roster members and ACR section members, but soliciting more broadly as well (for example, to university-based professors and curriculum designers). Seek their advice on how they currently address the matters raised in the workshop findings in their training activities, and solicit their recommendations on how the new findings might be integrated into future ECR training programs. Initiate training network January 2004 Task 7 ? Create Training Module (U.S. Institute)Identify information and training needs pertinent to the workshop recommendations and develop an orientation session for trainers and a demonstration-training module for practitioners and program managers to be presented at the summer 04 U.S. Institute conference. Develop demonstration training moduleFebruary 2004 RFQ Informational Conference An informational conference call will be held on May 13 at 11am (MST) to answer questions about the solicitation and the project. Interested parties must contact Patricia Orr at orr@ecr.gov by May 9 to obtain call-in information for the conference. Distribution of the Announcement Readers of this announcement are encouraged to distribute it to others who may be interested in responding to the RFQ. Deadline In order to be considered, submissions must be received before 5:00 PM (MST) on Friday, May 30, 2003. Submit information, work samples and references to: Patricia Orr, Program Evaluation Coordinator U.S. Institute for Environmental Conflict Resolution 130 South Scott Avenue Tucson, AZ 85701 Fax: (520) 670-5530 Email: orr@ecr.gov Submittals by mail, fax, or email are permissible. Quotations will be reviewed the first week of June, 2003. We anticipate that a selection will be made and contracting in place by the end of June. Reference Materials The following reference materials can be found under the section ?What?s New?Upcoming Events? on the U.S. Institute website: www.ecr.gov - U.S. Institute ECR Evaluation Overview - U.S. Institute ECR Evaluation Report Template (Draft) - Evaluation Dataset: List of Common Variables
- Record
- SN00316353-W 20030503/030502074115 (fbodaily.com)
- Source
-
FedBizOpps.gov Link to This Notice
(may not be valid after Archive Date)
| FSG Index | This Issue's Index | Today's FBO Daily Index Page |