Upon granting unified accreditation to the University of Maine System in July 2020, our regional accreditor, the New England Commission of Higher Education (NECHE), asked us to prepare a self-study in advance of a Fall 2022 visit by a NECHE-appointed evaluation team. Standard Two: Planning and Evaluation can be found on this page.
Read the full self-study draftDescription
Planning and evaluation in UMS occur at multiple levels. The Board of Trustees (Board) has oversight of UMS initiatives enacted through the Chancellor’s leadership. Those initiatives are currently guided by Board strategic priorities set forth in 2016 and 2018. System-wide committees and work groups with representation from UMS and the universities and Law School coordinate planning and evaluation to meet university- and System-level goals.
Central to planning and evaluation is the institutional research (IR) and assessment function supplied by several offices at the UMS and university levels. UMS established a UMS IR office in 2016 with three full-time staff who report to the Vice Chancellor for Academic Affairs (VCAA). UMS IR provides regular UMS and university reports and data for academic and research needs, data support for UMS committees, and data requested by the Maine legislature.
Five UMS universities maintain institutional research offices with one (UMF) to four staff (UM and UMA). University IR supplies reporting and analysis at the university, department, and program level on retention and graduation, credit hours, and enrollment trends, conducts student surveys, projects enrollment and credit hours, and provides data for academic program reviews.
A third component of UMS institutional research support is the Office of Data, Analytics and Reporting Technology Services (DARTS). DARTS leads and supports data infrastructure, data governance, data literacy, research, and analytics. Together, UMS IR, university IR, and DARTS ground decision-making in valid data suitable for strategic and operational applications.
To ensure System-wide data consistency and integrity, UMS developed the Data Governance Program in 2017. It includes a five-member Council advised by a Data Advisory Committee with roughly 25 representatives from the seven universities. In its first five years, Data Goverance has adopted the UMS Data Cookbook (housing functional and technical enterprise data definitions developed collaboratively across UMS and its functional areas), established consistent definitions for Early College and distance education, among other areas, and created a report certification process.
I. Planning
Strategic planning
Strategic planning in UMS is multi-tiered. While much of it occurs individually at varying levels— e.g. System-wide shared services, collaborative programs, and universities— emphasis is on the alignment of plans and planning to ensure UMS priorities remain in step with mission-specific priorities defined at each university. Strategic planning at all levels informs UMS and university capital and facilities planning, enrollment management, financial planning, academic program development and approval, and student and community service programs. Strategic differentiation is encouraged in the context of UMS priorities designed to meet state and regional needs.
At the highest level, the Board develops and disseminates priorities, goals, and strategies. These are made public through a variety of means: their content is communicated by university leaders and by the Chancellor and his senior staff. These priorities, goals, and strategies reflect strategic planning focused on enrollment, student learning, retention, research and economic development, state workforce needs, and related areas.
All UMS universities maintain strategic plans developed in an inclusive manner that engage their stakeholders. University plans are expected to align with UMS priorities while identifying mission-differentiation and areas for non-duplicative innovation.
In addition to UMS and university planning, shared services, including IT, finance, and human resources, engage in regular planning processes to ensure they meet the needs of UMS, its universities, and the Law School.
Through unified accreditation, numerous collaborative academic and student-support programs involving two or more UMS universities have been forged. An example is the University of Maine at Fort Kent’s (UMFK) bachelor of science in nursing (BSN) program at the University of Maine at Presque Isle (UMPI). Students begin at UMPI and then continue as UMFK students with no need to relocate, as all BSN classes are also offered live on the UMPI campus by UMFK faculty on that campus. In this program, UMFK grants the degree. Planning and operationalization involves administrators from both universities who annually review staff, faculty, student, curricular, and resource needs and make appropriate changes and investments.
Contingency planning
Planning for unanticipated events originates from the Board and Chancellor in consultation with the presidents. Decisions requiring further development and/or discussion are typically delegated to cross-university functional teams (e.g. finance, IT, HR). Options are reviewed by university leadership and in most cases are routed to the UMS Presidents’ Council for resolution.
Shared services leaders convene regularly— and exigently as needed— to plan for and respond to unforeseen events, with support and facilitation by UMS senior staff. IT leaders, for example, supply additional technology as needed to support faculty, staff and students, while the chief business officers work with the Vice Chancellor for Finance and Administration (VCFA) to plan for absorbing an appropriation curtailment or rescission.
Examples of success
The Board’s 2018 strategic priorities included the goal of increasing adult degree completion. Per the recommendations of the June 2018 UMS Adult Degree Completion report, the Adult Degree Completion Committee was established with representation from every UMS university. With support from UMS staff, the committee identified resources adult completers need. A website provides these resources while also serving as a promotional tool in a statewide marketing campaign. In addition, UMS hired two success coaches who now deliver broad-based services to guide adult learners into UMS and help them make progress in their academic careers.
UMS shared services regularly undertake planning with wide university input. An example is the successful implementation of a new System-wide learning management system (LMS) in 2020. The UMS Educational Technology Advisory Committee (ETAC), which includes faculty and UMS and university staff, shepherded the selection of the new LMS from the request-for-proposals (RFP) stage through final implementation.
The LMS effort was guided by needs and desires expressed by faculty and IT personnel in surveys; that input was fed into the RFP. ETAC led the evaluation of proposals and hosted vendor presentations at the annual UMS Faculty Institute to get further feedback from faculty users. As the new LMS was launched, ongoing support for end-users was delivered through regular trainings, including virtual one-on-one sessions.
II. Evaluation
For the purpose of sustaining a cycle of continuous improvement, evaluation occurs at all levels of the institution, from System reporting on strategic priorities to local academic program review and assessment.
UMS evaluation
System-wide initiatives responsive to Board priorities include those that reach first-generation students, adult students, Early College programs, financial aid, and distance education. Standing committees lead that work and regularly review data informing collaborative planning. For example, the UMS Student Success Steering Committee tracks data on return rates, low/failing grade rates, and stop outs and shares it with the appropriate university offices and staff for use in supporting students.
Planning efforts for meeting Board priorities is undergirded by standard reporting and topical research (e.g. benchmarking analyses, labor studies) using internal and external data. Examples can be found on the UMS student reports website and dashboard. A set of key performance indicators measuring financial health, enrollment, and student success is monitored regularly by the Board.
In 2018, UMS introduced a Programs for Examination (PFE) process initially designed to “foster broader collaborative discussions among faculty and academic administrators regarding program size in the context of mission, quality, and sustainability.” PFE has since evolved to serve as a continuous improvement process. Through it, the universities use data provided by UMS IR along with information collected from academic units to evaluate the current and projected future state of programs. In spring 2022, PFE was renamed the Annual Academic Program Review (AAPR).
At the end of each AAPR cycle, the chief academic officers present highlights to the Board. Examples of improvements reported in the spring 2021 cycle were the modification of UMA’s Contemporary Music program to permit fully online delivery, which has led to increased enrollments; diversification of undergraduate populations in UMPI’s YourPace competency-based education programs; and the development of an interdisciplinary major at UMF.
Evaluation of shared services
To provide more efficient services and deploy resources strategically, UMS adopted a shared services model in fiscal year 2015. Those services are Finance and Administration; Information Technology; Facilities/Capital Planning and Project Management; Strategic Procurement; Risk/Safety Management; Human Resources; and Equal Opportunity. Each marshals the work of either a mix of UMS and university staff, or UMS staff assigned to support a specific university or universities, typically in direct collaboration with university leadership.
Evaluation processes vary to account for local priorities and needs. UMS universities apply a range of assessments to measure progress in achieving university- and program-level strategic goals, including seeking external reviews, conducting campus-level surveys, and tracking student success, financial aid, and university- and departmental-level budget metrics.
Institutional research
As noted above, four UMS universities have institutional research offices. USM has separate IR and assessment offices, while IR offices at UM and UMA fulfill combined IR and assessment functions. The IR function at UMPI and UMFK is served by a university staff member in another role and by UMS IR staff. University IR provides reporting and analysis at the university, department, and program levels. Regular reports and analyses on low/failing grades and course withdrawal rates, student success, university and course-level enrollment projections, and admissions trends are among the outputs of university IR, which also supports surveys and provides data for program accreditation and reviews.
Evaluating and informing strategic plans
UMS universities engage in a continuous improvement cycle of strategic planning, implementation, and evaluation. For example, at USM, benchmarking and a campus survey conducted by the Data Innovation Project will guide a community policing model for the university’s Department of Police and Public Safety, while UMPI’s strategic plan outlines a set of key results that will be tracked over the next five years.
At UM, a year-long evaluation of the 2012-2017 Blue Sky strategic plan was followed by the creation of the Strategic Vision and Values framework. In fall 2020, a working group set key indicators to track progress in meeting the goals of that framework. At UMF, a new structure for planning is in development, and the evaluation of elements in the UMF strategic plan completed in 2021 will rely on key performance indicators tied to the plan’s goals, with targets and actuals compared and assessed.
Academic programs
UMS universities regularly track student demand, costs, and revenue associated with academic programs. These and related measures are incorporated in the AAPR process. (In 2017, every UMS university participated in the National Study of Instructional Costs and Productivity. Its structure and definitions did not align well with some universities’ needs, and use of the data was not widespread. At this time, only UM still participates.)
Programs with specialized external accreditation adhere to the evaluation requirements of those accreditors in addition to all NECHE standards. This may entail specialized data management. For example, UMS education programs use TK20 to ensure compliance with the Council for the Accreditation of Education Programs (CAEP)’s assessment expectations.
Expectations for academic program review differ across universities. For example, USM’s APR requires that the self study explicitly reflect on the program’s contributions to the university’s Vision 2028 document, and the combined self studies allow USM to benchmark its progress on that academic vision. All UMS universities use academic program reviews for broader planning purposes, including benchmarking progress on their academic goals.
Collaborative programs
Evaluation of collaborative programs follows the model outlined above. The university granting the degree is primarily responsible for program evaluation, and the general principle applied is that all faculty and courses are evaluated regardless of university affiliation. For example, in the joint UMA-UMPI Cybersecurity program, UMPI is the degree-granting university, and its procedures govern program evaluation.
Another example is the UM Graduate School of Business, a collaboration of UM and USM faculty. Evaluation is conducted at multiple levels: two comprehensive committees— the Curriculum Committee and the Steering Committee— benchmark against other programs to develop suggestions for curriculum improvements and for strengthening learning outcomes used in the assurance of learning. Direct assessments and surveys of completers are among the means used to assess learning outcomes.
Student success and satisfaction
As noted above, the universities regularly track student success metrics such as retention and graduation rates, low/failing grades and course withdrawal rates, and GPA. Some also share data with external entities such as the Consortium for Retention Data Exchange and the Student Achievement Measure. There has also been growing use of the EAB Navigate tool, which provides data collection and analysis with predictive capabilities for faculty and professional advisors (see Figure 6, p. X).
All UMS universities survey students, faculty, and staff to gauge satisfaction and collect information to inform organizational change. All participate in the National Survey of Student Engagement (NSSE) every three years, and in the Great Colleges to Work For survey. Data are shared with university stakeholders and are used for planning. The universities also periodically survey students about learning, campus activities, technology tools, dining services, and campus climate. In the first two years of COVID, several universities surveyed their constituencies about health and safety contingency planning.
Some UMS universities consistently survey their students at or after graduation. For example, USM conducts a survey of seniors upon application for graduation. That data and NSSE data are used to track progress toward meeting student satisfaction. Similarly, UM has conducted its Life After UMaine survey for over 20 years. Those data are used to track the percentage of graduates employed or in graduate school, and the extent to which they felt prepared for their post-college endeavors. The UM Graduate School surveys its students upon graduation and uses the feedback for planning purposes.
Appraisal
Collaborative strategic planning
As outlined in the Institutional Overview, UMS is currently engaged in a System-wide strategic planning process. The process follows a clear published timeline, includes numerous opportunities for stakeholder engagement, and has been designed to harmonize with ongoing unified accreditation efforts and the development of this self study.
Unified accreditation has united UMS and its universities in a way that compels intentional strategic planning. Planning processes and the communications attendant to them are evolving accordingly. As evidence of this, numerous UMS initiatives, shared services, and academic programs plan collaboratively.
For example, UMS Title IX processes and compliance operate as a collective, with centralized procedures and shared support for policy implementation and case management, and Title IX offices and their staff support multiple universities. The Title IX hearing officers and advisors comprise an inclusive and consultative group acting in a unified way.
As more functional areas have begun shared planning across the universities, existing roles and responsibilities are shifting. Policies addressing expectations of staff and use of resources are being reviewed and revised to support effective collaboration and cooperation in the unified environment. This includes an updating of UMS Administrative Practice Letters (APLs) led by the Vice Chancellor for Academic Affairs (VCAA) in consultation with the UMS Faculty Governance Council and the Chief Academic Officers.
Currently, the universities and Law School are accountable for meeting individual performance benchmarks, but these benchmarks are not necessarily used for System-level planning. Aggregated benchmarks have not yet been developed to evaluate whether shared services and other System-wide functional areas provide services responsive to, or are evaluated in view of, specific university-level goals and objectives.
Contingency planning
In most forms of contingency planning, there is inherent tension between UMS- and university- level priorities. While there is a collaborative environment generally, there are legitimate but sometimes unproductively competitive pressures to preserve and protect individual university priorities. For instance, when planning for high-demand academic and workforce development programs, questions about which university or universities stand to benefit most can invite disagreement and competition.
Assessing adult learners, first-generation students, and related populations
Data are regularly provided to committees tasked with improving student success, degree completion, and access for adult learners and first-generation students. However, a framework for the systematic (global) evaluation of these and related initiatives does not exist. In short, although data and analyses support this work, an assessment cycle of continuous improvement for these populations has not yet been instituted.
Improving coordination of IR activity and UMS and university-level planning
University IR capacity varies across UMS, a reality which can sometimes serve as a barrier to System-wide evaluation of programs and priorities. Improvements have been made since the creation of the Data Governance structure and the UMS IR office, but more work is needed to ensure adequate coverage of needs among the universities. For example, under ideal conditions, all UMS universities and the Law School would follow the same or similar schedules for collecting information throughout the student life cycle. As IR and assessment resources are not staffed equally, additional responsibility sometimes falls to staff in other university functional areas (e.g. student records, Academic Affairs) when data are needed for System-level planning.
Projection
As noted in our response to item 11.b on the Institutional Characteristics Form, UMS universities do not follow a uniform definition of “non-credit activity,” and do not share a single database where non-credit enrollments are entered. We are pursuing an avenue for accomplishing both.
Multi-university academic program planning, evaluation, and assessment
The evaluation structure for multi-university collaborative academic programs relies partly on processes originally designed for evaluating single-university programs and courses. While all students are assured the chance to complete a student evaluation of teaching in courses delivered jointly by two or more universities, a cycle of assessment tailored for multi-university programs and leading to the continous improvement of those programs has not yet been developed. (See Standard Eight for an update on our progress in assessing academic programs.)
More effective evaluation may require revisiting current KPIs to confirm that they continue to reflect the missions and strategic plans of the universities and Law School in relation to UMS priorities. KPIs should be associated with defined goals in order to provide an informed and responsive basis for UMS oversight and support for university and Law School strategic plans and planning.