Southern Association of Colleges and Schools Commision on Colleges – Fifth Year Interim Report

4. Institutional effectiveness: educational programs [CS 3.3.1.1]

Standard 4: The institution identifies expected outcomes, assesses the extent to which it achieves these outcomes, and provides evidence of improvement based on analysis of the results in the following area:  educational programs, to include student learning outcomes. (Comprehensive Standard 3.3.1)

Status: In Compliance

Berea College is committed to intentional, meaningful, and comprehensive assessment in all of its educational programs– Academic disciplinary and interdisciplinary programs, the Labor Program, and General Education,. In all of these programs and at every level, from individual courses and experiences to the entire program (major, four-year general education program, and the four-year well-structured labor experience), the College makes sure that there are clearly articulated student learning outcomes, that there is ongoing assessment of the extent to which the outcomes are achieved, and that through the use of a wide range of evidence targeted changes are made. A regular practice of assessment ensures that the assessment cycle is ongoing and integral at every level. The following narrative includes three major parts describing the expected outcomes, assessment, and improvement of the 1) Academic Programs, 2) Labor Program, and 3) General Education Program.

Academic Programs

Annual Departmental Effectiveness Reports (DERs) and Departmental Self-Studies (conducted every ten years) comprise the core of academic program assessment. All academic programs at Berea College identify specific learning outcomes for students; all programs articulate and review regularly the opportunities available for students to achieve these learning outcomes; all programs engage in regular assessment that draws upon a wide range of evidence; and all programs make improvements based upon these findings. In the annual Departmental Effectiveness Reports, programs consider four assessment components:

  1. learning outcomes,
  2. opportunities for students to achieve the learning outcomes,
  3. data collection and assessment methods, and
  4. findings and improvements.

Learning outcomes

Berea’s academic programs began drafting learning outcomes more than twenty years ago. All programs (majors and minors) now have well-articulated learning outcomes that are attentive to:

  • disciplinary and/or interdisciplinary expectations and trends
  • the context of Berea College, its mission, and the wide range of learning opportunities on campus
  • a thoughtful understanding of pedagogy and the complex nature of learning
  • the understanding that learning includes knowledge, skills, behaviors, and habits of mind.

Below are examples of learning goals from the academic programs.

Knowledge Learning Outcomes

  • Biology majors will know “the fundamental concepts of biology, including cell biology, ecology, evolution, genetics, organismal diversity, and organismal structure and function.”
  • Political science majors will “demonstrate knowledge of current political information and political events.”
  • Philosophy majors will understand “significant figures and texts in the areas of philosophy.”

Skills Learning Outcomes

  • Chemistry majors will have “laboratory skills in the field of chemistry including instrumentation, synthesis, and wet lab analysis.”
  • Foreign Language majors will “speak effectively in the target language.”
  • Nursing majors will “prepare written documents with accuracy, clarity, and grammatical proficiency utilizing current APA format.”
  • Sustainability and Environmental Studies minors will “master practical skills for increasing household resilience in one or more areas including food, energy, water, shelter, and health.”

Behavioral Learning Goals

  • Mathematics students will develop “a non-rote approach to math.”
  • Technology and Industrial Arts students will “be prepared to live thoughtfully in our natural and human-made environments.”
  • Art students learn to practice “self-discipline, self-motivation, and critical self- evaluation.”

Habits of Mind Learning Goals

  • In the Religion program, students will learn to “articulate and support with relevant sophistication and subtlety their own views about religious texts and issues.”
  • In the Agriculture and Natural Resources program, students learn to apply “facts and principles to the management of agriculture and natural resources systems.”
  • In the Economics and Business programs, students “will be able to view problems from a variety of perspectives and consider alternative points of view when examining issues and designing solutions.”

Opportunities for students to achieve outcomes

The faculty responsible for each of Berea’s academic programs identify and review regularly the opportunities for achieving the learning outcomes. In addition to the annual Departmental Effectiveness Reports, Departmental Self-Studies (schedule of studies) conducted every ten years provide for a substantial and comprehensive review. According to the Departmental Self-Study Guidelines, the basic purpose of the self-study is to insure that resources available to the department will continue to be used with maximum effectiveness and in keeping with overall institutional aims. The self-study provides an opportunity for academic programs to reflect on the students they serve, the current state of the discipline, and how to best provide an undergraduate education within the context of Berea College. To achieve that purpose, the self-study should include at least four major components:

  1. a thorough examination of the department’s goals and objectives,
  2. an assessment of departmental effectiveness in meeting those goals,
  3. a departmental review of its major curriculum, and
  4. a plan for departmental development in the future.

In addition, programs are asked to consider:

  • The appropriateness of the curriculum for size of faculty, number of student majors, and necessary frequency of course offerings.
  • Within the major, the course sequences and structures that provide coherence, that enable students to understand the operating assumptions, organizing principles, scholarly issues, and modes of inquiry that characterize the discipline.
  • Departmental strategies for helping students gain critical perspective on the major field, while developing the capacity to make connections within and between major courses, and between the major and other areas of study.
  • The extent to which departmental courses reflect recent developments or current understandings in the field.
  • The ways in which course-taking patterns and the design of experiences in major courses stimulate intellectual growth, and the ways in which the major program builds upon and strengthens general education.
  • The variety of pedagogical approaches and learning situations that students encounter, with proper regard for differences in students’ learning styles.
  • Opportunities for students to pursue study in other disciplines and to have learning experiences in off-campus settings.
  • Effectiveness of student advising, including guidance toward further study and careers.
  • The ways in which the labor program and the academic program are linked to maximize learning.
  • The accessibility of the department’s courses and major(s) to all groups of students, including groups underrepresented in the major.
  • The integration of a diverse body of scholarly work that includes varied perspectives, as appropriate to the discipline.
  • Departmental efforts to develop communication skills so that students become more effective communicators.

The self-study process begins with the faculty responsible for the program drafting over a one- or two-year period a self-study document. This is shared at various stages with the Academic Vice President and Dean of the Faculty. When completed, two or more external reviewers (faculty from other institutions) are selected to review the self-study report; to come to campus to interview faculty in the program, faculty outside the program, students, and administrators; and to write a summary of their findings. The self-study report is also forwarded to the Academic Program Council (APC), the faculty committee responsible for oversight of curriculum, and the external reviewers meet with two members of this Council. Upon receipt of the external reviewers’ findings, the APC drafts questions for the program chair. The APC as a whole meets with the program chair, reviews all questions and issues raised through the self- study process (the report drafted by the program faculty and the findings of the external reviewers), and concludes the process with a letter sent to the program faculty.

This process is not pro forma. When self-study reports are deemed inadequate, the Academic Program Council may ask a program to return to the assessment process with careful attention to particular areas. The process is robust, rigorous, connected to Berea’s mission, and connected to external contexts (external reviewers play a very important role). It is owned by the faculty as a whole through the role played by the Academic Program Council, an elected body; and it is supported with institutional dollars.

Data Collection and Assessment Methods

The institution collects data for assessment purposes and provides this information to academic programs. Annually, Departmental Data Reports are provided to each program and are very detailed. These include individual departmental data as well as comparative norms for a wide range of information (e.g., grade distributions, academic advising information, graduates by gender and race/ethnicity, and graduate survey results). The institution also conducts national surveys such as the National Survey of Student Engagement (NSSE), the ACT Alumni Survey, and a recent survey of three decades of graduates.

In addition, anonymous, electronic student evaluations (known as the Instructor Evaluation Questionnaire—IEQ) are conducted in every course every semester, and the student feedback is shared with the instructor and with the department chair/program coordinator. This information is used in all faculty evaluations and helps to sustain a culture of valuing and working toward excellence in teaching and learning, the first of four criteria used to evaluate faculty.

Academic programs also collect and evaluate assessment data. Berea’s academic programs use diverse assessment methods to ascertain programmatic strengths and weaknesses and to help identify effective changes. Examples of assessment methods and tools from the Departmental Effectiveness Reports include senior projects and research papers, presentations made to audiences outside the classroom, exit surveys of graduating seniors, and the use of national examinations such as the National Council Licensure Examination (NCLEX) in Nursing, Educational Testing Service (ETS) Major Field Test in Business, American Chemical Society (ACS) in Chemistry, and PRAXIS for teacher certification.

Curriculum and co-curriculum review includes the examination, analysis, and mapping of syllabi and courses, the review of programs at other institutions, and the analysis of assignments and skill development across the program’s curriculum. These are reported in the DERs and the self-studies.

Assessment Findings, Actions Taken, and Improvements Made

Following are examples from recent Departmental Effectiveness Reports and self-studies:

  • Sociology faculty members identify specific courses that focus on deepening students’ quantitative skills and others that focus primarily on research methods. This clarity about the distinct emphases of individual courses allows faculty to develop appropriate syllabi and design courses that contribute to the overarching learning goals, and also provides more precise assessment questions that can be asked about the success of particular courses. (DER)
  • History faculty members have identified the analytic skills that are developed at each course level. In reviewing and clarifying how their courses take students through the developmental stages of becoming a strong researcher and writer about history, the faculty have the opportunity to ensure that the history curriculum moves students intentionally toward the programs’ core learning outcomes. (DER)
  • The Art program provides regular museum and other trips to help students practice skills associated with viewing art in museum and other settings. Committed to helping students develop the capacity for a broad and deep engagement with art, co-curricular programming is an intentional part of the Art programs effort to help students achieve the program’s core learning outcomes. (DER and Self-Study)
  • The Education Studies faculty has created numerous opportunities for their students to be in K-12 classrooms prior to student teaching. Focused on the core dispositions necessary for successful teaching, these early K-12 classroom experiences help Education Studies students make progress in cultivating these dispositions and become more self-aware of their abilities as future educators. (DER)
  • Biology, Physics, and Chemistry programs require students to create poster presentations, and their students participate in the Kentucky Academy of Science annual conference. These out-of-class experiences enhance the research experiences woven throughout science courses and further support the experience of science as a collaborative work that includes presentation to peers as well as research with peers.
  • About ten years ago, as a result of a self-study, the Biology program decided to move away from “cookbook” laboratory experiences and to embed genuine research activities throughout the curriculum. Such decisions were based on evidence suggesting that students were not linking scientific knowledge to scientific research. Currently, evidence from the senior survey suggests that students feel overwhelmed by the scope of research embedded in courses, and as a result the Biology faculty is diversifying the type of research experiences, making some team-based or class-focused. (DER)
  • The Chemistry faculty has been collecting and analyzing student performance data based on grades in Chemistry courses, the American Chemical Society diagnostic exam, and the general exam for graduating seniors. As a result of this assessment project, which is ongoing, as well as other assessment findings, the faculty has implemented a structured set of study skills tutorials and has begun evaluating course sequencing. (DER and Self-Study)
  • The Technology and Industrial Arts recent self-study discusses the role of internships in helping students gain the capacity to make connections among major courses and to gain a critical perspective on the field as practiced outside the academy. (DER and Self-Study)
  • The Foreign Language programs have, based upon evidence from student performance on assignments and exams, increased the emphasis on presentations, oral tests, skits, and dialogues in an effort to enhance student proficiency in speaking. (Self-Study)
  • In response to senior exit survey data, communication with alumni and advisees, and the assessment of internship experiences, the faculty in the Agriculture and Natural Resources program decided to add staff to the farm programs to further improve the quality of the farm component of the curriculum and to expand the range of the student capstone projects to help improve student communication skills as well as research skills and business planning skills. Given that the Berea College Farm is the nation’s oldest, the Agriculture and Natural Resources program has always included a hands-on component. Eight years ago, based on evidence suggesting that this experience was directly linked to successful student learning of practical skills and the application of knowledge, the faculty increased student responsibility for farm operations by structuring and adding labor experiences on the farm. (DER and Self-Study)
  • Based upon a study of the enrollment in courses and study abroad programs, the Asian Studies program has changed two survey courses to be more country-specific and is seeking new institutional partners in East Asia. (DER)

Future Plans Aimed at Improvements in Assessment Work

Plans for the academic year 2011-2012 include revamping the Departmental Effectiveness Report system to require academic programs to demonstrate more clearly continuous improvement processes in their assessment work. By directing academic programs to focus more intentionally on generating formal evidence of student achievement (regarding learning goals), monitoring and evaluating this evidence over time, and tracking and reporting the improvement of student learning across annual effectiveness reports, our goal is to create a more robust institutional effectiveness system in the academic program area.

Also, Berea College has recently restructured its academic programs so that more than two dozen departments will now become six academic divisions. Each academic program has a Program Coordinator who works with colleagues in the academic program and throughout the new division as well as with a division chair; each academic division has between five and six academic programs (previously academic departments). Current discussion with the newly appointed Division Council, formed in May 2011(composed of each of the six division chairs, the Dean of Curriculum and Student Learning (also a new position), and the Academic Vice President and Dean of the Faculty), involves transforming ten-year departmental self-studies into a five-year divisional endeavor to provide more flexible and responsive action for continuous improvement. Such actions, it is hoped, will foster collaboration across the programs in areas such as student learning assessment. Other recent actions as a result of restructuring will bring self-studies to the scrutiny of the Division Council as well as the Academic Program Council (APC), and changes also included plans for new divisional self-studies in the future to be presented to the entire faculty.

A scenario planning process began as a response to the global financial collapse of 2008-2009 and led to an innovative initiative centered on student learning and curriculum known as “Engaged and Transformative Learning.” During this discussion, sustained attention was given to assessment, and faculty members conveyed a genuine eagerness to make assessment a fundamental part of their practice at every level, from course design to curriculum revision.

Labor Program

Mission

Berea’s Student Labor Program is an integral part of the institution’s educational philosophy and program. The Great Commitments of Berea College include the commitment, “To provide for all students through the Labor Program experiences for learning and serving in community and to demonstrate that labor, mental and manual, has dignity as well as utility.” Though the Labor Program has been central to the Berea experience since its beginning, a 2003 strategic review of the program provided a new unifying vision for labor as student-learning centered, as service to the College and larger community, and as providing necessary work done well.

Description

Berea College, founded in 1855, established the Labor Program under its original charter. By the early 1900s the program was structured into labor departments with staff and faculty work supervisors serving as practical instructors. All students are required to work in Berea’s Labor Program during each term of enrollment except when engaged in an institutionally organized or approved internship, study abroad, or similar program. Continuing students not enrolled in a summer course may participate in a summer work-learning-service practicum.

Students generally work between 10 and 15 hours per week during the academic year in more than 110 labor departments across all programs and operations of the College. They may work more during the summer. Failure to meet the participation or to fulfill position expectations can result in probation or suspension from the College. Positions are classified into Work-Learning-Service levels ranging from entry trainee assignments to positions with significant program direction and supervisory responsibility. Time is reserved each week for labor department training, and other activities are not scheduled during that hour. The Labor Program operates under the administrative oversight of a Dean of Labor. The labor experience has long been cited by students and alumni as one of the most valuable parts of their Berea College education. The College routinely solicits feedback regarding the Labor Program from current and graduating students. The College also collaborates with other Work Colleges on shared research and alumni surveys in order to improve its programs and enhance student work-learning. Alumni ratings of educational outcomes are consistently higher for Berea and other work colleges as compared to other institutions’.

Work College Definition

In 1991, Congress enacted federal legislation recognizing Work Colleges as a special type of degree-granting institution. This legislation, first adopted as an amendment to the 1962 Higher Education Act (HEA), defined a Work College as a degree-granting institution with a student work-learning-service program that:

  1. is an integral and stated part of the institution’s educational philosophy and program,
  2. requires participation of all resident students for enrollment and graduation,
  3. includes learning objectives, evaluation, and a record of work performance as part of the student’s college record,
  4. provides programmatic leadership by college personnel at levels comparable to traditional academic programs,
  5. recognizes the educational role of work-learning-service supervisors,
  6. includes consequences for nonperformance or failure in the work-learning-service program similar to the consequences for failure in the regular academic program, and
  7. administers, develops, and assesses comprehensive work-learning-service programs including:
    1. community-based work-learning-service alternatives that expand opportunities for community service and career-related work; and
    2. alternatives that develop sound citizenship, encourage student persistence, and make optimum use of assistance under the Work-Learning-Service program in education and student development.

In 2008, the new Higher Education Opportunity Act (HEOA) amended the federal description of such programs from “comprehensive work-learning program” to “comprehensive work-learning-service program” recognizing service as a desired work-learning outcome and intrinsic to a Work College education. Seven colleges are federally recognized as Work Colleges.

Comprehensive Program Review

In 2000, Berea’s Strategic Planning Council (SPC) appointed a subcommittee to review the work program and make recommendations for improvement with special focus on enhancing labor as learning. The Labor Review Committee, co-chaired by two faculty members, reviewed structures and processes of the program, interviewed members of the campus community (students, supervisors, faculty, and staff), analyzed existing program evaluation data, including alumni surveys, and educated itself about the world of work and learning at and beyond Berea.

Its resulting report to the College’s Strategic Planning Council included recommendations for re-visioning, re-vitalizing, and re-structuring the Labor Program. Recommendations were made on program structure, administration, policies, and operations. Further recommendations focused on the need to develop and structure operations, services, and assessments around a central, unifying vision of labor as and integration of learning, service, and work.

Guided by this review, the Strategic Planning Council devised a strategic initiative on “student labor as student and learning centered, as service to the College and community, and as providing necessary work (i.e., work that needs to be done), being done well.” This strategic initiative, approved by the College and General Faculties in 2003 and by the Board of Trustees in 2004, has provided a framework for subsequent program improvements in such areas as articulation and measures of work-learning-service outcomes, assessment of their attainment, and enhanced work supervisor support and training.

Most recommendations resulting from this comprehensive review of the Labor Program have been implemented. Program structure and leadership have been modified; program literature rewritten to emphasize the learning dimensions of the program; position allocation processes revised to include emphasis on learning outcomes; and resources have been dedicated to improving the quality of supervision.

Labor Transcript

In 2009-2010 the Student Labor Transcript was revised to provide a permanent record of a student’s work history and quality of work performance. Coupled with a well-written resume, the Labor Transcript supports a student in exploring internship and career opportunities. The transcript contains:

  • a detailed work history including department, position, and supervisor
  • Work-Learning-Service level for each position
  • number of contract hours per week
  • departmental Labor Evaluation scores across five performance indicators ranging from “Unsatisfactory” to “Exceptional Labor Performance”
  • distinctions attained (e.g., labor related awards, certifications, and or specialized training)

Training

Ongoing Training support is provided for supervisors to enhance workplace development. The Labor Program Office maintains a library of print and audio-visual materials for use by supervisors on a variety of topics such as communication, teamwork, training, diversity, mediation, leadership, and supervision. In addition, a newly designated annual Day of Exploration in Learning and Labor is reserved in the academic calendar for students to explore fields of study, labor positions, learning-labor linkages, and other opportunities for engaged and integrated learning. In addition, specific learning, evaluation, and assessment improvements described below have been devised with input from students, practical supervisors, the Labor Program Council, and others. Piloted over a year, they are now fully in place and preliminary results will be used to guide future program and outcomes improvements.

Work-Learning Outcomes

Work provides an experiential learning environment through which learning and service outcomes are derived. Labor positions may provide work experience directly related to a major academic field of study and opportunity to explore areas of interest outside the major. Linking academics and work gives students opportunities to reflect on and apply what is being learned in the classroom. Other skills such as communication, team work, and initiative developed through a labor position also help students become more effective learners and workers.

Learning opportunities are now articulated in written position descriptions that include learning outcomes. Positions are classified according to Work-Learning-Service levels and learning outcomes progress accordingly to those levels.

The next phase of program improvement implements continuous evaluation, particularly of program outcomes.

Program Evaluation

Program Evaluation and assessment utilizes a variety of processes and instruments. Practical instructors evaluate student work performance on seven performance expectations using the Student Labor Evaluation (SLE) tool. Descriptors for each area assist supervisors to connect these expectations to seven specific Labor Learning Goals and to the College’s Workplace Expectations for all workers. “Position-Specific” performance expectations on the student labor evaluation relate directly to skill and performance requirements specified in the position description. Training materials identify those connections and facilitate supervisors’ conversations with students about the integration of labor and learning.

To assess learning outcomes of the labor experience and to improve work performance, two annual formal evaluations are conducted—one at the mid-point and one at the end. At the mid-point, labor supervisors discuss student performance and make recommendations for improvement. Students receive all evaluation results and may respond and comment before the final score is uploaded into the labor transcript. Students also reflect on their work experience and provide feedback through the Labor Experience Evaluation (LEE) on:

  • learning-through-work experiences
  • the relationships between work and academics
  • four core general educational goals
  • evaluation of the specific work area
  • evaluation of the Labor Program

The 2009-2010 and 2010-2011 pilot assessment reports reveal that supervisors routinely hold systematic discussions with students about work performance. Assessments also suggest that work and learning are being connected in meaningful ways and that the Labor Program is successful in promoting a culture of learning in the workplace through supervision, reflection, and evaluation. A majority of students indicate they experience direct connections between work and learning and that their labor positions support their academic and career objectives. Refer to Figures 1 and 2 below.

Figure 1

Figure 2

Also, students have overwhelmingly indicated on the Labor Experience Evaluation (LEE) that their work fosters the development of Berea’s four core general education goals, i.e., to become better writers, speakers, researchers, and critical thinkers. Refer to Table 1 below.

Table 1

Student Ratings from the Labor Experience Evaluation

My work has provided me with opportunities to become a better:

Always Often Occasionally Rarely Never N/A
Writer 13% 13% 24% 23% 23% 4%
Speaker 20% 28% 26% 11% 11% 4%
Researcher 16% 22% 29% 13% 15% 5%
Critical Thinker 35% 28% 21% 5% 6% 5%

In the summer of 2011 the Labor Program implemented an annual assessment feedback mechanism for Labor supervisors and Department administrators. Labor supervisors were provided web-based access to the Labor Experience Evaluation (LEE) results from the spring of 2011. From these assessments two institutional reports were generated: 1) Student Labor Evaluation report and 2) Labor Experience Evaluation report provided to College and department administrators. The Labor Program office also began generating individual Department Assessment Reports incorporating data to compare individual department’s performance with institution-wide performance. Other sections of the report address items specific to a department. Performance areas assessed include evaluation of completion rates, quality and frequency of supervisor feedback, rating averages for the department compared to institutional averages, and student feedback about specific work areas and job-specific learning objectives. Selected labor department directors met with members of the Labor Program administration and the Director of Institutional Research and Assessment to review assessment findings and discuss opportunities for improvement. This process will continue for other work areas in the fall. Vice Presidents will also receive labor department assessments reports from within their division. These assessment cycles will continue for all departments in the future.

Over the next year, as sufficient comparative data becomes available, the Office of Institutional Research and Assessment will work with the Labor Program to define areas for comparative statistical analysis, longitudinal, and on-going improvement-based research. Initial findings from this past academic year (2010-2011) illustrate that student work significantly improved from midterm to end-of-year assessments, according to supervisor ratings. Refer to Table 2 below. These studies will allow us to explore student growth over multiple years within a department, student movement across labor departments, Work-Learning-Service level advancement within departments and across the institution, patterns of secondary positions and movement across the institution, and connections between particular academic areas of study and labor departments. We will also be able to identify departments with exceptional performance and use them as best practice models for other departments.

Table 2

Supervisors’ Ratings of Labor Students: Mid-Term Compared to End-of-Term

Mid-Term Mean End-of-Term Mean Effect Size Significance
Attendance 16.74 17.00 0.12 0.002
Teamwork 8.41 8.61 0.22 0.000
Accountability 8.22 8.43 0.22 0.000
Initiative 8.12 8.32 0.20 0.000
Respect 8.77 8.89 0.15 0.002
Learning 16.76 17.08 0.19 0.000
Job-Specific 17.05 17.30 0.15 0.000
TOTAL SCORE 84.04 85.59 0.20 0.000

Note: Attendance, Learning, and Job-Specific were rated on 20-point scales and Teamwork, Accountability, Initiative, and Respect were rated using 10-point scales.

Summary and Future Plans

Integrated work-learning is imbedded in the history of Berea College. From its earliest days, Berea has enabled students to contribute to their cost of education while extending their classroom learning, gaining valuable work experience, and serving the College and surrounding communities by performing necessary work well done. More recently, beginning with an institution-wide comprehensive review of the program in 2000-2001, the College has undertaken systematic assessment of its work-learning program and its learning outcomes. This review resulted in adoption of a strategic initiative to re-vision, re-vitalize, and re-structure the program. Subsequently, the College has implemented other program improvements including systematic program goals and learning outcomes, restructured program operations, and program assessments which have carefully field-tested and have now been fully implemented. Such assessments are used across the program to evaluate student learning and performance, and to assess program effectiveness. Going forward, such embedded assessment will provide data for program improvement at the department and institutional levels and support even more intentional and consistent focus on work-learning outcomes.

General Education Program

Berea College’s General Education Program is comprised of five interdisciplinary core classes taught by faculty from across our campus–GSTR 110: Critical Thinking and the Liberal Arts; GSTR 210: Identity and Diversity in the United States, GSTR 310: Understandings of Christianity, GSTR 332: Scientific Knowledge and Inquiry, and GSTR 410: Contemporary Global Issues. In addition, there are six perspective areas and two practical reasoning courses (at least one of which has a quantitative emphasis) that are departmental courses that have been approved by the Committee on General Education (COGE) to satisfy these requirements. Finally, all students are required to have one credit in Health and Physical Education and an Active Learning Experience.

The General Education Program is:

  • Aligned with the goals and commitments of the College
    The Aims of General Education come directly out of the College’s Great Commitments and the paired learning goals of Berea’s strategic plan, Being and Becoming.
  • Founded on a contemporary understanding of liberal learning
    The General Program reflects a vision of liberal learning articulated by the Association of American Colleges and Universities (AAC&U) and the work of such thinkers as Bruce Kimball, Martha Nussbaum, and others.
  • Focused on student learning
    The Aims for General Education and each of the Core Courses and Perspectives areas are framed in terms of student learning outcomes.
  • Centered on critical reflection
    There is a consistent emphasis on critical reflection, a general term used to refer to a number of skills, abilities, and habits related to practical reasoning, critical thinking, moral imagination, and other aspects of aims and outcomes.
  • Committed to both disciplinary and multidisciplinary learning
    Many Perspective requirements are met by taking courses in disciplines, and the core courses incorporate multiple disciplinary approaches to the topics examined.
  • Structured to promote linkages and connections
    All of the courses and experiences in the program are designed to foster connections. For example, there are close skills development connections between the writing seminars, attention given to reason and science in the Christianity course, and the design of a final seminar course that challenges students to bring the knowledge, skills, and habits of mind they have gained in earlier general studies and major courses to bear on a particular issue. In addition, the program links curricular and non-curricular learning by encouraging connections among the classroom, labor, residential life, and convocation programs.
  • Designed to encourage and enable collegial collaboration
    The core’s multidisciplinary emphasis is a natural place for regular, and in some cases, extensive collaboration between students, faculty, and staff.
  • Smaller and more flexible than the previous program
    The current program reduced the number of GSTR, or Core Courses, from eight to five and defined Perspective requirements so that courses outside the core GSTR curriculum may fulfill up to two Perspective requirements.

Learning outcomes in key General Education skills areas are regularly assessed. The results of those assessments are used to modify course goals and structures, monitor student learning, and guide faculty development.

Committee on General Education (COGE)

The Committee on General Education (COGE) is the steering committee for the General Education curriculum and is guided by faculty-approved Aims of General Education. Primary responsibilities of COGE include:

  1. Consideration of issues affecting the substance of the General Education Program, including reviewing and making recommendations on any matters that affect GSTR course guidelines or the General Education curriculum as a whole.
  2. Administration of existing policy within the General Education Program. These matters include (but are not limited to) the development, review, and approval of new sections of General Education core courses, Perspectives, and other components of the General Education Program; and consideration of requests for exceptions within the program.
  3. Systematic and on-going assessment of individual GSTR courses and of the General Education curriculum as a whole.
  4. Planning for faculty development in regard to the General Education Program.
  5. Initiation of proposals for programmatic and/or curricular changes to the General Studies Program, which are forwarded to and acted upon by the Academic Program Council (APC).

COGE consists of eight members—the Dean of Curriculum and Student Learning (formerly, the Director of General Education); the Coordinators of the five General Studies Core Courses; the Coordinator of Perspectives, Practical Reasoning, and Active Learning Experiences; and a student. The Dean of Curriculum and Student Learning serves as a voting member of the Academic Program Council (APC) and functions as a liaison between COGE and APC. Coordinators are appointed by the Dean of Curriculum and Student Learning. Terms are generally three years but may range from two to five years. All members have voice and vote.

Structures for Assessment and Faculty Development

The Committee on General Education (COGE) and the Dean of Curriculum and Student Learning have devised structures for assessing aspects of the General Education program and offering faculty development opportunities aimed at making instruction more effective. Some of these structures are described below:

  • The annual General Studies workshop (1-2 days) each August is an opportunity to share assessment data and other information and to offer instructional workshop sessions.
  • Each summer a 5-day seminar on Teaching and Evaluating Student Writing and Speaking is offered with stipends for participating faculty. Faculty members who are new to the first-year writing seminars are especially encouraged to participate.
  • Coordinators of the General Studies Core Courses develop initiatives and programming to enhance instruction and student learning.
  • The Dean of Curriculum and Student Learning convenes advisory groups as needed to conduct focused assessments of particular aspects of the program.
  • Seminars focused on content and skills in the General Education Core Courses are offered during the summer, with stipends for faculty participants.
  • Assessment projects are conducted during the summer, with stipends for faculty participants.
  • Faculty-led groups interested in projects focused on student learning and assessment are encouraged and supported financially.

Overview of General Education Program Assessment, 2008-2011

During the first two years of Berea’s current General Education Program—inaugurated in the fall of 2006—assessment focused primarily on student and faculty perceptions of the new program and of the extent to which they believed learning outcomes were being met. Our current General Education Program was designed to be experimental in several respects and to include—for our campus—somewhat unusual structural elements: e.g., the common syllabus, text, exam, and weekly plenary lectures in GSTR 310; required team-teaching in GSTR 410; the requirement that all students, regardless of major or transfer credit, take all five core classes; and a required Active Learning Experience. We wanted to give these aspects of the program a chance to succeed over a two-to-three-year period, and we gathered systematic feedback from students and faculty to monitor how the program was working.

Feedback included our regular course evaluations, online surveys of students and faculty, meetings of the faculty in the various Core Courses, and, for some issues, reviews by advisory teams formed by the Dean of Curriculum and Student Learning. Several modifications to the program, including modification to GSTR 410 and alternative ways of meeting the GSTR 332 requirement, were proposed by the Committee on General Education (COGE) and approved by the Faculty. In December of 2010, the Committee on General Education summarized these changes in a report to the Faculty.

Direct assessments of student learning in the General Education program over the past three years have focused on five key areas:

  1. critical thinking and quantitative reasoning,
  2. research skills and information literacy,
  3. writing,
  4. scientific knowledge and inquiry, and
  5. student engagement as measured by the National Survey of Student Engagement (NSSE).

Students’ abilities and knowledge in these areas have been assessed independently using both standardized and local instruments, and in an integrated way through rubric analysis of large samples of research writing produced by first-year and senior students.

1) Critical Thinking and Quantitative Reasoning

Critical Thinking

Critical thinking has been assessed using both standardized instruments and through rubric evaluations of student research projects.

Standardized Instruments

Using standardized instruments, Berea College aimed to measure changes in critical thinking skills and dispositions using the California Critical Thinking Disposition Inventory (CCTDI) and the ACT College Assessment of Academic Proficiency (CAAP) Critical Thinking examination in 2006-07 and 2009-10.

CCTDI—2006

Early in the 2006 fall semester, 275 students took the California Critical Thinking Disposition Inventory (CCTDI) in their GSTR 110 classes. The CCTDI is designed to measure students’ disposition to be critical thinkers. It is the product of an American Philosophical Association project of the late 1980s, which resulted in the “Delphi Report: Critical Thinking: A Statement of Expert Consensus for Purposes of Educational Assessment and Instruction.” That report describes a critical thinker as one who “is habitually inquisitive, well-informed, trustful of reason, open-minded, flexible, fair-minded in evaluation, honest in facing personal biases, prudent in making judgments, willing to reconsider, clear about issues, orderly in complex matters, diligent in seeking relevant information, reasonable in the selection of criteria, focused in inquiry, and persistent in seeking results that are as precise as the subject and the circumstances of inquiry permit.” As this definition suggests, critical thinking is comprised of both skills and dispositions. The CCTDI is designed to measure the latter.

The test was given early in students’ Berea education to provide a baseline. Administering it to the same students when they were juniors or seniors provided an important perspective on how their college curriculum and experiences may have influenced their disposition to think critically.

CCDTI scores are not reported in comparison with scores at others institutions. Instead, there is an overall score and scores ranging between 10 and 60 on six sub-scales, including:

  • Truth-seeking (disposition of being courageous when asking questions, eager to seek best knowledge in a given context, and honest in the pursuit of inquiry even if the findings do not support one’s interests or one’s preconceived opinions)
  • Open-mindedness (disposition of being open to and tolerant of the expression of divergent points of view with sensitivity to the possibility of one’s own bias)
  • Analyticity (disposition of being alert to potentially problematic situations, anticipating possible results of consequences, and prizing the application of reason and the use of evidence even if the problem at hand turns out to be challenging or difficult)
  • Systematicity (disposition toward organized, orderly, focused, and diligent inquiry)
  • Critical Thinking Self-Confidence (the level of trust one places in one’s own reasoning processes
  • Inquisitiveness (intellectual curiosity)
  • Cognitive Maturity (disposition to make reflective judgments, particularly under conditions of uncertainty).

In their 2001 article, “A Look across Four Years at the Disposition toward Critical Thinking among Undergraduate Students” (Journal of General Education, Vol. 50, No. 1, pp. 29-55), Carol Ann Giancarlo and Peter A. Facione provide an interpretive framework for CCTDI scores:

For each of the seven scales a person’s score on the CCTDI may range from a minimum of 10 points to a maximum of 60 points. Scores are interpreted utilizing the following guidelines. A score of 40 points or higher indicates a positive inclination or affirmation of the characteristic; a score of 30 or less indicates opposition, disinclination or hostility toward that same characteristic. A score in the range of 31-39 points indicates ambiguity or ambivalence toward the characteristic. An overall score on the CCTDI is computed by summing the seven scale scores. Overall CCTDI scores may range from a minimum of 70 points to a maximum of 420 points. Similar interpretative guidelines are used when looking at overall CCTDI scores: A total score of 280 points or higher indicates a positive overall disposition toward critical thinking, whereas a total score of 210 or lower indicates a negative disposition averring critical thinking. (p. 36).

The mean sub-scale scores for the 275 students who took the CCTDI in 2006 are listed in Table 3 below:

Table 3

2006: CCTDI Subscale Average Ratings

Truth-seeking 36.8
Open-mindedness 44.5
Analyticity 43.5
Systematicity 38.7
Self-confidence 41.8
Inquisitiveness 46.7
Maturity 44.8

These scores indicated that our students were beginning their college education with a mildly positive disposition toward critical thinking in five of the seven areas measured by the test. In two areas; systematicity and truth-seeking, student scores indicated ambivalence. Since these were baseline scores, COGE did not interpret them as indicating the need for specific immediate action.

CAAP—2007

In the spring semester approximately 280 first-year students took ACT’s College Assessment of Academic Proficiency (CAAP) test in their GSTR 210 sections. ACT describes the CAAP text as follows:

The CAAP Critical Thinking Test is a 32-item, 40-minute test that measures students’ skills in clarifying, analyzing, evaluating, and extending arguments. An argument is defined as a sequence of statements that includes a claim that one of the statements, the conclusion, follows from the other statements. The Critical Thinking Test consists of four passages that are representative of the kinds of issues commonly encountered in a postsecondary curriculum. A passage typically presents a series of sub-arguments in support of a more general conclusion or conclusions. Each passage presents one or more arguments using a variety of formats, including case studies, debates, dialogues, overlapping positions, statistical arguments, experimental results, or editorials. Each passage is accompanied by a set of multiple-choice test items. A total score is provided for the Critical Thinking Test; no sub-scores are provided.

Unlike other CAAP tests (in writing, reading, science, and math), the critical thinking test results are not correlated to ACT scores to provide a snapshot of “value added.” Thus, we have been cautious in drawing inferences from the results of a single testing. Berea students’ mean score of 64.2 on the test puts this cohort of students at the 66th percentile of the national comparison group. It is important to note that the national group scores are derived from students completing their sophomore year, whereas all of the Berea students who took the exam were first-year students. The test is divided into content categories: Analysis of Elements of an Argument, Evaluation of an Argument, and Extension of an Argument (see Figures 3, 4, and 5).

Figure 3

Figure 4

Figure 5

Figures 3, 4 and 5 show that there was essentially no difference in the average score of the top quartile of our students and the top quartile of the national cohort, though the average score for our students in the middle two quartiles and bottom quartile were significantly higher than the national averages, as were the overall scores. Interpretation of these results is difficult. The results indicate that we may wish to focus on the need to challenge our strongest students to higher levels of achievement without compromising our commitment to all students. If, on the other hand, we believe our students entered Berea College with a different distribution of critical thinking skill than the national cohort, these results indicate that the distribution of critical thinking skills for Berea College students may simply be tighter than the national cohort—our bottom quartile is stronger and our top quartile is weaker. In the face of such ambiguity, additional actions were justified.

Actions Taken:

A Critical Thinking Advisory Group (CTAG) was convened in the fall of 2007 to review the results of the standardized assessments and to make recommendations about how best to improve and assess student learning in this fundamental skill (CTAG Report). After reviewing that report, the Committee on General Education (COGE) made the following recommendations:

  1. Administer the CCTDI and CAAP Critical Thinking examination to seniors in 2009-2010, who would overlap substantially with the cohort who began at Berea in 2006.
  2. Schedule meetings of faculty in GSTR 110 to discuss critical thinking texts and approaches to teaching critical thinking.
  3. Address critical thinking in the August General Education workshop and in the August seminar on Teaching and Evaluating Student Writing and Speaking.
  4. Seek ways to assess students’ reasoning in course assignments within the General Education Program.

In the summer and fall of 2009, the CCTDI and the CAAP Critical Thinking test were given to 164 students in our senior capstone course, GSTR 410. Seventy-five of these students had taken the inventory and test in 2006-2007.

CCTDI—2009

The cumulative mean for the first-semester students who took the CCTDI in 2006 was 296.71. The cumulative mean for the seniors who took the CCTDI in 2009 was 306.22. A comparison of scores of the 75 students who took the inventory both as first-year students and as seniors (those who were still at the institution and able to re-take the test) shows a gain of slightly less than 5 points (302.48 in 2006 and 307.36 in 2009). The CCTDI does not have normative data for comparison purposes. However, in a replication of a study by Giancarlo and Facione (2001), (Journal of General Education, Vol. 50, No. 1, pp. 29-55), we found that scores for the 75 students who took the CCTDI both in 2006 and in 2009 remained about the same (first-year and senior scores were similar) in four of the seven areas and changed significantly in two others. No significant changes occurred for the open-mindedness, analyticity, systematicity, and maturity of judgment scales. However, significant changes (at the p<.05 level) occurred from the first year to the senior year for truth-seeking (it increased) and for inquisitiveness (it decreased). A third area (critical thinking self-confidence) showed marginally significant increases (p<.06 level).

Table 4 illustrates the sub-score means for 75 students who took the inventory both as first-year students and as seniors.

Table 4

2006 2009
Inquisitiveness 47.67 46.19 (statistically significant decrease)
Maturity 45.87 46.04
Open-mindedness 45.63 46.59
Analyticity 44.04 43.99
Confidence 42.47 43.91 (moderately statistically significant increase)
Systematicity 39.37 40.29
Truth-seeking 37.43 40.35 (statistically significant increase)

In the Giancarlo and Facione study, Santa Clara University students’ ratings increased on the truth-seeking scale (like Berea students) and on the critical thinking confidence scale (marginally significant for Berea students). The scores on all seven scales for Berea juniors and seniors were greater than 40, the benchmark for indicating a positive disposition. While the inquisitiveness score decreased, that dimension, along with open-mindedness and maturity, were the areas in which our students scored highest, with means above 46 on all three. Results of this type of analysis need to be interpreted with some caution, however. Although 75 students completed the assessment twice, it is difficult to determine the extent to which this is a representative sample of Berea students. Nevertheless, with some caution, we can interpret the results of the longitudinal analysis of the CCTDI as highlighting the fact that students enter Berea with generally positive inclination toward critical thinking and, overall, increase these positive inclinations over time.

CAAP—2009

Students in GSTR 410 in the summer and fall of 2009 were given the CAAP Critical Thinking test. The average score for the seniors—like that for our first-year students in 2007—was higher than their national peers (64.8 compared to 62.0; see Table 5). Also, for many content areas, the positive gaps between Berea students and the national cohort were larger in 2009 than in 2007 (see Figures 6, 7, and 8). These results may suggest that Berea College’s teaching, programming, and curricula are comparatively successful in promoting critical thinking.

Table 5

CAAP Critical Thinking Scores all Students: 2007 and 2009

2007 2009
Berea 64.2 64.8
National 62.7 62.0

 

Figure 6

Figure 7

Figure 8

In addition, analyzing the score data for the 69 students who took the test twice served as the basis of a simple longitudinal study. The average score for the group as first-year students was 65.55 and 64.48 as seniors. This is a statistically significant drop in scores. Although of concern, it is important to note that the motivational score differences between the pre- and post-tests were also significantly different. Students in 2009 indicated they gave less effort to the test as seniors than they had in 2007 as first-year students.

Summary of results of the two administrations of the CCTDI and the CAAP

Berea students who took these exams in both 2006-2007 and 2009 exhibited a few modest gains both in critical thinking dispositions (as measured by the CCDTI) and critical thinking skills (as measured by the CAAP Critical Thinking exam). The most noteworthy improvement was the statistically significant increase in students’ disposition to truth-seeking. Though there was a corresponding decrease in inquisitiveness, it remained one of the strongest dispositions of our students. The CAAP results suggest that Berea students are strong in analysis of arguments, somewhat less so in the areas of evaluation and extension of arguments. One promising result emerging from the CAAP was the increasing positive gap that Berea students showed in comparison to the national cohort across quartile in a majority of content areas (6 of a possible 9 comparisons in the breakdown represented in Table 6). In the analysis of an argument, Berea students showed improvements over similar quartiled students from the national cohort. Nevertheless, the initial advantage Berea students had in the evaluation and extension of arguments content areas (see Figures 4 and 5) was not as pronounced when the CAAP was repeated in 2009 (Figures 7 and 8). Comparing Berea and the national cohort, Table 6 gives the change in the difference across quartiles and content areas. For example, the ‘+12’ for the 1st quartile under Analysis of an Argument indicates that the advantage Berea students had as in Analysis of Argument increased by 12 percentage points across the two CAAP administrations (in 2007 Berea students in the first quartile were 11 percent higher than the national quartile and in 2009 Berea College students were 23 percent higher than the national cohort). In addition, Berea seniors scored well above the fourth quartile of the national comparison group, which stands in sharp contrast to the performance of Berea students at the end of their first year.

Table 6

Change in Gap from Berea College and National Cohort (2007-2009)

Quartile/Content Area Analysis of an Argument Evaluation of Arguments Extension of Arguments
First +12 -6 -14
Second and Third +13 -3 -
Fourth +8 +2 +6

Future Actions:

Continue to seek effective ways to use standardized assessments of critical thinking (initial thoughts: course-embedded standardized assessments and focused longitudinal studies of the critical thinking skills of a representative sample of Berea students).

Rubric-based assessment of critical thinking

While there were some positive indicators in the longitudinal studies using the CCTDI and CAAP instruments, we have not measured tremendous gains. The conclusion that we need to maintain continued focus on students’ critical thinking skills is reinforced by the results of the Writing, Thinking, and Research Assessment Project completed in the summer of 2010 and 2011. These projects are described in greater detail below (see Section III: Assessment of Student Writing), but one finding was directly related to critical thinking. Several hundred GSTR 210 and GSTR 410 research projects, respectively, were evaluated by faculty members (graded using rubrics) on the following criteria: Purpose/Thesis; Organization/Structure; Reasoning/Evidence; Research/Use of Source Material; Citation/Style Conventions; Language Use (see Table 7). Both our sophomores and our seniors scored lowest on two criteria directly related to critical thinking: Purpose/Thesis and Reasoning/Evidence. When defining proficiency as a criterion score of 3 or 4, we find that students demonstrated proficiency in their GSTR 210 research paper as indicated in Table 7. A second iteration of the Assessment of Student Writing, Thinking, and Research was completed in May 2011 and did not show substantially different results (see Section III: Assessment of Student Writing).

Table 7

Percentage of GSTR 210 Projects Assessed as Meeting Proficiency Standards

Criteria Proficient in GSTR 210
Purpose/Thesis 59%
Organization/Structure 67%
Reasoning and Evidence 54%
Research/Source material 80%
Citation/Style Conventions 68%
Language Use 75%

Additional breakdown of the data shows that the distribution of scores (GSTR 210, 310, and 410 Writing Assessment Instruments) with rubrics varied across criteria, as shown in Figure 9. The reasoning skills and marshalling of evidence that students demonstrated with their submitted work was a bit below what the faculty assessors deemed proficient.

Figure 9

Levels of Proficiency across Criteria for GSTR 210

Moreover, as illustrated in Table 8 (although we have not yet completed a longitudinal analysis of student writing, thinking, and research), we find that seniors in 2009-2010 and first-year students from the same year do not demonstrate proficiency (on GSTR 210 and GSTR projects) at noticeably different rates.

Table 8

Comparison between GSTR 210 and GSTR 410 on Projects/Papers

Percentage Assessed as Meeting Proficiency Standards

Criteria Proficient GSTR 410 Proficient GSTR 210
Purpose/Thesis 56% 59%
Organization/Structure 63% 67%
Reasoning and Evidence 59% 54%
Research/Source material 81% 80%
Citation/Style Conventions 73% 68%
Language Use 75% 75%

Data from the CCDTI, the CAAP Critical Thinking examination, and the Writing, Thinking, and Research Assessment Project indicate that we need to continue efforts to assist students in becoming more rigorous and sophisticated thinkers.

Actions Taken:

  1. Shared results of the assessment with Course Coordinators for the General Education Core Courses and faculty at the annual General Studies Workshop in August 2010.
  2. Offered General Studies faculty copies of the Foundation for Critical Thinking’s Miniature Guide to Critical Thinking: Concepts and Tools for each student in their classes.
  3. Offered a 7-day faculty seminar in the summer of 2011 on promoting and teaching critical thinking—Think! Critically, Quantitatively, and Creatively.

Immediate Future Actions:

  1. Convene (October 2011) a new Critical Thinking Advisory Group with the charge of (1) creating a concise handout on critical thinking for use in GSTR courses and, when appropriate, across the curriculum, and (2) creating or modifying an existing critical thinking rubric to assist instructors in targeting students’ specific needs in this area.
  2. Design workshop sessions in light of the data on our students’ critical thinking skills for regular, ongoing faculty development sessions for the General Studies Core Courses.

An area that we have not directly assessed since 2004 is quantitative reasoning. This is due in part to a continuing discussion about what, precisely, we expect of our students in this area, with some faculty calling for proficiency in basic mathematics through algebra and geometry while others emphasizing an ability to recognize the proper and improper use of quantitative information more than formal mathematical ability. In an attempt to provide some data to inform this discussion, the Writing, Thinking, and Research Assessment Project included an audit of the quantitative reasoning in papers assessed. In 35 percent of the GSTR 210 research projects, quantitative information was deemed to be centrally relevant by faculty assessors, and 58 percent of those projects were judged to make proficient use of quantitative information. In 54 percent of the GSTR 410 research projects, quantitative information was deemed to be centrally relevant, and of those, 66 percent were judged to make proficient use of quantitative information. The audit was particularly useful in identifying the large number of student projects where faculty assessors anticipated the use of quantitative reasoning as important, necessary, or useful evidence to employ in defense of the stated thesis. These data, along with anecdotal evidence about weak quantitative reasoning among a significant portion of our students, have led to the actions described below. In response segmented consideration of quantitative reasoning as an aspect of critical thinking actions were developed.

Actions Taken:

  1. Reviewed standardized quantitative reasoning assessment instruments that might be administered to provide baseline data for a fuller assessment of students’ quantitative reasoning skills.
  2. Convened an ad hoc group of faculty from across disciplines to promote quantitative reasoning across and beyond the curriculum. This informal group—the Society of Quantitative Inquiry and Reasoning Learners (SQuIRLs)—was convened by the Director of Institutional Research and Assessment in the fall of 2010. They reviewed Carleton College’s model of promoting quantitative reasoning—QuIRK (Quantitative Inquiry, Reasoning, and Knowledge) and discussed ways of promoting quantitative reasoning across campus and the possibility of creating a repository of quantitative reasoning assignments for use in courses across the curriculum. In November 2010, SQuIRLs sponsored a campus “happening” to peak student interest in science, engineering, and quantitative reasoning. Bill Cloyd, founder of Newton’s Attic, an organization to inspire young people to embrace science and math, brought his roller coaster device to campus, assembled it on the quad, and invited students and faculty to make quantitative predictions before being launched down the track. This pilot project is still in the experimental stage.
  3. Organized presentations and discussions on quantitative reasoning across the campus led by Professor Neil Luksky, one of the founders of Carleton’s QUIRK initiative, on April 12-13, 2011.
  4. Included quantitative reasoning as a significant element in the faculty development seminar summer of 2011: Think! Critically, Quantitatively, and Creatively.

2) Research Skills and Information Literacy

Assessment of the research skills of our students occurs on many levels. In the first instance, we have focused on the assessment of bibliographies from student papers, which should give some indication of our student’s research skills.

In the summer of 2008, more than 200 research papers written by students in GSTR 210: Identity and Diversity in the United States—were analyzed by a team of librarians to discern patterns in students’ research. That audit of the papers’ bibliographies revealed that students were generally adept at finding and incorporating high-quality sources into their research; however, it also showed that scholarly journals were used less frequently than faculty expected and desired.

Action Taken:

  1. In response to this study, librarians modified the bibliographic instruction program to provide each student in GSTR 210 with a more focused introduction to, and practice in, locating scholarly journal literature relevant to their research topic in 2008-2009.

A second round of assessing bibliographies of nearly 370 first-year and senior papers in 2009 showed that students used a greater proportion of high-quality research materials, with the use of scholarly journal literature up by 50 percent.

Further assessment on student research skills was conducted in the context of the Writing, Thinking, and Research Assessment Project in the summers of 2010 and 2011. Evidence from the criteria analysis of student papers suggests that bibliographic instruction offered through our five GSTR Core Courses and through dedicated library instruction is effective. Student scores on Research/Source Material were the strongest among all criteria, with 80 percent of both GSTR 210 and GSTR 410 students being judged proficient in this area, and only 5 percent or so rated as seriously deficient. Although Berea students are generally able to find and cite appropriate sources, weaknesses in the use of sources cause faculty assessors to remain concerned about students’ ability to demonstrate critical thinking.

Additional assessment of our students’ research dispositions, understandings, and habits, was conducted using the Higher Education Data Sharing (HEDS) consortium’s Research Practices Survey (RPS). HEDS describes the survey as follows:

This 15-minute survey explores the experiences and opinions of college students concerning academic research. Its purposes are to (1) study students’ research habits, (2) use these findings to improve the ways we help students develop their research skills, and (3) determine what changes occur in research abilities as students progress through their academic careers.

The RPS has been promoted by the National Institute for Technology in Liberal Education (NITLE) and has been supported and implemented by several dozen leading liberal arts colleges, including Carleton College, Knox College, Lewis and Clark College, St. Olaf College, Swarthmore College, Trinity University, and Wellesley College. HEDS reports comparative data as well as raw scores to provide participating institutions with a framework for interpreting their scores. The response rate for the spring 2009 administration of the RPS was disappointing, with only 33 first-year students (9 percent response rate) and 29 seniors (8 percent response rate) completing the survey. Those response rates, which ranked among the lowest of all institutions that participated in the RPS survey that semester, were not sufficient to justify any immediate change in our bibliographic instruction program. This survey remains a promising instrument for gathering data that can be used to improve student research and information literacy. Learning from our mistakes, the Director of General Education discussed assessment goals with librarians, and together they planned to work with Course Coordinators of GSTR 110 and GSTR 210 to ensure greater student participation.

Actions Taken:

  1. Assessed first-year students’ research skills and knowledge in 2010-2011 with the Higher Education Data Sharing (HEDS) Consortium’s Research Practices Survey (RPS). This survey should be given to first-year students early in the fall semester and again late in the spring semester. The pre-/post-survey format will provide a fuller picture of how a student’s research skills are developing in the first year and will inform the bibliographic and research instruction offered in the first-year writing seminars. It will also allow for comparison with students from other participating colleges.
  2. Expanded assessment of student research in GSTR courses in the summer of 2011 to see if the trend toward various high-quality sources continues. This analysis focused on research essays from GSTR 310: Understandings of Christianity.
  3. Administered the SAILS (Standardized Assessment of Information Literacy Skills) test to transfer students who had waived GSTR 110. The results, once analyzed, will be shared with the coordinator of the GSTR 210 course and used to determine if transfer students who did not take GSTR 110 have adequate research skills.

In accordance with Action 1 above, the Research Practices Survey (RPS) was administered at the beginning of the fall 2010 semester. Recruiting GSTR 110 and GSTR 210 Course Coordinators to communicate with faculty the importance of the survey and to enlist their assistance in encouraging students to participate was generally successful. The response rate was over 60 percent, with 270 students completing the survey. In the spring of 2009, only two of the 36 participating institutions had a response rate of more than 50 percent. Response rates for the 2011 late-spring administration of the survey to the same cohort of students was also good. Unfortunately we are unable to provide a discussion of the results in this report because they have not yet been received from administrators of the RPS, the Higher Education Data Sharing consortium. When received we will use the results to (1) assess changes in students’ research attitudes and abilities during their first year at Berea, and (2) adjust our bibliographic instruction program as necessary to enhance student learning.

The bibliographic analysis of GSTR 310 research essays, referred to in action item two above, included an examination of 1600 citations from one hundred seventy-eight GSTR 310 papers submitted during the 2010-11 academic year. Data collected included age of source, format and type of source, and biblical translation used. The analysis suggested that students had difficulty accessing and using appropriately relevant source material. In response, the library will purchase new materials to help students meet the requirements of the GSTR 310 research essay. These resources will be tailored to the topics of GSTR 310 projects and be more appropriate to the level of students’ understandings of Christianity. Resource guides have also been created to increase awareness of what is readily available in existing print, electronic, multimedia, and primary source collections.

Further assessment in the area of research skills and information literacy was undertaken recently with respect to a new cohort of students who enrolled in the fall of 2010 (see Action 3 immediately above). In the fall of 2010 for the first time, transfer students who had earned a B or higher in a composition course taken elsewhere were allowed to waive GSTR 110 and to begin in GSTR 210. Such a waiver raised concerns, as these students would not have had the required training that our own first-year students would have had as a result of their enrollment in GSTR 110. Some GSTR 210 instructors noticed that a number of transfer students had little library research experience. Our reference librarians consulted with the GSTR 210 faculty and agreed to assess each transfer student’s research abilities using the Project SAILS (Standardized Assessment of Information Literacy Skills that was developed at Kent State in 2001) instrument, “a knowledge test with multiple-choice questions targeting a variety of information literacy skills.” The test items are based on the Association of College and Research Libraries (ACRL) “Information Literacy Competency Standards for Higher Education.”

Review of the Project SAILS instrument resulted in the identification of 17 questions that measured competencies in the information discovery process and the evaluation of resources, the two primary information literacy concepts that librarians generally introduce to students in the library research training associated with GSTR 110.

The data on these questions indicate that transfer students are not proficient in the information discovery process and the evaluation of resources. Following up action on this issue has consisted in relaying the results to GSTR 210 faculty who teach in the fall (which would be the Core Course especially populated by transfer students) and encouraging these faculty to include additional library training and time dedicated to the development of research skills.

3) Writing

Berea has tried several means of identifying students who may need writing instruction and support beyond our two first-year required writing seminars. From the fall of 2006 to the spring of 2009, a Writing Competency Examination was administered each semester. Students who did not pass the examination by the end of their second semester were required to complete an additional composition class—GST 150: College Composition—by the end of their third semester. Because of concern about the reliability of exam scoring, the timing of exams, and the investment of time and resources, the Director of General Education charged a review team in the fall of 2008 with a thorough assessment of the writing competency exam and requirement. The Committee on General Education (COGE) reviewed the report and proposed the following to the College Faculty, approved in the spring of 2009:

Actions Taken:

  1. Eliminated the Writing Competency Exam and requirement.
  2. Identify students needing additional writing instruction on the basis of standardized test scores they received prior to arrival on campus. Preliminary data suggest that earlier identification of students with basic writing needs and placement of those students in GST 150 has a positive impact on retention and, by implication, on student learning.
  3. Implemented an end-of-the-semester assessment of GSTR 110 students’ writing by GSTR 110 instructors based on the rubric developed for the Writing Competency Examination. Included in the assessment are recommendations for additional writing instruction for students receiving a score of 1 or 2 (Sample GSTR 110 writing assessment form).

Rubric-based Assessment of Research Projects in GSTR 210 and GSTR 410

Further analysis and assessment of student writing began in June of 2010. Fourteen faculty members participated in a week-long assessment project in which they refined a rubric and analyzed several hundred student research projects from GSTR 210 and GSTR 410. One finding from this Writing, Thinking, and Research Assessment Project is discussed above in the section on critical thinking.

The study confirmed the high proficiency of Berea students on a number of criteria but also revealed the need for continuing work in helping students become more sophisticated in their reasoning and use of quantitative information. As illustrated in Table 8, students (first-years and seniors) were rated strongest on the identification and use of relevant research material, followed by general language use, use of style conventions, organization, and structure, with formulation of an appropriate thesis and reasoning the weakest (though reversed for GSTR 210 and GSTR 410) for each group. The difference in proficiency scores between the criteria in the two cohorts is never more than 5 percent.

Several things are noteworthy about the data in Table 8:

  • Two of the three strongest scores were in language use and the use of style conventions, which is not surprising. Those are skills that are expected in a wide range of classes and in various writing assignments, and thus are areas in which most students get practice and feedback.
  • Students scored the highest on research and source materials, which is likely due to two factors. First, improved access to scholarly materials via the Internet, especially with the development of Google Scholar and Google Book, makes it easier for even inexperienced students to locate high-quality materials. Second, greater ease of access is complemented by Berea’s long-standing commitment to bibliographic instruction tailored to the particular needs of students in both GSTR Core Courses and classes in the major.
  • The average proficiency score for both cohorts of students for the three criteria mentioned above is approximately 75 percent, while the average proficiency scores for the other three criteria, which call for focused thinking specific to each project, is under 60 percent. That difference is important and needs additional attention.
  • The biggest surprise—and disappointment—in these data is that if proficiency scores are averaged for all criteria, then the seniors’ average is less than one full percentage point higher than first-year students’ (67.8 for seniors; 67.1 for first-year students), and in two areas—Purpose/Thesis and Organization/Structure—the senior proficiency scores are lower than those of the first-year students.

This finding led to a number of informed theories among faculty-participants of the assessment project:

  • Instructors in GSTR 210 give substantial and sustained attention to the research project with feedback to students at several stages of the process. There tends to be much less of this type of instruction in GSTR 410 where the expectation of teaching faculty is that students have become adept at basic aspects of research writing.
  • GSTR 410 projects are likely to be more ambitious in terms of scope. Students often explore very complex topics that may exceed their ability to handle in a rigorous and effective way.
  • GSTR 410 projects can take the form of creative projects that can be hard to judge using the rubric criteria designed primarily for traditional research essays.
  • Many seniors may lack the motivation to do their best work for a GSTR course in their final year—and often their final semester—of college and perhaps dedicate the bulk of their energies to capstone experiences in their major.

Actions Taken:

  1. Designed a follow-up assessment of student writing, thinking, and research for the summer of 2011 (see overview results below). Like the Writing, Thinking, and Research Assessment Project of 2010, the 2011 version included more than a dozen faculty members from across the disciplines. The 2010 rubric was used nearly without modification, and the 2011 project included research essays from GSTR 310 (the third semester—sophomore level—Core Course in the GSTR core curriculum), as well as essays from GSTR 210 and GSTR 410.
  2. Promoted, and continue to promote, the use of the rubric used in for the assessment project across campus in both departmental and General Studies courses.
  3. Organized a showcase of GSTR 410 projects for the fall of 2010 and spring of 2011 that included approximately 35 research projects with15-20 non-presenters in attendance and eight faculty and staff. By highlighting student work in this way and others (see the next point, 4, as well), more students may be inspired to do their best work and to see how they and others can take advantage of a upper-division General Studies seminar to explore complex and meaningful subjects.
  4. Established an online journal (August 2011)—Future Tense—where the best GSTR 410 projects from each year are published.

We have attempted to evaluate the impact of some of these actions by examining the results of the 2011 iteration of the Writing, Thinking, and Research Assessment Project. In this context, Table 9 gives results to guide the analysis of student skills and suggests modified assessment practices. As indicated above, the 2010 initiative suggested that perhaps student skills were not progressing as much as desired. The assessment project did not clearly show skill development from first-year work to senior work. However, one of the conflating factors may have been that the 2010 evaluation of student skills on the GSTR 410 projects was muddied by the inclusion of some non-traditional projects in the evaluation. For the 2011 assessment project, faculty assessors segmented traditional and non-traditional projects and discovered that students’ measured abilities appeared slightly higher for seniors in 2011 in comparison to 2011 first-years. However, Table 9 also shows that students “organization and structure,” “reasoning and evidence,” and “language use” as measured by the faculty assessors, were lower in 2011 than 2010 even after the actions described above.

Table 9

Nevertheless, such a result must be carefully interpreted as there is no reason to believe that the faculty assessment group measured student writing in the same way as the previous year. The assessment project has been used both as a faculty development tool and as an assessment instrument. As such, only some faculty functioned as repeat assessors, and training in the use of the rubric is not standardized with the intention of achieving consistent measurement across years. Faculty feedback regarding the developmental aspect of the assessment project has been tremendous. As a consequence, we do not wish to fundamentally alter the project in overly restrictive ways. However, following the second iteration of the project, we still need to determine if the observed differences in Table 9 is the result of a qualitative difference in student projects or a difference in the evaluative process, or both. For the third iteration of the assessment project, we will include papers and projects from previous years, which will allow us to determine the amount of difference in the evaluative process year-to-year.

Table 9 also includes a first-time assessment of research paper from GSTR 310: Understandings of Christianity. Including this additional analysis has proven useful in that we have verified that student performance on this sophomore-level paper is generally below the averages that we find in the first year. Student performance on all six criteria is disappointingly low for the evaluation of GSTR 310 papers. Assessors in 2011 found that many of the papers from GSTR 310 were short and not well developed.

Action Taken:

  1. Informed by the finding of relative weak papers, the GSTR 310 teaching faculty have restated (1) the importance of intentionally teaching research writing in GSTR 310, and (2) the importance of thinking of writing in GSTR 310 as the third step in an institution-wide writing instruction program.

Future Actions:

  1. Design a follow-up assessment of student writing, thinking, and research for the summer of 2012. Include (blindly, from the assessors’ perspective) papers from previous years, providing an ability to control for difference across yearly faculty assessment groups to allow more precise measurements of improvements, or otherwise, of student skills.
  2. Promote the use of the rubric used in the 2010 and 2011 projects for assessment of departmental capstone projects. Follow up and collect results.

GSTR 410 was designed as a team-taught and intentionally experimental and innovative course allowing students to satisfy requirements by conducting non-traditional research projects. As such, students have completed art shows, photo essays, documentaries, and a wide variety of research-based projects. Although the medium varies, we still intend for students to demonstrate, through the completion of a project—traditional or not—a high level of research skills and critical thinking through sustained engagement with difficult and challenging material. Although we have had success in this area, we are still in the process of developing assessment instruments to assess effectively the learning associated with this course and the associated projects. Figure 10 contains information associated with a brief evaluation from faculty assessors regarding agreement and disagreement with five statements linked to GSTR 410 learning goals and may provide a basis for developing improved assessment mechanisms.

Figure 10 gives results of faculty assessors evaluating creative projects on four categories.

Figure 10

Overall, the results from this brief assessment are speculative but indicate that although the research projects are generally centered on issues of global significance, they rarely provide a historical overview based on research. In addition, the creative statements submitted with the project generally did not describe the nature of the research project in detail.

Action Taken:

  1. GSTR 410 faculty instructors discussed the above results and agreed in August 2011 to develop improved rubrics for the evaluation of non-traditional research projects. Further, GSTR 410 faculty reaffirmed a commitment to collect a text-based statement to accompany any non-traditional research project.

4) Scientific Knowledge and Inquiry

GSTR 332: Scientific Knowledge and Inquiry is the chief means by which Berea ensures that all students have a rich and rigorous engagement with the natural sciences. Until recently, assessments of the course have focused on students’ knowledge of particular content and have shown that wide variance in knowledge is often associated with instructors’ areas of expertise and particular emphases. Also, Instructor Evaluation Questionnaire (IEQ) data indicated that students were often dissatisfied with the class, with science majors frequently claiming it was repetitious and non-science majors frequently indicating that they found the course content to be too difficult and un-engaging. During the 2008-2009 academic year, a group of course instructors met to review the course and to make recommendations regarding the natural science requirement, the structure of the course, and ways of assessing students’ comprehension of both science content and science as a way of knowing.

Actions Taken:

  1. An alternative way of satisfying the General Studies natural science requirement was proposed by the Committee on General Education and approved by the full faculty. Beginning in 2009-2010 students could take two classes focused on the natural sciences (at least one of which is a lab course) in two different disciplines in lieu of GSTR 332. This essentially has the effect of allowing science majors to waive the class and thus should have the effect of allowing instructors to design an approach to the course as well as assignments for a somewhat narrower range of student interests and backgrounds.
  2. Instructors were offered the chance of piloting a different course structure. This alternative approach takes a thematic approach that requires students to consider a complex topic from multiple natural science disciplines, but it does not require the instructor to cover all the major topics that are part of the original GSTR 332 course description. One instructor opted for the thematic approach in spring 2010 and offered a course focused on the theme of migration.
  3. A new three-part assessment instrument was piloted in spring 2010. That tool included questions designed to measure students’ dispositions toward science, a multiple-choice section on fundamental scientific concepts, and a concise essay asking students to synthesize information from two natural science disciplines.

The assessment instrument mentioned in (3) above was administered at the end of spring 2010. Three faculty members reviewed responses to all the exams and wrote a detailed report of their findings. In general, this assessment suggested that the instrument, while needing some modifications, is a more promising approach to course assessment than the content-based multiple-choice test that was used in the past. Students in the pilot thematic section did well on the exam relative to other students. The assessment showed that students had difficulty in synthesizing their knowledge of major scientific theories. The readings of the assessment essays by three faculty members showed that most students were rated as a three on a five- point scale. The data detailed in Table 10 highlight a continued need to develop the means to ensure that students develop a deep understanding of the major scientific theories allowing them to demonstrate scientific understanding in novel situations. In short, students were asked to show integrative understanding in the essay but failed to do so at the level we had hoped.

Table 10

2009-2010: Academic Year: Distribution of Evaluations across Three Skills

Distribution of Results Organized by Question

Rating Description Evidence Connection Overall
“Target” 1% 4% 3% 0%
“Acceptable” 24% 19% 16% 22%
“Struggling” 46% 47% 39% 50%
“Limited” 24% 24% 34% 23%
“None” 4% 5% 8% 5%

Actions Taken:

  1. Reported and discussed findings at the annual fall retreat for science faculty.
  2. Offered the thematic section on migration again in 2010-2011 to see if the promising results from that section are replicated.
  3. Modified the instrument to address weaknesses noted in the assessment report and included quantitative reasoning tasks.
  4. Re-administered the assessment instrument in both semesters of 2010-2011.

Results from the 2010-2011 of formal assessment on dispositions, theoretical scientific knowledge, and integrative scientific thinking (see 4 above) suggest that students are still not demonstrating an ability to understand course content and apply it. The GSTR 332 faculty members reported and discussed findings at the GSTR Annual Workshop in August 2011 and agreed on a new assessment plan for 2011-2012 that includes pre- and post-tests on dispositions using the Epistemological Beliefs Assessment for Physical Science (EBAPS) instrument, a modified knowledge of science instrument, and a modified integrative essay.

5) Student Engagement (as measured by the National Survey of Student Engagement)

Berea regularly solicits feedback from its students using both local and national instruments. Our Office of Institutional Research and Assessment is committed to gathering and interpreting those data thoughtfully and rigorously with the intent of changing our practices in light of relevant data. Among the most important survey instruments we use is the National Survey of Student Engagement (NSSE), which was given to Berea first-year students and seniors in the spring of 2003, 2007, and 2010. While NSSE data are used in many ways to shape the College’s thinking and practice, a specific example of the way it influenced our curriculum is evident in the response to the 2003 survey’s data on academic challenge that continues to be examined and monitored in 2007 and 2010.

For example, in 2003 Berea’s first-year students’ responses to questions related to the level of academic challenge was cause for concern. In the NSSE Benchmark, “Level of Academic Challenge,” students respond to questions that address the following characteristics of a school’s curriculum and culture:

  • Preparing for class (studying, reading, writing, doing homework or lab work, etc., related to academic program)
  • Number of assigned textbooks, books, or book-length packs of course readings
  • Number of written papers or reports
  • Coursework emphasizes: Analysis of the basic elements of an idea, experience, or theory
  • Coursework emphasizes: Synthesis and organizing of ideas, information, or experiences into new, more complex interpretations and relationships
  • Coursework emphasizes: Making of judgments about the value of information, arguments, or methods
  • Coursework emphasizes: Applying theories or concepts to practical problems or in new situations
  • Working harder than you thought you could to meet an instructor’s standards or expectations
  • Campus environment emphasizes: Spending significant amount of time studying and on academic work.

In 2003 Berea’s first-year students were at the 86th percentile on Level of Academic Challenge when compared with all other schools participating in NSSE that year. However, Berea’s first-year students were at the 40th percentile when compared with students from liberal arts colleges. That information was made available to the General Education Review Committee that was charged with proposing a new General Education Curriculum.

Action Taken:

  1. The first-year writing seminars—GSTR 110: Critical Thinking and the Liberal Arts and GSTR 210: Identity and Diversity in the United States were designed to be rigorous introductions to college-level writing, thinking, and research. Critical thinking was highlighted in the first course and focused attention on the research process was moved from the second year in our previous curriculum to the second semester Core Course GSTR 210.

NSSE results in 2007 and 2010 suggest that these changes contributed greater academic challenge for our first-year students. Berea’s benchmark score for Level of Academic Challenge increased by more than two full points in the first year of the new General Education program and rose again, more modestly, in the 2010 administration of NSSE. In 2010 the Level of Academic Challenge score for our first-year students was higher than our Carnegie Peers (statistical significance of p<.05). While Berea’s score was slightly less than the average score of the top decile of NSSE participants, that difference was not statistically significant.

Summary and Future Plans

Intentional and meaningful assessment remains a top priority for Berea College. The time period 2008-2011 represents a period of growth and innovation by Berea College in attempting to develop meaningful, sustainable, and systematic and assessment programs. The process will continue. Current assessment stakeholders at Berea College are highlighting the potential assessment value of student e-portfolios. We are already collecting work from three of five Core Courses in the General Education Program. It is possible that student e-portfolios might simplify the collection of artifacts demonstrating core skills and support deep assessment of year-to-year student development. Immediate action to lay the foundation for the implementation of such a program is an institutional priority.

Supporting Documents and Evidence

  1. Berea College 2011-2012 Catalog and Student Handbook, Academic Program (Website) <http://www.berea.edu/cataloghandbook/academics/default.asp>
  2. Labor Program Office (Website) <http://www.berea.edu/laborprogramoffice/>
  3. General Education Program (Website) <http://www.berea.edu/generalstudies/>
  4. Departmental Effectiveness Reports, June 2011
  5. Self-Study Reports
  6. Learning Goals for Academic Programs at Berea College, as of August 2011
  7. Schedule of Self-Studies for Academic Majors
  8. Departmental Self-Study Guidelines, July 2010
  9. Berea College Faculty Manual, Academic Program Council (Website) <http://www.berea.edu/facultymanual/>
  10. Departmental Data Reports, Fall 2010
  11. National Survey of Student Engagement (NSSE) (Website) <http://www.berea.edu/ira/documents/surveysstudies/nsse2010report.pdf>
  12. ACT Alumni Survey (Website) <http://www.berea.edu/ira/documents/20090828-alumnisurvey.pdf>
  13. Berea-Specific Alumni Survey, Spring 2010 (Website) <http://www.berea.edu/ira/documents/AlumniSurveyReportSpring2010.pdf>
  14. Instructor Evaluation Questionnaire (IEQ) Instrument
  15. Berea College Faculty Manual, Tenure Review Standards (Website) <http://www.berea.edu/facultymanual>
  16. Academic Department Graduating Senior Survey Instruments
  17. National Council Licensure Examination (NCLEX) (Website) <https://www.ncsbn.org/nclex.htm>
  18. Educational Testing Service (ETS) Major Field Test in Business (Website) <http://www.ets.org/Media/Tests/PRAXIS/pdf/0100.pdf>
  19. American Chemical Society (ACS) in Chemistry (Website) < http://portal.acs.org/portal/acs/corg/content>
  20. PRAXIS for teacher certification (Website) <http://www.ets.org/praxis>
  21. Being and Becoming: The Strategic Plan for Berea College, Revised May 2011; Engaged and Transformative Learning
  22. Berea College 2011-2012 Catalog and Student Handbook, Labor Program (Website) <http://www.berea.edu/cataloghandbook/college/laborprogram/introduction.asp>
  23. Great Commitments of Berea College (Mission); (Website) <http://www.berea.edu/about/mission.asp>
  24. The Labor Program at Berea College (Website) <http://www.berea.edu/laborprogram/>
  25. Berea College 2010-2011 Fact Book, Labor Departments
  26. Work-Learning-Service (WLS) Descriptions
  27. Labor Program Student Survey Feedback (Website) <http://www.berea.edu/ira/documents/topical/LaborProgram.pdf>
  28. Work Colleges Consortium (Website) <http://www.workcolleges.org/>
    1. a. Shared research (Website) <http://www.workcolleges.org/node/34>
  29. ACT Alumni Outcomes Survey, Administered to Work Colleges (Website) <http://www.berea.edu/ira/documents/surveysstudies/WCC_AlumniOutcomesReport.pdf>
  30. Berea College Faculty Manual, Strategic Planning Council (Website) <http://www.berea.edu/facultymanual/>
  31. The Report of the Labor Review Team to the Strategic Planning Council
  32. Recommendations from the Report of the Labor Review Team
  33. Being and Becoming: The Strategic Plan for Berea College, Revised May 2011; Strategic Initiative on Student Labor
  34. Labor Transcript Sample
  35. Ongoing Training Support from the Labor Program
  36. Berea College Faculty Manual, Labor Program Council (Website) <http://www.berea.edu/facultymanual/>
  37. Sample Written Job Descriptions for the Student Labor Program
  38. Description of the Labor Evaluations
  39. Student Labor Evaluation Performance Expectations
  40. Student Labor Evaluation Form
  41. Connections between Workplace Expectations, Performance Expectations, and Labor Learning Goals
  42. Labor Learning Goals from the Report of the Labor Review Team
  43. Berea College Workplace Expectations
  44. Labor Experience Evaluation Form
  45. Labor Experience Evaluation Reports
    1. Academic Year 2009-2010
    2. Academic Year 2010-2011
  46. Labor Experience Evaluation Report, Supporting the Four Core General Education Goals
  47. Student Labor Evaluation Report, Academic Year 2009-2010
  48. Individual Labor Department Evaluation Report Examples
  49. Berea College 2011-2012 Catalog and Student Handbook, General Studies Courses (Website) <http://www.berea.edu/cataloghandbook/academics/academicprogram/gep/reqs.asp>
  50. Student Learning Outcomes and Descriptions of Required General Studies Courses
    1. GSTR 110: Writing Seminar I: Critical Thinking in the Liberal Arts
    2. GSTR 210: Writing Seminar II: Identity and Diversity in the United States
    3. GSTR 310: Understandings of Christianity
    4. GSTR 332: Scientific Knowledge and Inquiry
    5. GSTR 410: Contemporary Global Issues
  51. Committee on General Education (Website) <http://www.berea.edu/generalstudies/coge.asp>
  52. Berea College 2011-2012 Catalog and Student Handbook, General Studies Requirements
    1. Perspective Areas (Website) <http://www.berea.edu/cataloghandbook/academics/academicprogram/gep/6persp.asp>
    2. Practical Reasoning Courses (Website) <http://www.berea.edu/cataloghandbook/academics/academicprogram/gep/prr.asp>
    3. Health and Physical Education (Website) <http://www.berea.edu/cataloghandbook/academics/academicprogram/gep/lifetimehealth.asp>
    4. Active Learning Experience (Website) <http://www.berea.edu/cataloghandbook/academics/academicprogram/gep/ale.asp>
  53. Aims of General Education (Website) <http://www.berea.edu/generalstudies/aims.asp>
  54. Being and Becoming: The Strategic Plan for Berea College, Revised May 2011
  55. Learning Outcomes Assessment
  56. Committee on General Education Report to the Faculty, December 2010
  57. California Critical Thinking Disposition Inventory (CCTDI); (Website) <http://www.insightassessment.com/Products/Critical-Thinking-Attributes-Tests/California-Critical-Thinking-Disposition-Inventory-CCTDI>
  58. ACT College Assessment of Academic Proficiency (CAAP); (Website) <http://www.act.org/caap/>
  59. Report of the Critical Thinking Advisory Group (CTAG), 2007-2008
  60. Writing Assessment Instruments for GSTR 210, GSTR 310, and GSTR 410
  61. QuIRK (Quantitative Inquiry, Reasoning, and Knowledge) Initiative (Carleton College’s Model of Promoting Quantitative Reasoning (Website) <http://serc.carleton.edu/quirk/>
  62. Newton’s Attic (Website) <http://www.newtonsattic.com/>
  63. Summary of Bibliometric Studies in 2008 and 2009
  64. Higher Education Data Sharing (HEDS) Consortium (Website) <http://www.e-heds.org/>
  65. Research Practices Survey (RPS) Instrument
  66. National Institute for Technology in Liberal Education (NITLE); (Website) <http://www.nitle.org/>
  67. Project SAILS (Standardized Assessment of Information Literacy Skills); (Website) <https://www.projectsails.org/>
  68. Report of the Writing Review Team: A Proposal for a College Writing Expectation
  69. Sample GSTR 110 End-of-the-Semester Writing Assessment Form
  70. Final Report, August 2010: Assessment of Student Writing, Thinking, and Research
  71. Online Journal, Future Tense (Website) <http://community.berea.edu/futuretense/>
  72. Final Report, June 2011: Assessment of Student Writing, Thinking, and Research
  73. Report of the Ad Hoc Committee Reviewing the Standard Assessment for GSTR 332
  74. GSTR 332 Standard Essay Prompt
  75. Epistemological Beliefs Assessment for Physical Science (EBAPS) Instrument (Website) <http://www2.physics.umd.edu/~elby/EBAPS/home.htm>
  76. National Survey of Student Engagement (NSSE)
    1. Berea College Trends for Level of Academic Challenge
    2. 2010 Comparison of Berea College to Carnegie Peers: Level of Academic Challenge

Berea College Logo

 

Copyright © 2013 Berea College