|
Understanding the Domains and Defining the Dimensions
Understanding the Domains
The conceptual framework used in this tool kit consists of eight domains.
Seven emerged from the literature review and the national institutional
survey results from Phases I and II. An additional domain Integration
with Academic Management and Educational Improvement, which represents
how student assessment is linked to these two important institutional processes
was added during the Phase III case studies. The eight domains:
- Institutional Context
- External Influences
- Institutional Approaches
- Institution-wide Strategy, Support
and Leadership
- Assessment Management Policies and Practices
- Integration with Academic Management and Educational
Improvement
- Institutional Culture
- Uses and Impacts of Student Assessment Data
Uses and Impacts
The Uses and Impacts of Student Assessment Data domain was treated
as the dependent variable in our research and identifies how the institution
uses student assessment data in making academic decisions and how it impacts
institutional behavior or performance. Uses in this domain examines whether
student assessment information is used to make academic decisions in two
areas: educational and faculty decisions. Educational decisions include
those in areas such as academic program planning, curriculum, academic
support services, and academic resource allocation. Faculty decisions
measured the extent to which assessment information is used in institutional
decisions regarding faculty promotion, tenure, and salary increases or
rewards.
The impact of student assessment was examined in three basic areas: student
and faculty related impacts and external indicators. Four measures of
the impact of student assessment on students were used: student satisfaction,
retention and graduation rates, grade performance, and achievement on
external examinations. Four items served as measures of the impact of
student assessment on faculty: campus discussions of undergraduate education,
faculty satisfaction, interest in teaching, and changes in teaching methods.
Finally, seven items measured the impact of student assessment information
on external indicators of institutional performance: student applications
or acceptance rates, allocation of state funding, private fund-raising
results, success on grant applications, regional accreditation evaluation,
communications with external constituents, and institutional reputation.
(See description of the dimensions (variables) for this domain.)
jump to Defining the Dimensions
Institutional Context
The Institutional Context domain includes dimensions or characteristics
about each institution, which the literature suggested influences the
nature of student assessment on campus. This includes things such as size,
control (public or private), and Carnegie classification. Analysis of
the Phase II National Institutional Survey Data confirmed substantial
differences among institutions on these dimensions. When the case study
institutions were chosen for Phase III, they were selected to represent
variations on these dimensions. (See description of the dimensions (variables)
for this domain.)
External Influences
The domain of External Influences was included in the model to assess
the extent to which various external groups influenced each institution
at the time the institution initiated its assessment efforts. External
groups included state governments, accreditation agencies, professional
associations, foundations or donors, and business or employer groups.
State influence was reflected in state initiatives, state approaches,
state plans, state reporting requirements, state review methods, state
review criteria, or other state purposes related to student assessment.
Accrediting influence was reflected in their purposes, their types of
requirements, or their reporting and follow-up activities. (See description
of the dimensions (variables) for this domain.)
Institutional Approaches to Student Assessment
The Institutional Approach to Student Assessment domain is used to identify
several areas that describe the nature of the institutions approach
to student assessment. This includes the content or type of student assessment
measures the institutions use, the extent or comprehensiveness with which
they collect data on a variety of types of students cognitive, affective,
and post-college development or performance, and the timing of student
assessment measures used by the institution. Three variables measured
methods of student assessment the number of instruments (comprehensive
tests or inventories) used, use of integrative or performance-based assessment
methods (student-centered methods), and use of methods involving external
constituencies (external methods). The analysis of student assessment
was measured by the type and number of studies of student performance
that were conducted, and the number of levels at which assessment data
is aggregated and reported (number of reports). (See description of the
dimensions (variables) for this domain.)
Institutional-wide Strategy, Support,
and Leadership for Student Assessment
The Institution-wide Strategy, Support, and Leadership for Student Assessment
domain includes dimensions that examine broad areas of institutional commitment
to student assessment. Institution-wide Strategy focused on the emphasis
an institution gave to undergraduate education and to assessment in its
mission statement, whether or not it had a formal assessment plan or policy,
and the type of organizational structure for the student assessment effort.
Institution-wide Support for Student Assessment includes dimensions related
to the amount and type of administrative and faculty support for the institutions
student assessment effort and the administrative and governance activities
used to promote student assessment at the institution. Institution-wide
Leadership for Student Assessment includes the breadth and depth of the
leadership support among the administration and the faculty. A final dimension
examined whether an institution has formally or informally evaluated its
assessment process (conducted evaluation). (See description of the dimensions
(variables) for this domain.)
Assessment Management Policies and Practices
The domain of Assessment Management Policies and Practices is included
to assess whether nine areas of institutional management activities, that
are designed to promote student assessment, exist through the use of formal
policies and/or practices. The nine areas examined are: 1) whether student
assessment data are incorporated in academic planning and review processes,
2) use of student assessment data for budget decisions, 3) information
system for student assessment, 4) breadth of internal access to student
assessment information on individual students, 5) breadth of distribution
of student assessment reports, 6) policies promoting student involvement
in student assessment, 7) professional development policies on student
assessment for faculty and academic administrators, 8) professional development
policies on student assessment for student affairs personnel, and 9) policies
linking faculty evaluation and rewards to student assessment involvement.
(See description of the dimensions (variables) for this domain.)
Integration with Academic Management and Educational
Improvement
The domain for Integration with Academic Management and Educational Improvement
was not included in the Phase I and II conceptual framework but was added
as a result of the case studies. This domain examines the relationship
of an institutions student assessment efforts and information collected
within these two processes or activities. Academic management includes
institutional processes such as strategic planning, academic program review,
budgeting, and quality improvement. The relationship between student assessment
efforts and educational improvement examines whether student assessment
is a central fixture of institutional offices or processes designed for
instructional and teaching improvement, curricular and program design,
learning innovation, and faculty/professional development. (See description
of the dimensions (variables) for this domain.)
Institutional Culture and Climate for Student
Assessment
The final domain, Institutional Culture and Climate for Student Assessment,
is included to broadly reflect overall patterns of institutional approaches
to, support for and use of student assessment as well as the identifiable
attitudes, rituals, and driving forces or beliefs behind the institutions
student assessment efforts. The focus and strength of the institutions
culture for student assessment is, not only, a reflection of an institution's
involvement with student assessment, but also a useful indicator of their
commitment to use it for institutional, student, and academic improvement.
(See description of the dimensions (variables) for this domain.)
back to top
Defining the Dimensions
Each of the eight domains identified in the framework for this tool kit
consist of several dimensions (variables) that operationalize the various
organizational and administrative activities or characteristics within
each of these domains. Some of these dimensions consist of a single activity
or characteristic but most include several. Most users of this tool kit
will probably want to review the activities or characteristics individually
by item. However, drawing on a factor analysis of data collected in the
national survey (Phase II), it was clear that many of these institutional
activities or characteristics were closely related. Thus, several multi-item
dimensions of similar activities or characteristics were created.
Table 2 characterizes the dimensions (variables) by domain. The items
representing institutional activities or characteristics are primarily
from the Inventory of Institutional Support for Student Assessment (ISSA)
survey instrument. Other sources noted include the 1995 Integrated Postsecondary
Education Data System (IPEDS) and the Assessment of Teaching and Learning
for Improvement and Public Accountability: State Governing, Coordinating
Board and Regional Accreditation Association Policies and Practices (SAS)(Cole,
Nettles, & Sharp, 1997). Within each domain, the dimensions are classified
based on the type, source, definition, and scale where:
Type - refers to its categorization as a single item, additive
index, or factorially derived score.
Source - refers the user to the section, subsection, and items
numbers in the national survey (ISSA) used to identify or derive the item,
index, or factor.
Definition - details the activities or characteristics captured
by the item, index, or factor, and
Scale - identifies the possible range of responses for the item,
or items in the index or factor.
Where the dimension is the result of a factor analysis, the Cronbach
Alpha signifying the strength of the factor is shown.
______________________________________________________________________________
Table 2. Characteristics of Domains and Dimensions.
|
Domains and Dimensions
|
Dimension Type, Source, Definition and Scale
|
Institutional Uses and Impacts of Student Assessment
|
Educational decisions
|
Ten item factorially-derived scale. (V A 1-5, 8-12)
Reflects the influence of student assessment information in educational
decisions: revision of undergraduate academic mission or goals; designing
or reorganizing academic programs or majors; designing or reorganizing
student affairs units; allocating resources to academic units; modifying
student assessment plans, policies, or processes; revising or modifying
general education curriculum; creating or modifying student out-of-class
learning experiences; creating or modifying distance learning initiatives;
modifying instructional or teaching methods; modifying student academic
support services (1 = no action or influence known; 2 = action taken,
data not influential; 3 = action taken, data somewhat influential;
4 = action taken, data very influential), Cronbach alpha = .83. |
Faculty decisions
|
Two item factorially-derived scale. (V A 6-7) Reflects the influence
of student assessment information in faculty decisions: deciding
faculty promotion and tenure; deciding faculty salary increases
or rewards (1 = no action or influence known; 2 = action taken,
data not influential; 3 = action taken, data somewhat influential;
4 = action taken, data very influential). Cronbach alpha = .79.
|
Faculty impacts
|
Four item factorially-derived scale. (V B 1-4) Reflects
student assessment impacts on faculty: affected campus discussions
of undergraduate education; contributed to faculty satisfaction; contributed
to faculty interest in teaching; led to changes in teaching methods
used (1 = not monitored, do not know; 2 = monitored, negative impact;
3 = monitored, no known impact; 4 = monitored, positive impact). Cronbach
alpha = .79. |
Student impacts
|
Four item factorially-derived scale. (V B 5-8) Reflects
student assessment impacts on students: contributed to student satisfaction;
affected student retention or graduation rates; affected student grade
performance; affected student achievement on external examinations
(1 = not monitored, do not know; 2 = monitored, negative impact; 3
= monitored, no known impact; 4 = monitored, positive impact). Cronbach
alpha = .82. |
External impacts
|
Seven item factorially-derived scale. (V B 9-15) Reflects
student assessment impacts on external constituents: affected student
applications or acceptance rates; affected allocation or share of
state funding; affected evaluation from regional accrediting agency;
affected private fund-raising results; affected success on grant applications;
affected communications with external constituents; affected institutional
reputation or image (1 = not monitored, do not know; 2 = monitored,
negative impact; 3 = monitored, no known impact; 4 = monitored, positive
impact). Cronbach alpha = .82. |
Institutional Context
|
Enrollment
|
Single item. Reflects number of students enrolled in institution.
Data from IPEDS. |
Control
|
Single item. (1 = public; 0 = private). Data from IPEDS. |
Institutional type
|
Four dummy-coded single items. Reflects the institutions
carnegie type. (Associate of Arts, Baccalaureate, Doctoral, and Research.
Comprehensive institutions was the omitted category.) Data from IPEDS. |
External Influences
|
State initiative
|
Single item. Reflects whether the states assessment
initiatives were guided by legislative or other means (1 = no state
plan; 2 = state policy; 3 = state statute; 4 = combination of policy
& statute). Data from SAS. |
State approach
|
Single item. Reflects whether states mandate
common indicators and outcomes (1 = no indicators or outcomes; 2 =
institutional specific; 3 = common for some; 4 = common for all).
Data from SAS. |
Accrediting association
|
Five dummy-coded single items. Reflects the institutions regional
accreditation association membership (Middle States; North Central;
New England; Southern; Western. Northwest region was the omitted region).
Data from IPEDS. |
Development of state plan
|
Single item. (III A 1) Reflects how state plan for student assessment
was primarily developed (1 = state; 2 = joint consultation between
state and institution; 3 = no state plan or requirement). |
State influence
|
Four single items. (III A 2 a-d) Reflect the influence of state requirements
on the institutions assessment activities: a = important reason to
initiate student assessment; b = increased institutions involvement
in assessment; c = have not been a factor in assessment activities;
d = have been negative influence on assessment activities (1 = yes;
0 = no). |
State reporting requirements
|
Four single items. (III A 3 a-d) Reflect the states reporting
requirements: a = evidence that assessment plan is in place; b = measurement
of state mandated indicators; c = use of institutionally devised indicators;
d = evidence of institutional use of assessment information (1 = yes;
0 = no). |
State review methods
|
Four single items. (III A 4 a-d) Reflect the method used by state
to review the institutions assessment activities: a = reviewed by
state officials; b = reviewed using external reviewers; c = required
institutional self-review; d = no review occurred (1 = yes; 0 = no). |
State review criteria
|
Five single items. (III A 5 a-e) Reflect the processes included in
the state review of the institutions assessment activities: a = review
of institutions process itself; b = compare student performance record
with past record; c = compare student performance record with peer
institutions; d = compare student performance record with other in
state; e = other (1 = yes; 0 = no). |
Accrediting influence
|
Four single items. (III B 2 a-d) Reflect the influence of regional
accreditation agency requirements on the institutions assessment activities:
a = important reason to initiate student assessment; b = increased
institutions involvement in assessment; c = have not been a
factor in assessment activities; d = have been negative influence
on assessment activities (1 = yes; 0 = no). |
Accrediting reporting requirements
|
Five single items. (III B 3 a-e) Reflect the regional
accreditation agency reporting requirements: a = evidence that assessment
plan is in place; b = intended uses of assessment information; c =
results of assessment; d = evidence of actual institutional use of
assessment information; e = unfamiliar with regional accreditation
requirements (1 = yes; 0 = no). |
External sources of support
|
Five single items. (III C 1 a-e) Reflect the sources
of support received to improve student assessment practices: a = FIPSE;
b = other federal agencies; c = state incentive program; d = private
foundation or corporate source; e = no known external grants (1 =
yes; 0 = no). |
Use of external services
|
Four single items. (III C 2 a-d) Reflect the use of
services offered by each of the following type of postsecondary organization:
a = professional associations; b = regional accrediting association;
c = state-level agency; d = consortium of institutions. Respondants
could choose from the following services offered by each organization:
organization not used or not available; consultation services; assessment
conferences; training workshops; publications or research reports
(1 = used; 0 = not used). |
Internal purposes
|
Four item factorially-derived score. (II B 3-6) Reflects the importance
of internal institutional purposes for undertaking student assessment:
guiding undergraduate academic program improvement; improving achievement
of undergraduate students; improving faculty instructional performance;
guiding resource allocation decisions (1 = no importance; 2 = minor
importance; 3 = moderate importance; 4 = very important). Cronbach
alpha = .79. |
Accreditation purposes
|
Single item. (II B 1) Reflects importance of preparing for institutional
accreditation self-study as a purpose for undertaking student assessment
(1 = no importance; 2 = minor importance; 3 = moderate importance;
4 = very important). |
State purposes
|
Single item. (II B 2) Reflects importance of meeting state reporting
requirements as a purpose for undertaking student assessment (1 =
no importance; 2 = minor importance; 3 = moderate importance; 4 =
very important). |
Institutional Approaches to Student Assessment
|
Academic intentions
|
Single item. (I A 1) Reflects extent to which institutions
collect data on current students academic intentions or expectations
(1 = not collected; 2 = collected for some students; 3 = collected
for many students; 4 = collected for all students). |
Basic college-readiness skills
|
Single item. (I A 2) Reflects extent to which institutions
collect data on current students college-readiness skills (1
= not collected; 2 = collected for some students; 3 = collected for
many students; 4 = collected for all students). |
Cognitive assessment
|
Four item factorially-derived scale. (I A 3-6) Reflects
the extent to which institutions collect data on current students
cognitive performance: competence in major field; general education
competencies; higher-order cognitive skills; vocational or professional
skills (1 = not collected; 2 = collected for some students; 3 = collected
for many students; 4 = collected for all students). Cronbach alpha
= .71. |
Affective assessment
|
Affective assessment Three item factorially-derived scale. (I A 7-9)
Reflects the extent to which institutions collect data on current
students affective development and satisfaction: experiences
and involvement with institution; satisfaction with institution; personal
growth and affective development (1 = not collected; 2 = collected
for some students; 3 = collected for many students; 4 = collected
for all students). Cronbach alpha = .68. |
Academic progress
|
Single item. (I A 10) Reflects extent to which institutions
collect data on current students academic progress (1 = not
collected; 2 = collected for some students; 3 = collected for many
students; 4 = collected for all students). |
Post-college assessment
|
Three item factorially-derived scale. (I A 11,12,14)
Reflects the extent to which institutions collect data from former
students: vocational or professional outcomes; further education;
satisfaction and experiences with institution after leaving (1 = not
collected; 2 = collected for some students; 3 = collected for many
students; 4 = collected for all students). Cronbach alpha = .83. |
Civic/social roles
|
Single item. (I A 13) Reflects extent to which institutions
collect data on former students civic or social roles in the
community (1 = not collected; 2 = collected for some students; 3 =
collected for many students; 4 = collected for all students). |
Timing of data collection
|
Nine item additive index. (I A 1-9) Reflects when institutions collect
data (1 = not collected; 2 = collected at one point in time; 3 = collected
at entry and while enrolled, or while enrolled and at exit; 4 = collected
at entry and at exit; 5 = collected at entry, while enrolled, and
at exit). |
Number of instruments
|
Nine item additive index. (I B 1-9) Reflects student
assessment instruments (institutionally developed, state provided,
and commercially available) used by institution to collect ten types
of assessment information: student plans or expec-tations; basic college-readiness
skills; higher-order cognitive skills; general educ-ation competencies;
competence in major; vocational or professional skills; person-al
growth and affective development; experiences or involvement with
institution; satisfaction with institution (1 = instrument used; 0
= instrument not used). |
Student-centered methods
|
Four item factorially-derived scale. (I C 1-4) Reflects the extent
to which institutions use innovative or nontraditional assessment
methods: performance in capstone courses; portfolios or comprehensive
projects; observations of student performance; individual interviews
or focus groups (1 = not used; 2 = used in some units; 3 = used in
most units; 4 = used in all units). Cronbach alpha = .61. |
External methods
|
Two item factorially-derived scale. (I C 8-9) Reflects
the extent to which institutions use assessment methods that data
from external constituencies: employer interviews or focus groups;
alumni interviews or focus groups (1 = not used; 2 = used in some
units; 3 = used in most units; 4 = used in all units). Cronbach alpha
= .63. |
Transcript analysis
|
Single item. (I C 5) Reflects extent to which institutions
use transcript analysis to collect student assessment information
(1 = not used; 2 = used in some units; 3 = used in most units; 4 =
used in all units). |
External examination
|
Single item. (I C 6) Reflects extent to which institutions
use external examinations to collect student assessment information
(1 = not used; 2 = used in some units; 3 = used in most units; 4 =
used in all units). |
Interviews of withdrawing students
|
Single item. (I C 7) Reflects extent to which institutions use interviews
with withdrawing students to collect student assessment information
(1 = not used; 2 = used in some units; 3 = used in most units; 4 =
used in all units). |
Student sub-populations
|
Four single items. (I D 1-4) Reflect the use of different
assessment methods for the following different student populations:
a = adult students; b = part-time students; c = minority students;
d = distance education students (1 = different method; 2 = same method). |
Number of studies
|
Nine item additive index. (I E 1-9) Reflects the number
of studies institutions conduct on the relationship between aspects
of students institutional experiences and performance: course-taking
patterns; exposure to different teaching methods; patterns of student-faculty
interaction; extra-curricular activities; residence arrangements;
financial aid and/or employment; admission standards or policies;
academic advising patterns; classroom, library and/or computing resources
(1 = conduct study; 0 = do not conduct study). |
Number of reports
|
Five item additive index. (I F 1-5) Reflects the levels
of aggregation at which student assessment data are provided as reports:
institution-wide; schools or colleges; academic programs or departments;
special populations or subgroups of students; by course or groups
of courses (1 = report provided; 0 = report not provided). |
Institution-wide Strategy, Support, and Leadership for Student Assessment
|
Mission emphasis
|
Three item additive index. (II A 1 a-c) Reflects institutions
mission statement emphasis on undergraduate education and its assessment:
emphasizes excellence in undergraduate education; identifies educational
outcomes intended for students; refers to student assessment as important
activity (1 = yes; 0 = no). |
Administrative and governance activities
|
Seven item additive index. (II C 1-7) Reflects the number
of administrative or governance activities used by institutions to
promote student assessment: annual institution-wide assessment forums
or seminars; rewards or incentives for administrators promoting use
of assessment in unit; incentives for academic units to use assessment
information; assessment workshops for administrators; board of trustees
committee addresses assessment; faculty governance committee addresses
assessment; student representation on assessment committees (1 = yes;
0 = no). |
Administrative and faculty support
|
Four item additive index. (II D 2-5) Reflects the degree
to which chief executive officer, academic and student affairs administrators,
and faculty support student assessment (1 = very unsupportive; 2 =
somewhat unsupportive; 3 = neutral or unknown; 4 = somewhat supportive;
5 = very supportive). |
Type of plan or policy
|
Seven single items. (II E 1 a-g) Reflects the institutions
plan or policy for student assessment: a = formally adopted plan or
policy requiring assessment activities for all academic units; b =
formally adopted plan or policy requiring assessment activities for
some academic units; c = formally adopted plan or policy requiring
all academic units to develop their own assessment plan; d = formally
adopted plan or policy stipulating institution-wide activities to
be conducted by central committee, office, or officer; e = has no
formal plan or policy but academic units are encouraged to conduct
their own assessment activities; f = is currently developing plan
or policy; g = does not have an assessment plan or policy (1 = yes;
0 = no). |
Formal centralized policy
|
Single item. (II E 1 a) Reflects institution has formal institutional
plan or policy requiring specified student assessment activities of
all academic units or programs (1 = yes; 0 = no). |
Institution-wide planning group
|
Single item. (II E 2) Reflects institution has institution-wide
group for student assessment planning and policy setting (1 = yes;
0 = no). |
Breadth of assessment planning group
|
Nine item additive index. (II E 3 a-i) Reflects the
number of internal members included in the institutions assessment
planning group: chief executive officer; academic affairs administrator(s)/staff;
student affairs administrator(s)/staff; institutional research administrator(s)/staff;
academic review and evaluation administrator(s) /staff; student assessment
administrator(s)/staff; faculty; students; other. |
Responsibility for planning group
|
Responsibility for planning group Seven single items.
(II E 4 a-g) Reflect the internal members who have executive responsibility
for the institution-wide group responsible for planning or policy-setting
for assessment: a = academic affairs administrator; b = student affairs
administrator; c = institutional research officer; d = academic review
and evaluation officer; e = student assessment officer; f = faculty
member; g = other (1 = yes; 0 = no). |
Approval authority
|
Eleven single items. (II E 5 a-k) Reflect the internal
members who approve any changes to institutions assessment plan or
policy: a = board of trustees; b = chief executive officer; c = chief
academic affairs officer; d = chief student affairs officer; e = institutional
research officer; f = academic review and evaluation officer; g =
student assessment officer; h = student government; i = academic senate
or other faculty committees; j = faculty union; k = other (1 = yes;
0 = no). |
Operating responsibility
|
Eight single items. (II E 6 a-h) Reflect the internal members who
have operational responsibility for the institutions day-to-day
assessment activities: a = academic affairs administrator; b = student
affairs administrator; c = institutional research officer; d = academic
review and evaluation officer; e = student assessment officer; f =
faculty member; g = other; h = no one (1 = yes; 0 = no). |
Reporting relationship
|
Six single items. (II E 7 a-f) Reflect the individual to whom person
with day-to-day responsibility reports: a = chief executive officer;
b = chief academic affairs officer; c = chief student affairs officer;
d = institutional research officer; e = academic review and evaluation
officer; f = other (1 = yes; 0 = no).
Conducted evaluation Single item. (II F 1 a-d) Reflects if institution
has formally or informally evaluated its student assessment process
(1 = yes; 0 = no). |
Evaluations elements
|
Eight single items. (II F 2 a-h) Reflect the elements
that were reviewed during the institutions assessment evaluation:
a = student assessment plan or policies; b = structure and responsibility
for assessment; c = achievement of intended objectives; d = reliability
and validity of instruments and methods; e = quality of data analysis;
f = use of information in institutional decision-making; g = problems
encountered; h = comparison of costs and benefits (1 = yes; 0 = no). |
Assessment Management Policies and Practices
|
Budget decisions
|
Two item additive index. (IV A 3-4) Reflects formal use of assessment
information in the budget process: to competitively allocate resources
among academic units; to reward academic units for improvement (1
= yes; 0 = no). |
Computer support
|
Three item additive index. (IV B 2-4) Reflects institutional capacity
to collect and manage student assessment information: computerized
student information system includes student performance indicators;
student information system tracks individual students; student assessment
database integrated with other institutional databases (1 = yes; 0
= no). |
Access to information
|
Five item additive index. (IV C 1-5) Reflects internal accessibility
of assessment information on individual students by: institutional
research or assessment professionals; senior academic administrators;
department chairs or academic program administrators; student affairs
professionals; faculty advisors (1 = yes; 0 = no). |
Distribution of reports
|
Six item additive index. (IV D 1-6) Reflects the number of constituent
groups to whom student assessment reports are regularly distributed:
students; faculty; academic administrators; student affairs professionals;
employers; general public (1 = yes; 0 = no).
|
Student involvement
|
Three item factorially-derived scale. (IV E 1,3,4) Reflects
the extent to which institutions have policies or practices to promote
student involvement in assessment activities: inform students about
assessment purposes and uses; require students to participate in assessment
activities; provide students with individual feedback on assessment
results (1 = not done at all; 2 = done in a few departments; 3 = done
in some departments; 4 = done in many departments; 5 = done in most
departments). Cronbach alpha = .69. |
Professional development
|
Four item factorially-derived scale. (IV F 2-5) Reflects existence
of professional development policies or practices on student assessment
for faculty and academic administrators: provide funds for faculty
to attend or present at assessment conferences; offer student assessment
workshops or consultation for faculty; provide assistance (e.g., paid
leaves, stipends, course reduction) to improve faculty use of student
assessment; provide student assessment workshops for academic administrators
(1 = not done at all; 2 = done in a few departments; 3 = done in some
departments; 4 = done in many departments; 5 = done in most departments).
Cronbach alpha = .77. |
Student affairs training
|
Two item factorially-derived scale. (IV F 6-7) Reflects
existence of professional development policies or practices on student
assessment for student affairs personnel: require assessment training
for student affairs staff; provide student assessment workshops for
student affairs administrators (1 = not done at all; 2 = done in a
few departments; 3 = done in some departments; 4 = done in many departments;
5 = done in most departments). Cronbach alpha = .84. |
Faculty evaluation
|
Five item factorially-derived scale. (IV G 1-5) Reflects
existence of faculty evaluation and reward policies and practices
related to student assessment: promotion evaluation considers evidence
of student performance; salary evaluation considers evidence of student
performance; promotion, tenure or salary reviews consider faculty
participation in student assessment; promotion, tenure or salary reviews
consider scholarship on assessment; public recognition or awards for
faculty use of student assessment (1 = not done at all; 2 = done in
a few departments; 3 = done in some departments; 4 = done in many
departments; 5 = done in most departments). Cronbach alpha = .77. |
Academic planning and review
|
Four item factorially-derived scale. (IV H 1-4) Reflects the incorporation
of student assessment data into academic planning and review processes
for: academic departments or undergraduate programs; general education
or core curriculum; courses; student academic support services (1
= not done at all; 2 = done in a few departments; 3 = done in some
departments; 4 = done in many departments; 5 = done in most departments).
Cronbach alpha = .84. |
Integration with Academic Management and Educational Improvement
|
Strategic planning
|
Qualitative variable used in Phase III. Reflects the integration of
student assessment data into the strategic planning for the institution. |
Academic program review
|
Qualitative variable used in Phase III. Reflects the
integration of student assessment data into the academic program review
at the institutional, divisional, and departmental levels. |
Budgeting
|
Qualitative variable used in Phase III. Reflects the
integration of student assessment data into the budget planning for
the institution. |
Quality improvement
|
Qualitative variable used in Phase III. Reflects the
integration of student assessment data into the improving institutional
quality of education. |
Instructional and teaching improvement
|
Qualitative variable used in Phase III. Reflects the
integration of student assessment data into instructional and teaching
improvement. |
Curricular and program design
|
Qualitative variable used in Phase III. Reflects the
integration of student assessment data into the strategic planning
for the institution. |
Learning innovation
|
Qualitative variable used in Phase III. Reflects the
integration of student assessment data into the strategic planning
for the institution. |
Faculty/professional development
|
Qualitative variable used in Phase III. Reflects the
integration of student assessment data into the strategic planning
for the institution. |
Institutional Culture and Climate for Student Assessment
|
Attitudes toward institutions student assessment process
|
Qualitative variable used in Phase III. Reflects administrator,
faculty, and staff attitudes toward the institutions student assessment
processes. |
Rituals associated with institutions student assessment
process
|
Qualitative variable used in Phase III. Characterizes
rituals associated with the institutions student assessment
process. |
Driving forces of institutions student assessment effort
|
Qualitative variable used in Phase III. Reflects certain
forces of the institutions that tend to drive or promote its
student assessment effort. |
Beliefs about institutions student assessment process
|
Qualitative variable used in Phase III. Reflects administrator, faculty,
and staff beliefs about the institutions student assessment processes. |
|
|