Annual Student Learning Assessment Summaries

BAPG SLAR 2021-2022 Summary

Investigation Topic:  Ability of students to grasp and discuss (in writing) content of some main concepts and theories

●       The objective of this Student Learning Assessment was to create the first basic assessment of students’ ability to

1.      Recognize main concepts of political science, be able to relate them to reality (SLO 1.2)

2.      Discuss these concepts in writing (both in an extensive form and in a short form)

3.      Retain, transfer and possibly develop their understanding of some basic concepts from a course they took in the Fall semester (PG101 Introduction to Politics and Governance) to a course they took in Spring semester (PG104 Comparative Politics)


This topic was chosen because this is the first year of the BAPG program. We wanted to start with the most basic question: are students learning what we hope they are learning? We wanted to look at their ability to grasp and use some of the basic concepts and theories that are introduced in the first semester and are revisited in the following semesters/years in many of the courses. SLO 1.2 is a good starting point, because it is addressed by a course in the first semester of the first year and in the second semester of the first year, making a comparison possible already in the first year of the program.


Assessment Methods and Schedule:


February 2022

Discussion with the instructors of the two courses that contain SLO 1.2. (PG101 Introduction to Politics and Governance and PG104 Comparative Politics). Identify most comparable assignments.


May-June 2022 Pool all the assignments from both courses. Create a simple rubric/set of criteria to assess demonstrated ability to recognize main concepts (state, nationalism, other) and theories (if applicable). How are students relating these concepts to reality in the assignments?


July- September 2022 Review of the assignments, based on the criteria. Adjustments made to the criteria. Report.
Description of Assessment Methods: 


Mid-term from PG101 Introduction to Politics and Governance and three short assignments from PG104 Comparative Politics that explore comparable concepts were reviewed by the Program Chair and two faculty members, based on a simple rubric, reflecting students’ ability to correctly recognize, describe and relate to reality basic concepts and theories of political science. The syllabi of the two courses were also reviewed to assess the level of overlap and the extent to which PG104 Comparative Politics builds upon and further advances the material covered by PG101 Introduction to Politics and Governance.


Assessment Team: Assessment was overseen by Yevgenya Paturyan, the Program Chair. Asbed Kotchikian and Uros Prokic participated in the review. The team prepared a report and circulated it to the faculty for input. The report was reviewed during a faculty meeting on October 19, 2022.


What do the results say?


The review of the (take-home, longer narrative) mid-term exam in PG101 Introduction to Politics and Governance shows that first-year students are on average quite capable of discussing key political science concepts (state, nation, ethnic group, legitimacy, power, executive, legislative, electoral systems, parliamentary vs presidential democracies) in sufficient detail. Students provide good descriptions and dive into nuances. Some are also able to successfully compare and contrast concepts, discuss pros and cons. Some demonstrate the basic application to reality by bringing country examples (but rarely going into detail). Few students went beyond expectation by being able to sustain an elaborate argument and connect many dots in a narrative of 10+ pages, going from description to analysis level. Some of the common problems identified through the review of this assignment were: shallow or patchy narrative, confusion between key and secondary ideas, and attempts to cram too much information into the assignment.


The review of three short (100 words) in-class assignments from PG104 Comparative Politics shows that students retain a fairly good understanding of the key concepts (state, nation, democracy and its key principles, electoral systems with their pros and cons). Understandably, students found a closed-book exercise more challenging, compared to a take-home exam. Despite that, most students mentioned key components of the concepts they were tested on, sometimes forgetting to mention one out of 3-4 key components. Despite the shortage of space and time to complete the assignment, many students mentioned reasonable examples. There were very few completely wrong or confusing answers. The main challenge identified in this assignment was students struggling to focus on the most important parts in the limited space given. It stands to reason that students do better in longer assignments they complete at home, compared to shorter tasks done in class with closed books.


Overall, the assessment shows a good level of understanding of the key concepts, with basic ability to apply these to reality (examples). Students have retained most knowledge they acquired in PG101 and transferred it to PG104. They struggled to adapt that knowledge to a different format (short vs long elaboration).


The comparison of the two syllabi shows some overlap in terms of the topics in the first 3-4 weeks. There is potential overlap in terms of reading material. Both courses use textbooks by O’Neil, but these are different textbooks. In one case O’Neil is one of the editors. This issue was specifically looked at because in private conversations some students mentioned that they use “the same textbook” for both classes.


Closing the Loop: What will your program do with the results? What is your action plan?

●       Since this is the very first assessment of the new program and it did not show any glaring shortcomings, it is hard to think of a concrete intervention at this stage. We will keep and use this evidence as a baseline for further assessment of student learning in the program.

●       In general, it is worth reflecting on how much or how little overlap we want to have between various courses in BAPG. Anecdotal evidence (personal conversations with students) suggests the existence of two extreme perceptions. Some students complain the field is too broad (seeing no connections and overlaps between subjects?) so much so that they would like to have some “tracks” to “specialize in.” At the same time some students complain that they are “learning the same thing” in some of the courses (seeing too much overlap between subjects?).


Were there any surprises?

Some students are exceptionally bright. Their level is above what we expected of the first-year bachelor student.



Name of Program MPSIA (Political Science and International Affairs)
Date of Report September 1, 2021
Student Learning Outcome(s) Reviewed 1. Identifying and conceptualizing political issues related to Armenia, the region and the world.

2. Understand the applicability of various research methods on specific topics.

Assessment Methods and Descriptions To prepare for the assessment, the evaluation team developed 20+ questions on topics such as: research methods, Armenian politics, regional politics, global issues/IR, conceptual framework, and comparative politics. The questions were compiled during May-July, 2021 period and circulated among the PSIA faculty to solicit feedback and additions.

Once the questions were finalized the evaluation team divided them into two sets of 10 questions each and identified a classroom that has the largest number of second year MPSIA student enrollment. In order to maximize the sample size, the evaluation team waited until the beginning of the semester on August 25 and managed to collect answers from 26 second year MPSIA students.

The assessment of student comprehension was evaluated on a scale used by the program to evaluate students’ knowledge. The scale uses the following categories:

Not yet competent (NYC) Developing (D) Proficient (P) Advanced (A)Added to these categories are also intermediate scales making the overall rubric consist of seven categories: NYC; NYC/D; D; D/P; P; P/A; A

Assessment Team (# and Positions. Names Not necessary) 3 Team Members: Senior Professor – Donald Fuller, Associate Professor – Asbed Kotchikian, Assistant Professor – Uros Prokic
Summary of Findings – Just above 25% of the respondents have above proficient (this includes D/P, P and P/A) knowledge in research methods. Since this is their second to last semester before they start working on their MA capstones, this is a bit of a challenge.

– Almost 45% have developing or less than developing knowledge about politics and political processes in Armenia.

– Over 80% have developing or less than developing understanding of IR concept and global issues.

– The “best” results or the domains where the students are more knowledgeable are in Armenian politics (~55% at Developing/Proficient level or above), regional politics (~49), and comparative politics (~38%).

– What is not expressed in the statistics is the fact that students are fairly good at making connection of concepts IF they know what the concepts are.

– The students need focus on fact analyses rather than provide logical and conceptual argumentations.

Next steps – While preparing for their capstones, focus more on research methods.

– Emphasize (political) theory so as to help students become “big” thinkers and readily able to apply concepts and theories to practical issues.

– Theory-heavy for purposes of creating problem solvers/critical thinkers.

– Try to synchronize course planning/syllabus organization (when possible) so that the faculty can overlap on main themes and theories in order to increase students’ retention of material.

– Provide the students with a suggested reading list (either throughout the semester or at the end of it) for them to develop their wider knowledge of ideas and concepts.

– Encourage students to read more regional and even local news and critically reflect on them.

– Commit to becoming systems thinkers as a department so as to help our students adopt a wider picture of politics and where they fit in the body of ideas.

– In order to achieve the goals above, it is recommended that the PSIA program appoints a task force to evaluate first year MA students as early as Fall 2021 and start working with them (and the subsequent cohorts) to establish and reach benchmarks during their first as well as second year of studies.

BA EC SLAR Summary

Name of Program BA in English and Communication
Date of Report August 31, 2021
Student Learning Outcome(s) Reviewed 1.1: Accurately and precisely communicate-both in speaking and writing – in a variety of contexts and genres.

3.3: Apply theoretical frameworks for literary criticism, linguistic analysis, and communication theory

Assessment Methods and Descriptions – Selection and anonymization of 15 student papers written in the first semester in 2016. Comparison of those with the capstone papers written by the same students in 2020.

– Evaluation of student papers according to the rubric by two full-time professors

– Interviews with faculty members who teach first-year students and advise them for capstone projects

– Survey of 2020 & 2021 EC graduates

Assessment Team (# and Positions. Names Not necessary) The assessment was done by 5 faculty members, overseen by Mica Hilson, the EC Program Chair.
Summary of Findings

The results of the assessment seem to indicate that the main program goals are being accomplished when it comes to writing instruction. When it comes to basic writing skills, as articulated in Goal 1.1, there seems to be a marked improvement over the four years, and by the time they write their capstone papers, most students are able to write with an extremely high level of clarity.  However, when it comes to more advanced academic writing skills, like the ability to integrate theoretical frameworks into their writing or the ability to compose an original research paper, some students are still struggling. These results are reported by the students and graduates, and supported by the scores given by the evaluators reviewing the selected student papers.

The interviews with the instructors reveal some key observations about student writing. A certain challenge was reported about the students with little background in writing instruction in the Armenian school context, who had considerable struggles in certain modes of writing, as well as in coming up with their own original arguments, expressing a writer’s voice (as compared by summarizing other people’s ideas)

Next steps The results of findings confirm earlier internal discussions about the curricular redesign for the EC program: since mandatory core courses are packed into into Year 1 and Year 2 of the program, there are not enough mandatory upper-division courses that teach more advanced academic writing skills. The proposed Introduction to Theory class as a required course for EC Juniors may resolve some of those problems. Additionally, students should be getting practice writing research papers and performing original research (not just a synthesis of existing research) in the first three years of the EC program, not just their final year. The student surveys reveal the importance of providing students with multiple opportunities for writing feedback (both instructor feedback and peer feedback) throughout a course. One thing to do is to have more workshops on the importance of substantial feedback and allowing students to do revisions.


Name of Program MS CIS
Date of Report February 2021
Student Learning Outcome(s) Reviewed SLO2: Assess students’ ability to exhibit problem solving skills and be able to design and implement solutions using development tools in keeping with state-of-the-art technologies
Assessment Methods and Descriptions – Report of the Academic Program Review Committee, fall 2019. The report includes recommendations in various areas including those regarding the courses CS326, CS312 and CS323.

– MS CIS student focus group meeting, spring 2020.

– Faculty Survey, spring 2020.  The survey was taken by faculty members who are employed in IT in Armenia and are actively involved in a field covering the three courses.

– Review and Evaluation of student projects produced in CS326, CS312 and CS323 to evaluate if their leads SLO2.

Assessment Team (# and Positions. Names Not necessary) The assessment team was overseen by the CIS Program Chair.  The student questionnaire discussed during the focus group was designed by the CIS faculty.  During the review process the CIS faculty was constantly involved in partaking their input.
Summary of Findings

The Academic Program Review Committee suggested that students interested in data science should complete at least one database course to understand where data is coming from, how it is stored, organized and used. The details and suggestions of the student focus group did not reveal any general issues regarding CS326.  However, students reported that CS323 did not consider deepening the necessary conceptual knowledge, while CS312 had repetitive content with CS323.  They also reported that the course duration for both courses could be shortened.

The survey to the faculty and IT experts abroad showed that the content of CS326 course is up to date with the industry and equips students with the necessary knowledge.  However, the topics included in CS312 and CS323 seemed to be outdated and overlapping with the undergraduate courses. The review of student projects overseen by instructors of the courses CS326, CS312 and CS323 revealed the following.  Projects for CS326 meet SLO2 in content and the use of state-of-the-art technologies.  The content of projects undertaken in courses CS312 and CS323 meet SLO2 in content and design solutions.  However, these courses did not seem to be up-to-date technology-wise.

Next steps The updated MS CIS program presented to the Curriculum Committee includes CS326 as a core course for all students, and CS312, CS323 will be merged into a single course to reduce redundancies.  The content of this new merged course will be updated with the help of the faculty so that the content is in par with the requirements of the industry.

MSE SLAR Summary

Name of Program MSE
Date of Report 2020-2021
Student Learning Outcome(s) Reviewed 1.1 To evaluate the post-graduation student placement

1.2 To evaluate the post-graduation salary profiles

1.3 To evaluate the use of learning outcome at the workplace

Assessment Methods and Descriptions MSE alumni survey was conducted. Direct evaluation of practical use of learning outcomes was assessed.
Assessment Team (# and Positions. Names Not necessary) The assessment is overseen by the outgoing MSE program chair. The discussion panel included MSE faculty.
Summary of Findings Findings evaluate the effectiveness of the program in achieving some of its goals. These findings inform the program about the practical usefulness of student learning outcomes.
Next steps Repeat the survey in the future to also understand the long-term impact of the SLOs on the workplace of the graduates and adjust them accordingly, if necessary.

MBA SLAR Summary

Name of Program MBA
Date of Report December 10, 2019
Student Learning Outcome(s) Reviewed Assessment of Student Learning Outcome A3 (SLO A3), which is “Apply up-to-date information technologies in business decision making”
Assessment Methods and Descriptions Professors teaching courses, in which a project is linked to the identified SLO, submit up to 3 projects, for which grade vary. Then, professors in the field assess the projects based on develop assessment matrix for the selected learning outcome. Independently, a survey is conducted among recent MBA graduates and current students, questions constructed from the same assessment matrix. Information from direct assessment by professors and from perceptions of MBA graduates and students are summarized and compared.
Assessment Team (# and Positions. Names Not necessary) Assessment is overseen by Aleksandr Grigoryan (Program Chair), with the participation of selected faculty, Meri Buniatyan and Anahit Sargsyan and support from the Office of Assessment and Accreditation. For direct evidence assessments, the Program Chair identified a group of Faculty to conduct the assessment.
Summary of Findings When comparing results from project assessments and survey, the key observation is that average values from project assessments are systematically higher from their survey counterparts. The gap is specifically high for the component “Data processing” and conclusion and it is the smallest for the component “data sharing”.
Next steps Focus group discussion on direct assessments and survey results with MBA graduates, employers and Faculty will close the loop of the SLAR exercise.


Name of Program MA TEFL
Date of Report November 25, 2019
Student Learning Outcome(s) Reviewed To evaluate the effectiveness of SLO 4.2 where students and graduates are expected to “Exhibit attitudes and behaviors of reflective and lifelong learning practitioners” with a focus on their behaviors, i.e. ability to transfer their learning across contexts and show independent learning practices.
Assessment Methods and Descriptions ∙ Early Feb. – Identify a few specific learning outcomes within TEFL courses

∙ Feb – April – Collect data to target specific SLOs that cut across multiple TEFL courses. Examples of data: lesson plans, activity design, teaching episodes observed by faculty. Assess how well the students are able to meet the selected learning outcomes in subsequent courses or learning opportunities.

∙ May – If needed, follow up on the results of the assessment by soliciting students’ opinions via survey, interviews, or focus groups.

Assessment Team (# and Positions. Names Not necessary) Assessment is performed by all TEFL faculty teaching in Fall 2019, overseen by Irshat Madyarov.
Summary of Findings Students seem to transfer some knowledge and skills from one course to another in non-teaching related contexts, e.g. lesson planning, activity design. It’s evident within one year (between Fall and Spring) and more so across a two-year span. Students do not transfer their learning in a teaching context from one lesson to another and from one year to the next as well. Perhaps this kind of learning takes longer time and needs more intensive practice opportunity.
Next steps All instructors will look for ways to reinforce learning from previous courses by encouraging students to recall and apply previous learning in the current class or task (see examples of activities below).

∙ Have Teaching Internship students review and reflect on their strengths and areas of growth identified in Teaching Practicum and create a plan of action for further development based on this previous experience.

∙ Discuss ways in which Internship and Practicum mentors encourage reflective thinking among TEFL students during post-observation conferencing sessions. Consider video-recording conferences. Identify and address areas of improvement.

MSM SLAR Summary

Name of Program Master of Science in Strategic Management
Date of Report August, 2019
Student Learning Outcome(s) Reviewed Applying core theories of management to authentic business projects and organizations (SLO 1.2).
Assessment Methods and Descriptions

Overall to assess the degree of achievement of the SLO 1.2 two major directions were taken. First, we have analyzed the syllabi of the courses to identify course level learning outcomes that would match SLO 1.2. Then, together with course instructors we have tried to match the assignments that would be targeted at assessing practical applications of knowledge obtained. Formally this was implemented through a joint syllabus review conducted by the program chair and relevant course instructor. We have also considered major assessment items, e.g. midterm and final exam contents, project assignments. Second source of information is the student feedback and their perceptions. To collect this piece of evidence we have developed and conducted a survey among graduated and capstone course taker MSSM students in the Summer.

Assessment Team (# and Positions. Names Not necessary) Program chair and 4 Faculty members involved in teaching the relevant courses
Summary of Findings

– Assessment questions both for projects and exams should be changing over time to make sure that various sections of syllabus are periodically assessed.

-Course level student learning outcomes for some of the courses analyzed should be revised to make them more suitable for subsequent assessment purposes. SLOs that are defined in extremely broad terms are difficult to be measured or evaluated. In general, the instructors should be paying a bit more attention to make sure that SLOs are coherent with curriculum map and that they are aligned with course assignments.

– Feedback received from the faculty teaching the courses highlights the fact that different background of students often times impedes proper coverage of graduate level material. Still the recommendation is to make sure that syllabi of graduate level courses have a rigor required for MS programs.

– Evaluation of students’ perceptions about how “applied” are the courses analyzed here generally gives positive outlook. In this term Capstone and Business analytics stand out, but that was also expected given highly applied nature of both courses. –     – Students were asked to respond to a set of MCQs that were checking “remaining” knowledge after the courses in 6 months or 1 year period. The questions were suggested by the instructors themselves to make sure that they are relevant and properly formulated. The rate of correct responses is around 50%, but there are variations both across courses and across questions within the courses. While this exercise is far from being representative, it can serve as a food for thought for the instructors in terms of what is working and what is not working that well in the course.

Next steps Revision of syllabi (if needed). Take steps to mitigate background differences among students

LLM SLAR Summary 

Name of Program LL.M. Program
Date of Report Student Learning Assessment Report 2019
Student Learning Outcome(s) Reviewed

 3.4 Oral Presentation and Advocacy Skills. The objective of the review is to understand how well oral presentation and advocacy skills have been incorporated into the course design from perspectives of both instruction methodology and syllabi structure to meet the students’ needs in preparing them for successful professional legal career.

Assessment Methods and Descriptions

The assessment was conducted using both direct and indirect evidence to come to a conclusion about the effectiveness of overall course design and instruction methods with regard to oral presentation and advocacy skills as a learning objective among LLM program students. In this regards, the syllabi of 6 courses with 3.4 skills were examined, as well as interviews had been conducted with students, alumni and faculty.

Assessment Team (# and Positions. Names Not necessary) The following LL.M. Faculty members prepared the SLAR report: Ms. Lilit Martirosyan, Ms. Tatevik Danielyan, Ms. Monica Pirinyan, and Ms. Lilit Banduryan
Summary of Findings

The study reveals that the overall course design, structure as well as teaching methods, in general, duly incorporate oral presentation and advocacy skills mechanisms to help students develop both theoretical and practical knowledge and relevant skills.  The survey results display the positive feedback on the courses which lead to advance students’ oral presentation and advocacy skills in their daily work and legal practice. Both the alumni and students appeared to be more confident in their argumentations and judgments at the end of the course, since verbal skills are crucial part in their professional career. The survey reveals that lecturers and instructors’ put special emphasis on oral presentation and advocacy skills advancing techniques during their courses, and practice that regularly based on the situation and the need of the class audience regardless of the fact whether or not SLO 3.4 objectives are spell out in the syllabi.

Next steps

The syllabi review as well as interviews with alumni and students reveal that there are six courses which practice SLO 3.4 objectives oral presentation and advocacy skills enhancement exercises during the classes. Furthermore, there are several courses in the Program that do not state SLO 3.4 objectives in their syllabi, however they practice so in the classroom. Therefore, it is recommended to remind faculty to revise their syllabi and incorporate SLO 3.4 objectives in their course as they practice.

MPH SLAR Summary 

Program: Master of Public Health
Date of Report: September 2019
Investigation Topic Students’ ability to master learning outcomes corresponding to the core functions of the MPH professional practice paradigm.
SLO(s) Reviewed Two learning outcomes: 1) Assure the appropriateness and effectiveness of a given public health intervention and 2) Communicate public health messages to targeted audiences.
Assessment Methods and descriptions

Integrating Experience Project papers written by the MPH students in 2019 and written products of the course PH 350: Project Development and Evaluation were used for the direct assessment of the SLO 1.  The papers were graded using the pre-existing Grading Rubric developed based on the MPH program competencies, corresponding to this SLO.  The criteria were assessed on a numeric scale ranging from 1 (performance criteria not met or missing) to 4 (excellent). The average scores for each criterion and each student were developed and presented in the assessment report.The indirect assessment included the analysis of the results of the End of Program Evaluation conducted for the MPH students at the end of studies in 2019. Students’ responses to the closed-ended question used for indirect assessment of the SLOs of interest were translated into summary scores and compared to the results from the previous years.

Assessment Team (# and broad descriptions. Names not necessary) The visiting faculty member of the Turpanjian School of Public Health (SPH) conducted the assessment and prepared the report. The assessment was overseen by the SPH Dean and reviewed by the SPH faculty.
Findings The direct assessment showed the students’ ability to meet all applicable criteria for the achievement of the first SLO at a very high level. The selected papers demonstrated the students’ mastery of all stages of scientifically rigorous program development and evaluation process and their full understanding of the importance of evaluation principles for the improvement of health services and programs. The indirect assessment documented an increase in the students’ self-rating of both learning outcomes of interest since 2017.
Next steps The assessment report will be circulated among the resident and visiting faculty and discussed at the subsequent faculty meetings. The evidence for closed loops on the gaps that were identified in earlier years and any weaknesses in students’ performance indicated by the current review will be discussed and followed up in the next assessment cycle. Revisions in the teaching methods and syllabi will be introduced as needed.