,

Georgetown in Trier Language Program Evaluation

Lincoln Snyder May 7, 2025 Final Paper – Language Program Evaluation – Georgetown-in-Trier Context This evaluation focuses on the “Georgetown-in-Trier” summer study abroad program for undergraduate students of German, offered and operated by Georgetown University (hereinafter GU), which offers a unique context for learning German. Table 1: Summary of Context – Georgetown-in-Trier Program Program Structure…

Lincoln Snyder

May 7, 2025

Final Paper – Language Program Evaluation – Georgetown-in-Trier

Context

This evaluation focuses on the “Georgetown-in-Trier” summer study abroad program for undergraduate students of German, offered and operated by Georgetown University (hereinafter GU), which offers a unique context for learning German.

Table 1: Summary of Context – Georgetown-in-Trier Program
Program StructureDescription
TypeUndergraduate summer study abroad program
Duration5 weeks
Dates (2025)June 7 to July 12
LocationTrier, Germany (population 102,727)
HousingHomestay with local families
Academic Units6-7 units (awarded by GU)
Classes/LanguageTwo classes in German, optional internship
InstructorsProfessors from GU German Department, GU German graduate students, guest professors/instructors from Trier
Program OwnershipGU German Department, under the aegis of GU Office of Global Education
Program AdministrationDr. Joseph Cunningham of the German Department serves as program administrator; GU Office of Global Education administers applications, finances and contracts; University of Trier global programs handles some local items (facilities, etc).
University PartnerUniversity of Trier (provides classroom space and local administrative support)
Community PartnersHost families, local businesses (for internships)

GU is a private, Ignatian Catholic, and highly selective university founded in 1789 and located in the District of Columbia. GU offers a German-language summer study abroad program that takes place in Trier, Germany. With a population of 102,727 inhabitants, Trier is Germany’s oldest city and is located in the far west of the German state of Rhineland-Palatinate (Citypopulation.de). Georgetown-in-Trier (hereinafter GIT) is a five-week-long study abroad program for undergraduate students, housed at the University of Trier and running from early June to mid-July; in 2025, the dates are June 7 to July 12 (Georgetown University). The program has a long history, dating back to at least 1973; I participated in the program in 1997, and the format remains largely unchanged. Students earn 6-7 academic units from GU through two classes and an optional internship, and the program includes a homestay component, where students live with local families to enhance their immersion experience. Students sign a pledge to only speak German during their time in Trier (Georgetown University, “Student Life”).

The language learners are primarily undergraduate students from GU, although the program occasionally includes students from other universities (J. Cunningham, personal communication, Feb. 20, 2025). The teaching staff consists of a mix of professors from the GU German Department (hereinafter GUGD), graduate students from GU, and guest professors and instructors from Trier. The program is managed by Dr. Joseph Cunningham of the GUGD under the oversight of the GU Office of Global Education. GU’s Office of Global Education (hereinafter OGE) handles the administrative aspects of the program, such as the budget, applications, and budget; GUGD retains control of the program’s content and instruction. The University of Trier provides classroom space and support through its international program office. Local businesses offer internships, and host families provide accommodation for students (J. Cunningham, personal communication, Feb. 20, 2025).

Purpose – Why is this evaluation happening?

The GU department is the key internal stakeholder and thus has a special role in guiding the purpose (Norris et. al. 2007 p. 6), but other stakeholders will find explicit and implicit program uses as well. The evaluation will be formative, focusing on program improvement and development. It seeks to provide insights into how well the program fulfills its mission, to better understand the student experience, and to inform decisions related to the program’s budget, pricing, and sustainability.

Table 2 – Context, Purpose, Audience
Audience: GU German DepartmentAudience: GU Office of Global Education
Context: Program ownership, delivery of learningContext: Program management. budget
Purpose: Understand mission, “secret sauce, outcomes”Purpose: Understand program viability, sustainability
Audience: EvaluatorAudience: Students
Context: Evaluating as a department student, employeeContext: Desire to learn German, earn units
Purpose: Conduct effective internal evaluation, earn PhDPurpose: Understand program efficacy and attraction – why Trier and not another program?

When I proposed the evaluation project, GUGD chair Peter Pfeiffer admitted that there has been no formal program evaluation of the Trier program for some time, and that there would be real value in assessment (P. Pfeiffer, personal communication, Feb. 12, 2025). Dr. Pfeiffer expressed a desire to discern the “secret sauce,” noting that program alumni often describe the program as transformative in terms of their mastery of German. He hoped a program evaluation could better define what aspects of the program result in that outcome.

I also discussed the purpose with program director Dr. Cunningham. Davis et. al. list a series of prompts helping answer the question, “the findings of this evaluation will be used to…” (Davis et. al. 2018, p. 21); Dr. Cunningham responded to these with two specific answers: “Find out how well the program fulfills its mission,” and “Better understand the student experience” (Snyder, photograph, 2025). He noted that the program does not have a clearly defined mission or mission statement, and that it would be good to determine those as part of a process. (J. Cunningham, personal communication, Feb. 27, 2025).

Audience

This evaluation falls under Norris’ category of “program development, monitoring, and improvement” (Norris, 2016, p. 174). In identifying the stakeholders for this internal evaluation, I have considered both the “specific users of the report” and the “users’ evaluation uses” of the report (Davis et. al, 2020, p. 15). Though GUGD is the evaluation user, the evaluation will be student-centered, in that it considers the learner experience, value, and outcomes, while also considering other relevant stakeholder groups.

Table 3: Stakeholder Groups

The students are the ultimate beneficiaries of the study. Nearly all are GU students, and the Trier program offers an immersion experience in a German speaking country while still being operated directly by GU. (GU students’ other option for study abroad is a full semester or year abroad at a university in Germany or Austria as a fully matriculated student). The GU German department leadership is the primary user and client for the evaluation; as I am an employee of, and graduate student in, the department, this is an internal evaluation, and I will collaborate with the department in evaluation design. The Chair supervises the Program Director and is ultimately responsible for the program’s success; the director of curriculum vets alignment to the overall program. All hold doctorates in German and are tenured professors in the department. The program represents an employment opportunity for professors and graduate students. As the office that officially administers the program, oversees its budget, and manages application and registration, the OGE is a major stakeholder in the program. GUGD designs the program, but as the technical owner of the program and the entity that sets pricing and collects tuition, OGE’s input is important.

Without buy-in and participation from stakeholders in Trier, the program could not exist. The university has an international program office that supports the program – key players include the office director, who is not strongly involved in the day-to-day, and the administrative assistant, who is. Both the past and current university presidents have been invested in the program and the broader relationship with GU. GU has consistently employed instructors from Trier to teach in the program (J. Cunningham, personal communication, Feb. 20, 2025). Host families in Trier are a critical constituency, as they give students exposure to the language and culture in the environment of primary discourse. The program also relies on local businesses for its internship program.

The evaluation team is an implied audience; I will conduct the program evaluation, in collaboration with program director Dr. Cunningham and under the tutelage of Dr. Meg Malone in the context of her class, Language Program Evaluation. Though this is my first language program evaluation outside the K-12 space, I have conducted, directed, and supported program evaluations, school accreditations, and protocols for standards.

Table 4: Stakeholder Uses

Stakeholders of Georgetown-at-Trier programEvaluation use and purpose
GU German Department (key internal stakeholder: able and in position to take actions based on findings; Norris et.al. 2007 p. 6)Use: Formative – program improvement and development; Purpose: 1. Find out how well program fulfills its mission; 2. Better understand student experience
Students (future, current, present) and those paying their tuitionUse: For alum – summative: offer data on program effects and results; current and future students and funders – formative: communicate program impact and value proposition
University of Trier Community: international programs team, internship providers, local teaching staff, host familiesUni Trier, teaching staff, and internship providers: Formative use – program improvement, communicate own needs, perceptions, and goals; Host families: Formative: speak to their and guest students’ experience and program value
GU Office of Global EducationUse: Formative – Datapoint for programmatic budget, pricing, and decisions – Owns the program budget; accountability stakeholder in understanding program investment, sustainability

Instruments

Table 5: Stakeholders and Instruments

Stakeholder GroupInstrument(s)TimingType
Learners
– Current studentsC-TestJune 9, July 12 2025Quantitative

QuestionnaireBefore June 7, 2025Mixed

Entry InterviewJune 7-9 2025Qualitative

Exit InterviewJuly 9-12 2025Qualitative
– AlumniQuestionnaireMay 2025Mixed
GU German Department
– LeadershipInterviewMay 2025Qualitative
– ProfessorsQuestionnaire (or interview)May 2025Qualitative
– Grad studentsQuestionnaire (or interview)May 2025Mixed
GU Office of Global Education
– Leadership, FinanceInterviewsMay 2025Mixed
– Admissions managerInterviewsMay 2025Mixed
Trier Community
– UT LeadershipInterviewsJune 2025Qualitative
– UT Administrative StaffInterviewsJune 2025Qualitative
– Internship providersInterviewsJune-July 2025Qualitative
– Host familiesInterviewsJune-July 2025Qualitative
GUGD and Trier – Instructors
All Teaching Staff – CurrentInterviewJune 2025Qualitative
All Teaching Staff – PastQuestionnaireMay-June 2025Qualitative

Note regarding administration: I will administer all instruments, with the exception of the C- Tests, which are administered by the GU German Department.

Guidelines for practice: In discussing ethical data collection and evaluation, Davis et. al. note that “program evaluation is not typically regarded as ‘research’ and commonly does not come under regulations for human subjects review. Nevertheless, we encourage readers to adopt best evaluation practices and be mindful of participants’ well-being during and after evaluation activities” (Davis et. al. 2018, p. 39). The authors offer a number of helpful guidelines, which I incorporate into a disclaimer that I will include with each instrument I produce, including a handout to distribute at interviews and focus groups. Moreover, given the nature of the project that I am conducting, an internal program evaluation that will serve as the basis for a dissertation, I believe that it does cross the line from pure evaluation and into research. Both Dr. Malone and Dr. Ryshina-Pankova noted that I should submit to the Georgetown Internal Review Board process given that the research is centered on human subjects. Before beginning research, I will engage the users, in particular the GU German department leadership, to ensure instrument alignment with their use goals (Davis et. al. 2018, pp. 39-40). Data can be quantitative, qualitative, and mixed, and we see all of the above in these instruments (Isabelli-Garcia et. al. 2020, p. 99).

Interviews: Due to the diversity of stakeholder groups, but also the small number of stakeholders within each group, I propose interviews as the best method for collecting data from the largest number of respondents. Davis et. al. note that “interviews are useful in exploring the why and how in a language program,” and also offer “more privacy so that informants can speak more openly and honestly about their opinions” (Davis et. al. 2018, p. 58). They also offer greater control of response rates once scheduled; In the case of some stakeholders, such as GU German department leadership, University of Trier leadership, and teaching staff, we will need census-level response rates to ensure evaluation validity. Electronic survey tools rarely receive a 100% response rate, and I will have ample time in Trier to conduct interviews.

Per Davis, I will use the interview guide approach, and come with common questions for each set of stakeholders, while still allowing myself the freedom to ask follow-up questions (Davis et. al. 2018, p. 58). Davis et. al. offer advice on effective interviewing, including writing questions; sequencing questions; piloting; and conduct (Davis et. al. 2018, p. 60). All questions in the addenda are subject to significant piloting and revision before I conduct the interviews this summer. In addition, I expect input from the evaluation users. I anticipate conducting at least two interviews with all students in the program: an intake interview regarding their background, preparation, aspirations and expectations; and another regarding their experience, growth, and changed perspectives. For all interviewees, I borrowed liberally from Davis (Davis et al 2018, pp. 67-68) in developing my protocols.

Questionnaires: Questionnaires have the distinct advantage of being simple to create, inexpensive to distribute, and useful in the fast compilation of results. I am also concerned that they are not the best instrument for all stakeholders in this project due to my desire for a high response rate. Still, they will be useful for consistency and ease of distribution with stakeholders with whom I will not have direct contact, as well as for collecting baseline information about the students and the teaching staff, about whom I want to know more as I discern the picture of where they are coming from and where they want to head. Per Davis et. al., I anticipate using mostly short-answer and specific questions in my questionnaires (Davis et. al. 2018, p. 73). Though values from drop-down menus are easier to consolidate in an application like Google Forms, AI can also sort open-ended data efficiently and accurately, and I would rather process open-ended responses from a small group than end up with “other” frequently checked.

C-Tests: The GU GD owns a proprietary C-Test developed by John Norris which has been used for student placement for years; I took it myself in the Winter of 2023 as part of my doctoral program application. The department has years’ worth of data from students and applicants having taken the test. As it is a quick and efficient instrument that we can use at no marginal cost, Dr. Pfeiffer proposes administering it at the beginning and at the end of the program as a measure of student progress (P. Pfeiffer, personal communication, Feb. 12, 2025).

What Instruments don’t I include here? More Student Assessments: Dr. Cunningham and I discussed the utility of external assessments that could measure student performance according to an external standard like the Common European Framework of Reference (CEFR) or the ACTFL Proficiency Guidelines. A number of providers offer assessments aligned to these standards. We agreed, however, that we may not have time to research and implement an instrument before June 2025, and this may need to be a research task for 2026 and beyond (J. Cunningham, personal communication, Mar. 31 2025).

Classroom Observations and Curricula Review: As I noted above, I do not yet propose an instrument for classroom observation nor curricula review, although course analysis and classroom observation will certainly be a part of this project. Violeta Ramsay offers strong ideas on these areas, and I include a picture in Appendix A (from Norris et. al., ed., 2009, pp. 163-181; “Table 4,” p. 170). The GU German department has policies, procedures, and reviews in place, and I will thus need to first conduct a review of these two areas with Dr. Cunningham and Dr. Ryshina-Pankova before proposing further instruments. The goal is reviewing the program, and not duplicating reviews that the department has already conducted.

AEA Guidelines

The AEA Guidelines are designed to govern ethical behavior for program evaluations, and all five domains – Systematic Inquiry, Competence, Integrity, Respect for People, and Common Good and Equity – apply at all times (AEA 2018, “Guiding Principles”).

The AEA offers some guidelines that connect to the matter of purpose, including the need to “explore with primary stakeholders the limitations and strengths of the core evaluations questions and the approaches that might be used for answering those questions,” which gives users a chance to weigh in on purpose (AEA 2018, “A: Systematic Inquiry”); the need to “communicate truthfully and openly with clients and relevant stakeholders concerning all aspects of the evaluation, including its limitations” (AEA 2018, “C: Integrity”), which would thus include communicating the purpose to clients and stakeholders; and, in determining the evaluation scope, seeking to “mitigate the bias and potential power imbalances that can occur as a result of the evaluation’s context. Self-assess one’s own privilege and positioning within that context” (AEA 2018, “E. Common Good and Equity”); my observation above about balancing my experience with mission statements against best practice in assessment illustrates this.

The AEA explicitly threads consideration for stakeholders throughout its Principles, including the need to explore limitations of the questions and the approach with the stakeholders as a self study (AEA 2018, “A. Systematic Inquiry”); seek appropriate expert help and training as the evaluation will be a self-study (AEA 2018, “B. Competence”); communicate openly both the study goals and its inherent nature as a self-study with all stakeholders (AEA 2018, “C. Integrity”); be respectful in considering the views and needs of all stakeholders (AEA 2018, “D. Respect for People); and balance the interests of each stakeholder group “and the common good while also protecting the integrity of the evaluation” (AEA 2018, “E. Common Good and Equity”). I trust that an open and collaborative dialogue with all stakeholders throughout the evaluation will improve both the quality of the report and its chances of actually being used.

The timeline is ethical in that it provides for an efficient collection of data, and presents a report to the user that is “timely and available for program decision making” (Davis et. al. 2018, p. 12). It follows the AEA Guidelines: Systematic Inquiry (I am including everything and everyone I can think of), Competence (I am working under expert guidance), Integrity (I provide for IRB and disclosure), Respect for People (participation is voluntary), and Common Good and Equity (it is for the betterment of a program, and I will work with the department to provide summary findings to stakeholders (AEA 2018, “Guiding Principles”). It is reasonable in that it spreads IRB approval, data collection and writing over 5 months; and I have agreed with GU GD leadership that I will collect more data in Trier in 2026 if we identify gaps.

Results

The Trier program ends on July 12, 2025, and my deadline for completing a draft report is August 26, 2025 – the start of the Fall 2025 semester at GU. I will share the draft report with Dr Malone and GU GD leadership for review; I would like to be able to incorporate their feedback into a final “year 1” report that I can submit to the department chair by October 15, 2025.

The degree and manner of my presentation of findings to the rest of the department will depend on what I learn; some findings may seem rock-solid, while others may raise new questions that I will need to answer with more data before we feel ready to share those findings. It is my, and their, presumption that my work and the study will not end in October 2025, and I am already planning a return to Trier in 2026. Though this evaluation report provides for the broad collection of data across stakeholders, I anticipate that the findings may raise new questions. I may need new instruments with new audiences, or I may need to pursue different lines of inquiry.

The extra summer will give me a chance to collect more data, or collect it differently. I have also not forgotten about Dr. Cunningham’s original desired purpose of the evaluation, namely understanding and articulating the mission of the Trier program. That exercise will demand its own report protocol and instruments. It is my plan that this evaluation will culminate in a published dissertation, which will take at least 3 years to complete but will serve as the definitive public document.

Key Personnel

I will be the lead evaluator. Dr. Malone has advised on project and instrument design, and I am sure I will seek counsel during the evaluation. Dr. Cunningham will aid in data collection this summer (for example, by administering the C-Tests) and has also promised close collaboration during the process in Trier. As my advisor, I will also consult with Dr. Ryshina-Pankova on the evaluation, especially if any conflicts arise that require navigating. Though he will be on sabbatical next semester, I will submit the completed project to Dr. Peter Pfeiffer and his successor as department chair for their review.

Works Cited

Albert, A., et. al. (2023). Chapter 13: Individual Variables in Study-Abroad Contexts: Concepts and Measurements. In Pérez-Vidal, C., & Sanz, C. Methods in study abroad research: Past, Present, and Future. John Benjamins Publishing Company.

American Evaluation Association (AEA). (2018). Guiding principles for evaluators. American Evaluation Association. https://www.eval.org/About/Guiding-Principles

Citypopulation.de. (n.d.). Trier. Trier, Rhineland-Palatinate, Germany – Population Statistics, Charts, Map, Location, Weather and Web Information. https://www.citypopulation.de/en/germany/rheinlandpfalz/trier/07211000__trier/

Davis, J. McE., & McKay, T. H. (2018). A Guide to Useful Evaluation of Language Programs. Georgetown University Press.

Georgetown University. (n.d.). Myguabroad. Georgetown in Trier. https://myguabroad.georgetown.edu/index.cfm?FuseAction=Programs.ViewProgramAngular&id=10107

Georgetown University Office of Global Education. (2024, December 3). About Us. Office of Global Education. https://studyabroad.georgetown.edu/about/

ILR Homepage, Interagency Language Roundtable, http://www.govtilr.org. Accessed 6 May 2025.

Isabelli-Garcia, C. L. & Isabelli, C. A. (2020). Researching Second Language Acquisition in the Study Abroad Learning Environment. Palgrave Macmillan.

Lane, K., Murphrey, T. P., Briers, G., Dooley, L., Lindner, J., & Esquivel, C. (2024). Comparing Influence and Value Based on Study Abroad Program Types. Frontiers: The Interdisciplinary Journal of Study Abroad, 36(1), 624–639. https://doi.org/10.36366/frontiers.v36i1.795

Llosa, L., & Slayton, J. (2009). Using program evaluation to inform and improve the education of young English language learners in US schools. Language Teaching Research, 13(1), 35–54. https://doi.org/10.1177/1362168808095522

Mitchell, R., Tracy-Ventura, N., & McManus, K. (2015). Social Interaction, Identity and Language Learning During Residence Abroad. European Second Language Association.

Norris, J. M. (2016). Language Program Evaluation. The Modern Language Journal, 100, 169-189. DOI: 10.1111/modl.12307

Norris, J. M., Davis, J. McE. et. al. (2009). Toward Useful Program Evaluation in College Foreign Language Education. National Foreign Language Resource Center.

Norris, J.M., & Watanabe, Y. (2007). Roles and Responsibilities for Evaluation in Foreign Language Programs. University of Hawaii at Manoa.

Patton, M. Q. (2008). Chapter 1: Evaluation Use: Both Challenge and Mandate in Utilization-Focused Evaluation (4th ed.). Sage Publications.

Pérez-Vidal, C., & Sanz, C. (2023). Methods in study abroad research: Past, Present, and Future. John Benjamins Publishing Company.

Sanz, C., & Morales-Front, A. (2018). The Routledge Handbook of Study Abroad Research and Practice. Routledge.

Snyder, L. (2025, Feb. 16). Photograph of Page 21 from Davis and McKay, Annotated by J. Cunningham. (Photograph). Taken in German Department, Georgetown University.

Snyder, L. (2025, Feb. 18). Photograph of Trier Brochure from 1984. (Photograph). Walsh Building, 3rd floor, Georgetown University.

Venezia, F. (2018). Chapter 4: Identifying Indicators for Evaluation Data Collection. In Davis, J. McE., & McKay, T. H. (Eds.). A Guide to Useful Evaluation of Language Programs. Georgetown University Press.

Vertex 42. (n.d.). Simple Gantt Chart. Simple Gantt Chart for Google Sheets. https://www.vertex42.com/Files/download2/gdrive.php?file=simple-gantt-chart

Appendix 1 – Instruments

I included a separate Excel spreadsheet of my updated instruments at submission of this paper, and you may view the instruments at:

Appendix 2 – Timeline

The reader may view a full Gantt Chart chart at this link, and I attached an Excel copy of the same at project submission: https://docs.google.com/spreadsheets/d/13bJAHSwlzQJb27nb6mRqE0TGfsBL7sizSdMtpsY5DqA/edit?gid=0#gid=0

This year’s Georgetown-in-Trier Program has a posted student arrival date of June 7, 2025, and ends on July 12, 2025 (GU, myGUABROAD, “Overview”).

Pre-Arrival (4/14 – 6/6/2025): Submission of my instruments (see Table 1) to the IRB is a critical next step in this process. I will also collect any extant data on the program from the GU German Department (hereinafter GU GD), complete consent forms, and make any necessary revisions to the instruments. I will use the time from May 6 to June 9 to distribute questionnaires to and conduct pre-arrival interviews with a variety of stakeholders.

Arrival (6/7 – 6/12/2025): The second phase is the week of students’ arrival, in which I will administer C-Tests, conduct student and instructor interviews, collect all class materials (syllabi, books, etc.), and conduct initial observations of all classes.

During the Program (6/13 – 7/12/2025): For the rest of the program’s duration, I will conduct interviews with host families, internship providers, and other stakeholders in Trier; conduct ongoing observations of both classroom and extracurricular activities; conduct follow-up interviews; and, in the last week, conduct student assessments and exit interviews.

Post-Program (7/13 – 8/26/2025): Once the program is completed, I plan to distribute a follow-up questionnaire to parents; write a draft report, which I will then share with Dr Malone and GU GD leadership for review; and then present my findings to the GU GD once the academic year begins, and a public summary report on a date to be determined with the GU GD.

Appendix 3 – Reflection

I received detailed constructive feedback on my project from Dr. Malone, as well as from Dr. Pfeiffer. In this reflection, I detail how I responded to that feedback in the paragraphs below. Dr. Malone offered continuous verbal feedback throughout the project as well, and it is easier for me to document my responses to her written comments, but as a general sentiment, I did not dismiss any of her ideas.

In addition to the feedback documented below, Dr. Malone and I held several discussions regarding the Institutional Review Board. This has turned out to be a more daunting task than I initially thought it would be. There is some discussion in the literature as to whether this language program evaluation will require an IRB approval, but in the end, Dr. Malone and Dr. Pfeiffer agreed that it is best to submit and then let the IRB make that determination. I have completed GU’s mandatory IRB training and will submit upon completion of this paper.

Feedback from Dr. Malone

Part I: Audience

In her annotated comments, Dr. Malone asked an obvious yet essential question: is the program flourishing at 16 students? This is related to Dr. Cunningham’s broader question about mission. I did not explicitly frame the evaluation instruments to ask and answer this question directly, but I do expect it to come up.

In her graded feedback, Professor Malone made two major suggestions: include alumni as a stakeholder group; and add more readings from the study abroad space, and I did so in subsequent sections. There is a large body of work in study-abroad research journals, but but I decided to first explore substantial research in published books and compendia, in particular Isabelli-Garcia & Isabelli; Albert; Pérez-Vidal & Sanz; Mitchell; Tracy-Ventura & McManus; and Sanz & Morales-Front. I will be taking Morales-Front and Sanz’ study abroad class in Fall 2025 and expect to expand my bibliography in this area quickly.

Part II: Purpose

Dr. Malone’s comments on conflict-of-interest issues had a major bearing beyond the report itself; I have made a point to overcommunicate with GUGD leadership, and my advisor and the program director are not the same person. The note to focus on host families was helpful, as was the encouragement to conduct student pre- and post-interviews; I used those notes in advocating for their inclusion in the instruments.

In her graded feedback, Dr. Malone suggested that the final will need to have a bit more of an academic format, which I have attempted to capture; she also suggested more journal articles on Study Abroad; I subsequently included multiple books on Study Abroad research in my reading and bibliography.

Part III: Context

Some of Dr. Malone’s comments which I took to heart are her remark that maintaining student (ie, customer) satisfaction is critical – many of my interview questions are designed to understand satisfaction. I also updated the context box (Table 1) to add the University of Trier administration. The most important comment was essentially a question about Trier’s role – how big is it? This question played a major role in my choosing qualitative instruments for my conversations with Trier.

Part IV: Instruments

Dr. Malone warned that words like “transformative” indicate bias, and so I removed it from my instruments.

Part V: Timeline

Dr. Malone raised the question of distribution of my evaluation, which I later included in my timeline and address in the pertinent sections, above. She also raised the reasonable question of, is this too intense a pace? I agree that it is, but I would note that reducing the number of instruments from my initial draft, in particular the parent interviews and the collection of academic data other than the C-Tests, has simplified my work.

Feedback from Dr. Pfeiffer

Dr. Pfeiffer had a number of comments about the survey and instruments when I reviewed it with him (P. Pfeiffer, personal communication, April 28, 2025). He was concerned about potential complications arising from including the parents of students in the evaluation, and so I eliminated them as a group. He also opined that the School of Foreign Service test is not a normed instrument and is only taken by a small number of students at Trier each year, and I should thus drop it. He was also insistent that I clear all my plans with Dr. Cunningham, which I have.


Leave a comment