Workflow

This plan is for the common use of the entire department (faculty members, administration, coordinator, commission). The responsible person/team is clarified in each action step.

🚦Current Status:
📌
🟢 Green: completed  |  🔵 Blue: current stage  |  Gray: upcoming stages. Clicking on stage numbers scrolls to the relevant card. *Stage 7 is active only in the accreditation application year.
🔄 Continuous Improvement Cycle (PDCA)
📓
PLAN
Stage 1-2
DO
Stage 3-4-5
🔍
CHECK
Stage 6 (analysis)
ACT
Stage 6 (decision)
🔄
NEXT CYCLE
Return to Stage 1
1

Strategic Planning and Infrastructure Check

Before semester starts
👨‍🏫 Faculty Members
  • Reviewing the syllabi of the courses to be taught, submitting update suggestions to the commission.
  • Keeping laboratory safety equipment ready for use (lab supervisors).
💼 Administration
  • Criterion 1, 5: Determining and announcing quotas for lateral transfer, vertical transfer, and special students.
  • Criterion 6: Finalizing and approving course assignments.
  • Criterion 7: Initiating laboratory safety inspection, budget planning for deficiencies.
🤝 Coordinator
  • Criterion 5: Checking whether the curriculum covers Annex-1 (Civil Eng.) topics, preparing a report.
  • Criterion 4: Updating the tracking list of improvement decisions from the previous semester.
👥 Commission
  • Criterion 7.5: Applying and reporting the laboratory safety inspection form.
  • Criterion 8: Identifying infrastructure and hardware needs, forwarding them to the administration.
ℹ️ Detail and Information

Why Are We Doing This? Before the semester starts, we must check the "field" and the "rules". MUDEK requires us to guarantee that education is carried out not only in the classroom but also in safe laboratories and with a correct curriculum.

Critical Concepts:

  • Annex-1 (Civil Engineering Criteria): It is the special topic list required by MUDEK for civil engineering. Our curriculum must include areas such as Structure, Geotechnics, Hydraulics, and construction site management/bidding processes.
  • Showstopper: Laboratory safety (Criterion 7.5). If auditors do not see "Wear Goggles" warnings or fire extinguishers in the laboratory, they may not grant accreditation.

Relevant MUDEK Articles: Criterion 5 (Curriculum), Criterion 7 (Infrastructure).

2

Bologna, Organization and Student Expectation Survey

First 10 Business Days
👨‍🏫 Faculty Members
  • Criterion 3: Updating course syllabi, associating course learning outcomes with MUDEK outcomes.
  • Criterion 7.5 (OHS - Critical): Notifying students of Occupational Health and Safety rules in laboratory courses and getting their signatures via the "OHS Notification / Commitment Form" (before the student enters the laboratory).
  • Holding the first meeting with advised students, completing registration approvals.
  • Announcing the "Student Expectation Survey" in their courses, encouraging students to fill it out.
💼 Administration
  • Criterion 9: Making appointments for the accreditation commission and sub-commissions, announcing them.
  • Criterion 1.4: Announcing academic advisor lists to students.
  • Initiating the student representative election process.
🤝 Coordinator
  • Criterion 3: Combining PO matrices from all courses, creating the program-level PO matrix (even if data is collected in the first 10 days, administrative consolidation may extend to the 3rd-4th week).
  • Checking the compliance of course syllabi with the Bologna information package (check/feedback cycle may extend to the 3rd-4th week).
  • Criterion 4 (Pre-Test): Announcing the "Student Expectation Survey" link to all students, tracking participation.
👥 Commission
  • Criterion 5: Auditing course syllabi with a checklist, reporting deficiencies (administrative control/reporting may extend to the 3rd-4th week).
  • Preparing the annual work plan.
  • Conducting a preliminary review of Expectation Survey results, creating a dataset for end-of-semester comparison.
ℹ️ Detail and Information

Basic Definitions:

  • PO (Program Outcomes): 11 basic competencies that a student must acquire upon graduation (Engineering knowledge, design, communication, ethics, etc.).
  • ECTS: European Credit Transfer System based on student workload. The workload calculation in the course syllabus must be realistic.

📊 Pre-Test - Post-Test Logic (Criterion 4 - Continuous Improvement):

  • Beginning of Semester (Expectation Survey): What does the student know when coming to the course? In which subjects do they consider themselves inadequate?
  • End of Semester (Course Process Survey): What did the student learn when leaving the course? Is there a significant increase in achievements?
  • Value-Added Analysis: The difference between the pre-test and post-test proves the value the program adds to the student.

⚠️ OHS Note (Criterion 7.5): Notifying students of OHS rules and obtaining their signatures in laboratory courses is critical evidence for auditors.

Why Are We Doing This? Consistency between the course information package on the website and the practice in the classroom (Bologna Compliance) is the basis of reliability. Also, student participation in the process (Representative election) is required by Criterion 9.

Tip: The first 10 business days are the "preparation + announcement + first implementation" period for faculty members. The coordinator/commission's consolidation-audit tasks may administratively extend to the 3rd-4th week even if data collection is finished early.

3

External Stakeholders and Advisory Board

After midterm
👨‍🏫 Faculty Members
  • Contributing to updating the external stakeholder list through relationships with the sector.
  • Attending the Advisory Board meeting when invited.
  • Contacting their own graduates to encourage participation in the Alumni Survey.
💼 Administration
  • Criterion 2.2-c: Inviting Advisory Board members, organizing the meeting.
  • Criterion 2.3: Determining the meeting agenda (sector needs, graduate performance, PEO update).
  • Updating the alumni database, confirming contact information.
🤝 Coordinator
  • Criterion 2.3: Preparing PEO success data to be presented to the Advisory Board.
  • Survey (External Stakeholder): Sending the External Stakeholder Survey link, tracking participation.
  • Survey (Alumni): Sending the Alumni Survey link to all graduates via email/SMS, tracking participation.
👥 Commission
  • Preparing the Advisory Board meeting minutes, recording decisions taken.
  • Monitoring Alumni Survey participation rates, ensuring reminders are sent.
ℹ️ Detail and Information

Critical Distinction (PEO vs PO):

  • PEO (Program Educational Objectives): Describes the position our graduates reach in their careers 3-5 years after graduation (e.g., "Working as a manager", "Received postgraduate education").
  • PO (Program Outcomes): The knowledge they possess on the day they graduate.

Why Are We Doing This? We are obliged to get the opinion of External Stakeholders (Employers and Graduates) to understand whether our education meets the current needs of the sector (Criterion 2).

Evidence Value: In the Advisory Board meeting, not just "chatting" should take place; concrete Decision Minutes such as "The sector suggested software X, so we added it to the curriculum" must be created.

4

Assessment, Post-Test and Survey Mobilization

Before finals (Survey Mobilization)
👨‍🏫 Faculty Members
  • Criterion 3: Direct assessment through final exams, projects.
  • Preparing answer keys by associating exam questions with POs.
  • Announcing surveys to students in their courses, encouraging participation.
💼 Administration
  • Preparing and announcing the exam schedule.
  • Activating institutional incentive mechanisms to increase survey participation.
  • Creating the graduation candidate list (senior students), forwarding it to the coordinator.
🤝 Coordinator
  • Surveys (Student - Post-Test): Announcing Course Process (Post-Test), Advisor Satisfaction, Student Satisfaction, International Student surveys.
  • Survey (Staff): Announcing the Academic Staff Survey, tracking participation.
  • Survey (Graduation Candidate/Exit): Sending the "Graduation Candidate (Exit) Survey" to students at the graduation stage, measuring the level of achievement of program outcomes.
  • Tracking survey participation rates weekly, sending reminders to groups with low participation.
👥 Commission
  • Ensuring regular backup of survey results.
  • Conducting a preliminary review of Graduation Candidate Survey results.
  • Post-Test data: Preparing Course Process Survey results to be compared with the Expectation Survey in stage 2.
ℹ️ Detail

Post-Test (Course Process Survey): When evaluated together with the Expectation Survey (Pre-Test) applied in stage 2, it scientifically measures the value added by the program to the student.

Graduation Candidate (Exit) Survey (Criterion 3.3): It is one of the indirect measurement tools used to prove that students who have reached the graduation stage have achieved the program outcomes.

Data reliability: If participation remains low, the representativeness of the data weakens. At least 30% participation should be targeted in student surveys.

5

Course Binder Audit

Make-up week
👨‍🏫 Faculty Members
  • Criterion 3: Preparing the Course Binder: exams, answer keys, good-average-poor student examples, course evaluation report.
  • Submitting the file to the specified digital/physical archive.
💼 Administration
  • Announcing the deadline for the file submission process.
🤝 Coordinator
  • Updating the course binder checklist, forwarding it to the commission.
  • Reporting audit results.
👥 Commission
  • Criterion 3,5: Auditing submitted course binders with the checklist.
  • Identifying missing files, giving feedback to faculty members.
  • Preparing the audit report.
ℹ️ Detail and Information

What is a Course Binder? It is the black box of that course's quality journey. It must contain exam questions, answer key, grade sheet, and most importantly Student Evidence (Examples of papers receiving the best, average, and lowest grades).

Civil Engineering Special Note: Especially in design course files;

  • Complex Engineering Problem: Non-standard problems requiring abstract thinking.
  • Realistic Constraints: It must be proven that limitations such as cost, environment, ethics are included in the project.

Why Are We Doing This? When auditors come to the school, they examine these files, not the student.

6

Analysis, PDCA and Closing

End of year
👨‍🏫 Faculty Members
  • Submitting course-based success analyses (PO-based) to the coordinator.
  • Notifying the commission in writing of course improvement suggestions for the next semester.
💼 Administration
  • Criterion 4: Convening the Department Board, formalizing improvement decisions.
  • Criterion 2: Evaluating the need to update PEOs.
  • Formally assigning and convening the Graduation Commission.
🤝 Coordinator
  • Criterion 3.2: Combining PO data from all courses, calculating program-level PO success percentages.
  • Criterion 2.3, 3: Analyzing and reporting all survey results.
  • 📊 Value-Added Analysis (Criterion 4): Comparing the Expectation Survey (Pre-Test) results in stage 2 with the Course Process Survey (Post-Test) results in stage 4, calculating the value added to the student.
  • Criterion 4: Updating the Self-Assessment Report (SAR) draft.
👥 Quality Commission
  • Interpreting PO analyses and survey results, developing improvement suggestions.
  • Value-Added Analysis interpretation: Evaluating the results from the Pre-Test - Post-Test comparison, identifying in which courses/outcomes significant progress was achieved.
  • Criterion 4: Preparing the improvement action plan draft.
🎓 Graduation Commission
  • Criterion 1.6: Examining transcripts of all graduation candidates (course credits, ECTS total, GPA).
  • Criterion 1.6: Checking compliance of internships with regulations in terms of duration, content, and evaluation.
  • Criterion 1.6: Documenting that all requirements for graduation (compulsory courses, elective course groups, internship, etc.) are met.
  • Criterion 1.6: Preparing the graduation approval minutes and presenting them to the Department Board.
ℹ️ Detail

📊 Value-Added Analysis (Pre-Test - Post-Test):

  • Pre-Test (Stage 2): Students' readiness level at the beginning of the semester.
  • Post-Test (Stage 4): Students' achievement level at the end of the semester.
  • Value Added = Post-Test - Pre-Test: This difference is scientific proof that the program actually adds something to the student. MUDEK auditors welcome such improvement cycles based on concrete data very positively.

Criterion 1.6 (Graduation Audit): MUDEK wants to see that the graduation decision is made with "reliable methods". Therefore, graduation commission minutes and checklists must be kept.

PDCA: This stage is the "Check" and "Act" parts. It proves that data is transformed into decision/action.

7

SAR Submission and Site Visit Preparation *

SAR (July) - Visit (Fall Semester)
⚠ This stage is valid only in years when accreditation application is made.
👨‍🏫 Faculty Members
  • Being prepared for delegation interviews, reviewing course binders again.
  • Responding quickly to possible additional evidence requests.
💼 Administration
  • Making final checks of the virtual/physical evidence room.
  • Creating the delegation visit schedule, completing logistical planning.
  • Preparing opening/closing presentations.
🤝 Coordinator
  • Preparing the final version of the SAR, sending it to the delegation.
  • Confirming that all documents in the evidence room are up-to-date and accessible.
  • Preparing an information note for the delegation.
👥 Commission
  • Making task distribution during the delegation visit.
  • Randomly checking sample course binders and evidence.
ℹ️ Detail and Information

Evidence Room: It is the place where all course binders, meeting minutes, and official letters are exhibited for auditors in a physical or digital environment.

Look Back: Auditors want to see records not only of this year but also of previous years (at least 1 full cycle). Continuity is essential.

Tip: In delegation interviews, saying "I can check this data from our quality system and get back to you" instead of "I don't know" shows that you trust your system.

📊 MUDEK 3.0 Relation Tables

Survey - MUDEK Criterion Relation

SurveyRelated Criterion/CriteriaApplication Time
Student Expectation Survey (Pre-Test)Criterion 4 (Continuous Improvement - Data Basis)Stage 2 (Beginning of sem.)
Course Process SurveyCriterion 5 (Curriculum)Stage 4 (Before finals)
Advisor SatisfactionCriterion 1.4Stage 4 (Before finals)
Student Satisfaction (Dept)Criterion 1, 7, 8Stage 4 (Before finals)
International StudentCriterion 1.3 (Mobility)Stage 4 (Before finals)
Academic StaffCriterion 6, 8, 9Stage 4 (Before finals)
Alumni SurveyCriterion 2.3 (PEO Monitoring)Stage 3 (After midterm)
Graduation Candidate (Exit) SurveyCriterion 3.3 (PO Achievement)Stage 4 (Before finals)
External Stakeholder SurveyCriterion 2.2-c, 3 (PO)Stage 3 (After midterm)

Stage - MUDEK Criterion Relation

StageMain CriterionSub-CriteriaKey Personnel
1. Strategic Planning5, 7, 8, 91.1, 6.1Admin + Commission
2. Bologna & Org. + Expectation Survey3, 4, 5, 91.4All Faculty + Commission
3. Ext. Stakeholders22.2-c, 2.3Admin + Coordinator (Alumni + Ext. Stakeholder)
4. Assessment & Survey1, 3, 62.3, 3.3All Faculty + Coordinator (Grad. Candidate)
5. Course Binder Audit3, 5-All Faculty + Commission
6. Analysis & PDCA2, 3, 41.6Graduation Commission + Quality Commission
7. Site Visit *All-All Department
💡 Note: Alumni Survey (Criterion 2.3 - PEO monitoring) is applied in Stage 3, while Graduation Candidate (Exit) Survey (Criterion 3.3 - PO achievement) is applied in Stage 4. Both surveys are analyzed together in Stage 6.