The overall goal of learning evaluation is to determine the effectiveness of the SEC learning environment and whether the existing course delivery methods are successfully achieving the delivery of intended learning outcomes. Learning Evaluation is applied to ensure the efficiency of the program specifications, course learning outcomes, course materials, program design, and the learning environment. In SEC, learning Evaluation is continuously conducted at program/course levels through student surveys, tutor surveys, student achievements analytical reports, and knowledge progress assessments based on learning outcomes.
SEC applies quality monitoring and reviews periodically covering the program content, policy, and requirements. This procedure aims to ensure that SEC achieves the required learning objectives. This review is based on inputs coming from students' progression and relevant collected data and evaluation reports. As a result of these ongoing reviews, SEC can improve its learning resources and programs.
SEC utilizes the Plan-Do-Check-Act quality technique (describe below) to ensure the implementation of high-level quality learning resources.
SEC’s Quality Assurance/Management Manual documents its methods to assure stakeholder satisfaction, meet needs, and comply with legislation. The SEC implements the Deming Plan-Do-Check-Act (PDCA) process to ensure quality.
Educational institutions seeking national and worldwide recognition "should develop" a DPQA. It's a "must" to attend even though there are no graduate students. Regardless of certification, every organization must have a quality function, either directly through a quality unit or department or indirectly through other ways. A new organization should form a Quality Department.
⦁ Plan, execute, monitor, and improve SEC.
⦁ They planned service improvement.
⦁ Supporting and aiding the establishment of quality improvement projects in their regions for academic and administrative units, divisions, and departments.
⦁ Self-evaluation report procedures.
⦁ Organizing and guiding periodic self-studies for internal and external reviewers.
⦁ Creating a quality-improvement culture.
⦁ They are raising awareness of quality assurance and its approaches.
⦁ Development of a quality vision and mission statement.
⦁ Quality Improvement of the internal departments' service.
⦁ Ensure Appointment or nomination of quality officers.
⦁ Provision of orientation and induction sessions for new workers to ensure quality assurance activities are understood and supported.
⦁ Involvement of stakeholders’ programs.
⦁ Progress in quality assurance across the division/department.
⦁ Executive and governing board quality assurance reports using KPIs and internal data.
⦁ Institutional and program self-studies under a Division/Department procedure.
⦁ Reporting to the Director of Quality Assurance
Create a quality committee which should consist of academic department heads and senior academic administrators.
The quality committee advises the school's internal quality assurance committee (IQAC) on how to improve its quality assurance system and recommends to the senior administration quality improvement plans to implement and approve standard forms and documents for quality assurance activities throughout the school.
⦁ Assist with IQAC's quality systems, policy, and process proposals
⦁ Guide and document preparation
⦁ Educate and oversee faculty on quality standards.
⦁ Plan to improve student performance and growth.
⦁ Research client satisfaction now and hereafter.
⦁ Periodically self-evaluate.
The processes of planning, monitoring and evaluation make up the Result-Based Management (RBM) approach, which is intended to aid decision-making towards explicit goals.
Our measurement Committee sets SEC’s future institutional and operational goals. This guides decision-making, program implementation, and evaluation. Among these responsibilities:
⦁ Initiate and implement techniques to get the whole institution behind Institution Quality Struggle and its quality management process.
⦁ A college's Quality Management Committee handles all quality-related issues.
⦁ Measure quality to ensure quality management and performance monitoring.
⦁ Third-party oversight of quality systems, evaluation, and assessment should encourage continual improvement.
⦁ Provide oversight and advice to ensure quality service and support.
Communicating actions taken helps to define our organization’s culture because it's fundamental to build success and leads to better-quality work.
That's why, we run a meeting on Lessons Learned to analyze these actions and to evaluate what did and didn’t work, as well as how to improve.
Accreditation is self-regulation and peer review in academia. Accreditation ensures that higher education meets national and international quality standards. QAC helps in this area by:
⦁ Facilitation of institution and program accreditation.
⦁ This handbook simplifies national and international accreditation criteria.
⦁ Creating tools and recommendations to help find excellent practice evidence and information.
⦁ Standards and KPIs influence institutional and program self-evaluations.
⦁ Create a plan to submit your self-study data and documentation to the relevant national or international accreditation organization.
The Strategic Planning Committee coordinates an annual strategic planning cycle and budget meetings to align SEC’s departments and support services with its Vision and Strategic objectives. These tasks include managing how SEC’s senior managers obtain and use management information, including tracking KPIs and benchmarks and collecting and analyzing statistical and database data.
⦁ Regular monitoring and evaluation of strategic plans give the school proactivity.
⦁ Analyzes and recommends department and division actions' impact, efficacy, and applicability.
⦁ Divisions and units must coordinate their strategies to achieve the school's vision and goal.
After conducting these polls, SEC intends to utilize them every year for the foreseeable future:
⦁ Student surveys
⦁ Evaluation surveys
⦁ Employer-satisfaction survey
⦁ student survey (SES)
⦁ Alumni survey
⦁ P&O Questionnaire
⦁ Faculty satisfaction
⦁ Worker satisfaction survey
Before distributing any more surveys, the Dean must and should always grant permission – except in the following cases:
⦁ Non-required surveys that the organization should approve. By surveying employers. Would you like to comment? Training evaluation forms.
⦁ Unless specifically exempt, all surveys must be approved before being distributed, regardless of technique.
Surveys by SEC will be in future surveys.
⦁ Program evaluation (CES)
⦁ student survey (SES)
⦁ Evaluation survey (PES)
⦁ Employer/alumni surveys
⦁ Employer Survey (ES)
⦁ Survey of faculty and admin
⦁ Progress Report
⦁ Every year following this, invest in new teaching methods and employ them. : Professors may see how others perceive their teaching, which helps them improve in performance in the future.
⦁ Using result analysis, administrators make summative choices (e.g., decisions about promotion, tenure, salary increases, etc.)
⦁ Each class gives CES. At least once a year, each class and at least one class taught by each instructor should take this survey. CES is used extensively for course reports.
The goal of SES will be annually from now on:
⦁ Enhance education.
⦁ Student-specific services and activities
⦁ To provide students with online resources.
⦁ Improve online services.
⦁ To improve SEC’s graduate and undergraduate offerings.
⦁ The evaluation process helps professors and administrators evaluate their programs.
⦁ To assess the programs' merits.
⦁ Program quality recommendations.
⦁ This method gathers contact, interest, and status.
⦁ Improve your institution's quality.
⦁ Look at how happy previous students are with their education, from the classroom to extracurricular activities (campus life, extracurricular activities, technology resources).
⦁ Accreditation necessitates reporting on statistical data.
Employer surveys help determine if graduates have the abilities employers require. These tools help define the skills needed to do business work. We want to utilize ES every year from now on, as stated by their objectives:
⦁ Prepare graduates for the industry they're entering.
⦁ Surveys can identify graduate training shortfalls.
⦁ See if they have the skills firms want in new workers.
HR sends yearly surveys. Human Resources publishes Faculty and Administrative Satisfaction surveys between the 10th and 13th weeks of the Fall Semester. Deans and School Councils debate the results.
As the direct sponsor, the School Council and the Strategic Planning Committee require or request this survey.
For the time being, we plan to apply SEC’s technique to identify, determine, collect, analyze, and summarize data to assess the Quality Assurance/Management System. Various feedback mechanisms, result monitoring, and activity measurement give multiple data.
The following stages will every year go forward as part of the school's survey, data analysis, and commitment to continuous improvement:
⦁ Locations and services identified.
⦁ Preparing surveys.
⦁ A survey results investigation.
⦁ Working on the survey report.
⦁ The improvement unit must design a strategy.
⦁ Approve the proposed improvement.
⦁ This phase includes improvement plan implementation and monitoring/evaluation.
Quantifiable objective statistics and other feedback can show high-quality performance. Performance indicators are pre-selected and consistently applied statistics to measure changes over time and draw comparisons with previous outcomes, other departments within the school, or other institutions.
"Key performance indicators" are crucial measures (KPIs). They may identify by consensus or a single institution for use in higher education. The below remarks strive to ensure that KPIs are consistent.
Data is needed everywhere. KPIs include individual programs or departments, universities, and institutions. Personal program data should be aggregated, then aggregated again to get the institution's totals. When this, program and institution comparisons are straightforward.
Continuous development and decision-making processes rely on performance indicators to analyze educational institutions' quality and performance. The institution measures critical performance indicators with benchmarking using appropriate means (opinion polls, statistical data, etc.) based on the nature and purpose of each hand and its identification.
For the foreseeable future, we'll be doing these exams every year.
⦁ Self-benchmarking (internal reference comparison).
⦁ The bar has risen.
A report highlights each indicator's findings and comparisons among branches, campuses, and student genders.
|Mission and vision, and planning||KPI-I-01||Verifiable Institutional strategy plan performance indicators||The percentage of an institution's strategic plan goals and performance indicators met the yearly target level in the same year.|
|Governance, leadership, and management||KPI-I-02||The ratio of accredited programs||the proportion of recognized programs in effect, Accredited accreditation bodies to program totals|
|Teaching and Learning||KPI-I-03||Students' evaluation of quality||The annual poll of final-year students' five-level rating of program quality|
|KPI-I-04||Student retention rate||In the first year, the ratio of first-year students who return the following year to the total number of first-year students.|
|KPI-I-05||Graduates work and attend graduate school||Percentage of undergraduates who are: Hired|
|KPI-I-06||Selected period undergraduate graduation rate||Percentage of each batch's undergraduates who completed programs on time|
|KPI-I-07||Learning Resources' satisfaction||On a five-level scale, beneficiaries' satisfaction with learning resources in terms of:
Adequacy and diversity (frequent references, databases, etc.)
Their support services.
|Students||KPI-I-08||Evaluating graduate employment agencies.||Annual employer survey on graduate competence on a 5-point scale.|
|KPI-I-09||Average student spending||Student operating costs total annual expenses per student.|
|KPI-I-10||Service-satisfied students||An annual poll of students' satisfaction with the institution's online services on a five-point scale.|
|Faculty & Staff||KPI-I-11||Student-to-faculty ratio||The total number of students to full-time faculty is equivalent to the school and each program.|
|KPI-I-12||Doctorate ratio||The percentage of faculty with approved doctorates at each level:
|KPI-I-13||Faculty attrition rate||The percentage of academics who leave annually for significant reasons.|
|Institutional Resources||KPI-I-14||institution’s Self-income||Self-income % of overall enterprise income|
|KPI-I-15||Beneficiaries' satisfaction with technical services||A yearly survey measured beneficiaries' satisfaction with technical assistance on a 5-point scale.
Through support, easy Accessibility, support, and maintenance.
|Research & Innovation||KPI-I-16||Faculty publication rate||The proportion of full-time academics who have produced at least one research paper during the year.|
|KPI-I-17||Innovation and excellence awards||Annual Foundation patents and innovation excellence awards|
|KPI-I-18||Science budget ratio||Research budget to overall budget ratio|
|KPI-I-19||Research funding ratio||External financing is a percentage of the annual research budget.
|Community Partnership||KPI-I-20||Community service beneficiaries' satisfaction||Annual survey of Foundation beneficiaries' satisfaction with community services.
Programs and initiatives
Indicators are evidence; however, they don't prove many quality faults. The indicator data is evaluated based on the specifics of considering something. A figure is favorable if it improves over previous figures and concerned if it falls. Consistently calculated figures are key data sources for assessing varied scenarios.
Choosing one or more KPIs that show this standard.
Benchmarking: Identifying external standards or internal benchmarks to gauge one's performance is an excellent tool for continual growth and breakthroughs.
Retention: Retention cohorts can function as first-time full-time, first-time part-time, etc., and retention rates can result in several ways. Still, the most typical method is to monitor enrollment from the autumn semester of matriculation to the succeeding fall semesters.
To guarantee that SEC’s Quality Assurance/Management System continues to be successful now and in the future, SEC employs its quality policy, objectives, audit findings, data analysis, corrective and preventative action, and management review.
The academic department prepares the Operational Plan to show the Annual Accomplishment Report each year in the academic department's operational plan for the next academic year.
This technique will help departments not fulfilling operational plan targets. Departments should utilize the improvement planning process to request funds as part of the monitoring process or as a voluntary self-assessment. Such planning includes numerous steps which we intend to use every year from now on:
⦁ Revisit key performance indicators (for assessing operational plans or project implementation) or program objectives (for program evaluation) set before execution, vis-à-vis outcomes or result.
⦁ Planning participants must review evaluation data to identify improvement areas and hypothesize the root reasons.
⦁ Determine priorities based on needs. Participants must determine how data will be collected and assessed.
⦁ After writing the objectives and strategies (actions), select evidence-based tactics to accomplish them.
⦁ The parties involved in implementing the recommended activities, the timing, and the expenses must be defined.
SEC will rectify nonconformities to prevent a recurrence. Corrective measures must be proportional to nonconformities. SEC identifies preventive measures to avoid potential nonconformities. Preventive measures must be proportionate to potential issues.
⦁ Determining probable nonconformities and their causes.
⦁ Evaluating how to prevent nonconformities.
⦁ Identifying and taking action.
⦁ Documenting action results.
⦁ Examining preventative measures.