Научная статья на тему 'Assessing the teaching quality of economics programme: instructor course Evaluations'

Assessing the teaching quality of economics programme: instructor course Evaluations Текст научной статьи по специальности «Науки об образовании»

CC BY
254
44
i Надоели баннеры? Вы всегда можете отключить рекламу.
Журнал
Интеграция образования
Scopus
ВАК
Область наук
Ключевые слова
COURSE EVALUATION / HIGHER EDUCATION / QUALITY OF TEACHING / ECONOMICS PROGRAM / CONFIRMATORY FACTOR ANALYSIS / ОЦЕНКА КУРСА / ВЫСШЕЕ ОБРАЗОВАНИЕ / КАЧЕСТВО ПРЕПОДАВАНИЯ / ЭКОНОМИЧЕСКАЯ ПРОГРАММА / ПОДТВЕРЖДАЮЩИЙ ФАКТОРНЫЙ АНАЛИЗ

Аннотация научной статьи по наукам об образовании, автор научной работы — Hysa Eglantina, Ur Rehman Naqeeb

Introduction. In recent years, measuring the efficiency and effectiveness of higher education has become a major issue. Most developed countries are using national surveys to measure teaching and assessment as key determinants of students' approaches to learning which have direct effect on the quality of their learning outcomes. In less developed countries, there does not exist a national survey. This paper aims to propose an original questionnaire assessing the teaching quality. The specifics of this questionnaire, termed as the Instructor Course Evaluation Survey, is that it addresses three main dimensions, such as: Learning Resources, Teaching Effectiveness, and Student Support. Materials and Methods. The paper opted for an analytic study using 3,776 completed questionnaires. This is a case study applied to the students enrolled in economics program in a private university in Albania. The Instructor Course Evaluation Survey design was supported by the literature review, identifying the three main dimensions included in the questionnaire. The reliability was tested with Cronbach's alpha and with Confirmatory Factor Analysis. The use of Confirmatory Factor Analysis helps in identifying issues of multi-dimensionality in scales. Results. The paper provides empirical insights into the assessing methodology and brings a new model of it. The finding suggests that Learning Resources, Teaching Effectiveness and Student Support increase the quality of teaching. Because of the chosen research target group, students from economics program, the research results may not be generalizable. Therefore, researchers are encouraged to test the proposed statements further. Discussion and Conclussion. The paper includes implications for the development of a simple and useful questionnaire assessing the quality of teaching. Although Instructor Course Evaluation Survey was applied specifically to economics program, the proposed questionnaire can be broadly applied. This paper fulfills an identified need to propose an original and simple questionnaire to be used from different universities and programs to measure the quality of teaching.

i Надоели баннеры? Вы всегда можете отключить рекламу.
iНе можете найти то, что вам нужно? Попробуйте сервис подбора литературы.
i Надоели баннеры? Вы всегда можете отключить рекламу.

Оценка качества преподавания экономической программы

Введение. В последние годы особое внимание уделяется оценке эффективности и действенности высшего образования. Большинство развитых стран используют национальные исследования для измерения качества преподавания и оценки ключевых факторов и подходов учащихся к обучению, непосредственно влияющих на результаты образования. В менее развитых странах национальное исследование не проводится. Цель настоящей работы предложить оригинальную анкету для оценки качества преподавания. Характерной особенностью опросника «Преподавательская оценка курса» является охват трех основных аспектов: учебных ресурсов, эффективности преподавания и поддержки студентов. Материалы и методы. В статье подводятся итоги анкетного опроса 3 776 респондентов. Данное тематическое исследование проводилось среди студентов, обучающихся по экономической программе в частном университете Албании. Анкета была разработана на основе обзора литературы, в котором были определены три основных аспекта, включенных в опросник. Надежность была проверена с помощью коэффициента альфа Кронбаха. Использование подтверждающего факторного анализа позволило выявить аспекты многомерности в шкалах. Результаты исследования. В статье представлены эмпирические данные о методологии оценки качества образования и предложена новая модель ее применения. Результаты исследования показывают, что учебные ресурсы, эффективность преподавания и поддержка учащихся повышают качество преподавания. В связи с тем, что выбранная целевая группа это студенты, обучающиеся по экономической программе, результаты исследований не могут быть обобщены. Исследователям рекомендуется дополнительно проверить полученные выводы. Обсуждение и заключение. Данная статья удовлетворяет выявленной необходимости иметь оригинальную и простую анкету для использования в различных университетах и программах с целью оценки качества преподавания.

Текст научной работы на тему «Assessing the teaching quality of economics programme: instructor course Evaluations»

ИНТЕГРАЦИЯ ОБРАЗОВАНИЯ. Т. 23, №4. 2019 ISSN 1991-9468 (Print), 2308-1058 (Online) http://edumag.mrsu.ru

МОДЕРНИЗАЦИЯ ОБРАЗОВАНИЯ / MODERNIZATION OF EDUCATION

УДК 371:33:005.336.3

DOI: 10.15507/1991-9468.097.023.201904.556-567

Assessing the Teaching Quality of Economics Programme: Instructor Course Evaluations

E. Hysa*, N. Ur Rehman

Epoka University, Tirana, Albania, * ehysa@epoka.edu.al

Introduction. In recent years, measuring the efficiency and effectiveness of higher education has become a major issue. Most developed countries are using national surveys to measure teaching and assessment as key determinants of students' approaches to learning which have direct effect on the quality of their learning outcomes. In less developed countries, there does not exist a national survey. This paper aims to propose an original questionnaire assessing the teaching quality. The specifics of this questionnaire, termed as the Instructor Course Evaluation Survey, is that it addresses three main dimensions, such as: Learning Resources, Teaching Effectiveness, and Student Support.

Materials and Methods. The paper opted for an analytic study using 3,776 completed questionnaires. This is a case study applied to the students enrolled in economics program in a private university in Albania. The Instructor Course Evaluation Survey design was supported by the literature review, identifying the three main dimensions included in the questionnaire. The reliability was tested with Cronbach's alpha and with Confirmatory Factor Analysis. The use of Confirmatory Factor Analysis helps in identifying issues of multi-dimensionality in scales.

Results. The paper provides empirical insights into the assessing methodology and brings a new model of it. The finding suggests that Learning Resources, Teaching Effectiveness and Student Support increase the quality of teaching. Because of the chosen research target group, students from economics program, the research results may not be generalizable. Therefore, researchers are encouraged to test the proposed statements further.

Discussion and Conclussion. The paper includes implications for the development of a simple and useful questionnaire assessing the quality of teaching. Although Instructor Course Evaluation Survey was applied specifically to economics program, the proposed questionnaire can be broadly applied. This paper fulfills an identified need to propose an original and simple questionnaire to be used from different universities and programs to measure the quality of teaching.

Keywords: course evaluation, higher education, quality of teaching, economics program, confirmatory factor analysis

For citation: Hysa E., Ur Rehman N. Assessing the Teaching Quality of Economics Programme: Instructor Course Evaluations. Integratsiya obrazovaniya = Integration of Education. 2019; 23(4):556-567. DOI: https://doi.org/10.15507/1991-9468.097.023.201904.556-567

© Hysa E., Ur Rehman N., 2019

q Контент доступен под лицензией Creative Commons Attribution 4.0 License. The content is available under Creative Commons Attribution 4.0 License.

Оценка качества преподавания экономической программы

Э. Хыса*, Н. Ур-Рехман

Университет Эпока, г. Тирана, Албания, * ehysa@epoka.edu.al

Введение. В последние годы особое внимание уделяется оценке эффективности и действенности высшего образования. Большинство развитых стран используют национальные исследования для измерения качества преподавания и оценки ключевых факторов и подходов учащихся к обучению, непосредственно влияющих на результаты образования. В менее развитых странах национальное исследование не проводится. Цель настоящей работы - предложить оригинальную анкету для оценки качества преподавания. Характерной особенностью опросника «Преподавательская оценка курса» является охват трех основных аспектов: учебных ресурсов, эффективности преподавания и поддержки студентов. Материалы и методы. В статье подводятся итоги анкетного опроса 3 776 респондентов. Данное тематическое исследование проводилось среди студентов, обучающихся по экономической программе в частном университете Албании. Анкета была разработана на основе обзора литературы, в котором были определены три основных аспекта, включенных в опросник. Надежность была проверена с помощью коэффициента альфа Кронбаха. Использование подтверждающего факторного анализа позволило выявить аспекты многомерности в шкалах.

Результаты исследования. В статье представлены эмпирические данные о методологии оценки качества образования и предложена новая модель ее применения. Результаты исследования показывают, что учебные ресурсы, эффективность преподавания и поддержка учащихся повышают качество преподавания. В связи с тем, что выбранная целевая группа - это студенты, обучающиеся по экономической программе, результаты исследований не могут быть обобщены. Исследователям рекомендуется дополнительно проверить полученные выводы.

Обсуждение и заключение. Данная статья удовлетворяет выявленной необходимости иметь оригинальную и простую анкету для использования в различных университетах и программах с целью оценки качества преподавания.

Ключевые слова: оценка курса, высшее образование, качество преподавания, экономическая программа, подтверждающий факторный анализ

Для цитирования: Хыса Э., Ур-Рехман Н. Оценка качества преподавания экономической программы // Интеграция образования. 2019. Т. 23, № 4. С. 556-567. DOI: https://doi.org/10.15507/1991-9468.097.023.201904.556-567

Introduction

According to the Standards and Guidelines for Quality Assurance in the European Higher Education Area1, 'universities have to review their programs on a regular basis ensuring their compliance with international aims meeting learners' and social needs, especially on quality assurance'. The academic knowledge and skills followed by concrete examples directly linked to the real world persist to be crucial issues to be absorbed and transmitted to students as learning tools and added value [1].

Stergiou and Airey and Darwin state that 'the systems for the evaluation of teaching and course quality in higher education in-

stitutions have long been established both in the United States and Australia and they have also become increasingly common in the United Kingdom' [2; 3]. Other authors such as Clayson & Haley, Kuzmanovic et al. and Surgenor state that have been established in other countries too [4-6]. Student evaluations of teaching (SET) is the most commonly used method as they provide rapid feedback [7], and ratings that are easily compared across units and between instructors [8]. These surveys are used to identify the problem areas and to set up some action plans to enhance them. The evaluation of both, teachers and teaching, is an important part of higher education [9] and can be used to help improve teaching qual-

1 Standards and Guidelines for Quality Assurance in the European Higher Education Area (ESG). Brussels; 2015. Available at: http://enqa.eu/wp-content/uploads/2015/11/ESG_2015.pdf (accessed 01.06.2019). (In Eng.)

ity [10]. They are often an important part of accreditation processes too. Marsh, Paulsen and Richardson suggest that 'student ratings demonstrate acceptable psychometric properties which can provide important evidence for educational research' [11-13].

The Law no. 9741, dated 21.05.2007 for the higher education made some new establishments with respect to administration, organization and financial aspect to improve quality of Albanian HEIs in alignment with the European Standards2. Even though further amendments to this were carried out, Law no. 9832, dated 12.11.2007 and Law no. 10 307, dated 22.07.2010, again there were concerns related to quality weaknesses in the HEIs3.

The new Law No. 80, dated 17.09.2015, "On Higher Education and Scientific Research in Higher Education Institutions of the Republic of Albania", would enforce the establishment of the internal and external mechanisms on quality control in each institution4. Article 103/3 of this law states that each institution must spread and collect questionnaires before the final exams of each semester in order to track data regarding quality of teaching within the programs.

In 2014, the Ministry of Education and Sport of Albania and the Quality Assurance Agency for Higher Education (QAA) in the UK signed a Memorandum of Understanding and during 2016-2017, all the 35 HEIs

in Albania, both public and private, entered the process of institutional accreditation5. One of the standards that the HEI has to fulfill is the "Study programmes are subject to their continuous improvement to increase quality", and the concrete examples of this standard are as following:

1. Lecturers are regularly assessed by institution structures that pursue qualitative implementation of study programmes.

2. Students are involved in evaluation of lecturers and study programme implementation.

3. Outcomes of examinations and competitions are published.

4. Study programmes are improved by taking into account the outcomes of their evaluation by academic staff and students.

5.Study programmes quality is evaluated also by statistics of employment of graduates in the relevant study programme6.

Even though the evaluation of study programs is a requirement, systematic data collection and evaluation process is not well established in most of the Albanian universities. Hoxhaj and Hysa in 2015 stated that the main and the most difficult challenge for the HEIs in Albania is the improvement of controlling, monitoring and reviewing quality assurance in universities. Many public and private universities are not accomplishing the standards of existence and are still operating in educative market [14].

2 Law Nr. 9741, date 21.05.2007, 'For the Higher Education in Republic of Albania' amended with laws No. 9832, date 12.11.2007, No. 10307, date 22.7.2010, No. 10493, date 15.12.2011, Nr. 82, date 14.02.2013, abrogated [Electronic resource]. Available at: http://www.aaal.edu.al/dokumente/en/Albania_Law__revised.pdf (accessed 01.06.2019).

3 Project Against Corruption. In: Albania (PACA) Technical Paper on Corruption in the Albania Education System prepared by Pellumb Karameta, Council of Europe Expert, Council of Europe/European Union; 2010. Available at: http://rm.coe.int/CoERMPublicCommonSearchServices/DisplayDCTMContent?docum entId=09000016806ec8bc (accessed 01.06.2019).

4 Law No. 80/2015, date 07.09.2015, 'On Higher Education and Scientific Research in Higher Education Institutions in the Republic of Albania (AL) [Electronic resource]. Available at: http://www.aaal.edu. al/dokumente/legjislacioni/LAL_NR_80_2015.pdf (accessed 01.06.2019).

5 Albanian Accreditation Agency for Higher Education. Final Evaluation Report "The Provision of Quality Assurance Expertise to Support the Creation of External Quality Review Materials, Peer Reviewer Training and External Review of Higher Education Institutions in Albania". Albania: PAAHE and QAA; 2018. Available at: https://www.ascal.al/media/documents/publikime/Report_QAA_ASCAL.pdf (accessed 01.06.2019). (In Eng.)

6 Albanian Accreditation Agency for Higher Education. Institutional Review of Higher Education Institutions in Albania, The Handbook 2016-2017. Albania: PAAHE and QAA; 2016. Available at: https://www. aaal.edu.al/accreditation/images/documents/Albanian%20handbook%20FINAL%20VERSI0N_web.pdf (accessed 01.06.2019). (In Eng.)

Motivated from the requirements to measure the teaching and course quality and a lack of instructor evaluation survey analysis in the Albanian higher education system, this study provides a useful starting point for the purpose of the present investigation. This is the first study of this kind conducted for any Albanian university. Epoka University (EU) is one of the leading universities in Albania, which is often included in the list of top three universities of this country7. This is the main reason for selecting EU as a case study. Secondly, this study can serve as a good practice, and the survey can be proposed as a quality measurement tool to the other higher education institutions of this region.

More specifically, this study is conducted for the economics program of the first cycle of study. The research focuses on a general literature review regarding the usage of different surveys and the variety of dimensions they include. A special part of the literature covers some previous studies that have used similar methods of analyzing the students' surveys. The second part is devoted to the methodology used and data

collection for our survey. The next session includes the descriptive statistics, which help to measure the reliability and internal consistency by using Cronbach's alpha, and the Confirmatory Factor Analysis (CFA) to investigate the correlation between dimensions of the survey. Finally, the conclusions, discussions and study limitation takes place.

Literature Review

Bassi et al. state that one of the aspects of students' surveys is the measurement of the quality of teaching [15]. Meanwhile, it is arduous to define the quality of something since it depends on many various elements. 'Different interest groups, or stakeholders, have different priorities' [16].

Spooren et al. state that different surveys have used a great number of instruments available to students for assessing teaching [17]. Some of the examples are found in the below table 1.

Although some level of consensus regarding the characteristics of effective or good teaching has been reached [17], existing SETs instruments vary widely in the dimensions they try to capture [15].

T a b l e 1. Some typologies of the used questionnaires

Questionnaire Used

Author, Year

Instructional Development and Effectiveness Assessment

Students' Evaluation of Education Quality

Course Experience Questionnaire

Student Instructional Report

Students' Evaluation of Teaching Effectiveness Rating Scale

Student Course Experience Questionnaire Teaching Proficiency Item Pool SET37

Exemplary Teacher Course Questionnaire

Cashin and Perrin, 1978

Marsh, 1982; Marsh et al., 2009 Ramsden, 1991 Centra, 1998 Toland and Ayala, 2005

Ginns et al., 2007 Barnes et al., 2008 Mortelmans and Spooren, 2009 Kember and Leung, 2008

Source: Authors' revision upon the work of Spooren et al. [17].

7 Umultirank, World University Rankings 2018-2019 [Electronic resource]. Available at: http://www. umultirank.org/study-at/epoka-university (accessed 01.06.2019); UniRank, World Universities Search Engine [Electronic resource]. Available at: https://www.4icu.org/al (accessed 01.06.2019); Webometrics, Ranking Web of Universities [Electronic resource]. Available at: http://www.webometrics.info/en/Europe/Albania (accessed 01.06.2019).

In their studies, Marsh [18], Marsh et al. [19] and Coffey and Gibbs [20] employed questionnaires including a total of nine dimensions, three of which are similar to our dimensions. Most of these works have used the reliability test, Cronbach's alpha and confirmatory factor analysis.

Kember and Leung used the case of 'designing a new course questionnaire to discuss the issues of validity, reliability and diagnostic power in good questionnaire design' [21]. The authors have interviewed award-winning teachers about their principles and practices, resulting nine dimensions of good teaching, which were developed into nine questionnaire scales. Along with the test of reliability with Cronbach's alpha and with confirmatory factor analysis, the authors introduced 'the concept of diagnostic power as the ability of an instrument to distinguish between related constructs'.

Barth examined the student evaluation of teaching instrument used in the College of Business Administration at Georgia Southern University, which measured five dimensions, such as quality of instruction, course rigor, level of interest, grades and instructor helpfulness [22]. Apart from the level of interest and grades, the other three dimensions match with our survey. The author found that 'the overall instructor rating is primarily driven by the quality of instruction'.

Ginns et al. used the Course Experience Questionnaire to receive the students' perceptions on a number of dimensions, including 'Good Teaching, Clear Goals and Standards, Appropriate Assessment, Appropriate Workload, and Generic Skills development' [23]. 'Confirmatory factor analyses supported the hypothesised factor structure and estimates of inter-rater agreement on SCEQ scales indicated student ratings of degrees can be meaningfully aggregated up to the faculty level' [23].

Entwistle et al. define teaching and learning environment as the aggregate of four elements: 'course contexts, teaching and assessment of contents, relationship

between students and staff, and students and their cultures' [24]. This definition is similar to our survey. Course context is considered to be the learning resource scale. 'Course contexts include, among others, aims and intended learning outcomes for a specific course' [24]. 'Teaching and assessment of contents refer to pedagogical practices that support students' understanding of discipline-specific ways of thinking and reasoning' [25], which is consistent with teaching effectiveness scale in our survey. 'Relationship between students and staff describes the affective quality of the relationships between students and teachers, such as the provision of flexible instructional support for both cognitively and affectively diverse learners' [26; 27]. This element is the last scale of our survey, that of student support scale. Whereas the fourth element, it is not being considered in the ICES.

Usage of Reliability, Validity and Confirmatory Factor Analysis in Literature Review. Though the validity and reliability of the instrument are important, often they are not given sufficient attention [20]. Generally, the reported ratings from SET are assumed to be valid indicators of teaching performance [27]. 'There is only limited evidence-based research on the validity and the reliability of SET instruments in the literature' [8]. 'SETs typically contain groupings of items reflecting different dimensions of the student experience of a particular course, referred to as scales' [2].

Both reliability and validity are categorized to be important psychometric elements of surveys. Although reliability may be measured in a number of ways, the most commonly accepted measure is internal consistency reliability using alpha coefficient. In their studies, Nunnally8 and Hinkin [28] define 'reliability as being concerned with the accuracy of the actual measuring instrument, and validity referring to the instrument's success at measuring what it purports to measure'.

Traditionally, the internal structure of a questionnaire is evaluated via Confirmatory Factor Analysis [2; 21; 29-31], 'which

8 Nunnally J.C. Psychometric Theory. 2nd ed. Hillsdale, NJ: Mcgraw-Hill; 1978. (In Eng.)

tests the theoretically justified measurement model against the data collected with the questionnaire' [32].

Methodology and Data Collection

The main objective of this study is to validate the scales of student evaluation of teaching used at the bachelor program of economics at Epoka University in Albania. The population for the study consisted of students of the above-mentioned program for the academic year 2017-2018. EU has been using its' own survey, named as the "Instructor Course Evaluation Survey", which was filled up electronically and the participants were assured that their responses would be kept confidential and anonymous. The students had to complete the form before the final exam period of fall and spring semesters.

There are two categories of students' filling up the survey; the students enrolled in the economics department or others having as electives the courses of this department. These students are in the first year, second year and third year of their studies. Survey results are shown electronically from the university interactive system. Individual results are reported to each faculty member, accordingly. Moreover, the list of the courses offered per each program under the department is shown to each head of the department account.

The Instructor Course Evaluation Survey was fulfilled for 41 courses in the fall semester, and 43 courses in the spring semester, a total of 84 courses for academic year 2017-2018. These 84 courses represent the collective evaluations of 32 different instructors, based on the surveys of 3,776 students. The responce rate to this survey was soaring; the lowest percentage response rate per courses has been calculated to be 90.00% and the highest one 100%.

The ICES used was based on the 14 item instrument merged into three scales reflecting different dimensions of teaching, such as Learning Resources Scale, Teaching Effectiveness Scale, and Student Support Scale. Students are required to evaluate the teaching of each course by responding to the questions using a 5-point Likert

Scale, from 0 for 'definitely disagree' to 4 for 'definitely agree'.

ICES included also a session in which the students could write additional comments. Even though reading all the comments and including this information in the analyse seems to be an imperative work, these comments sometimes are rich and much more informative and often they can serve to stress the students' evaluation.

The 14 items are categorized under 3 dimensions (see Table 2) which can be can be summarized as:

Learning Resources Scale (LRS) -which are mostly related to the course type, structure and organization.

Teaching Effectiveness Scale (TES) -covering the teaching methodology, effectiveness and assessment.

Student Support Scale (SSS) - comprises the lecturers' readiness to support students and their punctuality.

Results and Discussion

Descriptive statistics. Table 2 shows the descriptive and reliability statistics of instructor course evaluation survey (ICES) using learning resource scale (LRS), teaching effectiveness scale (TES) and student support scale (SSS). These three scale measure the efficiency and effectiveness of teaching. In the first column, the information on ICES is reported which is obtained from the students of economics. The second column shows the code of each statement (see Table 2). These statements are coded as LRS_1, LRS_2, LRS_3, TES_1 and so forth. Mean values and the standard deviations are reported in the third and fourth columns. Overall, the mean values are greater than 3 which indicate that most of the students ranked their teacher performance satisfactory. In order to examine the reliability and internal consistency between variables (or statements), the Cronbach's alpha values indicates that these statements which are related to learning resource scale (LRS), teaching effectiveness scale (TES), and the student support scale (SSS) are highly correlated and suggest that reliability of all these variables have excellent (alpha > 0.90) test scores.

ИНТЕГРАЦИЯ ОБРАЗОВАНИЯ. Т. 23, № 4. 2019 £ T a b l e 2. Descriptive statistics of academic learning feedback (N = 3 776)

Components of instructor course evaluation survey

Code

Mean

alpha

Learning Resource Scale (LRS)

The outline and objectives of the course were clearly pre- LRS _1 3.479 0.337 0.993

sented in the syllabus

The textbook and/or reading materials were helpful for under- LRS _2 3.431 0.366 0.994

standing the subject matter

The course increased my knowledge and interest in the sub- LRS _3 3.422 0.367 0.993

iНе можете найти то, что вам нужно? Попробуйте сервис подбора литературы.

ject matter

Teaching Effectiveness Scale (TES)

The methods of teaching in this course were appropriate TES _1 3.412 0.374 0.993

The instructor made appropriate use of course materials (text- TES _2 3.454 0.349 0.993

books, supplements etc.) to subject matter

The instructor used the language of instruction effectively TES _3 3.454 0.366 0.993

The instructor engaged and motivated the class very well TES _4 3.404 0.398 0.993

The instructor graded my work fairly TES _5 3.513 0.341 0.994

Student Support Scale (SSS)

The instructor was well prepared for the lectures SSS_ 1 3.488 0.325 0.993

The instructor was available to give help outside the class SSS_ 2 3.457 0.341 0.994

The instructor came to class on time SSS_ 3 3.561 0.271 0.994

The instructor attended the class regularly SSS_ 4 3.553 0.267 0.994

The instructor had effective dialogue with students during the SSS_ 5 3.451 0.372 0.993

class

The instructor demonstrated concern regarding my grade SSS_ 6 3.412 0.351 0.992

Source: Authors' calculations.

Confirmatory factor analysis of Instructor Course Evaluation Survey (ICES). To investigate the correlation between LRS, TES and SSP, we have used path co-variance analysis which is also known as confirmatory factor analysis (see Figure). Figure reports the latent variables in three circles which are labeled as 'LRS', 'TES', and 'SSS'. Each latent or unobserved variable is linked with their proxies (observed variables). For example, learning resource scale (LRS) is associated with LRS_1, LRS_2 and LRS_3 with error terms (in small circles). Similarly, teaching effective scale (TES) is a latent variable and related with TES_1, TES_2, TES_3 and so forth. One-sided arrow shows the linear relationship (regression) between latent

variable and their proxies. Along each arrow the factor loading values have been reported. Each factor value shows the relationship between a latent and an observed variable. Two-sided arrow presents the correlation (covariance) between latent variables. Higher the factor value means that two variables are strongly correlated. In order to check the fitness of the factor model, we can observe that comparative fit index (CFI) value is 0.90 which suggest that the model is good fit (see Table 3). Similarly, other model fit statistics such as root mean square error approximation (RMSEA) with value 0.05 and standardized root mean residual value which is 0.019 shows that our confirmatory factor analysis is appropriate.

T a b l e 3. Fit indices including chi-square, p, CFI for ICE

Fit indices X2 p-value CFI RMSEA SRMR

ICES 3 Factors 444.698 0.000 0.90 0.05 0.019

Source: Authors' calculations.

о

INTEGRATION OF EDUCATION. Vol. 23, No. 4. 2019

F i g u r e. Confimatory factor analysis using path diagram of ICES

Concerning the statistical relationship between ICES variables scale, the results are reported in Table 4. Learning resource scale is measured with 3 items. The standardized coefficients (or factor loadings) are high and showed significant association (P > 0.90; p-value = 0.000) to learning resource scale (LRS). This outcome indicates that clarity regarding syllabus, textbook and reading materials positively enhance the students learning skills. Regarding teaching effectiveness (TES), the teaching methodology, instructor use of course related knowledge, effective communication and teacher assessment of students' grades are positively correlated (P = 0.9; p-value = = 0.000) with teaching effectiveness scale.

Student support scale (SSS) also showed strong statistical evidence (ft > 0.90; p-value = 0.000) which rejects the null hypothesis. This outcome indicates that instructor preparation for lecture, being punctual and interaction with students supports their learning abilities (see Table 4). Lastly, the three latent or unobserved variables (see Table 4 or Figure) show a strong positive correlation with each other. This finding suggests that using learning resource scale (LRS), teaching effectiveness (TES) and student support scale (SSS) will increase the quality of teaching in degree programs. So, our results confirmed the validity of these three ICES scale and it useful to implement in Albanian higher education institutions.

ИНТЕГРАЦИЯ ОБРАЗОВАНИЯ. Т. 23, №4. 2019 T a b l e 4. Confirmatory factor analysis with standardized factor loadings of ICES

Path: Latent à observed

Standardized factor loadings

Residual variance

Learning Resource Scale

LRS à LRS_1

LRS à LRS_2

LRS à LRS_3

Teaching Effective Scale

TESàTES_1

TESàTES_2

TESàTES_3

TESàTES_4

TESà TES_5

Student Support Scale

SSSà SSS_1

SSSà SSS_2

SSSà SSS_3

SSSà SSS_4

SSSà SSS_5

SSSà SSS_6

Path: Latent <-> Latent

Cov (LRS, TES)

Cov (LRS, SSS)

Cov(TES,SSS)

* indicates significant at 1% level.

Source: Authors' calculations.

0.971* 0.964* 0.988*

0.985* 0.976* 0.973* 0.978* 0.961*

0.978* 0.971* 0.902* 0.920* 0.981* 0.974*

0.993* 0.988* 1.000*

0.056 0.068 0.022

0.031 0.050 0.043 0.043 0.008

0.042 0.062 0.185 0.152 0.035 0.005

Conclusion

Researchers widely use student ratings of instruction as a metric of instructor performance [22]. 'From the university pedagogics perspective, in order to support students' learning and thinking, it is important to know how students perceive their teaching-learning environments' [25].

This study aimed at validating the scale of students' evaluation of teaching used by Epoka University in Albania, with particular regard to economics program and indicators assessing the teaching carried out by instructors of this university. The satisfying results concerning the statistical validity and reliability of the questionnaire lay the foundation for improvement in terms of the quality of teaching and learning processes. The three scales/dimensions used in ICES related to learning resource, teaching effectiveness, and the student

support are found to be highly correlated and all these variables are reliable and have internal consistency.

Both, the comparative fit index and square error approximation show that the model is a good fit and that the used confirmatory factor analysis is appropriate. The three scales are correlated to each other, underling the fact that all together they significantly and positively contribute to quality of teaching in this program.

Although the results reported here are specific to Epoka University, economics program and ICES, researches can use the same survey in measuring the teaching performance and finding out if the three dimensions of ICES are reliable and valid to their institution. The usage of such surveys and the examination of their dimensions make possible a better understanding of the teaching quality and the factors affecting it.

REFERENCES

1. Hysa E. Defining a 21st Century Education: Case Study of Development and Growth Course. Mediterranean Journal of Social Sciences. 2014; 5(2):41-46. (In Eng.) DOI: https://doi.org/10.5901/mjss.2014.v5n2p41

2. Stergiou D.P., Airey D. Using the Course Experience Questionnaire for Evaluating Undergraduate Tourism Management Courses in Greece. Journal of Hospitality, Leisure, Sport & Tourism Education. 2012; 11(1):41-49. (In Eng.) DOI: https://doi.org/10.1016/jjMste.2012.02.002

3. Darwin S. Moving Beyond Face Value: Re-Envisioning Higher Education Evaluation as a Generator of Professional Knowledge. Assessment & Evaluation in Higher Education. 2012; 37(6):733-745. (In Eng.) DOI: https://doi.org/10.1080/02602938.2011.565114

4. Clayson D.E., Haley D.A. Are Students Telling Us the Truth? A Critical Look at the Student Evaluation of Teaching. Marketing Education Review. 2011; 21(2):101-112. (In Eng.) DOI: https://doi.org/10.2753/ MER1052-8008210201

5. Kuzmanovic M., Savic G., Gusavac B.A., Makajic-Nikolic D., Panic B. A Conjoint-Based Approach to Student Evaluations of Teaching Performance. Expert Systems with Applications. 2013; 40(10):4083-4089. (In Eng.) DOI: https://doi.org/10.10167j.eswa.2013.01.039

6. Surgenor P.W. Obstacles and Opportunities: Addressing the Growing Pains of Summative Student Evaluation of Teaching. Assessment & Evaluation in Higher Education. 2013; 38(3):363-376. (In Eng.) DOI: https://doi.org/10.1080/02602938.2011.635247

7. Shevlin M., Banyard P., Davies M., Griffiths M. The Validity of Student Evaluation of Teaching in Higher Education: Love Me, Love My Lectures? Assessment & Evaluation in Higher Education. 2000; 25(4):397-405. (In Eng.) DOI: https://doi.org/10.1080/713611436

8. Oon P.T., Spencer B., Kam C.C.S. Psychometric Quality of a Student Evaluation of Teaching Survey in Higher Education. Assessment & Evaluation in Higher Education. 2017; 42(5):788-800. (In Eng.) DOI: https://doi.org/10.1080/02602938.2016.1193119

9. Nasser F., Fresko B. Faculty Views of Student Evaluation of College Teaching. Assessment & Evaluation in Higher Education. 2002; 27(2):187-198. (In Eng.) DOI: https://doi.org/10.1080/02602930220128751

10. Hammonds F., Mariano G.J., Ammons G., Chambers S. Student Evaluations of Teaching: Improving Teaching Quality in Higher Education. Perspectives: Policy and Practice in Higher Education. 2017; 21(1):26-33. (In Eng.) DOI: https://doi.org/10.1080/13603108.2016.1227388

11. Marsh H.W. Students' Evaluations of University Teaching: Research Findings, Methodological Issues, and Directions for Future Research. International Journal of Educational Research. 1987; 11(3):253-388. (In Eng.) DOI: https://doi.org/10.1016/0883-0355(87)90001-2

12. Paulsen M.B. Evaluating Teaching Performance. New Directions for Institutional Research. Special Issue: Evaluating Faculty Performance. 2002; (114):5-18. (In Eng.) DOI: https://doi.org/10.1002/ir.42

13. Richardson J.T. Instruments for Obtaining Student Feedback: A Review of the Literature. Assessment & Evaluation in Higher Education. 2005; 30(4):387-415. (In Eng.) DOI: https://doi. org/10.1080/02602930500099193

14. Hoxhaj J., Hysa E. Comparing ENQA, British, German & Albanian Standards of Quality in Higher Education. European Journal of Sustainable Development. 2015; 4(2):243-258. (In Eng.) DOI: http://dx.doi. org/10.14207/ejsd.2015.v4n2p243

15. Bassi F., Clerici R., Aquario D. Students' Evaluation of Teaching at a Large Italian University: Validation of Measurement Scale. Electronic Journal of Applied Statistical Analysis. 2017; 10(1):93-117. (In Eng.) DOI: https://doi.org/10.1285/i20705948v10n1p93

16. Newton J. What is Quality? In: Embedding Quality Culture in Higher Education. A Selection of Papers from the 1st European Forum for Quality Assurance (23-25 November 2006, Germany) / L. Bollaert, S. Brus, B. Curvale, et al. (Eds.). 2007. p. 14-24. Avaiable at: https://enqa.eu/indirme/papers-and-reports/ associated-reports/EUA_QA_Forum_publication.pdf (accessed 11.10.2019). (In Eng.)

17. Spooren P., Brockx B., Mortelmans D. On the Validity of Student Evaluation of Teaching: The State of the Art. Review of Educational Research. 2013; 83(4):598-642. (In Eng.) DOI: https://doi. org/10.3102/0034654313496870

18. Marsh H.W. SEEQ: A Reliable, Valid, and Useful Instrument for Collecting Students' Evaluations of University Teaching. British Journal of Educational Psychology. 1982; 52(1):77-95. (In Eng.) DOI: https:// doi.org/10.1111/j.2044-8279.1982.tb02505.x

19. Marsh H.W., Muthén B., Asparouhov T., Lüdtke O., Robitzsch A., Morin A.J. et al. Exploratory Structural Equation Modeling, Integrating CFA and EFA: Application to Students' Evaluations of University Teaching. Structural Equation Modeling: A Multidisciplinary Journal. 2009; 16(3):439-476. (In Eng.) DOI: https://doi.org/10.1080/10705510903008220

20. Coffey M., Gibbs G. The Evaluation of the Student Evaluation of Educational Quality Questionnaire (SEEQ) in UK Higher Education. Assessment & Evaluation in Higher Education. 2001; 26(1):89-93. (In Eng.) DOI: https://doi.org/10.1080/02602930020022318

21. Kember D., Leung D.Y. Establishing the Validity and Reliability of Course Evaluation Questionnaires. Assessment & Evaluation in Higher Education. 2008; 33(4):341-353. (In Eng.) DOI: https://doi. org/10.1080/02602930701563070

22. Barth M.M. Deciphering Student Evaluations of Teaching: A Factor Analysis Approach. Journal of Education for Business. 2008; 84(1):40-46. (In Eng.) DOI: http://dx.doi.org/10.3200/JOEB.84.1.40-46

23. Ginns P., Prosser M., Barrie S. Students' Perceptions of Teaching Quality in Higher Education: The Perspective of Currently Enrolled Students. Studies in Higher Education. 2007; 32(5):603-615. (In Eng.) DOI: http://dx.doi.org/10.1080/03075070701573773

24. Entwistle N. Promoting Deep Learning Through Teaching and Assessment: Conceptual Frameworks and Educational Contexts. [Electronic resource]. 2003. Available at: http://www.leeds.ac.uk/educol/docu-ments/00003220.htm (accessed 11.10.2019). (In Eng.)

25. Entwistle N., McCune V., Hounsell J. Approaches to Studying and Perceptions of University Teaching-Learning Environments: Concepts, Measures and Preliminary Findings. In: Occasional Report, 1. Evaluating faculty performance [Electronic resource]. 2002. Available at: https://www.semanticscholar.org/ paper/Approaches-to-Studying-and-Perceptions-of-%3A-%2C-and-Entwistle-McCune/e371ce75da787bb4d-fe1c03c9917213d43ad6d36 (accessed 11.10.2019). (In Eng.)

26. McCune V. Final Year Biosciences Students' Willingness to Engage: Teaching-Learning Environments, Authentic Learning Experiences and Identities. Studies in Higher Education. 2009; 34(3):347-361. (In Eng.) DOI: https://doi.org/10.1080/03075070802597127

27. Hativa N., Stanley C. Student Ratings of Instruction: Recognizing Effective Teaching. CreateSpace Independent Publishing Platform; 2013. (In Eng.)

28. Hinkin T.R. A Review of Scale Development Practices in the Study of Organizations. Journal of Management. 1995; 21(5):967-988. (In Eng.) DOI: https://doi.org/10.1016/0149-2063(95)90050-0

29. Beran T., Violato C., Kline D., Frideres J. What Do Students Consider Useful about Student Ratings? Assessment & Evaluation in Higher Education. 2009; 34(5):519-527. (In Eng.) DOI: https://doi. org/10.1080/02602930802082228

30. Lemos M.S., Queirós C., Teixeira P.M., Menezes I. Development and Validation of a Theoretically Based, Multidimensional Questionnaire of Student Evaluation of University Teaching. Assessment & Evaluation in Higher Education. 2011; 36(7):843-864. (In Eng.) DOI: https://doi.org/10.1080/02602938.2010.493969

31. Zhao J., Gallant D.J. Student Evaluation of Instruction in Higher Education: Exploring Issues of Validity and Reliability. Assessment & Evaluation in Higher Education. 2012; 37(2):227-235. (In Eng.) DOI: https://doi.org/10.1080/02602938.2010.523819

iНе можете найти то, что вам нужно? Попробуйте сервис подбора литературы.

32. Utriainen J., Tynjälä P., Kallio E., Marttunen M. Validation of a Modified Version of the Experiences of Teaching and Learning Questionnaire. Studies in Educational Evaluation. 2018; 56:133-143. (In Eng.)

Submitted 15.07.2019; revised 07.10.2019; published online 31.12.2019.

Поступила 15.07.2019; принята к публикации 07.10.2019; опубликована онлайн 31.12.2019.

About the authors:

Eglantina Hysa, Lecturer at Economics Department, Epoka University (Autostrada Tiranë-Rinas, km. 12, Tirana 1000, Albania), Ph. D., Associated Professor, ORCID: https://orcid.org/0000-0002-3429-5738, Scopus ID: 56005581900, Researcher ID: T-7876-2018, ehysa@epoka.edu.al

Naqeeb Ur Rehman, Lecturer at Economics Department, Epoka University (Autostrada Tiranë-Rinas, km. 12, Tirana 1000, Albania), Ph. D., ORCID: https://orcid.org/0000-0003-1015-2588, Scopus ID: 57190133070, nrehman@epoka.edu.al

Contribution of the authors:

Eglantina Hysa - contributed to the Albanian environment and specifics regarding higher education quality and axccreditation processes. In addition, she conceptualized the quality measurement for economics department.

Naqeeb Ur Rehman - contributed to the statistical analysis and general perception of the survey results associated to specific dimensions of the survey.

All authors have read and approved the final manuscript.

Об авторах:

Хыса Эглантина, преподаватель департамента экономики Университета Эпока (1000, Албания, г. Тирана, 12-й км шоссе Тирана-Ринас), доктор философии, доцент, ORCID: https://orcid.org/ 0000-0002-3429-5738, Scopus ID: 56005581900, Researcher ID: T-7876-2018, ehysa@epoka.edu.al

Ур-Рехман Накеб, преподаватель департамента экономики Университета Эпока (1000, Албания, г. Тирана, 12-й км шоссе Тирана-Ринас), доктор философии, ORCID: https://orcid.org/ 0000-0003-1015-2588, Scopus ID: 57190133070, nrehman@epoka.edu.al

Заявленный вклад авторов:

Хыса Эглантина - исследование образовательной среды Албании; оценка качества высшего образования и процессов аккредитации; разработка концепции измерения качества для экономических факультетов.

Ур-Рехман Накеб - статистический анализ и общая оценка результатов исследования, связанных с конкретными аспектами.

Все авторы прочитали и одобрили окончательный вариант рукописи.

i Надоели баннеры? Вы всегда можете отключить рекламу.