Document Type : Original Research Paper

Authors

1 Ph.D Candidate in Architecture, Department of Architecture, Ahvaz Branch, Islamic Azad University, Ahvaz, Iran.

2 Visiting Associate Professor, Department of Architecture, Ahvaz Branch, Islamic Azad University, Ahvaz, Iran/ Associate Professor, School of Architecture and Environmental Design, Shahid Beheshti University, Tehran, Iran

3 Visiting Assistant Professor, Department of Architecture, Ahvaz Branch, Islamic Azad University, Ahvaz, Iran /Assistant Professor, School of Engineering, Shahid Chamran University of Ahvaz, Iran.

4 Visiting Professor, Department of Educational Sciences, Ahvaz Branch, Islamic Azad University, Ahvaz, Iran/Professor, School of Educational Sciences and Psychology, Shahid Chamran University of Ahvaz, Iran.

Abstract

Extended Abstract
Background and Objectives: A continuous and comprehensive learning process depends on the proper teaching method in every pioneer education system. The evaluation process of architectural designs is meant to judge the designs. It measures the ratio of variable criteria in the design from the desired aspect and then evaluates it. Due to the pivotal role of judgment in the architecture curriculum, if the evaluation process is unclear and no productive criticism ambiance is provided, personal interpretations or unrelated demands to educational goals may distort the judgment process and prevent the flourishment of the students’ development and talents. If the judgment criteria are known, the students’ gradual quantitative and qualitative progression will be achieved, increasing their scope of the understanding of the architectural education system and its representation method. The present research was conducted to recognize the indicators and criteria affecting the evaluation of university architectural designs as a part of the student’s learning process to provide a better evaluation method that is more accurate and objective. The comprehensive education process in teaching is investigated in learning and assessment. In the current study, the role of the architectural design evaluation in learning and improving students’ scientific knowledge is investigated.
Methods: This research uses a mixed-method (qualitative-quantitative), and it is considered applied research. The statistical population comprises 15 faculty members at Shahid Beheshti University, Tehran University, Iran University of Science and Technology, and Shahid Chamran University of Ahwaz. A systematic, non-random sampling method was applied, and the samples were selected according to the educational fields due to the importance of scoring and its direct effect on the research results. Bearing in mind that students are one of the most important factors in the evaluation process, the students’ opinions were considered in the architectural design evaluation. Therefore, the master’s students of Architectural Design (3) of Islamic Azad University, Ahwaz Branch, were selected as the statistical population. Data was collected using a Likert scale questionnaire. In order to assess the research model, the results were analyzed using SPSS software and applying the Spearman Correlation Test, and in order to assess its validity, the Friedman test was used to prioritize the variables. Experts’ grading was considered in the final assessment of the architectural design projects. The results obtained from the questionnaires effectively provide proposed strategies and score the criteria and judgment rules of the architectural design projecects.
Findings: The research findings showed a significant difference between 4 components affecting the final product. The impact ratio of each one on the final product is different. The results showed that studies and technical knowledge, with a correlation coefficient of 0.535 and a significance level of 0.04, have the highest impact on the final product. This component has been the most important and effective factor in the final product. The other effective factors are design skills, design process development, and initial knowledge. The initial knowledge component has the least impact on the final product compared to other components. The results of Friedman’s ranking test showed that the sub-component in the analysis and interpretation of final results, presentation technique, and replica has the highest average rank. These sub-components have been the most important sub-component affecting the judgment of university projects. Then there is the design idea, creativity and form of the building, the subject, and the ability to analyze and present. The results show that these sub-components have the highest impact on the final judgment of the designs compared to other sub-components. And the sub-components of the impact ratio of planning and functional design and oral presentation have the least impact on the final judgment of the designs. The weighted index of 5 main components affecting the judgment of final designs based on the ranking of sub-component tests is the final product, study and technical knowledge, design skill, design process development, and primary knowledge, respectively.
Conclusion: According to the conducted studies, evaluation seems to have an important and valuable place in the learning process. If students are dissatisfied with this process, it will have a devastating effect on their learning. In this regard, holding “learner-centered” sessions was suggested to evaluate the design process during the semester and increase students’ learning. Since the highest scores were given to learning in classroom evaluation, specialization, and roundtable discussion, it is recommended that the professors collaboratively hold their design classes and invite professional architects as experts to make students more familiar with the market in the initial sessions. The students should be able to choose their professor among the studio professors to reduce the student’s confusion after the initial class sessions and the student’s familiarity with the professors’ viewpoints. It is better to hold classroom evaluation sessions in a participatory and roundtable manner, and students of various levels attend the classrooms. This research suggests strategies for professors and decision-makers for architectural design judgment, reducing students’ stress and worries and increasing their self-confidence in the architecture design studio. Suggestions for architectural evaluation and policy making are made to promote the level of architecture education and ultimately train students and competent architects.

Graphical Abstract

The judgment strategies of architectural designs and its role in the students’ learning process

Highlights

- Strategies are proposed for judging projects that include: Step 1: Provide student forms with assessment and evaluation steps, Step 2: Assessment the student design process during the semester, Step 3 : Evaluation of the final product, stage 4: Review of assessment and evaluation results.
- The design process should be “learner-centered” during the semester, holding class sessions by two or three professors to make students more familiar with their point of view, inviting professional architects as professionals in the profession, giving form How to assessment the design process during the semester and evaluate the product at the end of the semester to students to become more familiar with the judging of architectural designs.

Keywords

این مقاله برگرفته از رساله دکتری نویسنده نخست با عنوان «تبیین راهبردهای ارزیابی طرح های معماری مبتنی بر جهانی شدن آموزش معماری»، می‌باشد که به راهنمایی نویسنده دوم و مشاوره نویسنده سوم و چهارم در دانشگاه آزاد اسلامی واحد اهواز انجام گرفته است.

This article is derived from the first author`s doctorsl thesis entitled “Explaining the Assesment Strategies of Architectural Designs Based on Globization of Architectur Education”, supervised by the second author and advised by the third and fourth, at Islamic Azad University Ahvaz branch.

  1. Ahadi, P. (2017). Using DEMATEL to Evaluation model of Students' Architectural Design Projects. Hoviat Shahr, No, 33 (12), 75-88.
  2. Alizadeh Miandoab, A, Akrami Gh.(2019). A Study of Different Criticism Methods in Architecture, Scientific Quarterly of  Daneshgah Honar, 24,47-62.
  3. Anthony, Kathryn H. (1991). Design Juries on Trial , New York.
  4. Bazargan, A. (2015). Higher education standards: from ideal to reality. Nameh Amozesh Ali. 8 (30), 11-23.
  5. Bloom, B.S. (1971). Handbook on Formative & Summativ Evaluation of Student Learning. NewYork: MacGrowHill.
  6. Dermirbas, Osman O. and Demirkan, Halime, (2007). Learning styles of design students and the relationship of academic performance and gender in design education. Learning and Instruction, PP: 557-345,17.
  7. Frederickson, M.P. (1993). Gender and Racial Bias in Design Juries. Journal of Architectural Education, 47(1), 38-48.
  8. Hassanpour, B., Utaberta, N., Zaharim, A. & Abdullah, N. G. (2011). Students’ Perception of the Evaluation System in Architecture Studios. International Journal of Social,Behavioral, Educational, Economic, Business and Industrial Engineering, 5(5), 494-500.
  9. Heidari, M. R, Sadram, V. Siavashpour, B. (2019). Judgment Mechanism of Academic Architectural Designs Based On JAAD Model (Case Study: The Judgment Mechanism of a Commercial Design), Sixth National Conference on Applied Research in Civil Engineering, Architecture and Urban Management, Tehran, Khajeh Nasir al-Din Tusi University of Technology, https : //www.civilica.com/Paper-CEUCONF06-CEUCONF06_0139.html
  10. House, E.R. (1983). Assumption Underlaying EvaluationModels. Boston: Klawer Nijhoff. NewYork: Macmillan International.
  11. Kvan, Thomas. Yunyan, Jia. (2005). Students’ learning styles and their correlation with performance in architectural design studio. Design Studies,1(26), 34-19.
  12. Mohammadi Bolbolan Abad, S. Iranmanesh M. Bemanian, M. R. (2009). Investigating the Role of Evaluation in Architectural Education, Amozesh mohandesi Iran Quarterly, 41 (11), 113-134.
  13. Markus, J. (2003). Student Assessment and Evaluationin Studio Art. Research in Ontario Secondary Schools, 8(1), http://legacy.oise.utoronto.ca/research/field-centres/TVC/RossReports/vol8no1.htm.
  14. Mehdizadeh Seraj, F. Mardomi, K. (2008). Judging criteria for architectural design projects. Collection of articles on architecture education, the third conference on architecture education, examining challenges, searching for solutions"Aban 19, Honar-Ha-Ye-Ziba Tehran .
  15. Mirriahi, S. (2006). Judging Architectural Design and Its Consequences, Sofeh Magazine, No. 42, 86-87.
  16. Mirriahi, S (2006), An Attitude Towards Evaluation in the Architectural Design Educational System. Sofrh Magazine, No. 43, 100-110.
  17. Mirriahi, S. (2009). Assessing Design Skills in Architecture Education, Sofeh Magazine, No. 49, 61-68.
  18. Mirriahi, S. (2014). Assessment and evaluation in the architecture education system with emphasis on team-based learning and peer evaluation. Armanshahr Architecture and Urban Planning, No. 13, 107-117.
  19. Mortazavi, Sh. (1993). Proceedings of the Seminar on Improving the Quality of Higher Education. Tehran: Shahid Beheshti University. Review Process CEBE Transactions,1 (2), 56-69.
  20. Nadimi, H. (2010), A Look at the Evaluation of Architectural Designs Sofeh Magazine, 20 (50), 20-9.
  21. Nooranipur, R. (1993). The concept of quality and four qualitative dimensions of higher education. Proceedings of the seminar on improving the quality of higher education. (Pp. 308-321.) Tehran: Shahid Beheshti University.
  22. Oschner, J. K. (2000), Behind the Mask: A Psychoanalytic Perspective on Interaction in the Design Studio. Journal of Architectural Education, 206-194 ,(4) 53.
  23. Raiss, Dana, F. L. (1991). Introducing the concepts of research and evaluation and expressing the most important aspects of their similarity and differentiation. Talim va Tarbiat Tehran Quarterly, Ministry of Education, 7 (25), 32-52.
  24. Rezaei Ashtiani, S. Mahdi Nejad, J. (2019). Provide a standard of educational evaluation based on criteria in architectural design studios. Fanavari Amozesh, 13 (3), 441-458.
  25. Sadram, V. (2017). Why Teaching Creativity requires More Than Just producing more “Creativity”, Proceedings of the Fourth International Conference on New Technologies in Civil Engineering, Architecture and Urban Planning, Tehran, Salehan University. https://www.civilica.com/Paper-RCEAUD04-RCEAUD04_040.html
  26. Sadler, D.R. (2005). Interpretations of criteria-based assessment and grading in higher education, Assessment and education in higher education., vol.30: 175-193.
  27. Sara, R. (2004). The Review Process CEBE Transactions,1 (2), 56-69.
  28. Sameh R. Izadi A. (2014). Arbitration and Design Assessment Mechanism in Architectural Education Proposing a Model for Evaluating Process and Evaluating Design in Teacher-Student Interaction]. Memari va Shahrsazi Iran, (8), 1-13.
  29. Seif, A. A. (2010). Educational measurement and evaluation. Tehran: Doran Publishing.
  30. Seymour, M. (2008). Beginning Design Students’Perception of Design Evaluation Techniques. BeginningDesign Student. 24th National Conference on the Beginning Design Student. March, (pp.1-9). Atlanta: Georgia Institute of Technology.
  31. Sharif, H. R.. (2009). The process of architectural design and critical thinking (interaction of critical thinking with creative thinking), PhD thesis, Faculty of Architecture and Urban Planning, Shahid Beheshti University.
  32. Soleiman, A. (2017). Appropriate teaching and learnin strategies for the architectureal design process in pedagogic design studios. Frontiers of Architectural Researrch , 6, 204-217.
  33. Uluoglu, Belkis. (2002). Design knowledge communicated in studio critiques. Design Studies, 58-33,)1 21.
  34. University of Utah, (2006).College of Architecture + Planning, Master of Architecture, Master’s Project, PROCEDURES.
  35. Uzunoglu, K. & Uzunoglu, S. (2011). Project evaluationprocess with classified objective criteria in architecturaleducation. Social and Behavioral. 28, 1004
  36. Utaberta, N. Hassanpour, B. & Bahar, M. A. Che Ani, A. (2013). An Evaluation of Criteria-Based Assessment and Grading in Architecture Design. Research Journal of Applied Sciences, Engineering and Technology. 5(2), 346-352.
  37. Webster, H. (2007). The Analytics of Power: Re-presentingthe Design Jury. Journal of Architectural Education.
  38. Worthen, B.R., & Sanders, J. R. (1987). Educational evaluation: Alternative approaches and practical guidelines. New York: Longman.
  39. Wolffe, M., & Defesche. A., (1999). VALUED Approach to the Assessment of Design Skills in Architectural Education: A Pilot Study, in Quality in Higher Education. 5. Delft University of Technology, Netherlands.