

































Feedback

Feedback
Following the collection of responses from students, peers, management and other institutions my thoughts turned to how they could be collated and utilised as a proposal for pedagogical change.
The new Inclusive Reviews were seen as an evolving guide, meant to be evaluated and revised by each new cohort at the start of every academic year. The feedback proposed that the existing core Learning Outcomes (currently add to 100%and all need to be addressed) should equate to 50% of the grades only. Placing a greater emphasis on social, political and climate change in design proposals. They included new criteria for the presentation itself, to ensure the Review remains an immersive creative act. How would these ideas be perceived? I created a Miro board to generate feedback from the department peers and management:
The response from the department to date has been positive, they found the desire for grades ‘surprising’ and generally the feedback suggestions were ‘provocative’. The Stage 1 leader thought it was a good idea to award marks for the Reviews as proposed, suggesting we implement in the future.
Implementation and testing
Following my requests from other institutions for review guidelines, I was asked to share my research and findings. At Cambridge they responded by testing peer reviews for the first time for Stage 1, as suggested by UAL students for further pin ups to destigmatise the Reviews. The heads of year shared their feedback confirming:
Students responses were ‘what is my to do list ask the tutors – students felt it was missing from the feedback, they were thankful for being given that platform, it was difficult at the start with silence in the room, felt too floppy – wanted tutor feedback – scaffolding the experience’. Showing that the students were still actively requesting structure and guidelines, the plan is to devise a set of guidelines for the department.
Future Pedagogical Implementation?
The research has garnered interest following discussions at APSA (Association for Professional Studies in Architecture) back in November, members have invited me to present the feedback at their spring meet in 2024. In addition following meetings at UCL they have been encouraging me to formalise the findings into a report that could be presented at the annual RIBA Education Away day in 2024.
Conclusion
The design review (crit) is one of the most common pedagogical and feedback methods used in the architectural, arts and design disciplines. This research presented me with valuable experiences and learning opportunities that students have gained during reviews. The main attention gave the opportunity for more productive practices. While reviewing the literature, we learned that students often see design reviews more as a ‘rite of initiation’ as opposed to a learning experience, as an event for unconstructive feedback, as a source of stress and anxiety, and/or as an experience that reveals some asymmetrical relations of power: clearly much more work and research is needed to improve design reviews. I agree with Smith (2011, 63) in suggesting that design reviews should be a creative and flexible event in itself, but would add that a dialogic approach that tries to mitigate power imbalance is also an important goal. Other modifications to improve design reviews include: [1] A change in the physical arrangement and procedural format, [2] clear, constructive and objective feedback, and [3] a well-established set of rules and codes of practice can help to improve the learning experience during reviews.
Bibliography