SOTEL

Melbourne CSHE Scholarship of Technology Enhanced Learning – a digital education network Hub

Introducing the Faculty of Arts eTeaching/eLearning unit

The Faculty of Arts eTeaching/eLearning unit supports Faculty of Arts teaching staff integrate technology in teaching and learning, to enhance student engagement and interactivity, and to develop innovative teaching materials. We also provide local support and administration for Canvas and other learning tools, digital media production and manage an equipment loans office for use in Arts subject teaching. We are a small but dynamic team that was formed in 2012.  The team is constantly evolving new approaches to augment blended learning, support active learning and innovative digital media production.

 

Using a combination of medium-quality production and DIY approaches, we’ve explored different ways to enhance learning opportunities by creating and applying immersive technologies and VR in ways relevant to the humanities, social sciences and language disciplines. In particular, there’s opportunities for interaction with learning materials through authoring learning “experiences” that enhance our Faculty’s focus on Object Based Learning (OBL).

 

We’ve taken a flexible approach, so that the learning experiences and learning materials have application in a range of teaching contexts:

  • Face-to-Face:in-class active learning, in-class ‘incursion’, virtual field trips)
  • Blended Learning:material used in-class or outside of class time, through the LMS
  • Fully Online subject:virtual tours, explore places, spaces, objects.

The use of DIY approaches has been intentional. Simple drag and drop, no coding tools such as Seekbeak for building VR/360 experiences, represents something teaching staff and students can do themselves. Application includes students building interactive field trips or tours, digital storytelling presentations and alternatives to traditional essay assessment, through importing 180’ or 360 photos and adding a series of hotspots comprised or narrated voiceovers, videos, text resources. (see Seekbeak example for Online subject In the Heart of the Loire Valley).

 

In 2016 we filmed original interpretations of scenes of Shakespeare’s plays for use in the Shakespeare in Performance subject. The same screen from the Taming of the Shrew was filmed in three different presentation styles, including immersive 360 video, using the same text and actors. Using a classroom set of Google cardboards, students had agency to face and observe any of the actors in the scene, rather than being presented with a director’s interpretation.  We began supporting the Screen Studies program with VR technology in 2017, by assisting in ‘critical viewing sessions’ during class time. The students played short games, watch 360 clips and discussed issues such as ‘immersive storytelling’ and ‘the intersection of gaming culture’ taking turn on a solitary Oculus Rift PC VR rig.

These early examples seemed like the natural starting point to begin using XR EdTech within Arts and Humanities.

 

While that work continued, in 2017-18, things began to grow in a noticeably different direction. Arts eTeaching began producing teaching assets for the Ancient Worlds Studies program, using various novel mashup techniques and collaborations with other tech /TEL professional staff within Learning Environments and the Library.

 

 

These eye-catching examples were a powerful way to garner interest and support, and the ensuing period up to 2020 we chalked up about 15 ‘VR 360 3D’ projects. These ranged from in-class demonstration, learning incursions, student-led exhibitions, Arts Engagement promotions, professional development sessions, custom 6DOF/3DOF/360 learning experiences, and collaborative research projects.

 

While we work with specialists across the whole University, our team comprises:

 

Meredith Hinze, Manager, eLearning/eTeaching (Learning Design, learning technologies, professional development, Digital Producer, project coordination).

Sam Taylor, Videographer & Editor. (High end 2D & 360 video, 3D modelling, production management)

Mitch Buzza, Educational Technologist, Digital Producer. (VR 360 3D, web tools, LMS administration)

Dan Hayward, Documentary maker & Editor (Equipment and Facilities, Technical Consulting)

 

The Faculty of Arts has approximately 50 subject disciplines and 300 teaching staff. This diversity means there is always a few ideas for new projects bubbling up and in various stages of development. Here are three examples showing the diversity of our output.

 

Who is Nature?  (Version 2.0).

Learning intention: Exploring indigenous cultures, prompting critical discussion. Used in Latin American Studies subjects (undergraduate and a Masters subject). Students wrote reflective essays on the video for final assessment in semester 1, 2020.

A VR 360 & 2D Video multimedia tour of 4 cultural belief systems

Afro-Cuban, Mexican, Afrikete Festival Gold Coast, Beemarra serpenct, Pilbara WA

 

Against Erasure – Manus Island  (Version 1.0, 75% complete, close to publication)

Collaborative research project – Criminology, History, Geography, eTeaching

Using satellite imagery, documentary audio/video and primary source interviews to reverse engineer the now dismantled and razed Manus Island Detention Centre as a detailed 3D model with hotspots, containing links and audio commentary.

 

 

VR projects

The Library of Egyptian Stelae  (Version 2.0, 75% complete, close to publication)

Learning intention: Read hieroglyphics, prompting critical discussion

A 6DoF Unity3D based VR experience – 50 stelae in a Museum setting

Complements the ‘Tomb of Nefertari’ an exquisite high-quality 3D scanned simulation of the actual tomb in Egypt, which is now sealed off from any visitors.

 

 

Arts eTeaching Unit: arts-eTeaching@unimelb.edu.au


Introducing Alexis Pang

Alexis Pang is a Teaching Specialist at the School of Agriculture and Food, Faculty of Veterinary and Agricultural Sciences. He is interested in the innovative use of technology, particularly mobile learning to enhance the learning and teaching of earth sciences such as geomorphology, soil science and hydrology. At the Faculty, Alexis led the development of Fieldfriend, a mobile learning app for smartphones to support field-based learning. Alexis was previously an Educational Technology Officer (R&D) at the Ministry of Education, Singapore, where he led projects such as Mixed Reality for Primary Science learning; Large-scale semi-formal learning with mobile technology (Learn@ programme at the Singapore Science Centre). He was the main project officer for the FutureSchools@Singapore programme for large-scale innovative ICT curricular and pedagogical integration in multiple schools. Alexis was also research collaborator on the Learning Sciences Lab (NIE, Singapore) led National Research Foundation (NRF) funded educational research project “Voyage to the Age of Dinosaurs” 3DVLE.

Relevant publications:

Pang, A. & Weatherley, A. J. (2016). Fieldfriend: A Smartphone App for Mobile Learning in the Field. In Chen, W. et al. (Eds.). Proceedings of the 24th International Conference on Computers in Education (ICCE). India: Asia-Pacific Society for Computers in Education. Link: http://www.et.iitb.ac.in/icce2016/files/proceedings/ICCE%202016%20Main%20Conference%20Proceedings.pdf

Lee, J.W.Y., Pang, A.L.H., Ruffolo, L. & Kim, B. (2010). Designing around preconceptions in earth science. In Z. Abas et al. (Eds.), Proceedings of Global Learn Asia Pacific 2010 (pp. 1217-1222). Association for the Advancement of Computers in Education (AACE). Available from: https://sites.google.com/site/voyagetotheageofdinosaurs/important-documents

Kim, B., Pang, A., Kim, M. & Lee, J. (2009). Designing with Learners for Game-Based Collaborative Learning: An Account of T-Rex Group. CSCL2009 Community Events Proceedings (8th International Conference on Computer Supported Collaborative Learning, June 2009, Rhodes, Greece). Available from: https://sites.google.com/site/voyagetotheageofdinosaurs/important-documents

Kim, B, Wang, X., Tan, L., Kim, M-S, Lee, J. & Pang, A. (2009). Designing with Stakeholders for Learning Innovations: Voyage to the Age of Dinosaurs. Symposium paper presented at the Annual Meeting 2009 – American Educational Research Association. Available from: https://sites.google.com/site/voyagetotheageofdinosaurs/important-documents

Pang, A. L. H. & Phua, Y. C. (2008). Large-scale semi-formal learning activities for Singapore schools – Learn@ programme. In Chan, T. W. et al. (2008), Proceedings ICCE 2008 The 16th International Conference on Computers in Education, pp. 919 to 924. Jhongli, Taiwan: APSCE. Available from: http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.373.1153&rep=rep1&type=pdf

Pang, A. L. H., Phua, Y. C., Wu, W. T., Suriyani, R., Mohd-Noor, M. Y. & Pan, A. (2007). Exploratory Study on the Use of Mixed Reality for Primary Science Learning. In T. Hirashima et al. (Eds.), Supporting Learning Flow Through Integrative Technologies (Proceedings of ICCE 2007), (pp. 449 to 452). Amsterdam: IOS Press. Available from: http://ebooks.iospress.nl/volumearticle/3838

Pang, A. (2006). Geographical Information Systems (GIS) in Education. Educational Technology Division, Ministry of Education, Singapore. Available at http://www.scribd.com/doc/95965654/Geographical-Information-Systems-in-Education

Pang, A. L. H. (2005). The Educational Effectiveness of Dynamic and Interactive Data Visualization and Exploration in Geographical Education. Paper presented at International Conference on Education – Redesigning Pedagogy: Research, Policy and Practice. May 2005, Singapore. Available at http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.522.7434&rep=rep1&type=pdf


Introducing @aiello_stephen #MESH360 researcher

Paramedicine offers unlimited opportunities due to it being a relatively new research environment. Since starting his research career at Auckland University of Technology (New Zealand) in 2014, Stephen has been working toward the development of design-based research projects that aim to provide more authentic critical care educational experiences and learner-centered pedagogies within the emergent profession of Paramedicine education.

Stephen’s aim is to explore the critical care aspects of emergency medicine in relation to paramedic pre-hospital management within an immersive simulation (XR) environment. Stephen is part of the investigation team reviewing the use of virtual reality for Paramedic scene orientation and situational awareness. His research investigates virtual environments and Paramedical experiential data in order to guide decision-making via qualitative research methods and biometric quantitative feedback. This work has expanded to include all health school departments and will form a catalyst for future work.

To date, the investigation into 360-degree immersive environments and biometric feedback has led to both national and international collaborations. This extensive work is ground-breaking and has been awarded several accolades for innovation and teaching excellence.

 

References

Cochrane, T, Aguayo, C, Aiello, S, Wilkinson, N. (2019). Enhancing Simulation Training through

Immersive Reality: MESH360. Full paper accepted for the upcoming BJET Special Section on Immersive

Virtual Reality in Education. United Kingdom.

 

Cochrane, T, Aguayo, C, Aiello, S, Wilkinson, N. (2019). Developing A Mobile Immersive Reality

Framework For Enhanced Simulation Training: MESH360. Paper to be presented at ASCILITE 2018

Singapore.

 

Aiello, S, Cook, S. (2019). I See Real Things. Keynote speaker presentation at SoTEL: Scholarship of Technology Enhanced Learning 2019, Auckland University of Technology, Manukau, New Zealand.

 

Cochrane, Thomas, Stretton, Todd, Aiello, Stephen, Britnell, Sally, Cook, Stuart, & Narayan, Vickel.

(2018). Authentic interprofessional health education scenarios using mobile VR. Research in Learning

Technology, 26, 2130. doc: http://dx.doi.org/10.25304/rlt.v26.2130

 

Cochrane, T, Cook, S, Aiello, S, Aguayo, C, Dañobeitia, C, & Boncompte, G. (2018). Designing

immersive mobile learning mixed reality for paramedic education. Paper presented at the IEEE-TALE

2018: Education and Technology Conference, University of Wollongong, Australia.

 

Aguayo, Claudio, Dañobeitia, Cristobel, Cochrane, Thomas, Aiello, Stephen, Cook, Stuart, & Cuevas, A.

(2018). Embodied reports in paramedicine mixed reality learning. Research in Learning Technology, 26,

2150. doc: http://dx.doi.org/10.25304/rlt.v26.2150


Why would a veterinary and agricultural science lecturer use XR?

It might seem unusual for a lecturer in veterinary and agricultural science to be talking about extended reality (XR)- virtual (VR) and augmented (AR) reality. To give you some background about how and why I became involved in XR I need to compare the difference between when I was a veterinary and agricultural science student and how this differs to the current generation of students.

For me, it doesn’t seem all that long ago that I completed my veterinary degree as well as my agricultural science studies, but it is a few decades. When I completed my tertiary studies most of my classmates either knew someone who managed a farm or had relatives that knew someone on a property. This situation has now changed with many students having virtually no contact with rural enterprises and most not having any prior experience on properties before starting their degree. This is a good example of the increasing rural urban divide where there are increasing numbers of people living in major metropolitan regions and less in rural areas (1). It also reflects a greater diversity of student backgrounds from international locations where a similar rural urban divide also exists. A lack of primary production information has also been identified in primary and secondary education with groups such as the Primary Industries Education Foundation of Australia formed to improve food and fibre education (2). This increasing divide means that previous assumptions about students general understanding of food and fibre production needs to be rethought and there is increasing scope for the use of XR to enable on-site learning to enable improved learning when students visit rural enterprises.

We commenced developing virtual reality farms with 360 imaging initially looking at a display via computer screens or mobile devices. Towards the end of producing the 4DVirtualFarm site, Oculus Rift development kits became available so farms could also be viewed in VR using this method (3). Our early VR work used multiple images from a DSLR camera stitched together but have now moved to 360 cameras with multiple lenses – initially the Panono and now the Xphase Pro. A range of other projects involving 3D visualisation are in progress along with augmented reality tools for gamification. These open a range of different methods for students to visit properties virtually before, during or after visits to real properties as well as new ways to visualise animals in 3D. A good example of how this can be used was recent intensive week long online sessions “visiting” two VR properties to allow students to work in small groups to better understand how the properties functioned as they weren’t able to visit the actual properties due to COVID restrictions.
I don’t see XR as a solution to all teaching issues but it is an extra teaching tool in the teaching toolbox that can help increase understanding of a range of areas where it is difficult to get students to places due to cost, timing, biosecurity, EHS and other issues. As tools for producing XR become simpler and cheaper they have significant potential in teaching and learning.


1 https://www.accc.gov.au/system/files/Fn%20118%20-%20Hugo,%20Changing%20patterns%20of%20population%20and%20distribution.pdf, figure 3, page 7).
2 https://www.piefa.edu.au/
3 https://fvas.unimelb.edu.au/research/groups/decommissioned/cattle-and-sheep-research-and-education/research-groups/4d-farms-multimedia-education


#Heutagogy in curriculum design: A framework for re thinking the pedagogy of studio-based design classrooms

Hot off the press: Sinfield D., & Cochrane T. (2020). A framework for re thinking the pedagogy of studio-based design classrooms. Pacific Journal of Technology Enhanced Learning, 2(2), 31-44. https://doi.org/10.24135/pjtel.v2i2.77

Keywords: Atelier, social learning environments, rhizomatic learning, ontological pedagogies, heutagogy, design-based research, collaborative curriculum design


#TheNewNormal Webinar Episode 5: Student Engagement – surface vs deep, part2

Episode 5 of #TheNewNormal Webinar series is now online: Student Engagement Online Part 2: Surface vs Deep. figshare. Media. https://doi.org/10.6084/m9.figshare.13370333 @cdeneen212 @CatManning @SiewFangLaw1 @briansology @MelbCSHE

In this episode, we discuss how engagement connects with teaching, learning and assessment. Some exemplar cases are offered that involve partnering with students for increased engagement.

See the entire 5 part series at; https://melbourne-cshe.unimelb.edu.au/programs/teaching-and-learning/the-new-normal-engaged-teaching-and-learning-webinar-series


#ASCILITEMLSIG Webinar 11 December 2020

Our guest this week is Alexis Pang from the University of Melbourne discussing mobile fieldwork in higher education.

Pang, A., & Weatherley, A. (2016). A Smartphone App for Mobile Learning in the Field. 24TH INTERNATIONAL CONFERENCE ON COMPUTERS IN EDUCATION (ICCE 2016), Indian Inst Technol Bombay, Mumbai, INDIA. https://minerva-access.unimelb.edu.au/bitstream/handle/11343/130084/ICCE2016-main-proc-final-19Nov%20Pang_Weath.pdf

Thar, S. P., Ramilan, T., Farquharson, R. J., Pang, A. & Chen, D. (2020). An empirical analysis of the use of agricultural mobile applications among smallholder farmers in Myanmar. The Electronic Journal of Information Systems in Developing Countries, pp. 14-. doi:10.1002/isd2.12159


Online learning design resources for #COVID19

A brief selection of some approaches to #COVID19 online learning design for dual delivery, remote teaching, hybrid teaching: resources, guides, design frameworks


What’s done is data: Creating datafied feedback loops to inform Creative Industry pedagogy.

We interact with digital devices everyday. These devices continue to record a growing variety of actions, at a higher velocity than ever, and in ever increasing volumes. There is a growing lake of data about the “how, what, when, and where” of our lives. And this leads to growing potential to explore the “why”.

Image of skelton looking into an artistic representation of data.
Every action we make with and through a digital device leaves behind a data trail, even if we can’t see it. Image from https://www.pxfuel.com/en/free-photo-qkjgk, Pxfuel terms of use.

Feedback loops are one way that our actions-as-data are reflected back to us. Social media companies, for example, are expert at using data to generate feedback loops. These loops are used for a range of things like platform development and tailoring user content. Another example is smart phones, which now produce feedback loops about your device usage through functions like screen reports, and apps to limit your use of certain platforms. In most cases, though, the user is the consumer and not the producer of the feedback loop.

In her 2016 book, Cathy O’Neil wrote:

Big Data processes codify the past. They do not invent the future. Doing that requires moral imagination. And that’s something only humans can provide.

If data codifies the past, and humans invent the future, then feedback loops are the translators – making our datafied actions human readable again. This agency through feedback loops is a concept being explored by the #DataCreativities collaboration. #DataCreativities is a research team formed in June 2020 to explore the fast-paced shift to making, living and learning in the creative industries during times of isolation. We have a particular interest at looking at the data created in the creative industries, and looping this back into creative industry education. Our goal: to view the loop, and where needed break the loop, to invent future creative industry pedagogies.

Image of a infinity sign made up of small pictures of online platforms like email.
Feedback loops can be generated to reflect our digital actions back to us. Image from https://pixabay.com/vectors/infinity-icons-internet-infinite-5556109/, Pixabay licence

In December 2020, #DataCreativities will be hosting a workshop focused on exploring the data we unwittingly create through digital devices, how we can create our own feedback loops, and the implications of this for the Scholarship of Technology Enhanced Learning. The team come from a variety of disciplines, with a range of online teaching experiences, and a spectrum of data science skills. We use this diversity to create a workshop that provides you with a practical, user friendly set of tools which you can apply to your own research led teaching practices.

To find out more about the workshop and how to register, visit: https://omeka.cloud.unimelb.edu.au/datacreative/workshop2020. Note: while the live workshop is open to University of Melbourne staff only, stay tuned for workshop outputs which will be shared after the event.


#DIMENSIONSXR2020 : Summary of the Congress presentations 28-30th October

https://dimensionsxr.com/congress-talks/education/

See the Education stream: panel discussion, demos, and networking centered around exploring how immersive technologies are being applied to education at https://ssvar.ch/dimensionsxr2020-education/

 


Number of posts found: 83

SoTEL Network Contributors