5 Tech Trends that Could Supercharge Education in 2016

[In this EdTech article five trends are highlighted that may make a major impact on both K-12 and higher education in 2016. I’m teaching an Osher course for senior adults at Carnegie Mellon University that will look at six trends that are making a difference – coding, personalized learning, flipped learning, game-based learning, virtual reality, and robotics. It’s interesting to see how the two merge.]

2016tech5_0
The technologies of tomorrow are already making headway into education, and others are poised for mass distribution in 2016.

Science-fiction author William Gibson once said, “The future is already here — it’s just not very evenly distributed.”

The technologies of tomorrow are already being tested in select classrooms today, laying the seeds for the future of how students could learn. With 2016 fast approaching, technology analysts have been busy prognosticating the top technology trends. A few of these technologies have already made headway into education, and others are poised for mass distribution, with the promise of ground-shaking change in their wake.

We’ve reviewed a few of these trends through the lens of how they could affect classrooms in both K–12 and higher education.

Read more…

Can Virtual Reality Replace the Cadaver Lab?

[Virtual Reality is taking hold thanks to new tools like zSpace. It’s not only higher education, the focus for this Center for Digital Education article. The Montour Area High School has created a virtual reality lab using zSpace opening up new opportunities for students without health hazards or costly expenses for materials.]

Colleges are starting to use virtual reality platforms to augment or replace cadaver labs, saving universities hundreds of thousands of dollars.
BY JUSTINE BROWN / OCTOBER 15, 2015

Medical students at a growing number of colleges are using virtual reality platforms to augment or replace cadaver labs, providing students with more opportunities to practice skills while saving universities hundreds of thousands of dollars.

According to a survey by the American Association of Anatomists, the nation’s 150 medical schools average about 149 hours of training in first-year gross anatomy, about two-thirds of which is spent with cadaver dissection.
“Cadavers provide a realistic experience for students, but they cause concerns with biohazards, availability, and expense,” said Daniel Buchbinder, professor and chief of the division of maxillofacial surgery in the department of otolaryngology at Mount Sinai Beth Israel in New York.
In response, some medical schools are beginning to leverage virtual reality platforms, which professors say can also provide very realistic experiences without the costs or other downsides of using real cadavers.
Robert Hasel, associate dean of Simulation, Immersion and Digital Learning at the Western University of Health Sciences (WesternU), a private medical school located in Pomona, Calif., has been a proponent of virtual reality for medical training for many years.
“I’ve long been on a mission to make learning as exciting as playing video games, and I’ve been working toward that for years,” said Hasel. “But the technology that was needed to pull this all together has really just fully emerged in the last couple of years.”

Virtual Reality Space Lets Students Experience Big Data

[In the past few years there have been two movements that have not intertwined – Big Data and Redesigning Learning Spaces. In this Campus Technology article you’ll discover a new space that immerses people in Big Data. It’s a fascinating concept that may change the way we look at both learning spaces and data.]

A new facility at Virginia Tech uses large-scale visuals and sound to immerse users in vast amounts of data.

Virginia Tech’s Cube (photo courtesy of Virginia Tech)

Imagine walking through a black room four stories high, 50 feet wide and 40 feet deep, populated with speakers. As you move through the space wearing a head-mounted display (no mouse, keyboard or joystick needed), you’re immersed in vast amounts of data — both visually and aurally — collected from an actual storm that took place a little more than two years ago. As the recorded data shows the formation of some kind of supercell, your ears detect something distinct from every other sound that permeates the space — akin to hearing your name being spoken across the room during a lively cocktail party. You turn and move toward the sound to explore it further. Before your eyes a gigantic tornado forms. Your experience — and the exploration sparked by it — could result in a better understanding of how to interpret the data generated by tornados such as the one that hit Moore, OK, in May 2013, killing two dozen people and injuring hundreds.

That’s the thinking behind Virginia Tech’s Cube, an adaptable space for research and experimentation housed in the campus’s Moss Arts Center. A joint project of the university’s Center for the Arts and the Institute for Creativity, Arts & Technology (ICAT), the Cube officially opened for business in January but has already hosted numerous performances as well as events in big data exploration and immersive environments.

According to Ben Knapp, director of ICAT, the Cube shows the potential for immersive environments to help remake learning, design and collaboration. “That’s the amazing thing about being immersed in visual and aural and sonic environments — that ability to do what we’ve done for the history of human beings and primates: use our hearing to detect change, to pull things out from patterns and then use our vision to turn around and look and explore that. That’s what the Cube is really able to do.”

Read more…

%d bloggers like this: