2020 EdTech Watch List
The Digital Learning & Innovation’s EdTech team compiled a list of technologies to look out for in 2020. Some of these technologies, like 5G, are just around the corner; others are farther on the horizon. We are actively working with vendors to bring some of these to a classroom near you. As with all technologies, some will be hits, some will be duds, but whatever the cards hold for the future of these technologies, the EdTech team will be here to help you navigate through them all. We are here to help you incorporate these technologies into your courses to meet the student learning objectives. — Ernie Perez, Director, Educational Technology
Harry Lawrence, Manger, Educational Platform Administration
Verizon & AT&T both recently announced they will be rolling out 5G services to Boston. The 5th generation of cellular network technology promises a big jump in speed, lower latency, and more reliable connections for compatible devices. This new infrastructure will open doors for enhancements in video streaming, video chat, augmented reality and help deliver a new wave of smart, ever-connected “Internet of Things” devices.
Amod Lele, Lead Educational Technologist
The use of G Suite tools like Google Docs continues to climb, especially among the K-12 students who soon become our freshmen. Google’s Assignments tool makes it easier for faculty to grade papers in Docs, and to link them with Blackboard. It’s still in beta for now, but we hope to see it come to BU before too long.
AI – Machine Learning
Maria Afzal, Educational Technologist
The AI market in the US education sector is expected to grow by 48% through 2022 as we move towards a more connected world. These advances in AI, particularly in the field of Machine Learning (ML) has gained a considerable amount of support in education as it holds great potential to personalize learning by making data-driven predictions and decisions. One of the first applications of machine learning in education has been to help move quizzes and tests from multiple choice to fill in the blank answers where Natural Language Processing and machine learning are used to evaluate students’ free form answers. Other notable current use cases include Learning Analytics that build statistical models of student knowledge to provide computerized and personalized feedback and Content Analytics that organize and optimize content items like assessments, textbook sections, lecture videos, etc.
Learning remains highly relational for most of us, but most likely it will increasingly be informed and guided by big data in coming years.
Jennifer MacLeod, Platform Administrator
Alexa is in our homes, cars, and mobile devices and is integrated into many aspects of our lives. Alexa Skills—apps enabling customers to perform everyday tasks or engage with your content naturally with voice—are now being developed for the education sector.
Currently there are a few educational technology companies that are developing skills to enhance the student experience by providing course communication and student assignment due dates and more. Blackboard has announced Alexa Skills, but is currently not available on our Blackboard installation. However, we hope to see it available in 2020. I can see this technology expanding to other educational tech spaces and continue to evolve as more complex skills and technological advancements occur.
AI in XR Development 2020 and Beyond
Wendell Seale, Senior Platform Administrator, Video Streaming
The year 2020 will see a proliferation of new and exciting uses of Artificial Intellegence in XR as mobile devices benefit from an increase in power and speed along with 5G. AI is disrupting our lives by adding machine learning to everything we interact with. XR (Extended Reality) is no different than other tech areas where AI has already dominated.
Unlike standard virtual reality, which includes three-dimensional environments, user navigation, and basic object interactions, XR simulation can include virtual machinery, cameras and sensors, control software, human avatars and much more. This allows for a more realistic representation of simulations developed with AI. Forbes predicts by 2025, at least 90% of new enterprise apps will embed AI; by 2024, over 50% of user interface interactions will use AI-enabled computer vision, speech, natural language processing, and AR/VR.
Monty Kaplan, Platform Administrator
In 2020, I’m expecting some challenges as human-computer interaction rapidly evolves. Touchscreens have been a staple of the mobile computing era. Steady advances in voice recognition have resulted in new methods for gathering and sharing information. Improved AR and VR technology has created new contextual learning opportunities. Robots are getting more mobile and commonplace in dining and retail environments. How will we respond to these new experiences (e.g. reactions to Marty the Robot) and what methods will developers use to overcome barriers to successful interactions (e.g. Amazon emotion-detection research)?