Digital upskilling in legal: More than just new technology — from legalexecutiveinstitute.com by Bob Dolinsky; with thanks to Gabe Teninbaum for this resource

Excerpt:

How many law firms have digital upskilling programs for their lawyers and staff members? Based on what I hear and read, very few, if any.

Amazon, for example, recently announced a commitment of more than $700 million to its “Upskilling 2025” program, an internal training initiative designed to promote customer satisfaction and worker advancement. Another example is PwC, which has a digital upskilling program to develop its in-house talent pool called “New world. New skills.” In 2019, PwC announced that it would invest $3 billion into job training for its 275,000 employees around the world, enhancing its workforce and client service delivery to better address emerging digital needs.

The goals of these and similar initiatives is to help ensure that employees have the skills in the digital arena to be successful, to position these organizations as preferred employers, and to provide customer and client service excellence.

Also from Gabe:

“Virtual justice” (the preferred, if unsettling, term) is an emergency response to a dire situation. But it is also a vision some judicial innovators had long tried to realize. One leading booster, Michigan Chief Justice Bridget Mary McCormack, told me that going online can make courts not only safer but “more transparent, more accessible, and more convenient.” Witnesses, jurors, and litigants no longer need to miss hours of work and fight traffic. Attorneys with cases in multiple courts can jump from one to another by swiping on their phones.

 

DC: Yet another reason for Universal Design for Learning’s multiple means of presentation/media:

Encourage faculty to presume students are under-connected. Asynchronous, low-bandwidth approaches help give students more flexibility in accessing course content in the face of connectivity challenges.

— as excerpted from campustechnology.com’s article entitled, “4 Ways Institutions Can Meet Students’ Connectivity and Technology Needs

 

 

How to Mitigate Accessibility & Digital Inclusion Obstacles for the d/Deaf Community — from inclusionhub.com by Christina Claus
To mitigate accessibility and digital inclusion obstacles for the d/Deaf and hard of hearing, developers must conduct critical research to understand these ongoing hurdles. This guide outlines the many challenges facing this community, shares useful insights, and provides meaningful inclusion solutions.

Excerpt:

Several commonly accepted characterizations include:

  • Deaf: When using the capital D, the individual conveys they communicate with sign language and have either been deaf since birth or shortly after.
  • deaf: The lowercase d is often utilized by those who do not identify as part of Deaf culture and typically become deaf later in life.
  • Hard of Hearing (HoH): Individuals who don’t experience total hearing loss or deafness often identify as hard of hearing.
  • Late-Deafened: This indicates the individual became deaf later in life.
  • Deaf-Blind: In addition to being deaf or hard of hearing, this individual also has a degree of vision loss.

These diversities can impact the individual’s ability to experience digital and online services. To create an inclusive experience for the entire community, developers must understand the obstacles each faces.

 

eLearning has improved in the last 7 years… — from elearningindustry.com by Mary Burns
A student in 2014 and a student in 2021 would probably have the same fundamental online learning experience. Yet online learning has changed dramatically in the last 7 years. In my 50th article, I look back at online learning and enumerate 5 drivers that have led to its improvement.

Excerpt:

COVID/remote learning has expanded our understanding of online learning. Teachers are far more adept at using technology and designing with technology to help students learn online. They know which online learning modalities—synchronous or asynchronous—work best with which eLearning platform. Students too have had a year-long internship in online learning—its tools and pedagogies.

COVID has diversified our understanding of online learning. In 2014, when I began writing for eLearning Industry, eLearning meant one thing—a class held in a virtual learning environment or Learning Management System. In 2021, online learning is any learning we do online, whether via free or fee-based subscription services (like Khan Academy and DreamBox, respectively), web conferencing, or Web 2.0 platforms (Edmodo, YouTube channels)—another overlooked candidate for this list.

And COVID has constrained our understanding of online learning. In 2014, when I began writing for eLearning Industry, eLearning was largely asynchronous and done in an LMS. In 2021, for a plurality of teachers and students across the globe, eLearning means one thing: Zoom and it means learning that is live and synchronous.

 

 

Improved Student Engagement in Higher Education’s Next Normal — from er.educause.edu by Ed Glantz, Chris Gamrat, Lisa Lenze and Jeffrey Bardzell
Five pandemic-introduced innovative teaching adaptations can improve student engagement in the next normal for higher education.

Excerpt (emphasis DSC):

The five teaching enhancements/adaptations discussed above—collaborative technologies for sense-making, student experts in learning and technology, back channels, digital breakout rooms, and supplemental recording—are well positioned to expand the definition of “student engagement” beyond traditional roll call and attendance tracking. Opportunities to include students at a distance have permitted inclusion of students who are reticent to speak publicly, students whose first language is not English, students with disabilities, and students less engaged through “traditional” channels. With these new conceptions of engagement in mind, we are prepared to be more inclusive of all students in the next normal of higher education.

 

Chrome now instantly captions audio and video on the web — from theverge.com by Ian Carlos Campbell
The accessibility feature was previously exclusive to some Pixel and Samsung Galaxy phones

Excerpt:

Google is expanding its real-time caption feature, Live Captions, from Pixel phones to anyone using a Chrome browser, as first spotted by XDA Developers. Live Captions uses machine learning to spontaneously create captions for videos or audio where none existed before, and making the web that much more accessible for anyone who’s deaf or hard of hearing.

Chrome’s Live Captions worked on YouTube videos, Twitch streams, podcast players, and even music streaming services like SoundCloud in early tests run by a few of us here at The Verge. Google also says Live Captions will work with audio and video files stored on your hard drive if they’re opened in Chrome. However, Live Captions in Chrome only work in English, which is also the case on mobile.

 

Chrome now instantly captions audio and video on the web -- this is a screen capture showing the words being said in a digital audio-based file

 

ARHT Media Inc.
Access The Power Of HoloPresence | Hologram Technology | Holographic Displays | Hologram Events

Excerpt:

ARHT Media mounted a holographic display at the event in Vancouver and had Sunlife’s executive captured and transmitted live as a hologram to the event from our Toronto studio. He was able to see the audience and interact with them in realtime as if he was attending the event and present in the room.

ARHT Media Inc. launches the Holopod | A holograph of the head of global sales appears on stage. Access The Power Of HoloPresence | Hologram Technology | Holographic Displays | Hologram Events

From DSC:

  • Will holographic displays change what we mean by web-based collaboration?
  • Will this be a part of the future learning ecosystems inside of higher education? Inside of the corporate training world? Inside the world of events and webinars?
  • How will this type of emerging technology impact our communications? Levels of engagement?
  • Will this type of thing impact telehealth? Telelegal?
  • How will this impact storytelling? Media? Drama/acting? Games?
  • Will the price come down to where online and blended learning will use this type of thing?
  • Will streams of content be offered by holographic displays?

 

 

Moving Online Learning from Challenge to Opportunity — from campustechnology.com by Dr. Mark Lombardi
The future of higher education means breaking down classroom walls, embracing digital tools and engaging students with creativity and innovation.

Excerpt (emphasis DSC):

Similarly, Maryville criminology professor Geri Brandt, working in concert with our learning designers and on-site production studio, used 360-degree camera technology to stage a virtual, explorable, interactive crime scene. Extending this online experience further, the criminology department created a “choose your own adventure” virtual experience, where each student response or choice in an investigation leads to a different outcome.

Creativity and innovation on this scale are only possible when faculty work with a talented team of learning design professionals who can translate a vision of interactive and immersive learning into a new student experience. Designing and delivering this active learning ecosystem integrated in online and on-ground learning takes a team of committed professionals and the courage to make it happen.

What if students could choose to learn in environments that are fully immersive, deeply enriching and profoundly meaningful to their courses of study? We encourage our students to explore the world beyond the campus, but what happens when the world is your campus?

 

Telemedicine likely to change how we receive health care post-pandemic — from mlive.com by Justin Hicks

A patient sits in the living room of her apartment in New York City during a telemedicine video conference with a doctor. (Mark Lennihan/AP)AP

A patient sits in the living room of her apartment in New York City during a telemedicine video conference with a doctor. (Mark Lennihan/AP)AP

Excerpt:

As we look to the post-pandemic future, medical experts believe telemedicine will be here to stay as another option to increase access to care, reduce costs, and free up doctors to spend more time with patients who need in-person care.

“When I think about the pandemic, one thing that didn’t change about our lifestyles is people are busy,” Lopez said. “I think we’ll still see growth in overall visits because of the fact that people want access to care and when you lower the cost, it should go up.”

From DSC:
A friend of mine said that he is doing most of his practice now via the telehealth route (and has been for many months now). Then, recently, when I was at the lab, the knowledgeable woman who assisted me said that she thought virtual health was definitely going to stick. Many doctors and nurses will be using virtual means vs. physical visits she said.

Expectations get involved here — for education, for the legal field, and for other arenas.

 

Specialties In Instructional Design and What They Do — from teamedforlearning.com
Specialties in instructional design can help both job seekers and hiring managers find the right fit for digital learning courses and programs.

Excerpt:

An instructional designer is anyone who designs and develops digital learning experiences. That may sound straightforward, but within that vague job title nest dozens of specialties. Even more confusingly, instructional designers may also be called learning designers or learning architects. Their work often overlaps with that of instructional technologists and content creators. Specialties in instructional design help both teammates and hiring managers to navigate this evolving position.

Untangling the complexities of the instructional design role can help both job seekers and hiring managers find the right fit. Identifying a specialty can help professionals carve out their own niche in the instructional design ecosystem. Greater clarity around what instructional designers actually do can help team leaders find the right instructional designer for their project.

 

Look at the choice and control possibilities mentioned in the following except from Immersive Reader in Canvas: Improve Reading Comprehension for All Students

When building courses and creating course content in Canvas, Immersive Reader lets users:

  • Change font size, text spacing, and background color
  • Split up words into syllables
  • Highlight verbs, nouns, adjectives, and sub-clauses
  • Choose between two fonts optimised to help with reading
  • Read text aloud
  • Change the speed of reading
  • Highlight sets of one, three, or five lines for greater focus
  • Select a word to see a related picture and hear the word read aloud as many times as necessary

Also see:

All about the Immersive Reader — from education.microsoft.com

The Microsoft Immersive Reader is a free tool, built into Word, OneNote, Outlook, Office Lens, Microsoft Teams, Forms, Flipgrid, Minecraft Education Edition and the Edge browser, that implement proven techniques to improve reading and writing for people regardless of their age or ability.

 

Learning from the Living [Class] Room: Adobe — via Behance — is already doing several pieces of this vision.

From DSC:
Talk about streams of content! Whew!

Streams of content

I received an email from Adobe that was entitled, “This week on Adobe Live: Graphic Design.”  (I subscribe to their Adobe Creative Cloud.) Inside the email, I saw and clicked on the following:

Below are some of the screenshots I took of this incredible service! Wow!

 

Adobe -- via Behance -- offers some serious streams of content

 

Adobe -- via Behance -- offers some serious streams of content

 

Adobe -- via Behance -- offers some serious streams of content

 

Adobe -- via Behance -- offers some serious streams of content

 

Adobe -- via Behance -- offers some serious streams of content

Adobe -- via Behance -- offers some serious streams of content

Adobe -- via Behance -- offers some serious streams of content

 


From DSC:
So Abobe — via Behance — is already doing several pieces of the “Learning from the Living [Class] Room” vision. I knew of Behance…but I didn’t realize the magnitude of what they’ve been working on and what they’re currently delivering. Very sharp indeed!

Churches are doing this as well — one device has the presenter/preacher on it (such as a larger “TV”), while a second device is used to communicate with each other in real-time.


 

 

When the Animated Bunny in the TV Show Listens for Kids’ Answers — and Answers Back — from edsurge.com by Rebecca Koenig

Excerpt:

Yet when this rabbit asks the audience, say, how to make a substance in a bottle less goopy, she’s actually listening for their answers. Or rather, an artificially intelligent tool is listening. And based on what it hears from a viewer, it tailors how the rabbit replies.

“Elinor can understand the child’s response and then make a contingent response to that,” says Mark Warschauer, professor of education at the University of California at Irvine and director of its Digital Learning Lab.

AI is coming to early childhood education. Researchers like Warschauer are studying whether and how conversational agent technology—the kind that powers smart speakers such as Alexa and Siri—can enhance the learning benefits young kids receive from hearing stories read aloud and from watching videos.

From DSC:
Looking at the above excerpt…what does this mean for elearning developers, learning engineers, learning experience designers, instructional designers, trainers, and more? It seems that, for such folks, learning how to use several new tools is showing up on the horizon.

 

From DSC:
THIS is incredible technology! Check out the Chroma-keying technology and the handwriting extraction feature of the Sony Analytics appliance.

#AR hits the active learning classroom! THIS in incredible technology/functionality! See through your instructor as they write on the board!

From Sony’s website (emphasis DSC):

No matter where the speaker is standing, the Handwriting Extraction feature ensures that any words and diagrams written on a board or screen remain in full view to the audience — via AR (augmented reality).

Even if the speaker is standing directly in front of the board, their ideas, thinking process, and even their animated presentation, are all accessible to the audience. It’s also easy for remote viewers and those playing back the presentation at a later date to become immersed in the content too, as the presenter is overlaid and the content is never compromised.

Also, the chroma keying tech can be useful/engaging as well.

Chroma keying hits the Active Learning Classroom as well

 

Grab your audience’s attention and increase their engagement with intelligent video analytics technology.

I saw this at IUPUI’s recent webinar/tour of their new facilities. Here’s further information on that webinar from last Friday, 1/29/21:

Designing Large Active Learning Classrooms webinar/tour on 1/29/21 from the Mosaic Program at Indiana University; also features rooms/staff at IUPUI.

 

From DSC:
I was reviewing an edition of Dr. Barbara Honeycutt’s Lecture Breakers Weekly, where she wrote:

After an experiential activity, discussion, reading, or lecture, give students time to write the one idea they took away from the experience. What is their one takeaway? What’s the main idea they learned? What do they remember?

This can be written as a reflective blog post or journal entry, or students might post it on a discussion board so they can share their ideas with their colleagues. Or, they can create an audio clip (podcast), video, or drawing to explain their One Takeaway.

From DSC:
This made me think of tools like VoiceThread — where you can leave a voice/audio message, an audio/video-based message, a text-based entry/response, and/or attach other kinds of graphics and files.

That is, a multimedia-based exit ticket. It seems to me that this could work in online- as well as blended-based learning environments.


Addendum on 2/7/21:

How to Edit Live Photos to Make Videos, GIFs & More! — from jonathanwylie.com


 
© 2025 | Daniel Christian