The Section 508 Refresh and What It Means for Higher Education — from er.educause.edu by Martin LaGrow

Excerpts (emphasis DSC):

Higher education should now be on notice: Anyone with an Internet connection can now file a complaint or civil lawsuit, not just students with disabilities. And though Section 508 was previously unclear as to the expectations for accessibility, the updated requirements add specific web standards to adhere to — specifically, the Web Content Accessibility Guidelines (WCAG) 2.0 level AA developed by the World Wide Web Consortium (W3C).

Although WCAG 2.0 has been around since the early 2000s, it was developed by web content providers as a self-regulating tool to create uniformity for web standards around the globe. It was understood to be best practices but was not enforced by any regulating agency. The Section 508 refresh due in January 2018 changes this, as WCAG 2.0 level AA has been adopted as the standard of expected accessibility. Thus, all organizations subject to Section 508, including colleges and universities, that create and publish digital content — web pages, documents, images, videos, audio — must ensure that they know and understand these standards.

Reacting to the Section 508 Refresh
In a few months, the revised Section 508 standards become enforceable law. As stated, this should not be considered a threat or burden but rather an opportunity for institutions to check their present level of commitment and adherence to accessibility. In order to prepare for the update in standards, a number of proactive steps can easily be taken:

  • Contract a third-party expert partner to review institutional accessibility policies and practices and craft a long-term plan to ensure compliance.
  • Review all public-facing websites and electronic documents to ensure compliance with WCAG 2.0 Level AA standards.
  • Develop and publish a policy to state the level of commitment and adherence to Section 508 and WCAG 2.0 Level AA.
  • Create an accessibility training plan for all individuals responsible for creating and publishing electronic content.
  • Ensure all ICT contracts, ROIs, and purchases include provisions for accessibility.
  • Inform students of their rights related to accessibility, as well as where to address concerns internally. Then support the students with timely resolutions.

As always, remember that the pursuit of accessibility demonstrates a spirit of inclusiveness that benefits everyone. Embracing the challenge to meet the needs of all students is a noble pursuit, but it’s not just an adoption of policy. It’s a creation of awareness, an awareness that fosters a healthy shift in culture. When this is the approach, the motivation to support all students drives every conversation, and the fear of legal repercussions becomes secondary. This should be the goal of every institution of learning.

 

 


Als0 see:


How to Make Accessibility Part of the Landscape — from insidehighered.com by Mark Lieberman
A small institution in Vermont caters to students with disabilities by letting them choose the technology that suits their needs.

Excerpt:

Accessibility remains one of the key issues for digital learning professionals looking to catch up to the needs of the modern student. At last month’s Online Learning Consortium Accelerate conference, seemingly everyone in attendance hoped to come away with new insights into this thorny concern.

Landmark College in Vermont might offer some guidance. The private institution with approximately 450 students exclusively serves students with diagnosed learning disabilities, attention disorders or autism. Like all institutions, it’s still grappling with how best to serve students in the digital age, whether in the classroom or at a distance. Here’s a glimpse at the institution’s philosophy, courtesy of Manju Banerjee, Landmark’s vice president for educational research and innovation since 2011.

 

 

Program Easily Converts Molecules to 3D Models for 3D Printing, Virtual and Augmented Reality — from 3dprint.com

Excerpt:

At North Carolina State University, Assistant Professor of Chemistry Denis Fourches uses technology to research the effectiveness of new drugs. He uses computer programs to model interactions between chemical compounds and biological targets to predict the effectiveness of the compound, narrowing the field of drug candidates for testing. Lately, he has been using a new program that allows the user to create 3D models of molecules for 3D printing, plus augmented and virtual reality applications.

RealityConvert converts molecular objects like proteins and drugs into high-quality 3D models. The models are generated in standard file formats that are compatible with most augmented and virtual reality programs, as well as 3D printers. The program is specifically designed for creating models of chemicals and small proteins.

 

 

 

 

 

 

Mozilla just launched an augmented reality app — from thenextweb.com by Matthew Hughes

Excerpt:

Mozilla has launched its first ever augmented reality app for iOS. The company, best known for its Firefox browser, wants to create an avenue for developers to build augmented reality experiences using open web technologies, WebXR, and Apple’s ARKit framework.

This latest effort from Mozilla is called WebXR Viewer. It contains several sample AR programs, demonstrating its technology in the real world. One is a teapot, suspended in the air. Another contains holographic silhouettes, which you can place in your immediate vicinity. Should you be so inclined, you can also use it to view your own WebXR creations.

 

 

Airbnb is replacing the guest book with augmented reality — from qz.com by Mike Murphy

Excerpt:

Airbnb announced today (Dec.11) that it’s experimenting with augmented- and virtual-reality technologies to enhance customers’ travel experiences.

The company showed off some simple prototype ideas in a blog post, detailing how VR could be used to explore apartments that customers may want to rent, from the comfort of their own homes. Hosts could scan apartments or houses to create 360-degree images that potential customers could view on smartphones or VR headsets.

It also envisioned an augmented-reality system where hosts could leave notes and instructions to their guests as they move through their apartment, especially if their house’s setup is unusual. AR signposts in the Airbnb app could help guide guests through anything confusing more efficiently than the instructions hosts often leave for their guests.

 

 

This HoloLens App Wants to Kickstart Collaborative Mixed Reality — from vrscout.com by Alice Bonasio

Excerpt:

Now Object Theory has just released a new collaborative computing application for the HoloLens called Prism, which takes many of the functionalities they’ve been developing for those clients over the past couple of years, and offers them to users in a free Windows Store application.

 

 

 

 

Virtual and Augmented Reality to Nearly Double Each Year Through 2021 — from campustechnology.com by Joshua Bolkan

Excerpt:

Spending on augmented and virtual reality will nearly double in 2018, according to a new forecast from International Data Corp. (IDC), growing from $9.1 billion in 2017 to $17.8 billion next year. The market research company predicts that aggressive growth will continue throughout its forecast period, achieving an average 98.8 percent compound annual growth rate (CAGR) from 2017 to 2021.

 

 

A look at the new BMW i3s in augmented reality with Apple’s ARKit — from electrek.co by Fred Lambert

 

 

 

 

Scope AR brings remote video tech support calls to HoloLens — from by Dean Takahashi

Excerpt:

Scope AR has launched Remote AR, an augmented reality video support solution for Microsoft’s HoloLens AR headsets.

The San Francisco company is launching its enterprise-class AR solution to enable cross-platform live support video calls.

Remote AR for Microsoft HoloLens brings AR support for field technicians, enabling them to perform tasks with better speed and accuracy. It does so by allowing an expert to get on a video call with a technician and then mark the spot on the screen where the technician has to do something, like turn a screwdriver. The technician is able to see where the expert is pointing by looking at the AR overlay on the video scene.

 

 

 

 

Virtual Reality: The Next Generation Of Education, Learning and Training — from forbes.com by Kris Kolo

Excerpt:

Ultimately, VR in education will revolutionize not only how people learn but how they interact with real-world applications of what they have been taught. Imagine medical students performing an operation or geography students really seeing where and what Kathmandu is. The world just opens up to a rich abundance of possibilities.

 

 

 

7 Things You Should Know About Video Walls — from library.educause.edu /

Excerpt:

What is it?
A video wall is a large-scale ultra-resolution digital display that joins multiple display screens with minimal bezels to create what is essentially one large screen. While the multiple monitors can be tiled to display one large image, the technology also lends itself well to viewing multiple sources at one time. Video walls are linked via software to a computer or other media source. Video walls typically are arrayed as flat-screen planar displays; sometimes the walls are curved. Found in public spaces like stadiums and airports, video walls are increasingly used in higher education as tools for pedagogy and research. Sizes vary. One example consists of 24 HD displays tiled and connected to create a 50-million-pixel screen. Another example combines twelve 55-inch ultra-resolution LED screens. A video wall at Stanford University measures 16 x 9 feet while one at Georgia State University measures 24 feet wide.

What are the implications for teaching and learning?
Video walls help create a dynamic, interactive, hands-on educational environment that enhances research, encourages active learning, and bolsters collaboration between faculty and students. Used for visualization and modeling of large data sets, for example, this technology helps learners and researchers view in-put with different perspectives, helping them draw new conclusions and deeper analyses, and contributing to the development of new knowledge. Through that capacity, coupled with the immersive experience created by large-scale displays, video walls help learners bridge quantitative and qualitative methods, inter-disciplinary exploration, and diverse modes of inquiry to address complex questions across the arts and sciences.

 

 

 

 

We Need to Help Our Students Build Solid Online-Based Footprints
I used a tool called VideoScribe to create this piece. The video relays how important it is that our students have solid, sharp, online-based footprints.

 

We need to help our students build their own online-based footprints

 

 

We need to help our students build their own online-based footprints

 

 

 

We need to help our students build their own online-based footprints

 

 

 

Robots in the Classroom: How a Program at Michigan State Is Taking Blended Learning to New Places — from news.elearninginside.com by Henry Kronk; with thanks to my friend and colleague, Mr. Dave Goodrich over at MSU, for his tweet on this.

Excerpt:

Like many higher education institutions, Michigan State University offers a wide array of online programs. But unlike most other online universities, some programs involve robots.

Here’s how it works: online and in-person students gather in the same classroom. Self-balancing robots mounted with computers roll around the room, displaying the face of one remote student. Each remote student streams in and controls one robot, which allows them to literally and figuratively take a seat at the table.

Professor Christine Greenhow, who teaches graduate level courses in MSU’s College of Education, first encountered these robots at an alumni event.

“I thought, ‘Oh I could use this technology in my classroom. I could use this to put visual and movement cues back into the environment,’” Greenhow said.

 

 

From DSC:
In my work to bring remote learners into face-to-face classrooms at Calvin College, I also worked with some of the tools shown/mentioned in that article — such as the Telepresence Robot from Double Robotics and the unit from Swivl.  I also introduced Blackboard Collaborate and Skype as other methods of bringing in remote students (hadn’t yet tried Zoom, but that’s another possibility).

As one looks at the image above, one can’t help but wonder what such a picture will look like 5-10 years from now? Will it picture folks wearing VR-based headsets at their respective locations? Or perhaps some setups will feature the following types of tools within smaller “learning hubs” (which could also include one’s local Starbucks, Apple Store, etc.)?

 

 

 

 

 

 

 

 

 

Excerpt:

Artificial Intelligence has leapt to the forefront of global discourse, garnering increased attention from practitioners, industry leaders, policymakers, and the general public. The diversity of opinions and debates gathered from news articles this year illustrates just how broadly AI is being investigated, studied, and applied. However, the field of AI is still evolving rapidly and even experts have a hard time understanding and tracking progress across the field.

Without the relevant data for reasoning about the state of AI technology, we are essentially “flying blind” in our conversations and decision-making related to AI.

Created and launched as a project of the One Hundred Year Study on AI at Stanford University (AI100), the AI Index is an open, not-for-profit project to track activity and progress in AI. It aims to facilitate an informed conversation about AI that is grounded in data. This is the inaugural annual report of the AI Index, and in this report we look at activity and progress in Artificial Intelligence through a range of perspectives. We aggregate data that exists freely on the web, contribute original data, and extract new metrics from combinations of data series.

All of the data used to generate this report will be openly available on the AI Index website at aiindex.org. Providing data, however, is just the beginning. To become truly useful, the AI Index needs support from a larger community. Ultimately, this report is a call for participation. You have the ability to provide data, analyze collected data, and make a wish list of what data you think needs to be tracked. Whether you have answers or questions to provide, we hope this report inspires you to reach out to the AI Index and become part of the effort to ground the conversation about AI.

 

 

 

How AI-powered enterprise chatbot platforms are transforming the future of work — from chatbotsmagazine.com by Gina Shaw

Excerpts:

WHAT IS AN ENTERPRISE CHATBOT PLATFORM?
To sum it up in a few words, a chatbot platform is a toolset which is used to build and deploy chatbots. Every organization has its own set of unique challenges that can be overcome by convenient automation provided by chatbots. After establishing a clear-cut chatbot strategy, enterprises can use a bot builder platform to build, train and manage customized bots. Before the advent of chatbot platforms, building a bot was a strenuous task and required sophisticated toolsets and advanced coding knowledge. However with time, several bot building platforms flooded the chatbot market and led to the creation of safe AI bots which need minimum deployment time and almost zero coding knowledge. Enterprise chatbot platforms also allow IT departments to have complete control and access to monitoring bots.

 

From DSC:
It is with some hesitation that I post this article. Why? Because:

  1. I started out my career in a customer service related position at Baxter Healthcare, and it was one of the most important jobs that I’ve had because it taught me the value of a customer. Ever since then, I have treated everyone as my customer — whether they be internal or external to the organization that I was working for.
  2. Then, there’s the idea of calling a Voice Response Unit (VRU) — which sometimes works well and sometimes I can’t stand it. There are times when I/we simply want to speak to a fellow human being.

So it is with some hesitation that I post this article. But I do so because it is yet another example of:

  • The increased usage of algorithms, software, bots, personal assistants, AI, etc. to obtain answers and information
  • The changing skillset employees will need and job seekers may want to develop (if such things are interesting to them)
  • The massive changes heading our way

 

 

 


From DSC:
Audrey Willis, with Circa Interactive, reminded me that next week is Computer Science Education Week. She wrote to me with the following additional resources:


 

As you may know, Computer Science Education Week starts next week on December 4. This week aims to raise awareness of the need to bolster computer science education around the world by encouraging teachers and students to host computer science events throughout the week. These events can include teacher-guided lesson plans, participating in the Hour of Code, watching computer science videos, or using your own resources to help inspire interest among students. It is for this reason that I wanted to share a few computer science resources with you that were just published by renowned universities. I believe these resources can provide K-12 students with valuable information about different career fields that an interest in computer science can lead to, from education and health information management, to electrical engineering.

Thanks in advance,
Audrey Willis
Circa Interactive

 

 

 

 

Augmented reality will transform city life — from venturebeat.com by Michael Park

Excerpts:

I’ve interviewed three AR entrepreneurs who explain three key ways that AR is set to transform urban living.

  • The real world will be indexed
  • Commuting will be smarter and safer
  • Language will be less of a barrier

 

 

 

Virtual Reality Devices – Where They Are Now and Where They’re Going — from iqsdirectory.com

Excerpts:

The questions now are:

  • What are the actual VR devices available ?
  • Are they reasonably priced?
  • What do they do?
  • What are they going to do?

We try to answer those questions [here in this article].

In this early stage, the big question becomes, “What’s next?”.

  • Integration of non-VR devices with VR users
  • Move away from needing a top-notch PC (or any PC)
  • Controllers will be your hands

 

 

Alibaba-backed augmented reality start-up makes driving look like a video game — from cnbc.com by Robert Ferris

  • WayRay makes augmented reality hardware and software for cars and drivers.
  • The company won a start-up competition at the Los Angeles Auto Show.
  • WayRay has also received an investment from Alibaba.

 

 

WayRay’s augmented reality driving system makes a car’s windshield look like a video game. The Swiss-based company that makes augmented reality for cars won the grand prize in a start-up competition at the Los Angeles Auto Show on Tuesday. WayRay makes a small device called Navion, which projects a virtual dashboard onto a driver’s windshield. The software can display information on speed, time of day, or even arrows and other graphics that can help the driver navigate, avoid hazards, and warn of dangers ahead, such as pedestrians. WayRay says that by displaying information directly on the windshield, the system allows drivers to stay better focused on the road. The display might appear similar to what a player would see on a screen in many video games. But the system also notifies the driver of potential points of interest along a route such as restaurants or other businesses.

 

 

 

HTC’s VR arts program brings exhibits to your home — from engadget.com by Jon Fingas
Vive Arts helps creators produce and share work in VR.

Exerpt:

Virtual reality is arguably a good medium for art: it not only enables creativity that just isn’t possible if you stick to physical objects, it allows you to share pieces that would be difficult to appreciate staring at an ordinary computer screen. And HTC knows it. The company is launching Vive Arts, a “multi-million dollar” program that helps museums and other institutions fund, develop and share art in VR. And yes, this means apps you can use at home… including one that’s right around the corner.

 

 

 

VR at the Tate Modern’s Modigliani exhibition is no gimmick — from engadget.com by Jamie Rigg
‘The Ochre Atelier’ experience is an authentic addition.

Excerpt:

There are no room-scale sensors or controllers, because The Ochre Atelier, as the experience is called, is designed to be accessible to everyone regardless of computing expertise. And at roughly 6-7 minutes long, it’s also bite-size enough that hopefully every visitor to the exhibition can take a turn. Its length and complexity don’t make it any less immersive though. The experience itself is, superficially, a tour of Modigliani’s last studio space in Paris: a small, thin rectangular room a few floors above street level.

In all, it took five months to digitally re-create the space. A wealth of research went into The Ochre Atelier, from 3D mapping the actual room — the building is now a bed-and-breakfast — to looking at pictures and combing through first-person accounts of Modigliani’s friends and colleagues at the time. The developers at Preloaded took all this and built a historically accurate re-creation of what the studio would’ve looked like. You teleport around this space a few times, seeing it from different angles and getting more insight into the artist at each stop. Look at a few obvious “more info” icons from each perspective and you’ll hear narrated the words of those closest to Modigliani at the time, alongside some analyses from experts at the Tate.

 

 

 

Real human holograms for augmented, virtual and mixed reality — from 8i.com; with thanks to Lisa Dawley for her Tweet on this
Create, distribute and experience volumetric video of real people that look and feel as if they’re in the same room.

 

 

 

Next-Gen Virtual Reality Will Let You Create From Scratch—Right Inside VR — from autodesk.com by Marcello Sgambelluri
The architecture, engineering and construction (AEC) industry is about to undergo a radical shift in its workflow. In the near future, designers and engineers will be able to create buildings and cities, in real time, in virtual reality (VR).

Excerpt:

What’s Coming: Creation
Still, these examples only scratch the surface of VR’s potential in AEC. The next big opportunity for designers and engineers will move beyond visualization to actually creating structures and products from scratch in VR. Imagine VR for Revit: What if you could put on an eye-tracking headset and, with the movement of your hands and wrists, grab a footing, scale a model, lay it out, push it, spin it, and change its shape?

 

 

 

How to be an ed tech futurist — from campustechnology.com by Bryan Alexander
While no one can predict the future, these forecasting methods will help you anticipate trends and spur more collaborative thinking.

Excerpts:

Some of the forecasting methods Bryan mentions are:

  • Trend analysis
  • Environmental scanning
  • Scenarios
  • Science fiction

 

 

 

 

From DSC:
I greatly appreciate the work that Bryan does — the topics that he chooses to write about, his analyses, comments, and questions are often thought-provoking. I couldn’t agree more with Bryan’s assertion that forecasting needs to become more realized/practiced within higher education. This is especially true given the exponential rate of change that many societies throughout the globe are now experiencing.

We need to be pulse-checking a variety of landscapes out there, to identify and put significant trends, forces, and emerging technologies on our radars. The strategy of identifying potential scenarios – and then developing responses to those potential scenarios — is very wise.

 

 

 

 

 

 

 

 

 

 

 

 

 

Looking for something?

Use the form below to search the site:

Still not finding what you're looking for? Drop a comment on a post or contact us so we can take care of it!

© 2017 | Daniel Christian