HarvardX rolls out new adaptive learning feature in online course — from edscoop.com by Corinne Lestch
Students in MOOC adaptive learning experiment scored nearly 20 percent better than students using more traditional learning approaches.

Excerpt:

Online courses at Harvard University are adapting on the fly to students’ needs.

Officials at the Cambridge, Massachusetts, institution announced a new adaptive learning technology that was recently rolled out in a HarvardX online course. The feature offers tailored course material that directly correlates with student performance while the student is taking the class, as well as tailored assessment algorithms.

HarvardX is an independent university initiative that was launched in parallel with edX, the online learning platform that was created by Harvard and Massachusetts Institute of Technology. Both HarvardX and edX run massive open online courses. The new feature has never before been used in a HarvardX course, and has only been deployed in a small number of edX courses, according to officials.

 

 

From DSC:
Given the growth of AI, this is certainly radar worthy — something that’s definitely worth pulse-checking to see where opportunities exist to leverage these types of technologies.  What we now know of as adaptive learning will likely take an enormous step forward in the next decade.

IBM’s assertion rings in my mind:

 

 

I’m cautiously hopeful that these types of technologies can extend beyond K-12 and help us deal with the current need to be lifelong learners, and the need to constantly reinvent ourselves — while providing us with more choice, more control over our learning. I’m hopeful that learners will be able to pursue their passions, and enlist the help of other learners and/or the (human) subject matter experts as needed.

I don’t see these types of technologies replacing any teachers, professors, or trainers. That said, these types of technologies should be able to help do some of the heavy teaching and learning lifting in order to help someone learn about a new topic.

Again, this is one piece of the Learning from the Living [Class] Room that we see developing.

 

 

 

 

Some brief reflections from DSC:

will likely be used by colleges, universities, bootcamps, MOOCs, and others to feed web-based learner profiles, which will then be queried by people and/or organizations who are looking for freelancers and/or employees to fill their project and/or job-related needs.

As of the end of 2016, Microsoft — with their purchase of LinkedIn — is strongly positioned as being a major player in this new landscape. But it might turn out to be an open-sourced solution/database.

Data mining, algorithm development, and Artificial Intelligence (AI) will likely have roles to play here as well. The systems will likely be able to tell us where we need to grow our skillsets, and provide us with modules/courses to take. This is where the Learning from the Living [Class] Room vision becomes highly relevant, on a global scale. We will be forced to continually improve our skillsets as long as we are in the workforce. Lifelong learning is now a must. AI-based recommendation engines should be helpful here — as they will be able to analyze the needs, trends, developments, etc. and present us with some possible choices (based on our learner profiles, interests, and passions).

 

 

Some reflections/resources on today’s announcements from Apple

tv-app-apple-10-27-16

 

tv-app2-apple-10-27-16

From DSC:
How long before recommendation engines like this can be filtered/focused down to just display apps, channels, etc. that are educational and/or training related (i.e., a recommendation engine to suggest personalized/customized playlists for learning)?

That is, in the future, will we have personalized/customized playlists for learning on our Apple TVs — as well as on our mobile devices — with the assessment results of our taking the module(s) or course(s) being sent in to:

  • A credentials database on LinkedIn (via blockchain)
    and/or
  • A credentials database at the college(s) or university(ies) that we’re signed up with for lifelong learning (via blockchain)
    and/or
  • To update our cloud-based learning profiles — which can then feed a variety of HR-related systems used to find talent? (via blockchain)

Will participants in MOOCs, virtual K-12 schools, homeschoolers, and more take advantage of learning from home?

Will solid ROI’s from having thousands of participants paying a smaller amount (to take your course virtually) enable higher production values?

Will bots and/or human tutors be instantly accessible from our couches?

Will we be able to meet virtually via our TVs and share our computing devices?

 

bigscreen_rocket_league

 

The Living [Class] Room -- by Daniel Christian -- July 2012 -- a second device used in conjunction with a Smart/Connected TV

 

 

 


Other items on today’s announcements:


 

 

macbookpro-10-27-16

 

 

All the big announcements from Apple’s Mac event — from amp.imore.com by Joseph Keller

  • MacBook Pro
  • Final Cut Pro X
  • Apple TV > new “TV” app
  • Touch Bar

 

Apple is finally unifying the TV streaming experience with new app — from techradar.com by Nick Pino

 

 

How to migrate your old Mac’s data to your new Mac — from amp.imore.com by Lory Gil

 

 

MacBook Pro FAQ: Everything you need to know about Apple’s new laptops — from amp.imore.com by Serenity Caldwell

 

 

Accessibility FAQ: Everything you need to know about Apple’s new accessibility portal — from imore.com by Daniel Bader

 

 

Apple’s New MacBook Pro Has a ‘Touch Bar’ on the Keyboard — from wired.com by Brian Barrett

 

 

Apple’s New TV App Won’t Have Netflix or Amazon Video — from wired.com by Brian Barrett

 

 

 

 

Apple 5th Gen TV To Come With Major Software Updates; Release Date Likely In 2017 — from mobilenapps.com

 

 

 

 

Coppell ISD becomes first district to use IBM, Apple format — from bizjournals.com by Shawn Shinneman

Excerpt:

Teachers at Coppell Independent School District have become the first to use a new IBM and Apple technology platform built to aid personalized learning.

IBM Watson Element for Educators pairs IBM analytics and data tools such as cognitive computing with Apple design. It integrates student grades, interests, participation, and trends to help educators determine how a student learns best, the company says.

It also recommends learning content personalized to each student. The platform might suggest a reading assignment on astronomy for a young student who has shown an interest in space.

 

From DSC:
Technologies involved with systems like IBM’s Watson will likely bring some serious impact to the worlds of education and training & development. Such systems — and the affordances that they should be able to offer us — should not be underestimated.  The potential for powerful, customized, personalized learning could easily become a reality in K-20 as well as in the corporate training space. This is an area to keep an eye on for sure, especially with the growing influence of cognitive computing and artificial intelligence.

These kinds of technology should prove helpful in suggesting modules and courses (i.e., digital learning playlists), but I think the more powerful systems will be able to drill down far more minutely than that. I think these types of systems will be able to assist with all kinds of math problems and equations as well as analyze writing examples, correct language mispronunciations, and more (perhaps this is already here…apologies if so). In other words, the systems will “learn” where students can go wrong doing a certain kind of math equation…and then suggest steps to correct things when the system spots a mistake (or provide hints at how to correct mistakes).

This road takes us down to places where we have:

  • Web-based learner profiles — including learner’s preferences, passions, interests, skills
  • Microlearning/badging/credentialing — likely using blockchain
  • Learning agents/bots to “contact” for assistance
  • Guidance for lifelong learning
  • More choice, more control

 

ibmwatson-oct2016

 

 

Also see:

  • First IBM Watson Education App for iPad Delivers Personalized Learning for K-12 Teachers and Students — from prnewswire.com
    Educators at Coppell Independent School District in Texas first to use new iPad app to tailor learning experiences to student’s interests and aptitudes
    Excerpts:
    With increasing demands on educators, teachers need tools that will enable them to better identify the individual needs of all students while designing learning experiences that engage and hold the students’ interest as they master the content. This is especially critical given that approximately one third of American students require remedial education when they enter college today, and current college attainment rates are not keeping pace with the country’s projected workforce needs1.  A view of academic and day-to-day updates in real time can help teachers provide personalized support when students need it.

    IBM Watson Element provides teachers with a holistic view of each student through a fun, easy-to-use and intuitive mobile experience that is a natural extension of their work. Teachers can get to know their students beyond their academic performance, including information about personal interests and important milestones students choose to share.  For example, teachers can input notes when a student’s highly anticipated soccer match is scheduled, when another has just been named president for the school’s World Affairs club, and when another has recently excelled following a science project that sparked a renewed interest in chemistry.The unique “spotlight” feature in Watson Element provides advanced analytics that enables deeper levels of communication between teachers about their students’ accomplishments and progress. For example, if a student is excelling academically, teachers can spotlight that student, praising their accomplishments across the school district. Or, if a student received a top award in the district art show, a teacher can spotlight the student so their other teachers know about it.
 

ngls-2017-conference

 

From DSC:
I have attended the Next Generation Learning Spaces Conference for the past two years. Both conferences were very solid and they made a significant impact on our campus, as they provided the knowledge, research, data, ideas, contacts, and the catalyst for us to move forward with building a Sandbox Classroom on campus. This new, collaborative space allows us to experiment with different pedagogies as well as technologies. As such, we’ve been able to experiment much more with active learning-based methods of teaching and learning. We’re still in Phase I of this new space, and we’re learning new things all of the time.

For the upcoming conference in February, I will be moderating a New Directions in Learning panel on the use of augmented reality (AR), virtual reality (VR), and mixed reality (MR). Time permitting, I hope that we can also address other promising, emerging technologies that are heading our way such as chatbots, personal assistants, artificial intelligence, the Internet of Things, tvOS, blockchain and more.

The goal of this quickly-moving, engaging session will be to provide a smorgasbord of ideas to generate creative, innovative, and big thinking. We need to think about how these topics, trends, and technologies relate to what our next generation learning environments might look like in the near future — and put these things on our radars if they aren’t already there.

Key takeaways for the panel discussion:

  • Reflections regarding the affordances that new developments in Human Computer Interaction (HCI) — such as AR, VR, and MR — might offer for our learning and our learning spaces (or is our concept of what constitutes a learning space about to significantly expand?)
  • An update on the state of the approaching ed tech landscape
  • Creative, new thinking: What might our next generation learning environments look like in 5-10 years?

I’m looking forward to catching up with friends, meeting new people, and to the solid learning that I know will happen at this conference. I encourage you to check out the conference and register soon to take advantage of the early bird discounts.

 

 

IBM Foundation collaborates with AFT and education leaders to use Watson to help teachers — from finance.yahoo.com

Excerpt:

ARMONK, N.Y., Sept. 28, 2016 /PRNewswire/ — Teachers will have access to a new, first-of-its-kind, free tool using IBM’s innovative Watson cognitive technology that has been trained by teachers and designed to strengthen teachers’ instruction and improve student achievement, the IBM Foundation and the American Federation of Teachers announced today.

Hundreds of elementary school teachers across the United States are piloting Teacher Advisor with Watson – an innovative tool by the IBM Foundation that provides teachers with a complete, personalized online resource. Teacher Advisor enables teachers to deepen their knowledge of key math concepts, access high-quality vetted math lessons and acclaimed teaching strategies and gives teachers the unique ability to tailor those lessons to meet their individual classroom needs.

Litow said there are plans to make Teacher Advisor available to all elementary school teachers across the U.S. before the end of the year.

 

 

In this first phase, Teacher Advisor offers hundreds of high-quality vetted lesson plans, instructional resources, and teaching techniques, which are customized to meet the needs of individual teachers and the particular needs of their students.

 

 

Also see:

teacheradvisor-sept282016

 

Educators can also access high-quality videos on teaching techniques to master key skills and bring a lesson or teaching strategy to life into their classroom.

 

 

From DSC:
Today’s announcement involved personalization and giving customized directions, and it caused my mind to go in a slightly different direction. (IBM, Google, Microsoft, Apple, Amazon, and others like Smart Sparrow are likely also thinking about this type of direction as well. Perhaps they’re already there…I’m not sure.)

But given the advancements in machine learning/cognitive computing (where example applications include optical character recognition (OCR) and computer vision), how much longer will it be before software is able to remotely or locally “see” what a third grader wrote down for a given math problem (via character and symbol recognition) and “see” what the student’s answer was while checking over the student’s work…if the answer was incorrect, the algorithms will likely know where the student went wrong.  The software will be able to ascertain what the student did wrong and then show them how the problem should be solved (either via hints or by showing the entire problem to the student — per the teacher’s instructions/admin settings). Perhaps, via natural language processing, this process could be verbalized as well.

Further questions/thoughts/reflections then came to my mind:

  • Will we have bots that teachers can use to teach different subjects? (“Watson may even ask the teacher additional questions to refine its response, honing in on what the teacher needs to address certain challenges.)
  • Will we have bots that students can use to get the basics of a given subject/topic/equation?
  • Will instructional designers — and/or trainers in the corporate world — need to modify their skillsets to develop these types of bots?
  • Will teachers — as well as schools of education in universities and colleges — need to modify their toolboxes and their knowledgebases to take advantage of these sorts of developments?
  • How might the corporate world take advantage of these trends and technologies?
  • Will MOOCs begin to incorporate these sorts of technologies to aid in personalized learning?
  • What sorts of delivery mechanisms could be involved? Will we be tapping into learning-related bots from our living rooms or via our smartphones?

 

The Living [Class] Room -- by Daniel Christian -- July 2012 -- a second device used in conjunction with a Smart/Connected TV

 

 

 

Also see:

 

 

 

Virtual reality: The hype, the problems and the promise — from bbc.com by Tim Maughan
It’s the technology that is supposed to be 2016’s big thing, but what iteration of VR will actually catch on, and what’s just a fad? Tim Maughan takes an in-depth look.

Excerpt:

For Zec this is one of VR’s most promising potentials – to be able to drop audiences into a situation and force them to react emotionally, in ways that traditional filmmaking or journalism might struggle to do. “We really cannot understand what the people [in Syria and other places] right now are going through, so I thought maybe if we put the viewer inside the shoes of the family, or near them, maybe they can feel more and understand more rather than just reading a headline in a newspaper.”

 

The aim of Blackout is to challenge assumptions New Yorkers might have about the people around them, by allowing them to tap directly into their thoughts. “You’re given the ability to pick into people’s minds and their motives,” says co-creator Alex Porter. “Through that process you start to realise the ways in which you were wrong about all the people around you, and start to find these kind of exciting stories that they have to tell.”

 

From DSC:
Virtual Reality could have a significant impact in diversity training. (I don’t like the word diversity too much; as in my experience, everybody in the Fortune 5oo companies where I worked belonged in the realm of diversity except for Christians, but I’ll use it here anyway.)

The point is…when you can put yourself into someone else’s shoes, it could have some positive impact in terms of better being able to relate to what that person is going through.

 

 

 

Star Trek in VR – Why can’t we do this with VR in education? — from digitalbodies.net by Maya Georgieva

Excerpt:

What if there was a new way to start this journey? What if you walked into the room and boarded a starship instead? What would a school experience be like if we sent our students on a mission, joining a global team to learn and solve our world’s most pressing problems? What if they met in Virtual Reality? For example, literally experiencing the streets of Paris if they were studying French culture or urban planning. Examining first hand the geology of volcanoes or building the next generation transportation? What would happen if they are given a problem they could not answer on their own, a problem that requires collaboration and teamwork with colleagues to find a solution?

Here is how VR and AI can empower the future of learning. The Star Trek: Bridge Crew VR Game gives us a glimpse of how we can engage with our students. Or, as Levar Burton (Geordi La Forge from Engineering) in the video trailer puts it:

There is something different being in a shared virtual environment . . . The team does not succeed unless everybody does their job well.


In the true spirit of Star Trek it is through cooperation rather than competition that we learn best. In VR, you can sit on any of the crew chairs and be the captain, engineer, or doctor and experience events from very different point of views. In Star Trek: Bridge Crew, you are flying the ship but have to work collaboratively with your team. You have to work with your crew to reach goals and accomplish the mission as this is virtual reality as a social experience. It demands that you be fully engaged.

 

 

 

Not just for gamers: CSU launching Virtual Reality Initiative — from source.colostate.edu by Lauren Klamm

Excerpt:

Think “virtual reality,” and it’s probably video gaming that comes to mind. CSU is looking to expand the breadth and depth of this emerging technical field with a campus-wide Virtual Reality Initiative, launching this semester.

The initiative will give students and the science community hands-on experience with virtual reality, for research and educational applications.

Virtual reality (VR) is a way of experiencing virtual worlds or objects – the cockpit of a spaceship, an anatomy lesson, a walk through a historical building – through devices like computers, goggles or headsets designed to immerse someone in a simulated environment. VR touches fields ranging from design to art to engineering.

 

 

VR Learning: How Virtual Reality Will Democratize Learning — from iamvr.co

Excerpt:

In case you haven’t heard, there is a lot of hype right now about virtual and augmented reality. Three months into 2016, investors have already spent 1.1 billion dollars to get a piece of the action.

 

 

 

 

 

 

 

Still, I am confident that virtual reality will revolutionize how we learn, and the reason is simple. Virtual reality is not just a technology, it’s a medium. And I’ve seen how powerful that medium can be.

 

 

 

 

Augmented reality has surgical application — from thestack.com by Nicky Cappella

Excerpt:

A Chinese surgeon has discovered a practical application for augmented reality in the medical field. Using the same technology by which a Pokemon character is layered onto a real-life setting, two surgical images can be combined into a single view, eliminating the need for surgeons to watch two separate screens simultaneously.

Catherine Chan Po-ling, a surgeon in Hong Kong and co-founder of MedEXO Robotics, says that the use of augmented reality technology in keyhole, or minimally invasive, surgery can solve one of the biggest problems for surgeons performing these procedures.

 

Currently, surgeons in keyhole procedures must create and view two images simultaneously. In Chan’s example, when checking for cancerous cells in the liver, the surgeon operates a regular camera showing a view of the surface of the liver, and at the same time operates an ultrasound probe to check beneath the surface of the liver.

 

 

Stanford Journalism Program’s Guide to Using Virtual Reality for Storytelling — from storybench.org by Geri Migielicz and Janine Zacharia

Excerpt:

Given the explosion of interest in virtual reality among media organizations, we sought in January to establish best practices and ideal scenarios for using the technology in storytelling through our inaugural immersive journalism class at Stanford University.

During the 10-week course, 12 undergraduate and graduate students evaluated a range of virtual reality experiences published by the New York Times, Wall Street Journal, ABC News and others. We compared commercially available virtual reality headsets (Google Cardboard, HTC Vive, Samsung Gear/VR and Oculus Rift) for ease and quality as well as virtual reality cameras — the (more expensive but expansive) GoPro and the (more affordable) Ricoh Theta S.

 

 

 

12 ways to use Google Cardboard in your class — from ditchthattextbook.com

Excerpt:

Virtual reality used to be the thing of science fiction books and movies. Now, it’s inexpensive, works with the technology we carry in our pockets, and can transform us to real and imaginary places.

 

 

 

These 5 Incredible HoloLens Videos Will Make You A VR/AR Believer — from uploadvr.com

 

 

 

 

Upload And Make School Graduate Their First Class Of VR Developers — from uploadvr.com

 

 

 

 

Infographic: IoT and the classroom of tomorrow — from cr80news.com by Andrew Hudson
Student IDs among list of most used smart devices on campus

Excerpt:

The classroom of tomorrow will undoubtedly employ more and more smart devices, and coupled with the Internet of Things (IoT) phenomenon, the way in which students learn could be very different in the not-so-distant future.

A new survey conducted by Extreme Networks reveals that while smart classrooms and schools only represent a small fraction of campuses today, the promise is there for the technology to redefine the academic experience going forward. There are K-12 schools and universities across the country that are already using the IoT to connect smart devices that can “talk” to one another for the purpose of enhancing the learning experience.

From DSC:
I look forward to the time when machine-to-machine communications and sensors will give faculty members the settings that they want setup/initiated as soon as they walk into a room (some of this is most likely already occurring somewhere else…just not on our campus yet!):

  • The front lights lower down 50% (as the professor had requested previously)
  • The front 80′ LCD — a smart/Internet connected device — display is turned on and brings up that specific course on the screen (having already signed into the cloud-based CMS/LMS upon that professor entering the room; the system has already queried the appropriate back end system to ascertain what that professor teaches at that particular time and place)
  • The window treatments are lowered all the way down for better viewing
  • The speakers play a previously scheduled song, or a spoken poem, or an announcement, or what the students should be doing for the first 5-10 minutes of class
  • Etc.

Also:

  • Attendance is automatic (this clearly is already here today and has been for a while).
  • Students could receive any handouts that the professor wanted to wait to deliver until that particular date and time — again, automatically
  • Students could upload content that they created — automatically to an electronic parking lot, for the professor or other students to review and comment on

Also see the infographic, a portion of which is seen below:

Benefits-of-IoT-Aug2016

 

How might these enhancements to Siri and tvOS 10 impact education/training/learning-related offerings & applications? [Christian]

From DSC:
I read the article mentioned below.  It made me wonder how 3 of the 4 main highlights that Fred mentioned (that are coming to Siri with tvOS 10) might impact education/training/learning-related applications and offerings made possible via tvOS & Apple TV:

  1. Live broadcasts
  2. Topic-based searches
  3. The ability to search YouTube via Siri

The article prompted me to wonder:

  • Will educators and trainers be able to offer live lectures and training (globally) that can be recorded and later searched via Siri? 
  • What if second screen devices could help learners collaborate and participate in active learning while watching what’s being presented on the main display/”TV?”
  • What if learning taken this way could be recorded on one’s web-based profile, a profile that is based upon blockchain-based technologies and maintained via appropriate/proven organizations of learning? (A profile that’s optionally made available to services from Microsoft/LinkedIn.com/Lynda.com and/or to a service based upon IBM’s Watson, and/or to some other online-based marketplace/exchange for matching open jobs to potential employees.)
  • Or what if you could earn a badge or prove a competency via this manner?

Hmmm…things could get very interesting…and very powerful.

More choice. More control. Over one’s entire lifetime.

Heutagogy on steroids.

Micro-learning.

Perhaps this is a piece of the future for MOOCs…

 

MoreChoiceMoreControl-DSC

 

 

The Living [Class] Room -- by Daniel Christian -- July 2012 -- a second device used in conjunction with a Smart/Connected TV

 

 

StreamsOfContent-DSC

 

 


 

Apple TV gets new Siri features in tvOS 10 — from iphonefaq.org by Fred Straker

Excerpt:

The forthcoming update to Apple TV continues to bring fresh surprises for owners of Apple’s set top box. Many improvements are coming to tvOS 10, including single-sign-on support and an upgrade to Siri’s capabilities. Siri has already opened new doors thanks to the bundled Siri Remote, which simplifies many functions on the Apple TV interface. Four main highlights are coming to Siri with tvOS 10, which is expected to launch this fall.

 


 

Addendum on 7/17/16:

CBS News Launches New Apple TV App Designed Exclusively for tvOS — from macrumors.com

Excerpt:

CBS today announced the launch of an all-new Apple TV app that will center around the network’s always-on, 24-hour “CBSN” streaming network and has been designed exclusively for tvOS. In addition to the live stream of CBSN, the app curates news stories and video playlists for each user based on previously watched videos.

The new app will also take advantage of the 4th generation Apple TV’s deep Siri integration, allowing users to tell Apple’s personal assistant that they want to “Watch CBS News” to immediately start a full-screen broadcast of CBSN. While the stream is playing, users can interact with other parts of the app to browse related videos, bookmark some to watch later, and begin subscribing to specific playlists and topics.

 

 

 

 

UX to LX: The Rise of Learner Experience Design — from edsurge.com by Whitney Kilgore

Excerpt:

Instructional design is now approaching a similar transition. Most student consumers have yet to experience great learning design, but the commoditization of online learning is forcing colleges and universities to think differently about how they construct digital courses. Courseware is enabling the development of new modalities and pedagogical shifts. An abundance of data now enables instructional designers to decode learning patterns. As a result, we are witnessing the growth of a new field: Learner Experience Design.

Parse higher-education job postings and descriptions, and it’s evident that LX design is, as a discipline, among the fastest growing fields in education. But what exactly makes for great learning design, and how can instructional designers ensure they remain competitive in this new era of student-centric education?

The transition to digital content has made entirely new layers of student data available. Learners now leave a digital footprint that allows designers to understand how students are interacting with course materials and for how long. LX designers can develop course pathways that connect student challenges to specific sections of content. For the first time, faculty have insights into time on task—before, during and after class. Ready access to student behavior data is helping institutions develop powerful predictive analytics, and LX designers are leading the field to make more and better informed choices on content delivery to help students better understand the critical concepts.

The groundswell of data and learning technology shows no sign of slowing down, and a LX designer’s job will grow more complex alongside it. Learner experience designers must rise to the challenge, so universities can deliver online courses that captivate and resonate with each unique student.

 

Ten skills you need to be a UX unicorn — from medium.com by Conor Ward

Excerpt:

So if the discipline of UX is not about improving how things LOOK, but instead how they WORK, then of course UX Design includes a multitude of varied deep specialisms and expertise. How could it not?

Well, thats where the mythical part of this discussion comes in, many UX designers out there still believe very strongly (and for good reason) that this multi skilled ‘specialist-generalist’ cannot exist.

They could well be correct in their current circumstances. For example if their company does not work like this, then how could they? Especially if their company is an agency, and their access to users is limited or non-existent.

My own experience is that we must strive towards unicorn-ism. I have created and curated a fantastic team of UX Unicorns (yes a group of unicorns is called a blessing, Google says so) and more importantly together with my colleagues and bosses we have created the environment for them to survive and thrive.

 

 

 

 

 

 

From DSC:
The above two articles get at a piece of what I was trying to relay out at The EvoLLLution.com. And that is, the growing complexities of putting digitally-based materials online — with a high degree of engagement, professionalism, and quality — require the use of specialists.  One person simply can’t do it all anymore. In fact, User Experience Designers and Learner Experience Designers are but a couple of the potential players at the table.

 

 

 
© 2016 Learning Ecosystems