2018 TECH TRENDS REPORT — from the Future Today Institute
Emerging technology trends that will influence business, government, education, media and society in the coming year.


The Future Today Institute’s 11th annual Tech Trends Report identifies 235 tantalizing advancements in emerging technologies—artificial intelligence, biotech, autonomous robots, green energy and space travel—that will begin to enter the mainstream and fundamentally disrupt business, geopolitics and everyday life around the world. Our annual report has garnered more than six million cumulative views, and this edition is our largest to date.

Helping organizations see change early and calculate the impact of new trends is why we publish our annual Emerging Tech Trends Report, which focuses on mid- to late-stage emerging technologies that are on a growth trajectory.

In this edition of the FTI Tech Trends Report, we’ve included several new features and sections:

  • a list and map of the world’s smartest cities
  • a calendar of events that will shape technology this year
  • detailed near-future scenarios for several of the technologies
  • a new framework to help organizations decide when to take action on trends
  • an interactive table of contents, which will allow you to more easily navigate the report from the bookmarks bar in your PDF reader



01 How does this trend impact our industry and all of its parts?
02 How might global events — politics, climate change, economic shifts – impact this trend, and as a result, our organization?
03 What are the second, third, fourth, and fifth-order implications of this trend as it evolves, both in our organization and our industry?
04 What are the consequences if our organization fails to take action on this trend?
05 Does this trend signal emerging disruption to our traditional business practices and cherished beliefs?
06 Does this trend indicate a future disruption to the established roles and responsibilities within our organization? If so, how do we reverse-engineer that disruption and deal with it in the present day?
07 How are the organizations in adjacent spaces addressing this trend? What can we learn from their failures and best practices?
08 How will the wants, needs and expectations of our consumers/ constituents change as a result of this trend?
09 Where does this trend create potential new partners or collaborators for us?
10 How does this trend inspire us to think about the future of our organization?




Implications of Learning Theories on Instructional Design — from elearningindustry.com by Jon-Erik Oleyar-Reynolds
Are you interested in becoming an Instructional Designer? Or are you just starting out in the world of learning theories? The focus of this article is to inform the reader of 3 unique learning theories while discussing the implications they have had in the field of Instructional Design (ID).


Behaviorist Learning Theory
Behavioral learning theory can be summarized as learning that occurs through the behavioral response to environmentally sourced stimuli. The foundation of this theory is built upon assumptions that “have little regard for the cognitive processing of the learner involved in the task”.

The focus of behavioral learning theory resides in the use of reinforcement to drive behavior. Instructional Design can benefit from the use of reinforcement as a means to train learners to complete instructional objectives that are presented to them.

Cognitive Learning Theory
The primary focus of learning is on the development of knowledge by the creation of schemas. Schemas are like catalogs of information that can be used to identify concepts or experiences through a complex set of relationships that are connected to one another. In short, the catalogs act like a database of knowledge for the learner. 2 prominent theories that will be discussed are Gestalt theory and information processing theory; these 2 have paved the way for cognitivism and its impact on the field of Instructional Design.

Information processing theory further supports cognitive learning theory. Similar to Gestalt theory, the focus of learning is on the individual. The processing of information by the learner is similar to the way a computer processes information. The memory system is broken into 3 stages based on this approach:

  • Sensory memory
  • Working memory
  • Long-term memory

…the working memory may require more rehearsal to establish a clear connection to the concept and store it in long-term.

One of the most widely used strategies could arguably be a rehearsal. The expression “practice makes perfect” may seem cliché, but it does fit very well when discussing cognition and development. As one rehearses, the working memory is exercised.

While working memory has a limit of 7 (plus or minus 2), creating a chunk of information increases the amount that can be worked with.

Instructional Design shifted in the presence of cognitivism and includes a more system-like design approach with a focus on the learners.


Social Learning Theory
Social learning theory focuses on the impact of learning based on factors related to the social environment. In other words, learning occurs in the context of a social situation that the learner is placed in.

Think of the expression “perhaps it rubbed off on me”. This has a direct relationship to social learning theory. Self-efficacy can be influenced by the design of lessons that allow for learners to view others of similar ability succeeding at instructional tasks. This could be achieved in a number of creative ways, but generally is most effective in collaborative activities where learners work in small groups. Overall, self-efficacy is the belief that one can be successful at particular tasks.

Collaborative learning groups and the use of peer review are widely used in many settings in which learning occurs.



From DSC:
I wanted to briefly relay an example that relates to the Cognitive Learning/Processing Theory — and more specifically to a concept known as Cognitive Load.  The other day I was sitting at the kitchen table, trying to read an interesting blog posting. But at the very same time, the radio was (loudly) relaying an item re: Virtual Reality (VR) — which also caught my ear and interest.

Which “channel” do I focus on? My visual channel or my auditory channel?

For me, I can’t do both well — perhaps some people can, but our visual and auditory channels can only handle so much at one time. Both channels request our attention and processing resources. I ended up getting up and shutting off the radio so that I could continue reading the blog posting. But for me, I think of it like a traffic jam. There are only so many cars that can simultaneously get through that busy highway that leads downtown.

So another application of this is that it’s helpful NOT to have a lot of auditory information going on at the same time as a lot of visual information. If you have PowerPoint slides, use graphics, photos, and/or graphs and use your audio voiceover to speak to them…but don’t list a long paragraph of text and then simply read that text and then also ask the learner to absorb other visual information at the same time. 




On Change and Relevance for Higher Education — from campustechnology.com by Mary Grush and Phil Long
A Q&A with Phil Long


Mary Grush: You’ve been connected to scores of technology leaders and have watched trends in higher education for more than 30 years. What is the central, or most important concern you are hearing from institutional leadership now?

Phil Long: Higher ed institutions are facing some serious challenges to stay relevant in a world that is diversifying and changing rapidly. They want to make sure that the experiences they have designed for students will carry the next generation forward to be productive citizens and workers. But institutions’ abilities to keep up in our changing environment have begun to lag to a sufficient degree, such that alternatives to the traditional university are being considered, both by the institutions themselves and by their constituents and colleagues throughout the education sector.

Grush: What are a few of the more specific areas in which institutions may find it difficult to navigate?

Long: Just from a very high level view, I’d include on that list: big data and the increasing sophistication of algorithms, with the associated benefits and risks; artificial intelligence with all its implications for good… and for peril; and perhaps most importantly, new applications and practices that support how we recognize learning.



“The pace of change never seems to slow down. And the issues and implications of the technologies we use are actually getting broader and more profound every day.” — Phil Long




2018 Workplace Learning Report — from learning.linkedin.com


The path to opportunity is changing
The short shelf life of skills and a tightening labor market are giving rise to a multitude of skill gaps. Businesses are fighting to stay ahead of the curve, trying to hold onto their best talent and struggling to fill key positions. Individuals are conscious of staying relevant in the age of automation.

Enter the talent development function.
These organizational leaders create learning opportunities to enable employee growth and achievement. They have the ability to guide their organizations to success in tomorrow’s labor market, but they can’t do it alone.

Our research answers the talent developer’s most pressing questions:
* How are savvy talent development leaders adapting to the pace of change in today’s dynamic world of work?
* Why do employees demand learning and development resources, but don’t make the time to learn?
* How do executives think about learning and development?
* Are managers the missing link to successful learning programs?


From DSC:
Even though this piece is a bit of a sales pitch for Lynda.com — a great service I might add — it’s still worth checking out. I say this because it brings up a very real trend that I’m trying to bring more awareness to — i.e., the pace of change has changed. Our society is not ready for this new, exponential pace of change. Technologies are impacting jobs and how we do our jobs, and will likely do so for the next several decades. Skills gaps are real and likely growing larger. Corporations need to do their part in helping higher education revise/develop curriculum and they need to offer funds to create new types of learning labs/environments. They need to offer more internships and opportunities to learn new skills.




A Microlearning Framework — from jvsp.io and Pablo Navarro
This infographic is based on the experience of different clients from different industries in different training programs.


From DSC:
I thought this was a solid infographic and should prove to be useful for Instructional Designers, Faculty Members, and/or for Corporate Trainers as well. 

I might also consider adding a “Gotcha!” piece first — even before the welcome piece — in order to get the learner’s attention and to immediately answer the WHY question. WHY is this topic important and relevant to me? When topics are relevant to people, they care and engage a whole lot more with the content that’s about to be presented to them. Ideally, such a piece would stir some curiosity as well.





Who’s Teaching the Teachers? — from chronicle.com by Elizabeth Alsop

Excerpts (emphasis DSC):

Last fall, the academic career coach Jennifer Polk conducted an informal Twitter poll: How many of you, she asked her followers, received any meaningful pedagogical training during graduate school?

Replies ranged from the encouraging to the mostly dispiriting, with one doctoral candidate noting that the only training the program had offered took the form of “trial by fire.” Just 19 percent of the 2,248 respondents said they had received at least “decent” training — a number that, however unscientific, is also symptomatic.

This statistic reflects something that many of us could confirm firsthand: Teaching remains undervalued in the context of doctoral training and the profession at large. The result, by this anecdotal reckoning, is that less than one-fifth of aspiring college teachers are effectively taught how to teach.

The American Association of University Professors estimates that over 70 percent of all faculty positions are non-tenure-track, so these are teaching, not research, appointments.

Ennobling as such rhetorical constructs may be, they obscure not only the very real labor of teaching, but the fact that teaching is teachable: something that results not from divine, Dead Poets Society-like bursts of inspiration, but, as in other career fields, from study, apprenticeship, and practice. There are any number of books — including Ken Bain’s What the Best College Teachers Do, John Bean’s Engaging Ideas, and Cathy Davidson’s The New Education — that offer excellent advice for college instructors.

It’s also worth noting that the resistance to addressing pedagogy in graduate education may be practical, as well as philosophical: Teaching someone to teach is hard. Like writing, teaching is a craft, learned not just in a single class, practicum, or workshop. Rather, it’s a recursive process, developed through trial and error — and yes, by “fire” — but also through conversation with others: a mentor, a cohort, your peers.





4 Things Experts Want You to Know About Blockchain in Higher Ed — from edtechmagazine.com by Meghan Bogardus Cortez
The electronic ledger software holds a lot of potential for university data.


At universities, blockchain is poised to help in aspects of data management, credentialing and research. From boosting security to enhancing access, the e-ledger tool has a lot of possibility.

Here are four ways experts think blockchain has a place in higher education:




In Move Towards More Online Degrees, Coursera Introduces Its First Bachelor’s — from edsurger.com by Sydney Johnson


These days, though, many MOOC platforms are courting the traditional higher-ed market they once rebuked, often by hosting fully-online masters degrees for colleges and universities. And today, one of the largest MOOC providers, Coursera, announced it’s going one step further in that direction, with its first fully online bachelor’s degree.

“We are realizing that the vast reach of MOOCs makes them a powerful gateway to degrees,” Coursera CEO Jeff Maggioncalda said in a statement.

The new degree will be a bachelor of science in computer science from the University of London. The entire program will cost between £9,600 and £17,000 (approximately $13,300 to $23,500), depending on a student’s geographic location. According to a spokesperson for Coursera, the program’s “cost is adjusted based on whether a student is in a developed or developing economy.”



From DSC:
At least a couple of questions come to mind here:

  • What might the future hold if the U.S. Department of Education / the Federal Government begins funding these types of alternatives to traditional higher education?
  • Will Coursera be successful here and begin adding more degrees? If so, a major game-changer could be on our doorsteps.




From DSC:
After seeing the article entitled, “Scientists Are Turning Alexa into an Automated Lab Helper,” I began to wonder…might Alexa be a tool to periodically schedule & provide practice tests & distributed practice on content? In the future, will there be “learning bots” that a learner can employ to do such self-testing and/or distributed practice?



From page 45 of the PDF available here:


Might Alexa be a tool to periodically schedule/provide practice tests & distributed practice on content?




Scientists Are Turning Alexa into an Automated Lab Helper — from technologyreview.com by Jamie Condliffe
Amazon’s voice-activated assistant follows a rich tradition of researchers using consumer tech in unintended ways to further their work.


Alexa, what’s the next step in my titration?

Probably not the first question you ask your smart assistant in the morning, but potentially the kind of query that scientists may soon be leveling at Amazon’s AI helper. Chemical & Engineering News reports that software developer James Rhodes—whose wife, DeLacy Rhodes, is a microbiologist—has created a skill for Alexa called Helix that lends a helping hand around the laboratory.

It makes sense. While most people might ask Alexa to check the news headlines, play music, or set a timer because our hands are a mess from cooking, scientists could look up melting points, pose simple calculations, or ask for an experimental procedure to be read aloud while their hands are gloved and in use.

For now, Helix is still a proof-of-concept. But you can sign up to try an early working version, and Rhodes has plans to extend its abilities…


Also see:




Looking for something?

Use the form below to search the site:

Still not finding what you're looking for? Drop a comment on a post or contact us so we can take care of it!

© 2017 | Daniel Christian