Blockchain Deployment Checklist — from The Journal by Sara Friedman
While the technology is still in the nascent stages, blockchain-based education systems have the potential to revolutionize how school districts manage student data.

 

 

 

From DSC:
Unfortunately, the checklist provided in this solid article was too long and complicated…it needs to be streamlined. But I think it’s likely that we’ll see more products out there in the future that will remove these complexities.

Along these lines, I think we’ll see cloud-based learner profiles in the future. Throughout our lifetimes, we will own the data and direct who can — and can’t — access it.

 


Also see:

 


 

 

Math Visuals — from mathvisuals.wordpress.com

Examples:

 

 

 

 

 

For a next gen learning platform: A Netflix-like interface to check out potential functionalities / educationally-related “apps” [Christian]

From DSC:
In a next generation learning system, it would be sharp/beneficial to have a Netflix-like interface to check out potential functionalities that you could turn on and off (at will) — as one component of your learning ecosystem that could feature a setup located in your living room or office.

For example, put a Netflix-like interface to the apps out at eduappcenter.com (i.e., using a rolling interface at first, then going to a static page/listing of apps…again…similar to Netflix).

 

A Netflix-like interface to check out potential functionalities / educationally-related apps

 

 

 

Google is bringing translation to its Home speakers — from businessinsider.com by Peter Newman

Excerpt:

Google has added real-time translation capabilities to its Google Home smart speakers, the Home Hub screened speaker, as well as other screened devices from third parties, according to Android Police.

 

Also see:

 

 

Why Facebook’s banned “Research” app was so invasive — from wired.com by Louise Matsakislo

Excerpts:

Facebook reportedly paid users between the ages of 13 and 35 $20 a month to download the app through beta-testing companies like Applause, BetaBound, and uTest.


Apple typically doesn’t allow app developers to go around the App Store, but its enterprise program is one exception. It’s what allows companies to create custom apps not meant to be downloaded publicly, like an iPad app for signing guests into a corporate office. But Facebook used this program for a consumer research app, which Apple says violates its rules. “Facebook has been using their membership to distribute a data-collecting app to consumers, which is a clear breach of their agreement with Apple,” a spokesperson said in a statement. “Any developer using their enterprise certificates to distribute apps to consumers will have their certificates revoked, which is what we did in this case to protect our users and their data.” Facebook didn’t respond to a request for comment.

Facebook needed to bypass Apple’s usual policies because its Research app is particularly invasive. First, it requires users to install what is known as a “root certificate.” This lets Facebook look at much of your browsing history and other network data, even if it’s encrypted. The certificate is like a shape-shifting passport—with it, Facebook can pretend to be almost anyone it wants.

To use a nondigital analogy, Facebook not only intercepted every letter participants sent and received, it also had the ability to open and read them. All for $20 a month!

Facebook’s latest privacy scandal is a good reminder to be wary of mobile apps that aren’t available for download in official app stores. It’s easy to overlook how much of your information might be collected, or to accidentally install a malicious version of Fortnite, for instance. VPNs can be great privacy tools, but many free ones sell their users’ data in order to make money. Before downloading anything, especially an app that promises to earn you some extra cash, it’s always worth taking another look at the risks involved.

 

When the future comes to West Michigan, will we be ready?


 

UIX: When the future comes to West Michigan, will we be ready? — from rapidgrowthmedia.com by Matthew Russell

Excerpts (emphasis DSC):

“Here in the United States, if we were to personify things a bit, it’s almost like society is anxiously calling out to an older sibling (i.e., emerging technologies), ‘Heh! Wait up!!!'” Christian says. “This trend has numerous ramifications.”

Out of those ramifications, Christian names three main points that society will have to address to fully understand, make use of, and make practical, future technologies.

  1. The need for the legal/legislative side of the world to close the gap between what’s possible and what’s legal
  2. The need for lifelong learning and to reinvent oneself
  3. The need to make pulse-checking/futurism an essential tool in the toolbox of every member of the workforce today and in the future

 

When the future comes to West Michigan, will we be ready?

Photos by Adam Bird

 

From DSC:
The key thing that I was trying to relay in my contribution towards Matthew’s helpful article was that we are now on an exponential trajectory of technological change. This trend has ramifications for numerous societies around the globe, and it involves the legal realm as well. Hopefully, all of us in the workforce are coming to realize our need to be constantly pulse-checking the relevant landscapes around us. To help make that happen, each of us needs to be tapping into the appropriate “streams of content” that are relevant to our careers so that our knowledgebases are as up-to-date as possible. We’re all into lifelong learning now, right?

Along these lines, increasingly there is a need for futurism to hit the mainstream. That is, when the world is moving at 120+mph, the skills and methods that futurists follow must be better taught and understood, or many people will be broadsided by the changes brought about by emerging technologies. We need to better pulse-check the relevant landscapes, anticipate the oncoming changes, develop potential scenarios, and then design the strategies to respond to those potential scenarios.

 

 

Amazon has 10,000 employees dedicated to Alexa — here are some of the areas they’re working on — from businessinsider.com by Avery Hartmans

Summary (emphasis DSC):

  • Amazon’s vice president of Alexa, Steve Rabuchin, has confirmed that yes, there really are 10,000 Amazon employees working on Alexa and the Echo.
  • Those employees are focused on things like machine learning and making Alexa more knowledgeable.
  • Some employees are working on giving Alexa a personality, too.

 

 

From DSC:
How might this trend impact learning spaces? For example, I am interested in using voice to intuitively “drive” smart classroom control systems:

  • “Alexa, turn on the projector”
  • “Alexa, dim the lights by 50%”
  • “Alexa, open Canvas and launch my Constitutional Law I class”

 

 

 

What is 5G? Everything you need to know — from techradar.com by Mike Moore
The latest news, views and developments in the exciting world of 5G networks.

Excerpt:

What is 5G?
5G networks are the next generation of mobile internet connectivity, offering faster speeds and more reliable connections on smartphones and other devices than ever before.

Combining cutting-edge network technology and the very latest research, 5G should offer connections that are multitudes faster than current connections, with average download speeds of around 1GBps expected to soon be the norm.

The networks will help power a huge rise in Internet of Things technology, providing the infrastructure needed to carry huge amounts of data, allowing for a smarter and more connected world.

With development well underway, 5G networks are expected to launch across the world by 2020, working alongside existing 3G and 4G technology to provide speedier connections that stay online no matter where you are.

So with only a matter of months to go until 5G networks are set to go live, here’s our run-down of all the latest news and updates.

 

 

From DSC:
I wonder…

  • What will Human Computer Interaction (HCI) look like when ~1GBps average download speeds are the norm?
  • What will the Internet of Things (IoT) turn into (for better or for worse)?
  • How will Machine-to-Machine (M2M) Communications be impacted?
  • What will that kind of bandwidth mean for XR-related technologies (AR VR MR)?

 

 

State of the E-Learning Industry 2019 — from elearninfo247.com by Craig Weiss
What I have seen so far (covers 2018 into the first week of 2019)…

 

4 Micro-Learning Trends to Watch in 2019 — from elearninglearning.com by Laura Lynch
Micro-learning is no longer a trend—it’s a permanent e-learning development. Here’s how it will change in the year to come.

 

Tech and trends at CES 2019 – Ripple Effect Group — from rippleffectgroup.com

 

 

 
 

Big tech may look troubled, but it’s just getting started — from nytimes.com by David Streitfeld

Excerpt:

SAN JOSE, Calif. — Silicon Valley ended 2018 somewhere it had never been: embattled.

Lawmakers across the political spectrum say Big Tech, for so long the exalted embodiment of American genius, has too much power. Once seen as a force for making our lives better and our brains smarter, tech is now accused of inflaming, radicalizing, dumbing down and squeezing the masses. Tech company stocks have been pummeled from their highs. Regulation looms. Even tech executives are calling for it.

The expansion underlines the dizzying truth of Big Tech: It is barely getting started.

 

“For all intents and purposes, we’re only 35 years into a 75- or 80-year process of moving from analog to digital,” said Tim Bajarin, a longtime tech consultant to companies including Apple, IBM and Microsoft. “The image of Silicon Valley as Nirvana has certainly taken a hit, but the reality is that we the consumers are constantly voting for them.”

 

Big Tech needs to be regulated, many are beginning to argue, and yet there are worries about giving that power to the government.

Which leaves regulation up to the companies themselves, always a dubious proposition.

 

 

 

The world is changing. Here’s how companies must adapt. — from weforum.org by Joe Kaeser, President and Chief Executive Officer, Siemens AG

Excerpts (emphasis DSC):

Although we have only seen the beginning, one thing is already clear: the Fourth Industrial Revolution is the greatest transformation human civilization has ever known. As far-reaching as the previous industrial revolutions were, they never set free such enormous transformative power.

The Fourth Industrial Revolution is transforming practically every human activity...its scope, speed and reach are unprecedented.

Enormous power (Insert from DSC: What I was trying to get at here) entails enormous risk. Yes, the stakes are high. 

 

“And make no mistake about it: we are now writing the code that will shape our collective future.” CEO of Siemens AG

 

 

Contrary to Milton Friedman’s maxim, the business of business should not just be business. Shareholder value alone should not be the yardstick. Instead, we should make stakeholder value, or better yet, social value, the benchmark for a company’s performance.

Today, stakeholders…rightfully expect companies to assume greater social responsibility, for example, by protecting the climate, fighting for social justice, aiding refugees, and training and educating workers. The business of business should be to create value for society.

This seamless integration of the virtual and the physical worlds in so-called cyber-physical systems – that is the giant leap we see today. It eclipses everything that has happened in industry so far. As in previous industrial revolutions but on a much larger scale, the Fourth Industrial Revolution will eliminate millions of jobs and create millions of new jobs.

 

“…because the Fourth Industrial Revolution runs on knowledge, we need a concurrent revolution in training and education.

If the workforce doesn’t keep up with advances in knowledge throughout their lives, how will the millions of new jobs be filled?” 

Joe Kaeser, President and Chief Executive Officer, Siemens AG

 

 


From DSC:
At least three critically important things jump out at me here:

  1. We are quickly approaching a time when people will need to be able to reinvent themselves quickly and cost-effectively, especially those with families and who are working in their (still existing) jobs. (Or have we already entered this period of time…?)
  2. There is a need to help people identify which jobs are safe to reinvent themselves to — at least for the next 5-10 years.
  3. Citizens across the globe — and their relevant legislatures, governments, and law schools — need to help close the gap between emerging technologies and whether those technologies should even be rolled out, and if so, how and with which features.

 


 

What freedoms and rights should individuals have in the digital age?

Joe Kaeser, President and Chief Executive Officer, Siemens AG

 

 

Best camera for vlogging 2019: 10 perfect choices tested — from techradar.com by Matthew Richards
Here are our top 10 vlogging camera picks

 

From DSC:
Also, with a different kind of camera in mind…and with a shout out to Mr. Charles Mickens (CIO / Associate Dean of Innovation and Technology at the WMU-Cooley Law School) see the amazing Light L16 Camera:

 

 

A Little Bit of Light from light on Vimeo.

 

 

Facial recognition has to be regulated to protect the public, says AI report — from technologyreview.com by Will Knight
The research institute AI Now has identified facial recognition as a key challenge for society and policymakers—but is it too late?

Excerpt (emphasis DSC):

Artificial intelligence has made major strides in the past few years, but those rapid advances are now raising some big ethical conundrums.

Chief among them is the way machine learning can identify people’s faces in photos and video footage with great accuracy. This might let you unlock your phone with a smile, but it also means that governments and big corporations have been given a powerful new surveillance tool.

A new report from the AI Now Institute (large PDF), an influential research institute based in New York, has just identified facial recognition as a key challenge for society and policymakers.

 

Also see:

EXECUTIVE SUMMARY
At the core of the cascading scandals around AI in 2018 are questions of accountability: who is responsible when AI systems harm us? How do we understand these harms, and how do we remedy them? Where are the points of intervention, and what additional research and regulation is needed to ensure those interventions are effective? Currently there are few answers to these questions, and the frameworks presently governing AI are not capable of ensuring accountability. As the pervasiveness, complexity, and scale of these systems grow, the lack of meaningful accountability and oversight – including basic safeguards of responsibility, liability, and due process – is an increasingly urgent concern.

Building on our 2016 and 2017 reports, the AI Now 2018 Report contends with this central
problem and addresses the following key issues:

  1. The growing accountability gap in AI, which favors those who create and deploy these
    technologies at the expense of those most affected
  2. The use of AI to maximize and amplify surveillance, especially in conjunction with facial
    and affect recognition, increasing the potential for centralized control and oppression
  3. Increasing government use of automated decision systems that directly impact individuals and communities without established accountability structures
  4. Unregulated and unmonitored forms of AI experimentation on human populations
  5. The limits of technological solutions to problems of fairness, bias, and discrimination

Within each topic, we identify emerging challenges and new research, and provide recommendations regarding AI development, deployment, and regulation. We offer practical pathways informed by research so that policymakers, the public, and technologists can better understand and mitigate risks. Given that the AI Now Institute’s location and regional expertise is concentrated in the U.S., this report will focus primarily on the U.S. context, which is also where several of the world’s largest AI companies are based.

 

 

From DSC:
As I said in this posting, we need to be aware of the emerging technologies around us. Just because we can, doesn’t mean we should. People need to be aware of — and involved with — which emerging technologies get rolled out (or not) and/or which features are beneficial to roll out (or not).

One of the things that’s beginning to alarm me these days is how the United States has turned over the keys to the Maserati — i.e., think an expensive, powerful thing — to youth who lack the life experiences to know how to handle such power and, often, the proper respect for such power. Many of these youthful members of our society don’t own the responsibility for the positive and negative influences and impacts that such powerful technologies can have (and the more senior execs have not taken enough responsibility either)!

If you owned the car below, would you turn the keys of this ~$137,000+ car over to your 16-25 year old? Yet that’s what America has been doing for years. And, in some areas, we’re now paying the price.

 

If you owned this $137,000+ car, would you turn the keys of it over to your 16-25 year old?!

 

The corporate world continues to discard the hard-earned experience that age brings…as they shove older people out of the workforce. (I hesitate to use the word wisdom…but in some cases, that’s also relevant/involved here.) Then we, as a society, sit back and wonder how did we get to this place?

Even technologists and programmers in their 20’s and 30’s are beginning to step back and ask…WHY did we develop this application or that feature? Was it — is it — good for society? Is it beneficial? Or should it be tabled or revised into something else?

Below is but one example — though I don’t mean to pick on Microsoft, as they likely have more older workers than the Facebooks, Googles, or Amazons of the world. I fully realize that all of these companies have some older employees. But the youth-oriented culture in American today has almost become an obsession — and not just in the tech world. Turn on the TV, check out the new releases on Netflix, go see a movie in a theater, listen to the radio, cast but a glance at the magazines in the check out lines, etc. and you’ll instantly know
what I mean.

In the workplace, there appears to be a bias against older employees as being less innovative or tech-savvy — such a perspective is often completely incorrect. Go check out LinkedIn for items re: age discrimination…it’s a very real thing. But many of us over the age of 30 know this to be true if we’ve lost a job in the last decade or two and have tried to get a job that involves technology.

 

Microsoft argues facial-recognition tech could violate your rights — from finance.yahoo.com by Rob Pegoraro

Excerpt (emphasis DSC):

On Thursday, the American Civil Liberties Union provided a good reason for us to think carefully about the evolution of facial-recognition technology. In a study, the group used Amazon’s (AMZN) Rekognition service to compare portraits of members of Congress to 25,000 arrest mugshots. The result: 28 members were mistakenly matched with 28 suspects.

The ACLU isn’t the only group raising the alarm about the technology. Earlier this month, Microsoft (MSFT) president Brad Smith posted an unusual plea on the company’s blog asking that the development of facial-recognition systems not be left up to tech companies.

Saying that the tech “raises issues that go to the heart of fundamental human rights protections like privacy and freedom of expression,” Smith called for “a government initiative to regulate the proper use of facial recognition technology, informed first by a bipartisan and expert commission.”

But we may not get new laws anytime soon.

 

just because we can does not mean we should

 

Just because we can…

 

just because we can does not mean we should

 

 

Smart speakers hit critical mass in 2018 — from techcrunch.com by Sarah Perez

Excerpt (emphasis DSC):

We already know Alexa had a good Christmas — the app shot to the top of the App Store over the holidays, and the Alexa service even briefly crashed from all the new users. But Alexa, along with other smart speaker devices like Google Home, didn’t just have a good holiday — they had a great year, too. The smart speaker market reached critical mass in 2018, with around 41 percent of U.S. consumers now owning a voice-activated speaker, up from 21.5 percent in 2017.

 

In the U.S., there are now more than 100 million Alexa-enabled devices installed — a key milestone for Alexa to become a “critical mass platform,” the report noted.

 

 
© 2024 | Daniel Christian