Blockchain: The move from freedom to the rigid, dominant system in learning — from oeb.global by Inge de Waard
In this post Inge de Waard gives an overview of current Blockchain options from industry and looks at its impact on universities as well as philosophises on its future.

Excerpt:

I mentioned a couple of Blockchain certification options already, but an even more advanced blockchain in learning example has entered on my radar too. It is a Russian implementation called Disciplina. This platform combines education (including vocational training), recruiting (comparable with what LinkedIn is doing with its economic graph) and careers for professionals. All of this is combined into a blockchain solution that keeps track of all the learners’ journey. The platform includes not only online courses as we know it but also coaching. After each training, you get a certificate.

TeachMePlease, which is a partner of Disciplina, enables teachers and students to find each other for specific professional training as well as curriculum-related children’s schooling. Admittedly, these initiatives are still being rolled out in terms of courses, but it clearly shows where the next learning will be located: in an umbrella above all the universities and professional academies. At present, the university courses are being embedded into course offerings by corporations that roll out a layer post-university, or post-vocational schooling.

Europe embraces blockchain, as can be seen with their EU Blockchain observatory and forum. And in a more national action, Malta is storing their certifications in a blockchain nationwide as well. We cannot deny that blockchain is getting picked up by both companies and governments. Universities have been piloting several blockchain certification options, and they also harbour some of the leading voices in the debate on blockchain certification.

 

Also see:

AI in education -- April 2019 by Inge de Waard

Future proof learning -- the Skills 3.0 project

 

Also see:

  • 7 blockchain mistakes and how to avoid them — from computerworld.com by Lucas Mearian
    The blockchain industry is still something of a wild west, with many cloud service offerings and a large universe of platforms that can vary greatly in their capabilities. So enterprises should beware jumping to conclusions about the technology.
 
 

10 things we should all demand from Big Tech right now — from vox.com by Sigal Samuel
We need an algorithmic bill of rights. AI experts helped us write one.

We need an algorithmic bill of rights. AI experts helped us write one.

Excerpts:

  1. Transparency: We have the right to know when an algorithm is making a decision about us, which factors are being considered by the algorithm, and how those factors are being weighted.
  2. Explanation: We have the right to be given explanations about how algorithms affect us in a specific situation, and these explanations should be clear enough that the average person will be able to understand them.
  3. Consent: We have the right to give or refuse consent for any AI application that has a material impact on our lives or uses sensitive data, such as biometric data.
  4. Freedom from bias: We have the right to evidence showing that algorithms have been tested for bias related to race, gender, and other protected characteristics — before they’re rolled out. The algorithms must meet standards of fairness and nondiscrimination and ensure just outcomes. (Inserted comment from DSC: Is this even possible? I hope so, but I have my doubts especially given the enormous lack of diversity within the large tech companies.)
  5. Feedback mechanism: We have the right to exert some degree of control over the way algorithms work.
  6. Portability: We have the right to easily transfer all our data from one provider to another.
  7. Redress: We have the right to seek redress if we believe an algorithmic system has unfairly penalized or harmed us.
  8. Algorithmic literacy: We have the right to free educational resources about algorithmic systems.
  9. Independent oversight: We have the right to expect that an independent oversight body will be appointed to conduct retrospective reviews of algorithmic systems gone wrong. The results of these investigations should be made public.
  10. Federal and global governance: We have the right to robust federal and global governance structures with human rights at their center. Algorithmic systems don’t stop at national borders, and they are increasingly used to decide who gets to cross borders, making international governance crucial.

 

This raises the question: Who should be tasked with enforcing these norms? Government regulators? The tech companies themselves?

 

 

Introduction: Leading the social enterprise—Reinvent with a human focus
2019 Global Human Capital Trends
— from deloitte.com by Volini?, Schwartz? ?, Roy?, Hauptmann, Van Durme, Denny, and Bersin

Excerpt (emphasis DSC):

Learning in the flow of life. The number-one trend for 2019 is the need for organizations to change the way people learn; 86 percent of respondents cited this as an important or very important issue. It’s not hard to understand why. Evolving work demands and skills requirements are creating an enormous demand for new skills and capabilities, while a tight labor market is making it challenging for organizations to hire people from outside. Within this context, we see three broader trends in how learning is evolving: It is becoming more integrated with work; it is becoming more personal; and it is shifting—slowly—toward lifelong models. Effective reinvention along these lines requires a culture that supports continuous learning, incentives that motivate people to take advantage of learning opportunities, and a focus on helping individuals identify and develop new, needed skills.

 

The Future Today Institute’s 12th Annual Emerging Tech Trends Report — from futuretodayinstitute.com

Excerpts:

At the Future Today Institute, we identify emerging tech trends and map the future for our clients. This is FTI’s 12th annual Tech Trends Report, and in it we identify 315 tantalizing advancements in emerging technologies — artificial intelligence, biotech, autonomous robots, green energy and space travel — that will begin to enter the mainstream and fundamentally disrupt business, geopolitics and everyday life around the world. As of the publication date, the annual FTI Tech Trend Report report has garnered more than 7.5 cumulative views.

Key findings for 2019 (emphasis DSC)

  • Privacy is dead. (DC: NOT GOOD!!! If this is true, can the situation be reversed?)
  • Voice Search Optimization (VSO) is the new SEO.
  • The Big Nine.
  • Personal data records are coming. (DC: Including cloud-based learner profiles I hope.)
  • China continues to ascend, and not just in artificial intelligence.
  • Lawmakers around the world are not prepared to deal with new challenges that arise from emerging science and technology.
  • Consolidation continues as a key theme for 2019.

 

 

How MIT’s Mini Cheetah Can Help Accelerate Robotics Research — from spectrum.ieee.org by Evan Ackerman
Sangbae Kim talks to us about the new Mini Cheetah quadruped and his future plans for the robot

 

 

From DSC:
Sorry, but while the video/robot is incredible, a feeling in the pit of my stomach makes me reflect upon what’s likely happening along these lines in the militaries throughout the globe…I don’t mean to be a fear monger, but rather a realist.

 

 

Is this South Africa’s best legal online platform using blockchain? — from techfinancials.co.za
The winners of the event, Kagiso, will progress to the second round of the contest, in which a panel of international judges will decide who attends a grand final in New York.

Excerpt:

The Hague Institute for Innovation of Law (HiiL) and leading global law firm Baker McKenziehave announced the winners of the South African leg of Global Legal Hackathon 2019 (GLH2019).

First prize went to Kagiso, an online mediation platform that provides a cost-effective and fast alternative to lengthy court processes for civil disputes.

Kagiso uses machine learning to match cases with professional mediators who have the most relevant skill sets to be effective – such as subject matter experience or knowledge of local languages – and stores records using blockchain technology.

The second prize was awarded to Bua, a voice-recognition system that allows victims of crime to record their own statements in their own language in a private “safe space” such as a kiosk or on their own phone.

The majority of crimes in South Africa’s go unreported or prosecutions fail, and a leading reason is that victims don’t feel comfortable giving statements in open police stations, and statements are often badly or wilfully mistranslated.

 

 

The Global Legal Hackathon is a non-profit organization that organizes law schools, law firms and in-house departments, legal technology companies, governments, and service providers to innovation in the legal industry – across the globe. It brings together the best thinkers, doers and practitioners in law in support of a unified vision: rapid development of solutions to improve the legal industry, world-wide.

 

 

From DSC:
Glancing through the awards likely shows where the future of the legal field is going…at least in part.

 

Also see:

 

 

Acts 10:34-35 New International Version (NIV) — from biblegateway.com

34 Then Peter began to speak: “I now realize how true it is that God does not show favoritism 35 but accepts from every nation the one who fears him and does what is right.

 

Towards a Reskilling Revolution: Industry-Led Action for the Future of Work — from weforum.org

As the Fourth Industrial Revolution impacts skills, tasks and jobs, there is growing concern that both job displacement and talent shortages will impact business dynamism and societal cohesion. A proactive and strategic effort is needed on the part of all relevant stakeholders to manage reskilling and upskilling to mitigate against both job losses and talent shortages.

Through the Preparing for the Future of Work project, the World Economic Forum provides a platform for designing and implementing intra-industry collaboration on the future of work, working closely with the public sector, unions and educators. The output of the project’s first phase of work, Towards a Reskilling Revolution: A Future of Jobs for All, highlighted an innovative method to identify viable and desirable job transition pathways for disrupted workers. This second report, Towards a Reskilling Revolution: Industry-Led Action for the Future of Work extends our previous research to assess the business case for reskilling and establish its magnitude for different stakeholders. It also outlines a roadmap for selected industries to address specific challenges and opportunities related to the transformation of their workforce.

 

See the PDF file / report here.

 

 

 

 

What does it say when a legal blockchain eBook has 1.7M views? — from legalmosaic.com by Mark A. Cohen

Excerpts (emphasis DSC):

Blockchain For Lawyers,” a recently-released eBook by Australian legal tech company Legaler, drew 1.7M views in two weeks. What does that staggering number say about blockchain, legal technology, and the legal industry? Clearly, blockchain is a hot legal topic, along with artificial intelligence (AI), and legal tech generally.

Legal practice and delivery are each changing. New practice areas like cryptocurrency, cybersecurity, and Internet law are emerging as law struggles to keep pace with the speed of business change in the digital age. Concurrently, several staples of traditional practice–research, document review, etc.– are becoming automated and/or no longer performed by law firm associates. There is more “turnover” of practice tasks, more reliance on machines and non-licensed attorneys to mine data and provide domain expertise used by lawyers, and more collaboration than ever before. The emergence of new industries demands that lawyers not only provide legal expertise in support of new areas but also that they possess intellectual agility to master them quickly. Many practice areas law students will encounter have yet to be created. That means that all lawyers will be required to be more agile than their predecessors and engage in ongoing training.

 

 

 

What is 5G? Everything you need to know — from techradar.com by Mike Moore
The latest news, views and developments in the exciting world of 5G networks.

Excerpt:

What is 5G?
5G networks are the next generation of mobile internet connectivity, offering faster speeds and more reliable connections on smartphones and other devices than ever before.

Combining cutting-edge network technology and the very latest research, 5G should offer connections that are multitudes faster than current connections, with average download speeds of around 1GBps expected to soon be the norm.

The networks will help power a huge rise in Internet of Things technology, providing the infrastructure needed to carry huge amounts of data, allowing for a smarter and more connected world.

With development well underway, 5G networks are expected to launch across the world by 2020, working alongside existing 3G and 4G technology to provide speedier connections that stay online no matter where you are.

So with only a matter of months to go until 5G networks are set to go live, here’s our run-down of all the latest news and updates.

 

 

From DSC:
I wonder…

  • What will Human Computer Interaction (HCI) look like when ~1GBps average download speeds are the norm?
  • What will the Internet of Things (IoT) turn into (for better or for worse)?
  • How will Machine-to-Machine (M2M) Communications be impacted?
  • What will that kind of bandwidth mean for XR-related technologies (AR VR MR)?

 

 

Google, Facebook, and the Legal Mess Over Face Scanning — finance.yahoo.com by John Jeff Roberts

Excerpt:

When must companies receive permission to use biometric data like your fingerprints or your face? The question is a hot topic in Illinois where a controversial law has ensnared tech giants Facebook and Google, potentially exposing them to billions in dollars in liability over their facial recognition tools.

The lack of specific guidance from the Supreme Court has since produced ongoing confusion over what type of privacy violations can let people seek financial damages.

 

Also see:

 

 

From DSC:
The legal and legislative areas need to close the gap between emerging technologies and the law.

What questions should we be asking about the skillsets that our current and future legislative representatives need? Do we need some of our representatives to be highly knowledgeable, technically speaking? 

What programs and other types of resources should we be offering our representatives to get up to speed on emerging technologies? Which blogs, websites, journals, e-newsletters, listservs, and/or other communication vehicles and/or resources should they have access to?

Along these lines, what about our judges? Can we offer them some of these resources as well? 

What changes do our law schools need to make to address this?

 

 

 

 

The world is changing. Here’s how companies must adapt. — from weforum.org by Joe Kaeser, President and Chief Executive Officer, Siemens AG

Excerpts (emphasis DSC):

Although we have only seen the beginning, one thing is already clear: the Fourth Industrial Revolution is the greatest transformation human civilization has ever known. As far-reaching as the previous industrial revolutions were, they never set free such enormous transformative power.

The Fourth Industrial Revolution is transforming practically every human activity...its scope, speed and reach are unprecedented.

Enormous power (Insert from DSC: What I was trying to get at here) entails enormous risk. Yes, the stakes are high. 

 

“And make no mistake about it: we are now writing the code that will shape our collective future.” CEO of Siemens AG

 

 

Contrary to Milton Friedman’s maxim, the business of business should not just be business. Shareholder value alone should not be the yardstick. Instead, we should make stakeholder value, or better yet, social value, the benchmark for a company’s performance.

Today, stakeholders…rightfully expect companies to assume greater social responsibility, for example, by protecting the climate, fighting for social justice, aiding refugees, and training and educating workers. The business of business should be to create value for society.

This seamless integration of the virtual and the physical worlds in so-called cyber-physical systems – that is the giant leap we see today. It eclipses everything that has happened in industry so far. As in previous industrial revolutions but on a much larger scale, the Fourth Industrial Revolution will eliminate millions of jobs and create millions of new jobs.

 

“…because the Fourth Industrial Revolution runs on knowledge, we need a concurrent revolution in training and education.

If the workforce doesn’t keep up with advances in knowledge throughout their lives, how will the millions of new jobs be filled?” 

Joe Kaeser, President and Chief Executive Officer, Siemens AG

 

 


From DSC:
At least three critically important things jump out at me here:

  1. We are quickly approaching a time when people will need to be able to reinvent themselves quickly and cost-effectively, especially those with families and who are working in their (still existing) jobs. (Or have we already entered this period of time…?)
  2. There is a need to help people identify which jobs are safe to reinvent themselves to — at least for the next 5-10 years.
  3. Citizens across the globe — and their relevant legislatures, governments, and law schools — need to help close the gap between emerging technologies and whether those technologies should even be rolled out, and if so, how and with which features.

 


 

What freedoms and rights should individuals have in the digital age?

Joe Kaeser, President and Chief Executive Officer, Siemens AG

 

 

Facial recognition has to be regulated to protect the public, says AI report — from technologyreview.com by Will Knight
The research institute AI Now has identified facial recognition as a key challenge for society and policymakers—but is it too late?

Excerpt (emphasis DSC):

Artificial intelligence has made major strides in the past few years, but those rapid advances are now raising some big ethical conundrums.

Chief among them is the way machine learning can identify people’s faces in photos and video footage with great accuracy. This might let you unlock your phone with a smile, but it also means that governments and big corporations have been given a powerful new surveillance tool.

A new report from the AI Now Institute (large PDF), an influential research institute based in New York, has just identified facial recognition as a key challenge for society and policymakers.

 

Also see:

EXECUTIVE SUMMARY
At the core of the cascading scandals around AI in 2018 are questions of accountability: who is responsible when AI systems harm us? How do we understand these harms, and how do we remedy them? Where are the points of intervention, and what additional research and regulation is needed to ensure those interventions are effective? Currently there are few answers to these questions, and the frameworks presently governing AI are not capable of ensuring accountability. As the pervasiveness, complexity, and scale of these systems grow, the lack of meaningful accountability and oversight – including basic safeguards of responsibility, liability, and due process – is an increasingly urgent concern.

Building on our 2016 and 2017 reports, the AI Now 2018 Report contends with this central
problem and addresses the following key issues:

  1. The growing accountability gap in AI, which favors those who create and deploy these
    technologies at the expense of those most affected
  2. The use of AI to maximize and amplify surveillance, especially in conjunction with facial
    and affect recognition, increasing the potential for centralized control and oppression
  3. Increasing government use of automated decision systems that directly impact individuals and communities without established accountability structures
  4. Unregulated and unmonitored forms of AI experimentation on human populations
  5. The limits of technological solutions to problems of fairness, bias, and discrimination

Within each topic, we identify emerging challenges and new research, and provide recommendations regarding AI development, deployment, and regulation. We offer practical pathways informed by research so that policymakers, the public, and technologists can better understand and mitigate risks. Given that the AI Now Institute’s location and regional expertise is concentrated in the U.S., this report will focus primarily on the U.S. context, which is also where several of the world’s largest AI companies are based.

 

 

From DSC:
As I said in this posting, we need to be aware of the emerging technologies around us. Just because we can, doesn’t mean we should. People need to be aware of — and involved with — which emerging technologies get rolled out (or not) and/or which features are beneficial to roll out (or not).

One of the things that’s beginning to alarm me these days is how the United States has turned over the keys to the Maserati — i.e., think an expensive, powerful thing — to youth who lack the life experiences to know how to handle such power and, often, the proper respect for such power. Many of these youthful members of our society don’t own the responsibility for the positive and negative influences and impacts that such powerful technologies can have (and the more senior execs have not taken enough responsibility either)!

If you owned the car below, would you turn the keys of this ~$137,000+ car over to your 16-25 year old? Yet that’s what America has been doing for years. And, in some areas, we’re now paying the price.

 

If you owned this $137,000+ car, would you turn the keys of it over to your 16-25 year old?!

 

The corporate world continues to discard the hard-earned experience that age brings…as they shove older people out of the workforce. (I hesitate to use the word wisdom…but in some cases, that’s also relevant/involved here.) Then we, as a society, sit back and wonder how did we get to this place?

Even technologists and programmers in their 20’s and 30’s are beginning to step back and ask…WHY did we develop this application or that feature? Was it — is it — good for society? Is it beneficial? Or should it be tabled or revised into something else?

Below is but one example — though I don’t mean to pick on Microsoft, as they likely have more older workers than the Facebooks, Googles, or Amazons of the world. I fully realize that all of these companies have some older employees. But the youth-oriented culture in American today has almost become an obsession — and not just in the tech world. Turn on the TV, check out the new releases on Netflix, go see a movie in a theater, listen to the radio, cast but a glance at the magazines in the check out lines, etc. and you’ll instantly know
what I mean.

In the workplace, there appears to be a bias against older employees as being less innovative or tech-savvy — such a perspective is often completely incorrect. Go check out LinkedIn for items re: age discrimination…it’s a very real thing. But many of us over the age of 30 know this to be true if we’ve lost a job in the last decade or two and have tried to get a job that involves technology.

 

Microsoft argues facial-recognition tech could violate your rights — from finance.yahoo.com by Rob Pegoraro

Excerpt (emphasis DSC):

On Thursday, the American Civil Liberties Union provided a good reason for us to think carefully about the evolution of facial-recognition technology. In a study, the group used Amazon’s (AMZN) Rekognition service to compare portraits of members of Congress to 25,000 arrest mugshots. The result: 28 members were mistakenly matched with 28 suspects.

The ACLU isn’t the only group raising the alarm about the technology. Earlier this month, Microsoft (MSFT) president Brad Smith posted an unusual plea on the company’s blog asking that the development of facial-recognition systems not be left up to tech companies.

Saying that the tech “raises issues that go to the heart of fundamental human rights protections like privacy and freedom of expression,” Smith called for “a government initiative to regulate the proper use of facial recognition technology, informed first by a bipartisan and expert commission.”

But we may not get new laws anytime soon.

 

just because we can does not mean we should

 

Just because we can…

 

just because we can does not mean we should

 

 

Global IoT technology market to reach $318 billion by 2023, says GlobalData — from which-50.com

Excerpt:

The global market for Internet of Things (IoT) technology, which consists of software, services, connectivity, and devices, reached $130 billion in 2018, and is projected to reach $318 billion by 2023, at a compound annual growth rate (CAGR) of 20 per cent, according to GlobalData.

GlobalData forecasts show that solutions for government, utilities and manufacturing dominate the market, with a total of 58 per cent of the opportunity in 2018 and a slighter smaller 55 per cent of the market in 2023, as others such as travel and leisure and retail grow their respective shares. Energy and transportation are other major verticals, with a combined 15 per cent of the market in both 2018 and 2023.

 

Also see:

  • As digital technology pervades the utility industry so too does the risk of cyber attacks — from by which-50.com by Joseph Brookes
    Excerpt:
    Smart metres and IoT have the potential to optimise performance and maintenance of the billions of dollars worth of infrastructure in Australian utilities. But each new device creates a potential access point to systems that are not designed with cyber security in mind and, in some cases, are already exposed.
 

Looking for something?

Use the form below to search the site:

Still not finding what you're looking for? Drop a comment on a post or contact us so we can take care of it!

© 2019 | Daniel Christian