Can you make AI fairer than a judge? Play our courtroom algorithm game — from technologyreview.com by Karen Hao and Jonathan Stray
Play our courtroom algorithm game The US criminal legal system uses predictive algorithms to try to make the judicial process less biased. But there’s a deeper problem.

Excerpt:

As a child, you develop a sense of what “fairness” means. It’s a concept that you learn early on as you come to terms with the world around you. Something either feels fair or it doesn’t.

But increasingly, algorithms have begun to arbitrate fairness for us. They decide who sees housing ads, who gets hired or fired, and even who gets sent to jail. Consequently, the people who create them—software engineers—are being asked to articulate what it means to be fair in their code. This is why regulators around the world are now grappling with a question: How can you mathematically quantify fairness? 

This story attempts to offer an answer. And to do so, we need your help. We’re going to walk through a real algorithm, one used to decide who gets sent to jail, and ask you to tweak its various parameters to make its outcomes more fair. (Don’t worry—this won’t involve looking at code!)

The algorithm we’re examining is known as COMPAS, and it’s one of several different “risk assessment” tools used in the US criminal legal system.

 

But whether algorithms should be used to arbitrate fairness in the first place is a complicated question. Machine-learning algorithms are trained on “data produced through histories of exclusion and discrimination,” writes Ruha Benjamin, an associate professor at Princeton University, in her book Race After Technology. Risk assessment tools are no different. The greater question about using them—or any algorithms used to rank people—is whether they reduce existing inequities or make them worse.

 

You can also see change in these articles as well:

 

 
 

Per Jane Hart on LinkedIn:

Top 200 Tools for Learning 2019 is now published, together with:

PLUS analysis of how these tools are being used in different context, new graphics, and updated comments on the tools’ pages that show how people are using the tools.

 

 

 

The Global Landscape of Online Program Companies — from by Doug Lederman
New trove of data suggests a bigger, more complex, more varied ecosystem of companies that work with colleges to take their academic programs online.

Excerpt:

A new dataset promises to give college leaders, company officials and others involved in the online learning landscape much more information about who offers what programs, how they manage them and where the money is flowing, among other factors.

And the company behind the new data, Holon IQ, published a report today that gives a new name to the large and diversifying category of providers that are working with colleges to take their programs online: OPX, instead of OPM, for online program management companies. (More on that later.)

Also see:

Also see:

  • Multi-Faculty Collaboration to Design Online General Studies Courses — from facultyfocus.com by B. Jean Mandernach
    Excerpt:
    While this type of autonomy in course design makes sense for the face-to-face classroom, it may be less practical–and less effective–in the context of online education. Simply put, development of a high-quality online course takes considerable time and advanced knowledge of online pedagogy. If multiple faculty members are teaching the same course online (as is often the case with general studies or other high-demand courses), it is not an efficient use of departmental time, resources, or budget to have multiple faculty developing their own online classroom for different sections of the same course.
 

What to expect at IFA 2019, Europe’s colossal tech show — from digitaltrends.com by Josh Levenson

Excerpt:

This week, the world’s leading manufacturers will take to the stage at IFA 2019 in Berlin, Germany, to showcase their latest innovations. Here’s what you need to know about this year’s show, including when it’s set to start, how long it will run for, where it’s held, the schedule, and all the devices we’re expecting to see unveiled by the likes of LG, Sony, Samsung, and more.

 

Amazon, Microsoft, ‘putting world at risk of killer AI’: study — from news.yahoo.com by Issam Ahmed

Excerpt:

Washington (AFP) – Amazon, Microsoft and Intel are among leading tech companies putting the world at risk through killer robot development, according to a report that surveyed major players from the sector about their stance on lethal autonomous weapons.

Dutch NGO Pax ranked 50 companies by three criteria: whether they were developing technology that could be relevant to deadly AI, whether they were working on related military projects, and if they had committed to abstaining from contributing in the future.

“Why are companies like Microsoft and Amazon not denying that they’re currently developing these highly controversial weapons, which could decide to kill people without direct human involvement?” said Frank Slijper, lead author of the report published this week.

Addendum on 8/23/19:

 

Take a tour of Google Earth with speakers of 50 different indigenous languages — from fastcompany.com by Melissa Locker

Excerpt:

After the United Nations declared 2019 the International Year of Indigenous Languages, Google decided to help draw attention to the indigenous languages spoken around the globe and perhaps help preserve some of the endangered ones too. To that end, the company recently launched its first audio-driven collection, a new Google Earth tour complete with audio recordings from more than 50 indigenous language speakers from around the world.

 

 
 

45% of ORs will be integrated with artificial intelligence by 2022 – from healthitanalytics.com by Jessica Kent
Operating rooms will become infused with artificial intelligence in the coming years, with interoperability and partnerships fueling growth.

Excerpt:

Thirty-five percent to 45 percent of operating rooms (ORs) in the US and beyond will become integrated with artificial intelligence and virtual reality technologies by 2022, according to a recent Frost & Sullivan analysis.

AI, virtual reality, and other advanced tools will enable ORs to use intelligent and efficient delivery options to improve care precision. Robotic-assisted surgery devices (RASDs) will play a key role in driving the $4.5 billion US and European hospital and OR products and solutions market to $7.04 billion by 2022, the analysis said.

 

From DSC:
I just ran across this recently…what do you think of it?!

 

 

From DSC:
For me, this is extremely disturbing. And if I were a betting man, I’d wager that numerous nations/governments around the world — most certainly that includes the U.S. — have been developing new weapons of warfare for years that are based on artificial intelligence, robotics, automation, etc.

The question is, now what do we do?

Some very hard questions that numerous engineers and programmers need to be asking themselves these days…

By the way, the background audio on the clip above should either be non-existent or far more ominous — this stuff is NOT a joke.

Also see this recent posting. >>

 

Addendum on 6/26/19:

 

Experts in machine learning and military technology say it would be technologically straightforward to build robots that make decisions about whom to target and kill without a “human in the loop” — that is, with no person involved at any point between identifying a target and killing them. And as facial recognition and decision-making algorithms become more powerful, it will only get easier.

 

 

Blockchain: The move from freedom to the rigid, dominant system in learning — from oeb.global by Inge de Waard
In this post Inge de Waard gives an overview of current Blockchain options from industry and looks at its impact on universities as well as philosophises on its future.

Excerpt:

I mentioned a couple of Blockchain certification options already, but an even more advanced blockchain in learning example has entered on my radar too. It is a Russian implementation called Disciplina. This platform combines education (including vocational training), recruiting (comparable with what LinkedIn is doing with its economic graph) and careers for professionals. All of this is combined into a blockchain solution that keeps track of all the learners’ journey. The platform includes not only online courses as we know it but also coaching. After each training, you get a certificate.

TeachMePlease, which is a partner of Disciplina, enables teachers and students to find each other for specific professional training as well as curriculum-related children’s schooling. Admittedly, these initiatives are still being rolled out in terms of courses, but it clearly shows where the next learning will be located: in an umbrella above all the universities and professional academies. At present, the university courses are being embedded into course offerings by corporations that roll out a layer post-university, or post-vocational schooling.

Europe embraces blockchain, as can be seen with their EU Blockchain observatory and forum. And in a more national action, Malta is storing their certifications in a blockchain nationwide as well. We cannot deny that blockchain is getting picked up by both companies and governments. Universities have been piloting several blockchain certification options, and they also harbour some of the leading voices in the debate on blockchain certification.

 

Also see:

AI in education -- April 2019 by Inge de Waard

Future proof learning -- the Skills 3.0 project

 

Also see:

  • 7 blockchain mistakes and how to avoid them — from computerworld.com by Lucas Mearian
    The blockchain industry is still something of a wild west, with many cloud service offerings and a large universe of platforms that can vary greatly in their capabilities. So enterprises should beware jumping to conclusions about the technology.
 
 

10 things we should all demand from Big Tech right now — from vox.com by Sigal Samuel
We need an algorithmic bill of rights. AI experts helped us write one.

We need an algorithmic bill of rights. AI experts helped us write one.

Excerpts:

  1. Transparency: We have the right to know when an algorithm is making a decision about us, which factors are being considered by the algorithm, and how those factors are being weighted.
  2. Explanation: We have the right to be given explanations about how algorithms affect us in a specific situation, and these explanations should be clear enough that the average person will be able to understand them.
  3. Consent: We have the right to give or refuse consent for any AI application that has a material impact on our lives or uses sensitive data, such as biometric data.
  4. Freedom from bias: We have the right to evidence showing that algorithms have been tested for bias related to race, gender, and other protected characteristics — before they’re rolled out. The algorithms must meet standards of fairness and nondiscrimination and ensure just outcomes. (Inserted comment from DSC: Is this even possible? I hope so, but I have my doubts especially given the enormous lack of diversity within the large tech companies.)
  5. Feedback mechanism: We have the right to exert some degree of control over the way algorithms work.
  6. Portability: We have the right to easily transfer all our data from one provider to another.
  7. Redress: We have the right to seek redress if we believe an algorithmic system has unfairly penalized or harmed us.
  8. Algorithmic literacy: We have the right to free educational resources about algorithmic systems.
  9. Independent oversight: We have the right to expect that an independent oversight body will be appointed to conduct retrospective reviews of algorithmic systems gone wrong. The results of these investigations should be made public.
  10. Federal and global governance: We have the right to robust federal and global governance structures with human rights at their center. Algorithmic systems don’t stop at national borders, and they are increasingly used to decide who gets to cross borders, making international governance crucial.

 

This raises the question: Who should be tasked with enforcing these norms? Government regulators? The tech companies themselves?

 

 

To attract talent, corporations turn to MOOCs — from edsurge.com by by Wade Tyler Millward

Excerpt:

When executives at tech giants Salesforce and Microsoft decided in fall 2017 to turn to an online education platform to help train potential users of products for their vendors, they turned to Pierre Dubuc and his team in fall 2017.

Two years later, Dubuc’s company, OpenClassrooms, has closed deals with both of them. Salesforce has worked with OpenClassrooms to create and offer a developer-training course to help people learn how to use the Salesforce platform. In a similar vein, Microsoft will use the OpenClassrooms platform for a six-month course in artificial intelligence. If students complete the AI program, they are guaranteed a job within six months or get their money back. They also earn masters-level diploma accredited in Europe.

 

 

Introduction: Leading the social enterprise—Reinvent with a human focus
2019 Global Human Capital Trends
— from deloitte.com by Volini?, Schwartz? ?, Roy?, Hauptmann, Van Durme, Denny, and Bersin

Excerpt (emphasis DSC):

Learning in the flow of life. The number-one trend for 2019 is the need for organizations to change the way people learn; 86 percent of respondents cited this as an important or very important issue. It’s not hard to understand why. Evolving work demands and skills requirements are creating an enormous demand for new skills and capabilities, while a tight labor market is making it challenging for organizations to hire people from outside. Within this context, we see three broader trends in how learning is evolving: It is becoming more integrated with work; it is becoming more personal; and it is shifting—slowly—toward lifelong models. Effective reinvention along these lines requires a culture that supports continuous learning, incentives that motivate people to take advantage of learning opportunities, and a focus on helping individuals identify and develop new, needed skills.

 

Looking for something?

Use the form below to search the site:

Still not finding what you're looking for? Drop a comment on a post or contact us so we can take care of it!

© 2019 | Daniel Christian