Everyday Media Literacy — from routledge.com by Sue Ellen Christian
An Analog Guide for Your Digital Life, 1st Edition

Description:

In this graphic guide to media literacy, award-winning educator Sue Ellen Christian offers students an accessible, informed and lively look at how they can consume and create media intentionally and critically.

The straight-talking textbook offers timely examples and relevant activities to equip students with the skills and knowledge they need to assess all media, including news and information. Through discussion prompts, writing exercises, key terms, online links and even origami, readers are provided with a framework from which to critically consume and create media in their everyday lives. Chapters examine news literacy, online activism, digital inequality, privacy, social media and identity, global media corporations and beyond, giving readers a nuanced understanding of the key concepts and concerns at the core of media literacy.

Concise, creative and curated, this book highlights the cultural, political and economic dynamics of media in our contemporary society, and how consumers can mindfully navigate their daily media use. Everyday Media Literacy is perfect for students (and educators) of media literacy, journalism, education and media effects looking to build their understanding in an engaging way.

 
 

Three threats posed by deepfakes that technology won’t solve — from technologyreview.com by Angela Chen
As deepfakes get better, companies are rushing to develop technology to detect them. But little of their potential harm will be fixed without social and legal solutions.

Excerpt:

3) Problem: Deepfake detection is too late to help victims
With deepfakes, “there’s little real recourse after that video or audio is out,” says Franks, the University of Miami scholar.

Existing laws are inadequate. Laws that punish sharing legitimate private information like medical records don’t apply to false but damaging videos. Laws against impersonation are “oddly limited,” Franks says—they focus on making it illegal to impersonate a doctor or government official. Defamation laws only address false representations that portray the subject negatively, but Franks says we should be worried about deepfakes that falsely portray people in a positive light too.

 

Incredible Pictures from Agora’s Contest

 

Per Jane Hart on LinkedIn:

Top 200 Tools for Learning 2019 is now published, together with:

PLUS analysis of how these tools are being used in different context, new graphics, and updated comments on the tools’ pages that show how people are using the tools.

 

 

 
 

 

Watch Salvador Dalí Return to Life Through AI — from interestingengineering.com by
The Dalí Museum has created a deepfake of surrealist artist Salvador Dalí that brings him back to life.

Excerpt:

The Dalí Museum has created a deepfake of surrealist artist Salvador Dalí that brings him back to life. This life-size deepfake is set up to have interactive discussions with visitors.

The deepfake can produce 45 minutes of content and 190,512 possible combinations of phrases and decisions taken by the fake but realistic Dalí. The exhibition was created by Goodby, Silverstein & Partners using 6,000 frames of Dalí taken from historic footage and 1,000 hours of machine learning.

 

From DSC:
While on one hand, incredible work! Fantastic job! On the other hand, if this type of deepfake can be done, how can any video be trusted from here on out? What technology/app will be able to confirm that a video is actually that person, actually saying those words?

Will we get to a point that says, this is so and so, and I approved this video. Or will we have an electronic signature? Will a blockchain-based tech be used? I don’t know…there always seems to be pros and cons to any given technology. It’s how we use it. It can be a dream, or it can be a nightmare.

 

 

The Common Sense Census: Inside the 21st-Century Classroom

21st century classroom - excerpt from infographic

Excerpt:

Technology has become an integral part of classroom learning, and students of all ages have access to digital media and devices at school. The Common Sense Census: Inside the 21st-Century Classroom explores how K–12 educators have adapted to these critical shifts in schools and society. From the benefits of teaching lifelong digital citizenship skills to the challenges of preparing students to critically evaluate online information, educators across the country share their perspectives on what it’s like to teach in today’s fast-changing digital world.

 

 

Exquisite underwater photos to make you love the ocean — from wired.com by Laura Mallonee

 

Job 12:7-10

7 “But ask the animals, and they will teach you,
    or the birds in the sky, and they will tell you;
or speak to the earth, and it will teach you,
    or let the fish in the sea inform you.
Which of all these does not know
    that the hand of the Lord has done this?
10 In his hand is the life of every creature
    and the breath of all mankind.”

 

Glory to GOD in the highest!

 

Facial recognition has to be regulated to protect the public, says AI report — from technologyreview.com by Will Knight
The research institute AI Now has identified facial recognition as a key challenge for society and policymakers—but is it too late?

Excerpt (emphasis DSC):

Artificial intelligence has made major strides in the past few years, but those rapid advances are now raising some big ethical conundrums.

Chief among them is the way machine learning can identify people’s faces in photos and video footage with great accuracy. This might let you unlock your phone with a smile, but it also means that governments and big corporations have been given a powerful new surveillance tool.

A new report from the AI Now Institute (large PDF), an influential research institute based in New York, has just identified facial recognition as a key challenge for society and policymakers.

 

Also see:

EXECUTIVE SUMMARY
At the core of the cascading scandals around AI in 2018 are questions of accountability: who is responsible when AI systems harm us? How do we understand these harms, and how do we remedy them? Where are the points of intervention, and what additional research and regulation is needed to ensure those interventions are effective? Currently there are few answers to these questions, and the frameworks presently governing AI are not capable of ensuring accountability. As the pervasiveness, complexity, and scale of these systems grow, the lack of meaningful accountability and oversight – including basic safeguards of responsibility, liability, and due process – is an increasingly urgent concern.

Building on our 2016 and 2017 reports, the AI Now 2018 Report contends with this central
problem and addresses the following key issues:

  1. The growing accountability gap in AI, which favors those who create and deploy these
    technologies at the expense of those most affected
  2. The use of AI to maximize and amplify surveillance, especially in conjunction with facial
    and affect recognition, increasing the potential for centralized control and oppression
  3. Increasing government use of automated decision systems that directly impact individuals and communities without established accountability structures
  4. Unregulated and unmonitored forms of AI experimentation on human populations
  5. The limits of technological solutions to problems of fairness, bias, and discrimination

Within each topic, we identify emerging challenges and new research, and provide recommendations regarding AI development, deployment, and regulation. We offer practical pathways informed by research so that policymakers, the public, and technologists can better understand and mitigate risks. Given that the AI Now Institute’s location and regional expertise is concentrated in the U.S., this report will focus primarily on the U.S. context, which is also where several of the world’s largest AI companies are based.

 

 

From DSC:
As I said in this posting, we need to be aware of the emerging technologies around us. Just because we can, doesn’t mean we should. People need to be aware of — and involved with — which emerging technologies get rolled out (or not) and/or which features are beneficial to roll out (or not).

One of the things that’s beginning to alarm me these days is how the United States has turned over the keys to the Maserati — i.e., think an expensive, powerful thing — to youth who lack the life experiences to know how to handle such power and, often, the proper respect for such power. Many of these youthful members of our society don’t own the responsibility for the positive and negative influences and impacts that such powerful technologies can have (and the more senior execs have not taken enough responsibility either)!

If you owned the car below, would you turn the keys of this ~$137,000+ car over to your 16-25 year old? Yet that’s what America has been doing for years. And, in some areas, we’re now paying the price.

 

If you owned this $137,000+ car, would you turn the keys of it over to your 16-25 year old?!

 

The corporate world continues to discard the hard-earned experience that age brings…as they shove older people out of the workforce. (I hesitate to use the word wisdom…but in some cases, that’s also relevant/involved here.) Then we, as a society, sit back and wonder how did we get to this place?

Even technologists and programmers in their 20’s and 30’s are beginning to step back and ask…WHY did we develop this application or that feature? Was it — is it — good for society? Is it beneficial? Or should it be tabled or revised into something else?

Below is but one example — though I don’t mean to pick on Microsoft, as they likely have more older workers than the Facebooks, Googles, or Amazons of the world. I fully realize that all of these companies have some older employees. But the youth-oriented culture in American today has almost become an obsession — and not just in the tech world. Turn on the TV, check out the new releases on Netflix, go see a movie in a theater, listen to the radio, cast but a glance at the magazines in the check out lines, etc. and you’ll instantly know
what I mean.

In the workplace, there appears to be a bias against older employees as being less innovative or tech-savvy — such a perspective is often completely incorrect. Go check out LinkedIn for items re: age discrimination…it’s a very real thing. But many of us over the age of 30 know this to be true if we’ve lost a job in the last decade or two and have tried to get a job that involves technology.

 

Microsoft argues facial-recognition tech could violate your rights — from finance.yahoo.com by Rob Pegoraro

Excerpt (emphasis DSC):

On Thursday, the American Civil Liberties Union provided a good reason for us to think carefully about the evolution of facial-recognition technology. In a study, the group used Amazon’s (AMZN) Rekognition service to compare portraits of members of Congress to 25,000 arrest mugshots. The result: 28 members were mistakenly matched with 28 suspects.

The ACLU isn’t the only group raising the alarm about the technology. Earlier this month, Microsoft (MSFT) president Brad Smith posted an unusual plea on the company’s blog asking that the development of facial-recognition systems not be left up to tech companies.

Saying that the tech “raises issues that go to the heart of fundamental human rights protections like privacy and freedom of expression,” Smith called for “a government initiative to regulate the proper use of facial recognition technology, informed first by a bipartisan and expert commission.”

But we may not get new laws anytime soon.

 

just because we can does not mean we should

 

Just because we can…

 

just because we can does not mean we should

 

 
 

NEW: The Top Tools for Learning 2018 [Jane Hart]

The Top Tools for Learning 2018 from the 12th Annual Digital Learning Tools Survey -- by Jane Hart

 

The above was from Jane’s posting 10 Trends for Digital Learning in 2018 — from modernworkplacelearning.com by Jane Hart

Excerpt:

[On 9/24/18],  I released the Top Tools for Learning 2018 , which I compiled from the results of the 12th Annual Digital Learning Tools Survey.

I have also categorised the tools into 30 different areas, and produced 3 sub-lists that provide some context to how the tools are being used:

  • Top 100 Tools for Personal & Professional Learning 2018 (PPL100): the digital tools used by individuals for their own self-improvement, learning and development – both inside and outside the workplace.
  • Top 100 Tools for Workplace Learning (WPL100): the digital tools used to design, deliver, enable and/or support learning in the workplace.
  • Top 100 Tools for Education (EDU100): the digital tools used by educators and students in schools, colleges, universities, adult education etc.

 

3 – Web courses are increasing in popularity.
Although Coursera is still the most popular web course platform, there are, in fact, now 12 web course platforms on the list. New additions this year include Udacity and Highbrow (the latter provides daily micro-lessons). It is clear that people like these platforms because they can chose what they want to study as well as how they want to study, ie. they can dip in and out if they want to and no-one is going to tell them off – which is unlike most corporate online courses which have a prescribed path through them and their use is heavily monitored.

 

 

5 – Learning at work is becoming personal and continuous.
The most significant feature of the list this year is the huge leap up the list that Degreed has made – up 86 places to 47th place – the biggest increase by any tool this year. Degreed is a lifelong learning platform and provides the opportunity for individuals to own their expertise and development through a continuous learning approach. And, interestingly, Degreed appears both on the PPL100 (at  30) and WPL100 (at 52). This suggests that some organisations are beginning to see the importance of personal, continuous learning at work. Indeed, another platform that underpins this, has also moved up the list significantly this year, too. Anders Pink is a smart curation platform available for both individuals and teams which delivers daily curated resources on specified topics. Non-traditional learning platforms are therefore coming to the forefront, as the next point further shows.

 

 

From DSC:
Perhaps some foreshadowing of the presence of a powerful, online-based, next generation learning platform…?

 

 

 

 



 

Everything you need to know about those new iPhones — from wired.com by Arielle Pardes

Excerpt:

Actually, make that three new iPhones. Apple followed last year’s iPhone X with the iPhone Xs, iPhone Xs Max, and iPhone Xr. It also spent some time showing off the Apple Watch Series 4, its most powerful wearable yet. Missed the event? Catch our commentary on WIRED’s liveblog, or read on for everything you need to know about today’s big Apple event.

 

 

Apple’s latest iPhones are packed with AI smarts — from wired.com by Tom Simonite

Excerpt:

At a glance the three new iPhones unveiled next to Apple’s glassy circular headquarters Wednesday look much like last year’s iPhone X. Inside, the devices’ computational guts got an invisible but more significant upgrade.

Apple’s phones come with new chip technology with a focus on helping the devices understand the world around them using artificial intelligence algorithms. The company says the improvements allow the new devices to offer slicker camera effects and augmented reality experiences.

For the first time, non-Apple developers will be allowed to run their own algorithms on Apple’s AI-specific hardware.

 

 

Apple Watch 4 adds ECG, EKG, and more heart-monitoring capabilities — from wired.com by Lauren Goode

Excerpt:

The new Apple Watch Series 4, revealed by Apple earlier today, underscores that some of the watch’s most important features are its health and fitness-tracking functions. The new watch is one of the first over-the-counter devices in the US to offer electrocardiogram, or ECG, readings. On top of that, the Apple Watch has received FDA clearance—both for the ECG feature and another new feature that detects atrial fibrillation.

 

 

 

 

 

Adobe supercharges Photoshop’s content-aware fill so you have more options, fewer AI fails — from techcrunch.com by Devin Coldewey

Excerpt:

The most important difference is certainly the ability to choose which parts of the image the filling agent samples when it’s looking for stuff to put inside your lassoed area.

No need to be exact — the algorithm is smart enough to work without the handful of pixels that are casualties of overeager mousing. The improved algorithm also lets you tell the algo that it should be liberal with rotation and scaling of the elements it uses, or that it should mirror-image the content it finds to make it fit better.

 

Adobe supercharges Photoshop’s content-aware fill so you have more options, fewer AI fails

 

 

 

From DSC:
I wonder what Vincent Van Gogh would think of this one!? Wow!!!

“No Blue Without Yellow” by Artist Maciek Janicki — from booooooom.com

Excerpt:

San Francisco-based artist and animator Maciek Janicki takes us into the world of Vincent van Gogh with his latest short film. Created in partnership with the Van Gogh Museum in Amsterdam, “No Blue Without Yellow” offers an immersive 3D tour, constructed using sampled paintings from consequential times in the renowned artist’s life. Click here for previous posts of Janicki’s work. And watch “No Blue Without Yellow” below!

 

No Blue Without Yellow from Maciek Janicki on Vimeo.

 

 

“Milk & Honey” by Photographer Thomas Jordan — from booooooom.com

 

 

 

3 Ways to Bring together Drama, Dance, and the Visual Arts — from theartofed.com by Raymond Yang
If you’re interested in bringing together dance, drama, and the visual arts, I have some insight to share.

 

 

 

Photographer Spotlight: André Terras Alexandre — from booooooom.com

 

 

 

 

How a domino master builds (and destroys) 15,000-piece creations — from wired.com by Ryan Loughlin

 

 

 

How Does Calling Students Artists Impact Growth Mindset? — from theartofed.com

Excerpt:

In short, for students to develop a growth mindset they need to understand how to fail well. They need to believe through hard work they can improve. If students are artists the moment they walk into the art room, how does that impact their growth mindset?

 

 

 

 

Looking for something?

Use the form below to search the site:

Still not finding what you're looking for? Drop a comment on a post or contact us so we can take care of it!

© 2019 | Daniel Christian