Google and Microsoft warn that AI may do dumb things — from wired.com by Tom Simonite

Excerpt:

Alphabet likes to position itself as a leader in AI research, but it was six months behind rival Microsoft in warning investors about the technology’s ethical risks. The AI disclosure in Google’s latest filing reads like a trimmed down version of much fuller language Microsoft put in its most recent annual SEC report, filed last August:

“AI algorithms may be flawed. Datasets may be insufficient or contain biased information. Inappropriate or controversial data practices by Microsoft or others could impair the acceptance of AI solutions. These deficiencies could undermine the decisions, predictions, or analysis AI applications produce, subjecting us to competitive harm, legal liability, and brand or reputational harm.”

 

Chinese company leaves Muslim-tracking facial recognition database exposed online — from by Catalin Cimpanu
Researcher finds one of the databases used to track Uyghur Muslim population in Xinjiang.

Excerpt:

One of the facial recognition databases that the Chinese government is using to track the Uyghur Muslim population in the Xinjiang region has been left open on the internet for months, a Dutch security researcher told ZDNet.

The database belongs to a Chinese company named SenseNets, which according to its website provides video-based crowd analysis and facial recognition technology.

The user data wasn’t just benign usernames, but highly detailed and highly sensitive information that someone would usually find on an ID card, Gevers said. The researcher saw user profiles with information such as names, ID card numbers, ID card issue date, ID card expiration date, sex, nationality, home addresses, dates of birth, photos, and employer.

Some of the descriptive names associated with the “trackers” contained terms such as “mosque,” “hotel,” “police station,” “internet cafe,” “restaurant,” and other places where public cameras would normally be found.

 

From DSC:
Readers of this blog will know that I’m generally pro-technology. But especially focusing in on that last article, to me, privacy is key here. For which group of people from which nation is next? Will Country A next be tracking Christians? Will Country B be tracking people of a given sexual orientation? Will Country C be tracking people with some other characteristic?

Where does it end? Who gets to decide? What will be the costs of being tracked or being a person with whatever certain characteristic one’s government is tracking? What forums are there for combating technologies or features of technologies that we don’t like or want?

We need forums/channels for raising awareness and voting on these emerging technologies. We need informed legislators, senators, lawyers, citizens…we need new laws here…asap.

 

 

 

Law firms either keep up with tech or get left behind — from abajournal.com by Gabriel Teninbaum

Excerpts:

I spend a lot of time thinking about a version of that classic interview question where applicants are asked to envision their future. But, instead of thinking about my own future, I think of the legal profession’s future. If you haven’t done it, give it a try: What will legal work look like in 15 years?

There is a reason to think it’ll look very different from it does now. E-discovery software now does the work once handled by new associates. Legal process outsourcing (LPO) companies have pulled due diligence work, and much more, to offshore locations (and away from domestic midsize firms). LegalZoom—now valued at $2 billion—is drawing millions of consumers every year choosing to handle legal matters without local attorneys.

If your vision includes the idea that the biggest legal employers may someday not even be law firms, then you’re correct. It’s already happened: The largest private provider of legal services in the world today is no longer a multinational law firm. It’s Deloitte, the Big Four accounting firm. Looming super-technologies—like AI and blockchain—are somewhere on the horizon, with the potential to upend legal work in ways that some believe will be unprecedented.

 

Also see:

Students create immersive videos to enhance criminal justice courses — from news.psu.edu by Emma Gosalvez

Excerpt:

Immersive technologies such as 360-degree videos could revolutionize the future of forensic science, giving police and criminologists a tool to visualize different crime scenes and ultimately, become better investigators. Through a Berks Teaching & Learning Innovation Partnership Grant, Penn State Berks students in the course CRIMJ 210: Policing in America are learning to create 360-degree videos of crime-scene scenarios.

These videos are viewed by their peers in CRIMJ 100: Introduction to Criminal Justice to learn about topics such as self-defense, defense of others, and defense of property.

“The project transforms student learning on two levels: It allows students to engage in creative collaboration related to a course topic, and students get to ‘experience’ the scenarios presented by the 360-degree videos created by their peers,” said Mary Ann Mengel, an instructional multimedia designer for Penn State Berks’ Center for Learning & Teaching.

 

 

The information below is from Deb Molfetta, Outreach Coordinator at EdDPrograms.org


EdDPrograms.org helps educators and administrators research doctoral education opportunities. Their organization’s work in education began in 2008 with projects ranging from a new teacher survival guide to their own teacher education scholarship program. More recently they realized that there weren’t any websites dedicated to professional development through Doctor of Education (EdD) programs, which is why they created their own – EdDPrograms.org. It covers a lot of ground, but here are a few sections they think administrators will appreciate:

EdDPrograms.org is owned and operated by a group that has been creating post-secondary education resources since 2008. According to Deb, they have a history of providing students with objective, fact-based resources.

 

 

 

AR will spark the next big tech platform — call it Mirrorworld — from wired.com by Kevin Kelly

Excerpt:

It is already under construction. Deep in the research labs of tech companies around the world, scientists and engineers are racing to construct virtual places that overlay actual places. Crucially, these emerging digital landscapes will feel real; they’ll exhibit what landscape architects call place­ness. The Street View images in Google Maps are just facades, flat images hinged together. But in the mirrorworld, a virtual building will have volume, a virtual chair will exhibit chairness, and a virtual street will have layers of textures, gaps, and intrusions that all convey a sense of “street.”

The mirrorworld—a term first popularized by Yale computer scientist David Gelernter—will reflect not just what something looks like but its context, meaning, and function. We will interact with it, manipulate it, and experience it like we do the real world.

 

Also see:
Google Maps in augmented reality points you in the right direction — from mashable.com by Sasha Lekach

 

 

Bobst launches augmented reality helpline — from proprint.com.au by Sheree Young

Excerpt:

Swiss packaging and label equipment supplier Bobst has launched a new augmented reality smart headset to help answer customer questions.

Rapid problem solving thanks to a new augmented reality helpline service introduced by Swiss packaging and label equipment supplier Bobst stands to save printers time and money, the company says.

The Helpline Plus AR innovation provides a remote assistance service to Bobst’s customers using a smart headset with augmented reality glasses. The technology is being gradually rolled out globally, Bobst says.

Customers can use the headset to contact technical experts and iron out any issues they may be having as well as receive real time advice and support.

 

 

 

 

C-Level View | Feature:
Technology Change: Closing the Knowledge Gap — A Q&A with Mary Grush & Daniel Christian

Excerpts:

Technology changes quickly. People change slowly. The rate of technology change often outpaces our ability to understand it.

It has caused a gap between what’s possible and what’s legal. For example, facial recognition seems to be starting to show up all around us — that’s what’s possible. But what’s legal?

The overarching questions are: What do we really want from these technologies? What kind of future do we want to live in?

Those law schools that expand their understanding of emerging technologies and lead the field in the exploration of related legal issues will achieve greater national prominence.

Daniel Christian

 

 

 

For a next gen learning platform: A Netflix-like interface to check out potential functionalities / educationally-related “apps” [Christian]

From DSC:
In a next generation learning system, it would be sharp/beneficial to have a Netflix-like interface to check out potential functionalities that you could turn on and off (at will) — as one component of your learning ecosystem that could feature a setup located in your living room or office.

For example, put a Netflix-like interface to the apps out at eduappcenter.com (i.e., using a rolling interface at first, then going to a static page/listing of apps…again…similar to Netflix).

 

A Netflix-like interface to check out potential functionalities / educationally-related apps

 

 

 

135 Million Reasons To Believe In A Blockchain Miracle — from forbes.com by Mike Maddock

Excerpts:

Which brings us to the latest headlines about a cryptocurrency entrepreneur’s passing—taking with him the passcode to unlock C$180 million (about $135 million U.S.) in investor currency—which is now reportedly gone forever. Why? Because apparently, the promise of blockchain is true: It cannot be hacked. It is absolutely trustworthy.

Gerald Cotton, the CEO of a crypto company, reportedly passed away recently while building an orphanage in India. Unfortunately, he was the only person who knew the passcode to access the millions his investors had entrusted in him.

This is how we get the transition to Web 3.0.

Some questions to consider:

  • Who will build an easy-to-use “wallet” of the future?
  • Are we responsible enough to handle that much power?

Perhaps the most important question of all is: What role do our “trusted” experts play in this future?

 


From DSC:
I’d like to add another question to Mike’s article:

  • How should law schools, law firms, legislative bodies, government, etc. deal with the new, exponential pace of change and with the power of emerging technologies like , ,  ,  etc.?

 


 

 

From DSC:
From Mary Grush’s recent article re: Learning Engineering, I learned that back in the late 1960’s, Herbert Simon believed there would be value in providing college presidents with “learning engineers” (see his article entitled, “The Job of a College President”).

 

 

An excerpt:

What do we find in a university? Physicists well educated in physics, and trained for research in that discipline; English professors learned in their language and its literature (or at least some tiny corner of it); and so on down the list of the disciplines. But we find no one with a professional knowledge of the laws of learning, or of the techniques for applying them (unless it be a professor of educational psychology, who teaches these laws, but has no broader responsibility for their application in the college).

Notice, our topic is learning, not teaching. A college is a place where people come to learn. How much or how little teaching goes on there depends on whether teaching facilitates learning, and if so, under what circumstances. It is a measure of our naivete that we assume implicitly, in almost all our practices, that teaching is the way to produce learning, and that something called a “class” is the best environment for teaching.

But what do we really know about the learning process: about how people learn, about what they learn, and about what they can do with what they learn? We know a great deal today, if by “we” is meant a relatively small group of educational psychologists who have made this their major professional concern. We know much less, if by “we” is meant the rank and file of college teachers.

 

What is learned must be defined in terms of what the student should be able to do. If learning means change in the student, then that change should be visible in changed potentialities of behavior.

Herbert Simon, 1967

 

From DSC:
You will find a great deal of support for active learning in Simon’s article.

 

 

Why Facebook’s banned “Research” app was so invasive — from wired.com by Louise Matsakislo

Excerpts:

Facebook reportedly paid users between the ages of 13 and 35 $20 a month to download the app through beta-testing companies like Applause, BetaBound, and uTest.


Apple typically doesn’t allow app developers to go around the App Store, but its enterprise program is one exception. It’s what allows companies to create custom apps not meant to be downloaded publicly, like an iPad app for signing guests into a corporate office. But Facebook used this program for a consumer research app, which Apple says violates its rules. “Facebook has been using their membership to distribute a data-collecting app to consumers, which is a clear breach of their agreement with Apple,” a spokesperson said in a statement. “Any developer using their enterprise certificates to distribute apps to consumers will have their certificates revoked, which is what we did in this case to protect our users and their data.” Facebook didn’t respond to a request for comment.

Facebook needed to bypass Apple’s usual policies because its Research app is particularly invasive. First, it requires users to install what is known as a “root certificate.” This lets Facebook look at much of your browsing history and other network data, even if it’s encrypted. The certificate is like a shape-shifting passport—with it, Facebook can pretend to be almost anyone it wants.

To use a nondigital analogy, Facebook not only intercepted every letter participants sent and received, it also had the ability to open and read them. All for $20 a month!

Facebook’s latest privacy scandal is a good reminder to be wary of mobile apps that aren’t available for download in official app stores. It’s easy to overlook how much of your information might be collected, or to accidentally install a malicious version of Fortnite, for instance. VPNs can be great privacy tools, but many free ones sell their users’ data in order to make money. Before downloading anything, especially an app that promises to earn you some extra cash, it’s always worth taking another look at the risks involved.

 

Amazon is pushing facial technology that a study says could be biased — from nytimes.com by Natasha Singer
In new tests, Amazon’s system had more difficulty identifying the gender of female and darker-skinned faces than similar services from IBM and Microsoft.

Excerpt:

Over the last two years, Amazon has aggressively marketed its facial recognition technology to police departments and federal agencies as a service to help law enforcement identify suspects more quickly. It has done so as another tech giant, Microsoft, has called on Congress to regulate the technology, arguing that it is too risky for companies to oversee on their own.

Now a new study from researchers at the M.I.T. Media Lab has found that Amazon’s system, Rekognition, had much more difficulty in telling the gender of female faces and of darker-skinned faces in photos than similar services from IBM and Microsoft. The results raise questions about potential bias that could hamper Amazon’s drive to popularize the technology.

 

 

Looking for something?

Use the form below to search the site:

Still not finding what you're looking for? Drop a comment on a post or contact us so we can take care of it!

© 2019 | Daniel Christian