From DSC:
Below are some further items that discuss the need for some frameworks, policies, institutes, research, etc. that deal with a variety of game-changing technologies that are quickly coming down the pike (if they aren’t already upon on).  We need such things to help us create a positive future.

Also see Part I of this thread of thinking entitled, “The need for ethics, morals, policies, & serious reflection about what kind of future we want has never been greater!  There have been so many other items that came out since that posting, I felt like I needed to add another one here.

What kind of future do we want? How are we going to insure that we get there?

As the saying goes…”Just because we can do something, doesn’t mean we should.” Or another saying comes to my mind…”What could possibly go wrong with this? It’s a done deal.”

While some of the items below should have very positive impacts on society, I do wonder how long it will take the hackers — the ones who are bent on wreaking havoc — to mess up some of these types of applications…with potentially deadly consequences? Security-related concerns must be dealt with here.


 

5 amazing and alarming things that may be done with your DNA — from washingtonpost.com by Matt McFarland

Excerpt (emphasis DSC):

Venter is leading efforts to use digital technology to analyze humans in ways we never have before, and the results will have huge implications for society. The latest findings he described are currently being written up for scientific publications. Venter didn’t want to usurp the publications, so he wouldn’t dive into extensive detail of how his team has made these breakthroughs. But what he did share offers an exciting and concerning overview of what lies ahead for humanity. There are social, legal and ethical implications to start considering. Here are five examples of how digitizing DNA will change the human experience:

 

 

These are the decisions the Pentagon wants to leave to robots — from defenseone.com by Patrick Tucker
The U.S. military believes its battlefield edge will increasingly depend on automation and artificial intelligence.

Excerpt:

Conducting cyber defensive operations, electronic warfare, and over-the-horizon targeting. “You cannot have a human operator operating at human speed fighting back at determined cyber tech,” Work said. “You are going to need have a learning machine that does that.” He did not say  whether the Pentagon is pursuing the autonomous or automatic deployment of offensive cyber capabilities, a controversial idea to be sure. He also highlighted a number of ways that artificial intelligence could help identify new waveforms to improve electronic warfare.

 

 

Britain should lead way on genetically engineered babies, says Chief Scientific Adviser — from.telegraph.co.uk by Sarah Knapton
Sir Mark Walport, who advises the government on scientific matters, said it could be acceptable to genetically edit human embryos

Excerpt:

Last week more than 150 scientists and campaigners called for a worldwide ban on the practice, claiming it could ‘irrevocably alter the human species’ and lead to a world where inequality and discrimination were ‘inscribed onto the human genome.’

But at a conference in London [on 12/8/15], Sir Mark Walport, who advises the government on scientific matters, said he believed there were ‘circumstances’ in which the genetic editing of human embyros could be ‘acceptable’.

 

 

Cyborg Future: Engineers Build a Chip That Is Part Biological and Part Synthetic — from futurism.com

Excerpt:

Engineers have succeeded in combining an integrated chip with an artificial lipid bilayer membrane containing ATP-powered ion pumps, paving the way for more such artificial systems that combine the biological with the mechanical down the road.

 

 

Robots expected to run half of Japan by 2035 — from engadget.com by Andrew Tarantola
Something-something ‘robot overlords’.

Excerpt:

Data analysts Nomura Research Institute (NRI), led by researcher Yumi Wakao, figure that within the next 20 years, nearly half of all jobs in Japan could be accomplished by robots. Working with Professor Michael Osborne from Oxford University, who had previously investigated the same matter in both the US and UK, the NRI team examined more than 600 jobs and found that “up to 49 percent of jobs could be replaced by computer systems,” according to Wakao.

 

 

 

Cambridge University is opening a £10 million centre to study the impact of AI on humanity — from businessinsider.com by Sam Shead

Excerpt:

Cambridge University announced on [12/3/15] that it is opening a new £10 million research centre to study the impact of artificial intelligence on humanity.

The 806-year-old university said the centre, being funded with a grant from non-profit foundation The Leverhulme Trust, will explore the opportunities and challenges facing humanity as a result of further developments in artificial intelligence.

 

Cambridge-Center-Dec2015

 

 

Tech leaders launch nonprofit to save the world from killer robots — from csmonitor.com by Jessica Mendoza
Elon Musk, Sam Altman, and other tech titans have invested $1 billion in a nonprofit that would help direct artificial intelligence technology toward positive human impact. 

 

 

 

 

2016 will be a pivotal year for social robots — from therobotreport.com by Frank Tobe
1,000 Peppers are selling each month from a big-dollar venture between SoftBank, Alibaba and Foxconn; Jibo just raised another $16 million as it prepares to deliver 7,500+ units in Mar/Apr of 2016; and Buddy, Rokid, Sota and many others are poised to deliver similar forms of social robots.

Excerpt:

These new robots, and the proliferation of mobile robot butlers, guides and kiosks, promise to recognize your voice and face and help you plan your calendar, provide reminders, take pictures of special moments, text, call and videoconference, order fast food, keep watch on your house or office, read recipes, play games, read emotions and interact accordingly, and the list goes on. They are attempting to be analogous to a sharp administrative assistant that knows your schedule, contacts and interests and engages with you about them, helping you stay informed, connected and active.

 

 

IBM opens its artificial mind to the world — from fastcompany.com by Sean Captain
IBM is letting companies plug into its Watson artificial intelligence engine to make sense of speech, text, photos, videos, and sensor data.

Excerpt:

Artificial intelligence is the big, oft-misconstrued catchphrase of the day, making headlines recently with the launch of the new OpenAI organization, backed by Elon Musk, Peter Thiel, and other tech luminaries. AI is neither a synonym for killer robots nor a technology of the future, but one that is already finding new signals in the vast noise of collected data, ranging from weather reports to social media chatter to temperature sensor readings. Today IBM has opened up new access to its AI system, called Watson, with a set of application programming interfaces (APIs) that allow other companies and organizations to feed their data into IBM’s big brain for analysis.

 

 

GE wants to give industrial machines their own social network with Predix Cloud — from fastcompany.com by Sean Captain
GE is selling a new service that promises to predict when a machine will break down…so technicians can preemptively fix it.

 

 

Foresight 2020: The future is filled with 50 billion connected devices — from ibmbigdatahub.com by Erin Monday

Excerpt:

By 2020, there will be over 50 billion connected devices generating continuous data.

This figure is staggering, but is it really a surprise? The world has come a long way from 1992, when the number of computers was roughly equivalent to the population of San Jose. Today, in 2015, there are more connected devices out there than there are human beings. Ubiquitous connectivity is very nearly a reality. Every day, we get a little closer to a time where businesses, governments and consumers are connected by a fluid stream of data and analytics. But what’s driving all this growth?

 

 

Designing robots that learn as effortlessly as babies — from singularityhub.com by Shelly Fan

Excerpt:

A wide-eyed, rosy-cheeked, babbling human baby hardly looks like the ultimate learning machine.

But under the hood, an 18-month-old can outlearn any state-of-the-art artificial intelligence algorithm.

Their secret sauce?

They watch; they imitate; and they extrapolate.

Artificial intelligence researchers have begun to take notice. This week, two separate teams dipped their toes into cognitive psychology and developed new algorithms that teach machines to learn like babies. One instructs computers to imitate; the other, to extrapolate.

 

 

Researchers have found a new way to get machines to learn faster — from fortune.com by  Hilary Brueck

Excerpt:

An international team of data scientists is proud to announce the very latest in machine learning: they’ve built a program that learns… programs. That may not sound impressive at first blush, but making a machine that can learn based on a single example is something that’s been extremely hard to do in the world of artificial intelligence. Machines don’t learn like humans—not as fast, and not as well. And even with this research, they still can’t.

 

 

Team showcase how good Watson is at learning — from adigaskell.org

Excerpt:

Artificial intelligence has undoubtedly come a long way in the last few years, but there is still much to be done to make it intuitive to use.  IBM’s Watson has been one of the most well known exponents during this time, but despite it’s initial success, there are issues to overcome with it.

A team led by Georgia Tech are attempting to do just that.  They’re looking to train Watson to get better at returning answers to specific queries.

 

 

Why The Internet of Things will drive a Knowledge Revolution. — from linkedin.com by David Evans

Excerpt:

As these machines inevitably connect to the Internet, they will ultimately connect to each other so they can share, and collaborate on their own findings. In fact, in 2014 machines got their own ”World Wide Web” called RoboEarth, in which to share knowledge with one another. …
The implications of all of this are at minimum twofold:

  • The way we generate knowledge is going to change dramatically in the coming years.
  • Knowledge is about to increase at an exponential rate.

What we choose to do with this newfound knowledge is of course up to us. We are about to face some significant challenges at scales we have yet to experience.

 

 

Drone squad to be launched by Tokyo police — from bbc.com

Excerpt:

A drone squad, designed to locate and – if necessary – capture nuisance drones flown by members of the public, is to be launched by police in Tokyo.

 

 

An advance in artificial intelligence rivals human abilities — from todayonline.com by John Markoff

Excerpt:

NEW YORK — Computer researchers reported artificial-intelligence advances [on Dec 10] that surpassed human capabilities for a narrow set of vision-related tasks.

The improvements are noteworthy because so-called machine-vision systems are becoming commonplace in many aspects of life, including car-safety systems that detect pedestrians and bicyclists, as well as in video game controls, Internet search and factory robots.

 

 

Somewhat related:

Novo Nordisk, IBM Watson Health to create ‘virtual doctor’ — from wsj.com by Denise Roland
Software could dispense treatment advice for diabetes patients

Excerpt:

Novo Nordisk A/S is teaming up with IBM Watson Health, a division of International Business Machines Corp., to create a “virtual doctor” for diabetes patients that could dispense treatment advice such as insulin dosage.

The Danish diabetes specialist hopes to use IBM’s supercomputer platform, Watson, to analyze health data from diabetes patients to help them manage their disease.

 

 

Why Google’s new quantum computer could launch an artificial intelligence arms race — from washingtonpost.com

 

 

 

8 industries robots will completely transform by 2025 — from techinsider.io

 

 

 

Addendums on 12/17/15:

Russia and China are building highly autonomous killer robots — from businessinsider.com.au by Danielle Muoi

Excerpt:

Russia and China are creating highly autonomous weapons, more commonly referred to as killer robots, and it’s putting pressure on the Pentagon to keep up, according to US Deputy Secretary of Defense Robert Work. During a national-security forum on Monday, Work said that China and Russia are heavily investing in a roboticized army, according to a report from Defense One.

Your Algorithmic Self Meets Super-Intelligent AI — from techcrunch.com by Jarno M. Koponen

Excerpt:

At the same time, your data and personalized experiences are used to develop and train the machine learning systems that are powering the Siris, Watsons, Ms and Cortanas. Be it a speech recognition solution or a recommendation algorithm, your actions and personal data affect how these sophisticated systems learn more about you and the world around you.

The less explicit fact is that your diverse interactions — your likes, photos, locations, tags, videos, comments, route selections, recommendations and ratings — feed learning systems that could someday transform into superintelligent AIs with unpredictable consequences.

As of today, you can’t directly affect how your personal data is used in these systems

 

Addendum on 12/20/15:

 

Addendum on 12/21/15:

  • Facewatch ‘thief recognition’ CCTV on trial in UK stores — from bbc.com
    Excerpts (emphasis DSC):
    Face-recognition camera systems should be used by police, he tells me. “The technology’s here, and we need to think about what is a proportionate response that respects people’s privacy,” he says.

    “The public need to ask themselves: do they want six million cameras painted red at head height looking at them?

 

Addendum on 1/13/16:

 

From DSC:
This posting is meant to surface the need for debates/discussions, new policy decisions, and for taking the time to seriously reflect upon what type of future that we want.  Given the pace of technological change, we need to be constantly asking ourselves what kind of future we want and then to be actively creating that future — instead of just letting things happen because they can happen. (i.e., just because something can be done doesn’t mean it should be done.)

Gerd Leonhard’s work is relevant here.  In the resource immediately below, Gerd asserts:

I believe we urgently need to start debating and crafting a global Digital Ethics Treaty. This would delineate what is and is not acceptable under different circumstances and conditions, and specify who would be in charge of monitoring digressions and aberrations.

I am also including some other relevant items here that bear witness to the increasingly rapid speed at which we’re moving now.


 

Redefining the relationship of man and machine: here is my narrated chapter from the ‘The Future of Business’ book (video, audio and pdf) — from futuristgerd.com by Gerd Leonhard

.

DigitalEthics-GerdLeonhard-Oct2015

 

 

Robot revolution: rise of ‘thinking’ machines could exacerbate inequality — from theguardian.com by Heather Stewart
Global economy will be transformed over next 20 years at risk of growing inequality, say analysts

Excerpt (emphasis DSC):

A “robot revolution” will transform the global economy over the next 20 years, cutting the costs of doing business but exacerbating social inequality, as machines take over everything from caring for the elderly to flipping burgers, according to a new study.

As well as robots performing manual jobs, such as hoovering the living room or assembling machine parts, the development of artificial intelligence means computers are increasingly able to “think”, performing analytical tasks once seen as requiring human judgment.

In a 300-page report, revealed exclusively to the Guardian, analysts from investment bank Bank of America Merrill Lynch draw on the latest research to outline the impact of what they regard as a fourth industrial revolution, after steam, mass production and electronics.

“We are facing a paradigm shift which will change the way we live and work,” the authors say. “The pace of disruptive technological innovation has gone from linear to parabolic in recent years. Penetration of robots and artificial intelligence has hit every industry sector, and has become an integral part of our daily lives.”

 

RobotRevolution-Nov2015

 

 

 

First genetically modified humans could exist within two years — from telegraph.co.uk by Sarah Knapton
Biotech company Editas Medicine is planning to start human trials to genetically edit genes and reverse blindness

Excerpt:

Humans who have had their DNA genetically modified could exist within two years after a private biotech company announced plans to start the first trials into a ground-breaking new technique.

Editas Medicine, which is based in the US, said it plans to become the first lab in the world to ‘genetically edit’ the DNA of patients suffering from a genetic condition – in this case the blinding disorder ‘leber congenital amaurosis’.

 

 

 

Gartner predicts our digital future — from gartner.com by Heather Levy
Gartner’s Top 10 Predictions herald what it means to be human in a digital world.

Excerpt:

Here’s a scene from our digital future: You sit down to dinner at a restaurant where your server was selected by a “robo-boss” based on an optimized match of personality and interaction profile, and the angle at which he presents your plate, or how quickly he smiles can be evaluated for further review.  Or, perhaps you walk into a store to try on clothes and ask the digital customer assistant embedded in the mirror to recommend an outfit in your size, in stock and on sale. Afterwards, you simply tell it to bill you from your mobile and skip the checkout line.

These scenarios describe two predictions in what will be an algorithmic and smart machine driven world where people and machines must define harmonious relationships. In his session at Gartner Symposium/ITxpo 2016 in Orlando, Daryl Plummer, vice president, distinguished analyst and Gartner Fellow, discussed how Gartner’s Top Predictions begin to separate us from the mere notion of technology adoption and draw us more deeply into issues surrounding what it means to be human in a digital world.

 

 

GartnerPredicts-Oct2015

 

 

Univ. of Washington faculty study legal, social complexities of augmented reality — from phys.org

Excerpt:

But augmented reality will also bring challenges for law, public policy and privacy, especially pertaining to how information is collected and displayed. Issues regarding surveillance and privacy, free speech, safety, intellectual property and distraction—as well as potential discrimination—are bound to follow.

The Tech Policy Lab brings together faculty and students from the School of Law, Information School and Computer Science & Engineering Department and other campus units to think through issues of technology policy. “Augmented Reality: A Technology and Policy Primer” is the lab’s first official white paper aimed at a policy audience. The paper is based in part on research presented at the 2015 International Joint Conference on Pervasive and Ubiquitous Computing, or UbiComp conference.

Along these same lines, also see:

  • Augmented Reality: Figuring Out Where the Law Fits — from rdmag.com by Greg Watry
    Excerpt:
    With AR comes potential issues the authors divide into two categories. “The first is collection, referring to the capacity of AR to record, or at least register, the people and places around the user. Collection raises obvious issues of privacy but also less obvious issues of free speech and accountability,” the researchers write. The second issue is display, which “raises a variety of complex issues ranging from possible tort liability should the introduction or withdrawal of information lead to injury, to issues surrounding employment discrimination or racial profiling.”Current privacy law in the U.S. allows video and audio recording in areas that “do not attract an objectively reasonable expectation of privacy,” says Newell. Further, many uses of AR would be covered under the First Amendment right to record audio and video, especially in public spaces. However, as AR increasingly becomes more mobile, “it has the potential to record inconspicuously in a variety of private or more intimate settings, and I think these possibilities are already straining current privacy law in the U.S.,” says Newell.

 

Stuart Russell on Why Moral Philosophy Will Be Big Business in Tech — from kqed.org by

Excerpt (emphasis DSC):

Our first Big Think comes from Stuart Russell. He’s a computer science professor at UC Berkeley and a world-renowned expert in artificial intelligence. His Big Think?

“In the future, moral philosophy will be a key industry sector,” says Russell.

Translation? In the future, the nature of human values and the process by which we make moral decisions will be big business in tech.

 

Life, enhanced: UW professors study legal, social complexities of an augmented reality future — from washington.edu by Peter Kelley

Excerpt:

But augmented reality will also bring challenges for law, public policy and privacy, especially pertaining to how information is collected and displayed. Issues regarding surveillance and privacy, free speech, safety, intellectual property and distraction — as well as potential discrimination — are bound to follow.

 

An excerpt from:

UW-AR-TechPolicyPrimer-Nov2015

THREE: CHALLENGES FOR LAW AND POLICY
AR systems  change   human  experience   and,  consequently,   stand  to   challenge   certain assumptions  of  law  and  policy.  The  issues  AR  systems  raise  may  be  divided  into  roughly two  categories.  The  first  is  collection,  referring  to  the  capacity  of  AR  devices  to  record,  or  at  least register,  the people and  places around  the user.  Collection  raises obvious  issues of  privacy  but  also  less  obvious  issues  of  free  speech  and  accountability.  The  second  rough  category  is  display,  referring  to  the  capacity  of  AR  to  overlay  information over  people  and places  in  something  like  real-time.  Display  raises  a  variety  of  complex  issues  ranging  from
possible  tort  liability  should  the  introduction  or  withdrawal  of  information  lead  to  injury,  to issues   surrounding   employment   discrimination   or   racial   profiling.   Policymakers   and stakeholders interested in AR should consider what these issues mean for them.  Issues related to the collection of information include…

 

HR tech is getting weird, and here’s why — from hrmorning.com by guest poster Julia Scavicchio

Excerpt (emphasis DSC):

Technology has progressed to the point where it’s possible for HR to learn almost everything there is to know about employees — from what they’re doing moment-to-moment at work to what they’re doing on their off hours. Guest poster Julia Scavicchio takes a long hard look at the legal and ethical implications of these new investigative tools.  

Why on Earth does HR need all this data? The answer is simple — HR is not on Earth, it’s in the cloud.

The department transcends traditional roles when data enters the picture.

Many ethical questions posed through technology easily come and go because they seem out of this world.

 

 

18 AI researchers reveal the most impressive thing they’ve ever seen — from businessinsider.com by Guia Marie Del Prado,

Excerpt:

Where will these technologies take us next? Well to know that we should determine what’s the best of the best now. Tech Insider talked to 18 AI researchers, roboticists, and computer scientists to see what real-life AI impresses them the most.

“The DeepMind system starts completely from scratch, so it is essentially just waking up, seeing the screen of a video game and then it works out how to play the video game to a superhuman level, and it does that for about 30 different video games.  That’s both impressive and scary in the sense that if a human baby was born and by the evening of its first day was already beating human beings at video games, you’d be terrified.”

 

 

 

Algorithmic Economy: Powering the Machine-to-Machine Age Economic Revolution — from formtek.com by Dick Weisinger

Excerpts:

As technology advances, we are becoming increasingly dependent on algorithms for everything in our lives.  Algorithms that can solve our daily problems and tasks will do things like drive vehicles, control drone flight, and order supplies when they run low.  Algorithms are defining the future of business and even our everyday lives.

Sondergaard said that “in 2020, consumers won’t be using apps on their devices; in fact, they will have forgotten about apps. They will rely on virtual assistants in the cloud, things they trust. The post-app era is coming.  The algorithmic economy will power the next economic revolution in the machine-to-machine age. Organizations will be valued, not just on their big data, but on the algorithms that turn that data into actions that ultimately impact customers.”

 

 

Related items:

 

Addendums:

 

robots-saying-no

 

 

Addendum on 12/14/15:

  • Algorithms rule our lives, so who should rule them? — from qz.com by Dries Buytaert
    As technology advances and more everyday objects are driven almost entirely by software, it’s become clear that we need a better way to catch cheating software and keep people safe.
 

A college completion idea that’s so simple. Why aren’t we doing it? — from huffingtonpost.com by Brad Phillips

Excerpt (emphasis DSC):

This week’s White House “College Opportunity” summit will focus on an overlooked area with enormous potential for student success: K-12 and higher education working together to improve college completion. It sounds so simple and obvious. In fact many assume it’s already happening. After all both groups of educators share the same students, just at different points in their education careers. Why wouldn’t they share information about students and coordinate efforts to help students be successful?

The process of closely analyzing high school to college data is eye opening for both K-12 and college educators. Faculty discover that while they both may be calling a subject Algebra or English, what is taught and assigned can be very different, setting up students for a struggle.

In Southern California, high school teachers and college faculty members participating in English Curriculum Alignment Project (ECAP) shared years of transcript information Examining student performance over time, educators learned that what was taught in High School English did not align with what was expected in college English.

 

From DSC:
I’ll take that one step further and say that we need stronger continuums between K-12, higher ed, and the corporate/business world.  We need more efforts, conversations, mechanisms, tools, communities of practice, and platforms to collaborate with each other.  That’s what I try to at least scratch the surface on via this Learning Ecosystems blog — i.e., touching upon areas that involve the worlds of K-12, higher ed, and the corporate/business world. We need more collaborations/conversations along these lines.

 

 

Higher Education: New Models, New Rules — from educause.com by Louis Soares, Judith S. Eaton, and Burck Smith
What are the new rules that will accompany future new models in higher education? Three essays address this question by exploring state higher education policy, accreditation for non-institutional education, and the disaggregation of the current higher education model.

 

Educause-NewModelsNewRules-Oct72013

 

Excerpt from Burck’s essay:

Accordingly, the government spurs “supply” by paying for colleges and universities and spurs “demand” by paying for students. Accreditors determine who can receive these funds. All of this worked well for sixty years. Until, suddenly, it doesn’t.

 

Alive in the Swamp  — from nesta.org.uk by Michael Fullan and Katelyn Donnelly

Excerpt (emphasis and link below from DSC):

The authors argue that we should seek digital innovations that produce at least twice the learning outcome for half the cost of our current tools.  To achieve this, three forces need to come together. One is technology, the other pedagogy, and the third is change knowledge, or how to secure transformation across an entire school system.

The breakthrough in Alive in the Swamp is the development of an Index that will be of practical assistance to those charged with making these kinds of decisions at school, local and system level. Building on Fullan’s previous work, Stratosphere, the Index sets out the questions policymakers need to ask themselves not just about the technology at any given moment but crucially also about how it can be combined with pedagogy and knowledge about system change. Similarly, the Index should help entrepreneurs and education technology developers to consider particular features to build into their products to drive increased learning and achieve systemic impact.

The future will belong not to those who focus on the technology alone but to those who place it in this wider context and see it as one element of a wider system transformation. Fullan and Donnelly show how this can be done in a practical way.

.

 seriously-scary-graphic---daniel-christian-7-24-13

 

Also see:
.

 

 

From DSC:
More innovation from the online world… hopefully these technologies end up in the right hands…

Keeping an eye on online test-takers – from nytimes.com by Anne Eisenberg

Excerpt:

But now eavesdropping technologies worthy of the C.I.A. can remotely track every mouse click and keystroke of test-taking students. Squads of eagle-eyed humans at computers can monitor faraway students via webcams, screen sharing and high-speed Internet connections, checking out their photo IDs, signatures and even their typing styles to be sure the test-taker is the student who registered for the class.

 

Inter-Organizational Task Force on Online Learning: Recommendations| September 20, 2012

From DSC:
What caught my eye:

  • The new “traditional” student: the adult learner
Tagged with:  

From DSC:
The wetakeyourclass.com website below is very disturbing in terms of the questions it raises — though such questions and activities are certainly not limited to what occurs in the online classroom, as evidenced by The Shadow Scholar (from The Chronicle by Ed Dante) where a man confessed to writing ~5,000 scholarly papers for students, including grad students.

Questions:

  • Should the owners of this site go to jail? Or is this just capitalism gone awry? 
  • Does this business make a profit?  If so, why and what does that say?
  • Is this type of thing happening for just a handful of people out there? How would we know?  Can the turnitin.com’s of the world detect/stop this?
  • If their services are in demand, should that inform or influence any differences in the strategies or pedagogies we utilize within higher education?  Do we need to re-evaluate what’s really being achieved and not achieved? That is, if employers didn’t look to a college degree, would students come to learn anyway or would they be gone by morning?
  • What, if anything, does it say about students’ ethics?  Matters of the heart?

.

WE take your class dot com -- should these people go to jail?

 

Look at some of the “services,” “benefits,” and messages being offered therein:

  • We take your online classes for you and you get an A
    Underlying message: We’ll help you get that piece of paper so you can move on to what really counts.  (Don’t worry about not knowing anything…you won’t have to prove yourself later on…and never mind about your work ethic, this is a one-time-cutting-the-corners type of decision, right? Isn’t it? Isn’t it?!)
  • Life is too short to spend on classes you have no interest in
  • Give us a deadline and we will meet it!
    Underlying message: Corporations outsource many things, why shouldn’t you?  (Never mind that learning should be your core business and should not be outsourced.)


How about you all, what are some questions that this type of thing raises in your minds?

 

States with online course or online experience requirements — from Sevenstar

  1. Alabama (As of 2008, all students must earn one credit in an advanced placement course, an honors course, a dual credit course or a distance learning course)
  2. Florida (For the 2011-2012 academic year, Florida begins requiring its high school students to complete at least one course online in order to graduate. As of 2006, all students must have the option to take an online course if a student wants it)
  3. Georgia (Starting in 2014, all ninth grade students will have to take at least one online course before graduation)
  4. Michigan (As of 2006, all incoming high school students must complete a course of study delivered via the intranet/Internet; or students will complete 20 hours of structured, sustained, integrated, online experiences.)
  5. Idaho (All students who begin ninth grade in fall 2012, must take 2 online courses to graduate)
  6. New York (As of 2011, public schools in the state of New York must spend significant amounts of time online)
  7. New Mexico (As of 2011, All students must earn one credit in an advanced placement course, an honors course, a dual credit course or a distance learning course)
  8. Virginia (All students who begin ninth grade in fall 2013, must take at least part of a course online to receive a standard or advanced-studies high-school diploma.)

 

Also see:

  • Christian Schools, Finances and Online Courses: Ready for a New Wineskin? — from Sevenstar
    Financial-based decisions are occurring in almost every school. While financial stewardship has always been a hallmark of the Christian school movement, the decisions to cut faculty, reduce the breadth of classes offered, and close schools may lessen the impact of Christian education in the United States. This white paper describes the changed landscape of education and presents opportunities for Christian Schools to thrive financially in this new environment. If your school is concerned about financial sustainability, be sure to read this seminal white paper.

 

Tagged with:  

Key findings from Executive Excess 2012: The CEO Hands in Uncle Sam’s Pocket — from the Institute for Policy Studies by Sarah Anderson, Chuck Collins, Scott Klinger, Sam Pizzigati

.

Executive Excess 2012

Key findings:

  • Of last year’s 100 highest-paid U.S. corporate chief executives, 26 took home more in CEO pay than their companies paid in federal income taxes, up from the 25 we noted in last year’s analysis. Seven firms made the list in both 2011 and 2010.
  • The CEOs of these 26 firms received $20.4 million in average total compensation last year. That’s a 23 percent increase over the average for last year’s list of 2010’s tax dodging executives
  • The four most direct tax subsidies for excessive executive pay cost taxpayers an estimated $14.4 billion per year—$46 for every American man, woman, and child. That amount could also cover the annual cost of hiring 211,732 elementary-school teachers or creating 241,593 clean-energy jobs.

 

From DSC:
Considering our corporations are sitting on $1.X trillion, where is our nation’s heart? Priorities? Care for fellow mankind? It seems the “every man for himself” philosophy and manner of living is alive and well here in America. My alma mater would be proud — it’s their philosophy exactly.

 

 

http://www.futurict.eu/

.

 

Project summary for FuturICT

.

Also see:

  • Scientists aim to predict the future with $1 billion Earth simulator — from dvice.com
    Excerpt:
    Imagine what would happen if you had a computer program that could take in data from sensors everywhere on Earth and then plug that data into a detailed simulation for the entire Earth all at once. If you’re imagining being able to predict the future, you’re imagining correctly, and E.U. researchers want to make it real.The Living Earth Simulator is a billion-dollar proposal to spend ten years developing a computer environment that can simulate everything. And not just simulate, but also explore predictive models of how everything going on in the world interrelates with everything else, deriving connections and correlations that we never knew existed.

    In order to get that billion dollars, the Living Earth Simulator has to beat out four other future and emerging technologies projects that are all trying to win funding from the European Commission.

 

Tagged with:  

From DSC:
Readers of this blog will know that I am pro-technology — at least in most areas. However, as our hearts can sometimes become hardened and our feet can sometimes find themselves on slippery ethical ground, we really need hearts, minds, and consciences that prompt us to care about other people — and to do the right thing as a result of that perspective.

 


Example articles that brought this to my mind recently include:


 

7 sinister technologies from Orwell’s 1984 that are still a threat — from dvice.com by Hal Rappaport

Excerpt (emphasis DSC):

Technology is a wonderful thing, but in the words of Spider-Man’s Uncle Ben, “With great power, comes great responsibility.” If we are not careful, the technology we know and love could be used against us, even subtly. In the year 1984, Apple thought IBM was the bringer of “Big Brother.” In reality, the technology of today better resembles George Orwell’s dystopian vision than a 1980s era PC.

Every day we are in the process of becoming a more connected society. With social networks, cloud computing and even more specific, less-thought-about tech such as Internet-connected home surveillance systems, we may find ourselves in a delicate balance of trust and paranoia.

While we are grateful that we don’t live in a world as bleak as Orwell’s Oceana, it’s clear that the technology now exists to make his world possible if we let it. Keeping our paranoia in check, we should all be mindful of our technology and how it’s used. Security is a good thing and so is saving money, but consider how much of each your personal freedom is worth.

Wiping away your Siri “fingerprint” — from technologyreview.com by David Talbot
Your voice can be a biometric identifier, like your fingerprint. Does Apple really have to store it on its own servers?

Excerpt:

“What I’ve discovered through building and running very targeted online ad campaigns using this data is that users respond favorably to ads that are more targeted, but only if the ads don’t make it clear that I’m targeting sensitive information about them,” he said. “What’s most interesting, and what I’m learning, are which attributes are considered too creepy, and which ones are acceptable.”

“Google Now” knows more about you than your family does – are you OK with that? — from readwriteweb.com by Mark Hachman

Excerpt (emphasis DSC):

Google Now aggregates the information Google already collects about you on a daily basis: accessing your email, your calendar, your contacts, your text messages, your location, your shopping habits, your payment history, as well as your choices in music, movies and books. It can even scan your photos and automatically identify them based on their subject, not just the file name (in the Google I/O demo, Google Now correctly found a picture of the Great Pyramid). About the only aspect of your online life that Google hasn’t apparently assimilated yet is your opinions expressed on Google+. But that’s undoubtedly coming.

Social network privacy settings compared — from techhive.com by Nick Mediati

Excerpt (emphasis DSC):

It should go without saying that protecting your privacy online is kind of a big deal. While people are generally good at not giving out their personal information to just any website that asks for it, those same people can be found filling their Facebook accounts with everything from their birthday to where they live and work. Putting this sensitive information onto a social network not only leaves your data exposed to third-parties (advertisers and so forth), but also to anyone who happens across your profile.

Facebook, Google+, and Twitter all have settings that let you tweak what others can see on your profile—but navigating them can be a bit of a mess. Not all social networks give you complete control over your privacy online, so here’s a quick overview of what Facebook, Google+, and Twitter allow you to do.

U.S. companies lost at least $13 billion to espionage last year — from ieee.org by Robert Charette

Google Glass & Now: Utopia or Dystopia? — from extremetech.com by Sebastian Anthony

Excerpt (emphasis DSC):

If you didn’t watch the Google I/O keynote presented by Vic Gundotra, Hugo Barra, and Sergey Brin, let me quickly bring you up to speed.  Google Now is an Android app that uses your location, behavior history, and search history to display “just the right information at just the right time.”  For example, if you regularly search for a certain sports team, Now will show you a card with the latest scores for that team.  When Now predicts or detects that you’re leaving home in the morning, it will display a card with any relevant traffic information.  If you have a lunch meeting in your Google Calendar, Now will show you the route you need to take to get there — and when you need to leave to get there on time.  If you search Google for an airline flight, Now will show a card with the flight details (and any delays).

Big e-reader is watching you — from PaidContent.org  by Laura Hazard Owen

Why privacy is big business for trial lawyers — from technologyreview.com by Antonio Regalado
Tech companies that make privacy mistakes can expect a lawsuit.

Legal discovery: The billboard is imaginary, but the trend is real.
Trial lawyers are ramping up lawsuits over online privacy breaches.
Flickr Creative Commons | AdamL212 and istock/stocknroll


Addendums
On 7/6/12:
  • Your e-book is reading you — WSJ.com by Alexandra Alter
    Digital-book publishers and retailers now know more about their readers than ever before. How that’s changing the experience of reading.

On 7/16/12:

On 7/23/12:

My notes on two presentations from the Learning Without Frontiers Conference, London, 26th January 2012:

My notes for:
Sir Ken Robinson’s talk

Practice <–>Theory <–> Policy

  • People who practice don’t often have time to get the latest and greatest information re: theory
  • Theorists don’t have much time for practice
  • Policy makers don’t know much about either 🙂

Purposes of education:

  • Economic.  Not solely, but there are economic reasons for providing education. Academic vs vocation programs – Sir Ken doesn’t subscribe to this dichotomy in educational DNA. Need new sorts of education
  • Cultural. Aim to pass on cultural genes – values, beliefs
  • Personal. The most important! In the end, education is ultimately, personal. Too much impersonal testing that students aren’t engaged in.

Key point:

  • There is everything you can do – at all levels; many of us ARE the educational system – at least for the group(s) of students that we are working with. So we can make immediate changes; and collectively this can create a revolution.

Education not linear, not monolithic. Rather, it’s a complex, adaptive system – many moving parts, like a vortex…not like an undistributed canal; more like an ocean with different forces tugging this way and that. (From DSC: I agree with what Sir Ken is saying here, but I especially agree with this particular perspective — thus the name of this blog.)

Personalization is key! Education needs to be customized to the communities where it’s taking place.

Principles

  • Curriculum – towards disciplines (skills, processes, procedures) and away from subjects
  • Teaching & Learning – dynamic; flow of knowledge; not static; forms need to tap into streams; move towards collaborative activities; active learning trumps passive learning
  • Assessment – must move from judgment to description

 


My notes (part way) for:
Jim Knight – If Steve Jobs Designed Schools

What if Steve Jobs had re-invented the education system rather the computer and consumer electronics industry?

Steve Jobs was a contradictory character, combining control freak and Zen Buddhist, and technology with design. He had a revolutionary impact on computing, animation, the music industry, printing, and publishing. Last year he and Bill Gates together expressed surprise at how little impact technology had had on schools. Jobs’s wife is an educational reformer, he was a college dropout; but what would it have been like if Steve Jobs had focused on education? What would the Jobs School be like?

How do we make an insanely great school?

  • Must go really deep to create something that’s easy to use (from DSC — I call this “Easy is hard.”) Need to de-clutter the teaching & learning environment, the curriculum, the qualifications, and the people.
  • How does it make me feel when I walk through the doorway of your school?
  • Get to choose who you want to learn with and from
  • Simple, beautiful space; flexible; social; reflective, all year round
  • More seductive, intuitive, enthralling
  • Does it inspire curiosity?
  • “Don’t need instructions”
  • Not just a school – learning doesn’t stop when school bell rings
  • 24×7 thing
  • Curriculum
  • Is there a range of things to interest everyone?
  • Need more choice; selection; more control of their learning
  • All ages
  • Enterprising
  • Creative, technical, practical…but most of all, it would be fun!

More here…


 

Big Brother is watching: Document reveals surveillance of social media, blogs, image-sharing sites — from techland.time.com byGraeme McMillan

Click here to find out more!
Jim Urquhart / Reuters

Jim Urquhart / Reuters

From DSC:
As a side comment:

  • What occurs in the legislatures and courtrooms across the world lags — sometimes a great deal — behind what technology can do.  With facial recognition, cyberwarfare, trojans, keyloggers, and other items out there these days, privacy is under attack and I don’t think the courts — and the citizens of the world — are keeping up.  Minority Report comes to mind…which is unsettling to me.

 

Recording everything: Digital storage as an enabler of authoritarian governments — by John Villasenor, a nonresident senior fellow in Governance Studies and in the Center for Technology Innovation at Brookings. He is also professor of electrical engineering at the University of California, Los Angeles.

Excerpt (emphasis DSC):

Within the next few years an important threshold will be crossed: For the first time ever, it will become technologically and financially feasible for authoritarian governments to record nearly everything that is said or done within their borders – every phone conversation, electronic message, social media interaction, the movements of nearly every person and vehicle, and video from every street corner. Governments with a history of using all of the tools at their disposal to track and monitor their citizens will undoubtedly make full use of this capability once it becomes available.

The Arab Spring of 2011, which saw regimes toppled by protesters organized via Twitter and Facebook, was heralded in much of the world as signifying a new era in which information technology alters the balance of power in favor of the repressed. However, within the world’s many remaining authoritarian regimes it was undoubtedly viewed very differently. For those governments, the Arab Spring likely underscored the perils of failing to exercise sufficient control of digital communications and highlighted the need to redouble their efforts to increase the monitoring of their citizenry.

Declining storage costs will soon make it practical for authoritarian governments to create permanent digital archives of the data gathered from pervasive surveillance systems. In countries where there is no meaningful public debate on privacy, there is no reason to expect governments not to fully exploit the ability to build databases containing every phone conversation, location data for almost every person and vehicle, and video from every public space in an entire country.

This will greatly expand the ability of repressive regimes to perform surveillance of opponents and to anticipate and react to unrest. In addition, the awareness among the populace of pervasive surveillance will reduce the willingness of people to engage in dissent.

© 2024 | Daniel Christian