ChatGPT remembers who you are — from thebrainyacts.beehiiv.com |Brainyacts #191

OpenAI rolls out Memory feature for ChatGPT
OpenAI has introduced a cool update for ChatGPT (rolling out to paid and free users – but not in the EU or Korea), enabling the AI to remember user-specific details across sessions. This memory feature enhances personalization and efficiency, making your interactions with ChatGPT more relevant and engaging.

.

Key Features

  1. Automatic Memory Tracking
    • ChatGPT now automatically records information from your interactions such as preferences, interests, and plans. This allows the AI to refine its responses over time, making each conversation increasingly tailored to you.
  2. Enhanced Personalization
    • The more you interact with ChatGPT, the better it understands your needs and adapts its responses accordingly. This personalization improves the relevance and efficiency of your interactions, whether you’re asking for daily tasks or discussing complex topics.
  3. Memory Management Options
    • You have full control over this feature. You can view what information is stored, toggle the memory on or off, and delete specific data or all memory entries, ensuring your privacy and preferences are respected.




From DSC:
The ability of AI-based applications to remember things about us will have major and positive ramifications for us when we think about learning-related applications of AI.


 

Shares of two big online education stocks tank more than 10% as students use ChatGPT — from cnbc.com by Michelle Fox; via Robert Gibson on LinkedIn

The rapid rise of artificial intelligence appears to be taking a toll on the shares of online education companies Chegg and Coursera.

Both stocks sank by more than 10% on Tuesday after issuing disappointing guidance in part because of students using AI tools such as ChatGPT from OpenAI.



Synthetic Video & AI Professors — from drphilippahardman.substack.com by Dr. Philippa Hardman
Are we witnessing the emergence of a new, post-AI model of async online learning?

TLDR: by effectively tailoring the learning experience to the learner’s comprehension levels and preferred learning modes, AI can enhance the overall learning experience, leading to increased “stickiness” and higher rates of performance in assessments.

TLDR: AI enables us to scale responsive, personalised “always on” feedback and support in a way that might help to solve one of the most wicked problems of online async learning – isolation and, as a result, disengagement.

In the last year we have also seen the rise of an unprecedented number of “always on” AI tutors, built to provide coaching and feedback how and when learners need it.

Perhaps the most well-known example is Khan Academy’s Khanmigo and its GPT sidekick Tutor Me. We’re also seeing similar tools emerge in K12 and Higher Ed where AI is being used to extend the support and feedback provided for students beyond the physical classroom.


Our Guidance on School AI Guidance document has been updated — from stefanbauschard.substack.com by Stefan Bauschard

We’ve updated the free 72-page document we wrote to help schools design their own AI guidance policies.

There are a few key updates.

  1. Inclusion of Oklahoma and significant updates from North Carolina and Washington.
  2. More specifics on implementation — thanks NC and WA!
  3. A bit more on instructional redesign. Thanks to NC for getting this party started!

Creating a Culture Around AI: Thoughts and Decision-Making — from er.educause.edu by Courtney Plotts and Lorna Gonzalez

Given the potential ramifications of artificial intelligence (AI) diffusion on matters of diversity, equity, inclusion, and accessibility, now is the time for higher education institutions to adopt culturally aware, analytical decision-making processes, policies, and practices around AI tools selection and use.

 

The Digital Transformation Journey: Lessons For Lawyers Embracing AI — from abovethelaw.com by Olga V. Mack
The journey from the days of leather-bound law books to the digital age — and now toward an AI-driven future — offers valuable lessons for embracing change.

No One Will Miss The ‘Good Old Days’
I have yet to meet a lawyer nostalgic for the days of manually updating law reports or sifting through stacks of books for a single precedent. The convenience, speed, and breadth of digital research tools have made the practice of law more efficient and effective. As we move further into the AI era, the enhancements in predictive analytics, document automation, and legal research will make the “good old days” of even the early digital age seem quaint. The efficiencies and capabilities AI brings to the table are likely to become just as indispensable as online databases are today.

The Way We ‘Law’ Will Change For The Better
The ultimate goal of integrating AI into legal practice isn’t just to replace old methods with new ones; it’s to enhance our ability to serve justice, increase access to legal services, and improve the quality of our work. AI promises to automate mundane tasks, predict legal outcomes with greater accuracy, and unearth insights from vast data. These advancements will free us to focus more on the nuanced, human aspects of law — strategy, empathy, and ethical judgment.


AI to Help Double Legal Tech Market Over Five Years, Gartner Says — from news.bloomberglaw.com by Isabel Gottlieb (behind a paywall)

  • Tech to take up a bigger share of in-house legal spend
  • Generative AI boom has much longer to run

The legal tech market will expand to $50 billion by 2027, driven by the generative artificial intelligence boom, according to an analysis by market research firm Gartner Inc.

That growth, up from about $23 billion in 2022, will be driven by continued law firm spending on AI legal tech, as well as in-house departments allocating more of their overall budgets to technology, said Chris Audet, chief of research in Gartner’s legal, risk and compliance leaders practice. The market size prediction, released publicly on Thursday, comes from a late-2023 analysis for Gartner clients, and the 2022 market size comes from …


Legal Tech Market To See Huge Lift Off Thanks to GenAI — from digit.fyi by Elizabeth Greenberg

The global legal technology market has grown significantly in recent years and generative AI (GenAI) will accelerate this growth, meaning the market will reach $50 billion in value by 2027, according to Gartner.

“GenAI has huge potential for bringing more automation to the legal space,” said Chris Audet, chief of research in the Gartner for legal, risk & compliance leaders practice.

“Rapid GenAI developments, and the widespread availability of consumer tools such as OpenAI’s ChatGPT and Google’s Bard, will quickly increase the number of established legal technology use cases, in turn creating growing market conditions for an increasing number of legal-focused tools.”

“New technologies can fundamentally change the way legal organizations do business, and GenAI has enormous potential to do this,” an analyst at Gartner said.


Revolutionizing Legal Tech in 48 Hours — from law.stanford.edu by Monica Schreiber
At CodeX Hackathon, SLS Students Help Create Award-Winning AI Tools to Help Veterans and Streamline M&A

Disabled veterans seeking to file claims with the Veterans Administration are faced with multiple hurdles and reams of paperwork. Many vets resort to paying third-party companies thousands of dollars to help them with the process.

What if there were a way to streamline the claims process—to condense burdensome information gathering and data inputting into a linear, simplified set of tasks guided by a chatbot? How long would it take to roll out a tool that could accomplish that?

The answer: about 48 hours—at least for an interdisciplinary team of students from Stanford University’s schools of Law, Business, and Computer Science collaborating feverishly during Codex’s Large Language Model (LLM) Hackathon held recently on campus.


What If Your Law Firm Had A Blank Page For Legal Tech? — from artificiallawyer.com

f law firms had a blank page for legal technology and innovation, what would they do?

While organisations across all sectors are getting to grips with the opportunities and risks posed by genAI, forward-thinking law firm leaders are considering what it means for their businesses – today, tomorrow, and the day after tomorrow.

But some firms remain constrained by yesterday, due to legacy processes, ways of working and mindsets. To create the conditions for change, firms need to adopt a ‘blank page’ approach and review all areas of their businesses by asking: if we were starting afresh, how would we design the organisation to future-proof it to achieve transformative growth with genAI at the core?

From DSC:
This sentence reminds me of the power of culture:

But some firms remain constrained by yesterday, due to legacy processes, ways of working and mindsets.


Fresh Voices on Legal Tech with Sarah Glassmeyer — from legaltalknetwork.com by Dennis Kennedy, Tom Mighell, and Sarah Glassmeyer

What if, instead of tech competence being this scary, overwhelming thing, we showed lawyers how to engage with technology in a more lighthearted, even playful, way? The reality is—tech competency doesn’t have an endpoint, but the process of continuous learning shouldn’t be dull and confusing. Sarah Glassmeyer joins Dennis and Tom to talk about her perspectives on technology education for attorneys, the latest trends in the legal tech world and new AI developments, and growing your knowledge of technology by building on small skills, one at a time.
.

 


How Legal Technology Can Add Value to an M&A Practice — from lexology.com

Following is a primer on some of the A.I.-driven legal technologies, from contract review and automated due-diligence solutions to deal collaboration and closing-management tools, that can drive productivity and efficiency during the four phases of an M&A transaction, as well as enhance market insight and client service.

 

The Verge | What’s Next With AI | February 2024 | Consumer Survey

 

 

 

 

 

 




Microsoft AI creates talking deepfakes from single photo — from inavateonthenet.net


The Great Hall – where now with AI? It is not ‘Human Connection V Innovative Technology’ but ‘Human Connection + Innovative Technology’ — from donaldclarkplanb.blogspot.com by Donald Clark

The theme of the day was Human Connection V Innovative Technology. I see this a lot at conferences, setting up the human connection (social) against the machine (AI). I think this is ALL wrong. It is, and has always been a dialectic, human connection (social) PLUS the machine. Everyone had a smartphone, most use it for work, comms and social media. The binary between human and tech has long disappeared. 


Techno-Social Engineering: Why the Future May Not Be Human, TikTok’s Powerful ForYou Algorithm, & More — from by Misha Da Vinci

Things to consider as you dive into this edition:

  • As we increasingly depend on technology, how is it changing us?
  • In the interaction between humans and technology, who is adapting to whom?
  • Is the technology being built for humans, or are we being changed to fit into tech systems?
  • As time passes, will we become more like robots or the AI models we use?
  • Over the next 30 years, as we increasingly interact with technology, who or what will we become?

 

Description:

I recently created an AI version of myself—REID AI—and recorded a Q&A to see how this digital twin might challenge me in new ways. The video avatar is generated by Hour One, its voice was created by Eleven Labs, and its persona—the way that REID AI formulates responses—is generated from a custom chatbot built on GPT-4 that was trained on my books, speeches, podcasts and other content that I’ve produced over the last few decades. I decided to interview it to test its capability and how closely its responses match—and test—my thinking. Then, REID AI asked me some questions on AI and technology. I thought I would hate this, but I’ve actually ended up finding the whole experience interesting and thought-provoking.


From DSC:
This ability to ask questions of a digital twin is very interesting when you think about it in terms of “interviewing” a historical figure. I believe character.ai provides this kind of thing, but I haven’t used it much.


 

Colleges are now closing at a pace of one a week. What happens to the students? — from hechingerreport.org by Jon Marcus
Most never finish their degrees, and alumni wonder about the value of degrees they’ve earned

About one university or college per week so far this year, on average, has announced that it will close or merge. That’s up from a little more than two a month last year, according to the State Higher Education Executive Officers Association, or SHEEO.

Most students at colleges that close give up on their educations altogether. Fewer than half transfer to other institutions, a SHEEO study found. Of those, fewer than half stay long enough to get degrees. Many lose credits when they move from one school to another and have to spend longer in college, often taking out more loans to pay for it.

Colleges are almost certain to keep closing. As many as one in 10 four-year colleges and universities are in financial peril, the consulting firm EY Parthenon estimates.

Students who transferlose an average of 43 percentof the credits they’ve already earned and paid for, the Government Accountability Office found in the most recent comprehensive study of this problem.

Also relevant:

 

Instructors as Innovators: a Future-focused Approach to New AI Learning Opportunities, With Prompts –from papers.ssrn.com by Ethan R. Mollick and Lilach Mollick

Abstract

This paper explores how instructors can leverage generative AI to create personalized learning experiences for students that transform teaching and learning. We present a range of AI-based exercises that enable novel forms of practice and application including simulations, mentoring, coaching, and co-creation. For each type of exercise, we provide prompts that instructors can customize, along with guidance on classroom implementation, assessment, and risks to consider. We also provide blueprints, prompts that help instructors create their own original prompts. Instructors can leverage their content and pedagogical expertise to design these experiences, putting them in the role of builders and innovators. We argue that this instructor-driven approach has the potential to democratize the development of educational technology by enabling individual instructors to create AI exercises and tools tailored to their students’ needs. While the exercises in this paper are a starting point, not a definitive solutions, they demonstrate AI’s potential to expand what is possible in teaching and learning.

 

The AI Tools in Education Database — from aitoolsdirectory.notion.site; via George Siemens

Since AI in education has been moving at the speed of light, we built this AI Tools in Education database to keep track of the most recent AI tools in education and the changes that are happening every day. This database is intended to be a community resource for educators, researchers, students, and other edtech specialists looking to stay up to date. This is a living document, so be sure to come back for regular updates.


Another Workshop for Faculty and Staff — from aiedusimplified.substack.com by Lance Eaton
A recent workshop with some adjustments.

The day started out with a short talk about AI (slides). Some of it is my usual schtick where I do a bit of Q&A with folks around myths and misunderstandings of generative AI in order to establish some common ground. These are often useful both in setting the tone and giving folks a sense of how I come to explore generative AI: with a mixture of humor, concern, curiosity, and of course, cat pics.

From there, we launched into a series of mini-workshops where folks had time to first play around with some previously created prompts around teaching and learning before moving onto prompts for administrative work. The prompts and other support materials are in this Workshop Resource Document. The goal was to just get them into using one or more AI tools with some useful prompts so they can learn more about its capabilities.


The Edtech Insiders Rundown of ASU+GSV 2024 — from edtechinsiders.substack.com by by Sarah Morin, Alex Sarlin, and Ben Kornell
And more on Edtech Insiders+, upcoming events, Gauth, AI Reading Tutors, The Artificial Intelligence Interdisciplinary Institute, and TeachAI Policy Resources

Alex Sarlin

4. Everyone is Edtech Now
This year, in addition to investors, entrepreneurs, educators, school leaders, university admins, non-profits, publishers, and operators from countless edtech startups and incumbents, there were some serious big tech companies in attendance like Meta, Google, OpenAI, Microsoft, Amazon, Tiktok, and Canva. Additionally, a horde of management consultancies, workforce organizations, mental health orgs, and filmmakers were in attendance.

Edtech continues to expand as an industry category and everyone is getting involved.


Ep 18 | Rethinking Education, Lessons to Unlearn, Become a Generalist, & More — Ana Lorena Fábrega — from mishadavinci.substack.com by Misha da Vinci

It was such a delight to chat with Ana. She’s brilliant and passionate, a talented educator, and an advocate for better ways of learning for children and adults. We cover ways to transform schools so that students get real-world skills, learn resilience and how to embrace challenges, and are prepared for an unpredictable future. And we go hard on why we must keep learning no matter our age, become generalists, and leverage technology in order to adapt to the fast-changing world.

Misha also featured an item re: the future of schooling and it contained this graphic:


Texas is replacing thousands of human exam graders with AI — from theverge.com by Jess Weatherbed

The Texas Tribune reports an “automated scoring engine” that utilizes natural language processing — the technology that enables chatbots like OpenAI’s ChatGPT to understand and communicate with users — is being rolled out by the Texas Education Agency (TEA) to grade open-ended questions on the State of Texas Assessments of Academic Readiness (STAAR) exams. The agency is expecting the system to save $15–20 million per year by reducing the need for temporary human scorers, with plans to hire under 2,000 graders this year compared to the 6,000 required in 2023.


Debating About AI: An Easy Path to AI Awareness and Basic Literacy — from stefanbauschard.substack.com by Stefan Bauschard
If you are an organization committed to AI literacy, consider sponsoring some debate topics and/or debates next year and expose thousands of students to AI literacy.

Resolved: Teachers should integrate generative AI in their teaching and learning.

The topic is simple but raises an issue that students can connect with.

While helping my students prepare and judging debates, I saw students demonstrate an understanding of many key issues and controversies.

These included—

*AI writing assessment/grading
*Bias
*Bullying
*Cognitive load
*Costs of AI systems
*Declining test scores
*Deep fakes
*Differentiation
*Energy consumption
*Hallucinations
*Human-to-human connection
*Inequality and inequity in access
*Neurodiversity
*Personalized learning
*Privacy
*Regulation (lack thereof)
*The future of work and unemployment
*Saving teachers time
*Soft skills
*Standardized testing
*Student engagement
*Teacher awareness and AI training; training resource trade-offs
*Teacher crowd-out
*Transparency and explainability
*Writing detectors (students had an exaggerated sense of the workability of these tools).

 

Meeting Students’ Needs for Emotional Support — from edutopia.org by Zi Jia Ng
A new survey finds that a large percentage of students don’t feel that they have an adult to turn to at school when they’re troubled.

Only 55 percent of elementary school students (grades three through five), 42 percent of middle school students, and 40 percent of high school students in the United States have an adult at school they can talk to when they feel upset or stressed, according to a survey of more than 200,000 students across 20 different states. At every age, students benefit from a hand to hold, an ear to listen, and a heart to understand them.

Here’s one strategy for helping to ensure that every student has a trusted adult at school.


Getting Middle and High School Students With Low Grades Back on Track — from edutopia.org by Christine Boatman
By sitting down with students and laying out just what they need to do to pass, teachers can give them the tools to succeed.

AN ANTIDOTE TO PROCRASTINATION
There are effective preventive measures that teachers can take to support middle and high school students with time-management and organizational skills. Still, some students inevitably may find themselves behind at the end of the semester and need individualized Tier 2 interventions as a result of their procrastination.

A Tier 2 strategy that teachers can use to support student efforts to pass classes during the end-of-the-semester scramble is the creation of individual PDSA (plan, do, study, act) cycles. A PDSA cycle is a process in which teachers and students work together to create a plan for improvement; implement, or do, the plan; study if the plan’s actions were successful; and act to create long-term improvement actions based on the results of the plan.

In PDSA cycles, teachers work with their students to create plans for success. These plans can be used either with a whole group or on an individual basis. Through working one-on-one with students this way, I’ve seen large gains in student achievement and agency.


A Student’s Perspective on Career and Interview Readiness — from gettingsmart.com by Tyler Robert and Todd Smith

Key Points

  • Sharing experiences in real-world learning is an asset when interviewing for early career opportunities.
  • Building confidence in not only being interviewed but also speaking about your skills in common language is a key part of creating effective pathways.

Asking Students What They Would Do If They Were The Teacher — from thebrokencopier.substack.com by Marcus Luther
one of my favorite practices we’ve normed in our classroom

Though it had been a bit since our previous check-in, the major drop in how students were doing overall was staggering—yet also very much tracked with the “vibe” of the classroom of late: students still feel pretty good about what we’re doing, but overall are exhausted and stressed, each in their own way but collectively as well.

My plan on Monday, then?

To share these results with the entire classroom followed by a simple question:

“If you were the teacher and you saw this feedback, what would you think and, more importantly, what would you do?”

And then I’ll listen to what they have to say.

Reflecting back on my own classroom over the years, though, too often the collecting of the feedback became a dead end as far as how students experienced this: they gave their results and then those results disappeared into the digital ether, in their eyes.


 

 

This week in 5 numbers: Education Department voices concern about OPMs — from highereddive.com by Natalie Schwartz
We’re rounding up our top recent stories, from growing worries about 2U’s finances to falling FAFSA submissions from high school seniors.

BY THE NUMBERS

$1.5 billion
The accumulated deficit that 2U has racked up following years of operating losses, according to its financial statements. Student advocacy groups recently called on the U.S. Department of Education to prepare for the “looming collapse” of the online program management company, though 2U has pushed back against those predictions.


5 ways to support today’s online learner — from insidetrack.org
How to help students feel seen, supported and connected as they pursue their programs online

  1. Make online learning learner-centered, demand-driven and career-advancing
  2. Help cultivate a sense of belonging
  3. Reduce barriers to online learning

 

 

Beyond the Hype: Taking a 50 Year Lens to the Impact of AI on Learning — from nafez.substack.com by Nafez Dakkak and Chris Dede
How do we make sure LLMs are not “digital duct tape”?

[Per Chris Dede] We often think of the product of teaching as the outcome (e.g. an essay, a drawing, etc.). The essence of education, in my view, lies not in the products or outcomes of learning but in the journey itself. The artifact is just a symbol that you’ve taken the journey.

The process of learning — the exploration, challenges, and personal growth that occur along the way — is where the real value lies. For instance, the act of writing an essay is valuable not merely for the final product but for the intellectual journey it represents. It forces you to improve and organize your thinking on a subject.

This distinction becomes important with the rise of generative AI, because it uniquely allows us to produce these artifacts without taking the journey.

As I’ve argued previously, I am worried that all this hype around LLMs renders them a “type of digital duct-tape to hold together an obsolete industrial-era educational system”. 


Speaking of AI in our learning ecosystems, also see:


On Building a AI Policy for Teaching & Learning — from by Lance Eaton
How students drove the development of a policy for students and faculty

Well, last month, the policy was finally approved by our Faculty Curriculum Committee and we can finally share the final version: AI Usage Policy. College Unbound also created (all-human, no AI used) a press release with the policy and some of the details.

To ensure you see this:

  • Usage Guidelines for AI Generative Tools at College Unbound
    These guidelines were created and reviewed by College Unbound students in Spring 2023 with the support of Lance Eaton, Director of Faculty Development & Innovation.  The students include S. Fast, K. Linder-Bey, Veronica Machado, Erica Maddox, Suleima L., Lora Roy.

ChatGPT hallucinates fake but plausible scientific citations at a staggering rate, study finds — from psypost.org by Eric W. Dolan

A recent study has found that scientific citations generated by ChatGPT often do not correspond to real academic work. The study, published in the Canadian Psychological Association’s Mind Pad, found that “false citation rates” across various psychology subfields ranged from 6% to 60%. Surprisingly, these fabricated citations feature elements such as legitimate researchers’ names and properly formatted digital object identifiers (DOIs), which could easily mislead both students and researchers.

MacDonald found that a total of 32.3% of the 300 citations generated by ChatGPT were hallucinated. Despite being fabricated, these hallucinated citations were constructed with elements that appeared legitimate — such as real authors who are recognized in their respective fields, properly formatted DOIs, and references to legitimate peer-reviewed journals.

 

Forbes 2024 AI 50 List: Top Artificial Intelligence Startups  — from forbes.com by Kenrick Cai

The artificial intelligence sector has never been more competitive. Forbes received some 1,900 submissions this year, more than double last year’s count. Applicants do not pay a fee to be considered and are judged for their business promise and technical usage of AI through a quantitative algorithm and qualitative judging panels. Companies are encouraged to share data on diversity, and our list aims to promote a more equitable startup ecosystem. But disparities remain sharp in the industry. Only 12 companies have women cofounders, five of whom serve as CEO, the same count as last year. For more, see our full package of coverage, including a detailed explanation of the list methodology, videos and analyses on trends in AI.


Adobe Previews Breakthrough AI Innovations to Advance Professional Video Workflows Within Adobe Premiere Pro — from news.adobe.com

  • New Generative AI video tools coming to Premiere Pro this year will streamline workflows and unlock new creative possibilities, from extending a shot to adding or removing objects in a scene
  • Adobe is developing a video model for Firefly, which will power video and audio editing workflows in Premiere Pro and enable anyone to create and ideate
    Adobe previews early explorations of bringing third-party generative AI models from OpenAI, Pika Labs and Runway directly into Premiere Pro, making it easy for customers to draw on the strengths of different models within the powerful workflows they use every day
  • AI-powered audio workflows in Premiere Pro are now generally available, making audio editing faster, easier and more intuitive

Also relevant see:




 

AI RESOURCES AND TEACHING (Kent State University) — from aiadvisoryboards.wordpress.com

AI Resources and Teaching | Kent State University offers valuable resources for educators interested in incorporating artificial intelligence (AI) into their teaching practices. The university recognizes that the rapid emergence of AI tools presents both challenges and opportunities in higher education.

The AI Resources and Teaching page provides educators with information and guidance on various AI tools and their responsible use within and beyond the classroom. The page covers different areas of AI application, including language generation, visuals, videos, music, information extraction, quantitative analysis, and AI syllabus language examples.


A Cautionary AI Tale: Why IBM’s Dazzling Watson Supercomputer Made a Lousy Tutor — from the74million.org by Greg Toppo
With a new race underway to create the next teaching chatbot, IBM’s abandoned 5-year, $100M ed push offers lessons about AI’s promise and its limits.

For all its jaw-dropping power, Watson the computer overlord was a weak teacher. It couldn’t engage or motivate kids, inspire them to reach new heights or even keep them focused on the material — all qualities of the best mentors.

It’s a finding with some resonance to our current moment of AI-inspired doomscrolling about the future of humanity in a world of ascendant machines. “There are some things AI is actually very good for,” Nitta said, “but it’s not great as a replacement for humans.”

His five-year journey to essentially a dead-end could also prove instructive as ChatGPT and other programs like it fuel a renewed, multimillion-dollar experiment to, in essence, prove him wrong.

To be sure, AI can do sophisticated things such as generating quizzes from a class reading and editing student writing. But the idea that a machine or a chatbot can actually teach as a human can, he said, represents “a profound misunderstanding of what AI is actually capable of.” 

Nitta, who still holds deep respect for the Watson lab, admits, “We missed something important. At the heart of education, at the heart of any learning, is engagement. And that’s kind of the Holy Grail.”

From DSC:
This is why the vision that I’ve been tracking and working on has always said that HUMAN BEINGS will be necessary — they are key to realizing this vision. Along these lines, here’s a relevant quote:

Another crucial component of a new learning theory for the age of AI would be the cultivation of “blended intelligence.” This concept recognizes that the future of learning and work will involve the seamless integration of human and machine capabilities, and that learners must develop the skills and strategies needed to effectively collaborate with AI systems. Rather than viewing AI as a threat to human intelligence, a blended intelligence approach seeks to harness the complementary strengths of humans and machines, creating a symbiotic relationship that enhances the potential of both.

Per Alexander “Sasha” Sidorkin, Head of the National Institute on AI in Society at California State University Sacramento.

 

Addressing equity and ethics in artificial intelligence — from apa.org by Zara Abrams
Algorithms and humans both contribute to bias in AI, but AI may also hold the power to correct or reverse inequities among humans

“The conversation about AI bias is broadening,” said psychologist Tara Behrend, PhD, a professor at Michigan State University’s School of Human Resources and Labor Relations who studies human-technology interaction and spoke at CES about AI and privacy. “Agencies and various academic stakeholders are really taking the role of psychology seriously.”


NY State Bar Association Joins Florida and California on AI Ethics Guidance – Suggests Some Surprising Implications — from natlawreview.com by James G. Gatto

The NY State Bar Association (NYSBA) Task Force on Artificial Intelligence has issued a nearly 80 page report (Report) and recommendations on the legal, social and ethical impact of artificial intelligence (AI) and generative AI on the legal profession. This detailed Report also reviews AI-based software, generative AI technology and other machine learning tools that may enhance the profession, but which also pose risks for individual attorneys’ understanding of new, unfamiliar technology, as well as courts’ concerns about the integrity of the judicial process. It also makes recommendations for NYSBA adoption, including proposed guidelines for responsible AI use. This Report is perhaps the most comprehensive report to date by a state bar association. It is likely this Report will stimulate much discussion.

For those of you who want the “Cliff Notes” version of this report, here is a table that summarizes by topic the various rules mentioned and a concise summary of the associated guidance.

The Report includes four primary recommendations:


 

 

 

AWS, Educause partner on generative AI readiness tool — from edscoop.com by Skylar Rispens
Amazon Web Services and the nonprofit Educause announced a new tool designed to help higher education institutions gauge their readiness to adopt generative artificial intelligence.

Amazon Web Services and the nonprofit Educause on Monday announced they’ve teamed up to develop a tool that assesses how ready higher education institutions are to adopt generative artificial intelligence.

Through a series of curated questions about institutional strategy, governance, capacity and expertise, AWS and Educause claim their assessment can point to ways that operations can be improved before generative AI is adopted to support students and staff.

“Generative AI will transform how educators engage students inside and outside the classroom, with personalized education and accessible experiences that provide increased student support and drive better learning outcomes,” Kim Majerus, vice president of global education and U.S. state and local government at AWS, said in a press release. “This assessment is a practical tool to help colleges and universities prepare their institutions to maximize this technology and support students throughout their higher ed journey.”


Speaking of AI and our learning ecosystems, also see:

Gen Z Wants AI Skills And Businesses Want Workers Who Can Apply AI: Higher Education Can Help — from forbes.com by Bruce Dahlgren

At a moment when the value of higher education has come under increasing scrutiny, institutions around the world can be exactly what learners and employers both need. To meet the needs of a rapidly changing job market and equip learners with the technical and ethical direction needed to thrive, institutions should familiarize students with the use of AI and nurture the innately human skills needed to apply it ethically. Failing to do so can create enormous risk for higher education, business and society.

What is AI literacy?
To effectively utilize generative AI, learners will need to grasp the appropriate use cases for these tools, understand when their use presents significant downside risk, and learn to recognize abuse to separate fact from fiction. AI literacy is a deeply human capacity. The critical thinking and communication skills required are muscles that need repeated training to be developed and maintained.

 
© 2025 | Daniel Christian