AI RESOURCES AND TEACHING (Kent State University) — from aiadvisoryboards.wordpress.com

AI Resources and Teaching | Kent State University offers valuable resources for educators interested in incorporating artificial intelligence (AI) into their teaching practices. The university recognizes that the rapid emergence of AI tools presents both challenges and opportunities in higher education.

The AI Resources and Teaching page provides educators with information and guidance on various AI tools and their responsible use within and beyond the classroom. The page covers different areas of AI application, including language generation, visuals, videos, music, information extraction, quantitative analysis, and AI syllabus language examples.


A Cautionary AI Tale: Why IBM’s Dazzling Watson Supercomputer Made a Lousy Tutor — from the74million.org by Greg Toppo
With a new race underway to create the next teaching chatbot, IBM’s abandoned 5-year, $100M ed push offers lessons about AI’s promise and its limits.

For all its jaw-dropping power, Watson the computer overlord was a weak teacher. It couldn’t engage or motivate kids, inspire them to reach new heights or even keep them focused on the material — all qualities of the best mentors.

It’s a finding with some resonance to our current moment of AI-inspired doomscrolling about the future of humanity in a world of ascendant machines. “There are some things AI is actually very good for,” Nitta said, “but it’s not great as a replacement for humans.”

His five-year journey to essentially a dead-end could also prove instructive as ChatGPT and other programs like it fuel a renewed, multimillion-dollar experiment to, in essence, prove him wrong.

To be sure, AI can do sophisticated things such as generating quizzes from a class reading and editing student writing. But the idea that a machine or a chatbot can actually teach as a human can, he said, represents “a profound misunderstanding of what AI is actually capable of.” 

Nitta, who still holds deep respect for the Watson lab, admits, “We missed something important. At the heart of education, at the heart of any learning, is engagement. And that’s kind of the Holy Grail.”

From DSC:
This is why the vision that I’ve been tracking and working on has always said that HUMAN BEINGS will be necessary — they are key to realizing this vision. Along these lines, here’s a relevant quote:

Another crucial component of a new learning theory for the age of AI would be the cultivation of “blended intelligence.” This concept recognizes that the future of learning and work will involve the seamless integration of human and machine capabilities, and that learners must develop the skills and strategies needed to effectively collaborate with AI systems. Rather than viewing AI as a threat to human intelligence, a blended intelligence approach seeks to harness the complementary strengths of humans and machines, creating a symbiotic relationship that enhances the potential of both.

Per Alexander “Sasha” Sidorkin, Head of the National Institute on AI in Society at California State University Sacramento.

 

AWS, Educause partner on generative AI readiness tool — from edscoop.com by Skylar Rispens
Amazon Web Services and the nonprofit Educause announced a new tool designed to help higher education institutions gauge their readiness to adopt generative artificial intelligence.

Amazon Web Services and the nonprofit Educause on Monday announced they’ve teamed up to develop a tool that assesses how ready higher education institutions are to adopt generative artificial intelligence.

Through a series of curated questions about institutional strategy, governance, capacity and expertise, AWS and Educause claim their assessment can point to ways that operations can be improved before generative AI is adopted to support students and staff.

“Generative AI will transform how educators engage students inside and outside the classroom, with personalized education and accessible experiences that provide increased student support and drive better learning outcomes,” Kim Majerus, vice president of global education and U.S. state and local government at AWS, said in a press release. “This assessment is a practical tool to help colleges and universities prepare their institutions to maximize this technology and support students throughout their higher ed journey.”


Speaking of AI and our learning ecosystems, also see:

Gen Z Wants AI Skills And Businesses Want Workers Who Can Apply AI: Higher Education Can Help — from forbes.com by Bruce Dahlgren

At a moment when the value of higher education has come under increasing scrutiny, institutions around the world can be exactly what learners and employers both need. To meet the needs of a rapidly changing job market and equip learners with the technical and ethical direction needed to thrive, institutions should familiarize students with the use of AI and nurture the innately human skills needed to apply it ethically. Failing to do so can create enormous risk for higher education, business and society.

What is AI literacy?
To effectively utilize generative AI, learners will need to grasp the appropriate use cases for these tools, understand when their use presents significant downside risk, and learn to recognize abuse to separate fact from fiction. AI literacy is a deeply human capacity. The critical thinking and communication skills required are muscles that need repeated training to be developed and maintained.

 

The University Student’s Guide To Ethical AI Use  — from studocu.com; with thanks to Jervise Penton at 6XD Media Group for this resource

This comprehensive guide offers:

  • Up-to-date statistics on the current state of AI in universities, how institutions and students are currently using artificial intelligence
  • An overview of popular AI tools used in universities and its limitations as a study tool
  • Tips on how to ethically use AI and how to maximize its capabilities for students
  • Current existing punishment and penalties for cheating using AI
  • A checklist of questions to ask yourself, before, during, and after an assignment to ensure ethical use

Some of the key facts you might find interesting are:

  • The total value of AI being used in education was estimated to reach $53.68 billion by the end of 2032.
  • 68% of students say using AI has impacted their academic performance positively.
  • Educators using AI tools say the technology helps speed up their grading process by as much as 75%.
 

What Are AI Agents—And Who Profits From Them? — from every.to by Evan Armstrong
The newest wave of AI research is changing everything

I’ve spent months talking with founders, investors, and scientists, trying to understand what this technology is and who the players are. Today, I’m going to share my findings. I’ll cover:

  • What an AI agent is
  • The major players
  • The technical bets
  • The future

Agentic workflows are loops—they can run many times in a row without needing a human involved for each step in the task. A language model will make a plan based on your prompt, utilize tools like a web browser to execute on that plan, ask itself if that answer is right, and close the loop by getting back to you with that answer.

But agentic workflows are an architecture, not a product. It gets even more complicated when you incorporate agents into products that customers will buy.

Early reports of GPT-5 are that it is “materially better” and is being explicitly prepared for the use case of AI agents.

 

10 Things You Can Definitely Expect From The Future Of Healthcare AI — from medicalfuturist.com by Andrea Koncz
Artificial Intelligence promises material changes on both sides of the stethoscope, but this revolution won’t unfold on its own.

Key Takeaways

  • From unlocking hidden biomarkers to streamlining administrative burdens, AI will improve patient care and redefine the role of physicians.
  • Technology can serve as a powerful tool, but healthcare remains a fundamentally human endeavor.
  • This technological revolution won’t unfold on its own, it requires collaboration between physicians, technologists, regulators, and patients.
 

Nvidia’s AI boom is only getting started. Just ask CEO Jensen Huang — from fastcompany.com by Harry McCracken
Nvidia’s chips sparked the AI revolution. Now it’s in the business of putting the technology to work in an array of industries.

Nvidia is No. 1 on Fast Company’s list of the World’s 50 Most Innovative Companies of 2024. Explore the full list of companies that are reshaping industries and culture.

Nvidia isn’t just in the business of providing ever-more-powerful computing hardware and letting everybody else figure out what to do with it. Across an array of industries, the company’s technologies, platforms, and partnerships are doing much of the heavy lifting of putting AI to work. In a single week in January 2024, for instance, Nvidia reported that it had begun beta testing its drug discovery platform, demoed software that lets video game characters speak unscripted dialogue, announced deals with four Chinese EV manufacturers that will incorporate Nvidia technology in their vehicles, and unveiled a retail-industry partnership aimed at foiling organized shoplifting.


Johnson & Johnson MedTech Works With NVIDIA to Broaden AI’s Reach in Surgery — from blogs.nvidia.com by David Niewolny

AI — already used to connect, analyze and offer predictions based on operating room data — will be critical to the future of surgery, boosting operating room efficiency and clinical decision-making.

That’s why NVIDIA is working with Johnson & Johnson MedTech to test new AI capabilities for the company’s connected digital ecosystem for surgery. It aims to enable open innovation and accelerate the delivery of real-time insights at scale to support medical professionals before, during and after procedures.

J&J MedTech is in 80% of the world’s operating rooms and trains more than 140,000 healthcare professionals each year through its education programs.


GE and NVIDIA Join Forces to Accelerate Artificial Intelligence Adoption in Healthcare — from nvidianews.nvidia.com

  • New generation of intelligent medical devices will use world’s most advanced AI platform with the goal of improving patient care
  • GE Healthcare is the first medical device company to use the NVIDIA GPU Cloud
  • New Revolution Frontier CT, powered by NVIDIA, is two times faster for image processing, proving performance acceleration has begun

Nvidia Announces Major Deals With Healthcare Companies — from cheddar.com

At the GTC A.I. conference last week, Nvidia launched nearly two dozen new A.I. powered, health care focused tools and deals with companies Johnson & Johnson and GE Healthcare for surgery and medical imaging. The move into health care space for the A.I. company is an effort that’s been under development for a decade.


Nvidia is now powering AI nurses — from byMaxwell Zeff / Gizmodo;; via Claire Zau
The cheap AI agents offer medical advice to patients over video calls in real-time

 

How to Make the Dream of Education Equity (or Most of It) a Reality — from nataliewexler.substack.com by Natalie Wexler
Studies on the effects of tutoring–by humans or computers–point to ways to improve regular classroom instruction.

One problem, of course, is that it’s prohibitively expensive to hire a tutor for every average or struggling student, or even one for every two or three of them. This was the two-sigma “problem” that Bloom alluded to in the title of his essay: how can the massive benefits of tutoring possibly be scaled up? Both Khan and Zuckerberg have argued that the answer is to have computers, maybe powered by artificial intelligence, serve as tutors instead of humans.

From DSC:
I’m hoping that AI-backed learning platforms WILL help many people of all ages and backgrounds. But I realize — and appreciate what Natalie is saying here as well — that human beings are needed in the learning process (especially at younger ages). 

But without the human element, that’s unlikely to be enough. Students are more likely to work hard to please a teacher than to please a computer.

Natalie goes on to talk about training all teachers in cognitive science — a solid idea for sure. That’s what I was trying to get at with this graphic:
.

We need to take more of the research from learning science and apply it in our learning spaces.

.
But I’m not as hopeful in all teachers getting trained in cognitive science…as it should have happened (in the Schools of Education and in the K12 learning ecosystem at large) by now. Perhaps it will happen, given enough time.

And with more homeschooling and blended programs of education occurring, that idea gets stretched even further. 

K-12 Hybrid Schooling Is in High Demand — from realcleareducation.com by Keri D. Ingraham (emphasis below from DSC); via GSV

Parents are looking for a different kind of education for their children. A 2024 poll of parents reveals that 72% are considering, 63% are searching for, and 44% have selected a new K-12 school option for their children over the past few years. So, what type of education are they seeking?

Additional polling data reveals that 49% of parents would prefer their child learn from home at least one day a week. While 10% want full-time homeschooling, the remaining 39% of parents desire their child to learn at home one to four days a week, with the remaining days attending school on-campus. Another parent poll released this month indicates that an astonishing 64% of parents indicated that if they were looking for a new school for their child, they would enroll him or her in a hybrid school.

 

GTC March 2024 Keynote with NVIDIA CEO Jensen Huang


Also relevant/see:




 


[Report] Generative AI Top 150: The World’s Most Used AI Tools (Feb 2024) — from flexos.work by Daan van Rossum
FlexOS.work surveyed Generative AI platforms to reveal which get used most. While ChatGPT reigns supreme, countless AI platforms are used by millions.

As the FlexOS research study “Generative AI at Work” concluded based on a survey amongst knowledge workers, ChatGPT reigns supreme.

2. AI Tool Usage is Way Higher Than People Expect – Beating Netflix, Pinterest, Twitch.
As measured by data analysis platform Similarweb based on global web traffic tracking, the AI tools in this list generate over 3 billion monthly visits.

With 1.67 billion visits, ChatGPT represents over half of this traffic and is already bigger than Netflix, Microsoft, Pinterest, Twitch, and The New York Times.

.


Artificial Intelligence Act: MEPs adopt landmark law — from europarl.europa.eu

  • Safeguards on general purpose artificial intelligence
  • Limits on the use of biometric identification systems by law enforcement
  • Bans on social scoring and AI used to manipulate or exploit user vulnerabilities
  • Right of consumers to launch complaints and receive meaningful explanations


The untargeted scraping of facial images from CCTV footage to create facial recognition databases will be banned © Alexander / Adobe Stock


A New Surge in Power Use Is Threatening U.S. Climate Goals — from nytimes.com by Brad Plumer and Nadja Popovich
A boom in data centers and factories is straining electric grids and propping up fossil fuels.

Something unusual is happening in America. Demand for electricity, which has stayed largely flat for two decades, has begun to surge.

Over the past year, electric utilities have nearly doubled their forecasts of how much additional power they’ll need by 2028 as they confront an unexpected explosion in the number of data centers, an abrupt resurgence in manufacturing driven by new federal laws, and millions of electric vehicles being plugged in.


OpenAI and the Fierce AI Industry Debate Over Open Source — from bloomberg.com by Rachel Metz

The tumult could seem like a distraction from the startup’s seemingly unending march toward AI advancement. But the tension, and the latest debate with Musk, illuminates a central question for OpenAI, along with the tech world at large as it’s increasingly consumed by artificial intelligence: Just how open should an AI company be?

The meaning of the word “open” in “OpenAI” seems to be a particular sticking point for both sides — something that you might think sounds, on the surface, pretty clear. But actual definitions are both complex and controversial.


Researchers develop AI-driven tool for near real-time cancer surveillance — from medicalxpress.com by Mark Alewine; via The Rundown AI
Artificial intelligence has delivered a major win for pathologists and researchers in the fight for improved cancer treatments and diagnoses.

In partnership with the National Cancer Institute, or NCI, researchers from the Department of Energy’s Oak Ridge National Laboratory and Louisiana State University developed a long-sequenced AI transformer capable of processing millions of pathology reports to provide experts researching cancer diagnoses and management with exponentially more accurate information on cancer reporting.


 

Also see:

Cognition Labs Blob

 

A View into the Generative AI Legal Landscape 2024 — from law.stanford.edu by Megan Ma, Aparna Sinha,  Ankit Tandon, & Jennifer Richards

Excerpt (emphasis DSC):

Some key observations and highlights:

  1. Emerging technical solutions are addressing the main challenges of using Generative AI in legal applications, such as lack of consistency and accuracy, limited explainability, privacy concerns, and difficulty in obtaining and training models on legal domain data.
  2. Structural impediments in the legal industry, such as the billable hour, lack of standardization, vendor dependence, and incumbent control, moderate the success of generative AI startups.
  3. Our defined “client-facing” LegalTech market is segmented into three broad lines of work: Research and Analysis, Document Review and Drafting, and Litigation. We view the total LegalTech market in the United States to be estimated at ~$13B in 2023, with litigation being the largest category.
  4. LegalTech incumbents play a significant role in the adoption of generative AI technologies, often opting for market consolidation through partnerships or acquisitions rather than building solutions organically.
  5. Future evolution in LegalTech may involve specialization in areas such as patent and IP, immigration, insurance, and regulatory compliance. There is also potential for productivity tools and access to legal services, although the latter faces structural challenges related to the Unauthorized Practice of Law (UPL).

Fresh Voices on Legal Tech with Tessa Manuello — from legaltalknetwork.com by Dennis Kennedy and Tom Mighell

EPISODE NOTES
Creative thinking and design elements can help you elevate your legal practice and develop more meaningful solutions for clients. Dennis and Tom welcome Tessa Manuello to discuss her insights on legal technology with a particular focus on creative design adaptations for lawyers. Tessa discusses the tech learning process for attorneys and explains how a more creative approach for both learning and implementing tech can help lawyers make better use of current tools, AI included.


International Women’s Day: Kriti Sharma Calls for More Women Working in AI, LegalTech — from legalcurrent.com

In honor of International Women’s Day, Sharma discusses on LinkedIn the need for more female role models in the tech sector as AI opens up traditional career pathways and creates opportunities to welcome more women to the space.

Sharma invited Thomson Reuters female leaders working in legal technology to share their perspectives, including Rawia Ashraf, Emily Colbert, and Anu Dodda.


 

This week in 5 numbers: Another faith-based college plans to close — from by Natalie Schwartz
We’re rounding up some of our top recent stories, from Notre Dame College’s planned closure to Valparaiso’s potential academic cuts.

BY THE NUMBERS

  • 1,444
    The number of students who were enrolled at Notre Dame College in fall 2022, down 37% from 2014. The Roman Catholic college recently said it would close after the spring term, citing declining enrollment, along with rising costs and significant debt.
  • 28
    The number of academic programs that Valparaiso University may eliminate. Eric Johnson, the Indiana institution’s provost, said it offers too many majors, minors and graduate degrees in relation to its enrollment.

A couple of other items re: higher education that caught my eye were:

Universities Expect to Use More Tech in Future Classrooms—but Don’t Know How — from insidehighered.com by Lauren Coffey

University administrators see the need to implement education technology in their classrooms but are at a loss regarding how to do so, according to a new report.

The College Innovation Network released its first CIN Administrator EdTech survey today, which revealed that more than half (53 percent) of the 214 administrators surveyed do not feel extremely confident in choosing effective ed-tech products for their institutions.

“While administrators are excited about offering new ed-tech tools, they are lacking knowledge and data to help them make informed decisions that benefit students and faculty,” Omid Fotuhi, director of learning and innovation at WGU Labs, which funds the network, said in a statement.

From DSC:
I always appreciated our cross-disciplinary team at Calvin (then College). As we looked at enhancing our learning spaces, we had input from the Teaching & Learning Group, IT, A/V, the academic side of the house, and facilities. It was definitely a team-based approach. (As I think about it, it would have been helpful to have more channels for student feedback as well.)


Per Jeff Selingo:

Optionality. In my keynote, I pointed out that the academic calendar and credit hour in higher ed are like “shelf space” on the old television schedule that has been upended by streaming. In much the same way, we need similar optionality to meet the challenges of higher ed right now: in how students access learning (in-person, hybrid, online) to credentials (certificates, degrees) to how those experiences stack together for lifelong learning.

Culture in institutions. The common thread throughout the conference was how the culture of institutions (both universities and governments) need to change so our structures and practices can evolve. Too many people in higher ed right now are employing a scarcity mindset and seeing every change as a zero-sum game. If you’re not happy about the present, as many attendees suggested you’re not going to be excited about the future.

 

How a Hollywood Director Uses AI to Make Movies — from every.to by Dan Shipper
Dave Clarke shows us the future of AI filmmaking

Dave told me that he couldn’t have made Borrowing Time without AI—it’s an expensive project that traditional Hollywood studios would never bankroll. But after Dave’s short went viral, major production houses approached him to make it a full-length movie. I think this is an excellent example of how AI is changing the art of filmmaking, and I came out of this interview convinced that we are on the brink of a new creative age.

We dive deep into the world of AI tools for image and video generation, discussing how aspiring filmmakers can use them to validate their ideas, and potentially even secure funding if they get traction. Dave walks me through how he has integrated AI into his movie-making process, and as we talk, we make a short film featuring Nicolas Cage using a haunted roulette ball to resurrect his dead movie career, live on the show.

 

Ecosystems for the future of learning — from thebigidea.education-reimagined.org by Education Reimagined and the History Co:Lab

The intent of this report is to help communities build their capacity for transformation of education, advancing toward what our society needs most—a system that works for young people. It draws on the experiences and insights of innovators across the United States who are already answering this challenge—creating learner-centered, community-based ecosystems.

This report includes:

  • a landscape analysis of select communities creating learning ecosystems;
  • a framework that emerged from the analysis and can be used by communities to consider their readiness and appetite for this transformation;
  • an invitation to communities to explore and discover their own path for reimagining education; and
  • a call for national and regional institutions to listen, learn from, and create the conditions for communities to pursue their visions.

From DSC:
The above items was accessed via the article below:

Where Does Work to Imagine a Learner-Centered Ecosystem Begin? — from gettingsmart.com by Alin Bennett

Key Points

  • The Norris School District in Wisconsin exemplifies how learner profiles and community connections can enhance authentic learning experiences for young people, fostering a culture of belonging and responsibility.
  • Purdue Polytechnic High School demonstrates the importance of enabling conditions, such as creating microschools with access to shared services, to support a learner-centered approach while ensuring scalability and access to a variety of resources.
 

Using Generative AI throughout the Institution — from aiedusimplified.substack.com by Lance Eaton
8 lightning talk on generative AI and how to use it through higher education


The magic of AI to help educators with saving time. — from magicschool.ai; via Mrs. Kendall Sajdak


Getting Better Results out of Generative AI — from aiedusimplified.substack.com by Lance Eaton
The prompt to use before you prompt generative AI

Last month, I discussed a GPT that I had created around enhancing prompts. Since then, I have been actively using my Prompt Enhancer GPT to much more effective outputs. Last week, I did a series of mini-talks on generative AI in different parts of higher education (faculty development, human resources, grants, executive leadership, etc) and structured it as “5 tips”. I included a final bonus tip in all of them—a tip that I heard from many afterwards was probably the most useful tip—especially because you can only access the Prompt Enhancer GPT if you are paying for ChatGPT.


Exploring the Opportunities and Challenges with Generative AI — from er.educause.edu by Veronica Diaz

Effectively integrating generative AI into higher education requires policy development, cross-functional engagement, ethical principles, risk assessments, collaboration with other institutions, and an exploration of diverse use cases.


Creating Guidelines for the Use of Gen AI Across Campus — from campustechnology.com by Rhea Kelly
The University of Kentucky has taken a transdisciplinary approach to developing guidelines and recommendations around generative AI, incorporating input from stakeholders across all areas of the institution. Here, the director of UK’s Center for the Enhancement of Learning and Teaching breaks down the structure and thinking behind that process.

That resulted in a set of instructional guidelines that we released in August of 2023 and updated in December of 2023. We’re also looking at guidelines for researchers at UK, and we’re currently in the process of working with our colleagues in the healthcare enterprise, UK Healthcare, to comb through the additional complexities of this technology in clinical care and to offer guidance and recommendations around those issues.


From Mean Drafts to Keen Emails — from automatedteach.com by Graham Clay

My experiences match with the results of the above studies. The second study cited above found that 83% of those students who haven’t used AI tools are “not interested in using them,” so it is no surprise that many students have little awareness of their nature. The third study cited above found that, “apart from 12% of students identifying as daily users,” most students’ use cases were “relatively unsophisticated” like summarizing or paraphrasing text.

For those of us in the AI-curious bubble, we need to continually work to stay current, but we also need to recognize that what we take to be “common knowledge” is far from common outside of the bubble.


What do superintendents need to know about artificial intelligence? — from k12dive.com by Roger Riddell
District leaders shared strategies and advice on ethics, responsible use, and the technology’s limitations at the National Conference on Education.

Despite general familiarity, however, technical knowledge shouldn’t be assumed for district leaders or others in the school community. For instance, it’s critical that any materials related to AI not be written in “techy talk” so they can be clearly understood, said Ann McMullan, project director for the Consortium for School Networking’s EmpowerED Superintendents Initiative.

To that end, CoSN, a nonprofit that promotes technological innovation in K-12, has released an array of AI resources to help superintendents stay ahead of the curve, including a one-page explainer that details definitions and guidelines to keep in mind as schools work with the emerging technology.


 
© 2024 | Daniel Christian