Can new AI help to level up the scales of justice? — from gtlaw.com.au by Peter Waters, Jason Oliver, and David Baddeley

So asks a recent study by two academics from Stanford Law School, David Freeman Engstrom and Nora Freeman Engstrom, on the potential impact of AI on the civil litigation landscape in the US.

It is against this landscape, the study observes, that champions of legal tech have suggested that there is an opportunity for legal tech to “democratise” litigation and put litigation’s “haves” and “have nots” on a more equal footing, by arming smaller firms and sole practitioners with the tools necessary to do battle against their better resourced opponents, and cutting the cost of legal services, putting lawyers within reach of a wider swathe of people.

But is this a real opportunity, and will AI be key to its realisation?

However, while AI may reduce the justice gap between the “haves” and “have-nots” of litigation, it could also exacerbate existing inequalities.

From DSC:
While this article approaches things from the lawyer’s viewpoint, I’d like to see this question and the use of AI from the common man’s/woman’s viewpoint. Why? In order to provide FAR GREATER access to justice (#A2J) for those who can’t afford a lawyer as they head into the civil law courtrooms.

  • Should I take my case to court? Do I have a chance to win this case? If so, how?
  • What forms do I need to complete if I’m going to go to court?
  • When and how do I address the judge?
  • What does my landlord have to do?
  • How do I prevent myself from falling into a debt-collection mess and/or what options do I have to get out of this mess?
  • Are there any lawyers in my area who would take my case on a pro bono basis?
  • …and judges and lawyers — as well as former litigants — could add many more questions (and answers) to this list

Bottom line:
It is my hope that technology can help increase access to justice.


Also relevant/see:

Virtual Justice? Exploring AI’s impact on legal accessibility — from nortonrosefulbright.com by Chris Owen and Mary-Frances Murphy

Excerpt (emphasis DSC):

A number of products are already under development, or have been launched. One example is a project that Norton Rose Fulbright is working on, together with not-for-profit legal service Justice Connect. The scope is to develop an automated natural language processing AI model that seeks to interpret the ‘everyday’ language used by clients in order to identify the client’s legal issues and correctly diagnose their legal problem. This tool is aimed at addressing the struggles that individuals often face in deciphering legal jargon and understanding the nature of their legal issue and the type of lawyer, or legal support, they need to resolve that problem.

 

Unpacking 3 major trends in ed tech and for-profit education — from highereddive.com by Natalie Schwartz
CEOs of major companies recently told investors how they fared in their most recent financial quarters, offering insight into the broader higher ed sector.

Education companies double down on degree programs

These programs allow Coursera users to count open courses they complete on the platform toward credit for degree programs. Students can also be admitted to degree programs based on their performance in these courses,Maggioncalda said.

Coursera recently announced it had built several of these pathways to master’s degrees offered by Illinois Tech. Coursera users can now complete professional certificates offered on the website — including from Google, IBM and Meta — as credit toward these programs.


Report Finds Students Struggling with Being Prepared for Courses and Increasingly Turning to Generative AI, Social Media to Study — from campustechnology.com by Kate Lucariello

In its second annual 2023 “Study Trends Report,” McGraw Hill found that college students were feeling unprepared for their courses, but also that they have turned to generative AI and social media to study and would like more learning resources in a similar format.

The study, conducted by Morning Consult between July 18 and Aug. 11, 2023, surveyed 500 undergraduate college students and 200 college instructors. Some of the key findings include:


The Plot To Kill Shop Class — by Ryan Craig

I suspect College Board may be trying to repent for its original sin: killing vocational education. Now known as career and technical education (CTE), America’s college-or-bust mentality has long relegated CTE to a shadowy corner of high school.

But make no mistake: the College Board’s fingerprints are on the weapon that killed CTE. College Board launched Advanced Placement courses in 1955 with 500 students across 18 elite schools like Andover, Bronx Science, and Newton High School. The original idea was guiltless: more challenging curricula for gifted and talented students to accelerate the development of leaders and win the Cold War. But it soon became clear that AP’s primary purpose would be to give students a leg up in competitive college admissions; as early as 1960, Exeter worried about “a dangerous tendency to regard advanced placement teachers and students as an elite worthy of special praise.”

When College Board’s primary source of revenue (and profits) is AP courses and demand for AP is driven by a weighted GPA formula that discriminates against all other forms of education, any attempt to create a level playing field between career discovery and college is window dressing: CTE theater. College Board knows which side its bread is buttered on (hint: it’s in its name).


2U, USC Curtail Online Partnership — from insidehighered.com by Doug Lederman
Southern California and the online program manager will part ways on master’s degrees that became a target of scrutiny because of their high price.

Which makes it fitting, perhaps, that Thursday 2U and USC announced that that they would largely wind down their 15-year partnership, which in the eyes of consumer advocates and some journalists had come to exemplify how involving companies intimately in the delivery of education could undermine, rather than expand, access and affordability to higher education.


edX and Jobs for the Future Offer Free MicroBachelors Programs — from campustechnology.com by Kate Lucariello

Three MicroBachelor programs are currently available:

  • Statistics Fundamentals and Mathematics and Statistics Fundamentals from The London School of Economics;
  • Marketing Essentials and Business and Professional Communication for Success from Doane University; and
  • Full Stack Application Development from IBM.

PROOF POINTS: Professors say high school math doesn’t prepare most students for their college majors — from hechingerreport.org


 

Where a developing, new kind of learning ecosystem is likely headed [Christian]

From DSC:
As I’ve long stated on the Learning from the Living [Class]Room vision, we are heading toward a new AI-empowered learning platform — where humans play a critically important role in making this new learning ecosystem work.

Along these lines, I ran into this site out on X/Twitter. We’ll see how this unfolds, but it will be an interesting space to watch.

Project Chiron's vision: Our vision for education Every child will soon have a super-intelligent AI teacher by their side. We want to make sure they instill a love of learning in children.


From DSC:
This future learning platform will also focus on developing skills and competencies. Along those lines, see:

Scale for Skills-First — from the-job.beehiiv.com by Paul Fain
An ed-tech giant’s ambitious moves into digital credentialing and learner records.

A Digital Canvas for Skills
Instructure was a player in the skills and credentials space before its recent acquisition of Parchment, a digital transcript company. But that $800M move made many observers wonder if Instructure can develop digital records of skills that learners, colleges, and employers might actually use broadly.

Ultimately, he says, the CLR approach will allow students to bring these various learning types into a coherent format for employers.

Instructure seeks a leadership role in working with other organizations to establish common standards for credentials and learner records, to help create consistency. The company collaborates closely with 1EdTech. And last month it helped launch the 1EdTech TrustEd Microcredential Coalition, which aims to increase quality and trust in digital credentials.

Paul also links to 1EDTECH’s page regarding the Comprehensive Learning Record

 

The Beatles’ final song is now streaming thanks to AI — from theverge.com by Chris Welch
Machine learning helped Paul McCartney and Ringo Starr turn an old John Lennon demo into what’s likely the band’s last collaborative effort.


Scientists excited by AI tool that grades severity of rare cancer — from bbc.com by Fergus Walsh

Artificial intelligence is nearly twice as good at grading the aggressiveness of a rare form of cancer from scans as the current method, a study suggests.

By recognising details invisible to the naked eye, AI was 82% accurate, compared with 44% for lab analysis.

Researchers from the Royal Marsden Hospital and Institute of Cancer Research say it could improve treatment and benefit thousands every year.

They are also excited by its potential for spotting other cancers early.


Microsoft unveils ‘LeMa’: A revolutionary AI learning method mirroring human problem solving — from venturebeat.com by Michael Nuñez

Researchers from Microsoft Research Asia, Peking University, and Xi’an Jiaotong University have developed a new technique to improve large language models’ (LLMs) ability to solve math problems by having them learn from their mistakes, akin to how humans learn.

The researchers have revealed a pioneering strategy, Learning from Mistakes (LeMa), which trains AI to correct its own mistakes, leading to enhanced reasoning abilities, according to a research paper published this week.

Also from Michael Nuñez at venturebeat.com, see:


GPTs for all, AzeemBot; conspiracy theorist AI; big tech vs. academia; reviving organs ++448 — from exponentialviewco by Azeem Azhar and Chantal Smith


Personalized A.I. Agents Are Here. Is the World Ready for Them? — from ytimes.com by Kevin Roose (behind a paywall)

You could think of the recent history of A.I. chatbots as having two distinct phases.

The first, which kicked off last year with the release of ChatGPT and continues to this day, consists mainly of chatbots capable of talking about things. Greek mythology, vegan recipes, Python scripts — you name the topic and ChatGPT and its ilk can generate some convincing (if occasionally generic or inaccurate) text about it.

That ability is impressive, and frequently useful, but it is really just a prelude to the second phase: artificial intelligence that can actually do things. Very soon, tech companies tell us, A.I. “agents” will be able to send emails and schedule meetings for us, book restaurant reservations and plane tickets, and handle complex tasks like “negotiate a raise with my boss” or “buy Christmas presents for all my family members.”


From DSC:
Very cool!


Nvidia Stock Jumps After Unveiling of Next Major AI Chip. It’s Bad News for Rivals. — from barrons.com

On Monday, Nvidia (ticker: NVDA) announced its new H200 Tensor Core GPU. The chip incorporates 141 gigabytes of memory and offers up to 60% to 90% performance improvements versus its current H100 model when used for inference, or generating answers from popular AI models.

From DSC:
The exponential curve seems to be continuing — 60% to 90% performance improvements is a huge boost in performance.

Also relevant/see:


The 5 Best GPTs for Work — from the AI Exchange

Custom GPTs are exploding, and we wanted to highlight our top 5 that we’ve seen so far:

 

A future-facing minister, a young inventor and a shared vision: An AI tutor for every student — from news.microsoft.com by Chris Welsch

The Ministry of Education and Pativada see what has become known as the U.A.E. AI Tutor as a way to provide students with 24/7 assistance as well as help level the playing field for those families who cannot afford a private tutor. At the same time, the AI Tutor would be an aid to teachers, they say. “We see it as a tool that will support our teachers,” says Aljughaiman. “This is a supplement to classroom learning.”

If everything goes according to plan, every student in the United Arab Emirates’ school system will have a personal AI tutor – that fits in their pockets.

It’s a story that involves an element of coincidence, a forward-looking education minister and a tech team led by a chief executive officer who still lives at home with his parents.

In February 2023, the U.A.E.’s education minister, His Excellency Dr. Ahmad Belhoul Al Falasi, announced that the ministry was embracing AI technology and pursuing the idea of an AI tutor to help Emirati students succeed. And he also announced that the speech he presented had been written by ChatGPT. “We should not demonize AI,” he said at the time.



Fostering deep learning in humans and amplifying our intelligence in an AI World — from stefanbauschard.substack.com by Stefan Bauschard
A free 288-page report on advancements in AI and related technology, their effects on education, and our practical support for AI-amplified human deep learning

Six weeks ago, Dr. Sabba Quidwai and I accidentally stumbled upon an idea to compare the deep learning revolution in computer science to the mostly lacking deep learning efforts in education (Mehta & Fine). I started writing, and as these things often go with me, I thought there were many other things that would be useful to think through and for educators to know, and we ended up with this 288-page report.

***

Here’s an abstract from that report:

This report looks at the growing gap between the attention paid to the development of intelligence in machines and humans. While computer scientists have made great strides in developing human intelligence capacities in machines using deep learning technologies, including the abilities of machines to learn on their own, a significant part of the education system has not kept up with developing the intelligence capabilities in people that will enable them to succeed in the 21st century. Instead of fully embracing pedagogical methods that place primary emphasis on promoting collaboration, critical thinking, communication, creativity, and self-learning through experiential, interdisciplinary approaches grounded in human deep learning and combined with current technologies, a substantial portion of the educational system continues to heavily rely on traditional instructional methods and goals. These methods and goals prioritize knowledge acquisition and organization, areas in which machines already perform substantially better than people.

Also from Stefan Bauschard, see:

  • Debating in the World of AI
    Performative assessment, learning to collaborate with humans and machines, and developing special human qualities

13 Nuggets of AI Wisdom for Higher Education Leaders — from jeppestricker.substack.com by Jeppe Klitgaard Stricker
Actionable AI Guidance for Higher Education Leaders

Incentivize faculty AI innovation with AI. 

Invest in people first, then technology. 

On teaching, learning, and assessment. AI has captured the attention of all institutional stakeholders. Capitalize to reimagine pedagogy and evaluation. Rethink lectures, examinations, and assignments to align with workforce needs. Consider incorporating Problem-Based Learning, building portfolios and proof of work, and conducting oral exams. And use AI to provide individualized support and assess real-world skills.

Actively engage students.


Some thoughts from George Siemens re: AI:

Sensemaking, AI, and Learning (SAIL), a regular look at how AI is impacting learning.

Our education system has a uni-dimensional focus: learning things. Of course, we say we care about developing the whole learner, but the metrics that matter (grade, transcripts) that underpin the education system are largely focused on teaching students things that have long been Google-able but are now increasingly doable by AI. Developments in AI matters in ways that calls into question large parts of what happens in our universities. This is not a statement that people don’t need to learn core concepts and skills. My point is that the fulcrum of learning has shifted. Knowing things will continue to matter less and less going forward as AI improves its capabilities. We’ll need to start intentionally developing broader and broader attributes of learners: metacognition, wellness, affect, social engagement, etc. Education will continue to shift toward human skills and away from primary assessment of knowledge gains disconnected from skills and practice and ways of being.


AI, the Next Chapter for College Librarians — from insidehighered.com by Lauren Coffey
Librarians have lived through the disruptions of fax machines, websites and Wikipedia, and now they are bracing to do it again as artificial intelligence tools go mainstream: “Maybe it’s our time to shine.”

A few months after ChatGPT launched last fall, faculty and students at Northwestern University had many questions about the building wave of new artificial intelligence tools. So they turned to a familiar source of help: the library.

“At the time it was seen as a research and citation problem, so that led them to us,” said Michelle Guittar, head of instruction and curriculum support at Northwestern University Libraries.

In response, Guittar, along with librarian Jeanette Moss, created a landing page in April, “Using AI Tools in Your Research.” At the time, the university itself had yet to put together a comprehensive resource page.


From Dr. Nick Jackson’s recent post on LinkedIn: 

Last night the Digitech team of junior and senior teachers from Scotch College Adelaide showcased their 2023 experiments, innovation, successes and failures with technology in education. Accompanied by Student digital leaders, we saw the following:

  •  AI used for languagelearning where avatars can help with accents
  • Motioncapture suits being used in mediastudies
  • AI used in assessment and automatic grading of work
  • AR used in designtechnology
  • VR used for immersive Junior school experiences
  • A teacher’s AI toolkit that has changed teaching practice and workflow
  • AR and the EyeJack app used by students to create dynamic art work
  • VR use in careers education in Senior school
  • How ethics around AI is taught to Junior school students from Year 1
  • Experiments with MyStudyWorks

Almost an Agent: What GPTs can do — from oneusefulthing.org by Ethan Mollick

What would a real AI agent look like? A simple agent that writes academic papers would, after being given a dataset and a field of study, read about how to compose a good paper, analyze the data, conduct a literature review, generate hypotheses, test them, and then write up the results, all without intervention. You put in a request, you get a Word document that contains a draft of an academic paper.

A process kind of like this one:


What I Learned From an Experiment to Apply Generative AI to My Data Course — from edsurge.com by Wendy Castillo

As an educator, I have a duty to remain informed about the latest developments in generative AI, not only to ensure learning is happening, but to stay on top of what tools exist, what benefits and limitations they have, and most importantly, how students might be using them.

However, it’s also important to acknowledge that the quality of work produced by students now requires higher expectations and potential adjustments to grading practices. The baseline is no longer zero, it is AI. And the upper limit of what humans can achieve with these new capabilities remains an unknown frontier.


Artificial Intelligence in Higher Education: Trick or Treat? — from tytonpartners.com by Kristen Fox and Catherine Shaw
.

Two components of AI -- generative AI and predictive AI

 

New models and developer products announced at DevDay — from openai.com
GPT-4 Turbo with 128K context and lower prices, the new Assistants API, GPT-4 Turbo with Vision, DALL·E 3 API, and more.

Today, we shared dozens of new additions and improvements, and reduced pricing across many parts of our platform. These include:

  • New GPT-4 Turbo model that is more capable, cheaper and supports a 128K context window
  • New Assistants API that makes it easier for developers to build their own assistive AI apps that have goals and can call models and tools
  • New multimodal capabilities in the platform, including vision, image creation (DALL·E 3), and text-to-speech (TTS)


Introducing GPTs — from openai.com
You can now create custom versions of ChatGPT that combine instructions, extra knowledge, and any combination of skills.




OpenAI’s New Groundbreaking Update — from newsletter.thedailybite.co
Everything you need to know about OpenAI’s update, what people are building, and a prompt to skim long YouTube videos…

But among all this exciting news, the announcement of user-created “GPTs” took the cake.

That’s right, your very own personalized version of ChatGPT is coming, and it’s as groundbreaking as it sounds.

OpenAI’s groundbreaking announcement isn’t just a new feature – it’s a personal AI revolution. 

The upcoming customizable “GPTs” transform ChatGPT from a one-size-fits-all to a one-of-a-kind digital sidekick that is attuned to your life’s rhythm. 


Lore Issue #56: Biggest Week in AI This Year — from news.lore.com by Nathan Lands

First, Elon Musk announced “Grok,” a ChatGPT competitor inspired by “The Hitchhiker’s Guide to the Galaxy.” Surprisingly, in just a few months, xAI has managed to surpass the capabilities of GPT-3.5, signaling their impressive speed of execution and establishing them as a formidable long-term contender.

Then, OpenAI hosted their inaugural Dev Day, unveiling “GPT-4 Turbo,” which boasts a 128k context window, API costs slashed by threefold, text-to-speech capabilities, auto-model switching, agents, and even their version of an app store slated for launch next month.


The Day That Changed Everything — from joinsuperhuman.ai by Zain Kahn
ALSO: Everything you need to know about yesterday’s OpenAI announcements

  • OpenAI DevDay Part I: Custom ChatGPTs and the App Store of AI
  • OpenAI DevDay Part II: GPT-4 Turbo, Assistants, APIs, and more

OpenAI’s Big Reveal: Custom GPTs, GPT Store & More — from  news.theaiexchange.com
What you should know about the new announcements; how to get started with building custom GPTs


Incredible pace of OpenAI — from theaivalley.com by Barsee
PLUS: Elon’s Gork


 

 

Accenture Life Trends 2024 — from accenture.com; via Mr. Bob Raidt on LinkedIn
The visible and invisible mediators between people and their world are changing.

In brief

  • The harmony between people, tech and business is showing tensions, and society is in flux.
  • Five trends explore the decline of customer obsession, the influence of generative AI, the stagnation of creativity, the balance of tech benefits and burden, and people’s new life goals.
  • Opportunity abounds for business and brands in the coming twelve months and beyond – read Accenture Life Trends 2024 to find out more.

5 Trends

01 Where’s the love?
Necessary cuts across enterprises have shunted customer obsession down the priority list—and customers are noticing.
02 The great interface shift
Generative AI is upgrading people’s experience of the internet from transactional to personal, enabling them to feel more digitally understood and relevant than ever.
03 Meh-diocrity
Creativity was once about the audience, but has become dependent on playing the tech system. Is this what creative stagnation feels like?
04 Error 429: Human request limit reached
Technology feels like it’s happening to people rather than for them—is a shift beginning, where they regain agency over its influence on daily life?
05 Decade of deconstruction
Traditional life paths are being rerouted by new limitations, necessities and opportunities, significantly shifting demographics.

 

Shocking AI Statistics in 2023 — from techthatmatters.beehiiv.com by Harsh Makadia

  1. Chat GPT reached 100 million users faster than any other app. By February 2023, the chat.openai.com website saw an average of 25 million daily visitors. How can this rise in AI usage benefit your business’s function?
  2. 45% of executives say the popularity of ChatGPT has led them to increase investment in AI. If executives are investing in AI personally, then how will their beliefs affect corporate investment in AI to drive automation further? Also, how will this affect the amount of workers hired to manage AI systems within companies?
  3. eMarketer predicts that in 2024 at least 20% of Americans will use ChatGPT monthly and that a fifth of them are 25-34 year olds in the workforce. Does this mean that there are more young workers using AI?
  4. …plus 10 more stats

People are speaking with ChatGPT for hours, bringing 2013’s Her closer to reality — from arstechnica.com by Benj Edwards
Long mobile conversations with the AI assistant using AirPods echo the sci-fi film.

It turns out that Willison’s experience is far from unique. Others have been spending hours talking to ChatGPT using its voice recognition and voice synthesis features, sometimes through car connections. The realistic nature of the voice interaction feels largely effortless, but it’s not flawless. Sometimes, it has trouble in noisy environments, and there can be a pause between statements. But the way the ChatGPT voices simulate vocal ticks and noises feels very human. “I’ve been using the voice function since yesterday and noticed that it makes breathing sounds when it speaks,” said one Reddit user. “It takes a deep breath before starting a sentence. And today, actually a minute ago, it coughed between words while answering my questions.”

From DSC:
Hmmmmmmm….I’m not liking the sound of this on my initial take of it. But perhaps there are some real positives to this. I need to keep an open mind.


Working with AI: Two paths to prompting — from oneusefulthing.org by Ethan Mollick
Don’t overcomplicate things

  1. Conversational Prompting [From DSC: i.e., keep it simple]
  2. Structured Prompting

For most people, [Conversational Prompting] is good enough to get started, and it is the technique I use most of the time when working with AI. Don’t overcomplicate things, just interact with the system and see what happens. After you have some experience, however, you may decide that you want to create prompts you can share with others, prompts that incorporate your expertise. We call this approach Structured Prompting, and, while improving AIs may make it irrelevant soon, it is currently a useful tool for helping others by encoding your knowledge into a prompt that anyone can use.


These fake images reveal how AI amplifies our worst stereotypes — from washingtonpost.com by Nitasha Tiku, Kevin Schaul, and Szu Yu Chen (behind paywall)
AI image generators like Stable Diffusion and DALL-E amplify bias in gender and race, despite efforts to detoxify the data fueling these results.

Artificial intelligence image tools have a tendency to spin up disturbing clichés: Asian women are hypersexual. Africans are primitive. Europeans are worldly. Leaders are men. Prisoners are Black.

These stereotypes don’t reflect the real world; they stem from the data that trains the technology. Grabbed from the internet, these troves can be toxic — rife with pornography, misogyny, violence and bigotry.

Abeba Birhane, senior advisor for AI accountability at the Mozilla Foundation, contends that the tools can be improved if companies work hard to improve the data — an outcome she considers unlikely. In the meantime, the impact of these stereotypes will fall most heavily on the same communities harmed during the social media era, she said, adding: “People at the margins of society are continually excluded.”


ChatGPT app revenue shows no signs of slowing, but some other AI apps top it — from techcrunch.com by Sarah Perez; Via AI Valley – Barsee

ChatGPT, the AI-powered chatbot from OpenAI, far outpaces all other AI chatbot apps on mobile devices in terms of downloads and is a market leader by revenue, as well. However, it’s surprisingly not the top AI app by revenue — several photo AI apps and even other AI chatbots are actually making more money than ChatGPT, despite the latter having become a household name for an AI chat experience.


ChatGPT can now analyze files you upload to it without a plugin — from bgr.com by Joshua Hawkins; via Superhuman

According to new reports, OpenAI has begun rolling out a more streamlined approach to how people use ChatGPT. The new system will allow the AI to choose a model automatically, letting you run Python code, open a web browser, or generate images with DALL-E without extra interaction. Additionally, ChatGPT will now let you upload and analyze files.

 


LEGALTECH TOOLS EVERYONE SHOULD KNOW ABOUT — from techdayhq.com

Enter legaltech: a field that marries the power of technology with the complexities of the law. From automating tedious tasks to enhancing research effectiveness, let’s delve into the world of legaltech and unmask the crucial tools everyone should know.



Should AI and Humans be Treated the Same Under the Law–Under a “Reasonable Robot” Standard? (Ryan Abbott – UCLA);  Technically Legal – A Legal Technology and Innovation Podcast

If a human uses artificial intelligence to invent something, should the invention be patentable?

If a driverless car injures a pedestrian, should the AI driver be held to a negligence standard as humans would? Or should courts apply the strict liability used for product defects?

What if AI steals money from a bank account? Should it be held to the same standard as a human under criminal law?

All interesting questions and the subject of a book called the Reasonable Robot by this episode’s guest Ryan Abbott.


Colin Levy, Dorna Moini, and Ashley Carlisle on Herding Cats and Heralding Change: The Inside Scoop on the “Handbook of Legal Tech” — from geeklawblog.com by Greg Lambert & Marlene Gebauer

The guests offered advice to law students and lawyers looking to learn about and leverage legal tech. Carlisle emphasized starting with an open mind, intentional research, and reading widely from legal tech thought leaders. Moini recommended thinking big but starting small with iterative implementation. Levy stressed knowing your purpose and motivations to stay focused amidst the vast array of options.

Lambert prompted the guests to identify low-hanging fruit legal technologies those new to practice should focus on. Levy pointed to document automation and AI. Moini noted that intake and forms digitization can be a first step for laggards. Carlisle advised starting small with discrete tasks before tackling advanced tools.

For their forward-looking predictions, Carlisle saw AI hype fading but increasing tech literacy, Levy predicted growing focus on use and analysis of data as AI advances, and Moini forecasted a rise in online legal service delivery. The guests are excited about spreading awareness through the book to help transform the legal industry.


You’ll never be solo again — from jordanfurlong.substack.com by Jordan Furlong
Generative AI can be the partner, the assistant, the mentor, and the confidant that many sole practitioners and new lawyers never had. There’s just one small drawback…

In terms of legal support, a terrific illustration of Gen AI’s potential is provided by Deborah Merritt in a three-part blog series this month at Law School Cafe. Deborah explores the use of ChatGPT-4 as an aid to bar exam preparation and the first months of law practice, finding it to be astonishingly proficient at identifying legal issues, recommending tactical responses, and showing how to build relationships of trust with clients. It’s not perfect — it makes small errors and omissions that require an experienced lawyer’s review — but it’s still pretty mind-blowingly amazing that a free online technology can do any of this stuff at all. And as is always the case with Gen AI, it’s only going to get better.

In terms of administrative support, Mark Haddad of Thomson Reuters explains in Attorney At Work how AI-driven chatbots and CRM systems can handle a sole practitioner’s initial client queries, schedule appointments and send reminders, while AI can also analyze the firm’s practice areas and create marketing campaigns and content. Earlier this month, Clio itself announced plans for “Clio Duo,” a built-in proprietary Gen AI that “will serve as a coach, intuitive collaborator, and expert consultant to legal professionals, deeply attuned to the intricate facets of running a law firm.”



GPT-4 Beats the Bar Exam — from lawschoolcafe.org by Deborah J. Merritt

In the first three posts in this series, I used a bar exam question as an example of the type of problem a new lawyer might confront in practice. I then explored how GPT-4 might help a new lawyer address that practice problem. In this post, I’ll work with another sample question that NCBE has released for the NextGen bar exam. On this question, GPT-4 beats the bar exam. In other words, a new lawyer using GPT-4 would obtain better answers than one who remembered material studied for the bar exam.


ABA TECHSHOW 2024 – A Preview from the Co-Chairs — from legaltalknetwork.com by Cynthia Thomas, Sofia Lingos, Sharon D. Nelson, and Jim Calloway

Also see: The ABA TECHSHOW 2024


When It Comes to Legal Innovation Everything is Connected — from artificiallawyer.com by Richard Tromans

Legal tech can sometimes feel like it’s the whole world. We get absorbed by the details of the technology and are sometimes blinded by big investment announcements, but without the rest of the legal innovation ecosystem around it then this sector-specific software is limited. What do I mean? Let me explain.


The Most Significant Updates In The Case Management Sphere — from abovethelaw.com by Jared Correia
Joshua Lenon of Clio and Christopher Lafferty of Caret talk over case management software’s role in today’s law firm operations.

 
 

WHAT WAS GARY MARCUS THINKING, IN THAT INTERVIEW WITH GEOFF HINTON? — from linkedin.com by Stephen Downes

Background (emphasis DSC): 60 Minutes did an interview with ‘the Godfather of AI’, Geoffrey Hinton. In response, Gary Marcus wrote a column in which he inserted his own set of responses into the transcript, as though he were a panel participant. Neat idea. So, of course, I’m stealing it, and in what follows, I insert my own comments as I join the 60 Minutes panel with Geoffrey Hinton and Gary Marcus.

Usually I put everyone else’s text in italics, but for this post I’ll put it all in normal font, to keep the format consistent.

Godfather of Artificial Intelligence Geoffrey Hinton on the promise, risks of advanced AI


OpenAI’s Revenue Skyrockets to $1.3 Billion Annualized Rate — from maginative.com by Chris McKay
This means the company is generating over $100 million per month—a 30% increase from just this past summer.

OpenAI, the company behind the viral conversational AI ChatGPT, is experiencing explosive revenue growth. The Information reports that CEO Sam Altman told the staff this week that OpenAI’s revenue is now crossing $1.3 billion on an annualized basis. This means the company is generating over $100 million per month—a 30% increase from just this past summer.

Since the launch of a paid version of ChatGPT in February, OpenAI’s financial growth has been nothing short of meteoric. Additionally, in August, the company announced the launch of ChatGPT Enterprise, a commercial version of its popular conversational AI chatbot aimed at business users.

For comparison, OpenAI’s total revenue for all of 2022 was just $28 million. The launch of ChatGPT has turbocharged OpenAI’s business, positioning it as a bellwether for demand for generative AI.



From 10/13:


New ways to get inspired with generative AI in Search — from blog.google
We’re testing new ways to get more done right from Search, like the ability to generate imagery with AI or creating the first draft of something you need to write.

 

Next month Microsoft Corp. will start making its artificial intelligence features for Office widely available to corporate customers. Soon after, that will include the ability for it to read your emails, learn your writing style and compose messages on your behalf.

From DSC:
As readers of this blog know, I’m generally pro-technology. I see most technologies as tools — which can be used for good or for ill. So I will post items both pro and con concerning AI.

But outsourcing email communications to AI isn’t on my wish list or to-do list.

 

The Public Is Giving Up on Higher Ed — from chronicle.com by Michael D. Smith
Our current system isn’t working for society. Digital alternatives can change that.

Excerpts:

I fear that we in the academy are willfully ignoring this problem. Bring up student-loan debt and you’ll hear that it’s the government’s fault. Bring up online learning and you’ll hear that it is — and always will be — inferior to in-person education. Bring up exclusionary admissions practices and you’ll hear something close to, “Well, the poor can attend community colleges.”

On one hand, our defensiveness is natural. Change is hard, and technological change that risks making traditional parts of our sector obsolete is even harder. “A professor must have an incentive to adopt new technology,” a tenured colleague recently told me regarding online learning. “Innovation adoption will occur one funeral at a time.”

But while our defense of the status quo is understandable, maybe we should ask whether it’s ethical, given what we know about the injustice inherent in our current system. I believe a happier future for all involved — faculty, administrators, and students — is within reach, but requires we stop reflexively protecting our deeply flawed system. How can we do that? We could start by embracing three fundamental principles.

1. Digitization will change higher education.

2. We should want to embrace this change.

3. We have a way to embrace this change.

I fear that we in the academy are willfully ignoring this problem. Bring up student-loan debt and you’ll hear that it’s the government’s fault. Bring up online learning and you’ll hear that it is — and always will be — inferior to in-person education. Bring up exclusionary admissions practices and you’ll hear something close to, “Well, the poor can attend community colleges.”

 

 

US Higher Education Needs a Revolution. What’s Holding It Back? — from bloomberg.com by Tyler Cowen
Not only do professors need to change how they teach, but universities need to change how they evaluate them.

When the revolution in higher education finally arrives, how will we know? I have a simple metric: When universities change how they measure faculty work time. Using this yardstick, the US system remains very far from a fundamental transformation.

But today’s education system is dynamic, and needs to become even more so. There is already the internet, YouTube, and a flurry of potential innovations coming from AI. If professors really are a society’s best minds, shouldn’t they be working to improve the entire educational process, not just punching the equivalent of a time clock at a university?

Such a change would require giving them credit for innovations, which in turn would require a broader conception of their responsibilities. 


Citing Significant Budget Deficits, Several Colleges Face Cuts — from insidehighered.com by Doug Lederman
The affected institutions include Christian Brothers, Delta State, Lane Community College, Miami University, St. Norbert and Shepherd.

Numerous colleges and universities, public and private, announced in recent days that they face significant budget deficits that will require cuts to programs and employees.

Many of the institutions appear to have been motivated by fall enrollment numbers that did not meet their expectations, in most cases representing a failure to recover from record low enrollments during the pandemic. Others cited the lingering effects on enrollment and budgets from COVID-19, exacerbated by the end of federal relief funds.


How universities can adopt a lifelong learning mindset: Lifelong learning that will last — from timeshighereducation.com by various authors
How the traditional university degree can be reimagined as a lifelong educational journey, enabling students to upskill and reskill throughout their lives

The rapid evolution of the workplace and changing skills demands are driving calls for better lifelong learning provision. For universities, this means re-examining traditional teaching practices and course design to ensure that students can benefit from continuing education throughout their careers. It requires more flexible, accessible, bite-sized learning that can be completed in tandem with other professional and personal commitments. But how can this be offered in a coherent, joined-up way without sacrificing quality? From Moocs to microcredentials, these resources offer advice and insight into how lifelong learning opportunities can be developed and improved for future generations.


The College Backlash Is Going Too Far — from theatlantic.com by David Deming; via Matthew Tower who also expresses his concerns re: this article from The Chronicle
Getting a four-year degree is still a good investment. 

American higher education certainly has its problems. But the bad vibes around college threaten to obscure an important economic reality: Most young people are still far better off with a four-year college degree than without one.

Historically, analysis of higher education’s value tends to focus on the so-called college wage premium. That premium has always been massive—college graduates earn much more than people without a degree, on average—but it doesn’t take into account the cost of getting a degree. So the St. Louis Fed researchers devised a new metric, the college wealth premium, to try to get a more complete picture.

But the long-term value of a bachelor’s degree is much greater than it initially appears. If a college professor or pundit tries to convince you otherwise, ask them what they would choose for their own children.

From DSC:
David’s last quote here is powerful and likely true. But that doesn’t mean that we should disregard trying to get the cost of obtaining a degree down by 50% or more. There are still way too many people struggling with student loans — and they have been for DECADES. And others will be joining these same financial struggles — again, for DECADES to come.


Johns Hopkins aims to address teacher shortage with new master’s residency option — from hub.jhu.edu ; via Matthew Tower

The School of Education’s TeachingWell program will provide professional, financial support for applicants looking to start long-term careers in teaching

Students in TeachingWell will earn the Master of Education for Teaching Professionals in four semesters at Johns Hopkins and gain Maryland state teacher certification along with real-world teaching experience—all made stronger by ongoing mentoring, life design, and teacher wellness programs through the university.

“We will focus on teacher well-being and life-design skills that address burnout and mental health concerns that are forcing too many teachers out of the profession,” says Mary Ellen Beaty-O’Ferrall, associate professor at the School of Education and faculty director of TeachingWell. “We want teachers with staying power—effective and financially stable educators with strong personal well-being.”


How to Build Stackable Credentials — from insidehighered.com by Lindsay Daugherty , Peter Nguyen , Jonah Kushner and Peter Riley Bahr
Five actions states and colleges are taking.

Stackable credentials are a top priority for many states and colleges these days. The term can be used to mean different things, from college efforts to embed short-term credentials into their degree programs to larger-scale efforts to rethink the way credentialing is done through alternative approaches, like skills badges. The goals of these initiatives are twofold: (1) to ensure individuals can get credit for a range of different learning experiences and better integrate these different types of learning, and (2) to better align our education and training systems with workforce needs, which often require reskilling through training and credentials below the bachelor’s degree level.S

 

180 Degree Turn: NYC District Goes From Banning ChatGPT to Exploring AI’s Potential — from edweek.org by Alyson Klein (behind paywall)

New York City Public Schools will launch an Artificial Intelligence Policy Lab to guide the nation’s largest school district’s approach to this rapidly evolving technology.


The Leader’s Blindspot: How to Prepare for the Real Future — from preview.mailerlite.io by the AIEducator
The Commonly Held Belief: AI Will Automate Only Boring, Repetitive Tasks First

The Days of Task-Based Views on AI Are Numbered
The winds of change are sweeping across the educational landscape (emphasis DSC):

  1. Multifaceted AI: AI technologies are not one-trick ponies; they are evolving into complex systems that can handle a variety of tasks.
  2. Rising Expectations: As technology becomes integral to our lives, the expectations for personalised, efficient education are soaring.
  3. Skill Transformation: Future job markets will demand a different skill set, one that is symbiotic with AI capabilities.

Teaching: How to help students better understand generative AI — from chronicle.com by Beth McMurtrie
Beth describes ways professors have used ChatGPT to bolster critical thinking in writing-intensive courses

Kevin McCullen, an associate professor of computer science at the State University of New York at Plattsburgh, teaches a freshman seminar about AI and robotics. As part of the course, students read Machines of Loving Grace: The Quest for Common Ground Between Humans and Robots, by John Markoff. McCullen had the students work in groups to outline and summarize the first three chapters. Then he showed them what ChatGPT had produced in an outline.

“Their version and ChatGPT’s version seemed to be from two different books,” McCullen wrote. “ChatGPT’s version was essentially a ‘laundry list’ of events. Their version was narratives of what they found interesting. The students had focused on what the story was telling them, while ChatGPT focused on who did what in what year.” The chatbot also introduced false information, such as wrong chapter names.

The students, he wrote, found the writing “soulless.”


7 Questions with Dr. Cristi Ford, VP of Academic Affairs at D2L — from campustechnology.com by Rhea Kelly

In the Wild West of generative AI, educators and institutions are working out how best to use the technology for learning. How can institutions define AI guidelines that allow for experimentation while providing students with consistent guidance on appropriate use of AI tools?

To find out, we spoke with Dr. Cristi Ford, vice president of academic affairs at D2L. With more than two decades of educational experience in nonprofit, higher education, and K-12 institutions, Ford works with D2L’s institutional partners to elevate best practices in teaching, learning, and student support. Here, she shares her advice on setting and communicating AI policies that are consistent and future-ready.


AI Platform Built by Teachers, for Teachers, Class Companion Raises $4 Million to Tap Into the Power of Practice — from prweb.com

“If we want to use AI to improve education, we need more teachers at the table,” said Avery Pan, Class Companion co-founder and CEO. “Class Companion is designed by teachers, for teachers, to harness the most sophisticated AI and improve their classroom experience. Developing technologies specifically for teachers is imperative to supporting our next generation of students and education system.”


7 Questions on Generative AI in Learning Design — from campustechnology.com by Rhea Kelly
Open LMS Adoption and Education Specialist Michael Vaughn on the challenges and possibilities of using artificial intelligence to move teaching and learning forward.

The potential for artificial intelligence tools to speed up course design could be an attractive prospect for overworked faculty and spread-thin instructional designers. Generative AI can shine, for example, in tasks such as reworking assessment question sets, writing course outlines and learning objectives, and generating subtitles for audio and video clips. The key, says Michael Vaughn, adoption and education specialist at learning platform Open LMS, is treating AI like an intern who can be guided and molded along the way, and whose work is then vetted by a human expert.

We spoke with Vaughn about how best to utilize generative AI in learning design, ethical issues to consider, and how to formulate an institution-wide policy that can guide AI use today and in the future.


10 Ways Technology Leaders Can Step Up and Into the Generative AI Discussion in Higher Ed — from er.educause.edu by Lance Eaton and Stan Waddell

  1. Offer Short Primers on Generative AI
  2. Explain How to Get Started
  3. Suggest Best Practices for Engaging with Generative AI
  4. Give Recommendations for Different Groups
  5. Recommend Tools
  6. Explain the Closed vs. Open-Source Divide
  7. Avoid Pitfalls
  8. Conduct Workshops and Events
  9. Spot the Fake
  10. Provide Proper Guidance on the Limitations of AI Detectors


 
© 2024 | Daniel Christian