As the buzz around ChatGPT and other generative AI increases, so has scammers’ interest in the tech. In a new report published by Meta, the company says it’s seen a sharp uptick in malware disguised as ChatGPT and similar AI software.
In a statement, the company said that since March of 2023 alone, its researchers have discovered “ten malware families using ChatGPT and other similar themes to compromise accounts across the internet” and that it’s blocked more than 1,000 malicious links from its platform. According to Meta, the scams often involve mobile apps or browser extensions posing as ChatGPT tools. And while in some cases the tools do offer some ChatGPT functionality, their real purpose is to steal their users’ account credentials.
AI Is Reshaping the Battlefield and the Future of Warfare — from bloomberg.com by Jackie Davalos and Nate Lanxon In this episode of AI IRL, Jackie Davalos and Nate Lanxon talk about one of the most dangerous applications of artificial intelligence: modern warfare
Excerpt:
Artificial intelligence has triggered an arms race with the potential to transform modern-day warfare. Countries are vying to develop cutting-edge technology at record speed, sparking concerns about whether we understand its power before it’s deployed.
From DSC: I wish that humankind — especially those of us in the United States — would devote less money to warfare and more funding to education.
Technology has become the main driver for increasing access to justice, and there are huge opportunities for legal service providers to leverage both existing and emerging tech to reach new clients. Dennis and Tom welcome Natalie Knowlton to discuss the current state of legal services, the justice gap, and ways technology is helping attorneys provide better and more affordable services to consumers. As always, stay tuned for the parting shots, that one tip, website, or observation that you can use the second the podcast ends.
The survey, conducted in late-March by the Thomson Reuters Institute, gathered insight from more than 440 respondent lawyers at large and midsize law firms in the United States, United Kingdom, and Canada. The survey forms the basis of a new report, ChatGPT & Generative AI within Law Firms, which takes a deep look at the evolving attitudes towards generative AI and ChatGPT within law firms, measuring awareness and adoption of the technology as well as lawyers’ views on its potential risks.
The report also reveals several key findings that deserve special attention from law firm leaders and other legal professionals as ChatGPT and generative AI evolve from concept to reality for the vast majority of the legal industry participants. These findings include:
There are lots of tropes related to lawyers and law firms that frequently show up in works of fiction. The thing is, those tropes are tropes because they’re sort of old; they’ve been around for a long time. Now, however, modern technology can solve a heck of a lot of those issues. So, for this edition of the “Reference Manual of Lists,” we’re going to relay a trope, offer an example, and talk about how legal tech actually fixes the problem today.
If you made it this far, you should by now understand that ChatGPT is not by itself a search engine, nor an eDiscovery data reviewer, a translator, knowledge base, or tool for legal analytics. But it can contribute to these functionalities.
April 20, 2023 – Alternative dispute resolution (ADR), a common technique parties can use to settle disputes with the help of a third party, offers several unique benefits over traditional litigation. It is typically more cost-effective, confidential and generally a preferred method to resolving disputes. As a result, counsel and their clients often view ADR as a no-brainer. But the once simple decision to engage in ADR is now complicated by whether to proceed in-person, virtually or with a hybrid approach.
The emergence of ChatGPT comes with tremendous promise of increased automation and efficiency. But at what cost? In this blog post, we’ll explore the potential ethical time bomb of using ChatGPT and examine the responsibility of lawyers in the age of AI.
Every year, I write a year-end wrap-up of the most significant developments in legal technology.
At the end of the past decade, I decided to look back on the most significant developments of the 2010s as a whole. It may well have been the most tumultuous decade ever in changing how legal services are delivered.
Here, I revisit those changes — and add a few post-2020 updates.
The FBI and the Defense Department were actively involved in research and development of facial recognition software that they hoped could be used to identify people from video footage captured by street cameras and flying drones, according to thousands of pages of internal documents that provide new details about the government’s ambitions to build out a powerful tool for advanced surveillance.
From DSC: This doesn’t surprise me. But it’s yet another example of opaqueness involving technology. And who knows to what levels our Department of Defense has taken things with AI, drones, and robotics.
A handful of companies control what PricewaterhouseCoopers called a “$15.7 trillion game changer of an industry.” Those companies employ or finance the work of a huge chunk of the academics who understand how to make LLMs. This leaves few people with the expertise and authority to say, “Wait, why are these companies blurring the distinction between what is human and what’s a language model? Is this what we want?”
…
Bender knows she’s no match for a trillion-dollar game changer slouching to life. But she’s out there trying. Others are trying too. LLMs are tools made by specific people — people who stand to accumulate huge amounts of money and power, people enamored with the idea of the singularity. The project threatens to blow up what is human in a species sense. But it’s not about humility. It’s not about all of us. It’s not about becoming a humble creation among the world’s others. It’s about some of us — let’s be honest — becoming a superspecies. This is the darkness that awaits when we lose a firm boundary around the idea that humans, all of us, are equally worthy as is.
Streamlined and personalized teaching Some examples of how we’ve seen educators exploring how to teach and learn with tools like ChatGPT:
Drafting and brainstorming for lesson plans and other activities
Help with design of quiz questions or other exercises
Experimenting with custom tutoring tools
Customizing materials for different preferences (simplifying language, adjusting to different reading levels, creating tailored activities for different interests)
Providing grammatical or structural feedback on portions of writing
Use in upskilling activities in areas like writing and coding (debugging code, revising writing, asking for explanations)
Critique AI generated text
While several of the above draw on ChatGPT’s potential to be explored as a tool for personalization, there are risks associated with such personalization as well, including student privacy, biased treatment, and development of unhealthy habits. Before students use tools that offer these services without direct supervision, they and their educators should understand the limitations of the tools outlined below.
Also relevant/see:
Designing Assignments in the ChatGPT Era — from insidehighered.com by Susan D’Agostino Some instructors seek to craft assignments that guide students in surpassing what AI can do. Others see that as a fool’s errand—one that lends too much agency to the software.
Excerpt (emphasis DSC):
David Wiley wrote a thoughtful post on the ways in which AI and Large Language Models (LLMs) can “provide instructional designers with first drafts of some of the work they do.” He says “imagine you’re an instructional designer who’s been paired with a faculty member to create a course in microeconomics. These tools might help you quickly create first drafts of” learning outcomes, discussion prompts, rubrics, and formative assessment items. The point is that LLMs can quickly generate rough drafts that are mostly accurate drafts, that humans can then “review, augment, and polish,” potentially shifting the work of instructional designers from authors to editors. The post is well worth your time.
… The question that I’d like to spend some time thinking about is the following: What new knowledge, capacities, and skills do instructional designers need in their role as editors and users of LLMs?
This resonated with me. Instructional Designer positions are starting to require AI and ML chops. I’m introducing my grad students to AI and ChatGPT this semester. I have an assignment based on it.
From DSC:
Check out the items below. As with most technologies, there are likely going to be plusses & minuses regarding the use of AI in digital video, communications, arts, and music.
DC: What?!? Wow. I should have seen this coming. I can see positives & negatives here. Virtual meetings could become a bit more creative/fun. But apps could be a bit scarier in some instances, such as with #telelegal.
I got lucky—my encounter was with a drone in virtual reality as part of an experiment by a team from University College London and the London School of Economics. They’re studying how people react when meeting police drones, and whether they come away feeling more or less trusting of the police.
It seems obvious that encounters with police drones might not be pleasant. But police departments are adopting these sorts of technologies without even trying to find out.
“Nobody is even asking the question: Is this technology going to do more harm than good?” says Aziz Huq, a law professor at the University of Chicago, who is not involved in the research.
The future looks very different in some parts of the country than in others, and will also vary among national four-year universities, regional universities like Ship, and community colleges. Grawe projects that, despite the overall demographic decline, demand for national four-year universities on the West Coast will increase by more than 7.5 percent between now and the mid-2030s. But in states like New York, Ohio, Michigan, Wisconsin, Illinois, and Louisiana, it will decline by 15 percent or more.
Higher ed’s eight-decade run of unbroken good fortune may be about to end.
Demand for regional four-year universities, per Grawe,will drop by at least 7.5 percent across New England, the mid-Atlantic, and Southern states other than Florida and Texas, with smaller declines in the Great Plains. Community colleges will be hit hard in most places other than Florida, which has a robust two-year system with a large Latino population.
The next generation of higher education leaders will take scarcity as a given and “return on investment” as both sales pitch and state of mind.
Nine out of 10 colleges either exclude or understate the net cost of attendance in their financial-aid offers to students, according to estimates published in a new report by the Government Accountability Office. The watchdog agency recommended that Congress consider legislation that would require institutions to provide “clear and standard information.”
The lack of clarity makes it hard for students to decide where to enroll and how much to borrow.
The report, published on Monday, paints a troubling picture of an industry that makes it difficult for consumers to understand the bottom line by presenting insufficient if not downright misleading information. Federal law does not require colleges to present financial-aid offers in a clear, consistent way to all students.
U.S. higher education faces a stable but deteriorating credit outlook in 2023, Fitch Ratings said Thursday, taking a more pessimistic view of the sector’s future than it had at the same time last year.
Operating performance at colleges and universities will be pressured by enrollment, labor and wage challenges, according to the bond ratings agency. Colleges have been able to raise tuition slightly because of inflation, but additional revenue they generate generally isn’t expected to be enough to offset rising costs.
While not all institutions can (or should be) saved, most institutional closures reflect the failure of past governing boards to face the fiscal reality of their institution — and to plan accordingly and in a timely manner. Leaders should always consider and, if necessary, pursue potential partnerships, mergers, or consolidations before a school has exhausted its financial and political capital. The inability or unwillingness of many leaders to take such action is reflected in the fact that the number of institutional closures in higher education far outweighs the number of successful mergers.
In fact, the risk of closure can be predicted. In a prior analysis several coauthors and I reported on a number of risk factors predictive of closure, noting that most schools at risk for closure are small and financially fragile, with declining enrollment and limited resources to mount significant online programs. While there are many clear signs that a school is at risk for closure, the major challenge to mounting a response seems to be the unwillingness of institutional leaders to understand, face and act on these signs.
Journalism jobs are hard to find. But it’s nice work when you can get it.
That’s the takeaway from a new report from the Georgetown University Center on Education and the Workforce on the payoff of journalism programs. An analysis of federal education and labor data reveals that journalism and communication bachelor’s degrees offer moderate payoff to their graduates, but only 15% of majors end up working in the field early in their careers. Newsroom employment has declined 26% since 2008, and researchers predict it will fall 3% over the next nine years.
Addendum on 12/10/22:
A Sectorwide Approach to Higher Ed’s Future — from insidehighered.com by Sylvia M. Burwell Institutions must seek ways to differentiate themselves even as they work together to address common challenges facing all of higher education, writes Sylvia M. Burwell.
We have to think differently about the future of higher education. And rather than limit our work to what one type of institution or program can achieve, we should look across the entire higher education sector.
A sectorwide [insert DSC:system-wide] approach is needed because the economics of higher education are not going to hold.
…
To evolve our thinking on these questions, we should focus on the value proposition of higher education and market differentiation.
There’s a certain feeling that happens when a new technology adjusts your thinking about computing. Google did it. Firefox did it. AWS did it. iPhone did it. OpenAI is doing it with ChatGPT.
In 2023, businesses will realise that, in order to get out of FAQ Land, they need to synchronise business systems together to deliver personalised transactional experiences for customers.
“We finally have the technologies to do all the things we imagined 10 years ago.”
From the 45+ #books that I’ve read in last 2 years here are my top 10 recommendations for #learningdesigners or anyone in #learninganddevelopment
Speaking of recommended books (but from a more technical perspective this time), also see:
10 must-read tech books for 2023 — from enterprisersproject.com by Katie Sanders (Editorial Team) Get new thinking on the technologies of tomorrow – from AI to cloud and edge – and the related challenges for leaders