Newsletter Subject

Weekly Briefing: How ChatGPT changed teaching

From

chronicle.com

Email Address

newsletter@newsletter.chronicle.com

Sent On

Sat, Dec 16, 2023 01:00 PM

Email Preheader Text

One year after the chatbot's debut, many instructors don't have guidance from their institutions. AD

One year after the chatbot's debut, many instructors don't have guidance from their institutions. ADVERTISEMENT [Weekly Briefing Logo]( You can also [read this newsletter on the web](. Or, if you no longer want to receive this newsletter, [unsubscribe](. This is how AI has changed teaching. Dear Readers: This is the year’s last edition of the Weekly Briefing. The newsletter will be back in your inboxes on Saturday, January 6, 2024. On that note, here’s a story from this week on a topic that seemed to dominate the entire year. Only a year after the [release of ChatGPT]( the artificial-intelligence chatbot tool from the nonprofit OpenAI, college instruction has changed. Colleges are still forging large-scale policies on what role generative AI will play in operations, research, and academic programming. While administrators deliberate, instructors have been forced to act. ChatGPT and other AI tools entered their classrooms quickly — whether they liked it or not. Some professors welcome the new technology and are training their students to be skilled users, assuming that they will need to know how to use it for jobs. Others have issued an outright ban, and have students submit their notes and other parts of their studying process to ensure they aren’t outsourcing their assignments to chatbots. The Chronicle asked professors about their AI practices, including whether the instructors themselves were brushing up on the technology. Nearly 100 instructors shared their answers. Though they are not representative of all of higher education, they teach at a range of institutions: community colleges, public and private four-year colleges or universities, international institutions, and one for-profit college. These instructors also taught a range of subjects, and many spent time learning about AI. Though no two respondents are the same, most share a commonality: They changed their classroom policies and assignments because of AI. Fewer than 10 said their assignments and policies remained the same. Many instructors suggested that they were ahead of their colleagues and their college’s leadership when responding to AI. For example, many instructors added language to their syllabus stating what they considered appropriate and inappropriate AI use. Some described ongoing conversations with students about the technology’s ethical implications, problems with bias, and how to cite its use. For most instructors who responded to The Chronicle’s inquiry, students’ failure to properly cite work produced or shaped by AI was a serious violation. Some professors changed or eliminated certain types of assignments. Several respondents said that even if they supported AI use in a few ways, they significantly changed or had to cut some exercises or assessments. One instructor wrote that she no longer offers take-home exams for one of her courses. Many professors said they allowed AI use for some assignments but not all. One common AI-aided assignment is to have students use ChaptGPT to write an essay and then critique the bot’s work. Instructors said this teaches students how AI works — which they said is necessary given its ubiquity. Several instructors were surprised to see that some students know little about the technology. As for the instructors who want to minimize or eliminate the use of AI, more in-class work is the solution. Some changed assignments to include references to specific points or personal reflections that ChatGPT would not be able to help with. When some faculty members seek guidance from their institutions on difficult issues related to AI, they often find no official policies on the technology, or bare-bones plans. That leads to some questions: If institutions issue guidance on AI, what shape will it take? And will everyone be happy? [Read our Beth McMurtrie and Beckie Supiano’s full story here](. ADVERTISEMENT NEWSLETTER [Sign Up for the Teaching Newsletter]( Find insights to improve teaching and learning across your campus. Delivered on Thursdays. To read this newsletter as soon as it sends, [sign up]( to receive it in your email inbox. Lagniappe - Read. Remember the 2016 wildfire that set Fort McMurray, Alberta, ablaze? So many wildfires have superseded that moment that you might not. [Fire Weather,]( by John Vaillant, details the conditions that created the wildfire in an oil town, and how it was an omen of devastating fires to come. (The New York Times) - Listen. The album [Something Shines,]( by Lætitia Sadier, known for her membership in the band Stereolab, is reminiscent of the ’90s band and still scratches the same itch for old fans. (Spotify) - Watch. [Yosi, the Regretful Spy]( is a fictional TV series about real events that led to the 1992 bombing of the Israeli Embassy in Buenos Aires, one of the worst terrorist attacks in Argentina’s history. (Imdb, Amazon Prime) —Fernanda and Heidi Landecker SUBSCRIBE TO THE CHRONICLE Enjoying the newsletter? [Subscribe today]( for unlimited access to essential news, analysis, and advice. Chronicle Top Reads ACADEMIC INTEGRITY [How Bad Are the Plagiarism Allegations Against the Harvard President? It Depends on Whom You Ask.]( By Emma Pettit and Megan Zahneis [STORY IMAGE]( Claudine Gay, under fire for comments at a congressional hearing last week, has also been accused of plagiarism. But several scholars she allegedly copied from dispute the charges. SPONSOR CONTENT | Amazon Business [Powering Higher Ed with Smarter Procurement]( With Smart Business Buying, Amazon Business can help colleges and universities make going back to campus easier (and less expensive) THE REVIEW | OPINION [Why the Presidents Couldn’t Answer Yes or No]( By Rafael Walker [STORY IMAGE]( They behaved like academics. That’s a good thing. DATA [What Do Americans Say About College? That Depends on What — and Whom — You Ask.]( By Brian O’Leary [STORY IMAGE]( Explore the nuances of public views of higher ed in 15 findings from Chronicle polling data. ADVERTISEMENT FROM THE CHRONICLE STORE [The Research Driven University - The Chronicle Store]( [The Research Driven University]( Research universities are the $90-billion heart of America’s R&D operation. [Order this report today]( to explore the scope of the American academic-research enterprise and how institutions can contribute to tomorrow’s revolutionary innovations. NEWSLETTER FEEDBACK [Please let us know what you thought of today's newsletter in this three-question survey](. This newsletter was sent to {EMAIL}. [Read this newsletter on the web](. [Manage]( your newsletter preferences, [stop receiving]( this email, or [view]( our privacy policy. © 2023 [The Chronicle of Higher Education]( 1255 23rd Street, N.W. Washington, D.C. 20037

Marketing emails from chronicle.com

View More
Sent On

05/12/2024

Sent On

03/12/2024

Sent On

02/12/2024

Sent On

02/12/2024

Sent On

02/12/2024

Sent On

09/11/2024

Email Content Statistics

Subscribe Now

Subject Line Length

Data shows that subject lines with 6 to 10 words generated 21 percent higher open rate.

Subscribe Now

Average in this category

Subscribe Now

Number of Words

The more words in the content, the more time the user will need to spend reading. Get straight to the point with catchy short phrases and interesting photos and graphics.

Subscribe Now

Average in this category

Subscribe Now

Number of Images

More images or large images might cause the email to load slower. Aim for a balance of words and images.

Subscribe Now

Average in this category

Subscribe Now

Time to Read

Longer reading time requires more attention and patience from users. Aim for short phrases and catchy keywords.

Subscribe Now

Average in this category

Subscribe Now

Predicted open rate

Subscribe Now

Spam Score

Spam score is determined by a large number of checks performed on the content of the email. For the best delivery results, it is advised to lower your spam score as much as possible.

Subscribe Now

Flesch reading score

Flesch reading score measures how complex a text is. The lower the score, the more difficult the text is to read. The Flesch readability score uses the average length of your sentences (measured by the number of words) and the average number of syllables per word in an equation to calculate the reading ease. Text with a very high Flesch reading ease score (about 100) is straightforward and easy to read, with short sentences and no words of more than two syllables. Usually, a reading ease score of 60-70 is considered acceptable/normal for web copy.

Subscribe Now

Technologies

What powers this email? Every email we receive is parsed to determine the sending ESP and any additional email technologies used.

Subscribe Now

Email Size (not include images)

Font Used

No. Font Name
Subscribe Now

Copyright © 2019–2025 SimilarMail.