Integrating AI in Journalism Education


Social Media

Comparative Analysis: Mastodon vs. Traditional Social Media

Read more

Newsletter Case Study: Impact’s Subscriber Growth

Read more
Digital Strategy

Marketing Strategies for Mastodon

Read more

AI in Action: Enhancing Public Policy Research for the Future

Read more
Branding and Logo

Social Chime Branding and Logo Design

Read more
Graphic Design

Virtual Reality and Augmented Reality in Design

Read more
Social Media

How Social Media Platform Changes Can Help Your Business Pivot Its Strategy

Read more

Are Your Emails CAN-SPAM Compliant?

Read more

Proven Email Strategies for Publishers

Read more

The Role of Voice Search in SEO

Read more

Navigating SEO Algorithm Updates

Read more
Website Development

InsightFlow Analytics Website Design + Development

Read more

Leveraging AI for Nonprofit Organizations

Read more
Graphic Design

World Cup Templates for Impact

Read more
Branding and Logo

9th Street YCC: An Environmental Youth Organization Brand Revamp

Read more
Branding and Logo

Blush. Cosmetics: A Symphony of Elegance and Simplicity

Read more
Branding and Logo

Springboard Project: A Visual and Strategic Blueprint

Read more
 min read

Integrating AI in Journalism Education
Journalism professors are acknowledging that AI must be addressed in the classroom. And this technology isn't new. The United States’ top newsrooms have been using AI for certain tasks since 2014.

Executive Summary

It’s not a secret that journalism has struggled to adapt to the digital age. In an industry that has been plagued by furloughs, layoffs, and the consolidation of the news-gathering process, anything that can reduce the workload of the reporters, editors, and designers who remain is embraced. Some newsrooms have implemented AI tools like transcribing meetings to save time in the writing process. In contrast, others have turned to generative AI to write articles themselves, which editors then review. 

Universities and organizations worldwide are diligently examining how AI can and should be implemented in modern newsrooms. 

The Generative AI In The Newsroom Project at Northwestern University aims to discover how AI could work in newsrooms today and in the future. It points out that while there are ways generative AI could save time and money, there are very legitimate concerns about accuracy and plagiarism, as well as the sustainability of using AI to create news content. 

In July, Open AI (the company behind Chat GPT) and nonprofit the American Journalism Project announced a $10 million partnership in which AJP will disperse $5 million in grants to newsrooms, which will then experiment with AI. 

Meanwhile, the Associated Press is allowing Chat GPT to be trained on its archive going back to the 1980s in an attempt to adapt to the situation at hand, which is that AI does not appear to be going away. 

The question on everyone’s minds is how AI can be used to enhance reporting, not replace reporters. 

Including information about AI in journalism education is critical to ensuring future journalists will be able to leverage the technology ethically. 

The “Fourth Estate” is built on journalists’ ethics and responsibility to report the news objectively and with their original work. Newsrooms around the world are trying to determine how to use AI without losing their readers’ trust. When accusations of “fake news” are still echoing in the American psyche, reporters are understandably loath to take actions that might lose the tenuous trust the community has in them. 

Perhaps, as this article from the International Center for Journalists suggests, AI should be used and taught to automate some repetitive tasks, such as searching for file photos or stories to use for background, so reporters can do the important work of building relationships and finding human stories to connect with their audiences. 

Journalism professors are acknowledging that AI must be addressed in the classroom. And as this article points out, the United States’ top newsrooms have been using AI for certain tasks since 2014. 

Professors can see reporters using AI for prompting, such as helping generate interview questions, writing outlines, and those kinds of tasks. Their responsibility will be to teach aspiring journalists where the line is and how to know it. 


Despite the stir ChatGPT created when it entered the market in late 2022, artificial intelligence has been used in some limited functions for nearly a decade among journalism’s most prominent circles. 

According to the Associated Press, the news organization first implemented AI in 2014 to automate corporate earnings. They chose to automate these stories not because they’re not important but because they were rather time-consuming, and using AI for these stories meant business reporters could focus on “higher-impact journalism.” 

Since then, the nonprofit has partnered with startups and utilized AI in various ways. One program monitors social media with language processing tools for breaking news. Another automatically transcribes videos and automates shot lists and story summarizations. 

The AP is in the process of developing a tagging system that will optimize content via image recognition. This will streamline the publishing process and let users find content faster, according to the AP. 

As of August 2023, the Associated Press does not use generative AI to create publishable content and images, however. 

Major publications across the globe have laid out their principles and approach to using – or not using – AI in their newsrooms. Publications have expressed their concerns with the reliability of AI tools to create accurate information, the ethics of such generative tools as Chat GPT, which has been accused of training its model on copyrighted information, and the possibility of further straining the trust of media organizations. 

Simultaneously, top newspapers are seeking guidance – and opening new positions – in order to lead the charge in the responsible use of GenAI. In September 2023, the New York Times sought a senior editor to work as a guiding hand for the publication’s approach to implementing AI in the newsroom, both in customer-facing and behind-the-scenes applications. 

The journalism industry is attempting to pivot and not be left in the figurative GenAI dust. However, critics point out, not unreasonably, that this situation is still developing, and journalists have very legitimate concerns about GenAI tools.

Significance of AI literacy in journalism careers

How AI can be used to improve journalism is a developing situation, but media schools teaching young journalists need to be agile in addressing AI in the curriculum now. There’s a well-worn saying in journalism circles that if your mother says she loves you, you should seek a second source. In the AI age, journalists need to be more suspicious than ever about the “facts” as presented. With just a few keystrokes on Chat GPT, as illustrated in this article, anyone can create an authentic-sounding newspaper staff, website, and articles that are not actually real. It’s all too easy to do such things, eroding what’s left of the public’s trust in journalism.

Objective of integrating AI in journalism syllabus

Journalists seem to be split into two camps when it comes to AI. The first group is filled with despair, crying out that this is the end of journalism as we know it. The second is a little more hopeful, foreseeing some ways AI can be used to bolster newsgathering rather than replacing the humans who keep it going. 

Journalism and media schools should be at the forefront of the exploration of how AI and GenAI can be used to improve and simplify the day-to-day lives of journalists, allowing them to focus on more hard-hitting topics. 

What AI can do is still in the air. At this point, educators across all grade levels are facing a lack of tools to help teach students AI literacy, according to a scholarly article from the Georgia Institute of Technology. 

In an Online News Association article, stakeholders say J-schools should be teaching the limitations of the tool and where a human touch is still irreplaceable. Some studies are already in progress or complete.

Literature Review

Case studies of AI applications in journalism

Though this is a developing situation, educators and researchers have not hesitated to investigate the possible uses of AI in journalism. Here are a few of the case studies and scholarly articles released so far:

• “Imagination, Algorithms and News: Developing AI Literacy for Journalism,” by Mark Deuze of the University of Amsterdam and Charlie Beckett of the London School of Economics

• Making Artificial Intelligence Work for Investigative Journalism, by Jonathan Stray:

• Exo Journalism: A Conceptual Approach to a Hybrid Formula between Journalism and Artificial Intelligence: by Santiago Tejedor

Overview of existing AI educational models in journalism

Because of the significant interest and concern in the implications of generative AI in journalism, several prominent organizations have already begun to develop educational models and guidance on developing models. 

Partnership on AI, a nonprofit organization comprising “academic, civil society, industry, and media organizations,” is working to create solutions advancing positive results for people and society as a whole. The organization decided to open its Guidance for Safe Foundation Model Deployment up to public comment to help with the process of establishing agreed-upon practices for responsible model development and deployment. 

United Nations Educational, Scientific and Cultural Organization, or UNESCO, in 2023 released “Reporting on artificial intelligence: a handbook for journalism educators.” This handbook is free and available in its entirety online. 

The guide talks about:

  • Figuring out what machine intelligence is and the different kinds of AI
  • Looking into what AI can do and what its pros and cons are
  • Creating different worlds with AI by recognizing the common stories that shape people's minds 
  • Figuring out what role news plays in creating and managing AI debate Finding ways to talk about AI in a way that is complex, realistic, and responsible
  • Making links to different types of journalism, from general news reports to data journalism

Challenges and opportunities in AI education for journalism

The main challenges and opportunities of providing quality education to aspiring journalists regarding AI will be managing the programs available and how they transform over time, finding options that enable journalists to work more effectively without “taking away their jobs,” and balancing the ethics that journalists work under and the issues that have arisen with copyright and intellectual property ownership.

Theoretical Framework

Understanding AI and its applications in journalism

Reporting on Artificial Intelligence: A Handbook for Journalism Educators defines AI as “a collection of tools and technologies that are transforming operations and outcomes in diverse fields and sectors.” Generative AI, specifically, is trained on large amounts of other works, including books, blogs, websites, social media, and more. 

AI is already being used to analyze business earnings in journalism and may similarly be able to be used to understand complicated data sets more quickly. 

Pedagogical approaches for teaching AI in journalism

One of the frustrating parts about instructing students in journalism is the power of on-the-job learning – nothing teaches an aspiring journalist quite like doing the job. This is why internships are so valuable to journalism students and matter so much when it comes to getting a job in the field. 

In preparation for on-the-job work, practical simulations have been used for a long time to teach media. In these, students try out different ways to make different copies of a media piece based on a clear brief, and then they compare and analyze them. For example, different groups in a class could be given slightly different factors (like different viewers) or be told to give the software different "prompts" and then look at what happens. For ChatGPT, at least, it should be possible to show the sources that the app used, which should allow for more questioning (though it's not clear if this is still possible). It would be interesting if you could also compare the results of different programs.

This could lead to a more artistic method with more possibilities, as long as it is done carefully.

Ethical considerations of AI in journalism

For a journalist, ethics and integrity are among the most important principles that guide them through the day-to-day of newsgathering. It shouldn’t be too surprising then that the ethical concerns many people have about the way Generative AI has been created on models, possibly without the creators’ consent, has journalists raising objections. 

Journalists have several main concerns regarding the use of Generative AI: 

  1. Is an article written with the assistance of Generative AI inherently plagiarism since it is not original work by the writer and is based on thousands of others’ work? Furthermore, is it inherently unethical to use the current Gen AI models, given they were possibly trained without the consent of the authors?
  2. Suppose other large language models CAN be trained ethically, with the express consent of the authors of the works. How can publications trust that the information generated is correct and unbiased? 
  3. Furthermore, suppose an ethical, factually accurate, and unbiased AI database can be created. What is the danger that it will replace the living, breathing reporters and editors whose hard work and passion fill the pages of newspapers and their digital counterparts?

Journalists are rightfully a bit apprehensive about enabling anything that will lead to the reduction of available careers in the industry since many have faced layoffs, furloughs, and consolidations as the industry has contracted since the 2008 Great Recession. 

If, however, GenAI can be ethically and accurately used to make journalists’ daily workloads a little easier, this is something industry leaders are likely to pursue.

Of course, more than just journalists have ethical concerns when it comes to intellectual property and the implications of generative AI. The Council of Europe reviewed the ethical concerns behind the use of AI while researching “The Impact of Artificial Intelligence on the doctor-patient relationship.” While the implications of AI and medical systems are different than those applied to journalism, it is still an interesting exercise, which found six main concerns, which are as follows: 

  • Inconclusive Evidence: conclusions drawn by an algorithm are inherently uncertain;
  • Inscrutable Evidence: lack of access to the datasets on which AI are trained means conclusions don’t stand up to scrutiny;
  • Misguided Evidence: “Conclusions can only be as reliable (but also as neutral) as the data they are based on;”
  • Unfair Outcomes: Discrimination can occur even if it is based on “conclusive, scrutable and well-founded evidence.”
  • Transformative Effects: We don’t know how the widespread use of AI will change the way the world is organized and conceptualized, and;
  • Traceability: “It is difficult to detect harms, find their cause, and assign blame when AI systems behave in unexpected ways.”

The report goes into a great amount of detail about how the opacity of the processes of decision-making within AI can have outsized effects on the populations and individuals who reap the results of those decisions. 

Practical Implementation

At the risk of becoming repetitive, educational institutes will have to balance the real-time development of AI tools and the need to create students who are knowledgeable and think critically about AI and its practical uses. 

Nick Diakopoulos, associate professor of communication studies at Northwestern’s School of Communication, studies the implementation of AI in journalism with his “Generative AI In The Newsroom” project. 

He believes there can be ways AI will improve journalists’ jobs rather than just increasing their output. 

“Whether or not these technologies really impact labor depends on how these technologies are deployed by management,” Diakopoulos said in an article from Northwestern University. “Management could come in and say, ‘You’re going to write ten times as many stories.’ Or they could say, ‘You’re going to write the same number of stories with ten times as many sources because you can do ten times as many interviews in the same amount of time.’ I think option B is an awesome vision for what the future could be if we could improve the quality of the news media, rather than just the quantity.”

Curriculum development

There are many different “AI” tools available right now, from Chat GPT and Anyword to Grammarly and Zapier, which are used for organization, automation, content creation, and more. Journalism students should be taught to recognize when GenAI tools have been used to create content to prevent credulity when it comes to false sources (fact-checking should also aid in this), what tools are ethically neutral and can be used to streamline workflows, and what usage would indicate plagiarism, such as using GenAI tools to create an entire story. The following is a suggestion for possible courses that could be designed to help future journalists grapple with the ongoing development of AI writing tools. 

Suggested courses and modules

Course: Ethical Use of AI Tools in Journalism

  • Module 1: Recognizing AI use in copy 
  • Module 2: When is it okay to use AI?
  • Module 3: When is it unethical to use AI?
  • Module 4: The grunt work AI tools can do for you (so you can get to the good stuff)
  • Module 5: AI tools being used now in newsrooms

Collaboration with industry experts

Fortunately, there is no shortage of career journalists who are already on this story. Nick Diakopoulos is just one associate professor industriously studying how AI is and may be used in real-life newsrooms.  

In the UK, David Caswell has interviewed journalists at more than 40 organizations and is on the optimistic end of the curve when it comes to AI’s impact on journalism. It will not be difficult at the university level to find and collaborate with industry experts who are interested in the impact of GenAI on journalism.

Infrastructure and Resources

Necessary hardware and software

Thankfully, the study of GenAI and journalism will not take more powerful computers than are already part-and-parcel of a university or college’s media school. J-schools and media schools may be compelled to subscribe to the more popular GenAI tools to provide access to their students. Perhaps, as with tools like Adobe Photoshop and Publisher, the companies will allow more affordable access to students.

Training of faculty

Training of faculty will need to take place hand-in-hand with the development of the specific courses and will need to be updated regularly as tools evolve.

Start by evaluating your current technology systems, including infrastructure and software. Identify areas for optimization or integration with new solutions. Then, choose technologies aligned with your needs, scalability, usability, and mission.

Assessment and Evaluation

Measuring student learning outcomes

Standardized or subject-specific tests and exams, as well as locally made and course-integrated onesPortfolios of student work can show how much they've learned over time.The last assignments, acts, or speeches for a class or program

Continuous improvement of the AI-integrated syllabus

As AI evolves, the syllabus and courses must be updated to adjust for these changes.


Journalists have a healthy skepticism of most things, and this extends to the utility of generative AI and its utility for improving the daily workflow of journalists. However, for every grizzled reporter convinced this developing technology will take all of their jobs, there is likely another more optimistic and rational individual who is ready and willing to adapt ethical iterations of generative AI into their workflows. Indeed, major journalistic institutions like the Associated Press have been using AI to simplify and offload some onerous tasks since 2014. 

Including information about AI in the education provided to aspiring journalists is critical to ensuring they will be able to leverage the technology ethically. Young journalists should be trained to recognize the content created by GenAI and also have hands-on experience with the tasks AI can improve, such as generating interview questions, outlines, or summaries. 

Studies, collaborations, and research into AI’s use in journalism are ongoing despite the relevant novelty of generative AI. The curriculum will have to be updated as new AI tools evolve and are created. 

The reality regarding AI is that these tools are not fading away, and a working knowledge of how to use them ethically and responsibly will become a valuable asset to aspiring journalists as they enter the job market. 

Type of Work

How We've Helped Clients

Social Media

Comparative Analysis: Mastodon vs. Traditional Social Media

In this comparative analysis, we will look at an overview of Mastodon and six other top global social media platforms, differences in user experience, content moderation, and privacy, and how these factors influence user engagement and content strategy.
Read more

Newsletter Case Study: Impact’s Subscriber Growth

From July 2023 to November 2023, Adriana Lacy Consulting drove an almost 50% increase in subscribers to Impact’s Substack newsletter.
Read more

AI in Action: Enhancing Public Policy Research for the Future

Leveraging Automation for Informed Decision-Making in Governmental Research and Policy
Read more

Ready to work together?

Get Started