If journalism is nothing more than a new information harvesting exercise for systems that transform that harvest into this product called 'customized content,' If journalism is nothing more than a new information harvesting exercise for systems that transform that harvest into this product called 'customized content,'

[Inside the Newsroom] Confronting journalism’s Oppenheimer moment

2025/12/14 11:00

Over the weekend, Nieman Labs — known for reporting on the intersection of technology and journalism — published a flurry of thought-provoking predictions for journalism in 2026. 

A number of them caused a stir among Filipino journalists. These included one which declared, “Sorry, the explainer is dead,” and another which predicted the rise of a new type of journalism — “one not directed at people, but tailored explicitly to machine compilers of language and information.” 

The writer of the first opinion piece basically said that, because people are going to AI for things like explainers and evergreen content, these do not matter anymore. What works? The author said it is the basics: hyperlocal news, breaking news, scoops, notable first-person narratives, and investigative journalism. “AI,” the expert said, “can’t or won’t summarize this information because it’s too recent or too unique.”

The author who talked about agentic journalism meanwhile noted that “AI systems do not need ledes, nut-graphs, or narrative flows; they need user-relevant, novel, and machine-readable content.” 

The role of agentic journalism, he further pointed out, was to write the five Ws, quotes, context, and links to multimedia content. All of these, the author noted, are then assembled and customized at the point of delivery, based on what suits the end user. 

Both pieces troubled me. Thinking about it further now, I realize that it is because these observations, while dire, are true to some degree. 

In fact, what troubled me the most was that these articles did not really go far enough. And they did not delve into real solutions to existential challenges newsrooms around the world are facing today. 

Hello! I am Gemma Mendoza, head of digital services and lead researcher for disinformation and platforms at Rappler. 

Since Rappler was launched, I have been working closely with our tech and data teams in the design and development of systems and platforms that produce content at scale using data. These projects range from the highly interactive election results pages we have launched every election year since 2013 to our GraphRAG-powered chatbot, Rai.

I also lead our research on disinformation and hate in platforms, which began as we observed changes in Facebook’s algorithms towards the 2016 elections. 

As mentioned earlier, I agree with many things that the authors of the NiemanLab articles raised. For instance, indeed, a better semantic architecture will help these large language models surface information better. At Rappler, we worked on our own knowledge graph before we went on to develop our chatbot Rai.

But I want to talk about the troubling things that we have observed while working on AI systems and their impact on our own platform. 

The author of the first article I mentioned is mistaken when she said that AI cannot summarize unique information. This is not true. 

If you let them, AI systems can easily summarize unique content. In fact, even if you explicitly tell them not to, they will find a way to do this.  

We know this for a fact because we have been observing the AI bots that have been crawling and harvesting content from the Rappler website. We know that despite restrictions our website rules imposed on AI scraping, these chatbots continue to happily churn out data and content that Rappler’s team has painstakingly gathered. 

In one case, ChatGPT instantly provided detailed information on findings in a report written by our campus journalist fellow about an entrenched dynasty in the Bicol region which spent around a million pesos on social media ads to attack its rivals. The information used for this story was painstakingly pieced together for several months by a student journalist who participated in our fellowship program. (See the screenshot below.)

Anybody familiar with scalable systems will tell you that if a search engine is querying a database for the first time, there will be lag in delivery. Even Google’s search engine needs time to process and index information. Based on its instant response, ChatGPT clearly had the entire article within its own database, even before we queried it. 

This is not the only unique piece of content published by Rappler that the bot has chewed on. 

When we added AI scraping restrictions to our robots.txt file, ChatGPT acknowledged that it was aware of these restrictions. However, it still found a way to use our content, initially by claiming that it used publicly available sources or by referencing other sites that also scraped and synthesized our content — also without permission. (See below.)

My point is that we are already talking to the machines. We have been for a while. This is nothing new.

This is what we do every time we shift the way we write our stories and social media captions to conform to updates in Facebook and Google’s algorithms. We are making it easier for these machines to understand our content as well as use and monetize what we have gathered, written, and produced. 

Over the years, we were told that adding microtags to stories would help surface our content better. And so we did it. The question really is whether newsrooms and journalists should continue to go in this direction.

This used to be a symbiotic relationship. Optimizing for these platforms used to reward Rappler with substantial traffic, allowing us to scale. 

Now, with AI, we are being pushed again to “optimize” — this time, for answer engines. 

But here’s the dilemma: these now “agentic” systems are totally different beasts. They are information-hungry machines that want to be know-it-alls. 

Unlike helpful librarians, they parasitically — without asking for permission — extract and suck the juice off content they feed upon.

And they do not want to pay for that content. Believe me, we asked.

OpenAI did pay a handful of publishers. None of them (still) are from the Global South.

And the problem is that these chatbots are catering to an audience that is increasingly being behaviorally-engineered towards instant gratification. These audiences do not click on those links that are so subtly placed within the AI-generated answers. Many times, they do not even check if the links are real. 

And the numbers show it. The traffic even the largest news publishers are getting from these parasitic systems is downright pitiful.

What is worse is that journalists, and the news industry itself, are not taking this seriously enough. Some major media networks are taking what they can by inking deals with the AI platforms. Others who have less clout are fading helplessly on the fringes. 

As another article from the NiemanLab series pointed out: “We’re not learning from the past. Instead, we’re approaching this new era of generative AI much like we did platforms.”

If this continues, the author said, “We guarantee the death of most remaining newsrooms.”

I agree. 

This is journalism’s Oppenheimer moment. 

The question is this: Should journalists and newsrooms succumb to this directive once again, hook, line, and sinker like we did with social media? 

The question here is: Does the public whom we serve ultimately benefit if we continue down this path of surrendering to the platforms? Or — for their sake and ours — should we push back and chart an alternative path, one that is not parasitic but focuses on achieving symbiosis? 

If journalism is nothing more than a new information harvesting exercise for systems that transform that harvest into this product called “customized content,” what becomes of its soul and its mission?

These questions need to be confronted not just by us journalists, but also by citizens who believe that independent journalism needs to survive as a check on power, abuse, and corruption. 

We need supportive communities to collaborate with us on this journey. 

Over the past couple of years, Rappler has been building a mobile platform that will allow you to directly converse with us and with the rest of our community in a way that is safe and free from manipulation.  

We have also started working with other newsrooms in the Philippines and in the ASEAN region towards building a bottom-up approach to news circulation — a more sustainable path to growing audience and revenue beyond fickle Big Tech algorithms.

If you have not yet done so, download our mobile apps and subscribe to our newsletters so that you can have that direct line with us.

If you have not yet done so, please consider signing up for Rappler Plus, our premium membership program. And please help us convince others.

We need you to help us confront journalism’s Oppenheimer moment. 

Help us carve these pathways to retaining agency so we can sustain our mission in this world of generative AI. – Rappler.com

Market Opportunity
NOTHING Logo
NOTHING Price(NOTHING)
$0.001601
$0.001601$0.001601
+1.65%
USD
NOTHING (NOTHING) Live Price Chart
Disclaimer: The articles reposted on this site are sourced from public platforms and are provided for informational purposes only. They do not necessarily reflect the views of MEXC. All rights remain with the original authors. If you believe any content infringes on third-party rights, please contact service@support.mexc.com for removal. MEXC makes no guarantees regarding the accuracy, completeness, or timeliness of the content and is not responsible for any actions taken based on the information provided. The content does not constitute financial, legal, or other professional advice, nor should it be considered a recommendation or endorsement by MEXC.

You May Also Like

Unexpected Developments Shake the Financial Sphere

Unexpected Developments Shake the Financial Sphere

The post Unexpected Developments Shake the Financial Sphere appeared on BitcoinEthereumNews.com. Japan’s recent move to hike its interest rate to 0.75 ahead of
Share
BitcoinEthereumNews2025/12/19 22:07
Foreigner’s Lou Gramm Revisits The Band’s Classic ‘4’ Album, Now Reissued

Foreigner’s Lou Gramm Revisits The Band’s Classic ‘4’ Album, Now Reissued

The post Foreigner’s Lou Gramm Revisits The Band’s Classic ‘4’ Album, Now Reissued appeared on BitcoinEthereumNews.com. American-based rock band Foreigner performs onstage at the Rosemont Horizon, Rosemont, Illinois, November 8, 1981. Pictured are, from left, Mick Jones, on guitar, and vocalist Lou Gramm. (Photo by Paul Natkin/Getty Images) Getty Images Singer Lou Gramm has a vivid memory of recording the ballad “Waiting for a Girl Like You” at New York City’s Electric Lady Studio for his band Foreigner more than 40 years ago. Gramm was adding his vocals for the track in the control room on the other side of the glass when he noticed a beautiful woman walking through the door. “She sits on the sofa in front of the board,” he says. “She looked at me while I was singing. And every now and then, she had a little smile on her face. I’m not sure what that was, but it was driving me crazy. “And at the end of the song, when I’m singing the ad-libs and stuff like that, she gets up,” he continues. “She gives me a little smile and walks out of the room. And when the song ended, I would look up every now and then to see where Mick [Jones] and Mutt [Lange] were, and they were pushing buttons and turning knobs. They were not aware that she was even in the room. So when the song ended, I said, ‘Guys, who was that woman who walked in? She was beautiful.’ And they looked at each other, and they went, ‘What are you talking about? We didn’t see anything.’ But you know what? I think they put her up to it. Doesn’t that sound more like them?” “Waiting for a Girl Like You” became a massive hit in 1981 for Foreigner off their album 4, which peaked at number one on the Billboard chart for 10 weeks and…
Share
BitcoinEthereumNews2025/09/18 01:26
Adoption Leads Traders to Snorter Token

Adoption Leads Traders to Snorter Token

The post Adoption Leads Traders to Snorter Token appeared on BitcoinEthereumNews.com. Largest Bank in Spain Launches Crypto Service: Adoption Leads Traders to Snorter Token Sign Up for Our Newsletter! For updates and exclusive offers enter your email. Leah is a British journalist with a BA in Journalism, Media, and Communications and nearly a decade of content writing experience. Over the last four years, her focus has primarily been on Web3 technologies, driven by her genuine enthusiasm for decentralization and the latest technological advancements. She has contributed to leading crypto and NFT publications – Cointelegraph, Coinbound, Crypto News, NFT Plazas, Bitcolumnist, Techreport, and NFT Lately – which has elevated her to a senior role in crypto journalism. Whether crafting breaking news or in-depth reviews, she strives to engage her readers with the latest insights and information. Her articles often span the hottest cryptos, exchanges, and evolving regulations. As part of her ploy to attract crypto newbies into Web3, she explains even the most complex topics in an easily understandable and engaging way. Further underscoring her dynamic journalism background, she has written for various sectors, including software testing (TEST Magazine), travel (Travel Off Path), and music (Mixmag). When she’s not deep into a crypto rabbit hole, she’s probably island-hopping (with the Galapagos and Hainan being her go-to’s). Or perhaps sketching chalk pencil drawings while listening to the Pixies, her all-time favorite band. This website uses cookies. By continuing to use this website you are giving consent to cookies being used. Visit our Privacy Center or Cookie Policy. I Agree Source: https://bitcoinist.com/banco-santander-and-snorter-token-crypto-services/
Share
BitcoinEthereumNews2025/09/17 23:45