After a Meta layoff in 2023, Selma Mouloudj interviewed at Google, LinkedIn, TikTok, and Pinterest before landing her current job at Salesforce.
AI21, an Israeli startup building its own large language models (LLMs), is raising a $300 million funding round, according to a source.
Applications are almost closed, and you have until 11:59 p.m. PT tonight to reserve your exhibitor table at TechCrunch Sessions: AI, our premiere industry event that’s happening at UC Berkeley’s Zellerbach Hall on June 5. Imagine walking into a r…
Function: Trend. Mode: Impulsive. Strucuture: Motive. Position: Wave {iii} of 5.
When OpenAI pulled back its latest ChatGPT release—one that apparently turned the helpful chatbot into a total suck-up—the company took the welcome step of explaining exactly what happened in a pair of blog posts. The response was a notable move and really pulled back the curtain on how much of what these systems do is shaped by language choices most people never see. A tweak in phrasing, a shift in tone, and suddenly the model behaves differently.
For journalists, this shouldn’t be surprising. Many editorial meetings are spent agonizing over framing, tone, and headline language. But what is surprising—and maybe even a little disorienting—is that the same editorial sensitivity now needs to be applied not just to headlines and pull quotes, but to algorithms, prompts, and workflows that live in the guts of newsroom technology.
Before we connect the dots to newsroom AI, a quick recap: OpenAI’s latest update to GPT-4o involved an extensive process for testing the outputs, and it scored well on the factors the testers could measure: accuracy, safety, and helpfulness, among others. However, some evaluators doing more qualitative testing said the model felt “off,” but without more to go on, OpenAI released it anyway.
Within a day, it was clear the evaluators’ vibe-checks were onto something. Apparently the release had substantially increased “sycophancy,” or the model’s tendency to flatter and support the user, regardless of whether it was ultimately helpful. In its post announcing the rollback, OpenAI said it would refine ChatGPT’s system prompt—the invisible language that serves as kind of an “umbrella” instruction for every query and conversation with the public chatbot.
The first thing that strikes you about this: We’re talking about changes to language, not code. In reaction to the recall, a former OpenAI employee posted on X about a conversation he had with a senior colleague at the company about how the change of a single word in the system prompt induced ChatGPT to behave in different ways. And the only way to know this was to make the change and try it out.
If you’re familiar with AI and prompting, this isn’t a shock. But on a fundamental level, it kind of is. I’m not saying the new release of GPT-4o was entirely about changing language in the system prompt, but the system prompt is a crucial element—altering it was the only temporary fix OpenAI could implement before engaging in the careful process of rolling back the release.
For anyone in communications or journalism, this should be somewhat reassuring. We’re in the business of words, after all. And words are no longer just the way we communicate about technology—they’re a crucial part of how these systems work.
OpenAI’s ordeal has two important takeaways for how the media deals with AI: First, that editorial staff have a vital role to play in building the AI systems that govern their operations. (Outside frontier labs, tool building often amounts to prompt engineering paired with automations.) And second, transparency is the path to preserving user trust.
On the first point, the way AI directly affects content, and the need for good prompting to do that well, has a consequence for how media companies are organized: Editorial and product teams are becoming more like each other. The more journalists incorporate AI into their process, the more they end up creating their own tools. Think custom GPTs for writing assistance, NotebookLM knowledge bases for analyzing documents, or even browser extensions for fact-checking on the fly.
On the product side, the idea that media technology today isn’t just presenting content, but remixing and sometimes creating it is a massive change. To ensure those outputs adhere to journalistic principles, it doesn’t just make sense to have writers and editors be a part of that process—it’s necessary.
What results, then, is a journalist-product manager hybrid. These kinds of roles aren’t entirely new, but they’re generally senior leadership roles with words like “newsroom innovation” in the title. What AI does is encourage each side to adopt the skills of the other all the way down. Every reporter adopts a product mindset. Every product manager prioritizes brevity and accuracy.
The audience is the silent partner in this relationship, and OpenAI’s incident also serves as an example of how to best include them—through radical transparency. It’s hard to think of a way OpenAI could have better restored trust with its users other than its decision to fully explain how the problems got by its review process, and what it’s doing to improve.
While it’s unusual among the major AI labs (can you imagine xAI or DeepSeek writing a similar note?), this isn’t out of character for OpenAI. Sam Altman often shares on his X account announcements and behind-the-scenes observations from his vantage point as CEO, and while those are probably more calculated than they seem, they’ve earned the company a certain amount of respect.
This approach provides a road map for how to publicly communicate about AI strategy, especially for the media. Typically, when a publication creates an AI media policy, the focus is on disclosures and guidelines. Those are great first steps, but without a clearer window into the specific process, indicators such as “This article is AI assisted” aren’t that helpful, and audiences will be inclined to assume the worst when something goes wrong.
Better to be transparent from the start. When CNET used AI writers in the early days of generative AI to disastrous results, it published a long explanation of what went wrong, but it didn’t come until well after it had been called out. If the publication had been out front with what it was doing—not just saying it was using AI, but explaining how it was building, using, and evaluating it—things might have turned out differently.
In its second post about the sycophancy fiasco, OpenAI revealed that a big part of its concern was the surprising number of people who now use ChatGPT for personal advice, an activity that wasn’t that significant a year ago. That growth is a testament to how fast the technology is improving and taking hold in various aspects of our lives. While it’s only just beginning to alter the media ecosystem, it could quickly become more deeply embedded than we had predicted.
Building AI systems that people trust starts with the people building them. By leveraging the natural talents of journalists on product teams, those systems will have the best chance of success. But when they screw up—and they will—preserving that trust will depend on how clear the window is on how they were built. Best to start polishing it now.
AI startups are disrupting professional networking, stirring buzz, and attracting attention from investors.
Square Enix’s Symbiogenesis Web3 game was slated to be discontinued in July 2025, but Sony said the game will now expand to Sony’s Soneium blockchain.
AI decentralized apps (DApps) have seen a spike in user activity and could soon challenge gaming and DeFi for the top spot in the DApp ecosystem, according to blockchain analytics platform DappRadar.Gaming and DeFi are both sitting on …
Apple is reportedly working on its own microchips across multiple product categories, including smart glasses and artificial intelligence — a hint at what’s next for the massive Silicon Valley-based tech giant. A May 8 repo…
Magnus Grimeland, the CEO and founder of the VC firm Antler, said demand for software engineers will only grow with AI.
Google parent Alphabet is one that has spied the allure of Europe’s reverse Yankee bonds
Executives like OpenAI’s Sam Altman said US support for infrastructure would make it easier for AI companies to meet demand.
For organizations with clearly defined problems and verifiable answers, RFT offers a compelling way to align models.
TOH is using Microsoft’s DAX Copilot to capture physician-patient conversations and generate draft clinical notes in real time.
International Game Developers Association (IGDA) welcomes three new directors to its board: Tiziano Giardini, Jennifer Estaris, and Pedro Zambon
Josh Raffaelli, who has deep roots as a Silicon Valley investor and has backed a number of Elon Musk companies, is suing his former employer, the massive trillion-dollar AUM Brookfield Asset Management, reports The New York Times. Much of Raffaelli’s …
Turtle Beach (TBCH) delivered earnings and revenue surprises of 40% and 2.52%, respectively, for the quarter ended March 2025. Do the numbers hold clues to what lies ahead for the stock?
Accenture’s new research reveals the critical strategies that separate the companies successfully scaling AI from the 92% stuck in perpetual pilot mode, providing enterprise leaders with actionable insights to accelerate their AI transformation journey.
Senseonics (SENS) delivered earnings and revenue surprises of 0% and 16.84%, respectively, for the quarter ended March 2025. Do the numbers hold clues to what lies ahead for the stock?
Sangoma Technologies Corporation (SANG) delivered earnings and revenue surprises of 0% and 0.72%, respectively, for the quarter ended March 2025. Do the numbers hold clues to what lies ahead for the stock?
Synaptics (SYNA) delivered earnings and revenue surprises of 5.88% and 0.58%, respectively, for the quarter ended March 2025. Do the numbers hold clues to what lies ahead for the stock?
GamesBeat Summit 2025 will host the 10th Women in Gaming Breakfast, in which the panelists discuss what happens once we’re back to growth.
HudBay Minerals (HBM) closed the most recent trading day at $7.62, moving +1.87% from the previous trading session.
SoundHound AI (SOUN) delivered earnings and revenue surprises of 14.29% and 3.57%, respectively, for the quarter ended March 2025. Do the numbers hold clues to what lies ahead for the stock?
Amprius (AMPX) delivered earnings and revenue surprises of 11.11% and 40.70%, respectively, for the quarter ended March 2025. Do the numbers hold clues to what lies ahead for the stock?
Resurgens Gaming, a gaming and creator-driven entertainment company, said it raised a seven-figure investment to launch the Ghost Launchpad accelerator.
Although the revenue and EPS for Arlo Technologies (ARLO) give a sense of how its business performed in the quarter ended March 2025, it might be worth considering how some key metrics compare with Wall Street estimates and the year-ago numbers.
The Stock Market News is a financial news aggregator for traders and investors that proposes to you the latest breaking news headlines on global financial markets, economy and business. Live qoute and chart technical analysis, opinion, price forecast on current stock market, currencies (Forex), cryptocurrency, commodities futures, ETFs, funds, bonds and more. Disclaimer: by using any material of this website, you acknowledge and agree that TheStockMarketNews.com is not responsible for the content, actions or any legal issues arising from third-party websites; materials of this website are not financial advice or call to actions. Trading and investing in financial instruments involve high risks including the risk of investment loss.