The cover image shows Zhengbin Fishing Harbor in Keelung, Taiwan. The photo was taken by me when I visited Keelung on a day trip in January 2026. According to the tour guide, before the houses were painted, it was just an insignificant, small port along the northeastern coast of Taiwan. However, once the houses were painted, it became a huge hit and attracted many local and foreign tourists. This speaks of ingenuity and a different way of seeing things, which is the subject of this post.
“One Question a Day” series, is an ongoing experiment where I leverage on AI to learn and write (not “learn to write”) more productively. I will pick a thought that keeps bugging me (a spark) and explore it with an “AI co‑worker” (the one on my team now is Perplexity) to uncover new knowledge and insights.
Today’s spark
Early this week, I was deeply immersed in preparing for a keynote speech titled “Managing Change in the Age of AI”. My research surfaced a core theme: organisations must move from episodic change to continuous change because AI tools keep evolving. Initially it sounded reasonable, but as I reflected upon it, more and more holes emerged.
“Continuous change” arise because the subject of change—the AI tool itself—now evolves on its own, becoming smarter at the same tasks and better at new ones over time. However, “continuous change” suggests we live in a state of permanent non‑equilibrium. Does that even make sense? When I stress tested this perspective to the context of a AI-powered newsroom, it didn’t seem to hold water. It made journalists sound like passengers being “managed through” tool changes, instead of people shaping how those tools fit their craft.
My conversation with AI
I asked whether this language of “continuous change management” tends to make workers, like journalists, reactive. The short answer: yes, if change is driven only from the top, people easily become objects of change, not partners in it, and job insecurity goes up even as productivity improves.
New knowledge and perspectives
- “Continuous change” is not the finish line.
- Agency has to be built into the job.
- The same AI lands very differently depending on structure.
Being in constant flux is not a workable end state. A more useful picture is an equilibrium where roles are redesigned so humans own clearly defined domains that AI will not take over—investigation, sourcing, narrative, judgment—while AI sits around that core to handle speed, volume, and formats.
Telling people to “embrace change” is not enough. Organisations need to bake agency into role design: make it explicit where AI is expected to help, where humans choose how to use it, and where only humans can decide. That is how you move from “continuous change happening to me” to “an evolving partnership I can work with.”
In a redesigned newsroom, new roles make AI feel like an extension of human work. In a traditional structure with no role redesign, staff can feel squeezed while gig workers feel empowered. The tool is the same; what changes is the mix of role design, employment model, and incentives.
My next question
If newsrooms can deliberately design “AI‑surrounded human roles” for journalists, what would an equivalent design look like for other domains I care about—consulting, HR, or the public sector?
Sources surfaced by AI for this post
- IBM – Transforming change management with responsible AI
- McKinsey – 5 steps for change management in the gen AI age
- Reuters Institute – How will AI reshape the news in 2026?
- Schibsted – How Schibsted is using AI to boost efficiency for their newsrooms and their readers
- Naviga – The changing role of journalists in the age of AI
https://www.ibm.com/think/insights/change-management-responsible-ai
https://www.navigaglobal.com/the-changing-role-of-journalists-in-the-age-of-ai/
CNA / Today – AI is not wiping out all entry-level jobs, but it’s changing the way juniors learn
https://www.channelnewsasia.com/today/big-read/ai-junior-entry-level-jobs-youn