
Badlands
‘There are badlands of the Earth, but also badlands of memory – whited-out areas that the mind fills in as best it can.’ By Thomas Meaney
Drones and Decolonization
‘Brody was rich in fresh flowers and fresh grief.’ By William T. Vollmann

‘There are badlands of the Earth, but also badlands of memory – whited-out areas that the mind fills in as best it can.’ By Thomas Meaney
‘Brody was rich in fresh flowers and fresh grief.’ By William T. Vollmann

Do the Democrats really want reform? by Andrew Cockburn
The puzzle of AI facial recognition by Michael W. Clune
Has the Treasury market started to crack? by Mary Childs
A high school student opens her laptop and types a question: What is Hamlet really about? Within seconds, a sleek block of text appears—elegant, articulate, and seemingly insightful. She pastes it into her assignment, hits submit, and moves on. But something vital is lost—not just effort, not merely time—but a deeper encounter with ambiguity, complexity, and meaning. What if the greatest threat to our intellect isn’t ignorance—but the ease of instant answers?

In a world increasingly saturated with generative AI (GenAI), our relationship to knowledge is undergoing a tectonic shift. These systems can summarize texts, mimic reasoning, and simulate creativity with uncanny fluency. But what happens to intellectual inquiry when answers arrive too easily? Are we growing more informed—or less thoughtful?
To navigate this evolving landscape, we turn to two illuminating frameworks: Daniel Kahneman’s Thinking, Fast and Slow and Chrysi Rapanta et al.’s essay Critical GenAI Literacy: Postdigital Configurations. Kahneman maps out how our brains process thought; Rapanta reframes how AI reshapes the very context in which that thinking unfolds. Together, they urge us not to reject the machine, but to think against it—deliberately, ethically, and curiously.
Kahneman’s landmark theory proposes that human thought operates through two systems. System 1 is fast, automatic, and emotional. It leaps to conclusions, draws on experience, and navigates the world with minimal friction. System 2 is slow, deliberate, and analytical. It demands effort—and pays in insight.
GenAI is tailor-made to flatter System 1. Ask it to analyze a poem, explain a philosophical idea, or write a business proposal, and it complies—instantly, smoothly, and often convincingly. This fluency is seductive. But beneath its polish lies a deeper concern: the atrophy of critical thinking. By bypassing the cognitive friction that activates System 2, GenAI risks reducing inquiry to passive consumption.
As Nicholas Carr warned in The Shallows, the internet already primes us for speed, scanning, and surface engagement. GenAI, he might say today, elevates that tendency to an art form. When the answer is coherent and immediate, why wrestle to understand? Yet intellectual effort isn’t wasted motion—it’s precisely where meaning is made.
Rapanta and her co-authors offer a vital reframing: GenAI is not merely a tool but a cultural actor. It shapes epistemologies, values, and intellectual habits. Hence, the need for critical GenAI literacy—the ability not only to use GenAI but to interrogate its assumptions, biases, and effects.
Algorithms are not neutral. As Safiya Umoja Noble demonstrated in Algorithms of Oppression, search engines and AI models reflect the data they’re trained on—data steeped in historical inequality and structural bias. GenAI inherits these distortions, even while presenting answers with a sheen of objectivity.
Rapanta’s framework insists that genuine literacy means questioning more than content. What is the provenance of this output? What cultural filters shaped its formation? Whose voices are amplified—and whose are missing? Only through such questions do we begin to reclaim intellectual agency in an algorithmically curated world.
Kahneman reveals how prone we are to cognitive biases—anchoring, availability, overconfidence—all tendencies that lead System 1 astray. GenAI, far from correcting these habits, may reinforce them. Its outputs reflect dominant ideologies, rarely revealing assumptions or acknowledging blind spots.
Rapanta et al. propose a solution grounded in epistemic courage. Critical GenAI literacy is less a checklist than a posture: of reflective questioning, skepticism, and moral awareness. It invites us to slow down and dwell in complexity—not just asking “What does this mean?” but “Who decides what this means—and why?”
Douglas Rushkoff’s Program or Be Programmed calls for digital literacy that cultivates agency. In this light, curiosity becomes cultural resistance—a refusal to surrender interpretive power to the machine. It’s not just about knowing how to use GenAI; it’s about knowing how to think around it.
Interpretation is inherently plural—shaped by lens, context, and resonance. Kahneman would argue that System 1 offers the quick reading: plot, tone, emotional impact. System 2—skeptical, slow—reveals irony, contradiction, and ambiguity.
GenAI can simulate literary analysis with finesse. Ask it to unpack Hamlet or Beloved, and it may return a plausible, polished interpretation. But it risks smoothing over the tensions that give literature its power. It defaults to mainstream readings, often omitting feminist, postcolonial, or psychoanalytic complexities.
Rapanta’s proposed pedagogy is dialogic. Let students compare their interpretations with GenAI’s: where do they diverge? What does the machine miss? How might different readers dissent? This meta-curiosity fosters humility and depth—not just with the text, but with the interpretive act itself.
This reimagining impacts education profoundly. Critical literacy in the GenAI era must include:
Educators become co-inquirers, modeling skepticism, creativity, and ethical interrogation. Classrooms become sites of dialogic resistance—not rejecting AI, but humanizing its use by re-centering inquiry.
A study from Microsoft and Carnegie Mellon highlights a concern: when users over-trust GenAI, they exert less cognitive effort. Engagement drops. Retention suffers. Trust, in excess, dulls curiosity.
Emerging neurocognitive research suggests overreliance on GenAI may dampen activation in brain regions associated with semantic depth. A speculative analysis from MIT Media Lab might show how effortless outputs reduce the intellectual stretch required to create meaning.
But friction isn’t failure—it’s where real insight begins. Miles Berry, in his work on computing education, reminds us that learning lives in the struggle, not the shortcut. GenAI may offer convenience, but it bypasses the missteps and epiphanies that nurture understanding.
Creativity, Berry insists, is not merely pattern assembly. It’s experimentation under uncertainty—refined through doubt and dialogue. Kahneman would agree: System 2 thinking, while difficult, is where human cognition finds its richest rewards.
The implications reach beyond academia. Curiosity fuels critical citizenship, ethical awareness, and democratic resilience. GenAI may simulate insight—but wonder must remain human.
Ezra Lockhart, writing in the Journal of Cultural Cognitive Science, contends that true creativity depends on emotional resonance, relational depth, and moral imagination—qualities AI cannot emulate. Drawing on Rollo May and Judith Butler, Lockhart reframes creativity as a courageous way of engaging with the world.
In this light, curiosity becomes virtue. It refuses certainty, embraces ambiguity, and chooses wonder over efficiency. It is this moral posture—joyfully rebellious and endlessly inquisitive—that GenAI cannot provide, but may help provoke.
A flourishing postdigital intellectual culture would:
In this culture, Kahneman’s System 2 becomes more than cognition—it becomes character. Rapanta’s framework becomes intellectual activism. Curiosity—tenacious, humble, radiant—becomes our compass.
The future of thought will not be defined by how well machines simulate reasoning, but by how deeply we choose to think with them—and, often, against them. Daniel Kahneman reminds us that genuine insight comes not from ease, but from effort—from the deliberate activation of System 2 when System 1 seeks comfort. Rapanta and colleagues push further, revealing GenAI as a cultural force worthy of interrogation.
GenAI offers astonishing capabilities: broader access to knowledge, imaginative collaboration, and new modes of creativity. But it also risks narrowing inquiry, dulling ambiguity, and replacing questions with answers. To embrace its potential without surrendering our agency, we must cultivate a new ethic—one that defends friction, reveres nuance, and protects the joy of wonder.
Thinking against the machine isn’t antagonism—it’s responsibility. It means reclaiming meaning from convenience, depth from fluency, and curiosity from automation. Machines may generate answers. But only we can decide which questions are still worth asking.
THIS ESSAY WAS WRITTEN BY AI AND EDITED BY INTELLICUREAN

How accountable are US intelligence agencies to the president and Congress? By Richard Norton-Taylor
Henry James’s return to the United States By Alicia Rix
New light on ‘Captain’ Warner’s weapon of mass destruction By Trevor Pateman
A poet for yesterday and today By Emma Greensmith

In a series of terse, unsigned orders, the court has often been giving the green light to President Trump’s agenda without a murmur of explanation.
Economists say it will take time for the effects of trade policies to show up in economic data — but acknowledge they aren’t sure how long.
After years pressing to end Ukraine aid, many Republicans have changed positions now that President Trump is supporting the country against Russian aggression.


Give us, this day, our sustainable daily bread
From eating better-quality meat to buying seasonal and local produce, Jane Wheatley suggests how we can shop smart to aid the environment
Solar, so good
Banks of solar panels covering farmland have sparked much opposition, but, with local input, could they be a force for good, wonders William Kendall
No job too big
Kate Green trumpets the native breeds best suited to grazing Britain’s green and pleasant land, as our farmers walk a fine line balancing food production and biodiversity recovery

‘It’s terrifying, but also an absolute dream’
Henrietta Bredin talks to Errollyn Wallen, Master of the King’s Music, about composing in a lighthouse and going on stage
Liz Fenwick’s favourite painting
The novelist picks a trailblazing nude by the first female RA
A passion for plasterwork
John Goodall discovers a neo-Classical delight when he takes a peek behind the unassuming frontage of a Swansea terrace

The legacy
Kate Green admires Rachel Carson’s seminal Silent Spring
A wing and a prayer
Hannah Bourne-Taylor extols the importance of feeding over the ‘hungry gap’ to help our beleaguered farmland birds

Country Life’s Little Green Book
We all want to shop well, but how to decipher the marketing? Madeleine Silver picks a handful of brands that do what they say

The good stuff
Let those bangles jangle, urges Hetty Lintell, with her bracelet pick
Interiors
Arabella Youens admires the rich refurbishment of a Scottish fishing lodge and laments the scarcity of trusty English oak
True grit
Gravel gardens are becoming ever more popular, but what are the secrets to making them a success, wonders Non Morris

Winging it
The ‘flying barn door’ that is the magnificent white-tailed eagle is returning to our shores. Mark Cocker, for one, is very glad
Arts & antiques
A lost technique is being revived by a Swiss sculptor, as pioneer-ing women of science are celebrated, reveals Carla Passino
War and peace
Tom Young’s intricate, powerful paintings capture the beauty and the heartbreak of Lebanon. Octavia Pollock meets him
All the world on one stage
Michael Billington finds Ralph Fiennes at his brooding best as Sir David Hare’s engrossing new play premieres in Bath

The Consumer Price Index rose 2.7 percent from a year ago, as the global trade war started to bite.
Official figures showed modest growth in the second quarter as exports shifted to other countries and Beijing invested in manufacturing and infrastructure.
Former government employees are finding that perhaps the only thing harder than getting laid off from the federal government is staying that way.
The decision allows President Trump to fire thousands of employees, functionally eliminating an agency created by Congress without legislators’ input.
Pentagon officials said details were still being worked out, and experts doubted President Trump’s threat of huge tariffs for Russian trading partners.

Is the United States truly ready for the seismic shift in modern warfare—a transformation that The New Yorker‘s veteran war correspondent describes not as evolution but as rupture? In “Is the U.S. Ready for the Next War?” (July 14, 2025), Dexter Filkins captures this tectonic realignment through a mosaic of battlefield reportage, strategic insight, and ethical reflection. His central thesis is both urgent and unsettling: that America, long mythologized for its martial supremacy, is culturally and institutionally unprepared for the emerging realities of war. The enemy is no longer just a rival state but also time itself—conflict is being rewritten in code, and the old machines can no longer keep pace.
The piece opens with a gripping image: a Ukrainian drone factory producing a thousand airborne machines daily, each costing just $500. Improvised, nimble, and devastating, these drones have inflicted disproportionate damage on Russian forces. Their success signals a paradigm shift—conflict has moved from regiments to swarms, from steel to software. Yet the deeper concern is not merely technological; it is cultural. The article is less a call to arms than a call to reimagine. Victory in future wars, it suggests, will depend not on weaponry alone, but on judgment, agility, and a conscience fit for the digital age.
At the heart of the analysis lies a confrontation between two worldviews. On one side stands Silicon Valley—fast, improvisational, and software-driven. On the other: the Pentagon—layered, cautious, and locked in Cold War-era processes. One of the central figures is Palmer Luckey, the founder of the defense tech company Anduril, depicted as a symbol of insurgent innovation. Once a video game prodigy, he now leads teams designing autonomous weapons that can be manufactured as quickly as IKEA furniture and deployed without extensive oversight. His world thrives on rapid iteration, where warfare is treated like code—modular, scalable, and adaptive.
This approach clashes with the military’s entrenched bureaucracy. Procurement cycles stretch for years. Communication between service branches remains fractured. Even American ships and planes often operate on incompatible systems. A war simulation over Taiwan underscores this dysfunction: satellites failed to coordinate with aircraft, naval assets couldn’t link with space-based systems, and U.S. forces were paralyzed by their own institutional fragmentation. The problem wasn’t technology—it was organization.
What emerges is a portrait of a defense apparatus unable to act as a coherent whole. The fragmentation stems from a structure built for another era—one that now privileges process over flexibility. In contrast, adversaries operate with fluidity, leveraging technological agility as a force multiplier. Slowness, once a symptom of deliberation, has become a strategic liability.
The tension explored here is more than operational; it is civilizational. Can a democratic state tolerate the speed and autonomy now required in combat? Can institutions built for deliberation respond in milliseconds? These are not just questions of infrastructure, but of governance and identity. In the coming conflicts, latency may be lethal, and fragmentation fatal.
To frame the stakes, the essay draws on powerful historical precedents. Technological transformation has always arisen from moments of existential pressure: Prussia’s use of railways to reimagine logistics, the Gulf War’s precision missiles, and, most profoundly, the Manhattan Project. These were not the products of administrative order but of chaotic urgency, unleashed imagination, and institutional risk-taking.
During the Manhattan Project, multiple experimental paths were pursued simultaneously, protocols were bent, and innovation surged from competition. Today, however, America’s defense culture has shifted toward procedural conservatism. Risk is minimized; innovation is formalized. Bureaucracy may protect against error, but it also stifles the volatility that made American defense dynamic in the past.
This critique extends beyond the military. A broader cultural stagnation is implied: a nation that fears disruption more than defeat. If imagination is outsourced to private startups—entities beyond the reach of democratic accountability—strategic coherence may erode. Tactical agility cannot compensate for an atrophied civic center. The essay doesn’t argue for scrapping government institutions, but for reigniting their creative core. Defense must not only be efficient; it must be intellectually alive.
Perhaps the most haunting dimension of the essay lies in its treatment of ethics. As autonomous systems proliferate—from loitering drones to AI-driven targeting software—the space for human judgment begins to vanish. Some militaries, like Israel’s, still preserve a “human-in-the-loop” model where a person retains final authority. But this safeguard is fragile. The march toward autonomy is relentless.
The implications are grave. When decisions to kill are handed to algorithms trained on probability and sensor data, who bears responsibility? Engineers? Programmers? Military officers? The author references DeepMind’s Demis Hassabis, who warns of the ease with which powerful systems can be repurposed for malign ends. Yet the more chilling possibility is not malevolence, but moral atrophy: a world where judgment is no longer expected or practiced.
Combat, if rendered frictionless and remote, may also become civically invisible. Democratic oversight depends on consequence—and when warfare is managed through silent systems and distant screens, that consequence becomes harder to feel. A nation that no longer confronts the human cost of its defense decisions risks sliding into apathy. Autonomy may bring tactical superiority, but also ethical drift.
Throughout, the article avoids hysteria, opting instead for measured reflection. Its central moral question is timeless: Can conscience survive velocity? In wars of machines, will there still be room for the deliberation that defines democratic life?
The closing argument is not tactical, but philosophical. Readiness, the essay insists, must be measured not just by stockpiles or software, but by the moral posture of a society—its ability to govern the tools it creates. Military power divorced from democratic deliberation is not strength, but fragility. Supremacy must be earned anew, through foresight, imagination, and accountability.
The challenge ahead is not just to match adversaries in drones or data, but to uphold the principles that give those tools meaning. Institutions must be built to respond, but also to reflect. Weapons must be precise—but judgment must be present. The republic’s defense must operate at the speed of code while staying rooted in the values of a self-governing people.
The author leaves us with a final provocation: The future will not wait for consensus—but neither can it be left to systems that have forgotten how to ask questions. In this, his work becomes less a study in strategy than a meditation on civic responsibility. The real arsenal is not material—it is ethical. And readiness begins not in the factories of drones, but in the minds that decide when and why to use them.
THIS ESSAY REVIEW WAS WRITTEN BY AI AND EDITED BY INTELLICUREAN.

As the death toll climbs in Texas, the Trump Administration is actively undermining the nation’s ability to predict—and to deal with—climate-related disasters. By Elizabeth Kolbert
With global conflicts increasingly shaped by drones and A.I., the American military risks losing its dominance. By Dexter Filkins
The discomfort of loneliness shapes us in ways we don’t recognize—and we may not like what we become without it. By Paul Bloom

President Trump has earned a reputation for bluffing on tariffs. But he has steadily and dramatically raised U.S. tariffs, transforming global trade.
After years of lavishing praise on the Russian leader, President Trump abruptly changed his posture amid frustration with the lack of a cease-fire.
A group of University of Virginia alumni had long called for eliminating D.E.I., without much success. Then they gained a new ally: President Trump.