
A proposal, not a critique, with cadence, spacing, and a clear invitation rather than an accusation.
We are approaching a moment that deserves pause — not panic, not outrage — but re-classification.
Some technologies have quietly crossed from product into infrastructure.
AI is one of them.
So are the platforms we now use simply to remain in touch with family, friends, memory, and shared life.
When that happens, the funding model stops being a neutral detail.
It becomes the architecture of behaviour.
This is not a critique of innovation.
It is a proposal for alignment.
Advertising-funded systems behave exactly as they are designed to behave:
-
they optimise for attention
-
they reward speed and scale
-
they treat restraint as a cost
That tension shows up repeatedly — whether in weak moderation, distorted incentives, or the inability of large platforms to say “no” without commercial penalty.
The issue is not that control is impossible.
It is that control competes with growth.
A different model already exists
We already trust one of the world’s most ambitious knowledge projects under a radically different structure.
Wikipedia is:
-
open
-
subscription-supported
-
non-advertising
-
governed for trust, not engagement
It is not perfect — but it is allowed to slow down, to self-correct, and to refuse commercial capture.
That matters.
A proposal worth testing
What if certain AI systems were treated similarly?
-
Open or auditable by design
-
Funded by voluntary subscription
-
Governed transparently
-
Free from advertising pressure
Not all AI.
Not every company.
But at least one civic-grade AI, whose primary obligation is public trust rather than quarterly growth.
The incentive would change:
-
from capturing attention
-
to retaining confidence
That single shift alters everything downstream.
Public broadcasters and the digital public square
Public broadcasters already carry this responsibility — but increasingly on platforms they do not control.
What if organisations like the BBC hosted:
-
their own non-commercial digital platforms
-
community spaces without adverts
-
interaction ordered by time or editorial judgment, not algorithmic amplification
A digital public square, not a shopping mall.
We already accept that:
-
libraries do not sell to visitors
-
parks do not display billboards
-
public radio does not interrupt thought every few minutes
Why should basic digital connection be different?
If you change the funding model, you change the behaviour.
If you change the behaviour, you change the culture.
This is not nostalgia.
It is systems thinking.
The real question
The question is no longer “Can this be built?”
It is “What are we willing to classify as a public good?”
Once AI and digital connection mediate:
-
knowledge
-
memory
-
social trust
-
collective understanding
their governance becomes an ethical choice, not a technical one.
This is an invitation to explore that choice — calmly, openly, and without assuming that advertising is the only way the future can pay for itself.
We are approaching a moment that deserves pause — not panic, not outrage — but re-classification.
Some technologies have quietly crossed from product into infrastructure.
AI is one of them.
So are the platforms we now use simply to remain in touch with family, friends, memory, and shared life.
When that happens, the funding model stops being a neutral detail.
It becomes the architecture of behaviour.
This is not a critique of innovation.
It is a proposal for alignment.
Advertising-funded systems behave exactly as they are designed to behave:
-
they optimise for attention
-
they reward speed and scale
-
they treat restraint as a cost
That tension shows up repeatedly — whether in weak moderation, distorted incentives, or the inability of large platforms to say “no” without commercial penalty.
The issue is not that control is impossible.
It is that control competes with growth.
A different model already exists
We already trust one of the world’s most ambitious knowledge projects under a radically different structure.
Wikipedia is:
-
open
-
subscription-supported
-
non-advertising
-
governed for trust, not engagement
It is not perfect — but it is allowed to slow down, to self-correct, and to refuse commercial capture.
That matters.
A proposal worth testing
What if certain AI systems were treated similarly?
-
Open or auditable by design
-
Funded by voluntary subscription
-
Governed transparently
-
Free from advertising pressure
Not all AI.
Not every company.
But at least one civic-grade AI, whose primary obligation is public trust rather than quarterly growth.
The incentive would change:
-
from capturing attention
-
to retaining confidence
That single shift alters everything downstream.
Public broadcasters and the digital public square
Public broadcasters already carry this responsibility — but increasingly on platforms they do not control.
What if organisations like the BBC hosted:
-
their own non-commercial digital platforms
-
community spaces without adverts
-
interaction ordered by time or editorial judgment, not algorithmic amplification
A digital public square, not a shopping mall.
We already accept that:
-
libraries do not sell to visitors
-
parks do not display billboards
-
public radio does not interrupt thought every few minutes
Why should basic digital connection be different?
If you change the funding model, you change the behaviour.
If you change the behaviour, you change the culture.
This is not nostalgia.
It is systems thinking.
The real question
The question is no longer “Can this be built?”
It is “What are we willing to classify as a public good?”
Once AI and digital connection mediate:
-
knowledge
-
memory
-
social trust
-
collective understanding
their governance becomes an ethical choice, not a technical one.
This is an invitation to explore that choice — calmly, openly, and without assuming that advertising is the only way the future can pay for itself.
This piece sits alongside others exploring how language, pressure, and silence shape modern power.
Part of a longer work on language, pressure, and the quiet mechanics of power.
Add comment
Comments