Share your experiences
Do you have experience with service flooding? If so, we invite you to share your example via this form. As part of a joint research project between TNO and the Hertie School, we are building a database of examples.
AI as an accelerator of existing tensions
Government services are designed around a certain expected level of use. In practice, not every citizen makes use of the services and benefits they are entitled to. Some are discouraged by bureaucracy and complex procedures, while others are unaware that they qualify. When AI lowers these barriers, demand can suddenly increase. This highlights a fundamental tension: governments are expected to treat every citizen equally and fairly, while operating with limited processing capacity and budgets.
In addition, government services form part of a country’s critical infrastructure. AI does not only lower the threshold for legitimate requests, but also for misuse. For example, a malicious actor may now mass-submit fake calls to emergency services, or freedom-of-information requests aiming to overwhelm agencies. When requests are not handled in time, backlogs develop and public dissatisfaction can grow, in a context where public service delivery is already under pressure.
Early signals from practice
The impact of developments in generative AI is becoming increasingly visible. Municipalities are seeing a rise in the number of objections being submitted, and courts report that legal documents are likely drafted with the help of generative AI. In a recent court case, both the municipality and the court spent a great deal of time refuting arguments that referred to non-existent sources, which were likely hallucinated by ChatGPT.
Similar signals are emerging in the United Kingdom and Germany. The UK’s independent Financial Ombudsman Service reports a sharp increase in complaints drafted using AI, while Germany’s highest social court warns that courts and executive agencies risk becoming structurally overloaded by AI-generated objections and appeals.
‘As AI agents become more widely used by the general public, they may further increase pressure on public services.’
Emerging AI capabilities could soon expose more services
At present, AI applications that support text production are particularly popular. As AI agents become more widely used by the general public, they may further increase pressure on public services. These systems are able to navigate processes independently, remember context and follow up on actions.
Technology that already exists, but is not yet widely adopted, makes it possible to combine agents with real-time speech synthesis to make telephone calls to service providers. If AI agents are granted access to personal data, they could actively search for benefits someone may be entitled to and fully automate the submission of applications on their behalf. It is therefore likely that more government services will be exposed to service flooding in the future.
How should the government respond?
Government can respond by deploying AI themselves to process increased demand, for instance in correspondence or the evaluation of applications. This may increase processing capacity, but it also introduces risks related to vendor lock-in, bias and explainability. Another response is to introduce new barriers, such as transaction fees or additional identity verification. However, this conflicts with the ambition of accessible and human-centred public services and may be legally restricted.
A more sustainable approach is to treat service flooding as a catalyst for further modernisation of public service delivery. By investing in structured service interfaces, digital identity, standardised data formats and proactive services, public services can be designed to be more resilient. In some cases, filing can be eliminated entirely, when benefits are pushed to eligible citizens automatically.
Building resilient and future-proof public service delivery
AI-driven service flooding demonstrates that government services are under pressure in a digital society where interaction is becoming increasingly simple and scalable. Within TNO, we work on a range of technologies and approaches that can contribute to solutions, including Privacy-Enhancing Technologies, digital identity and Rules as Code.
TNO Vector connects these developments to the policy and implementation challenges that organisations face. By looking at systems, laws and technology together, we help public sector organisations move towards service delivery that is human-centred, transparent and resilient. We invite you to share your experiences, challenges and ideas for possible solutions.
This article was developed in collaboration with Chris Schmitz, researcher at the Centre for Digital Governance, Hertie School, Germany.





