AI-powered nimbyism could grind UK planning system to a halt, experts warn

2 hours ago 6

The government’s plan to use artificial intelligence to accelerate planning for new homes may be about to hit an unexpected roadblock: AI-powered nimbyism.

A new service called Objector is offering “policy-backed objections in minutes” to people who are upset about planning applications near their homes.

It uses generative AI to scan planning applications and check for grounds for objection, ranking these as “high”, “medium” or “low” impact. It then automatically creates objection letters, AI-written speeches to deliver to the planning committees, and even AI-generated videos to “influence councillors”.

Kent residents Hannah and Paul George designed the system after estimating they spent hundreds of hours attempting to navigate the planning process when they opposed plans to convert a building near their home into a mosque.

For £45-a-time, they are offering the tool to people who, like them, could not afford a specialist lawyer to help navigate labyrinthine planning laws. They said it would help “everyone have a voice, to level the playing field and make the whole process fairer”.

It is a modest enterprise but it is not alone. A similar service, Planningobjection.com, is promoting £99 AI-generated objection letters with the tagline “stop moaning and take action”.

Community campaigners have also encouraged supporters to use ChatGPT to craft objection letters on Facebook, claiming it is like having “a planning solicitor at your fingertips”.

One leading planning lawyer warned such AIs could “supercharge nimbyism” and if they became widely used could cause the planning system to “grind to a halt”, with planning officials potentially deluged with submissions.

Sebastian Charles said his firm, Aardvark Planning Law, had seen AI-generated objections to planning applications that included references to previous cases and appeal decisions that, when checked by a human lawyer, did not exist.

“The danger is decisions are made on the wrong basis,” he said. “Elected members making final decisions could easily believe AI-generated planning speeches made by members of the public, even if they are full of made up case law and regulations.”

Hannah George, a co-founder of Objector, denied the platform was about automating nimbyism.

“It’s just about making the planning system fair,” she said. “At the moment, from our experience, it’s not. And with the government on this ‘build, baby, build’ mission, we see that only going one way.”

Objector has said while AI-created errors are a concern, it uses two different AI models and cross-checks the results in an effort to reduce the risk of “hallucinations” – a term used to describe when AIs make things up.

The current Objector system is designed to tackle small planning applications, for example, repurposing a local office building or a neighbour’s home extension. A capability to challenge much larger applications, such as a housing estate on greenbelt land, is in development, said George.

The Labour government has been promoting AI as one solution to clearing planning backlogs. It recently launched a tool called Extract, which aims to speed up planning processes and help the government carry out its mission to build 1.5m new homes.

But there may be an AI “arms race” developing, said John Myers, the director of the Yimby Alliance, a campaign calling for more homes to be built with the support of local communities.

“This will turbocharge objections to planning applications and will lead to people finding obscure reasons [for opposing developments] that they have not found before,” he said.

A new dynamic could emerge “where one side tries to deploy AI to accelerate the process, and the other side deploys AI to stop it,” he said. “I don’t see an end to that until we find a way to bring forward developments people want.”

The government may already have an AI system that could respond to a rise in AI-generated objections. It has launched an AI tool called Consult, which analyses responses to public consultations.

It did so in the expectation that “widespread adoption of large language models [such as that used by Objector] will likely only increase the number of responses that consultations attract”.

Paul Smith, the managing director of Strategic Land Group, a consultancy, this month reported on the rising use of AI by people to oppose planning applications.

“AI objections undermine the whole rationale for public consultation,” he wrote in Building magazine. “Local communities, we are told, know their areas best … So, we should ask them what they think.

“But if all local residents are doing is deciding they don’t like the scheme before uploading the application documents to a computer to find out why they don’t like it, is there really any point in asking them at all?”

Read Entire Article
Bhayangkara | Wisata | | |