19.7% vs 51.1%
Approval rate, with and without pre-application advice (cleaned cohort)

Among the small-site applications we’ve coded across London where the officer report unambiguously says whether pre-application advice was taken, the ones that took it are approved at 19.7%. The ones that didn’t, at 51.1%. Which, taken at face value, suggests pre-app halves your odds of approval, and ought therefore to be avoided.

That isn’t quite what’s happening. But it isn’t a story about contaminated data either, which is what I assumed when I first looked at the figures. The cleaning didn’t make the gap go away. The headline finding survives a fairly careful pass at separating "officer report mentions pre-app boilerplate" from "officer report says applicant actually took pre-app". So the gap is real, and worth understanding.

What the cleaning did

The pre-app flag in the underlying dataset fires whenever an officer’s report explicitly references pre-application advice. Two phrasings dominate. The first is approving: the applicant engaged in pre-application advice, the scheme was amended accordingly, and the application is now in its revised form. The second is the boilerplate that appears in refusals where pre-app was offered but not taken. It reads, with minor variations, like this:

The Council is ready to enter into discussions with the applicants to assist in the preparation of a new planning application via the Council’s Pre-Application process. The applicant is encouraged to utilise this service prior to the submission of any future application.

This second phrasing is a polite way of telling the applicant: pre-app exists, you didn’t use it, do consider it next time. It refers to a future application that hasn’t happened yet. The flag, in our raw coding, fires for both phrasings, which contaminates the headline correlation.

I built a text classifier to separate them. Of the 471 applications originally flagged as having pre-app mentioned in their officer report, 188 turn out to be cases where pre-app was genuinely taken; 121 are cases where the boilerplate refers to pre-app having not been taken; 162 are cases where the report mentions pre-app in a way that’s ambiguous from the truncated 200-character extract we have, and which are excluded from the cleaned comparison. The remaining 845 unflagged applications are joined with the 121 reclassified-as-not-taken to give the cleaned no-pre-app cohort of 966.

The cleaning moves the headline numbers from 24.8% / 53.1% in the raw cohort to 19.7% / 51.1% in the cleaned one. The gap actually widens slightly. Which is the opposite of what I expected.

The borough-by-borough picture is much more interesting

The aggregate hides the most useful finding. Of the boroughs with at least five applications in each cohort:

BoroughWith pre-appWithoutn withn withoutGap
Enfield66.7%53.0%9100+13.7pp
Kensington and Chelsea72.7%66.7%1136+6.1pp
Havering30.0%27.5%1051+2.5pp
Lewisham32.4%33.3%376−0.9pp
Brent0.0%40.3%5124−40.3pp
Croydon0.0%50.7%100144−50.7pp

Croydon is doing most of the work in the headline figure. One hundred small-site schemes that took pre-app, zero approved. Compared to 144 that didn’t, of which roughly half got through. That isn’t a generic story about pre-app. It’s a Croydon story.

Enfield and K&C, by contrast, show pre-app correlating with slightly better outcomes. The samples are small enough to be cautious about, but the direction is consistent across both, and inconsistent with the headline.

What I think is actually going on

Two interpretations seem more likely than "pre-app harms your application".

The first is selection bias. Schemes that take pre-app are not a random subset. They are disproportionately the schemes the developer or the council suspects might struggle: backland sites, conservation areas, density above the local pattern, sensitive neighbour relationships, schemes the council has already had concerns about. The pre-app conversation is where the officer raises specific issues, and the formal application is what the developer chose to file in response. Whether they addressed the issues or not is a function of how reasonable the requested changes were against the developer’s commercial constraints.

The second is borough culture. In some boroughs (Enfield, K&C in our cohort) pre-app appears to function as a refinement mechanism: officers identify problems, applicants amend, the application progresses. In others (Croydon and Brent in our cohort) pre-app appears to function as a paper trail. Officers raise concerns at pre-app, the developer files anyway, the refusal report cites the unaddressed pre-app concerns as additional grounds. The mechanism is different even though the procedure is nominally the same.

The Croydon zero-approval-out-of-100 figure is striking enough that the selection-bias and borough-culture explanations probably both contribute. Either way, the suggestion that taking pre-app harms your scheme on its merits is harder to defend than the suggestion that pre-app filters and reflects scheme problems differently in different boroughs.

Inside Croydon: a closer look at all 100 cases

Worth pausing on Croydon properly. One hundred small-site schemes took pre-app and were refused. None were approved. The natural question is whether something more specific than "borough culture" is going on.

Of those 100 refusal reports, 81 explicitly state that pre-application advice was given and was not adopted. Sixteen reference a specific pre-app file number (the form is something like 22/00805/PRE), so we know advice was provided in writing. The rest use Croydon’s standard "suggested improvements were not adopted by the applicant" boilerplate. Across all 100, the same picture: officers gave specific advice; the formal application that followed didn’t reflect it.

The refusal reasons themselves cluster predictably. Across 497 individual reasons cited on these 100 schemes, the top categories are:

  • Parking, cycle and refuse provision: 20.3% of reasons
  • Overlooking, sense of enclosure, neighbour amenity: 19.9%
  • Scale, massing, bulk: 18.5%
  • Design, character, streetscene: 16.5%
  • Missing or insufficient legal agreement / S106 / CIL: 13.7%
  • Internal space standards (NDSS, head height, storage): 10.1%
  • Biodiversity and tree retention: 4.6%

The qualitative samples are unambiguous. A 5-unit demolish-and-rebuild in Coulsdon, refused on scale, massing, "overengineered appearance, excessive hardstanding"; rear terraces overlooking two neighbours. A 7-unit DMR in Selhurst, refused on insufficient three-bedroom family homes (Policy SP2.7) plus excessive scale and bulk. A 9-unit scheme in Sanderstead, refused on the same housing-mix shortfall plus "visually dominant and out of keeping with the established pattern of development". A 4-unit conversion in a town-centre conservation area where the pre-app reference (22/00805/PRE) is cited explicitly; the council had advised on studio sizing at pre-app, and the formal application was refused on heritage harm and overlooking.

What this looks like at population level is not a story about pre-app being a bad investment. It is a story about Croydon’s pre-app process functioning as designed and the response from developers being the variable. Officers identify specific issues. The applicant has a choice: amend the scheme to address them, or file unchanged in the hope that committee politics or appeal precedent will save the day. In Croydon, the second choice is the dominant one in our cohort, and it is reliably wrong.

Three of the most common pre-app concerns in the Croydon cohort relate to issues a developer can either address upfront or sell the site for: housing mix above the 3-bed family-housing threshold (Policy SP2.7), scale matching the local development pattern (DM10/SP4), and parking and refuse provision compliant with London Plan T4/T6. Schemes that genuinely amend in line with pre-app advice on these three issues, on the data as it stands, would not be in the cohort that ends up here.

Worth knowing

The recommendation that follows is unromantic. Pre-app advice is most useful in boroughs where, on present evidence, it is associated with better outcomes (Enfield, K&C, Havering, Lewisham in our cohort). It is least useful, and possibly counter-productive in the formal sense of leaving a damaging paper trail, in boroughs where the pre-app process appears to filter for unresolvable issues (Croydon, Brent).

If a scheme isn’t commercially viable at the unit count or design that pre-app advice would suggest, the right call is probably one of: redesign so it is, sell the site to someone whose model fits the borough, or appeal the principle of the lower number with proper evidence. Filing the original scheme through pre-app and into formal application unchanged is the option the cumulative figures argue against.

Most developers don’t notice this borough-specific pattern because they’re working from a sample of one or two recent schemes in one or two boroughs. The aggregate doesn’t tell them anything useful, and the per-borough figure they’d need is hard to assemble without somebody having coded the dataset first.

The dashboards in each borough show the local pattern. The pre-app effect, like most things in London planning, is not a London thing.

Browse all 33 borough dashboards for the local pattern. Or commission a Site Assessment on a specific address.

Related

Croydon: 78% of refusals cite design quality 74% of refusal reasons are design-codifiable Inner vs outer London: the 14.5pp refusal gap
← All articles Site Assessments →