It all started with a single complaint from a public sector body in the East Midlands in 2004 about a construction contract. But it quickly became clear something larger was afoot.
The resulting investigation was exhaustive, including site visits, interviews and leniency deals for information. It took months of combing through records, correspondences, tender applications, but the evidence was there waiting to be found.
And there it was, the smoking gun – an internal document that had little letter ‘c’s scrawled alongside bids to indicate the corresponding number was a cover price – an unrealistically high bid designed to make another competing firm’s inflated price look more plausible.
This was just one aspect of a massive cartel discovered connected to the Building Schools for the Future programme. During that period, from 2000-06, bid rigging – a form of cartel behaviour – was so common in the construction sector that company directors later admitted to not thinking twice before picking up the phone to a competitor and asking: “Can you give us a cover?” All this has emerged in a report into the case by the Office of Fair Trading (OFT).
Other incriminating evidence was often also just as mundane, from emails discussing bid-rigging to matching handwriting on different bid envelopes or supposedly competing applications. The scam led to an estimated £200m increase in the price local authorities across the country paid for schools, universities and hospitals. In 2009 more than 100 firms were fined a total of £129m in one of the largest cases of its kind.
But it all only came to light because the OFT decided to pursue that single complaint.
Documentary evidence against bid rigging can be persuasive when found but John Kirkpatrick, a senior director at the OFT’s successor organisation, the Competition and Markets Authority (CMA), said: “Sometimes that kind of thing is there but there’s no reason why you’d necessarily spot it without looking for it.”
Many cartel investigations still only happen because of whistleblowers, leniency deals or incredibly diligent buyers or auditors. That’s why Kirkpatrick has been leading a team developing a tool to give power back to buyers.
Specifically, it is a data science tool that lets buyers scan their own datasets in-house. “Our first objective was just to try and test whether it was possible to design algorithmic tests that one could run on procurement data that might give you a hint that something was amiss,” said Kirkpatrick. The tool flags up suspicious signs including bidding patterns, suspicious pricing and low endeavour entries and, although it cannot confirm the presence of a cartel, it can quickly sift through a large number of procurements and tell buyers which ones deserve closer investigation.
This can mean the difference between breaking a cartel and never knowing a crime was committed.
Enterprising procurement authorities have been able to download a previous version, but this official release carries the government’s official digital quality assurance.
Bid-rigging is non-competitive behaviour that can take several forms, including bid suppression, withdrawing bids or cover pricing – where the winning bid is agreed between the bidders. Bid-rigging agreements usually include some form of compensation for the losing parties, meaning everyone in the cartel wins but the buyer pays more. The public sector is not uniquely vulnerable but has certain factors that make it a target – tenders are large and the process is transparent and rules driven. This is good procurement practice, said Kirkpatrick, but it makes the outcomes easier to predict and that creates an opportunity for rigging.
The scale of the problem is unknown because, as with any illegal behaviour, cartels go out of their way to hide what they are doing. What is generally accepted is the practice can lead to the buyer paying up to 30% more than under a competitive tender.
Bid-rigging makes for a good entry point into the world of data science for the CMA because the patterns are visible in the procurement data held by procuring authorities. “That wouldn’t necessarily be true in other areas,” said Kirkpatrick. He describes the tool as “quite sophisticated, but not big data”, as it only uses data sets from individual procurement authorities and the analytic tools would be familiar to most statisticians. “It’s not a panacea for cartel hunting, but we think it’s quite a useful tool for procurers.”
What is novel is that it gives power back to the buyer by relying on activity by the procurer – “in other words by the victim of crime,” said Kirkpatrick. Because it uses data collected through the normal procurement process, buyers can look for signs of bid-rigging without tipping-off any potential cartelists, and it may even function as a deterrent if market knows buyers have access to the tool. “The ideal outcome in our world is not that we detect more cartels, it’s that there aren’t any,” said Kirkpatrick. “We would much rather they were all deterred than we found lots of them.”
Kirkpatrick said it was conceivable similar data science tools could be used to analyse patterns of contract awards to find evidence of market sharing or bid rotation-type arrangements, but that is “materially more complicated because you need a different pattern of data and the participation of many more stakeholders,” he said. “We haven’t gone that far as yet.”
The CMA is now part of what Kirkpatrick described as “an international brotherhood and sisterhood” of competition authorities that are working to use data against cartels. Switzerland has even used quantitative analysis on data held by procuring authorities as evidence to support a successful prosecution. “They brought that case to fruition and those companies were fined some tens of millions of Swiss francs.”
There is no formal timeframe for reviewing the new tool, but Kirkpatrick hopes to have enough feedback from users to start thinking about what it might need for improvement. So far it has had more than 100 downloads.
“Do we think we’ve hit the limits of using data science as a detection mechanism? No. Are we pleased we’ve achieved what we think we have so far? Yes,” he said. “Will we think about what more we might do? Well I guess we probably want to wait and see how well adopted, how well used and the results of that use out of this tool before we invest materially in more data science for detection.”
☛ Want to stay up to date with the news? Sign up to our daily bulletin.