A few months back we began a project that sought to examine the online fiscal transparency of Virginia’s 135 cities and counties. What we offer today are our preliminary results. As the results are indeed preliminary we will not be naming names – good or bad – we are first giving the municipalities an opportunity to dispute our scores, to show us where we missed information, and given the general lack of usability found on more than a few of these sites, its not out of the realm of possibility that we did in fact miss something.
Before we get there though, it’s likely instructive to provide an overview on our measurement standards and explain our methodology. We graded each municipality on a 100 point scale, using 16 criteria, which were not weighted equally and can, roughly speaking, be broken into four groupings:
Budget Documents Presented (36 Points): This category looked for the inclusion of a Portal Page (5), Archives (5), the Advertised and Adopted Budget (5 and 10 respectively), Markup Information (5) as well as the Carryover Package (3). In this section we also looked for localities to include some context (3) so citizens would understand what document they were looking at and how it fit into the larger picture. Now, while the majority of these items can be objectively measured – either the municipalities provided them or they did not – the context and portal page items were obviously more subjective. In terms of a portal page we felt that there should be a single, easily accessible website or webpage where all of their budget information is posted. Ideally all the relevant information, items like past years budgets, contractor information, and expenditure data would all be in one place, but at a minimum it would be linked to from one place. For this and context we used Arlington as the standard and judged other sites against. For the archives category, scoring full points required that the municipality went back five years or more. Less than that garnered half points.
Extent (25): Here we went one step beyond simply having the documents and looked at what exactly was in them. We felt that budget documents should provide data down to the program level (10, half credit was given for drilling down to department level), allowing citizens to see how much is being spent on each program or office the county runs. We also were looking for the documents to be both searchable (5, full points if it was indexed in Google, half if you could only search within the pdf document) and available in easy to access formats, such as Microsoft Excel files (5, half if there was a disclaimer urging you to contact county staff for it in this format), allowing citizens to truly drill down, manipulate data and make comparisons across time and against other localities. The last thing we tried to evaluate was how timely documents were posted. This proved challenging. Some were easy, such as the municipalities who haven’t managed to post anything for fiscal year 2010. But for localities that have information up it is difficult to tell when it went up, especially in regard to that localities budget process. This is one area where we will be asking municipalities to provide us with information on their policies to help evaluate this category.
Expenditure Information (25): This is where the vaunted online checkbook (10, half points were given for the Comprehensive Annual Financial Report) was taken into account. The federal government and many states have already implemented online checkbooks that allow citizens to see everything the government spends money on, there is no reason counties can not as well. Reports should be posted frequently so timeliness (5, full points for updates presented monthly or more frequently, half points if updates were given in a time frame greater than one month but less than every six) in updates was considered as well. We were also looking for low thresholds to trigger inclusion (5, $2,500 minimums) and all agencies to be included (5).
Contract Information (14): Government doesn’t do everything it once did. All sorts of functions that used to be governmental are now contracted out to the private sector. Counties need to release that information. Here, we were looking for fairly basic information such as who got contracts, what for and how much (5, half if only one was missed). We also expected contact information on the contractor (3), a threshold of $2,500 for reporting (3) and for information on performances standards (3). This is one area where we wish we had a better handle on what was already being offered. Ideally, we should have scored having requests for proposals online and having an easy to use database of contracting.
ResultsWhile we aren’t naming names at this point, we do have some initial finding to report.
- Overall, most localities did very poorly on our 100 point scale. The median in our study was only in the 30s. No locality topped 80 points.
- We were surprised by the number of municipalities that don’t have any budget information online. Even setting aside the counties that don’t even have website, roughly one in seven Virginia localities got zero points on the survey.
- In the documents presented category, most counties did quite poorly. While most did have their adopted budget online, very few had other documents. Advertised budgets were rare and markup information and carryover packages were nearly non-existent.
- The extent category was generally where localities did the best. Most had information available to program level and were searchable, at least within their .pdf documents. None, however, had the data available on their website as an excel file.
- Expenditure information was generally lacking. No municipality had a full online checkbook and only about half had CAFRs up.
- On contract information counties were generally lacking. Significantly more counties had request for proposals online than had actual information on awarded contracts.
- Check back soon for our full report, where we will name names and recognize the leaders in online transparency, as well as the laggards. We’ll also release our database of how each county did on each category, so you can see exactly how your county did on our survey.