With the advent of cloud computing and remote servers, it seems as though the amount of information people or organizations can collect and store has grown exponentially in a short period of time. While on paper this may sound like a good thing, this can lead to another serious issue in data oversaturation.
Especially for a government agency, having access to data at all times is crucial for efficient and effective operation. The rapid technologization of the service and maintenance management industry in recent years has created a situation where people are getting lost in seas of data - here are some things to keep in mind when planning your data storage solution.
What's causing bad data?
Governments often find themselves dealing with "bad data" - information that is incomplete, insufficient or inaccessible to address a certain problem. But how can that happen with this being the Information Age? As it turns out, there are quite a few things that can negatively impact the integrity of information governments collect and store.
The idea that more technology automatically solves data problems is a misconception. In fact, according to a report from Governing, the primary cause behind bad data was listed as "technology related problems," with 17 percent of survey respondents citing this as their chief bugbear. Other major causes included poor planning, bad management and a lack of accountability among users and government officials.
The dangers of bad data
These sorts of errors are more than just computer glitches. Especially in a government setting, improper data can have serious financial or even legal ramifications for governments and communities. Governing cited one example in which the California State Controller's office had accumulated 200,000 hours of improperly logged vacation time due to storage errors - a mistake that ended up costing around $6 million.
This is an extreme example, but not an uncommon one. Poor data can also lead to services being duplicated or forgotten about entirely, wasting government resources and leaving some members of the community without access to essential services. Unfortunately, the Governing survey indicated that around 70 percent of respondents reported data problems as a regular occurrence at work.
How can you stop it?
Obviously any organization wants to put a stop to bad data as quickly as possible to save time and resources. But with all the factors influencing the problem, how does one go about doing that?
One huge problem contributing to bad data is that governments so often silo their information. In other words, data required by one organization is stored in proprietary systems and servers, restricting or confounding other government agencies' ability to access that data if needed. As a result, data collection across different services is disorganized and chock full of duplicate information.
Doing away with this data silo system is easier said than done, especially with concerns abounding about confidentiality and shared information. However, Governing pointed out that in many instances, it's actually less of a legal headache to navigate the confidentiality arrangements involved with sharing data than it is to recover from a mistake caused by improper information sharing.
Another key strategy is for different government services to work together to standardize the process by which they collect and keep their data. Tiny differences in things like naming conventions may not seem major to people collecting that information, but it can add up to a jumble of partially reconciled data spread across a wide network of government systems. Standardizing the definitions governing data collection - such as how addresses are recorded and whether such systems take things like nicknames into account - is an important but often-overlooked step that can save your government significant headaches down the road.