Imagine that you want to lose weight. You have tried cutting back, but it hasn’t really helped. Maybe you should get a little more exercise. Maybe you should eat less fat. Or is it sugar? Greek yogurt is supposed to help. Am I really drinking enough coconut water? Who knows? So you mention it to your doctor, or make an appointment with a dietician or, perhaps, sign up with a club or clinic that specializes in weight loss. What is the first thing that they ask you to do? Keep a food log. A diary. Write down what you eat, record the exercise you do and then report back in a couple of weeks so they can give you a customized recommendation.
Great! You have a made a positive decision to take charge of your health.
The first day you are on top of it—and even pretty honest. That corned beef hash that you accidentally ate at the diner when you stopped in for coffee? In the diary. The late-night bowl of sugary flakes? In the diary. Day two, only the good stuff goes in the diary, and a few days later, you are still making a half-hearted attempt until, finally, you just find yourself scribbling down a bunch of made-up stuff in the waiting room moments before your next appointment.
Sound familiar?
For decades, the self-reported diary has been the primary research tool for studying and measuring our eating, sleeping and other behaviors—the foundation of efforts to help us change those behaviors—but it doesn’t really work. It is a fantasy.
The quantified self
New technology offers a different approach. In the past few years, we have spent millions of dollars on a host of devices and apps that passively track our behaviors. Products from Fitbit, Nike, Jawbone, Garmin and others. The theory of this technology, or movement (called “The Quantified Self” by some), is that more data–and more accurate data–about our behavior will help us understand ourselves better and ,thus, provide a foundation and methodology for improving ourselves.
Today’s technology tracks our steps, sleep patterns, communication habits and more. Tomorrow’s technology will automatically log the food we eat, its caloric and nutritional components and its effect on our bodies. This passive tracking of data clearly is a more realistic approach for us fragile, distracted, willpower-exhausted humans. The machine collects the data in a clever way. The algorithms automate the analysis of the data to give us insight into our habits and patterns and help us track our progress towards a goal.
Of course, this approach to problems–any kind of problem–is de rigueur. We know it as Big Data, and it is prescribed as a solution to everything from unemployment to world hunger.
We are bringing the "Quantified Life" philosophy to companies, governments and to entire nations. Tomorrow, we will have the quantified organization, with the promise that decisions based on tradition and superstition are replaced by decisions based on facts and evidence.
The quantified organization
It is easy to be cynical about Big Data. Sometimes I am, but mostly, I get it and believe it. Clearly, it raises a host of business, policy, legal, ethical and societal issues. In any case, it doesn’t matter whether I get it or not: It will be the way that we function as organizations–and, increasingly, as individuals–moving forward.
The idea that we should make decisions based on facts or evidence as opposed to tradition, intuition and superstition, of course, derives from the Enlightenment and the scientific method itself, but even in areas where you might expect that this approach is already baked in, there has been a push to focus on the evidence. In the 1990s, for example, the concept of “evidence-based medicine” (or “evidence-based practice”) was introduced into the medical field and has since taken hold as an operating philosophy in branches of medicine from optometry to dentistry.
Evidence-based practice is defined as:
Great! You have a made a positive decision to take charge of your health.
The first day you are on top of it—and even pretty honest. That corned beef hash that you accidentally ate at the diner when you stopped in for coffee? In the diary. The late-night bowl of sugary flakes? In the diary. Day two, only the good stuff goes in the diary, and a few days later, you are still making a half-hearted attempt until, finally, you just find yourself scribbling down a bunch of made-up stuff in the waiting room moments before your next appointment.
Sound familiar?
For decades, the self-reported diary has been the primary research tool for studying and measuring our eating, sleeping and other behaviors—the foundation of efforts to help us change those behaviors—but it doesn’t really work. It is a fantasy.
The quantified self
New technology offers a different approach. In the past few years, we have spent millions of dollars on a host of devices and apps that passively track our behaviors. Products from Fitbit, Nike, Jawbone, Garmin and others. The theory of this technology, or movement (called “The Quantified Self” by some), is that more data–and more accurate data–about our behavior will help us understand ourselves better and ,thus, provide a foundation and methodology for improving ourselves.
Today’s technology tracks our steps, sleep patterns, communication habits and more. Tomorrow’s technology will automatically log the food we eat, its caloric and nutritional components and its effect on our bodies. This passive tracking of data clearly is a more realistic approach for us fragile, distracted, willpower-exhausted humans. The machine collects the data in a clever way. The algorithms automate the analysis of the data to give us insight into our habits and patterns and help us track our progress towards a goal.
Of course, this approach to problems–any kind of problem–is de rigueur. We know it as Big Data, and it is prescribed as a solution to everything from unemployment to world hunger.
We are bringing the "Quantified Life" philosophy to companies, governments and to entire nations. Tomorrow, we will have the quantified organization, with the promise that decisions based on tradition and superstition are replaced by decisions based on facts and evidence.
The quantified organization
It is easy to be cynical about Big Data. Sometimes I am, but mostly, I get it and believe it. Clearly, it raises a host of business, policy, legal, ethical and societal issues. In any case, it doesn’t matter whether I get it or not: It will be the way that we function as organizations–and, increasingly, as individuals–moving forward.
The idea that we should make decisions based on facts or evidence as opposed to tradition, intuition and superstition, of course, derives from the Enlightenment and the scientific method itself, but even in areas where you might expect that this approach is already baked in, there has been a push to focus on the evidence. In the 1990s, for example, the concept of “evidence-based medicine” (or “evidence-based practice”) was introduced into the medical field and has since taken hold as an operating philosophy in branches of medicine from optometry to dentistry.
Evidence-based practice is defined as:
Applying the best available research results (evidence) when making decisions about healthcare. Healthcare professionals who perform evidence-based practice use research evidence along with clinical expertise and patient preferences. Systematic reviews (summaries of healthcare research results) provide information that aids in the process of evidence-based practice.
If the practice of medicine—which has embraced the scientific method for over a century–can benefit from a heightened focus on evidence-based decisions and policy, then surely there are other practices that could benefit from it as well. Any come to mind?
How about information governance (IG)?
Evidence-based practice and information governance
Today in IG, we make so many decisions, and craft so many policies, based on nothing more than tradition and superstition. This is especially prevalent in the records and information management (RIM) facet of IG, but it exists elsewhere as well. Why do we have 1,000 categories in our records retention schedule? Because that’s the way the last guy did it. Because we inherited the schedule from a company we acquired. Because Janice liked it that way. Because that’s the right way. Because that’s what makes the most sense to me. Because that’s what my old boss told us to do. Because that is what the consulting company sold us.
Where is the evidence?
What is true?
Are these justifications based on anything more than tradition, superstition or office politics?
I propose a new focus for IG practitioners–a focus on evidence-based information governance. This philosophy should be embedded in everything we do in IG. It is egregious that we wave our hands magically and use purely anecdotal evidence to create fear around information risk. The risk of a spoliation charge in a litigation, for example. How often does it happen? What is the risk of it happening? Go look it up for yourself.
We need to bring evidence into the practice of IG. We need evidence to quantify value. To quantify risk. Evidence to make decisions about how much time, money and effort we should put into managing specific kinds of information.
It is shameful that today, in 2014, this is the exception rather than the rule in IG.
Today, we have incredible tools that can easily shed light on our information to give us the visibility and the evidence we need to make good decisions. Let me know what you think.
Barclay T. Blair is an advisor to Fortune 500 companies, software and hardware vendors and government institutions, and is an author, speaker and internationally recognized authority on information governance. This content originally appeared on the Barclay T. Blair Blog: Essays in Information Governance (and more). For more information, visit barclaytblair.com.