Similar to the way that oil cultivation has led to disasterous consequences, the way we harness big data could prove to be problematic as well. If we do not take steps to ensure that we humanize data, Thorp argues, “data spills” and “data pollution” could give more weight to the “data as oil” argument. It is expecially important that we tread carefully in our use of personal data, Thorp cautions. “A great deal of the profit that is being made right now in the data world is being made through the use of human-generated information,” he explains. “Our browsing habits, our conversations with friends, our movements and location—all of these things are being monetized.” The issue in this, he argues, is that it is “deeply human data, though very often it is not treated as such.” Those looking to profit from this vast collection of information see it as dollars and cents compacted into ones and zeros, when in reality, this data contains important fragments of people's lives.
One way to combat this, he says, is to provide avenues for people to understand the value inherent in the data they produce. When people realize the amount of personal information that is contained in the data they produce, they will be better able to monitor its misuse. This leads to his next point, where he explains that despite the fact that companies are profiting from the use of human-generated data, “not a single one has mentioned the rights of the people from whom the data is being extracted.” This, he says, needs to change. We need to see data not as the new oil, but as “a new resource entirely.” As he explains in his eye-opening keynotes, and showcases in his intriguing art projects, data needs to be humanized. Data contains the stories of our lives, he says, and we need to see this resource as something deeply personal if we are able to effectively harness it and “avoid some of the mistakes that we made with the old oil.”