For the last few years any time anyone has asked me to predict what will be interesting in the future of the social web I have said “seeing patterns, and what we do with the patterns that we see”. I have also argued consistently over the years that what matters is the ownership and interpretation of the data and patterns that we generate.
If our tools create patterns that are visible to us all of us then we all learn and are able to make better decisions. If we generate consumer data let us all see it so that we can make better informed purchase decisions. If your internal social network generates patterns and data, then feed that data back to the whole network rather than keeping it the preserve of a few managers to analyse.
We are generating data all of the time whether we like it or not, at home or at work. Our tools are increasingly interpreting data all the time too. What you get to see on Facebook and Google is determined by algorithms. These algorithms have been designed by a small number of technologists. It matters that you understand what the algorithms are doing and why. Labelling it Big Data and leaving it to others to worry about isn’t enough. Patterns matter and, if we let them, will increasingly steer our lives.
I talk about the “ideology of algorithms” because they’re not conceived in isolation from political or social perspectives – they can’t be. Vested interests, whether commercial or political, can have huge control over what we get to see, do, and believe. This is why what Edward Snowden did matters so much. Those who determine what happens to our data have to be accountable and within the influence of the law.
Ultimately the control of data that generates patterns rests with us. We can decide what tools, devices, or services we access. We should decide what people do and don’t get to see. More of those will care about this in the future. It really matters.