Reducing our risk and offloading liability is something most of us seek. So do organisations and companies. Some even have teams of lawyers for this singular purpose.
Weirdly, consent becomes a risk reducer. When we sign up to Facebook and agree to their terms, we consent to them using our photos to train their facial recognition software. Of course, it’s not stated that clearly, but we do give up our rights to the images. We can be upset about this, but we ‘consented’ to it. This is the same when websites track our activity through ‘cookies’ and sell our behaviour to other companies. The phrase I often see is ‘share data with selected partners’.
In discussions about decentralised identity, I tend to get quite excited by how it works and the ability for me to have much greater control over who has access to my data. There is great potential here. However, when I think about how long my data trail is, I wonder if I will have time to do anything else but accept and reject requests for access. The burden and the liability for decision making lies entirely with me.
On the one hand, there is something good and exciting about this. On the other, there are many things I don’t know nor know the implications of. And the future impact is substantial for me and those related to me. The liability and risk of me suing companies goes down and the burden of responsibility and understanding.
So what can we do? We can go back to trying to trust the big companies and organisations, however they have, and continue to, show questionable ethics and break trust. Perhaps we need to consider something that sits between, some sort of collective. But also in the collective some sort of standards. Perhaps we can create options where there are WHO standards for whom to share data with that I can choose. Or there is a local organisation I know and trust who says ‘here’s who we have vetted and trust to share data with’. And again, we can choose to follow their choices. Or perhaps even we can choose to have the same consent settings as one of our good friends who we know.
The liability and responsibility might still lie with us as individuals. Unless of course we do this type of thing in a data trust where the settings are chosen and monitored by trustees.
I don’t know. There is clearly more exploring to do.
Photo by Edu Lauton
0 Comments