Google
Showing posts with label GDPR. Show all posts
Showing posts with label GDPR. Show all posts

Tuesday, 30 April 2019

Is BigTech Still Battling The Entire Human Race, Or Just Some Of Us?

Readers will be familiar with my view that we consumers tend to be loyal to 'facilitators' who focus on solving our problems, rather than 'institutions' who solve their own problems at our expense. Previously trusted service providers can also lose their facilitator status, and I'd argue that Facebook has already done so (owing to privacy, electoral and extremist content scandals) and Google is firmly headed in that direction (through behaviour incurring massive EU fines). Yet, despite announcements designed to suggest increasing transparency, it seems BigTech is actively resisting independent human oversight and the perceived battle between computers and the human race is far from over...

Part of the problem is that 'BigTech' firms still operate as agents of retailers and other organisations who pay them vast amounts of money for exploiting our personal data targeting advertising at us, rather than as our agents for the purpose of finding what we need or want while shielding us against exploitation. In fact, this is the year when digital advertising spend will exceed spending on the old analogue 'meat space' channels

Combine that exploitative role with rogue artificial intelligence (AI) and you have a highly toxic reputational cocktail - particularly because AI based on machine learning is seemingly beyond human investigation and control. 

For instance, Amazon found that an AI programme used for its own recruiting purposes was terribly biased, but could not figure out what was going wrong or how to fix it, so had to simply shut the thing down.  Alarmingly, that suggests other AI programmes that are already notorious for being biased, such as those used for 'predictive policing', are also beyond fixing and should be shut down...

Many BigTech firms are appointing 'ethics boards' to try to avoid their AI programmes heading in inappropriate directions. Trouble is, not only is there doubt about what data scientists might view as inappropriate (which drove the appointment of ethics boards in the first place), but these boards are also generally toothless (only CEOs and main boards can decide the actual course of development), and tend to be populated by industry insiders who sit on each other's ethics boards

It is unclear, for example, whether the recommendations of the ethics committee overseeing the West Midlands police 'predictive policing' algorithm will be followed. Meanwhile, 14 other UK police forces are known to be using such AI programmes...

Another worrying trend is for AI firms to prevent investors voting on the company's plans, using "dual class" share structures that leave voting control with the founders rather than shareholders. Lyft is the latest to hit the news, but other offenders include Alphabet (Google), Blue Apron and Facebook, while Snap and Pinterest give shareholders zero control. Those firms might argue that stock prices are a check in themselves. But the stock market and investor greed are notorious for driving short-term decisions aimed at only maximising profits, and even giant regulatory fines are subject to appeal and can take a long time to be reflected in share prices. Voting power, on the other hand, is more qualitative and not simply a function of market forces - and the fact that it is being resisted tells you it's a promising tool for controlling BigTech.

Regulation will also be important, since fines for regulatory breaches are a source of revenue for the public sector that can be used to clean up the industry's mess and to send signals to management, investors, competitors and so on. I'm not suggesting that regulatory initiatives like the UK Brexidiot ToryKIP government's heavily ironic "Online Harms" initiative are right in the detail or approach, but Big Tech certainly cannot keep abdicating responsibility for the consequences and other 'externalities' associated with its services and approach. There has to be legal accountability - and grave consequences - for failing to ensure that AI and the firms themselves are subject to human control.

I guess the real question might be: which humans? 


Thursday, 24 May 2018

If You Need Consent To Process My Personal Data, The Answer Is No

... there are plenty of reasons for businesses and public sector bodies to process the data they hold about you, without needing your consent. These are where the processing is necessary for:
  • performing a contract with you, or to take steps at your request before agreeing a contract; 
  • complying with their own legal obligation(s); 
  • protecting yours or another person's vital interests (to save your life, basically);
  • performing a task in the public interest or in the exercise of their official authority; 
  • their 'legitimate interests' (or someone else's), except where those interests are overridden by your legitimate interests or your fundamental rights which require protection of personal data. 
The General Data Protection Regulation lists other non-consent grounds apply where your personal data is more sensitive: relating to criminal convictions and offences or related security measures; or where it reveals racial or ethnic origin, political opinions, religious or philosophical beliefs, trade union membership; or it is genetic or biometric data for the purpose of identifying only you; or data concerning health or your sex life or sexual orientation. National parliaments can add other grounds in local laws.

These non-consent grounds for processing are all pretty reasonable - and fairly broad. So, if you don't have the right to process my personal data on one of those grounds, why would I want you doing so?

This would seem to herald a new era in which the Big Data behavioural profiling/targeting/advertising model begins to decline, in favour of personal Apps (or open data spiders) that act as your agent and go looking for items in retailers' systems as you need it, without giving away your personal data unless or until it is necessary to do so...


Related Posts with Thumbnails