Google
Showing posts with label personal data store. Show all posts
Showing posts with label personal data store. Show all posts

Wednesday, 3 April 2024

EU Countries To Offer Their Citizens Digital Identity Wallets From 2026

The EU is finally pushing forward with aRegulation that requires member state governments to offer their citizens a voluntary European digital identity wallet from 2026.

Under the new law, member states will offer citizens and businesses digital wallets that will be able to link their national digital identities with proof of other personal attributes (e.g., driving licence, qualifications, bank account). Citizens will be able to prove their identity and share electronic documents from their digital wallets simply, using their mobile phones. 

 The new European digital identity wallets (EDIWs) will enable all citizens to access online services with their national digital identification, which will be recognised throughout the EU, without having to use private identification methods or unnecessarily share personal data. User control ensures that only information that needs to be shared will be shared.

For the past 20 years, various players have pushed the idea that you could have a digital identity issued by any number of certified 'trust providers' based on certain agreed standards. This would go hand-in-hand with concepts of 'personal data stores' and access to your transaction data in machine-readable form ('midata')that would allow you to control how your data is monetized. There was speculation that the trust providers would likely be banks and telecoms companies, or perhaps dedicated new entities; but there were always concerns about whether they really had the core competencies required - or the risk/liability appetite - as well as issues relating to security and privacy.

The European Commission explains that:

  • by 2026, each member state must make a digital identity wallet available to its citizens and accept EDIWs from other member states according to the revised regulation 
  • sufficient safeguards have been included to avoid discrimination against anyone choosing not to use the wallet, which will always remain voluntary 
  • the wallet’s business model: issuance, use and revocation will be free of charge for all natural persons 
  • the validation of electronic attestation of attributes: member states are required to provide free-of-charge validation mechanisms only to verify the authenticity and validity of the wallet and of the relying parties’ identity 
  • the code for the wallets: the application software components will be open source, but member states are granted leeway so that, for justified reasons, specific components other than those installed on user devices need not be disclosed 
  • consistency has been ensured between the wallet as a form of eID and the scheme under which it is issued...

Qualified website authentication certificates (QWACs) will ensure that users can verify who is behind a website under well-established eID security rules and standards (which enable open banking service providers to authenticate each other's systems, for example).


Tuesday, 30 April 2019

Is BigTech Still Battling The Entire Human Race, Or Just Some Of Us?

Readers will be familiar with my view that we consumers tend to be loyal to 'facilitators' who focus on solving our problems, rather than 'institutions' who solve their own problems at our expense. Previously trusted service providers can also lose their facilitator status, and I'd argue that Facebook has already done so (owing to privacy, electoral and extremist content scandals) and Google is firmly headed in that direction (through behaviour incurring massive EU fines). Yet, despite announcements designed to suggest increasing transparency, it seems BigTech is actively resisting independent human oversight and the perceived battle between computers and the human race is far from over...

Part of the problem is that 'BigTech' firms still operate as agents of retailers and other organisations who pay them vast amounts of money for exploiting our personal data targeting advertising at us, rather than as our agents for the purpose of finding what we need or want while shielding us against exploitation. In fact, this is the year when digital advertising spend will exceed spending on the old analogue 'meat space' channels

Combine that exploitative role with rogue artificial intelligence (AI) and you have a highly toxic reputational cocktail - particularly because AI based on machine learning is seemingly beyond human investigation and control. 

For instance, Amazon found that an AI programme used for its own recruiting purposes was terribly biased, but could not figure out what was going wrong or how to fix it, so had to simply shut the thing down.  Alarmingly, that suggests other AI programmes that are already notorious for being biased, such as those used for 'predictive policing', are also beyond fixing and should be shut down...

Many BigTech firms are appointing 'ethics boards' to try to avoid their AI programmes heading in inappropriate directions. Trouble is, not only is there doubt about what data scientists might view as inappropriate (which drove the appointment of ethics boards in the first place), but these boards are also generally toothless (only CEOs and main boards can decide the actual course of development), and tend to be populated by industry insiders who sit on each other's ethics boards

It is unclear, for example, whether the recommendations of the ethics committee overseeing the West Midlands police 'predictive policing' algorithm will be followed. Meanwhile, 14 other UK police forces are known to be using such AI programmes...

Another worrying trend is for AI firms to prevent investors voting on the company's plans, using "dual class" share structures that leave voting control with the founders rather than shareholders. Lyft is the latest to hit the news, but other offenders include Alphabet (Google), Blue Apron and Facebook, while Snap and Pinterest give shareholders zero control. Those firms might argue that stock prices are a check in themselves. But the stock market and investor greed are notorious for driving short-term decisions aimed at only maximising profits, and even giant regulatory fines are subject to appeal and can take a long time to be reflected in share prices. Voting power, on the other hand, is more qualitative and not simply a function of market forces - and the fact that it is being resisted tells you it's a promising tool for controlling BigTech.

Regulation will also be important, since fines for regulatory breaches are a source of revenue for the public sector that can be used to clean up the industry's mess and to send signals to management, investors, competitors and so on. I'm not suggesting that regulatory initiatives like the UK Brexidiot ToryKIP government's heavily ironic "Online Harms" initiative are right in the detail or approach, but Big Tech certainly cannot keep abdicating responsibility for the consequences and other 'externalities' associated with its services and approach. There has to be legal accountability - and grave consequences - for failing to ensure that AI and the firms themselves are subject to human control.

I guess the real question might be: which humans? 


Friday, 5 October 2018

Brits Look Away Now: Free Movement Of Non-Personal Data In the EU

The EU's "Digital Single Market" strategy has been boosted by an agreement to remove requirements for non-personal data to be stored in any one EU member state. The new law, approved in plenary by 520 votes to 81, with six abstentions, is due to be approved by the EU Council of Ministers on 6 November. It will apply six months after its publication in the EU Official Journal.

Restrictions on the free movement of personal data have long been relaxed under the EU data protection framework. The latest move is expected to double the value of the EU data economy to 4% of GDP by 2020.


In summary, the agreement means that:
  • public authorities cannot impose "unjustified" data localisation restrictions;
  • the data remains accessible for regulatory and supervisory control even when stored or processed across borders within the EU;
  • cloud service providers are encouraged to develop self-regulatory codes of conduct for easier switching of provider and porting data back to in-house servers, by mid-2020;
  • Security requirements on data storage and processing remain applicable when businesses store or process data in another Member State or outsource data processing to cloud service providers;
  • Single points of contact in each Member State will liaise with other Member States’ and the Commission to ensure the effective application of the new rules. 


Thursday, 24 May 2018

If You Need Consent To Process My Personal Data, The Answer Is No

... there are plenty of reasons for businesses and public sector bodies to process the data they hold about you, without needing your consent. These are where the processing is necessary for:
  • performing a contract with you, or to take steps at your request before agreeing a contract; 
  • complying with their own legal obligation(s); 
  • protecting yours or another person's vital interests (to save your life, basically);
  • performing a task in the public interest or in the exercise of their official authority; 
  • their 'legitimate interests' (or someone else's), except where those interests are overridden by your legitimate interests or your fundamental rights which require protection of personal data. 
The General Data Protection Regulation lists other non-consent grounds apply where your personal data is more sensitive: relating to criminal convictions and offences or related security measures; or where it reveals racial or ethnic origin, political opinions, religious or philosophical beliefs, trade union membership; or it is genetic or biometric data for the purpose of identifying only you; or data concerning health or your sex life or sexual orientation. National parliaments can add other grounds in local laws.

These non-consent grounds for processing are all pretty reasonable - and fairly broad. So, if you don't have the right to process my personal data on one of those grounds, why would I want you doing so?

This would seem to herald a new era in which the Big Data behavioural profiling/targeting/advertising model begins to decline, in favour of personal Apps (or open data spiders) that act as your agent and go looking for items in retailers' systems as you need it, without giving away your personal data unless or until it is necessary to do so...


Wednesday, 21 November 2012

Will Midata Turn Institutions Into Facilitators?

The government's warning shot over Midata presents an interesting challenge for some of the UK's institutions. But will it make them focus on solving consumers' problems - transforming them into 'facilitators'? Or will they merely continue to solve their own problems at consumers' expense?

The government wants the suppliers of energy, mobile phones, current accounts and credit cards to provide each of their consumer and small business customers with the records of what they bought, where and for how much. That transaction data must be released in computer-readable format to enable it to be analysed, either by the customer or the customer's authorised service provider. This would help prevent those suppliers from gaining an unfair pricing advantage over consumers, for example, and make it easier for consumers to figure out the product right for them.

Factors the government might consider in deciding whether to expand the programme to other sectors include: 
  • the market is not working well for consumers, e.g. consumers find it difficult to make the right choice or their behaviour affects pricing it's difficult to predict that behaviour;
  • there's a one-to-one, long-term relationship between the business and the customer, with a stream of ongoing transactions;
  • consumer engagement is limited, e.g. low levels of switching or competition; and
  • suppliers don't voluntarily provide transaction/consumption data to customers at their request in portable electronic format.
Yet these factors merely hint at the characteristics that an organisation should display if it is to succeed in the future economic environment. In broad terms, the targeted institutions will need to be organised to solve their customers’ problems, operate openly, adapt well to changing circumstances, remain committed to transparency and take responsibility for the impact of their activities on the wider community and society. I've explained these themes in more detail here.
 
The current targets of this programme have a long way to go!
 
I should add that I am involved in the Midata programme, as a member of the Interoperability Board and on the working groups considering issues related to data transmission and law/regulation.

Wednesday, 12 September 2012

Rethinking Personal Data

As part of its 'midata' initiative to empower consumers, the department of Business Innovation and Skills has been consulting on a proposal to give the Secretary of State a general power that "might be exercised broadly or in a more targeted way" to compel suppliers to supply transaction data at a consumer’s request. In the interests of transparency, I've summarised my response to the consultation over on The Fine Print. As previously explained, I should disclose that I've been involved in the midata Interoperability Board from its inception in 2011.
Related Posts with Thumbnails