Google

Friday, 31 January 2014

Will You Share Your NHS Records?

You may have received a letter from your local NHS trust, giving you the chance to opt out of the NHS plan to share your health records with Big Pharma and others

I've found the process incredibly light on detail about how your data will actually be used, and I don't see how it can be said that any consent you give this way is fully-informed. You can't be expected to give a single 'yes' or 'no' for all your records in such a wide variety of circumstances. 

The issue of consent is not only a question of privacy, but also a question of the value that Big Data derives by exploiting your data without recompense, as explained here. The NHS scheme is just another Big Data play that takes a free ride on your data, and nowhere near the kind of mutually beneficial and trustworthy ecosystem that it's possible to construct today.

For instance, with your own data account you would be able to receive a request to use some of your health records for each specific project. You might choose to 'donate' some of your anonymised data to help find a cure that will be available to everyone at cost price. But you might put a high price on your data if it is to be mined by Big Pharma to create a premium branded drug. 

Hell, for enough dough you might even add your name and a nice photo!

Such a system would not need to be created specifically for your health records, nor paid for by the NHS. In fact, given the NHS record on technology projects it would be best developed by others.

At any rate, I plan to opt out of sharing my health records until the NHS cooperates with a more flexible, user-centric system.


Thursday, 30 January 2014

P2P Goes Cloud-to-Cloud


In Part 2 of my response to Google's 'computers vs people' meme, I explained that humans can win the war for economic control of their data by transacting on peer-to-peer marketplaces. That's because the P2P platforms don't derive their revenue primarily by using their users' data as bait to attract advertising revenue. Instead, they enable many participants to transact directly with each other in return for relatively small payments towards the platforms' direct operational costs, leaving the lion's share of each transaction with the parties on either side. This post covers some technological developments which move the P2P front line deep into Big Data territory.

Perhaps the ultimate way to avoid Big Data's free ride on the ad revenue derived from your data is to cut your reliance on the World Wide Web itself. After all, the Web is just the 'human-readable' network of visible data that sits on the Internet - just one of many other uses. As I've mentioned previously, having your own pet 'open data spider' that gathers information based on your data without disclosing it would transform the advertiser's challenge from using Big Data tools to target you with their advertising, to enabling their product data to be found by your spider as and when you need it.

But that would not necessarily solve the problems that arise where your data has to be shared.

Fortunately, all but the most hardcore privacy lobbyists have finally moved beyond debating the meaning of "privacy" and "identity" to realise two important things. First, 'personal data' (data that identifies you, either on its own or in combination with other data) is just one type of user-related data we should be concerned about controlling in a Big Data world. Second, it's critical to our very survival that we share as much data about ourselves as possible to the right recipient in the right context. The focus is now firmly on the root cause of all the noise: lack of personal control over our own data. 

Perhaps the leading exponents of this turnaround have been those involved in the Privacy by Design initiative. As explained in their latest report, they've become convinced by a range of pragmatic commercial and technological developments which together produce a 'personal data ecosystem' with you at the centre. You are now able to store your data in various 'personal cloud' services. 'Semantic data interchange' enables your privacy preferences to be attached to your data in machine-readable form so that machines can process it accordingly. Contractually binding 'trust frameworks' ensure data portability between personal clouds, and enable you to quickly grant others restricted access to a subset of your data for a set time and revoke permission at will. The advent of multiple 'persistent accountable pseudonyms' supports your different identities and expectations of privacy in different contexts, allowing for a lawful degree of anonymity yet making your identity ascertainable for contractual purposes. You can also anonymise your own data before sharing it, or stipulate anonymity in the privacy preferences attached to it, so your data can be processed in the aggregate for your own benefit and/or that of society.

All that's missing is a focus on determining the right value in each context. I mean, it should be a simple matter to attach a condition to your data that you are to be paid a certain amount of value whenever Big Data processes it. But 'how much'? And are you to be 'paid' in hard currency, loyalty points or cost savings?   

The ability to put a value on your data in any scenario is not as far away as you might think. The Privacy by Design report notes that the personal data ecosystem (PDE) is "explicitly architected as a network of peer-to-peer connectivity over private personal channels that avoid both information silos and unnecessary “middlemen” between interactions."

Sound familiar?

As explained in the previous post, P2P marketplaces already enable you to balance your privacy and commercial interests by setting a value on your data that is appropriate to the specific context. Your account on each platform - whether it's eBay or Zopa or one of many others - is effectively a 'personal cloud' through which you interact with other users' personal clouds to sell/buy stuff or lend/borrow money on service terms that leave most of the transaction value with you and the other participants.

The wider developments in semantic data interchange, trust frameworks etc., that are noted in the Privacy by Design report enable these clouds or marketplaces to be linked with other personal clouds, either directly or through the 'personal information managers',  as envisaged in the Midata programme

Ultimately, we could use one or two personal information managers to host and control access to our data and derive income from the use of that data by transacting on different P2P platforms dedicated to discrete activities. Not only would this make it simpler to understand and verify whether the use of our data is appropriate in each context, but it would also enable us to diversify our sources of value - a concept that is just as important in the data world as it is in financial services. You don't want all your data and income streams (eggs) in the one cloud (basket).

The Privacy by Design report claims that "all these advancements mean that Big Privacy will produce a paradigm shift in privacy from an "organisation-centric" to a balanced model which is far more user-centric".

I agree, but would add a cautionary note.

In the context of the 'computers vs people' meme, I'm concerned by references in the report to "cloud-based autonomous agents that can cooperate to help people make even more effective data sharing decisions". Has Privacy by Design been unwittingly captured by the Singularity folk?

I don't think so. Such 'cloud-based agents' are ultimately a product of human design and control. Whether the technologists at the Singularity University choose to believe it or not, humans are in fact dictating each successive wave of automation. 

At any rate, we should take advantage of technology to keep things personal rather than submit to the Big Data machines.


Wednesday, 29 January 2014

Humans Win In The P2P Economy

There's been a lot of heat rising from Google CEO Eric Schmidt's recent assertions about a "race between computers and people" that obliges people to avoid jobs that machines can do. Initially, I suggested this was somewhat disingenuous, given the belief amongst the Silicon Valley elite that machines will achieve the 'Singularity', a state of autonomous superintelligence in which point they will outcompete humans to the point of extinction. Merely pushing people into a narrower and narrower range of 'creative' jobs only furthers that cause, since their creative output attracts the vast advertising revenues Big Data needs to build ever smarter machines.

But I also suggested there's an antidote, and today I want to focus more on that.

Not all Internet platforms finance themselves primarily by using free content as bait for advertising revenue. Since eBay enabled the first person-to-person auction in 1995, the 'P2P' model has spread to music and file sharing, voice and data communications, payments, donations, savings, loans, investments and so on. There are now too many such platforms to list. Even political campaigning has become a person-to-person proposition. In Japan a person can offer to care for another person's elderly parents in his city, if someone else will care for his own parents in another.

Like their meat-space counterparts - the 'mutual society' and the 'co-operative' - online P2P platforms enable people to transact and communicate directly with each other in return for relatively small payments towards the platforms' direct operational costs of facilitating the connection. The P2P model vastly limits the need for advertising, since the platform either enables participants to find each other or automatically matches and connects them using the data the participants enter. Through central service terms, each participant agrees with the others how the platform works and how their data is to be used. Typically, every participant has their own data account in which they can view their transaction history. Some platforms will allow that data to be downloaded, along with all the transaction data on the platform, and this is to be encouraged. Low charges make this a high volume business, like Big Data, but platform operators are able to achieve profitability without commanding the lion's share of the margin in each transaction. This helps explain why eBay is solidly profitable but has a lower market capitalisation than, say, Facebook or Google. It's a leaner intermediary - a facilitator rather than institution. That Wall Street attaches a lower value to a comparatively democratic and sustainable business model tells you all you need to know about Wall Street.

Google and Facebook might argue they are a kind of P2P platform. But aside from a few services, like App sales, they don't directly facilitate the negotiation and conclusion of transactions, so they cannot justify a transaction fee. Perhaps they might say they own the web pages and the servers or virtual 'land' on which their advertising is displayed. But that doesn't ring true. They provide the tools for users to create web pages, but if users did not build them there would be no facade on which to display ads, and no one to look at them. Besides, the supply of creative tools is a one-off, while users supply limitless amounts of data in return. Meanwhile, the advertising revenue that was once merely enough to sustain the Big Data ecosystem now dwarfs the value derived by all participants except the platform operators themselves. Any essence of mutuality - and humanity - has been lost in exactly the same way that banks grew from their mutual origins to capture more and more of the 'spread' between savings and loans. And just as banks now allocate most of the money they create to add financial assets to their balance sheets, rather than financing the productive economy, the Big Data platforms are investing in more ways to capitalise on free user data to lure advertising spend, rather than figuring out new ways to leave most of the value with their users.

Dealing with people and businesses over P2P platforms is a good way to use your own data to claw some of that value back.



Friday, 24 January 2014

Google Declares War On The Human Race

Google's CEO, Eric Schmidt finally admitted yesterday something that the likes of Jaron Lanier have been warning us about for some years now: he believes there's actually a race between computers and people. In fact, many among the Silicon Valley elite fervently believe in something called The Singularity. They even have a university dedicated to achieving it.

The Singularity refers to an alleged moment when machines develop their own, independent 'superintelligence' and outcompete humans to the point of extinction. Basically, humans create machines and robots, harvest the worlds data until a vast proportion of it is in the machines, and those machines start making their own machines  and so on until they become autonomous. Stuart Armstrong reckons "there's an 80% probability that the singularity will occur between 2017 and 2112".  

If you follow the logic, we humans will never know if the Singularity actually happened. So belief in it is an act of faith. In other words, Singularity is a religion.

Lots of horrific things have been done in the name of one religion or another. But what sets this one apart is that the believers are, by definition, actively working to eliminate the human race.

So Schmidt is being a little disingenous when he says "It's a race between computers and people - and people need to win," since he works with a bunch of people who believe the computers will definitely win, and maybe quite soon. The longer quote on FT.com suggests he added:
“I am clearly on that side [without saying which side, exactly]. In this fight, it is very important that we find the things that humans are really good at.”
Well, until extinction, anyway.

Of course, the Singularity idea breaks down on a number of levels. For example, it's only a human belief that machines will achieve superintelligence. If machines were to get so smart, how would we know what they might think or do? They'd have their own ideas (one of which might be to look after their pet data sources, but more on that shortly). And there's no accounting for 'soul' or 'free will' or any of the things we regard as human, though perhaps the zealots believe those things are superfluous and the machines won't need them to evolve beyond us. Finally, this is all in the heads of the Silicon Valley elite...

Anyhow, Schmidt suggests we have to find alternatives to what machines can do and only humans are really good at. He says:
"As more routine tasks are automated, this will lead to much more part-time work in caring and creative industries. The classic 9-5 job will be redefined." 
Which is intended to focus our attention away from the trick that Google and others in the Big Data world are relying on to power up their beloved machines and stuff them full of enough data to go rogue. 
By offering some stupid humans 'free' services that suck in lots of data, Big Data can charge other stupid humans for advertising to them. That way, the machines hoover up all the humans' money and data at the same time.

This works just fine until the humans start insisting on receiving genuine value for their data.

Which is happening right now in so many ways that I'm in the process of writing a book about it. 

Because it turns out humans aren't that dumb after all. We are perfectly happy to let the Silicon Valley elite build cool stuff and charge users nothing for it. Up to a point. And in the case of the Big Data platforms, we've reached that point. Now its payback time.

So don't panic. The human race is not about to go out of fashion - at least not the way Big Data is planning. Just start demanding real value for the use of your data, wherever it's being collected, stored or used. And look out for the many services that are evolving to help you do that.

You never know, but if you get a royalty of some kind every time Google touches your data, you may not need that 9 to 5 job after all... And, no, the irony is not lost on me that I am writing this into the Google machine ;-)


Image from Wikipedia
Related Posts with Thumbnails