Mal Fletcher
Chips Under The Skin - "Convenient" But Not Wise

Australians are, if a new report is to be believed, quite open to the possibility of taking the concept of wearable devices to a wholly new level.

A study sponsored by the Visa company suggests that 25 percent of Australians are at least ‘slightly interested’ in having a commercial chip implanted in their skin.

This is, of course, potentially great news for multi-national payment groups like Visa. Their specialty is, after all, devising new ways of getting people to part with money via their particular channels. The more convenient the channel, the greater the profits for Visa et al.

On one level, implants are old news. Several groups overseas have experimented with implanted Radio Frequency Identification Devices (RFIDs). Most recently, the Epicentre high-tech office block in Sweden has encouraged its workers to undergo chip implantation which allows them to open security doors, run office machinery and even pay for their lunch at the work canteen.

The notion of implanted chips may seem a convenient way to perform all manner of everyday activities. Dig a little deeper, however, and it becomes apparent that implants may not be such a good idea on a number of fronts.

Firstly, we should consider the very real potential for bio-hacking. Any programmable device is in theory subject to hacking. A biochip can be hacked by third parties with nefarious motives in the same say that a computer system can be invaded.

With biochips, the potential for privacy incursions is huge – and not only with regard to society’s criminal elements. Recent debates about the work of official security agencies have highlighted public concerns about spying by governments on their citizens.

The trust contract between government and citizenry is a cornerstone of liberal democracy. Once this compact is broken, anarchy becomes a very real possibility. Current misgivings about privacy intrusions by officials are hardly likely to be allayed if we insert tracking devices into our bodies.

On the business and civic fronts, Big Data Analytics is proving a great boon.

Sophisticated mobile devices such as smartphones, satnavs and CCTV units make it possible for us to collect and generate data at unprecedented rates. Every day, the global community adds 2.5 billion gigabytes to the database we call the internet.

Super-computers, like IBM’s famous Watson machine, allow the speedy analysis of all this information and the discovery of patterns within it.

This analysis is used to predict such things as economic shifts, marketing trends and even political voter patterns. Big Data is now proving invaluable in the design of furniture, buildings, streets, driverless cars and even entire cities. Civic authorities are consulting it in the development of new crime prevention programmes and prisons.

For all its benefits, however, Big Data remains a form of soft surveillance.

The Samsung company recently warned users of its Smart TVs that the inbuilt voice recognition feature allows private conversations to be recorded and stored in the Cloud. These private conversations are then accessible to third parties. George Orwell would have loved that.

This is perhaps an isolated incident and Big Data provides too many benefits for us to attempt to turn back the clock. However, the hacking of biochips would render personal privacy even less intrusion-proof. 

Implants may appear convenient, but we must consider whether or not we want our bodies to become hackable devices.

Internally ‘worn’ chips also raise other ethical considerations.

One of the most pertinent relates to the line between humanity and technology. As we use nanorobotics and bio-mechanical chips to inject a new breed of prosthetic devices into the human frame, will we lose our sense of differentiation between what is human and what is machine? At what point then might we truly become androids?

Far from being frivolous questions for sci-fi aficionados, these are now subjects undergoing serious debate in major universities – particularly in fast-growing ethics faculties. (What was sci-fi yesterday becomes wi-fi tomorrow.)

Commercially-oriented chip implants also raise questions relating to digital debt. The growing number of charities and social enterprises devoted to helping the indebted bear witness to what is already a rapidly spreading problem in modern societies.

The uncoupling of spending from physical cash has doubtless played a key role boosting personal debt. Paper money and coinage have substance and weight; it is relatively easy to keep track of how much we spend when our money has a physical presence. We know it’s time to wind back on impulse purchases when the wad of cash in our pockets starts to feel a little on the light side.

Credit cards do not change weight when money leaves our accounts. At least, though, the process of filling out a credit card slip – now less and less a part of purchasing, thanks to wave and pay – provides some kind of physical reminder, albeit a tenuous one, that purchases cost us something of real-world value.

The advent of digital currencies such as bitcoin creates a potential for even greater overspending. The ones and zeroes of binary code have no weight at all. Implanted chips will continue to erode the link in human consciousness between spending and real-world value.

Arguably, companies like Visa have little interest in this problem. There are real benefits for them in divorcing the act of a consumer’s spending from any process of forethought.

Subcutaneous spending devices also raise the potential for digital dementia. In 2011, an international study concluded, after ten years of investigation, that the onset of dementia begins at around the age of 45, rather than 65 as was previously believed.

At the 2020Plus think tank, we posed an important question linked to this study. If a similar ten-year scientific investigation commenced today, would we find at its conclusion that things we associate with dementia in 2015 had now become normal cognitive function?

Would loss of short-term memory, numeracy skills and feelings of confusion have ceased to be peculiar because we had ceded so many areas of our thinking to machines?

We already rely on gadgets for arithmetic, spelling, navigation and, increasingly, person-to-person interaction. What happens to the parts of our brains responsible for these and other activities if they are no longer called upon on a regular basis?

A few weeks ago, a leading British psychiatrist suggested that children as young as five years of age are exhibiting borderline autism-like symptoms. They are, he said, unable to read the subtle facial signals in normal human conversation because of their engagement with digital screens.

A range of studies, particularly in the USA, suggests that we are forming transactional relationships with machines. We do not remember what we learn on the internet as much as we remember where we found it, relying on the machine to store the details.

This of course means that what we read is not stored in long-term human memory and provides no benefit for producing future innovation.

The experimental research of leading neuroscientists such as Baroness Susan Greenfield is building the case for watchfulness when it comes to relying too much on digital devices.

Finally, implants raise important health issues. Research is still ongoing into the impact of chips on the development of certain cancers. To this point, studies have only been carried out on laboratory animals. Yet even now, as the Australian reported earlier today, they point to links between chip implants and cancerous growths.

Technology is to be celebrated. There is no point in taking a luddite approach. Digital technologies have brought and will bring enormous benefits to the human experience.

That fact should not, however, make us oblivious to the potential pitfalls associated with making devices an extension – or an integral part – of the human frame.

Click here to listen to Mal Fletcher's ABC Radio interview on this issue 


Mal Fletcher was quoted on this issue in the Sydney Morning Herald and the Melbourne Age. Click here to read the piece.

Mal Fletcher (@MalFletcher) is the founder and chairman of 2030Plus. He is a respected keynote speaker, social commentator and social futurist, author and broadcaster based in London.

About us