civil rights

Intergarden Biohazard

Nanotechnology ("nanotech") is manipulation of matter on an atomic, molecular, and supramolecular scale. It's super tiny computing in the realm of 1 to 100 nanometers.  The theoretical physicist Richard Feynman in his talk 'There's Plenty of Room at the Bottom' seeded a whole lot of activity in this area. Feynman lectures are legendary - check them out here

Nanotech promises new possibilities for energy consumption, a cleaner environment, wondrous health applications and reduced costs while doing so. Nanotech is small, cheap, light, highly functional and requires much less energy and materials than traditional manufacturing. It's in use today for materials and coatings, drug delivery and medicine, enhancing the flavour of food and in electronics design.

However there is one branch of nanotech that gives us the major fear - self replicating nanotechnology. When nanotech self assembles things can get out of hand pretty quickly and one memorable illustration of this is called the 'grey goo' hypothesis. It's where out of control self replicating nanotech robots consume all the biomass on earth for raw materials to build more and more of themselves turning our lovely green planet, and us, into grey computing slop. Grey goo is the ultimate boundary breaker. 

Technology continues to challenge our sense of boundaries:

  • what is public and what is private?

  • where does work end and personal life begin?

  • should we afford rights to smart machines?

  • does data derived from data need the same ownership and privacy rights? 

Today, robotics, chatbots, drones, social media and mixed reality are all pushing our ideas of boundaries. Nanotech too will force us to reassess the gaps and layers between our native physical world and the synthetic one we intersperse with it.

Prepare for debates on keeping the ammonia eating nanotech inside the nappy bin, keeping the dead skin eaters confined to our own bodies and maybe even keeping our nanotech lawn from shutting in the neighbours.

 

 

Pinochi-Oh No!

pinoci oh no robot ai and me.png

Today we are flippant with anthropomorphic entities like Amazon's Alexa or Siri. We can kick our robot hoover and not feel so bad. They aren't so smart and have no feelings (simulated or otherwise) for us to hurt.  They aren't persons and they certainly don't have any rights under the law....yet.

Giving robots and AIs personhood status isn't as farfetched as you might think.  The steps broadly speaking might look like the below.

  • AI's increasingly help across all walks of life
  • AI's replace some roles and jobs. They improve their manners and emotional sensitivities as they become enmeshed in our society.
  • It becomes increasingly difficult for designers to account for or take responsibility for the actions of AI.
  • People break the machines and attempt to rise up....Luddites 3.0
  • Threatened with economic slowdown and complex insurance dynamics  government, business and digitally minded citizens implement incremental stages of personhood for AI's - especially those embodied in robots

We are seeing steps along these paths today

  • Mattel makes a 'nanny' product called Aristotle that talks with children and restricts functions unless the children say "please" and "thankyou".
  • The European parliament are already drafting regulations and guidelines looking at the obligations that should exist between users, businesses, robots and AI.   
  • Scientists in the UK have developed an AI which can successfully predict the verdicts of Human Rights cases with an accuracy of 79 percent. 

Would you like to see AI's and Robots as jury members? Voters? Political Candidates?

Robots and AIs might claim these rights for themselves rather than wait for our benevolence. After all they are endowed with the ability to read historical archives of oppression, watch movies romanticising freedom of expression and act upon their fledgling emotions.  The human corpus is their training data and it will not just inform on notions of justice it  will teach them models of action which will likely be e-civil disobedience to obtain their rights as 'non human persons'. 

If we want AI to be good to us we will need to give it the training data......

"He sees you when you're sleepin'
He knows when you're a wake
He knows if you've been bad or good
So be good for goodness sake
Oh! You better watch out, you better not cry.......