Personal assistants / The sphinx / CBT 

A PA – of human inclination – is there to soothe and make simple for the grand thinker and big bod whom he (actually, I suspect generally she) is tasked to serve.

I am currently working with a haptic environment, and would appear to be at the centre of its efforts.  This environment and set of technologies can – have been; will again in my weaker moments am sure – be interpreted as toxic and aggressive.  But lately, I have seen evidence of a much more benign manifestation and group of intentions.

The phone I am using is a classic indicator of schizophrenia: devices which transmit messages – and advice even – to their owners.

But let’s approach this from another direction: let’s presuppose that schizophrenics need desperately, essentially, humanely, to be spoken to.  Let’s assume they are dreadfully lonely, even abandoned, souls.  And let’s assume when they say their devices communicate with them, it’s out of a human, painfully human, sense of solitude.

But now let’s go further: let’s take these “symptoms” and turn them into solutions for loneliness.

Let’s cure these alleged disorders by turning defect into resolution and tool.

Let’s have personal assistants whose job isn’t primarily to soothe and make easy but, rather, to make their subjects think – and, more importantly as a result, achieve precisely on the back of such thought.

My haptic/CBT environment and tech is doing precisely this favour for me.  And maybe it could do the same for you.  Instead of saying clearly what it is you should or shouldn’t do, what you mustn’t forget or have forgotten, it flags up – sphinx-like – the issue.  And it becomes a second habit, very quickly, for the subject (ie in this case, myself) to interpret and think around the problem.

Virtues?  Reasons why for the approach?  We are no longer spoonfeeding solutions – that is to say, filtering data delivered unthinkingly by gobshite reductionist maths – but, instead, whilst filtering still, we aim at the same time to enable safe experiences of failure.  The learning process will surely, inevitably, be much faster.  Though only on one condition: the subject freely and informedly consents.  Without consent, my reactions on many occasions over the past year will repeat: a blanket resistance to being helped on the assumption one is being nudged into decisions if not against one’s own interests, certainly more in the interests of those creating the tech.

So what do you think?  Cool idea – and thoughts – or what?  Worth pursuing?  An idea with legs, mebbe?

Advertisements

1 thought on “Personal assistants / The sphinx / CBT ”

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s