This week, as usual… my mind has been whirring with a multitude of issues relating to staff well being in the NHS. Except lately I have been trying to look at everything from a different angle. So, being a bit of a Sci-fi geek as well as a Tudor History fan and doctoral researcher, it suddenly dawned on me how I was, in my own mind, equating the professional duties of NHS staff with Isaac Asimov‘s “Three Laws of Robotics“.
This ‘Eureka moment’ happened whilst I was enjoying my morning dippy egg, and I shall translate my thoughts as follows:
In case you were unaware, Asimov’s laws were intended as a fundamental framework to underpin the behavior of robots in human society. These laws are intended to allow the safe use of robots as tools.
They were originally as follows:
- A robot may not injure a human being or, through inaction, allow a human being to come to harm.
- A robot must obey the orders given it by human beings, except where such orders would conflict with the First Law.
- A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.
I shall now rewrite these laws based upon the analogy of this being applied to NHS relationships in my own mind.
- Clinical professionals may not injure a patient or, through inaction, allow a patient to come to harm.
- Clinical professionals must obey the orders or ‘needs’ of patients, except where such orders would conflict with the First Law.
- Clinical professionals must protect their own existence as long as such protection does not conflict with the First or Second Laws.
In essence, clinical professionals are programmed in this same way to put patients first. I am not necessarily arguing that this is wrong. But again, in doing this we may paradoxically be putting patients at risk if we fail to value clinical professionals as humans too.
Are clinical professionals the subservient robots of humanity?
If clinical professionals obey the needs or ‘orders’ of patients at the expense of their own well being, then this may not be conducive to safe clinical care.
Later, Asimov added a “zeroth law”, that preceded the others in terms of priority:
0. A robot may not harm humanity, or, by inaction, allow humanity to come to harm.
Which in this analogy would become:
0. Clinical professionals may not harm humanity or, through inaction, allow humanity to come to harm.
If the well being of clinical professionals is not properly valued or addressed, the quality of patient care may be reduced through “malfunctioning” or “Decommissioned” practitioners. Therefore, humanity is harmed twice – once in the harm of patients and once in the harm of clinical practitioners. Humanity suffers.
The three laws are intended for robots, and we need to remember that clinical professionals are not robots. We also need to ensure that the well being of NHS staff remains an issue of equal salience in the provision of safe care. If both NHS staff and patients are of equal societal value, then we must value #StaffExperience as much as we value #PatientExperience.
Should this happen, then we may see better quality outcomes for all.
Please let me know your thoughts… Until then, I shall be burying my head in an ethics paper and literature review!
Be kind to yourselves, and each other x