Having modelled the steps and interactions of the operating scenario, each of the identified decision points can then be analysed in order to determine the nature of any hazardous scenarios that may arise.
Each of the decision points identified in Activity 3 should be selected. For each, the possible environmental states that may arise within or emanate from the operating environment are identified before enumerating the decisions that the AS could select at that decision point.
A decision point for an autonomous robot moving in a building may correspond to the detection of an object in the robot’s path. The options available to the AS at this point are:
The different possible scenarios and environmental states that relate to the decision can then be identified by considering the real world state and the belief state of the AS at the point at which the decision is made, along with each of the possible options. The real world state represents the actual state of the operating environment, whilst the belief state represents the understanding that the AS has of the state of the operating environment.
An example of some enumerated situations for an autonomous robot encountering an object in its path is shown in Table 1. The real world and belief states are represented as Boolean states where ‘True’ represents the presence of the object and ‘False’ represents that the object is not present. So situation 5 in Table 1 represents the situation where there is an object in the path of the robot, but the robot is unaware of the object and continues on its current path at its current speed.
For each of the identified scenarios, the outcome can be defined in order to identify those that may be potentially hazardous.
For an autonomous car turning right at a roundabout, one decision point is the car entering the roundabout. The options for the car at this decision point are:
A possible interaction at this point is with another road user, in this example a cyclist. Relevant scenarios can be identified by considering the real-world presence of the cyclist in combination with the car’s belief that a cyclist is present, and each of the options identified above. For example, one situation is that a cyclist is not present, but the car believes that there is a cyclist and decides to stop and wait. This scenario could lead to a hazardous outcome as an unnecessary and unexpected stop could potentially cause a rear‐end collision.
For an autonomous insulin pump that is monitoring a patient’s blood sugar level, one decision point is whether to alter the infusion rate to the patient. The options for the pump at this decision point are:
Relevant scenarios can be identified by considering the real-world change in the patient’s blood sugar level in combination with the pump’s belief in the current sugar level, and each of the options identified above. For example, one scenario is that the patient’s blood sugar level rises, the pump also has a belief state that the blood sugar level has risen and yet decides to maintain the current rate of insulin infusion. This scenario could lead to a hazardous outcome as the patient's blood sugar level could increase to unsafe levels.
Considering the possibly hazardous scenarios for an autonomous robot encountering an object in its path in Table 1, it is possible to also assign a severity of outcome, and consider whether minor or serious injuries (or even fatalities) are likely.
It is also possible to further consider how this severity could be impacted by factors pertaining to the operating enviornment.
An autonomous robot moving in a building that fails to detect an object in its path, and therefore continues on the current path at the current speed, could impact a static object. If the static object is an adult human, the severity of the impact could be minor (bruise or laceration). Should the static object be a small child, then the severity of the impact could be major (broken bones). The robot may have detected the static object, but decided to maintain its current path, but reduce speed. Wet floor surfaces may reduce the effectiveness in braking (through wheel slippage), and this higher then expected collision speed could impact the severity of the collision.
For an autonomous car turning right at a roundabout, the autonomous car may fail to detect a dynamic object that would intersect it should it enter the roundabout. If this dynamic object is a car, and was collided into by the autonomous vehicle, the occupants may suffer only minor injuries (if any at all). Should the dynamic object be a cyclist, then the collision may result in a fatality. Should the road surface be icy, the force of impact may displace an impacted car onto a fixed object in the environment, thereby increasing the severity of the collision for the car’s occupants.
For an autonomous insulin pump that is monitoring a patient’s blood sugar level, the autonomous infusion pump could decrease the insulin infusion without medical need, should the patient be experiencing a spike in blood sugar levels at the time the infusion was reduced, then the patient may become hyperglycaemic. Should the patient have additional health complications, or the reduction in insulin infusion go unobserved by a clinician, then the severity could be impacted and result in a fatality.
The analysis undertaken in this activity shall be documented ([WW]).