As I said in the first part, we are not further along with AI than the second generation yet. Therefore, our assessment of criminal liability for the behaviour of an AI or robot will not be that complicated. The question is, however, what should we imagine under the term robot or AI in a heat treatment plant.
It can be a robot as such, it can be a robotic loader, but it can also be a furnace control system based on SCADA, SIMATIC, as well as DEMIG or STANGE, etc. There are countless systems and variants.
If for any reason a robot, furnace control system, AMR robotic loader (Autonomous Mobile Robotics) causes us damage or injury, two basic types of assessment will be applied
Objective liability – the pest is liable for the damage regardless of fault
Subjective liability – requires proof of fault, at least in the form of negligence
What you need to know about objective liability
An example is liability for damage caused by dangerous activities or employer liability for work-related injuries.
For subjective liability, you need to know
An example is criminal liability – for someone to be punished for a crime, their fault must be proven.
In the Czech legal system, product liability is incorporated into the Civil Code (Act No. 89/2012 Coll.), specifically in the provisions on special liability for damage caused by a product defect, and in the Consumer Protection Act. The manufacturer, or rather the person who placed the product on the market, is liable regardless of fault if it is proven that the product had a defect that caused the damage.
Objective liability for damage, e.g. for a defective product, has a compensatory, indemnifying function – the aim is to compensate the injured party for the damage. It is based on the principle that the injured party does not have to deal with the fault of the perpetrator, they only must prove the existence of the defect, the damage and the causal connection.
This form is also addressed by Directive (EU) 2024/2853
Objective liability in civil law is sufficient to resolve property damage and quickly compensate the injured party. However, it is not sufficient if the action causes severe social damage or danger, and the punishment of the perpetrator is justified from the point of view of protecting public interests.
If the action is significantly socially dangerous, interferes with protected rights and interests to an extent that exceeds the mere property rights of the injured party, its criminal sanction may be appropriate. A typical example is death or serious injury to health caused by a knowingly neglected safety standard. – e.g. a fundamental violation of regulations, ignoring warnings. In such a case, it is not just an accident or a product defect, but an action with the potential to endanger the lives or safety of several people, when it is appropriate to classify the action as a crime and impose an appropriate sanction.
Cumulation of civil and criminal liability:
In practice, the injured party can simultaneously claim compensation for damages in civil proceedings, and the criminal authorities can, independently of this, conduct criminal proceedings against the perpetrator. One line (civil) focuses on compensation, the other (criminal) on possible punishment (fine, imprisonment, prohibition of activity, etc.)
Subjective criminal liability
The basic attributes that serve to determine possible criminal liability are listed in the Criminal Code.
An obligatory feature of subjective criminal liability is the examination of the cause.
1. Abuse of a robot by a human – it is true that the relevance of artificial intelligence from the point of view of criminal liability is the same as the case of using any other tool. This will mainly concern those cases in which a human fulfils the objective aspect of the crime himself, using artificial intelligence in such a way that he controls it himself, even if he uses it to perform partial operations that the human himself would not be capable of
2. Defect caused by circumstances independent of the human and the given AI – it is true that this is not a criminal act even if the movement is caused by an external force or another circumstance independent of the human will, unless this event was his previous, usually negligent, action. According to §146 of the Criminal Code, it will be necessary to prove that the robot’s action was a manifestation of the perpetrator’s will in the outside world. However, the factual basis of the crime could be fulfilled because of an omission, e.g. from turning on the machine during ongoing maintenance to working with the robot in unsuitable conditions
3. AI acting alone without causing or being at fault by a human – this is probably a possibility for higher generations of AI and robots, and after safely disproving the previous options 1 and 2
The key issue will be the assessment of the causal relationship between the action and the consequence. It must be proven that a certain consequence, e.g. death, injury, damage, occurred precisely because of the actions of a specific person, the perpetrator, and not because of another circumstance. Without proving this causal connection, the perpetrator cannot be held responsible for the crime. This is a basic condition for acknowledging guilt.
Possible causes of injury to health of AI or second-generation control systems
When assessing the behavior of AI, it is necessary to take into account the already existing Regulation of the European Parliament (EU) 2024/1689.
As an example, let’s choose the default situation:
1. The manufacturer designs, tests, manufactures and sells a vacuum furnace with a second-generation control system or a robot with an AI-based control system, while complying with all the requirements of the AI Act, (EU) 2024/1689.
2. The user also uses it in accordance with this Regulation, according to the instructions supplied by the manufacturer
3. However, the robot behaves unexpectedly in a way that causes damage to property, health or even the life of a person
An investigation will be initiated according to the Criminal Procedure Code, with three options being considered:
A. It will not be determined at all why this happened, and external intervention (influence of actions by a third party) cannot be ruled out.
B. It will be determined that the robot behaved this way because of improper use, i.e. the user actually acted contrary to the documentation (instructions), perhaps intentionally, perhaps out of ignorance or negligence.
C. It will be determined that the robot behaved this way because of a defect caused by the manufacturer (regardless of whether HW or SW or input data).
Option A
Option A1: It will not be determined why the robot behaved the way it did
It will be manufactured according to documentation, the documentation will correspond to the findings of science and technology, or the requirements of some regulations or technical standards. Nevertheless, it will do something, e.g. injure someone by moving its body or run someone over.
In such a case, the police authority could conclude that it was not a criminal offense because the conditions under Section 13, Paragraph 2, of the Criminal Code are not met, i.e. that criminal liability for a criminal offense requires intentional fault, unless the criminal law expressly stipulates that fault due to negligence is sufficient. Moreover, the perpetrator will not be identified. Criminal prosecution cannot be initiated at all, because criminal prosecution of a specific person has failed (Section 160).
Option A2: It is determined that it was a coincidence, a coincidence of circumstances
The reason for the behaviour is determined, but it was not primarily caused by a product defect, but by one or more coincidences, unfortunate circumstances.
In the event of an unfortunate event, e.g. in the form of radio wave interference (overwhelmed by surrounding traffic) and at the same time adverse weather, i.e. confusion of the robot’s orientation in space, it could be discussed that the robot should have been designed in such a way that it could deal with this situation. This will depend on the expert assessment of whether this is imposed by some legal regulation or whether it is sufficient to require that the robot’s design meets the requirements of science and technology or common sense
Option A3: Influence or intervention of a third party is detected
In the case of influencing the robot’s actions by a third party, this would be an attack, regardless of whether physical or logical (hacking), and then the perpetrator would be fully criminally liable. In such a case, the robot would be an object that was used to commit a crime, i.e. an instrument of crime within the meaning of Section 135a of the Trade Marks Act. It can be assumed that the defence, as was done in the past in the case of attacks on the banking sector by IS employees, would point to the faulty design of the robot, which should not have been influenced by such an attack, because the perpetrator was only “trying something”, but this would most likely be rejected.
Option A4: The robot is found to have been reconfigured because of self-learning
It is found that the AI control system behaved differently than expected, but as a self-learning robot, it has reconfigured itself. In the most specific case, i.e. in the case of a self-learning robot, everything may be fine at the beginning when the robot is delivered, but since it will be a self-learning robot, it will adopt such a way of acting (reaction to external stimuli) that in order to complete a work task it will push someone under another machine.
Here we do not know whether the manufacturer could have expected this, and we will discuss whether the robot design corresponded to the knowledge of science and technology or the requirements according to the AI Act. If so, criminal liability in such a case will most likely not be considered, since there is no culpable conduct within the meaning of §13, paragraph 2, of the Criminal Procedure Code. Neither can unconscious negligence be inferred
Option B
Option B1: The user acted explicitly in violation of the documentation, took steps that are prohibited. Example: started the robot’s movement even though he knew that people could move in the path.
In this case, it could be either negligence under Section 16, paragraph 1, conscious or unconscious, or a higher degree of negligence, i.e. gross negligence, under Section 16, paragraph 2 of the Trademarks Act. However, intent is also considered, especially under Section 15, paragraph 1, letter b). It will be possible to initiate criminal proceedings under Section 160 of the Trade Marks Act.
Option B2: The user took steps that are not prohibited, but nevertheless appear to be contrary to the usual way of use or so-called common sense – e.g. he started the robot’s movement in reduced visibility, when he could expect that the robot’s sensors might not correctly evaluate the surroundings
Here, the voluntary component will probably be absent, when the perpetrator did not have the intention (did not want to cause a violation or threat to an interest protected by the Criminal Code), but there is still a causal relationship between the perpetrator’s actions and the criminal consequences of such actions. It will probably also be possible to initiate criminal proceedings under §160 of the Criminal Procedure Code.
Option B3: There is an error in the documentation
Even if the tort is caused by the user acting on faulty documentation, then the user should be held liable. The only question is to what extent, given the circumstances and his personal circumstances, he could have recognized this.
Option C
Option C1: The responsible person who caused the defect is identified (designer, programmer, assembler, tester, author of the documentation).
In this case, it could be either negligence under §16, paragraph 1, conscious or unconscious, or a higher degree of negligence under §16, paragraph 2, i.e. gross negligence. However, intent under §15, paragraph 1 is also possible, both letter a) example/revenge of the employee, or letter b) indifference.
In addition to prosecuting a natural person, the criminal liability of a legal person, the manufacturer, is also possible, according to Act No. 418/2011 Coll. On the Criminal Liability of Legal Persons, if the actions of a natural person are attributable to a legal person within the meaning of §8 of this Act
Option C2: It will not be determined who is specifically responsible, but it will be proven that the defect occurred during design, production or testing, i.e. at the manufacturer
In case C2, it was not possible to determine the facts justifying the initiation of criminal prosecution of a specific natural person. However, in this case, the criminal liability of the legal entity, the manufacturer, is probably considered, according to Act No. 418/2011 Coll. on the criminal liability of legal entities and proceedings against them, because according to Section 8, paragraph 3, the criminal liability of a legal entity is not prevented if it is not possible to determine which specific natural person acted in the manner specified in this Act. However, there must be no doubt, not just hypothetically, that such a natural person exists
Option C3: It is found that the defect was caused by a subcontractor of some components
In this case, the criminal liability of the natural person acting for the subcontractor, or the subcontractor as a legal entity, cannot be ruled out, and in addition, the criminal liability of the natural person acting for the manufacturer, or the manufacturer as a legal entity.
In both cases, it would most likely be liability for one of the negligent criminal acts (the subcontractor negligently causes the defect and the manufacturer negligently fails to detect it). In such a course of events, the causal connection will not be interrupted, and both natural persons and, depending on the fulfillment of the conditions of attribution, both legal entities will bear criminal liability.
This is where Professor Smejkal’s list of options ends. Do we have enough imagination to find an example from practice where this could work? I will “make” one example. We invest in robotization and buy a robot for preparing the batch, loading and unloading into the furnace according to the following picture. During the preparation of the batch, the robot’s gripping jaws break, and the manipulated product is thrown out of the robot at such a speed that it overcomes the barriers around the robot and hits an employee walking by. The employee suffers serious injuries. If these gripping jaws are from the robot manufacturer, then the manufacturer or its subcontractor is at fault. This will therefore be a Option C3 with criminal liability of the supplier or subcontractor.
If the gripping jaws are no longer original, it will have to be investigated whether they were purchased from the original supplier, i.e. an original spare part, or from someone else, or whether the user has already made them themselves as a spare part, and at the same time whether the disassembly and assembly were carried out according to the Instructions.
Since the gripping jaws break due to a material defect, it will be investigated whether this defect, hidden by nature, already existed when the robot or spare part was delivered, and the two-year period for claiming a defective product will also apply to this defect.
This liability is objective, i.e. there is no need to prove the fault of the manufacturer
Graphically, this can be shown roughly like this:
The description in Option C1 or C3 will therefore apply to the above case, depending on the situation, whether the gripping jaws are designed and manufactured by the robot supplier or whether they have subcontractors for this.
Since this is a serious injury, the work accident will be handled by the State Office of Labor Inspection (SÚIP). If no intervention in the robot by the user is found, or misinterpretation of the manufacturer’s instructions is found, or if it is not found that the parts are not original parts from the manufacturer, then there will be a presumption not only of compensation for the product defect, but there will also be grounds for criminal prosecution of the manufacturer as a legal entity. And if a person is found at the manufacturer who was responsible for the quality of these gripping jaws, then this person will also be subject to criminal prosecution.
However, it will most likely involve liability for some negligent criminal offense (the subcontractor negligently causes a defect, and the manufacturer negligently fails to detect it). In such a course of events, the causal connection will not be interrupted, and criminal liability will be borne by both the natural person who supplies the component and depending on the fulfilment of the conditions of attribution, the legal person, the supplier of the robot.
However, if it is proven that the gripping jaws were not original, but were made by the user, then it will probably be a Option B1 or B2 of description, but the manufacturer’s instructions and their recommendations regarding the clamping gripping jaws will also have to be examined. If the instructions are in order, there will be criminal sanctions for the natural person of the user and the legal person, the user.
It should be noted here, however, that the State Labor Inspection Office (SÚIP) is not authorized to conduct criminal proceedings. Its task is primarily to investigate whether regulations relating to occupational safety and health (SHE) have been complied with. If violations of the regulations are found, the inspectorate can order the employer to take measures to eliminate the identified deficiencies and prevent further accidents. In the case of serious or repeated violations, the inspectorate can initiate administrative proceedings and impose a fine for the offense.
As for criminal prosecution, the SÚIP itself does not submit proposals for criminal prosecution. However, if inspectors discover facts during the investigation indicating the commission of a criminal offence (e.g. gross negligence on the part of the employer leading to serious injury or death of an employee), they are obliged to inform the authorities involved in criminal proceedings, i.e. the Police of the Czech Republic or the State Prosecutor’s Office. These authorities will then decide to initiate criminal prosecution.
This is where my story ends. With the development of robotization and automation, with the onset of AI, we will face completely new challenges and problems. And we are still talking about only the second generation of AI. But if we enter the third generation, self-learning robots, then we are in for a real mess, because we are losing control over the robot, the control system.
The information that our vacuum furnace is equipped with a self-learning AI system is a sign that we need to start being vigilant. Regarding the definition given by Professor Smejkal in the previous part, this is not true, because we have a system, as Professor Mařík says:
AI today is just a collection of Turing mathematical models and algorithms, understanding input and output deterministically. If a program called AI must follow the commands of the programmer without the possibility of its own modification by itself, then we can work it up to the level of an expert system, but not further.
According to the European AI act (EU 2024/1689), this is not even possible today. Open, self-learning systems can only be used for data collection and evaluation, not for device control. Any change in function by the robot reprogramming itself will lead to a possible collision, destruction of the device or endangering people. And if we accidentally allow this, will we judge the robot?
Jiří Stanislav
February 9, 2025
(1) – Note – the text in italics is a citation from a lecture by Professor Smejkal