(magistrsko diplomsko delo)
Abstract
Robote že dolgo uporabljamo na področju proizvodnje, sedaj pa postajajo tudi vedno bolj nepogrešljivi v vsakdanjem življenju. Bolj avtonomni in napredni kot so, bolj je pomembno korenito razmisliti o njihovem pravnem statusu ter o vidikih kazenske odgovornosti za njihova dejanja. Na svetovni ravni so se začele pojavljati pobude po boljši ureditvi pravnega statusa robotov, ki so v pravu trenutno obravnavani kot stvari. Možnosti so, da bi robote obravnavali kot pravne osebe ali da bi vzpostavili nov pravni subjekt v obliki e-osebnosti. Lahko bi tudi posnemali pravni status živali, ki niso ne pravni subjekti, niti ne le stvari. Nasprotniki teh pobud menijo, da je vzpostavitev novega sistema še preuranjena in bi povzročila več možnosti zlorab. Tudi zaradi navedenega se pojavljajo kritike podelitve državljanstva Savdske Arabije robotu Sophia, proizvajalca Hanson Robotics. Roboti že sedaj včasih povzročijo neželene posledice, recimo poškodbe ali smrt delavcev v proizvodnji, zato je pomembno opredeliti, kdo je kazensko odgovoren za njihova dejanja. Pravo trenutno določa, da so lahko storilci kaznivih dejanj le fizične in pravne osebe. Kazenska odgovornost za dejanja robotov je pogosto omejena na tistega, ki je robota uporabil kot sredstvo za izvršitev njegove kriminalne zamisli. Naloga razišče alternativne možnosti kazenske odgovornosti za dejanja robotov, kot recimo da bi bil robot sam odgovoren za svoja dejanja. Za navedeno je potrebno več predpostavk, kot so, samostojno sprejemanje moralnih odločitev, sposobnost samostojnega ravnanja in spodobnost sporočanja svojih odločitev na razumljiv način. Če bi pravo dopuščalo možnost, da je robot sam odgovoren za svoja dejanja, sicer obstaja nabor primernih in možnih kazenskih sankcij za robote, vendar so smiselne le z vidika generalne prevencije, ne pa tudi z vidika specialne prevencije.
Keywords
umetna inteligenca;roboti;pravna subjektiviteta;kazenska odgovornost robotov;kaznovanje robotov;magistrske diplomske naloge;
Data
Language: |
Slovenian |
Year of publishing: |
2021 |
Typology: |
2.09 - Master's Thesis |
Organization: |
UL PF - Faculty of Law |
Publisher: |
[M. Gregorič] |
UDC: |
343:004(043.2) |
COBISS: |
75149571
|
Views: |
278 |
Downloads: |
82 |
Average score: |
0 (0 votes) |
Metadata: |
|
Other data
Secondary language: |
English |
Secondary title: |
Aspects of criminal liability for the actions of robots |
Secondary abstract: |
Robots have long been used in manufacturing and it is hard to imagine daily life without them. The more autonomous and advanced they become, the more important it is to think about their legal status and the aspects of criminal liability for their actions. Initiatives have formed around the world to further regulate the legal status of robots, which are currently treated as objects. The possibilities are to treat robots as legal persons or to establish a new form of legal entity, such as e-personality. One possibility is also a similar legal status to animals, which are neither legal entities nor mere objects. Opponents of these initiatives believe that establishing a new system is premature and would lead to more opportunities for abuse. This is one of the reasons for criticizing the granting of Saudi citizenship to the robot Sophia, manufactured by Hanson Robotics. Robots already sometimes cause undesirable consequences, such as injury or death to production workers, so it is important to define who is criminally liable. The law currently states that only humans and corporations can be held liable for criminal acts. Criminal liability for the actions of robots is often limited to the person who used the robot as a means to achieve their criminal aim. This dissertation explores alternative possibilities for criminal liability for the actions of robots, such as robots themselves being held liable for their own actions. This would require some preconditions, such as that they make moral decisions on their own, act independently, and communicate their decisions to humans in an understandable way. If it were legally possible to hold them criminally liable for their actions, there would be a reasonable range of criminal sanctions for robots, but they would only make sense from the standpoint of general prevention and not from the point of view of special prevention. |
Secondary keywords: |
artificial intelligence;robots;legal personhood;criminal liability of robots;punishing robots; |
Type (COBISS): |
Master's thesis/paper |
Study programme: |
0 |
Embargo end date (OpenAIRE): |
1970-01-01 |
Thesis comment: |
Univ. v Ljubljani, Pravna fak. |
Pages: |
36 f. |
ID: |
13314337 |