Considering the high level of interest in AI, we are launching a series of alerts about the main legal issues that arise from the use of AI in Kazakhstan. In this first edition, we summarize the main aspects of legal regulation of AI, which, in our opinion, should be regulated at the legislative level in Kazakhstan:
The concept of AI
Kazakhstan currently lacks a definition of AI. We wrote about the experience of foreign countries with respect to the concept of AI in our earlier publication from 2022. Meanwhile, considering AI’s operational analysis, the following characteristics can be identified:
- AI simulates human intelligence
- AI collects and processes data, including personal data
- AI predicts and makes decisions
AI and intellectual property
Here, we highlight two main issues requiring legislative attention:
- Can AI be recognized as the author of works? Under current Kazakhstani legislation, only a physical person whose creative labor has produced the work can be recognized as the author. There have been cases internationally – such as the case of Dr. Thaler (UK) regarding the registration of two patents naming the computer system DABUS as the author – where the courts explicitly concluded that AI cannot be an author.
- Can works created by AI be protected? Another significant issue arising in practice concerns the possibility of granting intellectual property protection to works created using AI. In some countries, this issue has already been regulated. For example, in Ukraine, non-original objects generated by a computer program are protected by a special kind of law (sui generis).
Liability for the actions of AI
Given that AI is actively used in almost every sphere of activity, the question arises as to who is responsible for potential damage caused by the use of AI? Since the AI itself cannot have guilt and at this point, cannot be held liable, there are different subjects potentially responsible for the AI’s actions: the user, the author, or the rights holder, or a combination of the three.
AI and personal data: Companies in Kazakhstan use AI in marketing, customer correspondence (chatbots), employee recruitment, and numerous other uses. For example, AI can analyze resumes and select candidates based on the employer’s stated requirements. In all such cases of AI use, there is the collection and processing of personal data. Furthermore, data is needed in order to train AI systems, and this may include personal data.
In our opinion, Kazakhstan should, at the very least, determine what personal data and under what conditions can be collected and processed by AI.
We plan to address all of the above issues in subsequent alerts, with reference to foreign experience and current Kazakhstan legislation.