you are being redirected

You will be redirected to the website of our parent company, Schönherr Rechtsanwälte GmbH : www.schoenherr.eu

01 February 2017
roadmap
austria

Austria: Machine Learning: Whom to Credit, Whom to Blame?

Rapid technological progress, artificial intelligence, machine learning – all those advancements require a new concept of legal thinking.

Data protection in a connected world

Data protection will become more and more important, thanks in no small part to ever more rapid technological progress. Self-driving cars, the internet of things, automated medical diagnosis – all these things invariably require data processing. Algorithm-based data analysis will play a key role in the future. This will raise certain questions about data protection, such as how to ensure proper data anonymisation, pseudonymisation or aggregation in order to comply with applicable data protection laws.

In addition, joint data processing and all the questions related to it will become increasingly important. As our world becomes more and more connected, huge amounts of data will need to be exchanged and (inter)transferred, making it more and more difficult to determine the responsibilities and roles of the data controller, in particular when no proper contracts are in place.

The most recent legal developments we have seen in Europe will boost the relevance of data protection, as will the General Data Protection Regulation, as it becomes effective by 2018. It will introduce new concepts to the existing regulatory framework (such as the Privacy Impact Assessment). On an overall note the General Data Protection Regulation will push companies towards a higher degree of responsibility. Crucially, the General Data Protection Regulation will apply directly and on a Union level in the Member States. This means that Austrian companies will be measured against a Union-wide benchmark so that Austrian companies will have to handle a benchmark as it is created by globally acting corporate groups and their (well-established) data privacy standards. This becomes no less challenging when considering the significant increase in fines that will come with the General Data Protection Regulation, foreseen at up to EUR 20 m in the most severe data protection infringement cases.

Rights to work products of “self-learning” systems

Only natural and legal persons can be owners of rights and duties, but not machines. Hence one always has to look at the person behind the system when attributing rights in creative endeavours or inventions. If certain work (including software code) created by an autonomously learning system would generally qualify for copyright or patent protection, under Austrian law it would have to initially be attributed to a natural person (ie a human being). Authors in the sense of the Austrian Copyright Act or inventors in the sense of the Austrian Patent Act must always be natural persons. Of course, such authors or inventors can grant third parties, which includes legal persons, rights to the to the protected work results.

But what is the author’s or inventor’s position when a self-learning system autonomously produces work? One could take the position that a work created by a self-learning system is only a consequence of the creative or inventive efforts of the person who created the logic behind the self-learning system, and that therefore this person is also to be credited for the end result. On the other hand, it can be argued that rights must be attributed to the person who provided the impetus for creating the concrete work result – eg by entering certain data. Perhaps both persons are co-authors or joint-inventors. Or maybe nobody can assume rights to such work results, if their contribution to the end result was so small that a “creative” or “inventive” effort can hardly be seen, as the system developed the work result almost fully autonomously.

Whom to blame for damages?

The question of liability is complex when it comes to autonomous systems or systems with artificial intelligence. They have been discussed for quite some time now in the context of self-driving cars, and not only in terms of accidents. A central principle of the right to compensation, namely the fault of the party causing damage, is already questionable in the case of a driver whose self-driving car has caused an accident. In this case, the issue might be whether the driver could have intervened to prevent the accident. The fault of the manufacturer (for example in the person of the software programmer) will generally be technically difficult and expensive to verify.

However, since the injured party usually does not have an agreement with the manufacturer, direct contractual liability will not be applicable. This leaves liability according to product liability acts (questionable whether also for software), liability based on contract with protective effects in favour of third parties, or tort liability. What is clear, however, is that only persons can by liable, not machines.

For damages caused by self-driving cars, in contrast to other systems with artificial intelligence, the so-called “car owner’s liability” (Halterhaftung) might be a valid basis for a claim. This special liability is based on the mere fact that operating a car poses a risk to the general public and is not based on the principle of culpability. How this will ultimately affect the manufacturers, especially concerning claims for recourse by the car owner’s insurance company, cannot yet be seriously assessed.

Technological developments do not only lead to connected data flows; they also require new concepts of connected legal thinking. An interesting and demanding challenge for any legal professional.

authors: Michael Woller, Günther Leissler, Wolfgang Tichy

Günther
Leissler

Partner

austria vienna

co-authors