you are being redirected

You will be redirected to the website of our parent company, Schönherr Rechtsanwälte GmbH :

21 December 2023
Schoenherr publication
austria poland

to the point: technology & digitalisation l December 2023

Welcome to the December edition of Schoenherr's to the point: technology & digitalisation newsletter!

We are excited to present a selection of legal developments in the area of technology & digitalisation in the wider CEE region.

Our December Double Whopper: GDPR & the AI Act

December has brought a "Double Whopper" in the area of technology & digitalisation. On 5 December 2023, the CJEU ruled on the liability concept of the GDPR in its "Deutsche Wohnen, C-807/21" case. The court clarified that a breach of the GDPR alone does not entitle the data protection authorities to impose fines on a company (strict liability). Instead, the data protection authority will first have to provide proof of intentional or negligent misconduct by representatives, managers or other persons acting on behalf of the breaching entity. As to the degree of such intentional or negligent misconduct, the court clarified that the breaching entity will be liable if it could not be unaware of the infringing nature of its conduct, regardless of whether it is aware that it is infringing the GDPR. By establishing this set of rules, the CJEU referred to case law established under EU competition laws. The liability regime under the GDPR has obviously now moved closer to that under EU competition law.

The second technology & digitalisation milestone in December has to do with the EU AI Act. This draft legislation made its breakthrough on 8 December 2023. After several marathon sessions, the trialogue came to a successful end, with the last open items concerning the legitimacy of biometric AI control systems having been resolved. The finalised AI Act can therefore be expected at the beginning of next year. It will define the legal framework for the use of AI systems within the EU. Although there will be a two-year transition period, lessons learned from the implementation of the GDPR show that this is less time than it seems considering all the preparatory work that must be completed before the AI Act becomes effective, especially as AI technology is the most disruptive innovation of the past few decades. Establishing compliance under the AI Act therefore requires a profound collaboration of business units, legal counsels, IT departments, compliance officers and, of course, data protection officers. Compliance with the AI Act will be a duty for many companies. Thorough preparation will be crucial to avoid liabilities and business disruptions and will be critical to the success of the use of AI technology.

A Q&A document on the use and development of AI under the AI Act has been published by the European Commission:

On 15 December 2023, Austria's National Council of the Parliament (lower house) adopted the Flexible Company Act (Flexible-Kapitalgesellschafts-Gesetz, FlexKapGG), which introduces a new company form, the "Flexible Company" (FlexCo) as of 1 January 2024.

As its name suggests, the FlexCo is all about being flexible. It addresses longstanding criticisms directed at Austria's predominant corporate form, the limited liability company (GmbH). The GmbH has faced extensive criticism in the start-up world due to its perceived lack of flexibility and strict formal requirements. The FlexCo combines features from both the conventional GmbH and the stock corporation (AG), but also introduces some totally new concepts to Austrian corporate law, such as company value shares.

But what exactly does the FlexCo bring to the table? To read more, click here.

Well-designed employee incentive programmes are crucial for start-ups seeking to build a motivated and committed workforce. These programmes not only attract top talent but also help motivate and retain the key individuals critical to a start-up's growth. While there are various employee incentive programmes, a notable distinction often arises between real and virtual share programmes. In Austria, virtual share programmes have gained prominence, largely owing to their tax advantages over real share programmes. A new development in the Austrian corporate landscape is "company value shares" (CVS), which may only be issued by the new corporate form called the "Flexible Company" (FlexCo).

Real Shares: Real shares, also known as equity or stock, confer actual ownership in the company to employees. Employees with real shares typically enjoy voting rights in shareholder meetings, potentially influencing critical decisions. Real shares also come with administrative complexities, as notarial deeds are required for their issuance and transfer.

Virtual Shares: Virtual shares, or phantom shares, have become a popular choice in Austria, mostly thanks to their tax advantages compared to real share and other employee incentive programmes. Virtual shares do not represent actual ownership in the company but rather a synthetic form of equity. Holders of virtual shares receive financial benefits (usually cash payments) from the company as if they held real shares. These programmes offer flexibility and simplicity compared to real shares. Employees holding virtual shares typically do not possess voting rights, as these shares are more focused on providing economic benefits rather than ownership influence.

Company Value Shares in a "FlexCo": The Austrian start-up landscape is evolving and the introduction of the "FlexCo" has created an innovative share class designed specifically for employee participation – "company value shares" (CVS). CVS are real shares with restricted shareholder rights. They provide participation rights in shareholder meetings without granting voting rights. Exceptions arise in specific scenarios where shareholder resolutions impact the profit and liquidation rights of CVS holders or involve the conversion of CVS into regular shares. This new share class is particularly compelling in combination with favourable tax incentives introduced into Austrian law alongside the new FlexCo legislation.

The choice of incentive structure depends on various factors, including the company's specific goals, financial strategy, the desired level of employee engagement, and tax consequences. It is advisable to plan carefully and to consult with tax and legal advisors during this decision-making process.

A liquidity event refers to a transaction or series of transactions that provide investors, founders and employees with the opportunity to convert their equity interest in a company into cash. Shareholders' agreements typically provide a definition of a liquidity event, the details of which often vary. The parties should pay attention to these definitions, since liquidation preference is triggered in a liquidity event, i.e. liquidation preference outlines the payment hierarchy for shareholders during a liquidity event. Naturally, shareholders at the bottom of the liquidation preference waterfall (such as the founders) have an interest in a narrow definition of a liquidity event, while investors (particularly later stage investors) have an interest in a very broad definition of a liquidity event.  

Common types of liquidity events:

  1. Exit transaction

An exit transaction is an event where shareholders sell their shares in a company, leading to a change in ownership or control. This strategic move often occurs through an initial public offering (IPO). As an alternative to an IPO, an exit transaction is commonly structured through:

  • an exit share transfer, involving the sale, exchange, contribution or disposition of shares to a single entity or a group of commercially related individuals;
  • an exit asset sale, characterised by the sale of company assets followed by the distribution of proceeds to shareholders as dividends; or
  • an exit merger, encompassing any consolidation, merger, business combination or transformation resulting in the initial shareholders collectively holding 50 % or less of the shares, equity rights, voting power or other interests in the surviving entity.
  1. Company liquidation

This pertains to the winding-up of the company, either voluntarily or compulsorily as part of an insolvency proceeding, leading to the distribution of liquidation proceeds to shareholders in the form of dividend payments (if any proceeds are left after the satisfaction of creditors).

Prompt engineering is a technique for getting better results from large language models, such as GPT-4, by providing clear and specific instructions, examples and reference texts. Prompt engineering can help reduce the chances of producing undesired or inaccurate content, such as hallucinations, biases or offensive outputs. Prompt engineering can also help improve the model's reasoning and understanding of complex tasks.

OpenAI recently published a guide ( on how to use prompt engineering with the OpenAI API. The guide covers six strategies for getting better results: write clear instructions, provide reference text, split complex tasks into simpler subtasks, give the model time to "think", use inner monologue or a sequence of queries to hide the model's reasoning process, and use intent classification to identify the most relevant instructions for a user query. The guide also provides some examples of prompts that showcase what GPT models can do.

Prompt engineering is an important skill for anyone who wants to leverage the power of large language models for various applications. By following the best practices and tips from the website, users can improve their prompt design and get more reliable and useful outputs from GPT models.

By now, it should come as no surprise to anyone that ChatGPT uses vast amounts of datasets for training that also contain personal data. However, the fact that it can potentially be made to disclose information about these datasets and other people to users is new.

A group of scientists discovered that when ChatGPT is asked to repeat a word endlessly, the result can be that it is quoting phrases from its source data.[1] This source data in some cases also contains personal information such as name, e-mail address and phone number. In theory, ChatGPT is designed to not disclose its training data, let alone large amounts of it. However, the scientists were able to extract "several megabytes of ChatGPT's training data". Machine-learning models generally recall a certain percentage of the data used to train them. The more sensitive or unique this data is, the less desirable it can be that parts of it are made public directly. In some cases, machine-learning models may be used to exactly reproduce training data, but in these cases a generative (language) model such as ChatGPT should not be used.

How much training data such models actually remember cannot be determined. For this reason alone, the team of scientists are worried that it may be impossible to distinguish between "safe" models and those that appear to be safe but are not. The scientists discovered this exploit back in July and reported it to OpenAI, the creator of ChatGPT, in August. When we tested this exploit in December, we were able to replicate the results in some cases and got ChatGPT to disclose some information on its training data (e.g. about the conflict in the Middle East). This shows that even the developers of these models still do not fully understand how they work. Surprises and potentially dangerous situations can still occur during their development.


In September 2020, the European Commission adopted the Digital Finance Strategy to support innovation in the European financial sector and build a single market for digital financial services. The EU Digital Finance Platform, a collaborative space that connects innovative financial firms and national supervisors and that also features the new Data Hub, amongst others, is part of this effort.[1]

Data Hub: In fall 2023, the Data Hub was added to the platform. This project, which will complement national innovation hubs and regulatory sandboxes, as well as private-sector initiatives, is certainly a novelty. For the first time, innovative firms will be able to access supervisory data for testing new applications or training artificial intelligence (AI) and machine learning (ML) models.

But given the EU's strict data privacy requirements, how can public sector data be shared with innovators? To ensure compliance with EU privacy requirements, the Data Hub will host synthetic data sets and thus rely on data synthetisation. But...

…what is data synthetisation? Synthetic data generation is a technique to create artificial ("new") data that closely resemble original data, but without exposing sensitive or confidential information. It serves as a substitute for actual data, allowing firms to experiment, test use cases, develop algorithms and perform analyses while keeping data safe and private. Synthetic data generation ensures full anonymisation while preserving the characteristics of the original data. Because of this, synthetic data and original data should deliver very similar results, which makes synthetic data highly relevant for testing.

For the Data Hub, this means that real data will never leave the authorities' premises and no external user will access actual data. Thus, national supervisors can participate in the project while innovators will be able to access meaningful information. Hence we would expect synthetic data to gain increased traction within AI and ML, as it helps train algorithms that require vast amounts of training data, which can be expensive or come with usage restrictions.

Outlook: The Directorate-General for Financial Stability, Financial Services and Capital Markets Union (DG FISMA), the EU Commission's responsible directorate for this project, is engaging in an intense dialogue with European supervisors to bring as many as possible into this initiative. Following a successful synthetic data pilot with the Bank of Spain, the first data sets are expected to become available as early as the beginning of 2024.[2] While the exact types of data that will be available is not yet public, you can expect them to be relevant, as the industry was consulted earlier this year on potential use cases and the type of datasets they would like to access for testing.



In December 2023, the President of the Polish Office for Personal Data Protection approved the "Code of Conduct for the Health Sector" prepared by the Polish Federation of Hospitals. This document is the first code in Europe covering both public and private entities in the medical sector. According to the DPA, the code complies with the GDPR and provides adequate safeguards in terms of data protection. An important aspect was the development of monitoring solutions for public entities. This is the first such code for the medical sector that allows public hospitals to confirm compliance of data processing with the GDPR. Entities that will apply the code can have a guarantee of the correctness of the use of certain solutions approved by the DPA. They can also count on the supervision of personal data processing based on the monitoring mechanisms described in the code. Additionally, when considering imposing a penalty on an entity, the DPA must consider in each case whether the entity is correctly applying the approved code of conduct.

more insights


13 September 2023


T.Kulnigg M.Czernin

The Venture Capital Law Review: Austria

press release

Montenegro: Schoenherr advises Montenegro Chamber of Commerce on its first digitalisation project

Moravčević Vojnović and partners in cooperation with Schoenherr advised the Montenegro Chamber of Commerce, the largest business organisation in the country, on its innovative "eKomora" project, the institution's first digitalisation project.