Impressions from the same-titled workshop organized by the German Foreign Office and the Alexander von Humboldt Institute for Internet and Society
At invitation by the German Foreign Office and the Research Area Global Constitutionalism of the Alexander von Humboldt Institute for Internet and Society, last Monday almost 50 experts of international law, cyber security and Internet regulation or Internet governance came together. The participants consisted of researchers, government officials as well as staff of foundations, associations, law firms and companies. The sessions were kicked off by keynote speeches on various aspects relating to the ‘Public International Law of the Internet‘, including on cyber-security, on human rights implications as well as on alternative regulatory approaches beyond the classical international law. Applying the Chatham House Rule, the workshop aimed at discussing the concept of International Internet Law, with special focus on identifying important issues as well as applicable rules and standards or regulatory gaps, respectively.
The term ‘International Public Law of the Internet‘ comes from the 2013 coalition agreement of CDU/CSU and SPD. On page 149 we find the following ‘governmental mandate’, which also gave reason to this workshop:
“In order to preserve the fundamental rights and freedoms of the citizens in the digital world and to promote the opportunities for democratic participation of the population in the global communications network, we are committed to a law of the Internet in order that the fundamental rights also apply in the digital world. The right to privacy, which is guaranteed in the International Covenant on Civil and Political Rights, is to be adapted to the needs of the digital age. “
The concept of an International Internet Law – Internet as a cross-sectional legal task
At the workshop’s beginning, it was doubted whether the terminology of ‘Public International Law of the Internet‘ is best suited to describe the problems that the Internet offers to the Law. For the Internet affects all aspects of life and therefore concerns not only the area of public international law in the narrow sense, but also for example civil law. This could result in entirely new legal forms assessing problems in the field of the Internet – this can be exemplified by the attempt of (European) competition authorities to tackle the power of large global Internet companies like Amazon or Google with the means of competition law. The participants agreed that the concept of an International Internet Law was suited much better to describe the full extent of this cross-cutting issue for legislation and policy-making, that also touches upon non-conventional normative levels (e. g. public international soft law or multi-stakeholder regulation).
The applicability of classical international law principles in cyberspace
Furthermore, the participants discussed whether certain classical principles of public international law could be transferred to the cyberspace. The term cyber(space) is used to describe the virtual world created by computers and (computer)networks; during the workshop it was also often equated with all that is Internet. The discussion evolved around a possible analogy with the International Law of the Sea as well as the territoriality principle and the principle of sovereignty under international law.
It was argued that the cyberspace, just as the high seas, could be treated as res omnium. This would mean that no state has an absolute right for usage and exploitation. Rather the principle of concerted compatibility would apply. This would inter alia prevent an absolute blockage or malfunction of the Internet by one state, even if this disruption did not exceed the threshold for the ban on violence and the non-intervention clause.
On the other hand, it was argued that the cyberspace and the high seas were not comparable. The high sea was characterized as a uniform good given by nature, which is changing very slowly. Concerning the high seas there was also said to be an international consensus that this space must be maintained cooperatively. Cyberspace however, it was argued, would just be the opposite. It was by no means a uniform space. It had been created and organized by men. Its terrestrial pillars had been built and were controlled by states. In addition, technology, and with it the cyberspace, changed so quickly that legal integration (so far) could not keep up. What is more, there was no fundamental consensus on how to deal with or, respectively, how to maintain the Internet.
Those who adopted an intermediary role pointed to the possibility of due diligence obligations of the states regarding the Internet, referring to such duties of care in international environmental law. Thus, states could be held responsible, for example if they were attacked, if the attacked server is located on their territories or if their infrastructure was used for the attack.
While discussing the applicability of the principle of territoriality it was first pointed out that one needs to distinguish between the power to regulate on the one hand and the regulation’s binding force on the other hand. There was broad consensus amongst the participants that such classical rules (on jurisdiction) could, if at all, only be applied in a modified form – as e.g. focusing on a server’s physical location would often lead to completely arbitrary results.
Alternative regulatory mechanisms
Moreover, alternative legislative techniques were discussed. First, it was indicated that it was not necessary to re-invent the wheel in this respect. For in areas such as Environment, Labor and Social Affairs there have been numerous experiments with multi-stakeholder regulation, which in general resulted in (only) soft law. It was stated that stakeholders were, in general, states that represent their citizens, civil society (usually NGOs, not representing citizens, but taking stand only to specific problem areas) and the relevant business community. However, it was pointed out that on other occasions, e. g. at the NETmundial, individual citizens also had the opportunity to comment.
Various security-related topics were addressed and, for each area, relevant questions were developed. Regarding military cyber-cooperation and cyber warfare the discussion was very detailed.
In the field of military cyber-cooperation two questions were found to be of special relevance: First, the question under what conditions a cyber attack was likely to trigger the right to self-defense and second, the question under what conditions a state can be held liable for a cyber attack. In particular, the latter issue about presumption rules lead to a prolonged but inconclusive debate during the making of the ‘Tallinn Manual on the International Law Applicable to Cyber Warfare‘, a non-binding study of independent experts on behalf of the NATO Cooperative Cyber Defence Centre of Excellence on the applicability of existing international law to this new form of warfare.
In the area of cyber warfare the distinction between protected civilian and military objects is a major problem, whose solution may require a modification of the basic rules. Because of the amalgamation of private and governmental structural components in cyberspace it can reasonably be argued on the basis of existing law that all structural components of the cyberspace were attackable military targets. The non-binding (!) Tallinn manual tries to address this problem by clarifying that the Internet as a whole must not be seen as an attackable military object.
In addition, structural cyber-security was held to be very relevant. The issue at hand was whether the rules on the ban of intervention can be, for example, interpreted or reformulated in a cyber-specific way. In this sense it was proposed to include a manipulative aspect, e. g. overcoming particularly high security barriers, to establish the notion of force. It was also suggested to consider rather the place where the harm arose (‘Erfolgsort’) as the relevant connecting factor than the place where the event which gave rise to the harm occurred (‘Handlungsort’).
Human rights in cyberspace
The discussion also focused on human rights challenges in ‘Public International Law of the Internet‘. Dr. Helmut Philipp Aust, MLE gave a keynote speech on this aspect. He outlined various development phases of the Internet that were characterized by differing hopes and expectations for the protection of human rights in Internet law. Particularly, he drew attention to the still unsolved problem of Big Data. It was stated that this keyword described the processing of large data sets, which is used “only” for detect probabilities and correlations between data – dismissing the category of causality, which is (still) one of the prevailing links in law. Accordingly, the data processor would not need to know in advance anymore what he intends to use the data for. There was broad agreement that traditional data protection law was not able to cope with this new phenomenon. Thus, it would have to be clarified whether Big Data was held to be inadmissible per se or whether a new concept (and, if so, which) had to be found.
In this context, the participants discussed the notion that the traditional separation of private and governmental data collection and data transmission had ceased. Therefore it was questioned whether the state-centeredness of legal thinking was still correct or whether it should switch its focal point, according to the matter of fact that most data processing is in private hands, towards those private players.
As for the rest, I would like to refer you to the keynote speech of Dr. Helmut Philipp Aust, which bases on his keynote speech at the workshop and has also been published in this blog.
The discussion has shown that many questions of public international law are still completely unresolved. Particularly in the technically-oriented (completely) globalized Internet, whose concept is or was the open and direct data flow, the factual circumstances are developing in seven-league boots – but the law is lagging behind. One of the crucial points in this area in the coming years will be the question of who has the authority to set and enforce the rules for the Internet. The person will be able to create and define one of the central resources of the future. Attention should be paid to the fact that countries such as Germany will hardly be able to put up a sufficient ‘critical mass‘ in the Internet to reach regulatory results single-handedly. Midrange states like Germany will be well advised to combine its leverage with other (EU member) countries and/or for example the United States.
Picture “… Human Rights” by Jeremy Schultz