Tag: GDPR

Continuing my GDPR journey

I’ve already written a couple of blog posts to reflect on my learning during the first two weeks of a Futurelearn course I’m taking on the General Data Protection Regulation (GDPR):

It’s a surprisingly interesting subject, so much so that I’m in danger of, for the first time ever, actually completing an online course that I’m taking voluntarily!

Although it’s my choice, I’m pursuing knowledge in this area because I’m leading Project MoodleNet. However, I’m writing here instead of on the project blog as I’m still coming to grips with all that GDPR means in practice.


Week 3 of the course is all about data controllers and data processors. The quotations I use throughout this post are taken from the course, which I highly recommend (you can sign up for free!)

In brief, data controllers are those who determine the purposes and means of processing personal data. When two or more controllers do so jointly, they are joint controllers. Processors, on the other hand, are those engaged in processing personal data on behalf of controllers. They will follow instructions given by controllers and cannot make decisions on the choice of purposes and means in data processing.

Here’s a more homely metaphor:

To make this more clear: if you visualise a ship and imagine that it is processing data, the controller is the captain and the processors are the sailors. A controller manages and controls the processing of the data (the ship), he determines the purpose (the destination), and the means (or the course of the voyage). A processor is contracted by the controller to carry out data processing for the purpose and with the means determined by the controller. Processors (sailors) act under captain’s instruction and report issues to the controller.

So from a Project MoodleNet perspective, Moodle (the company) is the ship, providing both the purpose and the means. The processor is the Project MoodleNet team which is processing the data. To the end user, the data controller and data processor are effectively one and the same.

Data Controllers

The interesting thing about the GDPR is that you can’t just respect users’ privacy and security, you have to prove that you’re doing so:

First of all, to demonstrate legal compliance is in itself a GDPR obligation. Being able to demonstrate that your organisation is taking compliance measures, both technical and organisational, may save you from potential hazards, such as heavy fines or sanctions. Controllers have to implement appropriate technical as well as organisational measures to make sure that processing of data complies with the GDPR. They have to implement these measures to ensure data protection by design and by default.

One method of doing so is ‘privacy by design’, something covered in a previous week, and which allows you to demonstrate that user-respectiving privacy safeguards are built into your products and services.

However, things can and do go wrong. GDPR therefore mandates what must happen in the event of a data breach:

In the event of a data breach, controllers have the obligation to notify the supervisory authority of that breach.

The supervisory authority in the UK is, I believe, the Information Commissioner’s Office. Moodle is an Australian company that is setting up an office in Barcelona. Until that’s set up, Moodle is processing EU members’ data without a legal presence in the EU. I wasn’t sure what that meant in terms of supervisory authority, so looked it up. Basically, it means that instead of a ‘one-stop shop’ approach, in the event of a data breach, Moodle would have to inform each member state individually.

The data controller has a responsibility to help users exercise their GDPR rights:

Finally, a very important obligation for a data controller is the duty to assist data subjects with exercising their rights to privacy and data protection under the GDPR. For example, a controller has the duty to provide data subject with sufficient information when collecting personal data.

Handily, the Futurelearn course (which is put together by the Universiy of Groningen) has a list of the obligations for data controllers:

Controllers’ obligations may include:

• To maintain records of all processing activities (Article 30 GDPR);

• To cooperate and consult with supervisory authorities (Article 31 GDPR);

• To ensure a level of security (Article 32 GDPR);

• To notify the supervisory authorities in the event of a data breach (Article 33 GDPR);

• To conduct a data protection impact assessment (Article 35 GDPR);

• To appoint a data protection officer (Article 37 GDPR);

• Specific obligations as regards transfer of data outside the EU (Chapter V GDPR);

• To assist data subjects with exercising their rights to privacy and data protection (Chapter III GDPR).

In other words, there’s a lot of companies that are going to have to get a whole lot more transparent about user data very quickly. I feel that we’re in a pretty good position with Project MoodleNet, as we can design all this in from the outset.

Data protection by default

Just as the GDPR advocates privacy by design, it also specifies ‘data protection by default’:

Data protection by default means that, by default, technical and organisational measures need to be taken to ensure that only personal data which are necessary for a specific purpose are processed. This obligation covers the amount of data collected, extent of processing, storage period and accessibility. This means that, by default, the less personal data that are processed, the better. This obligation includes that, by default, personal data are not accessible without the data subject’s intervention.

So, for example, I use an app called FullContact to manage my contacts across various accounts and to automatically update their details. It’s great, and I’m a paying subscriber to their service. When I install it on my Android smartphone, I get a screen which prompts me to give the app access to my contacts:

Full Contact

Given the job I’ve asked the app to do, giving it access to my contacts seems reasonable. I’ve seen other apps, however, request access to my microphone, location, and other ways of gaining potentially sensitive information about me, without any obvious reason why they would need to do so. GDPR compliance prevents this.

One thing we’ve been discussing with Project MoodleNet is pseudonymisation. Sometimes on a social network, for a whole variety of reasons, you may want to avoid posting with your ‘regular’ account. In this case, token-based pseudonymisation can help:

An example of an effective measure as mentioned in Article 25 is pseudonymisation. Pseudonymisation substitutes the identity of the data subject in such a way that additional information is required to re-identify a data subject. Such measures may also include anonymisation, which irreversibly destroys any way of identifying the data subject.

So, for example, you might be able to generate a finite number of pseudonymous accounts with your login details every month. This would mask your identity when it matters but, if you decided to do something illegal, or troll other members of the network, it would be possible to figure out who you are.

All of this is fascinating as, instead of organisations making it all up as they go along, they have to figure a lot of things out in advance. in order to satisfy their legal requirements and inform the user

When collecting personal data directly from data subjects, the controller has to provide the following information to data subjects at the moment of the obtaining the data:

  • The controller’s identity and contact details;
  • The contact details of the data protection officer (if applicable);
  • The purposes and legal basis for data processing;
  • The recipients of the personal data;
  • The fact that the controller intends to transfer personal data outside the EU (if applicable).

Furthermore, to ensure fair and transparent processing, the controller needs to provide the following information:

  • The reason why the data subject needs to provide personal data (this could be a statutory or contractual requirement or a requirement to enter into a contract), if the data subject is obliged to do so and what the consequences are for not not providing the data;
  • Data storage period;
  • The rights of data subjects (right to access, rectification, erasure, restriction of processing, objection to processing, data portability, the right to withdraw consent; the right to lodge a complaint with a supervisory authority);
  • The existence of automated decision making (including profiling);
  • Any other purposes (if the controller intends to further process the personal data for a purpose other than that for which the data was originally collected).

Over and above this, organisations have to be lot more secure in their data storage and processing procedures.

Under Article 32, controllers have the obligation to take technical and organisational measures to achieve a level of security appropriate to potential risk. When taking these measures, they need to consider the state of the art, the costs of implementation and the nature, scope, context and purposes of processing as well as the risk of varying likelihood and severity for the rights and freedoms of natural persons. Examples of such measures include:

  • Pseudonymisation and encryption;
  • Ensuring the ongoing confidentiality, integrity, availability and resilience of processing system and services;
  • The ability to restore the availability and access to personal data in a timely manner in case of physical or technical incident;
  • A process for regularly testing, assessing and evaluating the effectiveness of technical and organisational measures to ensure the security of the processing.

Data breaches

Returning to what happens when and if things go wrong, and user data is compromised, the GDPR makes very specific provisions:

When a data breach occurs, a controller has the obligation under Article 33 to notify the competent supervisory authority within 72 hours after becoming aware of the data breach, unless the breach is unlikely to result in a risk to the rights and freedoms of natural persons. If the supervisory authority is not notified within 72 hours, the controller needs to provide reasons for the delay.

Note the ‘unless the breach is unlikely to result in a risk to the rights and freedoms of natural persons’. In other words, if there’s a data breach but the data is encrypted (as in the case of the LastPass hack) then, as far as I’m aware, while the organisation may choose to notify the supervisory authority, they are not required to do so. Obviously, if personally identifiable information was accessed, then the organisation would need to notify the relevant supervisory authority within 72 hours.

If there’s an elevated risk, then the notification should be immediate. The ‘data subject’ (i.e. user) also needs to be informed, in ways that they can understand:

Furthermore, the controller has the obligation to communicate without undue delay the personal data breach to the data subject under Article 34 if the breach is likely to result in a high risk to the rights and freedoms of natural persons. The communication to the data subject needs to be described in clear, plain and understandable language.

Data Protection Impact Assessment (DPIA)

Interestingly, the GDPR makes provision for new kinds of technologies that may put ‘data subjects’ (i.e. users) at risk. Organisations using new technologies to obtain personally identifiable information are required to carry out a Data Protection Impact Assessment (DPIA):

If there is a chance that a new type of processing (especially when using new technologies) may cause a high risk to the rights and freedoms of natural persons, the data controller needs to carry out a DPIA.

The example in the course is something like using ultrasound to ‘fingerprint’ people. This won’t be a concern for Project MoodleNet, as we’re using pre-existing technologies.

Data Protection Officer (DPO)

Apparently, in earlier drafts of the GDPR, the appointment of a Data Protection Officer (DPO) was mandatory for all organisations that had over 250 employees. However, as I’m sure someone pointed out, when Instagram was purchased by Facebook, it had 27 million users on iOS alone… and only 13 employees.

The final version of GDPR makes no mention of the number of employees an organisation must have before having a DPO is mandatory. Instead, it focuses on the type and scope of the data being processed.

Appointing a DPO is mandatory under certain conditions. Based on Article 37 a controller and processor need to designate a DPO if:

  • The processing is carried out by a public authority or body (with the exception of courts acting in their judicial capacity);
  • The core activities consist of processing operations that require regular and systematic monitoring of data subjects on a large scale;
  • The core activities consist of processing on a large scale of special categories of data (Article 9) or personal data relating to criminal convictions and offences (Article 10).

Data Processors

As we have already seen, data controllers and data processors are different. Data controllers, using the nautical metaphor introduced earlier, are like the ship’s captain, whereas the data processors are like the crew.

Processors process data on behalf of controllers and under controller’s instructions. Processing has to be governed by a contract or other legal act under EU or national law that is binding on the processor. This contract or legal act, among other things, determines certain obligations for processors and how they assist data controllers in fulfilling their GDPR obligations. Some of these obligations are similar to the obligations of data controllers.

Not only are some of the obligations the same, but as with the case of Moodle and Project MoodleNet, the data controller and data processor are one and the same.

Again, data processors have to be able to demonstrate that they are acting within the terms of GDPR:

The most important obligation for both controllers and processors is to demonstrate legal compliance. Concrete technical and organisational measures (such as documentation, records, Data Protection by Design and by Default, etc.) may provide good evidence to demonstrate compliance with the GDPR.

Applying my learning to Project MoodleNet

Finally, the third week of this course asks a few questions:

  1. How will you demonstrate compliance? Do you keep records? Do you have a privacy policy? Does your personnel have clear privacy instructions? Do you have clear agreements between controllers and processors?
  2. Do you need to carry out a DPIA?
  3. Do you need to appoint a DPO or a representative?

The second and third questions are the easiest to answer. As Project MoodleNet does not involve new technologies that access personally identifiable information, we won’t need to carry out a DPIA. In terms of the DPO, Moodle is currently interviewing for a DPO to be based in the new Barcelona office.

Returning to the first question, Moodle has blogged about how the organisation’s approach to GDPR in terms of its open source learning platform. With Project MoodleNet, however, the answer to the sub-questions around record-keeping, privacy policies, etc. is “we will have”. As I mentioned earlier, one of the benefits of developing this project as GDPR comes into force is that we can build it from the ground with these in place!


Image by jesse orrico available under a CC0 license

More on the mechanics of GDPR

Note: I’m writing this post on my personal blog as I’m still learning about GDPR. This is me thinking out loud, rather than making official Moodle pronouncements.


‘Enjoyment’ and ‘compliance-focused courses’ are rarely uttered in the same breath. I have, however, enjoyed my second week of learning from Futurelearn’s course on Understanding the General Data Protection Regulation. This post summarises some of my learning and builds upon my previous post.

This week, the focus was on the rights of data subjects, and started with a discussion about the ‘modalities’ by which communication between the data controller and processor, and the data subject take place:

By modalities, we mean different mechanisms that are used to facilitate the exercise of data subjects’ rights under the GDPR, such as those relating to different forms of information provision (in writing, spoken, electronically) and other actions to be taken when data subjects invoke their rights.

Although the videos could be improved (I just use the transcripts) the mix of real-world examples, quizzes, and reflection is great and suits the way I learn best.

I discovered that the GDPR not only makes provision for what should be communicated by data controllers but how this should be done:

In the first place, measures must be taken by data controllers to provide any information or any communication relating to the processing to these individuals in a concise, transparent, intelligible and easily accessible form, using the language that is clear and plain. For instance, it should be done when personal data are collected from data subjects or when the latter exercise their rights, such as the right of access. This requirement of transparent information and communication is especially important when children are data subjects.

Moreover, unless the data subject is somehow attempting to abuse the GDPR’s provisions, the data controller must provide the requested information free of charge.

The number of times my surname is spelled incorrectly (often ‘Bellshaw’) or companies have other details incorrect, is astounding. It’s good to know, therefore, that the GDPR focuses on rectification of individuals’ personal data:

In addition, the GDPR contains another essential right that cannot be disregarded. This is the right to rectification. If controllers store personal data of individuals, the latter are further entitled to the right to rectify, without any undue delay, inaccurate information concerning them. Considering the purpose of the processing, any data subject has the right to have his or her personal data completed such as, for instance, by providing a supplementary statement.

So far, I’ve focused on me as a user of technologies — and, indeed, the course uses Google’s services as an example. However, as lead for Project MoodleNet, the reason I’m doing this course is as the representative of Moodle, an organisation that would be both data controller and processor.

There are specific things that must be built into any system that collects personal data:

At the time of the first communication with data subjects, the existence of the right to object– as addressed earlier– must be indicated to data subjects in a clear manner and separately from other information. This right can be exercised by data subjects when we deal with the use of information society services by automated means using technical specifications. Importantly, the right to object also exists when individuals’ personal data are processed for scientific or historical research or statistical purposes. This is, however, not the case if the processing is carried out for reasons of public interest.

Project MoodleNet will be a valuable service, but not from a scientific, historical, or statistical point of view. Nor will the data processing be carrierd out for reasons of public interest. As such, the ‘right to object’ should be set out clearly when users sign up for the service.

In addition, users need to be able to move their data out of the service and erase what was previously there:

The right to erasure is sometimes known as the right to be forgotten, though this denomination is not entirely correct. Data subjects have the right to obtain from data controllers the erasure of personal data concerning them without undue delay.

I’m not entirely clear what ‘undue delay’ means in practice, but when building systems, we should build it with these things in mind. Being able to add, modify, and delete information is a key part of a social network. I wonder what happens when blockchain is involved, given it’s immutable?

The thing that concerns most organisations when it comes to GDPR is Article 79, which states that data subjects have legal recourse if they’re not happy with the response they receive:

Furthermore, we should mention the right to an effective judicial remedy against a controller or processor laid down in Article 79. It allows data subjects to initiate proceedings against data controllers or processors before a court of the Member State of the establishment of controllers or processors or in the Member State where they have their habitual residence unless controllers or processors are public authorities of the Member States and exercise their public powers. Thus, data subjects can directly complain before a judicial institution against controllers and processors, such as Google or others.

I’m particularly interested in what effect data subjects having the right “not to be subjected to automated individual decision-making” will have. I can’t help but think that (as Google has already started to do through granular opt-in questions) organisations will find ways to make users feel like it’s in their best interests. They already do that with ‘personalised advertising’.

There’s a certain amount of automation that can be useful, the standard example being Amazon’s recommendations system. However, I think the GDPR focuses more on things like decisions about whether or not to give you insurance based on your social media profile:

There are three additional rights of data subjects laid down in the General Data Protection Regulation, and we will cover them here. These rights are – the right not to be subjected to automated individual decision-making, the right to be represented by organisations and others, and the right to compensation. Given that we live in a technologically advanced society, many decisions can be taken by the systems in an automatic manner. The GDPR grants to all of us a right not to be subjected to a decision that is based only on an automated processing, which includes profiling. This decision must significantly affect an individual, for example, by creating certain legal effects.

Thankfully, when it comes to challenging organisations on the provisions of the GDPR, data subjects can delegate their representation to a non-profit organisation. This is a sensible step, and prevents lawyers become rich from GDPR challenges. Otherwise, I can imagine data sovereignty becoming the next personal injury industry.

If an individual feels that he or she can better give away his or her representation to somebody else, this individual has the right to contact a not-for-profit association– such as European Digital Rights – in order to be represented by it in filing complaints, exercising some of his or her rights, and receiving compensation. This might be useful if an action is to be taken against such a tech giant as Google or any other person or entity. Finally, persons who have suffered material or non-material damage as a result of an infringement of the GDPR have the right to receive compensation from the controller or processor in question.

Finally, and given that the GDPR applies not only across European countries, but to any organisation that processes EU citizen data, the following is interesting:

The European Union and its Member States cannot simply impose restrictions addressed in Article 23 GDPR when they wish to. These restrictions must respect the essence of the fundamental rights and freedoms and be in line with the requirements of the EU Charter of Fundamental Rights and the European Convention for the Protection of Human Rights and Fundamental Freedoms. In addition, they are required to constitute necessary and proportionate measures in a democratic society meaning that there must be a pressing social need to adopt these legal instruments and that they must be proportionate to the pursued legitimate aim. Also, they must be aiming to safeguard certain important interests. So, laws adopted by the EU of its Members States that seek to restrict the scope of data subjects’ rights are required to be necessary and proportionate and must protect various interests discussed below.

I learned a lot this week which will stand me in good stead as we design Project MoodleNet. I’m looking forward to putting all this into practice!


Image by Erol Ahmed available under a CC0 license

Social networking and GDPR

Note: I’m writing this post on my personal blog as I’m still learning about GDPR. This is me thinking out loud, rather than making official Moodle pronouncements.


I have to admit to EU directive fatigue when it comes to technology (remember the ‘cookie law‘?) so when I heard about the General Data Protect Regulation (GDPR), I didn’t give it the attention it deserved.

The GDPR is actually pretty awesome, and exactly the kind of thing we need in this technologically-mediated world. It has wide-ranging impact, even beyond Europe. In fact, it’s likely to set the standard for the processing of user information, privacy, and security from May 2018 onwards.

So, on the advice of Gavin Henrick, I’m in the midst of Futurelearn’s course on Understanding the General Data Protection Regulation. The content is great but, unlike Mary Cooch‘s excellent videos for the Learn Moodle Basics 3.4 course (which I’m also doing at the moment) I don’t find the videos helpful. They don’t add anything, so it’s a more efficient use of my time to read the transcripts.

All of this is prologue to say that GDPR affects the work I’m leading at the moment with Project MoodleNet. It may be in its early stages, but privacy by design (PDF) means that we need to anticipate potential issues:

The Privacy by Design approach is characterized by proactive rather than reactive measures. It anticipates and prevents privacy invasive events before they happen. PbD does not wait for privacy risks to materialize, nor does it offer remedies for resolving privacy infractions once they have occurred − it aims to prevent them from occurring. In short, Privacy by Design comes before-the-fact, not after.

Project MoodleNet is a social network for educators focused on professional development and the sharing of open content. As such, it’s a prime example of where GDPR can protect and empower users.

Article 5(1) from the official document states:

Personal data shall be:

(a) processed lawfully, fairly and in a transparent manner in relation to the data subject (‘lawfulness, fairness and transparency’);

(b) collected for specified, explicit and legitimate purposes and not further processed in a manner that is incompatible with those purposes; further processing for archiving purposes in the public interest, scientific or historical research purposes or statistical purposes shall, in accordance with Article 89(1), not be considered to be incompatible with the initial purposes (‘purpose limitation’);

(c) adequate, relevant and limited to what is necessary in relation to the purposes for which they are processed (‘data minimisation’);

(d) accurate and, where necessary, kept up to date; every reasonable step must be taken to ensure that personal data that are inaccurate, having regard to the purposes for which they are processed, are erased or rectified without delay (‘accuracy’);

(e) kept in a form which permits identification of data subjects for no longer than is necessary for the purposes for which the personal data are processed; personal data may be stored for longer periods insofar as the personal data will be processed solely for archiving purposes in the public interest, scientific or historical research purposes or statistical purposes in accordance with Article 89(1) subject to implementation of the appropriate technical and organisational measures required by this Regulation in order to safeguard the rights and freedoms of the data subject (‘storage limitation’);

(f) processed in a manner that ensures appropriate security of the personal data, including protection against unauthorised or unlawful processing and against accidental loss, destruction or damage, using appropriate technical or organisational measures (‘integrity and confidentiality’).

We’re kicking off Project MoodleNet by looking at all of the different components we’ll be building, and really zeroing-in on user control. A key part of that is the way(s) in which users can authenticate and are authorised to access different parts of the system. We’re exploring open source approaches such as gluu (which has been GDPR-ready since November) that make it both easy for the user while protecting their privacy.

In addition, and as I’ve touched on while writing at the project blog, we’re going to need to ensure that users can, at the very least:

  • see what data is held on them
  • choose whether to revoke consent around storage and processing of that data
  • request a data export
  • ask for any of their personal data to be securely destroyed.

I actually think Google do a pretty good job with most of this with Download your data in your account settings (formerly ‘Google Takeout’).

One challenge, I think, is going to be global search functionality. To make searching across people, resources, and news reasonably fast, there’s going to be some pre-caching involved. We need to explore to what extent that’s compatible with purpose limitation, data minimisation, and storage limitation. It may be that, as with the authentication/authorisation example above, it may already be somewhat of a solved problem.

A related issue is that different functionality may be used to a greater or lesser extent by users. Some (e.g. crowdfunding) may not be used by some educators at all. As such, we need to ensure that, perhaps through an approach that leans on microservices and APIs, we ensure integrity and confidentiality of user data, while again adhering to the principle of data minimisation.

I’m delighted to be working on this project at such an exciting time for user control and privacy. Organisations that have been wilfully neglecting controls and safeguards around user data, or monetising it in unethical ways, are going to be in for a rough ride. Those, however, that have a commitment to openness and follow the principles of privacy by design are going to find that it’s a competitive advantage!


Image by Sanwal Deen available under a CC0 license

css.php