Blog


May 19, 2020 | Yva Alexandrova Meadway

Data, privacy and new tech in the immigration sector

Understanding the needs and capacities to deal with data, privacy and the use of new technologies in the immigration sector

Please take a moment to answer our survey on the immigration sector needs and capacities to deal with privacy, personal data, digital and technological changes: https://www.smartsurvey.co.uk/s/O150PS/ 

Migration is a process that still takes place very much in the real world: people cross physical borders, they travel thousands of miles, swim in ice cold water, sit in crammed lorries, live in camps often with little protection from nature. Increasingly, however, technology is becoming an essential part of their journeys, they use mobile technologies and social media to stay in touch with their families and friends, establish contacts to plan their journeys and find support in communities on arrival.

Governments are also increasingly using technology to manage migration. Personal data is collected upon arrival, when applying for asylum, fingerprints of asylum seekers are shared among members of the Eurodac system. Border control is increasingly done by means of biometric reading machines, fingertip, facial and iris scans and electronic gates. Questions around the use of AI in labour migration accompany some of the debates around the future of work. 

In the UK the most notable use of data in the immigration space has been through the agreement between the NHS and the Home Office in the context of the Hostile Environment policy. Introduced in 2011, it gives the Home Office sweeping powers and the mandate to access migrants personal data such as addresses collected by GPs, hospitals, schools, and job centres and use it to track down individuals for immigration detention or deportation.  

The “immigration exemption” as part of the 2018 Data Protection Act is another such example. A legal challenge brought by Open Rights Group (ORG) and the 3 million put forward the argument that the exemption, used by the Home Office to deny people access to their personal data, is far too broad and imprecise. As part of the proceedings, the Home Office was pushed to reveal it used the exemption in 60% of immigration-related requests for data.

The development of new technologies in response to the current Covid-19 crisis, such as tracing apps and immunity passports, are seen as important tools in fighting the pandemic. However, there are serious privacy concerns that affect vulnerable migrants as well as others. 

ORG is engaging in an effort to support and empower organisations in the migrant and refugee sector to become better equipped in dealing with data and privacy issues and understand new technologies, their impact on immigration policy and become better able to support their clients. Our work is funded by the Paul Hamlyn Foundation and will include the provision of trainings, webinars, policy briefs, methodologies, thematic and technical explainers, etc. It will also aim to develop an active network for information sharing and signposting, capacity for advocating policy change and undertaking joint campaigns.

For the purpose of better understanding these needs and responding to them ORG and Privacy International have developed a survey and we are encouraging as many organisations as possible to take part in it. The survey can be found here: https://www.smartsurvey.co.uk/s/O150PS/.  

[Read more]


May 13, 2020 | Mariano Delli Santi

NHSX tracking app Privacy Assessment: Key Concerns

NHSX finally released the Data Protection Impact Assessment (DPIA) for their contact tracing app at the weekend. DPIAs are risk analyses, meant to identify and address potential privacy and security issues prior to the deployment of a high risk application like the NHSX app. Also, they play a fundamental role in fostering public trust, as they ultimately allow us to know if we can trust the software we would be installing on our phone. 

Currently, the documents are unavailable, as some material was accidentally published, as highlighted by Matt Burgess at Wired. 

We have already highlighted how the Government has lacked due diligence so far (link). Although the public release of this DPIA comes as a positive development, we find significant shortcomings which still persist, as outlined below.

Risk assessmentis unsatisfactory

Assessing privacy risks is the most important aspect of a DPIA, as it should allow to put adequate safeguards into place. Unfortunately, what we have read so far leaves causes for concernsome threats have been merely listed, and left without appreciable safeguards in place, such as:

  1. re-identification of app users;

  2. re-use of contact tracing data for other purposes;

  3. potential impact of measures taken against individuals as a result of the data which has been collected.

Other risks have been downplayed. For instance, “malicious or hypochondriac” use of the self-diagnosis tool is admittedly at risk of generating false alerts, an occurrence which NHSX claims to prevent by forbidding it in the terms and conditions of the app.

Finally, NHSX decided not to submit this DPIA to the Information Commissioner’s Office for prior consultation, which they are obliged to if a DPIA reveals significant residual risks. It is clear to us that this is the case, and NHSX could definitely use some help from the ICO.

NHSX claims that the app does not profile you, nor interferes with your liberties

According to the DPIA, contact tracing will not result in any deprivation of your rights and freedom, nor involve profiling or evaluation of your personal aspects. However, this app is designed to:

  • evaluate your likelihood of being infected, by means of a risk-scoring algorithm that profiles your daily encounters;

  • influence your behaviour, e.g. by pushing you to self-isolate; and

  • inform the strategic management of Public Health responses, e.g. by allowing the Government to identify geographic areastoput into quarantine.

Thus, this is yet another example where the NHSX is improperly downplaying the areas of risk associated with the use of their app.

Your rights may be limited by design choice

The contact tracing app will transmit your personal information to a central database, managed by NHSX. The GDPR gives you the right to access, to erasure, and to objection regarding this data, as well as the obligation for the NHS to justify any limitation to your rights.

However, the app currently lacks this functionality, and the DPIA states that enabling these to be exercised will depend on technical practicality.

This claim is unfounded: the app could allow you to exercise your rights by design, the same way it allows you to link your medical condition to your data stored on the centralised server.

NHSX continues by stating that, if the decision to limit your rights were to be taken, this would be based upon an “overriding legitimate interest”, and the “fact” of your data being not identifiable. These reasons are groundless: an overriding legitimate interest must be assessed on a case-by-case basis, and cannot be relied upon unconditionally. Simultaneously, data about your encounters can be shown to you without the need to directly identify yourself, the same way these can be linked to your self-diagnoses. If data can sometimes be presented and linked to you, then it is unclear why there is a technical issue in providing access and deletion rights. 

DPIA improperly describes anonymisation, and leaves space to ambiguities

Data is anonymous when it cannot be used to identify a given person. On the other hand, the NHSX app works with pseudonyms — i.e., by including identifiers which are unique to individuals”, in order to uniquely recognise the activity logs generated by a given person.

It follows that claims such as “The App is designed to preserve the anonymity of those who use it” are inherently misleading, yet the term has been heavily relied upon by the authors of the DPIA. On top of that, many statements leave ambiguities, such as this:

Data may be fully anonymised for public health planning, research or statistical purposes, if those purposes can be achieved without the use of personal data. Otherwise, data may be linked […] with such additional protection that is required as an output of data protection impact assessment.”

We cannot decipher the meaning of this phrase: will our data be used for research purposes only and insofar as non personal data can be used? Or will our data be used regardless, and anonymised when possible? Furthermore, the law imposes the adoption of strict safeguards as a precondition to use data for research purposes, the details or existence of which are not covered in the DPIA.

Many questions are left unanswered

There are many other areas of significant concern for people’s privacy and security which are left untouched.

NHSX hit the news for its decision to rely on a centralised approach, contrary to most of the other European countries (link), and to the same design choices of Apple and Google (link). A centralised solution is not a bad option in itself, but it is much more intrusive and must therefore be justified: the Parliament Joint Committee on Human Rights has stated that “Without clear efficacy and benefits of the app, the level of data being collected will be not be justifiable”(link). Therefore, NHSX should clearly explain the necessity upon which the decision to rely on a centralised model was taken, as well as the metrics and KPIs by which they plan to measure its added value against the decentralised model.

Additionally, the DPIA only mentions NHS England and NHS Improvement as the entities to whom your data may be shared with. Although this would be good news, we also notice that the DPIA points out that data will be shared for research and statistical purposes, but leaves us without a clue about which entities may receive the data for such purposes. 

Finally, will any of the data generated and collected by the NHSX be available to law enforcement and security agencies, such as the Home Office or the GCHQ? Indeed, the UK government has recently asked to be granted more investigatory powers (link), and is relying on a surveillance company to analyse health data (link). The NHS will also have broad discretion under GDPR to release data to the police when a crime is being investigated; sometimes they may feel this is justified, for instance in the case of a missing child or terrorism. If the data might be re-identified and released in certain circumstances, this should be spelled out.

Thus, we believe this should be a matter of concern, and a clear answer concerning any possible involvement of such agencies should be put on record.

[Read more]


May 07, 2020 | Yva Alexandrova Meadway

Hostile environment may stop migrants from using NSHX tracker app

This week the Government confirmed that the NHSX app being developed as a response to the Covid-19 pandemic is using a centralised contact matching system. This means the app is more invasive in data collection than the alternative decentralised models such as the Apple/Google API and will store all data in a centralised system, rather than on each individual phone. ORG and a growing number of digital tech and digital rights experts are raising questions about both the invasive data gathering of the centralised approach, as well as the technical workability of the NHSX app (blog).

We have also already written about some of the key concerns that vulnerable people, including migrants, may face such as the lack of access to mobile phones, social exclusion and marginalisation. There is also the major issue of the lack of trust and concerns about data sharing between the NHS and the Home Office.

These concerns stem specifically from the current hostile environment policyof the Home Office introduced by Theresa May in 2011 and upheld with minor linguistic changes until today. It gave the Home Office sweeping powers and the mandate to access personal data such as addresses collected by GPs, hospitals, schools, and job centres and use it to track down individuals for immigration detention or deportation. Over the last decade this policy hascaused immeasurable injustice and suffering, including Britain’s biggest immigration scandal – Windrush. It has also served to erode the overall trust between ethnic minorities, migrant communities and the Home Office and has made people fearful or reluctant to seek medical help or contact the police if they have been the victim of a crime, for fear that they will be deported or detained.

Directly, these concerns affect a relatively small number of people whose immigration status is irregular. But it can also include a much wider set of people who have every legal right to be here but simply do not trust the Home Office and will be reluctant to entrust it with their private data. 

This mistrust has become very dangerous in the time of the Covid-19 pandemic, where public trust and mass participation and use of the NHSX app are fundamental to its success. 

For these users, a decentralised model would be likely to address concerns more effectively by reducing data sharing risks. This approach has been found to be more likely to comply with both human rights and data protection laws.

However, in case the government persists with the centralised approach, there is a need for strong legal safeguards that the Home Office will not be able to access the data collected through the NHSX app.

The adoption of such safeguardscan take the form of legal protection, such as the ones proposed in the Coronavirus Safeguards Bill, or a technical “firewall” around the data collected. These safeguards need to also extend beyond the lifetime of the app so as to ensure that the data will not be passed on to government or non-governmental actors once the pandemic is over.

If such safeguards are not built into the NHSX app, or any future technology deployed in the context of containing the Covid-19 pandemic, it is unlikely that this technology will be used by large sections of the migrant population who after a decade of being treated in a hostile manner simply do not trust the Home Office with their privacy and data.

Equally important is that in the case such safeguards are put in place, they need to be accompanied by a clear and targeted communication from the government and the NHS to create much-needed trust and confidence in the NHSX app.

For these reasons and others, we believe very strongly that the Joint Committee on Human Rights is pushing in the right direction when asking for a Safeguards Bill. We hope that organisations working with migrants will support this call as well.

 

 

[Read more]


April 24, 2020 | Jim Killock

Time for a Coronavirus Safeguards Bill

While we pause our weekly online discussions, we'll keep you informed here on the fast moving debate over digital privacy and Covid-19.

Contact tracing mobile apps

The new Google/Apple API that aids a privacy-conscious, decentralised model for tracking infections has been adopted by several European countries, such as Switzerland and Austria. Others, most notably France and Germany, are sticking to a 'centralised' method of contact matching. The UK's app is also centralised, so as matters stand it will prove impossible to use the Google / Apple API.

This means that centralised apps which don't use the new API will force users to keep their screen unlocked, so that Bluetooth keeps working. This is a security nightmare, and a drain on battery. It would make the apps close to unusable.

Thus we have to suspect that the governments in France, Germany and the UK are hoping that Apple and Google simply back down and let them access their new API. The governments may have clinical reasons to stick to a centralised approach, but without an open conversation about why exactly governments want a particular solution, it is impossible to judge whether their needs are real, or capable of being met another way.

Car crash argument ahead

An open discussion about the advantages and disadvantages of the Apple-Google approach and government needs is critical. Without that, there is a distinct likelihood of an acrimonious argument, where governments accuse ‘big tech’ of getting in the way of helping deal with the pandemic. That is good for nobody, except policy makers seeking to avoid blame.

Nevertheless, the contact tracing discussion is moving on. As we have repeatedly said, any app will be limited in its application and reach, not least in regard of some vulnerable groups, especially the elderly. There are issues with false positives and false negatives; and ensuring it is not therefore just generating noise.

Much more fruitful is the discussion around human contact tracing, which will need to be the centre point of the Government's efforts.

Transparency: tell us how and why decisions are made

We need to hear more from governments about their decisions and approaches. More information is crucial.

Maintaining their poor track record for transparency, the UK Government has been quiet lately on tracing apps and we are yet to hear anything substantial about the proposed approach to “immunity passports”. This weakens public trust when it is most needed.

Safeguards needed

No matter what technologies are employed, they require strong legal and technical privacy protections to build public trust, especially for vulnerable groups like migrants.

With Parliament reconvening, the time for codifying privacy protections for tracing apps is now. Open Rights Group (ORG) will be advocating the draft Coronavirus Safeguards Bill spearheaded by Prof. Lilian Edwards of Newcastle Law School.

We'll be back soon with updates on ORG's work to protect digital rights during the pandemic.

[Read more]


April 21, 2020 | Yva Alexandrova Meadway

Contact tracing apps & vulnerable migrants: key concerns

In our recent blog we discussed the emerging privacy issues around contact tracing and ‘immunity passports’, and the announcement from Google and Apple that they are jointly working on a contact tracing app. Open Rights Group (ORG) calls for both technical and legal safeguards to be introduced to ensure privacy is protected. These safeguards are essential to ensure both take up and trust in the planned measures and ultimately their success in the containment of the Covid-19 virus.

Here we will look at how the emerging privacy concerns around tracing and ‘immunity passports’ affect some vulnerable or marginalised migrants in particular. Their vulnerability may derive from facing significant social exclusion and marginalisation or being afraid to interact with governmental or non-governmental agencies.

The immigration policy context is defined by the hostile environment policy, which has a number of implications for migrants. The most pressing one is in the area of healthcare – treatment for COVID-19 has been made free for everyone regardless of whether they have paid the migrant healthcare surcharge, however many migrants may not know this. In addition, the Department for Health and Social Care has confirmed it will be sharing data with the Home Office and as a result vulnerable migrants may be deterred to seek medical help even if they are having Covid-19 symptoms. A number of organisations in the sector have issued a joint letter calling for an end to the Hostile Environment. 

With regard to technical safeguards for contact tracing there are two main issues – trust and take-up. Trust is a key factor, success of the new technologies is dependent on widespread use and adoption. This in turn is dependant on ensuring that privacy is as protected as possible and general risks around data sharing, re-use of data, mission-creep and repurposing of contact mapping data can be reduced or removed. For some vulnerable migrants clarifying who has access to the collected data, how it will be shared between private and government agencies and what exactly it will be used for will be critical to adoption and use. Ensuring guarantees that data shared through these apps will not be used for immigration enforcement or denial of services is particularly important.

The other consideration is take-up and use of the tool. The government has estimated that it needs 50-60% of the population to use the NHSX app for it to provide enough data points and that means a very large percentage of smartphone users. The Bluetooth LE technology which supports the tracing apps is not available on all phones, especially not older and cheaper models, often used by people with limited economic means. Also, among some migrant groups sometimes a number of people may be sharing one device, which creates additional problems with tracing. Thus even if near-universal adoption among users who can support an app takes place, the data gathered may not have wide enough coverage and pockets of exclusion may appear, which can lead to certain groups being left out of the measures.

We know even less about plans for ‘immunity passports’, but it is clear these could result in a centralised register. There will again be questions of mission creep, potential for abuse, and more appropriate alternatives which remove these risks.

Legal safeguards to privacy and against abuse

In addition to safeguards for data privacy which can be built in the technology discussed above, there is also the need for legal guarantees. A group of academics has reviewed the emerging legal challenges with regard to these new technologies and have proposed a Coronavirus (Safeguards) Bill 2020 to protect against abuse of contact tracing tools and also ‘Immunity Passports‘. The proposal would mandate that there would be:

  • No sanctions for failing to carry device, install or run application

  • No mandatory requirement to install application or display immunity certificate credential without due safeguards

  • No reuse of evidence or data derived from symptom tracking and contact tracing, and immunity certification, without due safeguards

As the authors say, “Uptake of apps, crucial to their success, will be improved if people feel confident their data will not be misused, repurposed or shared”. For immigrants, ensuring safeguards on data sharing with the Home Office specifically for the purpose of immigration enforcement and access to services is crucial.

Recognising that vulnerable and marginalised migrants, as well as other vulnerable groups, may be facing a higher risk in terms of their privacy in the context of new technologies developed to contain the spread of Covid 19 is essential for building trust and ensuring the success of these measures.

Background: ORG is implementing a new project aimed at building capacity and empowering the migrant and refugee sector to respond to the growing importance of data governance and new technologies at different stages of people’s migration journeys. We have developed a survey, jointly with Privacy International, which will help better understand how we can support different types of organisations in the sector, such as frontline services, legal advice, community support, policy and campaigning. The answers will help shape our future activities-workshops, trainings, webinars, discussions, etc.

The survey can be found here: https://www.smartsurvey.co.uk/s/O150PS/ and is open until 4 May.

 

[Read more]


April 15, 2020 | Jim Killock

Contact tracing and immunity passports must respect privacy

The government’s plans for contact tracing and immunity passports should respect privacy, both at a technical level and backed by legal safeguards. This is essential for trust.

Contact tracing apps: the NHSX plans and Apple-Google changes

The governments' plans for contact tracing apps are perhaps more confused as a result of announcements by Google and Apple for a Bluetooth contact tracing API.

Contact tracing can be performed either centrally, or on a device. In the NHS model, proximity contacts are uploaded to the NHS server, which then notifies people who are at risk. In the Google-Apple model, the server notifies users of 'risky contact IDs'. This means that the central server doesn't learn who is in contact with whom, otherwise known as the ‘social graph’. The same approach has been suggested by privacy researchers at DP-3T.

The 'decentralised' model is not without problems. There are potential attacks to disclose the identity of people who are infected, for instance, in both centralised and decentralised models. It arguably deprives the project of some information that may be useful for understanding the spread of the disease.

The prime consideration with this tool is take-up and use of the tool. The government has estimated that it needs 50-60% of the population to use it; and that means a very large percentage of smartphone users. Many mobile operating systems are slow to update; older models may never have upgrades; many will not support Bluetooth LE. Thus the government may be aiming for near-universal adoption among those users who can support an app.

Even assuming this is achievable, trust is a key factor, ensuring that privacy is as protected as possible extends the possibility of widespread adoption. Risks around re-use of data, mission-creep and repurposing contact mapping data at a later date can be reduced or removed. For some marginalised groups, these issues may well be critical to adoption.

No approach can remove all personal privacy risks, as people may speculate about whom they came into contact, when notified.

Practicality of NHSX Bluetooth workarounds

A further question arises as to the practicality of the NHSX app’s Bluetooth LE workarounds. Bluetooth LE is meant to stop working when a phone is idle. The government has not explained how it gets around this without causing battery or security issues from the device’s screen being kept on. The Apple-Google API will provide a way for apps to work in the background; this is a key reason for them introducing their new approach; but using the new API would require the NHSX to be reengineered and messily redeployed; an approach which they appear to be contemplating.

Whether any app-based approach work is still very much open to debate, as Professor Ross Anderson points out. Concerns over false positives could overwhelm such a tool; most people contacted will probably be at a low risk. At the very least, it is likely to work best in a state of very low infections combined with highly available testing regimes, so that people who are alerted can swiftly determine if any supposed risk resulted in an infection, or not. And in any case, an app is unlikely to supplant the need for rigorous and time consuming human contact tracing.

Centralised databases

We know even less about plans for immunity passports, but it is clear these could result in centralised registers: or not, if the government prefers. There will again be questions of mission creep, potential for abuse, and alternatives which remove these risks.

Legal defences against abuse

Defences against abuse of data can of course be either technical or legal. In parallel to the discussion about contact tracing, a group of academics has proposed a Coronavirus (Safeguards) Bill 2020 to protect against abuse of contact tracing tools and also ‘Immunity Passports‘. The proposal would mandate that there would be:

  • No sanctions for failing to carry device, install or run application

  • No mandatory requirement to install application or display immunity certificate credential without due safeguards

  • No reuse of evidence or data derived from symptom tracking and contact tracing, and immunity certification, without due safeguards

As the authors say, “Uptake of apps, crucial to their success, will be improved if people feel confident their data will not be misused, repurposed or shared to eg the private sector (think insurers, marketers or employers) without their knowledge or consent, and that data held will be accurate.”

The authors envisage that there may be times when the government wants to require immunity certificates: and that these must be constrained to specified circumstances where that is necessary and proportionate.

We urge the oppostion parties to work with the government to introduce this or a similar Bill to ensure that public trust is maintained, especially if and when immunity certificates are proposed. Data privacy is a condition of success — not an aspect of policy to be treated as an afterthought.

[Read more]


April 07, 2020 | Jim Killock

Contact tracing and immunity passports: questions for the government

We continue to hear bits and pieces about the way that mobile apps may be developed, and about the possibilities that Immunity Passports might come with intrusive database projects. We have today produced a briefing.

This sets out questions the Government and ICO need to answer in order for the public to understand what these projects may mean for them.

Mobile contact apps

We think that the government needs to urgently explain what its approach to mobile contact tracing apps may be. These are mooted as potentially important for a post lockdown world, where infections need to be identified so people can self-isolate. The government needs to explain the clinical basis for its approach – there is some controversy about how well these tools may work. Proximity, even based on relatively accurate Bluetooth connections, will not always be the same as risky contact, for instance.

The governmment must explain how privacy is protected, not least so that it has a good chance of persuading vast numbers of people to install and run these apps: it needs something like 60% of adult to use the app; and 80% of adults have a smartphone. Most of those will need to install it.

There are different potential approaches, and a series of possible technologies being developed. Some are centralised, others decentralised. The European PEPP-PT project (“Pan-European Privacy-Preserving Proximity Tracing”) appears to be co-ordinating and potentially picking which approach to use. However, the government has made no statement about how it is working with PEPP-PT. 

Related to this, contact tracing, whether using Bluetooth location data or not, will need to work across borders. The PEPP-PT project recognises this; we need to know how the government will work to ensure this can take place, again while protecting privacy.

Immunity passports

The idea of ‘immunity passports’ is being pursued by the government. Potential approaches could preserve privacy, through using 'attestation' from trusted parties. However, other approaches could involve centralised databases, potentially of the whole population, recording their immunity status.

Here we again need clarity from the government about the likely approach, governance model, and so on.

ICO advice

Privacy and data protection continue to be important in the crisis, in order to maintain trust and the rule of law. The ICO has a critical role explaining some of the difficult aspects of law, and also to state the duties of private and non-health government organisations during the crisis.

The ICO should explain when it intends to release advice.

Legal analysis

Our document gives a brief overview of data protection law in this area. In short, data protection laws and protections continue to apply, even when exceptional arrangements are in place. In particular: lawfulness, fairness and transparency; purpose limitation; data minimisation; storage limitation; integrity and confidentiality continue to be required.

Surveillance

Our briefing does not cover surveillance law; here we again need aq great deal of clarity about the use of existing government powers. We will be following with a further briefing. you can also read this blog by Javier Ruiz detailing some of our thoughts.

Full briefing: read here.

[Read more]


April 07, 2020 | Pascal Crowe

Democracy and Covid-19

The Covid 19 epidemic has disrupted our economic and social life unlike anything seen in peacetime. Its impact on the character of our democracy is becoming apparent; the pressure of social distancing is forcing us to rethink how we structure our democratic institutions.

In the midst of all this, does the ‘white heat’ of technology offer some meaningful solutions? After years of ‘techlash’, where technology firms were accused of damaging democracy, could they be its salvation after all? 

Early signs are promising. Select Committees have been using video conferencing software to continue scrutinising the government. Some have suggested this could be a preferable way to interview witnesses in future, given cost and environmental considerations. 

The Speaker of the Commons, Lindsay Hoyle, has written to parliamentary officials to explore digital innovations that can keep the show on the road in these difficult times. Although we often have reason to be skeptical, any tech driven solutions to shore up our democratic institutions should be an easy win. With this in mind, here is the ORG perspective on what can be done to keep our democracy intact- and even improved- during this unprecedented event. 

Digital Democracy

ORG continues to oppose the use of electronic voting in UK statutory elections. This is for a number of reasons; cybersecurity concerns, cost, and because the principles of verifiability and the secret ballot are opposed. 

Remote voting for MP’s, however, is a useful and essential democratic innovation for these times. Votes in Parliament are never anonymous; indeed it is essential for holding MP’s to account that their voting record is public. 

At the same time, recent legislation has brought the biggest curtailment on civil liberties in peacetime onto the statute book. Whilst all of us recognise the purpose of sacrificing some individual rights for the public interest, it is important that this should be proportionate and accountable. For example, the review period for the extraordinary powers in the emergency Covid 19 bill was shortened from 2 years to 6 months after public pressure, including from civil society groups. An up and running digital parliament could have further held the government to account. In the absence of one, it seems likely that there are measures in the legislation that we will all come to regret.

If remote voting for MP’s can facilitate proper scrutiny of the government, and return us to normal times more quickly, then it should be introduced as soon as possible. 

Information Environments

ORG supports the development of high quality ‘information hubs’ that inform public debate. Information hubs collate and grade information on a topic. Fact checking services can function as information hubs by critically evaluating the content and source of information. This is particularly important during the current public health crisis, where the speed of the situation is liable to foster misinformation (rumours). The recent attacks on 5G mast towers are one example of this.

In addition, the geopolitical dimensions of the Covid-19 epidemic have seen the spread of disinformation by state actors. There is an unprecedented amount of political capital at stake. For example, the Chinese Communist Party has claimed that the American Government created Covid-19, despite there being nothing in the best available evidence to suggest this. 

A high quality information hub that can provide assurance, advice and instructions during a crisis such as this, would be invaluable. The UCL Constitution Unit has already outlined a framework for how this might look.Some parliamentarians have already begun a cross party effort to create a database of good quality information on the pandemic.

Often, digital solutions to these kind of crises centre on takedowns and potential censorship. ORG rejects this. It is these impulses that exacerbated the crisis in the first place. The enemies of the pandemic are freedom of information, strong democratic institutions and public trust - not censorship, autocracy and fear.  

[Read more]