Home / Technology / We need to do more than flatten the curve; we need to demand action

We need to do more than flatten the curve; we need to demand action

VentureBeat published an opinion piece yesterday titled, “Don’t like dystopian surveillance? Flatten the coronavirus curve.” The author, Khari Johnson, rightly points out that, with the current pandemic, it’s more than our health on the line — it’s also our privacy.

Johnson argues, “If you value privacy and due process or you don’t like the idea of handing federal law enforcement the ability to indefinitely detain people during a pandemic, then do your part to flatten the curve.”

However, Johnson’s recommendation doesn’t go far enough. There is a lot more we need to do than simply protect ourselves from infection. Our leaders now seek powers that no one would think justifiable under normal conditions. In many ways, the United States government’s response to COVID-19 goes beyond the NSA in the years after 9/11 and is concerning to privacy advocates. And Big Tech has been tapped to deliver.

When Johnson argues that the singular solution is for us, the general public, to manage our health so that our privacy is not infringed upon, he’s absolving private companies and the government from acting within ethically acceptable boundaries. If we wish to maintain our privacy, safety, and democracy, there are five demands and three concrete steps we must take now. Let’s start with those five demands:

1. Forbid these companies from selling or sharing this data with other companies. It should be made illegal to sell any data collected for the purposes of tracking the virus to companies that will leverage that data to drive their bottom line. There are many ways this data could be misused. For instance, insurance companies could use information about who is infected to grant or deny coverage. Credit rating agencies could use the data to determine credit scores, penalizing consumers who might be carrying medical debt. If COVID-19 is later shown to have long-term health effects, employers could determine whether to hire someone based on their history. Real estate developers might determine where or where not to build based on outbreak hotspots, which may track and exacerbate existing wealth inequalities. Marketers can use the data to fan the flames of fear of consumers who were previously infected so they can sell more (real or fraudulent) COVID-19 preventative products. Beyond those examples, we can anticipate clever and powerful companies finding new ways to leverage this data. Even with the best intentions, unintended consequences are often unforeseeable. This is precisely why proceeding with caution is so important.

VB TRansform 2020: The AI event for business leaders. San Francisco July 15 - 16

2. Forbid these companies from sharing this data with divisions within their own companies. Google, for instance, is composed of hundreds of departments and tens of thousands of engineers that serve every industry. The data that Google (or one of its subsidiaries) collects in fighting the virus should not be shared with any other divisions within the company that potentially serve other masters. This prevents a situation in which one division stays within guidelines but (unwittingly or not) hands off data to another division that is not beholden to the same rules.

3. Forbid that the data be shared with governmental agencies that are not directly responsible for combating the spread of the virus. Nothing should be shared with ICE, for example, which could pair that data with facial recognition software (e.g. from Clearview) to target and arrest people suspected of being in this country illegally.

4. Ensure the data cannot be leveraged for political purposes. For instance, data could be used to target coronavirus-related aid in regions where a particular party’s supporters live, and then devote fewer resources to those unsupportive of a particular administration. Data that shows pockets of outbreaks in certain locales could be used in election messaging to point blame — fairly or unfairly — in the direction of candidates.

5. Specify a bi-partisan body, including non-governmental subject matter experts on data ethics and privacy, to oversee both the government and private companies to ensure they are responsible stewards of the data they were permitted to collect. This body must be given teeth, too. They should be able to determine and mete out fines and punishments to any company that oversteps or breaks established policy.

To enact those demands, there are three steps we can take as members of the general public:

1.  Engage our representatives. By contacting our local, state, and federal representatives, we’ll collectively make our voices heard and show those in power that this is a key concern. The good news is that some of our leaders have already sounded the alarm. The Washington Post reported that five senators — Bob Menendez and Cory Booker of New Jersey, Kamala Harris of California, Richard Blumenthal of Connecticut, and Sherrod Brown of Ohio — wrote a letter to the White House and to Google to ask how data will be collected, used, and shared. We need to call these senators to express our support and other senators to ask them to join the letter writers.

2. Support and inform larger organizations that can take up the cause. We should marshal the power of organizations like the ACLU — but they’ll need to hear from us in order to be truly effective. In their article, “Can We Trust the Government to Respond to the Coronavirus in a Fair and Effective Manner?” they do not even mention the words “data” or “privacy.” It’s up to us to make sure that they’re attacking these potential issues.

3. Act directly as employees and shareholders where applicable. We often forget that we have the power to effect change in our daily lives. For those of us employed by the companies working on these tracking projects, the time to organize and speak up is now. The tech community is no stranger to internal dissension — just look at Google walkouts over the past year. Strong, public-facing resistance can have an important impact. Also, for any of the companies that are publicly traded, even more power lies in the shareholders. Quarterly shareholder meetings and calls open the door for public scrutiny of ongoing projects and practices.

Some companies do seem to address the issue. Verily, a subsidiary of Alphabet, provides Covid-19 testing. Their privacy policy has the veneer of respecting privacy. But when it says, “certain Services (or portions thereof) may only be accessible by creating an account, providing third-party user credentials (e.g., your Google account) for authentication or otherwise disclosing certain PII,” one begins to wonder what is actually going on here.

In times of national crisis, the issue of balancing the rights of individual citizens with the good of the nation as a whole comes to the fore. We saw it when Abraham Lincoln suspended the writ of habeas corpus and the Bush administration passed the Patriot Act. The issue before us is not a new one. But it is a pressing one, both given the emergency on our hands and the stunning breadth of surveillance our government and others are pursuing, which at present are utterly unchecked by the legal and ethical concerns of a scared people, desperate for safety.

Reid Blackman is the founder and CEO of Virtue Consultants, a digital ethical risk consultancy. He is a Senior Advisor to Ernst & Young and is a member of their AI Advisory Board, sits on the committee for “Methods to Guide Ethical Research and Design” for the IEEE Global Initiative on Ethics of Autonomous and Intelligent Systems, and is a member of the European Union Artificial Intelligence Alliance.

Let’s block ads! (Why?)

VentureBeat

About

Check Also

The scale of ambition in gaming is getting bigger | Brian Ward fireside chat

The scale of ambition for Saudi Arabia when it comes to moving into the games …