The last time we wrote about an EU copyright proposal affecting software development, we explained that policymakers in Brussels responded when you took action. We’re fast approaching another vote on that proposal in the EU Parliament.
Here’s an update on the status of the negotiations, along with some ideas of how you can help Members of European Parliament (MEPs) understand why and how to protect software development. Activism by people concerned about the copyright proposal is part of why it hasn’t passed yet, so keep reading to learn how you can help shape the debate.
The EU’s copyright proposal has the potential to impact the way we develop software. There’s still time to make our stance heard on several key issues of the proposal before it becomes law:
Upload filters (Article 13): Automated filtering of code would make software less reliable and more expensive. Upload filters pose larger concerns—like censorship, free speech, privacy, and ineffectiveness—and are problematic for all kinds of content, including software code.
Text and data mining (Article 3): The copyright exception for text and data mining in the EU is too narrow, because it only applies to research organizations for scientific purposes on a not-for-profit basis. It would undermine the EU’s efforts on AI and machine learning, as well as software development in the EU that depends on AI and machine learning.
New right (ancillary copyright) for press publishers (Article 11): Requiring a license to post snippets of text that describe links would add overhead to anyone developing software for the web. It would also run counter to copyright exceptions that allow copying for certain purposes, like commenting on a copyrighted work.
We’re focusing on software because that’s where GitHub and software developers can speak with authority.
You may have heard that Parliament “rejected the copyright proposal” on July 5. Actually, Parliament voted against one of its committee’s proposals, but it didn’t permanently reject it. Instead, they voted to open the negotiations to all 751 MEPs, rather than primarily one committee.
That said, the rejection was significant because MEPs used a rare procedure to challenge the committee’s decision, and public advocacy against the proposal contributed to that.
Parliament will vote on September 12 based on amendments received by September 5. This time, it will be the full Parliament—with more than 700 additional MEPs potentially voting than when the committee voted on the proposal. And while they’re restarting negotiations, they’re not starting over completely. They’ll vote on amendments to the Commission’s initial proposal, which is the one that kicked off the idea of upload filters in the first place.
Whatever happens in Parliament, it’s important to keep in mind that there are three institutions involved in lawmaking in the EU: Commission, Council, and Parliament. If Parliament adopts a version of the proposal, it will enter into negotiations—or trilogues—with the Commission (based on its original proposal) and Council (based on its negotiating mandate).
But for now, all eyes are on Parliament to see what it might adopt.
Leading up to Parliament’s vote, a Copyright Week of Action is happening September 4-11. Each day in that week is dedicated to a different constituency. September 5 is for developers and open source software.
Because so many developers from the EU work in the San Francisco Bay Area, we’re hosting an event at our headquarters in San Francisco on that day for EU developers to encourage them to contact their MEPs and encourage developers back home to do the same.
Many MEPs aren’t familiar with software development or how this proposal can affect it. Developers are in an excellent position to explain to MEPs how essential open source software is to software development overall and to the countless sites, apps, and programs people rely on and enjoy.
It can be a lot to take in, so here are a few things you can share with your MEP to get started:
Ready to take action? Contact your MEPs and tell them to protect software development by:
If you aren’t a citizen of the EU, there are still ways to get involved and speak out for the developer community. Public advocacy has already shaped Parliament’s response, so share your thoughts on the copyright proposal on social media, raise awareness in your community and circles, and stay tuned to what happens next.
As a company that takes our commitment to social responsibility seriously, we’ve created and open sourced policies for others to adapt and use. We’re excited to announce that we’ve added four new policies to the collection.
The Anti-Bribery Statement and Gifts and Entertainment Policy are policies related to bribery. We designed these policies to promote compliance with anti-bribery laws, including the US Foreign Corrupt Practices Act and the UK Bribery Act.
Beyond our policy commitment to prohibiting bribery, the statement describes the concrete actions we take to back up our words. A key element of ensuring our employees know how to avoid bribery is education. We now provide anti-bribery training to all employees, with additional training for particularly relevant people and teams, like the sales team. We also now require our channel partners and vendors to comply with our anti-bribery statement.
In describing our anti-bribery policies, we note our Code of Ethics, Standards of Conduct, and Gifts and Entertainment Policy. We decided to open source a separate policy on gifts and entertainment to provide more detail on the activities with the highest risk of potential bribery. We include examples of what’s acceptable and what’s unacceptable, explain consequences for violations, and link to the DOJ and SEC’s Resource Guide on the Foreign Corrupt Practices Act.
With GitHub’s annual Universe conference coming up, we realized it was a great time to update and open source our Event Terms and Event Code of Conduct. Both policies aim to create an inclusive, inviting, engaging, and safe place for people to learn and participate.
The code of conduct sets expectations for event speakers, attendees, exhibitors, organizers, and volunteers to show each other respect and courtesy. We make it clear that we are dedicated to providing a positive and harassment-free event experience for everyone—regardless of age, gender, gender identity and expression, sexual orientation, disability, personal appearance, body size, ethnicity, race, religion, nationality, or level of experience—and that we do not tolerate harassment of event participants in any form. We provide examples of acceptable and unacceptable behavior and contact information for reporting incidents. The terms provide more of the legalese, including basic requirements to attend (like agreeing to the code of conduct).
We hope our policies will help our users who may lack time or other resources to craft policies by providing a starting point and basis for them to contribute to best practices against bribery and corruption. We welcome you to adapt and reuse these policies.
Since we last wrote about net neutrality, we’ve seen efforts to step up protections for an open internet in the US and India (some more promising than others). We’ll start with the most encouraging updates, and we’ll also revisit a great resource for developers who want to help policymakers understand the need to save net neutrality.
India recently adopted what might be the world’s strongest net neutrality norms. On July 11, India’s Telecom Commission approved recommendations the Telecom Regulatory Authority of India (TRAI) proposed last November to incorporate principles of non-discriminatory treatment into internet service provider license agreements. Those rules define discriminatory treatment to include “any form” of data discrimination, such as “blocking, degrading, slowing down or granting preferential speeds or treatment to any content,” as well as zero rating. India’s rules do allow exceptions, including for “reasonable traffic management” and “specialised services” (such as emergency services), but only where “proportionate, transient and transparent in nature” and not when provided a replacement for internet access services.
Meanwhile, in the US, the call for net neutrality is regaining momentum in California, and possibly in Congress.
In California, Scott Wiener’s state senate bill SB 822 almost lost its promise as the US’s strongest net neutrality protections. After initial success in committee, SB 822 merged with another bill, and then was gutted. Fortunately, after much negotiation, Scott Wiener announced it will be reinstated with essentially all of its key elements. The revived bill is due out August 6, after legislative recess. Be sure to look out for that, along with opportunities to defend and protect it as it progresses through the legislative process.
At the federal level, on July 17 Representative Mike Coffman of Colorado became the first Republican sign the Congressional Review Act (CRA) discharge petition to undo the FCC’s repeal of net neutrality rules.
On the same day, Representative Coffman introduced his own net neutrality bill, the 21st Century Internet Act, which principally embraces the tenets of the FCC’s 2015 Open Internet Order with the significant exception of creating a new title for broadband internet access services. (The 2015 Open Internet Order classifies broadband under Title II of the Federal Communications Act, corresponding to common carrier-level regulation. In repealing that order, the FCC re-classified broadband under Title I: “light-touch regulation.” Coffman’s bill chooses neither and creates a new title instead.) Proponents of the CRA bill hope that Coffman’s bill will not detract from support for the CRA bill, and that Coffman’s signing of the CRA discharge petition will spur other Republicans to put their weight behind the CRA too.
Developers have an important message to relay. Net neutrality has led to vast opportunity by giving developers the freedom to build and ship software without being potentially blocked, throttled, or tolled by internet service providers. This has meant a more level playing field for launching new products. Without net neutrality protections, we lose trust that the internet is a force for innovation and opportunity.
If you want more detail, check out these comments a group of 190 internet pioneers, technologists, and developers filed with the US Federal Communications Commission (FCC). Although they’re a year old, the explanations are still relevant and can be a great resource in explaining the ramifications of the FCC’s decision for software to policymakers who may not understand it.
Not mincing words, they explain that the authors of the FCC’s decision to repeal net neutrality rules “lack a fundamental understanding” of what the internet’s technology promises to provide, how the internet actually works, which entities in the internet ecosystem provide which services, and what the similarities and differences are between the internet and other telecommunications systems the FCC regulates as telecommunications services.
They also describe risks to innovation that could follow from reclassification of broadband services and give concrete examples of consumer harm that could have been prevented when broadband services were less regulated (before net neutrality) and consumer benefit realized when they were more regulated (during net neutrality).
Feeling inspired? We hope you’ll join us in continuing to advocate for an open internet.
The Electronic Frontier Foundation (EFF) publishes an annual “Who Has Your Back Report” to evaluate which companies defend their users when the government comes knocking. Since 2011, the report has focused on government requests for user information. This year, the report takes on a different topic: government requests to take down information—in other words, censorship on online platforms.
As background, EFF explains how the prevalence of HTTPS and mixed-use social media sites have made it harder for governments themselves to directly censor content. As a result, governments are increasingly turning to online platforms to censor for them.
EFF used five criteria to rate how well companies (“some of the biggest online platforms that publicly host a large amount of user-generated content”) protect their users from unwarranted censorship:
Based on EFF’s description of those criteria, GitHub meets each one. As we explain in our contribution to the UN’s free expression expert’s report on content moderation, we minimize censorship on our platform by providing transparency, notice, appeals, and geographically limited blocking when we find a takedown unavoidable.
Among EFF’s observations in the report are that companies that scored well “serve to provide examples of strong policy language for others hoping to raise the bar on content moderation policy” and that helping companies to review each other’s policies around content moderation “can serve as a guide for startups and others looking for examples of best practices.” A strong motivation behind open sourcing our policies is that we hope to contribute to industry best practices while offering those examples to startups and others who are looking for them. We recognize how important transparency is in how we develop our policies. We also recognize that being transparent about how we moderate content is essential to maintaining our community’s trust and our legitimacy as a platform.
We thank EFF for taking on online censorship in this year’s report. Get in touch with us through email or Twitter if you’re interested in collaboration toward raising the standard among companies involved in online content moderation.
Earlier this month, we shared our contribution to a report about content moderation and free expression written by David Kaye, the United Nations Special Rapporteur on freedom of expression. That report is now available.
While the report focuses on social media platforms that see large volumes of hate speech and misinformation, use automation to moderate content, and receive government takedown requests based on their Terms of Service, many of Kaye’s points are relevant to GitHub’s users.
For example, on how to respond to government takedown requests, the report cites GitHub’s contribution where it states:
Companies should ensure that requests are in writing, cite specific and valid legal bases for restrictions and are issued by a valid government authority in an appropriate format.
At GitHub, when we receive a government takedown request, we confirm:
- that the request came from an official government agency;
- that the official sent an actual notice identifying the content; and
- that the official specified the source of illegality in that country.
Here are some other relevant points from the report.
The report’s top recommendation to companies is to recognize human rights law as “the authoritative global standard for ensuring freedom of expression on their platforms.” As we note in our contribution:
GitHub promotes the freedom of expression in our policies and in our application of those policies to specific cases, consistent with international human rights law’s articulation of the right to freedom of expression and its limitations in the International Covenant on Civil and Political Rights (ICCPR).
The ICCPR allows limitations on free expression when provided by law and necessary, including for the respect of others’ rights or where the content constitutes harassment, abuse, threats, or incitement of violence toward others.
The report calls for companies to provide public input and engagement, and use transparent rulemaking processes. We develop the rules on our platform collaboratively with our community.
The report notes that companies can develop “tools that prevent or mitigate the human rights risks” caused when national laws or demands are inconsistent with international standards.
In these situations, we look for ways to comply that are the least restrictive on human rights—for example, by asking the user to remove part of a repository instead of blocking it entirely, and by geo-blocking content only in the relevant jurisdiction.
One of Kaye’s five recommendations for governments responds to a concern we raised in our contribution. We explained that measures like the European Union’s proposal to require upload filters for copyright infringement “are overly broad in their scope and, as applied to GitHub and our user community, could be so cumbersome as to prevent developers from being able to launch their work.”
The report noted that “automated tools scanning music and video for copyright infringement at the point of upload have raised concerns of overblocking,” and made this recommendation:
States and intergovernmental organizations should refrain from establishing laws or arrangements that would require the “proactive” monitoring or filtering of content, which is both inconsistent with the right to privacy and likely to amount to pre-publication censorship.
Thanks to Special Rapporteur Kaye for his in-depth study of how human rights principles apply to content moderation on online platforms.
If you’d like to participate in the development of our policies, watch our site-policy repository, and look out for posts announcing new policies for public comment on our policy blog or follow our Policy Twitter account. To follow takedowns in real time, watch our gov-takedowns and DMCA repositories.