Published on 12:00 AM, February 25, 2022

A deep dive into proposed online content regulations

Illustration: Star

It all started when a public interest litigation was filed in the summer of 2020, alleging infringement of constitutional rights due to the government's failure to regulate content in over-the-top (OTT) media services. Subsequently, consistent with a High Court directive issued in January 2021, the Bangladesh Telecommunication Regulatory Commission (BTRC) and the Ministry of Information and Broadcasting started drafting frameworks to regulate online content. A preliminary draft of the regulation— titled Bangladesh Telecommunication Regulatory Commission Regulation for Digital, Social Media and Over-The-Top Platforms, 2021—was recently made available by the BTRC for public comment.

Broadly speaking, the draft regulation appears to have been substantially copied from India's Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021. Unfortunately, such cut-and-paste exercise is problematic for a number of reasons—not least because it fails to reflect on the underlying policy considerations or to account for the differences in the regulatory environment. It also pre-empts an authentic rule-making effort driven by local issues and considerations. All in all, it stands in stark contrast with the draft policy prepared by the information ministry, and vacuously incorporates such requirements as traceability, local registration and content moderation, with no safe harbour provision.

Set out below are the key concerns with the draft regulation.

Enabling traceability

Let's start with what traceability means. Simply put, it means that messaging services—like WhatsApp and Viber—must have the ability to trace the first sender of a message (sent within Bangladesh) and disclose the information about said sender to the government authorities, on the basis of an order from a court or the BTRC. WhatsApp explains the concern well, "Requiring messaging apps to trace chats is the equivalent of asking us to keep a fingerprint of every single message sent on WhatsApp, which would break end-to-end encryption and fundamentally undermine people's right to privacy." Evidently, the intention is to curb misinformation spread over messaging services. So, why should the traceability feature concern you, a Bangladeshi citizen?

Firstly, it dilutes a citizen's right to privacy. At the heart of it, to mandate traceability is to effectively mandate surveillance, and this would infringe the right to privacy protected under Article 43 of the constitution. The draft regulation does not provide sufficient safeguards to counteract abuse, nor does it clarify what "other less intrusive means" the authorities must explore before a tracing order is issued. Consequently, to comply with such an order, messaging services may have to break end-to-end encryption, a feature which ensures that no one, not even the service provider—other than the sender and the receiver—can read messages.

Secondly, while the draft regulation states that the content of a message or any other information relating to the individual need not be disclosed pursuant to a tracing order, in reality, this is merely paying a lip service to privacy, because several legislations, including the Digital Security Act (DSA), 2018 and the Bangladesh Telecommunication Regulation Act, 2001 entitle the regulatory authorities to compel disclosure of information.

Thirdly, while the tracing orders must be issued for crime detection, prevention, investigation and prosecution purposes, and on variably interpretable grounds as "public order" and "sovereignty and integrity of Bangladesh," the grounds are so broadly applied that it leaves the door ajar for misuse. As a result, citizens would be reluctant to speak freely, fearing that their private communications—even if encrypted—could be traced and used against them. Besides, cybercriminals could easily use sophisticated tools to impersonate a sender, which would render the digital fingerprinting techniques redundant. Overall, this only increases risks against journalists, political activists and general citizenry for expressing unpopular opinion or dissent.

Local registration and resident officers

Under the draft regulation, there are requirements for non-resident intermediaries to have local registration as well as appoint resident officers and representatives in Bangladesh. Such requirements fail to account for the fact that internet-based services, or the speed and effectiveness of its delivery, are not dependent on companies having local presence in the country. On the contrary, this belief is antithetical to the fluid, cross-border and open nature of the internet and services provided by technology companies, and if enforced strictly, could result in internet fragmentation and debilitated connectivity. Moreover, this requirement will likely increase the cost of doing business in Bangladesh, and thereby make internet-based services more expensive for consumers.

From a business continuity perspective, there are substantial risks of registration cancellation, and law enforcement actions against resident officers and representatives. In fact, it is extremely disconcerting that Section 76 of the telecom act appears to reverse the legal burden of proof for individuals, which means that a person is presumed to be guilty unless proven innocent. On the whole, the requirement for local presence seemingly gives precedence to the government's desire for control over the internet companies, which, without a clear safe harbour provision, may create a prickly dilemma for certain companies to enter Bangladeshi markets altogether.

Absence of "safe harbour"

First and foremost, what is "safe harbour" protection? Fundamentally, it's a legal provision that limits the liability of internet platforms under certain circumstances. For instance, for platforms like Facebook, TikTok and YouTube, this protection is important to shield them from liabilities arising from user-generated contents, over which they don't exercise active editorial or curatorial control.

Second, why is such protection important? For several reasons. Since the advent of the internet, intermediary liability protection has undergirded it, enabling platforms to operate at scale, democratising access to information and content creation, and transposing responsibility for contents where it belongs. As a result, clearly defined protections are more likely to result in a vibrant internet ecosystem, thriving digital market, and accelerated economic growth. However, unfortunately, there is no such protection in the BTRC draft regulation, notwithstanding strong constitutional and enforcement arguments in its favour.

i) A robust intermediary protection enables the exercise of free speech and privacy rights under articles 39 and 43 of the constitution. Put differently, an "unsafe" intermediary is incentivised to remove user-generated content without adequate review or consideration, to avoid penalties, which will not only amount to a direct and unreasonable restraint on protected speech, but could also result in self-censorship, as users would consciously forbear from freely expressing themselves. It would also force intermediaries to become surveillance centres and censorship boards. Where the law also requires user information to be handed over, an intermediary would become all but a proxy for the government to collect information and surveil citizens.

ii) From an enforcement standpoint, this protection will also shield intermediaries from the menace of heavy-handed fines and detention. Under the telecom act, the BTRC can impose penalties on companies and associated individuals amounting up to Tk 300 crore (around USD 35 million) and/or imprisonment for up to five years, if contents are not removed in line with their order. And such penalties could be imposed more than once. Such a penalty regime, without a safe harbour, creates significant business continuity risks and could compel a non-resident company to discontinue its services in Bangladesh—or not enter the market at all.

Content moderation

According to the draft regulation, content moderation requests from government agencies must be fulfilled by intermediaries—like social media platforms, data centres and file hosting services—within 72 hours. Not only is the timeline arbitrary, but the requirement itself is disproportionate, raises constitutional concerns, poses significant implementation challenges, and promotes a monoculture of content moderation.

Firstly, as mentioned above, the punishment for noncompliance with this timeline is dangerously disproportionate. In the absence of sentencing guidelines, sufficient due process safeguards or a duty to take graded approach, the courts don't have clear precepts on how to exercise their discretion.

Secondly, content moderation on online platforms constitutes restraint on free speech. By prescribing harsh turnaround time (and in the absence of judicial oversight or the right to appeal), the intermediaries will have a perverse incentive to pre-emptively censor even valid and lawful expressions, or excessively remove content without sufficient consideration as to its legality. Moreover, it is impractical to expect that the intermediaries can comply with the timeline every single time. Content moderation could involve a complex system of review by human reviewers and automated tools, with moderators reviewing large volume of requests, taking into consideration nuanced local legal and regulatory exigencies. Hence, the draft regulation should allow more flexibility around lead times.

Bottom line?

There's no doubt that a regulation is necessary, as intermediaries have been woefully slow to solve the problems they create. But we need a content regulation mechanism that takes into consideration the technical, operational and functional differences between different internet-based services—one that is predictable, future-proof and fit for purpose. What we don't need is a regulation inspired by legacy telecommunication and broadcasting standards, arrogated from a foreign law that is facing constitutional challenges and industry-wide criticism, because that will inevitably lead to a patchwork regulation and constitutional challenges.

While the effort to socialise the draft regulation for public comments is a significant move in the right direction, it's not enough. Extensive consultation with input from government and non-government stakeholders as well as constitutional experts is also essential. Particularly, it is important to engage with non-resident intermediaries—like Meta, Google, and Netflix—who will be most impacted by this regulation. It is equally important to undertake impact assessments to assess the effect, effectiveness and cost implications of the regulation. A forward-leaning and collaborative approach will not only inspire confidence in the regulation, but will also give it the teeth of enforcement, and foster a healthy and vibrant digital public sphere.

 

Shahzeb Mahmood is a barrister and a research associate at the Centre for Governance Studies (CGS).