Although this section discusses various laws, it should not be taken as legal advice.
Unsurprisingly, given the diverse and often divided nature of our world, there is little global consensus on what content is acceptable to host online. Cultural and political norms differ widely from country to country, even to the point of directly contradicting one another, and local laws naturally reflect this. This variability makes it virtually impossible for platforms to tailor a single product or policy to comply with all local laws, much less to satisfy all regional sensitivities simultaneously. On many subjects, companies will inevitably disappoint some of their constituents, no matter what policy choices they make.
Despite this, the consequences of failing to account for these differences can be steep. Products that run afoul of these legal and cultural differences risk losing user trust and constraining future growth in a market; they can even be banned by local authorities. This section gives an overview of the types of regional considerations a policy team must weigh while crafting both global and regional policies.
Regional Variation in Laws
While many countries have broadly comparable legislation relevant to trust and safety policy, there is no single global set of laws that govern internet platforms; this section gives a brief summary of how variable the global legal landscape is for platforms, a topic that will be expanded upon in a future chapter of this curriculum.
Platforms may find their services limited or blocked by regional authorities if they are deemed to be non-compliant with local law. Regional events may create pressure for policy changes, too, both from people outside of a company and those within it; the way a product is used within a conflict zone, in the midst of a major emergency, or during an election can have significant real world consequences that platforms cannot always anticipate, but must be responsive to.
Embargoes and sanctions can also limit a platform’s ability to operate in certain regions. For example, current U.S. law prohibits U.S. companies from doing business in Iran; even if a U.S. social media company designed policies friendly enough to the Iranian Government to remain unblocked by its leadership, that company would not be allowed to accept any money from people living in the country as part of its services.
Most platforms have a high-level set of global policies that are more restrictive than common laws around the world governing content. For example, as mentioned in the “Considerations while creating policies” section, many countries have laws that require platforms to remove child sexual abuse material, which virtually all mainstream platforms already prohibit on a discretionary basis. Aligning with these legal minimums may not require any changes to a platform’s policies, because their own policies already match or exceed what the law requires.
Beyond this common baseline, the way platforms approach regional differences is determined by a complex mixture of product design, regulatory threats, financial pressures and, at least in some cases, ideological commitment to the company’s mission and values.
Product design plays a crucial role in shaping platforms’ response to regional differences because it fundamentally limits the policy and enforcement options available. Products that physically operate in specific countries (e.g. Airbnb listings) or which provide country-specific versions of their product (e.g. google.co.uk vs. google.mx) are much easier to adapt to regional considerations than those with a single global design (e.g. Twitter).
Similarly, a company’s stated mission and values, and the strength of its ideological commitment to them, can have extremely significant effects on how the platform will deal with regional differences. Parler and Gab are particularly salient examples of this, but the influence of internal mission and values can be observed, to varying degrees and towards varying ends, in the behavior of nearly all large internet platforms.
For larger platforms, global policies are often set to align with the strictest regulations in the major markets a platform serves when doing so is uncontroversial in the rest of the world and does not present a significant design, values, or business problem. This approach minimizes the complexity and duplication of effort inherent in maintaining multiple approaches to the same topic. For example, many platforms are built to comply with COPPA, a United States law that prohibits platforms from collecting data from people under the age of 13 without parental consent; the EU has similar legislation, Digital Age Of Consent, as part of its General Data Protection Regulation. Though many other countries do not have similar laws, platforms often prevent users under age 13 from using their service in any country; being in strict compliance with a law in the critical U.S. and EU markets may be deemed more important than the marginal gains the platform may get from under-13 usage in other markets.
For smaller platforms, as well as those with a strong focus on particular markets or with particular ideological commitments it may not be practical or desirable to align their policies with the legislation of every region, even if that means being blocked in those regions. In a recent example, local U.S. news organizations (whose users are mostly U.S. based) chose to block European users from accessing their content. The reason for this was because their sites would not meet all the standards for the EU General Data Protection Regulation (GDPR) without expensive redevelopment and continuing to serve EU based users would have risked large fines.
In cases where neither of these approaches are suitable, companies may invest in developing location-specific policies. These are policies that block or prohibit specific types of content or behavior, but only in specific locations. These policies, often referred to as geoblocking, can be applied in two ways:
- View blocking – where global users are free to post content that violates a local rule, but users in that location will not be able to see it.
- User blocking – where users from a given location are prevented from taking a particular action, but other users outside of the location are not.
Regional Variations in Standards, Norms, and Cultural Sensitivities
Platforms that choose to enforce one global set of policies across all regions soon encounter one of trust and safety’s most inescapable challenges: users find all kinds of content objectionable, including some that will never realistically violate the platform’s policies. Sensitivities vary widely across and within every country in the world, and content and behavior that feels normal to one set of users may be considered unacceptable to others.
Because these broader cultural differences grow out of society as a whole, it is impossible for even the largest and most powerful platforms to truly resolve them through policy dictates and prohibitions alone. Instead, platforms must attempt to manage regional sensitivities through variations in the product itself, rather than relying on enforcement outcomes.
For example, users in more conservative countries may object to criticism of religion or depictions of women in revealing attire; instead of banning this content, the platform can mitigate the problem by dialing down the amount of this type of content in those users’ feeds and recommendations. Since individual people’s tastes and sensitivities vary widely even within countries, platforms can also build controls for users that help them adjust what they experience; Google’s “Safe Search” is a well known example of this user-specific control.
Not all such differences are related to objectionable content or behavior. For example, because municipal building codes vary drastically from one country to another, a vacation rental platform must take these local norms into account and may not require a ski chalet in Switzerland to have the same features as an overwater bungalow in the Maldives. A single global policy would be all but impossible to align to regulatory requirements and customer expectations in every country at once.
Location-based changes can also be put in place in response to specific events, either reacting to problems or acting preemptively to prevent problems from arising. One example of this is restrictions on political advertising, sometimes used to prevent the threat of election interference and misinformation on topics like voting rules and times. Rather than ban all such advertising in all locations indefinitely, some platforms create restrictions close to the dates of major elections in specific countries, where risks are greatest and where there may not be time to take enforcement action on issues before harm is done.
Region-specific policies can only be applied if a platform can reliably identify a user’s present country or region. There are several common methods platforms use to determine a user’s location:
- IP address data
- GPS data, particularly on mobile apps and services
- Self reporting (i.e., user has set their home country)
- Behavior (i.e., user posts or views content mostly associated with one country)
None of these methods are foolproof. VPNs and similar services often advertise the ability to get around region blocks and restrictions by making it appear as though the user is accessing the platform from another country. Some platforms allow users to self-select their location as a way to both comply with local law and give users more control over what content they can see.