Technology

How a Norwegian Government Report Shows the Limits of CFIUS Data Reviews – Lawfare

Amid growing attention to data and national security threats from China, a recent Norwegian government report sheds light on the limits of a U.S. government process for tackling them: the Committee on Foreign Investment in the United States (CFIUS).

In 2019, CFIUS, which reviews foreign investments in sensitive U.S. companies for national security risks, initiated a review of Chinese company Beijing Kunlun Tech’s 2016 and 2018 investments into dating app Grindr. Beijing Kunlun Tech had a more than 98 percent ownership stake in Grindr. The committee’s logic was that Grindr’s user data, including on sexual orientation, sexual behavior and health data, was too sensitive to risk ending up in the hands of the Chinese government via the Chinese owner. CFIUS went through its review process and, though the specifics of that review are classified, it reportedly asked Beijing Kunlun Tech to sell Grindr in March 2019, which the company then did in March 2020 to San Vicente Acquisition LLC, a U.S-incorporated group of tech investors and entrepreneurs.

Yet a recent Norwegian government report on Grindr found that the application is sharing data with a range of third parties including data brokers—meaning data on the application’s users is traveling far beyond the bounds of just that company. This all raises the question: Is forcing the sale of a sensitive-data-holding company from a Chinese firm enough to mitigate national security risks when the data can still end up in that Chinese firm’s, or the Chinese government’s, hands?

This post uses Grindr as a case study of how CFIUS reviews of data security risks may be insufficient to fully limit the spread of sensitive data to foreign governments. It examines what happened in 2019, what the Norwegian report found, and how this fits into a broader context of reforming U.S. federal processes to identify and mitigate the spread of sensitive U.S. citizen data. To respond to this issue, we recommend that policymakers take a far more comprehensive view of data brokerage and data sharing when approaching data security risks to the United States and that they make CFIUS tools only one part of a broader U.S. policy toolbox.

The 2019 CFIUS Investigation Into Grindr

CFIUS may halt majority or minority investments into U.S. businesses involving a foreign person if the transaction poses a risk to U.S. national security. Following a CFIUS investigation, the committee can recommend to the company in question that it sell its relevant stake in the U.S. business, and if the company does not comply, the president has the authority to permanently block the investment. Transactions under CFIUS’s purview include those resulting in a foreign person gaining access to “material nonpublic information,” asserting control over decisions of businesses relating to critical technologies or infrastructure, or gaining control over businesses that possess sensitive personal data on U.S. citizens. 

The Foreign Investment Risk Review Modernization Act of 2018 (FIRRMA) was the first update to CFIUS authorities in more than a decade. It expanded the interagency body’s authority by explicitly including for review these kinds of investments that relate to the development of cutting-edge technology or involve the risk of sensitive personal data transfers to foreign entities, although it did not define sensitive personal data. FIRRMA also gives CFIUS the authority to discriminate between foreign investors by country, representing the belief among lawmakers that firms from certain competitor nations, such as China, pose an outsize risk to U.S. national security.

It was under the aegis of these expanded powers that CFIUS took the step in 2019 of insisting that Beijing Kunlun Tech Co., Ltd undo an already-completed acquisition of Grindr LLC that took place in two transactions between 2016 and 2018. Beijing Kunlun Tech first bought a 60 percent majority stake in 2016 for $93 million and completed the acquisition in 2018 for another $142 million. 

While the specific decision criteria to block the transaction remain classified, Grindr’s collection of sensitive personal data raises serious national security concerns. Information including sexual orientation, HIV status, real-time location and dating habits are key to the use of the app. Grindr’s considerable commercial success—with 27 million users as of 2017—could make it a high-value target for foreign intelligence services, particularly as government employees and military personnel are likely to be among its millions of users. Grindr data has been used against U.S. politicians before: A user of the app outed Randy Boehning, a Republican member of the North Dakota House of Representatives, after he voted against a gay rights bill. Given the Chinese legal requirements for Chinese Communist Party access to Chinese companies’ data—a 2017 intelligence law mandates that “‘any organization … shall support, assist, and cooperate with state intelligence work’”—the U.S. government’s view was that the Kunlun acquisition of Grindr posed a significant national security risk. The Trump administration was also greatly concerned about the possibility of the Chinese government combining commercial data it acquired from companies with data it stole in the 2015 hack of the U.S. Office of Personnel Management.

The CFIUS decision to unwind the Grindr acquisition indicates that personal data has become a principal concern of the interagency body. (Even though President Trump’s executive order effectively banning TikTok had strong political drivers behind it, a similar U.S. government decision to restructure TikTok’s ownership reflects a similar concern about access to personal data.) After the decision, Sens. Ed Markey and Richard Blumenthal said CFIUS “should continue to draw a line in the sand for future foreign acquisition of sensitive personal data.” However, even if a company is U.S. owned, its data can still be used to compromise American national security. The report below describes an intricate web of Grindr’s data flows and shows the limitations of CFIUS as a tool to effectively mitigate foreign access to sensitive data on U.S. persons. 

The 2021 Norwegian Government Report

In January, the Consumer Council of Norway (Forbrukerrådet), the country’s consumer protection agency, published a report detailing “how every time we use our phones, a large number of shadowy entities that are virtually unknown to consumers are receiving personal data about our interests, habits, and behaviour.” The research examined several different mobile applications, but of relevance to the 2019 CFIUS investigation is the report’s findings on Grindr. (A detailed technical documentation records how the council tested data flows from mobile apps on Android devices.) “The Grindr app,” the report states, “shares detailed user data with a very large number of third parties, including IP address, GPS location, age, and gender.” It was already known that Grindr collects highly sensitive information about its users—as with many apps, including data from sexual preferences to HIV status, as well as other data from the device and on the user’s activity. But revelations about the extent of its third-party data-sharing illustrate how the larger data broker and online ad network economy means sensitive U.S. citizen data can, and likely does, spread throughout the world, including possibly to foreign governments, even if the firm holding the data is entirely U.S. owned.

The report documents Grindr’s engagement with MoPub, a U.S.-incorporated ad network application bought by Twitter in 2013 and one of Grindr’s advertising partners. Grindr “communicates extensively” with MoPub, and the app contains MoPub’s software development kit, which “transmits [specific] information directly from [a user’s] device to MoPub’s servers.” Yet it hardly stops there. Grindr “sends a lot of user data”—the report listed such data points as GPS location, IP address, gender and age, though not the likes of sexual preferences—to a “large variety” of third parties including AdColony, AppNexus (Xandr), Bucksense, OpenX, PubNative and Smaato, all of which are ad exchange or marketing companies of one form or another. In total, the Consumer Council observed Grindr communicating with 53 different unique domains and 36 different advertising companies, with 11 parties receiving the user’s exact GPS location, four parties receiving the user’s IP and/or MAC address, and seven parties receiving “personal information about the end user, such as age or gender.”

Grindr’s interaction with the MoPub ad network is a telling example of how sensitive U.S. citizen data is often shared with entities far beyond the app that originally collected it. Grindr, according to the report, sends ad requests to MoPub (for example, to get back a user-tailored advertisement) several times per minute. The request itself includes the user’s exact GPS location, app name, gender and age. On the authors’ Android test device, Grindr’s request to MoPub also included the user’s Android advertising ID, which Google assigns to each Android user as a unique identifier for advertisers. Once MoPub receives the request from Grindr, it may then interact with any number of advertising networks to decide which advertisement is returned. And this second step is where many other third parties receive sensitive user information: AppNexus would receive the Android advertising ID and the user’s IP address; Aarki would also receive the user’s gender and year of birth; OpenX would receive, among other data points, a user’s GPS location, gender and associated keywords. Grindr is therefore sending data to MoPub, which then shares data with these and other entities. This same story unfolds multiple times over with other Grindr partners: Grindr shares data with Perfect365, a virtual makeup photo app, which itself communicates with many companies including Amazon, Google and mobile video ad platform Receptiv.

The Consumer Council also found Grindr to be in communication with 18 third-party software development kits, including those made by AdColony, Braze/Appboy, Google Crashlytics, Facebook, Google Firebase, OpenX, MoPub and Tencent. Potential harms from these privacy violations abound, yet the last of these companies, Tencent, a Chinese technology conglomerate, in particular links back to why this report demands a new discussion of the 2019 CFIUS investigation into Grindr. The forced sale of Grindr stopped Beijing Kunlun Tech from overtaking the company’s technology, data and operations. Yet, just as one ownership change did not stop sensitive Grindr data on U.S. citizens from spreading to third parties, it certainly did not stop a large Chinese technology conglomerate’s software development kit from being hooked into the application. Without rendering any judgment on the security risks or lack thereof of Tencent’s software development kit, this highlights the narrowness of the 2019 CFIUS review of Grindr.

How Policymakers Can Fill in the Gaps

Clearly, the fact that Grindr is a U.S. company does not prevent the sensitive data it collects from being widely shared with third parties. Nor should this case study be taken to mean that Grindr is an outlier in its heavy data-sharing with third parties. There is a massive and virtually unregulated economy of data brokerage in the United States, where companies sell, license or otherwise share data on consumers with third parties. Instead, this case study underscores why policymakers must think far broader than company ownership when considering the risks of sensitive U.S. citizen data ending up in the hand of unethical firms, cybercriminals or even foreign governments. CFIUS forced Chinese company Beijing Kunlun Tech to sell Grindr ostensibly because the app’s information was too sensitive to risk ending up in the Chinese government’s hands. The sale may have reduced that risk to the United States, but it hardly closed off the many other vectors through which third parties, Beijing included, could still obtain the sensitive information collected by the Grindr application.

CFIUS is still a useful and important mechanism for addressing the national security risks associated with direct foreign access to sensitive U.S. citizen data. But policymakers must recognize that CFIUS must be complemented with other measures outside of the body’s scope. This includes Congress bolstering export control authorities for the executive branch to limit data transfers to certain foreign entities; Congress implementing new restrictions on data brokerage in U.S. federal privacy law; and the executive branch better considering how third-party code, advertising networks and software development kits result in U.S. citizen data spreading throughout the internet ecosystem. These actions will come with their own challenges. For instance, export control requirements on data flows abroad would need to more precisely define terms like “sensitive data” compared to FIRRMA (which contains no such definition) as well as determine if balances should be struck on limiting some kinds of data exports over others (for example, health data versus financial transaction data). But the fact remains that policies on the privacy and security risks of U.S. data leaks should build up and leverage a toolkit much broader than just CFIUS.

The authors thank David Hoffman, Kenneth Rogerson and Harrison Grant of the Duke Privacy and Democracy Project for their feedback on an earlier draft of this post.