Innovation

Why AI Is Detaining Sex Workers at the Border—and You May Be Next

FACE OFF

Facial ID is being used to detain and ban sex workers from entering the country—and it’s a harbinger of a rapidly encroaching surveillance state fueled by artificial intelligence.

Photo illustration of a 3D facial map of a woman with red lips
Photo Illustration by Elizabeth Brockway/The Daily Beast/Getty

Sydney knew better than to take her cellphone across the border. After years of hearing rumors of friends and colleagues’ having their phones seized by border patrol, she chose to leave both of hers—a personal line and a separate work phone—at home when she traveled to the U.S. from her home in Canada. She and her wife regularly flew to Massachusetts and back without incident to visit her in-laws.

Leaving her devices behind is one of many precautions Sydney—whose name has been changed to protect her identity—takes as a Canadian full-service sex worker, the industry term for what’s more commonly known as an escort or a prostitute. Although escorting is technically legal in Canada under the Nordic Model—the Swedish structure that legalizes selling sex but criminalizes paying for it, which was adopted in Canada in 2014—in practice, legalization hasn’t made a dent in police violence against and surveillance of sex workers. Sydney also chooses to be face-in, meaning she sacrifices potential clients for her privacy and hides her face in advertisements.

Her precautions weren’t enough. Sydney and her wife landed in Massachusetts for a week at the beach in Aug. 2014, two months after Massport installed self-service kiosks at customs in Boston Logan Airport. When Sydney scanned her passport and looked into her reflection on the kiosk screen, time stopped as a big X appeared over her face. She wasn’t going to make it to the beach after all.

ADVERTISEMENT

An armed officer took Sydney and her wife to secondary questioning and immediately asked for their phones. (While both of Sydney’s were home, the officer made Sydney’s wife unlock hers.) After her wife was dismissed to continue on to the U.S., the officer took Sydney into a separate, freezing room for interrogation. A pair of latex gloves lay on the desk.

About a year earlier, in Sept. 2013, Sydney’s friend Naomi—a face-out sex worker, whose name has been changed to protect her identity—was on her way to New York City when she was detained at preclearance at Toronto Pearson Airport. “I guess I’ll see you on the other side,” she told her friends who went on ahead, even though she knew she wouldn’t.

After handing Naomi a 10-year travel ban, border patrol agents combed through her flight manifests to find potential associates. They then compared those women with the “duo” colleagues Naomi listed in her advertisements—other sex workers, including Sydney, with whom Naomi offered double sessions with clients. Naomi and Sydney had traveled to the U.S. together twice before. Even without showing her face, border agents connected the dots. Sydney received a five-year ban and hasn’t tried to leave the country since.

Photographs of Naomi
Courtesy of Naomi

Borders have always been a source of terror for sex workers: The Page Act of 1875, the first piece of immigration legislation in the U.S., prohibited East Asian women from entering the country under the assumption that they were traveling “for the purposes of prostitution.” But there’s been an uptick this year of sex workers’ being denied entry to countries across the globe such as the U.S., Canada, England, and Singapore. The general wisdom (and likely culprit for many) is that facial recognition and data sharing are to blame.

While it might be easy to disregard this phenomenon, the truth is that it’s a harbinger of an encroaching surveillance state that threatens everyone’s lives—not just sex workers’.

As a sex worker, I try my best to opt out of facial recognition technology whenever I travel, but I was taken by surprise later this summer when I arrived at Kennedy Airport on a flight from London. Exhausted and irritable, I removed my glasses when the agent asked, eager to get through customs and go home. “Welcome back, Olivia.” He smiled as I handed him my passport. (Note: like most sex workers, including those quoted in this article, I use a pseudonym. The agent addressed me by my government name, which I keep private for obvious reasons.)

Due to my work, I fully expect to be harassed by immigration when I visit a foreign country. As an American citizen, however, I find it easy to forget that the U.S. has an eye on me too. It has an eye on all of us, and it has, for years, as it built up an extensive database of travelers and our faces between passport kiosks and security footage, its accuracy tweaked and refined on its test subjects.

In Feb. 2023, a sex worker named Hex was denied entry to the United States for “prostitution.” Hex’s story was familiar in many ways, from the surveillance to the assumption that every sex worker, including “legal” sex workers like adult-content creators and strippers, is a “prostitute.” Hex creates virtual-reality content and appears as a 3D avatar, making it unlikely that border patrol matched her face.

A dozen sex workers—mostly full-service sex workers, but also some adult-content creators—who had been detained at the United States and Canadian borders over the past decade spoke to The Daily Beast about their experiences. What I expected to hear were stories of facial-recognition technology (FRT) linking face-out sex workers to their advertisements. Instead, every sex worker to whom I spoke shared stories of surveillance that was analog, not algorithmic, and largely indistinguishable from stalking and run-of-the-mill sexual harassment.

All Artificial, No Intelligence

Like Sydney, Jack—a transmasculine non-binary sex worker whose last name has been omitted to protect their identity—knew not to bring the phone he used for work, which he left behind in England, when he traveled to Vancouver to meet a client. He also knew to remove any and all apps from his personal phone that might connect him to his sex-work persona.

Jack told The Daily Beast that he’s not sure if he was flagged by facial recognition, another form of surveillance, or simply a bored Canada Border Services Agency (CBSA) agent—but he suspects it was the latter. He doubts that the agent knew he was a sex worker, but he had enough of an online presence that agents would have pulled up one of his profiles. CBSA perceived a young, single woman who, Jack conceded, was dressed “somewhat revealing,” and assumed the worst. He was detained for about seven hours before CBSA let him pass through customs.

Photograph of Jack
Courtesy of Jack

This wasn’t the first time Jack had been detained. The previous year, at Orlando International Airport, Jack was pulled aside at customs where an agent asked him each of the ESTA Visa-waiver application questions, lingering long on the question “Are you coming to the United States to engage in prostitution?” Jack made it through after about half an hour—he had an all-inclusive DisneyWorld package as an alibi—and when I asked what might have set off the agent’s alarm bells, his only guess was that he was, put plainly, “dressed like a whore.”

This scenario is likely familiar to most in-person sex workers. Facial recognition has been a primary concern since before surveillance technologies accelerated the threat. Beyond “anti-trafficking” training and directives to scrutinize women traveling alone, hotels have a long informal history of searching for sex workers’ online advertisements, printing them out, posting them behind the concierge desk, and comparing them against each guest—meaning facial recognition, whether algorithmic or analog, relies on everyone’s data to compare against ours.

Facial recognition has been a regular part of our lives for decades; only recently has it been streamlined by AI or, more accurately, the intricate networks of data, including our faces, woven together by algorithms and surveillance technologies. Despite the term “artificial intelligence,” such technologies are developed on the backs of human workers to label data and curate datasets in ways that makes the technology usable.

But a database is only a small part of the picture. Even if facial-recognition technology can accurately identify photos of an individual, a human component is still necessary to confirm that the match was accurate. Sifting through the millions of international tourists to the United States—a figure that reached over 50 million in 2022—is far too massive a dataset to tackle. Sex workers, on the other hand, are a relatively small portion of the population who work to conceal their identities while self-identifying to some extent on ad sites—and we’re a demographic that state officials and law enforcement don’t feel guilty humiliating.

No sex worker is completely anonymous. Government ID is required by all major adult platforms—from advertising boards to fansites—to ensure that users are over the age of 18. Some, like the escort platform Eros, which was raided by the Department of Homeland Security (DHS) in 2017, have been rumored to share user data with the federal government. Others, like porn platforms OnlyFans and Fansly, require self-disclosure for tax purposes. In-person indoor sex workers—those who work in brothels, dungeons, massage parlors, or strip clubs—have to provide ID to work in a house.

Even street-based full-service sex workers who manage to avoid a paper trail aren’t safe from surveillance. As sex worker and activist Liara Roux told The New Statesman, “unless you’re taking extreme measures—and even then, they can run facial recognition on traffic cameras—there’s nowhere to hide.”

That said, sex-worker surveillance isn’t limited to facial recognition. As with any marginalized community, sex workers are stars in a constellation, forming a relational network like a celestial map legible to researchers, law enforcement, and surveillance technology. If, for example, your mobile device is regularly in close physical proximity to a handful of others, some of which have been identified as belonging to sex workers, then it’s more likely than not that you’re also a sex worker.

Such evidence is more likely to contribute to algorithmic bans from social media platforms or payment providers rather than by the state—for now, at least. Geolocation has already been used by law enforcement to indict an Idaho teenager for traveling out of state for an abortion, and as these technologies become more insidious, the data they collect become more valuable—and more dangerous.

Still, curating data requires human interference, as does training any algorithm to perform surveillance on its own. And was three other sex workers The Daily Beast spoke to—all detained at the ground border outside Buffalo—what better way to spend your shift as a border agent, especially in smaller, more rural areas, then perusing escort ads?

The Unsafe Surveillance State

Naomi believes she’s been profiled by the DHS for a few years now—but not because of sex work. She visited Pakistan around 2010.

“Before that I had never been stopped at the U.S. border for anything, but as soon as I had that fucking stamp on my passport, I got stopped every single time for it,” she said. “That was when they really started asking me more questions at the border.”

In the years after 9/11, the DHS was not particularly shy about its heavy surveillance of Middle Eastern and Muslim travelers. In 2003, however, it quietly launched a program called ADVISE [Analy­sis, Dissemina­tion, Visu­ali­zation, Insight, and Semantic Enhance­ment] to develop mass surveillance technology. Rather than simply collecting data like location, ADVISE would “identify critical patterns in data that illuminate their motives and intentions.”

The same 2000s fervor that justified the PATRIOT Act made such intrusions into privacy an individual sacrifice for national security. But as critics noted then, “even well-intentioned counterterrorism programs could experience mission creep, having their purview expanded to include non-terrorists—or even political opponents or groups.” The program was suspended the following month due to privacy concerns and the availability of less-expensive commercial products.

ADVISE was like a lightning flash, illuminating the state’s motives and intentions as much as those of surveilled citizens: to identify networks between and within people and communities. And that, Naomi says, is precisely what DHS proceeded to do.

“At the time that I was banned, literally everyone I know in the business—in Toronto, some in Vancouver, one girl I knew in Calgary—we all started getting stopped,” she explained. “Everyone got five-year bans. It was in succession.”

Sydney’s still baffled that her movement has been so severely restricted by what appeared to be a handful of bored officers comparing names and passport photos with flight manifests and escort ads, for seemingly no reason other than that they can. She refused to sign a form confessing to her charges, which she said was sloppily thrown together. “They made so many errors that they cut and pasted things from other people’s files… other city names obviously from someone else’s documents.”

For Sydney, and the countless other sex workers who have been detained at the border and banned from legally entering the country, these actions are more than just overzealous and unjust punishments—they’re also hamstringing their basic rights. No longer are they free to travel and move about as they please. Instead, they’re forced to grapple with a Kafkaesque bureaucracy that is dead set on limiting their freedom using emerging technologies that will just as likely be turned on everyone else.

“Every time someone asks me about a Broadway show, or if I’ve been to Dollywood yet, it’s a knife in the heart,” Sydney added, through tears. “And all of this, just because I see men in nice hotel rooms, or a woman cooks me a nice dinner? I’m such a threat to national security? Good job, everyone. You’ve really made us safer.”

Got a tip? Send it to The Daily Beast here.