Tech company says its AI can forecast crime through social media policing

Tech company says its AI can forecast crime through social media policing

A tech company that claims about using artificial intelligence to anticipate crimes is currently embroiled in a privacy dispute with Meta, the old Facebook, over access to the social media network.

A rising number of law enforcement organizations in the United States and around the world have contracts with Voyager Labs, including two of the biggest police departments in the country: the New York City and Los Angeles departments.

According to records obtained by the Surveillance Technology Oversight Project (STOP), the New York Police Department signed a nearly $9 million contract with Voyager Labs in 2018, which asserts that it can use AI to anticipate crimes.

The tech business describes itself as a “world leader” in AI-based analytics investigations that can sift through mountains of data from the dark web and social media to offer insight, identify potential dangers, and foresee future crimes.

But Meta says in a federal lawsuit that Voyager Labs created at least 55,000 fake accounts on Facebook and Instagram to collect personal data “to uncover … behavior patterns,” “infer human behavior” and “build a comprehensive presence” on their target(s).

That includes 17,000 fake accounts after Meta revoked Voyager Labs’ access after filing the federal lawsuit on Jan. 12.

Essentially, Voyager Labs can use someone’s social media history to retrace anyone’s steps and potentially predict their next movements, according to Meta.

Read Also: Neuralink, tech firm founded by Elon Musk to test brain implants in human beings

An NYPD spokesperson told The Guardian that “offenders” increasingly “utilize social media in furtherance of their unlawful activities.”

“Voyager assists the NYPD in preventing victimization and apprehending these offenders,” according to the NYPD spokesperson; but the department doesn’t use “features that would be described as predictive of future criminality”.

William Colston, a spokesperson for Voyager Labs, said that he can’t get into specific cases on how and when they used the company’s AI, but he said they’re “very proud” to have busted child trafficking rings and combated terrorism.

Meanwhile, STOP, a privacy advocacy nonprofit, described Voyager Labs’ tactics as “a new digital form of stop-and-frisk” that targets Black and Latino New Yorkers, according to STOP communications director Will Owen.

“This is invasive, it’s alarming, and it should be illegal,” Owen said in a Sept. 8. statement. “Our constitution requires law enforcement to get a warrant prior to searching the public; but increasingly police and prosecutors just buy our data instead.”

“This isn’t just bad policing, it’s not just enabling companies that steal our data, but it’s a flagrant end-run around the Constitution.”

About The Author

Skilled communication expert and tech whiz. Highly passionate about social transformation and quality education.

Related posts

Leave a Reply

Your email address will not be published.