One of the most powerful advertising-buying firms has been warning its clients about the risks of their corporate marketing inadvertently showing up next to pornography on Snapchat.
GroupM, which oversees billions of digital-advertising dollars as the media investing arm of the ad giant WPP, recently sent clients a memo about their ads possibly running before or after “explicit adult content” on Snapchat’s Stories feature, if a user chooses to follow such accounts on the app. Stories are a collection of videos and images uploaded by users that disappear after 24 hours.
GroupM sent the memo after hearing from an advertiser last week that was distressed about its ad’s placement either before or after a pornographic image, said Rob Norman, the firm’s chief digital officer. While the ads do not appear on the same screen as the explicit images, Mr. Norman said that GroupM has been advising clients concerned about the issue to stick to Snapchat’s curated news service, Discover, or its filters and selfie lenses.
Snapchat lenses, which people often use to add virtual masks, such as dog ears and the like to their faces, can also be purchased by advertisers, as Taco Bell did to great effect last year with a lens that turned people’s heads into tacos.
This week, however, Mr. Norman said, “A note came to us and said, now we have a situation where one of our lenses we paid for on Snapchat has been appropriated by a porn star, and the porn star has used our lens.”
The lens was not used in an explicit manner, but the brand was concerned that it could be, according to Mr. Norman. He added that GroupM has been having similar conversations recently about risks on YouTube and other platforms, noting the management at Snap Inc., the newly public owner of Snapchat, has been “very responsive” to its concerns.
“We work hard to remove explicit content and protect our community and partners,” Snap said in a statement on Friday. “We will continue to invest in improving our tools and processes to ensure a positive Snapchat experience.” The company says it is currently testing an in-app reporting tool in Australia that may help with a variety of guideline violations.
Advertisers have been focused recently on where their content is showing up online, as the rise in user-generated content on social platforms has led to reports of brands inadvertently funding or being associated with terrorism, anti-Semitic sentiments, fake news and, most recently, sexualized images of children on Facebook.
While Snapchat’s guidelines prohibit accounts from using public Stories “to distribute sexually explicit content,” it is not foolproof. The platform is also less mature than some of its rivals in developing tools for policing content, Mr. Norman said.
Snap made almost all of its roughly $400 million in revenue last year from advertising. It has sought to distance itself from Snapchat’s early reputation as an app for “sexting,” a perception it acknowledged in its recent filing to go public.
Despite GroupM’s warning, it is difficult to quantify the risk for brands that their ads will show up near pornography on Snapchat. Ad Age reported this week that it had seen ads for household brands appear before and after Snapchat Stories that include nude videos. On Instagram, a search for the hashtag #snapchat only shows a selection of “top posts” and hides apparently 38 million “recent posts” because of community guideline violations. Reddit has hosted a number of forums devoted to sexual imagery or accounts on Snapchat. These forums often have names like “DirtySnapchat” and “NSFW_Snapchat.” (“N.S.F.W.” meaning “not safe for work.”)
“We now live in an age, having lived through generations of highly curated media like The New York Times and ABC and NBC, to a world of less curated media or not curated media, and advertisers need to learn what their tolerance for risk is,” Mr. Norman said. “If it’s zero, it leads you down a certain path, and if it’s not there, it leads you down a different path.”
The spotlight on social platforms like YouTube and Facebook will not lessen any time soon, Mr. Norman said. He added, “There’s an expectation from advertisers and regulators and consumers that these companies, which are fairly rich in revenue and certainly rich in profile and total number of users, should make it an extremely high investment priority to keep their platforms safe.”