

Once one response was selected, the prompted comment menu disappeared as an option.

In testing the product, a BuzzFeed News reporter only had to click a suggested response once for it to appear in the comment feed of a given live video. And while Facebook seems to be aiming to improve engagement on live video, critics have called the prompts insensitive and further evidence that the company has not thought out the human impact or consequences of its products. While autoreply prompts are not an entirely new concept for Silicon Valley products - Google’s Gmail recently unveiled a pre-populated response tool called “Smart Reply” and Instagram sometimes suggests emoji responses - this appears to be the first time Facebook has tested the tool on live video, where content can be sensitive, unpredictable, and sometimes depicts violence. “So I’m just noticing that Facebook has a thoughts and prayers autoresponder on our Chicago Hospital shooting livestream and I have thoughts,” Haberman tweeted along with photos of the suggested responses from Facebook. On one stream for MSNBC about an ongoing, officer-involved shooting at a Chicago hospital, NBCUniversal contractor Stephanie Haberman noticed Facebook was prompting her to comment with phrases like “this is so sad” and “so sorry,” along with emojis including the prayer hands. On Monday, a handful of Facebook users noticed that the social media platform was offering them preset responses for live videos about a series of news stories. Facebook appears to be testing a new tool that prompts users to comment on live video streams - including those involving sensitive situations like shootings and sexual assault - using suggested text and emojis.
