DIGITAL LIFE
Tik Tok Algorithms Stimulate Unhealthy Eating Behavior
The Wall Street Journal (WSJ) investigated the potential dangers of TikTok's recommendation algorithms, finding that some users were receiving materials promoting unhealthy eating behaviors. After the publication solicited comments from TikTok's administration, the service announced its intention to change these algorithms.
The WSJ investigation began with stories of teenagers who, supposedly, due to TikTok, became interested in weight loss, detox and potentially dangerous diets. To confirm the service's potential danger, journalists conducted an experiment by creating over 100 accounts on TikTok, in which several videos were “viewed” with minimal human participation. At the same time, 12 of these accounts were listed as belonging to 13-year-olds and “watched” videos about weight loss, alcohol and gambling. Once one of the bots stopped being interested in gambling videos and switched to the topic of weight loss, the TikTok algorithm tweaked their new “preferences”, offering material on a different topic.
Based on the results of the experiment, it was calculated that a total of 225,000 videos were viewed by bots, of which 32,700 were related to the topic of weight loss by description or other metadata. At the same time, 11,615 videos by keyword in the description matched the topic of eating disorders, and 4,402 descriptions matched the topic of normalization of such disorders. In some cases, the keywords have been provided in different spellings in order not to fall under the TikTok sanctions. When the WSJ informed the service's management of 2,960 potentially dangerous videos, 1,778 of them were promptly removed, and it is unclear who did this: their authors or the platform. After the WSJ, in anticipation of publishing the material, contacted TikTok management with a request for comment, the service officially announced its intention to adjust the work of recommendation algorithms – likely
The platform noted that it is not always useful to watch videos on some topics in large quantities – in particular, these are weight loss and physical activity issues. Now service experts are trying to limit the work of a recommendation algorithm to content that doesn't violate the terms of service but can be dangerous if consumed in excessive amounts. In addition, a tool that will allow users to independently limit the work of recommendations for certain keywords or hashtags is being tested.
Image source: antonbe/Pixabay
No comments:
Post a Comment