YouTube Takes Steps to Combat Spam on Shorts by Disabling Links

In response to the growing issue of spam on its short-form video platform, YouTube Shorts, the company is implementing a significant change. Starting from August 31st, YouTube will no longer allow clickable links to appear in the comments section, video descriptions, and vertical live feed of Shorts. This decision aims to counteract scammers and spammers who use links to deceive users and perpetrate scams.

The move comes as a preventive measure against potential threats posed by spammy links, which could lead users to harmful content such as malware, phishing attempts, and other fraudulent schemes.

While YouTube already employs systems and policies to detect and eliminate spammy links, the platform has decided to take a more drastic approach by disabling these links entirely. The change will be rolled out gradually, meaning that not all links will be disabled immediately on August 31st.

In addition to disabling clickable links, YouTube is also removing the clickable social media icons from desktop channel banners. These icons have been exploited by scammers to mislead users through deceptive links.

However, YouTube acknowledges that legitimate creators often need to include links, especially for purposes such as recommending products and brands to their followers. To address this concern, the platform is introducing alternative solutions for creators to include links safely.

Starting from August 23, both mobile and desktop YouTube viewers will notice “prominent” clickable links located near creators’ channel profiles close to the ‘Subscribe’ button. Creators can utilize this space to link out to websites, their other social profiles, merchandise sites, and other links that adhere to YouTube’s Community Guidelines.

Furthermore, creators who want to direct viewers to their long-form videos using links will still have the ability to do so. YouTube plans to introduce a safer method for these creators to guide Shorts viewers to their other YouTube content by the end of September.

These changes follow YouTube’s ongoing efforts to combat spam across its platform, including enhancements to the systems detecting impersonation channels. From Q1 2022 to Q1 2023, YouTube reported a more than 35% increase in removals and terminations related to impersonation.

The platform has also made improvements to its comment management system, resulting in a significant rise in comments being held for review by creators. This proactive approach aims to enhance user experience and safety while engaging with YouTube’s content-rich environment.

Share this article
0
Share
Shareable URL
Prev Post

Volkswagen’s Electric Camper Van Delayed Due to Weight Concerns, Report States

Next Post

Instagram Explores New Story Feature: Group Tagging Made Simpler with Single Mentions

Read next
Whatsapp Join