YouTube is once again facing backlash for a gruesome video on its platform.
The video was widely seen before he removed it. Paul has since apologized, noting he "intended to raise awareness for suicide and suicide prevention."
YouTube confirmed to CNN Tech that the video violated its policies, but declined to comment on why it was not taken down sooner.
"Our hearts go out to the family of the person featured in the video. YouTube prohibits violent or gory content posted in a shocking, sensational or disrespectful manner," a YouTube spokeswoman told CNN Tech. "If a video is graphic, it can only remain on the site when supported by appropriate educational or documentary information and in some cases it will be age-gated."
But the incident raises questions about why Google-owned YouTube didn't react sooner.
The platform relies on people and technology like machine learning and algorithms to monitor for violent and graphic content.
"It's not okay to post violent or gory content that's primarily intended to be shocking, sensational, or gratuitous. If a video is particularly graphic or disturbing, it should be balanced with additional context and information," according to the company's policies.
Other violent content may be age restricted. In addition, YouTube bans videos related to terrorism.
YouTube is no stranger to questions about how it cracks down on content. Late last year, Google was criticized for inappropriate videos on YouTube Kids, the site's child-friendly platform. The videos included profanity and violence.
Videos for YouTube Kids are pulled from the main platform through machine learning and algorithms. However, some videos, such as cartoons that appear age appropriate, can fall through the cracks.
Google recently rolled out new policies to address the issue. For example, videos flagged as inappropriate for kids will have an age-restriction warning on the main YouTube app. Only users logged into the site who are over the age of 18 can see that content.
According to Erna Alfred Liousas, an analyst at research firm Forrester, YouTube continues to face these challenges in part because of its "vague" policies.
Paul's video "underscores the ambiguity that faces all of these platforms when it comes to how to appropriately moderate content," she said. "The issue isn't going away. It's not getting better. It's time to do more digging and investing in people and in the way the technology is surfacing this information."
Liousas said tech companies should be more transparent about their process for handling disturbing content, such as revealing the steps that happen after content is flagged and how they decide what gets taken down.
However, YouTube has also faced backlash for removing certain content, including its removal of videos from war-torn Syria. The move concerned advocates who were trying to maintain digital records of one of the worst humanitarian crises in the world.
Paul, who posted the YouTube video in the Japanese forest, became famous from his viral micro-videos on Vine, a Twitter-owned video service that was phased out last year. Now, he creates video blogs on YouTube daily. He has over 15 million subscribers on the platform.
CNNTech's Sara Ashley O'Brien contributed to this report.
- YouTube says Logan Paul video violated its policies
- YouTube punishes Logan Paul for 'suicide forest' video
- YouTube temporarily cuts off ads on Logan Paul's videos
- YouTube re-enables ads on Logan Paul's videos following suspension
- Amid Logan Paul controversy, YouTube explores 'further consequences'
- YouTube star Logan Paul posts new emotional apology for showing video of apparent suicide victim
- Así castigó YouTube a Logan Paul tras el video del "bosque de los suicidios"
- YouTube assigning workers to review videos to avoid another Logan Paul-type disaster
- Who is Logan Paul? Embattled YouTube star wants to be the next big thing
- YouTube restricts gun videos