"My 10-year-old sweet daughter innocently searched for 'tap dance videos'," one parent wrote.
"Now she is in this spiral of... videos that give her horrible unsafe body-harming and body-image-damaging advice."
This is one of the hundreds of accounts outlining damage said to have been caused by YouTube's recommendations algorithm.
It's a phenomenon some refer to as "falling down the YouTube rabbit hole" with users directed to controversial and potentially dangerous content they might never have stumbled on otherwise.
The accounts have been gathered by Mozilla, the organisation best known for its Firefox web browser, which competes against Google's Chrome. The BBC was unable to corroborate the posts, as the foundation said they had been collected anonymously.
It's impossible to know if all the details are true. But Mozilla says it has shared a representative sample of the messages it received. And some read like horror stories.
"She is now restricting her eating and drinking," the parent continued.
"I heard her downstairs saying, 'Work to eat. Work to drink.'
"I don't know how I can undo the damage that's been done to her impressionable mind."
White supremacists
Mozilla asked the public to share their "YouTube regrets" - videos recommended to users of the video clip platform, which led them down bizarre or dangerous paths.
"The hundreds of responses we received were frightening: users routinely report being recommended racism, conspiracies, and violence after watching innocuous content," said Ashley Boyd, Mozilla's vice-president of advocacy.
"After watching a YouTube video about Vikings, one user was recommended content about white supremacy.
"Another user who watched confidence-building videos by a drag queen was then inundated by clips of homophobic rants."
YouTube is the second most visited website in the world. Its recommendation engine drives 70% of total viewing time on the site, by tailoring suggestions to keep viewers watching.
Its owner Google has yet to comment on Mozilla's report.
But managers have previously denied suggestions that their algorithms deliberately promote extremist or harmful content because it boosts watch-time or benefits the business in some other way.
And they have added that YouTube has begun tackling videos that contain misinformation and conspiracy theories by showing "warning labels" and "knowledge panels" containing trustworthy information.
Even so, claims that its recommendations have a tendency to lead users astray persist.
- YouTube advertises big brands alongside fake cancer cure videos
- YouTube 'aids flat earth conspiracy theorists'
- What is the fascination with the Illuminati conspiracy?
"We urge YouTube and all platforms to act with integrity, to listen to stories and experiences of users," said Lauren Seager-Smith, chief executive of children's protection charity Kidscape, which is not involved in Mozilla's campaign.
"[It needs] to reflect on when content may have caused harm - however inadvertently - and to prioritise system change that improves protection of children and those most at risk."
Fear and hate
Mozilla said it received more than 2,000 responses in five languages to its call.
It has published 28 of the anecdotes.
"My ex-wife, who has mental health problems, started watching conspiracy videos three years ago and believed every single one," recalled one contributor.
"YouTube just kept feeding her paranoia, fear and anxiety, one video after another."
"In coming out to myself and close friends as transgender, my biggest regret was turning to YouTube to hear the stories of other trans and queer people," one person wrote.
"Simply typing in the word 'transgender' brought up countless videos that were essentially describing my struggle as a mental illness and as something that shouldn't exist. YouTube reminded me why I hid in the closet for so many years."
The LGBT Foundation - a Manchester-based charity - called for YouTube and other social media companies to take more responsibility for the content promoted by their algorithms.
"Hateful content online is on the rise, and something that is of increasing concern," the foundation's Emma Meehan told the BBC.
"Social media giants have a responsibility for what is shared on their platforms and the real-world impact this may have, and need to work to take a more dedicated approach to combating hate online."
Research challenges
YouTube's recommendations system poses difficulties for researchers outside the company as the business does not share its own recommendations data.
Since each user is given different suggestions, it is hard to determine why some choices are made and how many others have had the same content promoted to them.
"By sharing these stories, we hope to increase pressure on YouTube to empower independent researchers and address its recommendation problem," Mozilla's Ashley Boyd said.
"While users should be able to view and publish the content they like, YouTube's algorithm shouldn't actively be pushing harmful content into the mainstream."
Have recommendations by an algorithm confronted you with false or harmful content? You can get in touch by emailing This email address is being protected from spambots. You need JavaScript enabled to view it..
Please include a contact number if you are willing to speak to a BBC journalist. You can also contact us in the following ways:
SOURCE: BBC