TikTok is refusing to mention how many videos were taken down or restricted.
Social media companies need to "get serious" about asking for ID to prevent harmful content being shown to teenagers.
Speaking to KFM, David McNamara, Managing Director at CommSec Cyber Security, said social media companies need act fast to stop youths being exposed to suicide-related content on their platforms.
He said people often share content without understanding the "consequences and how it’s hurting other people.”
He made an urgent call for TikTok to change its algorithm.
He said similar to banking apps, social media companies needs to start asking for ID.
His comments follow reports on Primetime that children as young as 13 are creating and viewing suicidal content on TikTok.
TikTok is refusing to mention how many videos were taken down or restricted.
McNamara said their response is not good enough - “by a long shot.”
Meanwhile, Facebook and Instagram are being investigated over concerns they lead to addictive behaviour in children.
The European Commission will look at systems which create so-called "rabbit hole" effects.
Meta, which owns both platforms, says it has a number of online tools to protect children.

29 New Jobs To Be Created As Chemist Warehouse To Open In Naas Tomorrow
Vulnerable Adults, Some Using Wheelchairs, Forced To Share Inaccessible Vehicles At Kildare HSE Service
“Grabbed By the Neck”: Founder Tells Kildare Today Why She Launched New Female-Focused Taxi Service
Just 75% Of Landowners On Board So Far As Uisce Eireann Submits Plans For Major Water Supply Project
Two Kildare Animal Charities Share €170k Fund As Animal Welfare Costs Continue To Mount
Call To Make Extended Christmas Train Timetable Permanent For Commuter Towns
Roadside Memorial Ban Proposal Criticised Over Christmas Timing
Man (30s) Dies Following Collision Involving Bus, Car And Pedestrian At Kishawanny, Carbury