TikTok is refusing to mention how many videos were taken down or restricted.
Social media companies need to "get serious" about asking for ID to prevent harmful content being shown to teenagers.
Speaking to KFM, David McNamara, Managing Director at CommSec Cyber Security, said social media companies need act fast to stop youths being exposed to suicide-related content on their platforms.
He said people often share content without understanding the "consequences and how it’s hurting other people.”
He made an urgent call for TikTok to change its algorithm.
He said similar to banking apps, social media companies needs to start asking for ID.
His comments follow reports on Primetime that children as young as 13 are creating and viewing suicidal content on TikTok.
TikTok is refusing to mention how many videos were taken down or restricted.
McNamara said their response is not good enough - “by a long shot.”
Meanwhile, Facebook and Instagram are being investigated over concerns they lead to addictive behaviour in children.
The European Commission will look at systems which create so-called "rabbit hole" effects.
Meta, which owns both platforms, says it has a number of online tools to protect children.