The portrayal of mental illness in Hollywood films and TV shows doesn't go down well with me. Mental health is a serious issue and the manner it is portrayed can be stigmatising or wrong. For example, showing that mental illness leads to violent behaviour is grossly inaccurate and stigmatising. I also think the importance of mental health is not really emphasised. What do you think?