Are Mental Health Initiatives in the Workplace Making a Difference?

I’ve noticed an increasing number of conversations about mental health in the workplace lately. It seems like companies are starting to recognize the importance of mental wellness. But I can’t help but wonder: is this attention really leading to positive changes for employees? For example, when a company introduces new mental health days or resources, do employees actually feel more supported, or is it just a superficial gesture?

In my experience, working in an environment that actively promotes mental health has created a more open atmosphere where we can discuss our struggles without fear. Still, I see some coworkers who are doubtful about whether these initiatives truly make a difference. Are companies just going through the motions?

What about you? How have mental health updates impacted your workplace? Do you feel like these announcements lead to real action, or do they seem more like public relations efforts?