Anon02/16/26, 19:34No.8832957
Japan, in general, is really ironically bad about health.That's to say that they give more of a shit about it on paper, doing more doctor visits on average, having nationalized healthcare, etc, but that the benefits from this are kind of a mess due to their work culture. People are often overworking themselves, and many problems with peoples' health, especially if they are stress-induced or based on mental health, are sort of overlooked. So they end up working when they shouldn't, and they tend to make their conditions worse.