The resource is provided to the majority case where the woman is the victim. It tries to help a demographic that is statistically more in need.
Everyone should receive help, as also stated in the comment, but to fault the algorithm for not providing the resource to the minority case can be compared to asserting that “all lives matter” in a police brutality context. They do, but white people aren’t as often victimized so you can’t fault someone for focusing their support on the black community.
I’m for providing support for male victims of domestic abuse just as I’m for supporting white victims of police brutality but you shouldn’t get worked up over people prioritizing helping the demographic that needs it the most
You are mistaken that the discrepancy is the result of algorithmic bias. The latter image depicts a custom, hard-coded result that appears when one of preselected set of queries are searched. It was added as part of an anti-domestic violence drive. The trouble is, adding a copy of the selected queries with substituted gendered language (e.g., substituting “husband” with “wife”, “man” with “woman”, etc.) would have taken all of 10 minutes. It’s not surprising that most are unsympathetic to this excuse.
Except one is “most cases are like this, so let’s help them”, and the other is “most cases are like this, so let’s hurt them”
It isn’t though? The post is advocating that everyone should receive help, while the comment is trying to justify the way it currently is.
The resource is provided to the majority case where the woman is the victim. It tries to help a demographic that is statistically more in need.
Everyone should receive help, as also stated in the comment, but to fault the algorithm for not providing the resource to the minority case can be compared to asserting that “all lives matter” in a police brutality context. They do, but white people aren’t as often victimized so you can’t fault someone for focusing their support on the black community.
I’m for providing support for male victims of domestic abuse just as I’m for supporting white victims of police brutality but you shouldn’t get worked up over people prioritizing helping the demographic that needs it the most
You are mistaken that the discrepancy is the result of algorithmic bias. The latter image depicts a custom, hard-coded result that appears when one of preselected set of queries are searched. It was added as part of an anti-domestic violence drive. The trouble is, adding a copy of the selected queries with substituted gendered language (e.g., substituting “husband” with “wife”, “man” with “woman”, etc.) would have taken all of 10 minutes. It’s not surprising that most are unsympathetic to this excuse.