Silver lining????
I guess the equality is better for women...
The thing I dont see addressed, is the automated filtering done by many companies. I know my company filters out applicants with software. I wonder if many of the companies the researcher applied to use the same software. Its possible the researcher could have done something when setting up the data that made it biased.
Did I miss that spy?
One of the possible sources of error is definitely that since companies automate processing it could be a flaw in the software and not indicative of human bias, yes. As it's hard to see what about the variations between race and triggering that computer response would be and how if that were the case seems like it could impact real-life individuals, that would be a pretty notable finding too though.