Is your AI bias?
Within the first 3-6 seconds of beginning my presentation, the audience will have already made a decision. Not only about my presentation but also about me and that is ok, it is human nature. But what is not OK is not acknowledging these perceptions and biases, and the effect that they have on our work. This often happens because we do not understand the extent of these biases.
AI and technology are only as good as their makers. We all act from a place of ‘gut feeling.’ So, you can begin to see this within the coding. We need to be consciously responsible for the AI and the algorithm/codes we are introducing.
For example ‘Tay’ is an innocent bot made by Microsoft and introduced on Twitter. The bot learns at a much faster rate than we can do as humans. In the case of Tay, it was learning non-filtered jokes and agreeing with anti-sematic views.
HR Tech and AI
There is a perception that HR tech is innocent, that there is not much AI being built in this sector. However, there are already those within the industry that are testing face to face interviews with a BOT. We are already training and implementing technology that may not, in many cases be ethically designed. Hiring decisions are made collectively using predictive analytics, systems and decision-makers.
So, when we add technology to this it can become a big problem.
Bias begins with human input
Using Amazon as an example where their recruitment process was biased against women. The initial worry here is the code had to start somewhere before the deep machine learning took over. I.e. there always has to be human input.
A study within the UK suggested that the majority of HR systems and BOTS are actually prejudiced against minorities and people of the same sex. This learning had to start somewhere, with someone showing an initial preference for these characteristics.
There are of course huge benefits with AI and it does help us to streamline processes and eliminate some early biases. The data that we are inputting helps the machine train ethically and this is something we need to all take responsibility for.
Create guidelines and checklists
I believe that regardless of the size of your organisation you need to ensure that you have processes, guidelines and checklists to ensure you are adding the right type of code and features/functionalities. Continue to test your checklists/guidelines and make sure that these work in practice and keep training. Have patience with your machines before launching them into the market as this will be when problems occur.
Also, measure your output and goals with your team. This will help everyone see how their perceptions and biases may affect code and technology.
ADP conducted research and the statistics showed that organisations that conducted their recruitment without bias had 45% more likely growth in the market share, 42% more applications that stay and 33% more profitability. This shows the effect this can have for the worth of your IP and your consumer trust and loyalty.
Finally, we all need to be as open-minded as possible, let’s be critical about ourselves and our team and study our biases. This will help us with continual learning and development.