New Methods for Assessing a Person’s Competencies and Potential: Challenges and Opportunities

D-Teck is at the leading edge of new technology for performing psychometric assessments and fostering innovation in general. This is why our experts work regularly with specialists in a number of leading-edge sectors, including university research, to anticipate tomorrow’s challenges today.

As an organizational psychologist with D-Teck specializing in innovation, I have co-written an article on new methods for assessing a person’s potential with Philippe Longpré, an assistant professor at the Université de Sherbrooke, and Francine Roy, an associate professor at the same institution. Our work was published in the Ordre des psychologues du Québec’s magazine.

This second part of our article focuses on the challenges that arise with this new technology, as well as the opportunities the new tools can provide. The first part of our article is available here.

Opportunities and challenges

These new practices provide opportunities for advancements in the area of potential and competency assessments, but also contain risks that researchers and practitioners need to consider and even reduce. Although practitioners are always looking for new solutions to improve the accuracy of potential and competency assessments, the predictive value of these assessments has been stagnating for the past several decades and remains moderate (Morris, Daisley, Wheeler, Boyer, 2015). It is quite possible that these new methods will provide an opportunity to increase the predictive value of competency assessments. In fact, research shows that using a combination of methods increases the predictive value of potential assessments (Schmidt and Hunter, 1998). That is not to say that new approaches will take the place of traditional methods. In a few years’ time, potential and competency assessments may combine information gleaned from social media and serious games with that obtained using more traditional assessment methods, such as self-reported measures and interviews, allowing for major incremental gains in predicting human behaviour within an organization.

Moreover, research in the fields of social psychology and behavioural economics has shown the limitations of human judgement in predicting behaviour (Kahneman, 2012). It also shows that using algorithms makes it possible to predict job performance with greater accuracy than human intuition (Kuncel, Klieger, Connelly, Ones, 2013). With the creation of new data analytics departments, an algorithm-based approach may be used more often for human resources-related data. The improvements in the predictive validity of data that should result may be all the more significant in view of the fact that the statistical tools developed by practitioners working with big data generally outperform the traditional methods used in the human sciences (Hindman, 2015).

HR analytics may also make it possible to partially resolve the criterion problem, which is well known in organizational psychology (Chamorro-Premuzic, Winsborough, Sherman and Hogan, 2016). Validation studies focussing on potential and competency assessment tools typically rely on the assessment of performance by a person’s supervisor in measuring the item to predict (Austin and Villanova, 1992). Such data is problematic in that it depends on human judgement, which can be biased in many ways. These biases have a direct impact on the measurement quality for the criterion, which limits the scientific understanding of the person’s job performance as well as the predictive validity of the tools. The objective data generated by HR analytics may therefore pave the way for significant advancements in this field of study.

These new methods also involve their share of risks. Little scientific research has been conducted to have an independent assessment of the validity of serious games and cybervetting. For example, a few articles published in professional journals make reference to the use of serious games in recruiting Generation Y candidates, but they do not present any data or information on what these methods entail (Polimeni, Burke and Benyaminy, 2009). Given the limited scientific information available, greater vigilance is required for cybervetting, which necessarily involves subjective judgement in terms of how the information gathered will predict a person’s behaviour in the workplace.

In addition to issues relating to validity, these new methods may also be subject to discriminatory bias. For example, cybervetting provides an easy way for an employer to discriminate against certain groups of candidates. Users of social media sites, such as Facebook, typically post pictures or state preferences and personal opinions that in no way relate to the skills and aptitudes required for a given position. Yet this information may very well relate to factors that are considered discriminatory, such as religious affiliation, sexual orientation, ethnic origin or marital status (Berkelaar and Buzzanell, 2015). This makes it easy for an employer to eliminate candidates based on these attributes at the start of the assessment process.

The role of the organizational psychologist

These practices, enabled by the emergence of new technology, are already having an impact on the traditional model used to assess an individual’s competencies and talents. However, although they have barely been implemented within organizations, new technology, such as artificial intelligence, connected devices, augmented reality and virtual reality, are now emerging. It is harder than ever to get a clear idea of how potential and competency assessments will be conducted in the future. In the face of these ongoing changes, organizational psychologists appear to be the ideal professionals to help organizations navigate through these new technologies. They have expertise in psychometrics, research and statistics, allowing them to use sound judgement with respect to these new assessment practices. However, to fully perform their role, they must learn how to work effectively with new partners, such as IT and data analytics specialists, and develop their expertise in this field. In order for organizational psychologists to play this key role, we believe that they need to actively use and research new methods for assessing a person’s competencies and potential.


  • Alvarez, J. (2007). Du jeu vidéo au serious game : approches culturelle, pragmatique et formelle (Doctoral dissertation, Université de Toulouse II). Accessible via HAL (tel-01240683).
  • Austin, J. and Villanova, P. (1992). The criterion problem : 1971-1992. Journal of Applied Psychology, 77(6), 836-874.
  • Back, M., Stopfer, J., Vazire, S., Gaddis, S., Schmukle, S., Egloff, B., & Gosling, S. (2010). Facebook profiles reflect actual personality, not self-idealization. Psychological Science: A Journal of the American Psychological Society/APS, 21(3), 372-374.
  • Berkelaar, B. and Buzzanell, P. (2015). Online Employment Screening and Digital Career Capital : Exploring Employer’s Use of Online Information for Personnel Selection. Management Communication Quarterly, 29(1), 84-113.
  • Chamorro-Premuzic, T., Winsborough, D., Sherman, R., and Hogan, R. (to be published). New Talent Signals: Shiny New Objects or a Brave New World? Industrial and Organizational Psychology.
  • Cossette, M. (2015). Analytique RH: un rôle en plein essor pour les professionnels RH. Effectif, 18(2). At
  • Davison, H., Maraist, C., Hamilton, R. and Bing, M. (2012). To Screen or not to Screen? Using the Internet for Selection Decisions. Employee Responsibilities and Rights Journal, 24, 1-21.
  • Galois-Faurie, I. and Lacroux, A. (2014). « Serious games » et recrutement : quels enjeux de recherche en gestion des ressources humaines? @GRH, 1(10), 11-35.
  • Hindman, M. (2015). Building Better Models: Prediction, Replication, and Machine Learning in the Social Sciences. The ANNALS of the American Academy of Political and Social Science, 1, 48-62
  • Kahneman, D. (2012). Thinking, Fast and Slow. Farrar, Straus and Giroux, 2011
  • Kuncel, N. R., Klieger, D., M., Connelly, B. S, and Ones, D. S. (2013). Mechanical Versus Clinical Data Combination in Selection and Admissions Decisions: A Meta-Analysis. Journal of Applied Psychology, 6, 1060-1072.
  • Morris, S.B., Daisley, R. L., Wheeler, M., and Boyer, P. (2015). A Meta-Analysis of the Relationship Between Individual Assessments and Job Performance. Journal of Applied Psychology, 1, 5-20.
  • Polimeni, R., Burke, J. and Benyaminy, D. (2009). Using Computer Simulations to Recruit and Train Generation Y Accountants, CPA Journal, 79(5), 64-68.
  • Roth, P., Bobko, P., Van Iddekinge, C. and Thatcher, J. (2016). Social Media in Employee-Selection-Related Decisions : A Research Agenda for Uncharted Territory. Journal of Management, 42(1), 269-298.
  • Schmidt, F. and Hunter, J. E. (1998). The validity and utility of selection methods in personnel psychology: Practical and theoretical implications of 85 years of research findings. Psychological Bulletin, 124, 262-274.
  • Society for Industrial and Organizational Psychology (2013, 2014, 2015). Top 10 Workplace Trends 2014-2015-2016.
  • Waber, B., Olguin Olguin, D., Kim, T., and Pentland, A. (2008). Understanding Organizational Behavior with Wearable Sensing Technology. Presented at The Academy of Management Annual Meeting, Anaheim. At
  • Wu, L., Waber, B., Aral, S., Brynjolfsson, E. and Pentland, A. (2008). Mining Face-to-Face Interaction Networks using Sociometric Badges : Predicting Productivity in an IT Configuration Task. Presented at The International Conference on Information Systems, Paris. At
  • Zyda, M. (2005). From Visual Simulation to Virtual reality to Games. At