Berkson's, right? A perfect interview process would result in a population where none of the people who are hired as a result of it have any attributes that could be correlated with job performance. i.e. all the information has been 'used up' by the selection filter. If you have a correlation, then you can improve the selection filter, so it can't be perfect.
This can have interesting outcomes. For instance, when Triplebyte published their blog post about which environments get the most hires⁰, it revealed the areas they haven't yet entirely accounted for in their quest to increase matching performance.
I don't follow the reasoning. Even if you have a reliable prediction of performance, what prevents you from hiring some candidates who are exemplary and some who are just well-qualified? Or are we assuming that the best candidates would be given a higher-tier job for which they just meet the requirements?
It's a spherical cow conversation. The cows you hire can be placed in a smooth manifold of job difficulty and you have a predictable prediction of the job performance distribution.
Hard to call that a "perfect interview process", because from individual candidate point of view, some with unusual characteristics could be unfairly disadvantaged (and other unfairly advantaged), while reaching your overall "neutral" distribution at the end. Short of being an omniscient interview process, I'm not sure this can absolutely be avoided, so in practice you have to be very careful with the kind of correlation presented here. Even if that ends up not being Berkson, but a property of all programmers.
This can have interesting outcomes. For instance, when Triplebyte published their blog post about which environments get the most hires⁰, it revealed the areas they haven't yet entirely accounted for in their quest to increase matching performance.
0: https://triplebyte.com/blog/technical-interview-performance-...