AI Is Eating Juniors — and with Them the Industry’s Breeding Stock
Since 2021, I have been working at the intersection of two industries—digital technology and agriculture. Over that time, my thinking has become noticeably shaped by the profession: I increasingly view other fields through an agricultural lens. For people far from agribusiness, that analogy may seem odd. But before arguing, try reading to the end.
Last week I already wrote about whether programmers will be replaced by vibe coders, agents, and large generative models. I received plenty of opinions, criticism, and comments in response. Now I want to continue that thought through an unexpected—but, in my view, accurate—analogy between IT and breeding livestock. May IT people and breeding bulls forgive me.
Any breeding operation is sustained not by the entire herd at once, but by its core—a lineage that preserves and reliably passes on the best traits. As long as that core is maintained, quality can be reproduced generation after generation. But if the core is weakened or no longer renewed, the degradation of future generations becomes only a matter of time and ultimately leads to the loss of the breed.
Something very similar, I think, is happening in IT.
The industry’s breeding core is not just strong specialists. It is the entire system of professional reproduction: engineering culture, code review, architectural thinking, discipline, understanding of algorithms, knowledge of the fundamentals, and the ability to take responsibility for the consequences of decisions. Any core must be replenished, refreshed, and continuously improved. And juniors play the most important role in that process.
A junior is not just a cheap pair of hands for simple tasks. It is the layer from which mid-level engineers, seniors, tech leads, and architects grow over time. Through routine work, small tasks, mistakes, reviews, and repetition, a person gradually learns not just to write code, but to understand the system.
And it is precisely this growth mechanism that AI is threatening today.
Yes, AI can generate code faster. Yes, it can often do it more cleanly, more neatly, and without unnecessary questions. But if you hand over to the machine the entire layer of tasks on which people used to learn, some junior will never get the practice, will never learn to read other people’s code, and will never understand why one solution works while another breaks the system. Which means that later this person will grow into neither a confident mid-level engineer nor a strong senior.
And that, to me, is the real problem. Not that AI will replace programmers, but that it may begin to displace the very mechanism by which programmers are produced.
The problem is also that what is being automated is not the top of the profession, but its training foundation. In the past, newcomers entered the craft through simple tasks: fixing small things, writing standard code, making mistakes, getting review comments, rewriting, and learning to see the consequences of their decisions. It was not the prettiest path, but it worked.
Today, that very layer is the first candidate for automation. Not responsibility, not architecture, not engineering judgment—but the layer where professional maturity used to develop in the first place.
From a short-term business perspective, the logic is easy to understand. If an agent can do a routine task faster than a junior and does not require senior time for training, the choice looks obvious. But over the long term, that logic begins to consume the industry’s own talent reserves. Strong engineers do not appear on the market fully formed and in the required quantities. They grow out of people who were once allowed to make mistakes, figure things out, and gradually take on more responsibility.
If an industry stops growing entry-level specialists internally, it can live for some time on accumulated reserves. But then, suddenly, it discovers that strong engineers are scarce, expensive, and that almost no new ones are appearing.
That is why a junior is not an annoying burden on the business. A junior is the industry’s replacement stock—the young layer from which the entire strong layer is later renewed. And if AI eats junior-level tasks, the industry risks eating through its own human capital.
But this does not mean we should simply restore the old world and once again put juniors on endless diskette formatting (God, I am old) and on writing simple data-handling procedures. That world is not coming back. If AI has become better than humans at a large chunk of routine work, then the old entry path into the profession will have to be redesigned.
The new junior should grow not as a cheap pair of hands, but as a junior systems-thinking engineer. They must learn not only to write, but also to read. Not only to generate, but also to verify. Not only to admire quickly generated, beautiful code, but also to understand why it works, where it is fragile, and exactly what may break.
If we accept that the problem is real, then metaphor alone is not enough. A breeding core does not preserve itself automatically—it is maintained deliberately. It seems that IT will have to do the same.
First, we need a real entry pipeline for students into the profession through actual practice in IT companies. Not formal internships for the sake of appearances, but work on a limited set of real tasks with mentorship.
Second, employers need incentives to develop juniors. Today, a young specialist often looks like a cost center to a business: they must be trained, reviewed, and protected from mistakes. If the industry wants to preserve the mechanism for reproducing talent, that burden should not fall entirely on employers alone. For example, support measures for IT companies could be partially linked to genuine work on developing young specialists: internship slots, trainee programs, mentorship, and first positions for juniors.
Third, we need to distinguish between products in which AI was merely a tool and products in which human engineering involvement was reduced to a minimum. This is not about banning AI; it is about transparency regarding the level of human responsibility. If, in essence, there was no real engineering acceptance behind a digital product, that should be clear. This criterion should be treated as a risk factor and could, for example, be used to limit the inclusion of such products in registries of specialized software.
The main threat from AI, in my view, is not that it will replace programmers. The main threat is that it may help the industry destroy the mechanism that grows programmers in the first place.
So the key question today is not: will AI replace programmers?
The real question is: can we preserve the system that makes the emergence of strong programmers possible at all?
P.S. For my part, I have always been happy to give the younger generation a chance to gain experience. I have consistently hired young specialists, given them practical tasks, and mentored them. Quite a few of them went on to continue working in the organizations I invited them into.