The other day I had a wonderful conversation with a friend and fellow software enthusiast about fairness when it comes to the remuneration of knowledge workers. Both of us have been through various posts in different technology companies, increasingly concerning ourselves with management and leadership as our careers progressed.
Our conversation honed in on the following question: Is the way people are payed in our industry today fair? Note that I’m not referring only to software developers here—data scientists, UI experts, DevOps heroes, product managers, and all the others who contribute to successful projects are included as well.
Both of us had observed that it is more or less common that “leadership-style” work (think product manager, team lead, software architect, …) tends to pay significantly better than even the most long held engineering positions. That issue creates a conundrum for many aspiring employees in tech companies: If you’re a developer and want to progress your career, why is it that our industry forces you into management-type roles instead of honoring increasing technical abilities?
Not only is that problematic on an individual level, it can also be highly counterproductive for organizations: Often great developers feel compelled to move into management despite them being much happier (and objectively making better contributions) in engineering. Hence companies end up with mediocre managers who would be better at writing code, and frustrated employees who complain about their leaders appearing unfit for their roles.
Of course, that is not to say that moving on from engineering is per se a bad idea. Personally, I’m grateful for the opportunities I had to leave the trench-work of writing code behind to focus on the bigger picture. Over time, I wanted to concern myself more with the question of “What should we be building?” instead of “How should we build it?”. But clearly, that move is neither possible nor advisable for everyone. We desperately need strong, experienced, and passionate engineers to solve the hard problems of “How?” in the best possible ways. So why are we treating those who explicitly want to do that hard work so badly?
A bit of History
The biggest problems with the way we treat people in software companies today arguably have their origins in the long history of wage-work itself: The nature of how we work has been institutionalized during the industrial revolution in the 18th and 19th century. That’s when ideas like fixed working hours per day/week/month first arose, when individuals started specializing on single aspects of a larger value chain, and when companies first acquired so many laborers that they needed to formally organize them and deliberately manage their respective workloads and responsibilities.
Increasing specialization lead to the creation of different types and levels of jobs, hence the organizational chart was born. Those who had little skill and education had only their physical labor to contribute to their companies, therefore they ended up on factory floors and in assembly shops. They were far from self-organizing, so the employer had to provide structure—plans for when and how to work for example, and precise instructions on how to perform one’s duties. Those plans hand to be created by someone though—usually someone with higher education, and a very different skill set.
In such an environment the differences in remuneration between those wearing the blue collar of production and their white-collar superiors was quite easy justified: While the former had little to no special education, the latter needed at least basic intellectual training. More time spent upfront in schools and universities necessitated higher wages, simply to compensate for the fewer working years one could reasonably be expected to perform over a lifetime.
The uniform nature of production-type jobs furthermore made those performing them easily replaceable, as opposed to the more individualistic contributions of university educated white-collar workers. Therefore, the latter were harder to replace and could demand higher salaries as their threats to leave the company carried far more weight.
Another interesting argument is that managing people is inherently “harder” and more stressful than other types of work, and those levels of stress should be compensated for by higher income. This line of reasoning is thoroughly flawed though, as studies have shown that the exact opposite is the case: One of the leading causes of low job satisfaction is not too much responsibility, but too little. Managers, who typically enjoy high levels of autonomy, therefore tend to be more satisfied with their work than their subordinates.
Finally, I’d argue that those in management always were one step closer to the ultimate source of the money that companies could distribute, which made it easier for them to claim a larger piece of the pie. Institutions like works councils, trade unions, and collective bargaining agreements tried to mitigate that issue and arguably were quite successful at that for a long time. Unfortunately though, that trend worsened again in recent years, with CEO salaries deviating more and more from the mean income of their employees.
Work in the 21st century
Today’s world of work is fundamentally different to what it looked like during and after the industrial revolution. A modern team that creates software needs to be structured totally different than a production-oriented company that harnesses manual labor to produce physical goods. More importantly, none of the traditional distinctions between those who “only” contributed on the lowest levels of the organization and those higher up the food chain mentioned above hold up any more.
In modern companies, it is not uncommon for engineers hold multiple PhDs while their CEOs sometimes don’t have any higher education to show for. Even in less extreme cases it is obvious that the differences in educational of say developers, user interface experts, and product managers are negligible when it comes to determining what a fair payment scheme should look like.
When it comes to the replaceability of individual contributors, knowledge work has at best equalized the playing field, and at worse turned it around in an almost comical way: Ideally everyone in a modern organization contributes his or her individual creativity and experience to a larger goal, so everyone is equally replaceable or irreplaceable. Sometimes though, losing a single engineer can be a devastating blow to a team or even an entire company, while personnel changes in the upper echelons of management can go unnoticed—and with little impact to the success or demise of an organization.
During the discussion my friend and I had, we quickly agreed that education or one’s position in a hierarchy should not be key indicator for how much money they make. But how can we do better?
Value the quality of individual contributions over the number of direct reports
Especially in larger companies it is still common to judge one’s status and influence by counting the number of employees he or she “manages”. That metric is as anachronistic as it sounds, as command-and-control management is gradually replaced by concepts of self-organization. In such an environment, it shouldn’t matter if there are five, 50, or zero employees “below” you in the hierarchy, because they no longer make up your personal fiefdom.
Instead we should judge a manager’s contributions based on their individual merits: Does he empower his subordinates to contribute at the top of their abilities? Does she foster growth and learning among those reporting to her? Do they bring specific and valuable subject-matter expertise to the table if they’re in a product- rather than people-management role?
If so, they deserve to be payed well for their contributions. But equally so does the engineer with many years of experience with a tricky piece of technology, the designer who brings outstanding creativity to the team, or the technical writer who is willing to go the extra mile to cross all the t’s and dot all the i’s in a critical publication.
Value experience over job titles
When I first became Product Owner, every single individual on my team had more experience with our problem domain than I had. Luckily they were outstandingly patient with me during the time it took me to finally figure out that despite my fancy job title, I had a lot to learn from them. In discussions with people outside our team though, it often happened that well-argued opinions were ignored, even ridiculed, just because they were stated by a developer. (I would sometimes repeat the point an engineer had made a moment ago and be astonished by how much more respect the exact same statement got when expressed by me, rather than a developer.)
In many companies the HiPPO principle is still prevalent when it comes to decision making: Those with the fanciest job title, or the highest position in the org chart, are expected to magically be right, even though their subordinates may have a lot more experience, and bring better founded arguments into a discussion. To overcome that problem, we need to judge arguments by their factual correctness, not just by who brought them up.
Value willingness to grow over following a predetermined career path
It is crucial that young developers feel assured that if they chose a career path that is driven by an honest curiosity about new technologies, and a willingness to grow along the way, they are no worse off financially than by going down the road towards “general management”. Our industry already has more than enough managers who would be better and more productive as developers, let’s make sure not to force even more aspiring engineers down a career path that doesn’t feel right for them.
The world of work has changed significantly over the last 150 years. Unfortunately, the structures and methods we use to organize work and determine who should be payed how much haven’t evolved as quickly. The software industry, being at the forefront of much of the rapid change going on in the world today, needs to also pioneer new ways of structuring work and pay.
Many of the changes required are up to us as individuals: We can chose to respect experience and the quality of contributions more than sheer hierarchical authority. We can also be role-models to younger colleagues in order to help them determine what career paths are right for them. And finally, we can demand more openness, honesty, and discussion in the upper echelons of management about the remuneration schemes in our organizations. Clearly, the way we do things today aren’t ideal, but it’s ultimately up to us to propose better alternatives.