When my cousin hit the workforce in the late 90s, it was truly a case of "if you knew which was was up on a keyboard, you could get a job in tech." They could not hire people fast enough. When I...
When my cousin hit the workforce in the late 90s, it was truly a case of "if you knew which was was up on a keyboard, you could get a job in tech." They could not hire people fast enough.
When I got into the industry in the mid-2000s, companies would come and conduct interviews at my college campus, and I got a job right out of college.
Nowadays (based on what new devs have told me), you can barely get a recruiter to respond to you if you're a new grad.
This article resonated with me because it succinctly states what I've felt internally for a while. When new developers ask me about breaking into the industry, I pretty frankly tell them that I have no advice to give, because what worked for me absolutely won't work for them. It's a totally different environment now, not only in the ratio of supply vs. demand for developers, but also in what sort of expectations we put on new developers (I was way more of a doofus when I first started than people I see now).
As someone who broke into the industry in 2015, my thoughts are similar, but even by then it’d gotten a lot more difficult than what I’ve heard/read the 00s and 90s were like. It took quite a...
As someone who broke into the industry in 2015, my thoughts are similar, but even by then it’d gotten a lot more difficult than what I’ve heard/read the 00s and 90s were like.
It took quite a number of applications, phone interviews, and in-person interviews before I finally found a company that would give me a chance, and I think the only reasons why that process weren’t even more protracted is because I had made a point of moving to a tech hub (SF Bay Area) and was targeting a niche that was in demand but hadn’t yet been flooded due to high-ish bar to entry (native iOS development). Had I been, say, a newbie front end web dev trying to get hired in an area where tech wasn’t dominant I’m sure I would’ve had a much harder time.
And yes, I was basically clueless for my first dev job. I knew enough to get along obviously, but the first several months were the most intense period of learning in my life by a long shot and a lot of newbies these days would be applying with a considerably higher degree of knowledge than I got hired with at that point.
Talking about the US but I assume the situation is the same in other countries: I think the main difference between the 90s and now is supply from other countries. Most larger companies have a big...
Talking about the US but I assume the situation is the same in other countries:
I think the main difference between the 90s and now is supply from other countries. Most larger companies have a big presence in India and maybe a few other places that have lower labor costs than the US. This drastically reduces the demand for local labor. As a job applicant it’s very hard to compete with someone who will do your job for 1/3 the price. Especially since businesses don’t properly measure the actual costs of outsourcing because they focus on quarterly goals.
Anyway this has been going on for 30 years but the article talks about the last 10 or so.
A political comment: By the way Tariffs and deportations won’t do anything to reduce outsourcing of highly skilled technical jobs. If protectionism and being internationally competitive were really the goal he may weigh the importance of these higher than low skilled jobs and create appropriate incentives and costs. But I haven’t seen trump express concern about this at all. In fact the tech oligarchs including Musk want to increase outsourcing.
To be more subtle: the cause go back as far as 15 years ago, but the effects go in waves, typically with the economy strong industry, wants to hire anyone and anything. Tech is known for cushy...
To be more subtle: the cause go back as far as 15 years ago, but the effects go in waves, typically with the economy
strong industry, wants to hire anyone and anything. Tech is known for cushy well paid white collar jobs
growth slows, companies start to look at ways to cut costs. Oftentimes, regardless of the productivity hit.
layoffs happen and experiments with cheaper labor occur
economy recovers, or some explosive new tech is made and people go back to the beginning.
This is definitely a strange time, though. We have both an explosive new tech with AI while being in a weakening economy. And that tech in particular wants to be the outsourcing itself. Many people in industry know that it won't replace them this decade (so, same as outsourcing), but it's still overall such a weird vibe this time around. The youngins' really are off to a brave new world.
I work in cybersecurity, and I often get asked by younger people (or even more often, their parents - which is an entirely different issue I could go on about) how to get into it. And my path here...
When new developers ask me about breaking into the industry, I pretty frankly tell them that I have no advice to give, because what worked for me absolutely won't work for them. It's a totally different environment now, not only in the ratio of supply vs. demand for developers, but also in what sort of expectations we put on new developers (I was way more of a doofus when I first started than people I see now).
I work in cybersecurity, and I often get asked by younger people (or even more often, their parents - which is an entirely different issue I could go on about) how to get into it. And my path here was strange and twisted. My actual college degree is in Geography of all things. I worked at a help desk as a student, and met people there. Those people got my foot in the door for a low level sys admin job, and I just worked my way up from there in a haphazard fashion. I also graduated college in 2002, which was both an awful time to graduate but also a great time as you've mentioned, the "requirements" to get into technology were much lower at the time.
My advice is to just take what you can get and make it work from there, but that's not really helpful to the specific question of "how do I get into this field?"
Yes, I have had a few conversations with parents that start like this: "My son/daughter really likes computers. You work with computers. How can my child get into this field?" To which I usually...
Yes, I have had a few conversations with parents that start like this: "My son/daughter really likes computers. You work with computers. How can my child get into this field?"
To which I usually give some version of the story above but with an added "I would be more than happy to talk to your child about this instead!" I've found, for the parents that eventually put me in touch with their kids, the conversation goes much broader than "computers = good job" and I find I can give much more helpful advice that is somewhat tailored to the individual.
Back when I was a teenager, my parents would have said to me "Here's my colleague that works in IT, go talk to them" instead of "Hey, I'll ask the nerd at work about the computer stuff for you." But, I am getting old and wearing an onion on your belt just isn't popular anymore - so what do I know?
I’ve seen a lot of internet posting these past two years that make fun of Computer Science majors, saying that they will be unemployed or working fast food. When I was in high school during the...
I’ve seen a lot of internet posting these past two years that make fun of Computer Science majors, saying that they will be unemployed or working fast food.
When I was in high school during the 2010s there was this big push from the Obama administration for students to study STEM in college. Even when I attended college from 2016 to 2020, someone being a CS major meant they were going to have a lot of job prospects. And now the major gets treated like if it’s Philosophy or any other sort of “useless” humanities degree.
Yup. It changed roughly in 2022, lot of tech layoffs that year, same year I was a junior in my CS program... Fun times getting an internship that year.
Yup. It changed roughly in 2022, lot of tech layoffs that year, same year I was a junior in my CS program... Fun times getting an internship that year.
During the dotcom bubble, it was more like 'do you have a pulse, we can teach you to turn one of these things on'. At one point I was working in a 24x7 NOC where the rest of my team's previous job...
During the dotcom bubble, it was more like 'do you have a pulse, we can teach you to turn one of these things on'. At one point I was working in a 24x7 NOC where the rest of my team's previous job was (in no particular order): Construction worker, overnight disc jockey, grocery store stocker, coast guard, another construction worker. To their credit they all followed explicit directions extremely well, so I just wrote a bunch of run books and scripted them out of as many things as possible.
My current wild ass'd theory is that c-level decision makers have decided that AI is going to replace 90% of the tech work force so why not go ahead and not hire them now?
1.) The field used to recruit primarily for smart generalists who had a solid foundation of technical skills and were good at figuring stuff out. The field now recruits primarily for people who...
1.) The field used to recruit primarily for smart generalists who had a solid foundation of technical skills and were good at figuring stuff out. The field now recruits primarily for people who check the boxes on having experience with specific, highly abstracted frameworks, tools, platforms, and technologies which makes it harder to get your foot in the door or dabble in multiple things.
2.) There is a LOT of competition because of a glut in hiring and poorly designed training programs/bootcamps. There’s a lot of people now who have nearly 10 years of experience as “engineers” but whose actual engineering skills are lacking because they’ve been able to job hop or coast largely by drafting on other people on their teams. Because of the credentialism in item #1, it’s hard for a candidate to stand out or show that they’ve done good work unless they have a big name on the resume.
3.) Most technical problems now are actually strategy and business problems, but company IT departments are too engineer-brained to admit it. Everyone wants to pretend they’re AWS or Google and solving cutting edge engineering challenges, but for most companies’ use cases the engineering issues have mostly been solved and are matters of implementation. What’s actually holding them back is that the company’s silos don’t collaborate effectively and their data architectures are too chaotic and out of date to do any cool modern stuff with them. They’re hiring for the wrong mix of people/skill sets basically.
4.) There’s just a lot more scamming and spamming in the jobs space. Recruiters are getting scammed with AI applicants and scam placement agencies. Applications are getting scammed with fake job postings, umpteen million “knowledge checks” before you can talk to anyone, and very dumb resume filtration systems that make no sense and drop good candidates in favor of manipulative SEO people.
There’s other issues, but those have been the major ones. The other one is just a pipelining thing. The incoming generations of technical talent simply lack a lot of technical foundations AND basic office/professionalism skills that used to be taken for granted and teams and management styles have had trouble adapting.
Some things that I have noticed in the dev field that differ from when I got in 15ish years back. This is all anecdata, and might not apply to everywhere. Cloud infrastructure makes it much easier...
Some things that I have noticed in the dev field that differ from when I got in 15ish years back. This is all anecdata, and might not apply to everywhere.
Cloud infrastructure makes it much easier to spin up proofs of projects.
Made a little game or some other proof of concept? You can host it for like 5 bucks to show that you can do a thing. You can also make the github repository for it public so people looking to hire can check out how you wrote it and maybe what kinds of bugs or whatever you noticed and fixed.
This is a boon for people that like doing projects outside of work. This sucks for people that don't. The barrier to entry for this kind of thing was a lot higher in the late 00s so it was hardly expected that people would have public demos of their work (some still did, of course, because it is fun for some people).
The idea of "rockstars" has fallen out of favor.
For a long while, it seemed like every company wanted the one person who would come in and do the jobs of four developers. This has kind of gone away as everyone has realized that burnout is a real thing and you probably don't want your programs all designed with "get feature out the door as fast as possible" in mind. It's usually better (for non-startups) to have a bit more planning involved. Startups are excluded from this bit because they are kind of obligated to get features out the door in any way possible to exist for more than a year.
What you are expected to know off hand is less important than knowing where to find information
Code quizzes used to be more normal during interviews. Plenty of "gotcha"s where some weird quirk of the language was used to try to trap you into doing the wrong thing to see if you knew trivia about a given language. This has kind of switched over to a (better, in my opinion) strategy of asking about design patterns that are kind of language agnostic.
This is probably a result of languages evolving quickly and documentation being ever easier to find. In years past I have had quizzes during interviews where I was asked if something was a valid syntax and had to respond with something like "I don't believe it was valid when this quiz was written, but it is now or will be on the next release" which is not a particularly valuable thing to check.
All of this to say that the landscape has changed a lot and I only know some of these things because I keep in touch with a lot of the interns that I work with after they leave. Even that is cherry picking because those people managed to get through the filters of "getting the internship" and also by extension have previous job experience on their resumes when they hit "real" interviews and is one further barrier for those that didn't/couldn't do internships for whatever reason.
It's kinda an awkward position for me. I'd love to show my games but making something in web is a dead end for most Financials. And isn't necessarily the optimal portfolio to show for industry...
This is a boon for people that like doing projects outside of work.
It's kinda an awkward position for me. I'd love to show my games but making something in web is a dead end for most Financials. And isn't necessarily the optimal portfolio to show for industry game programmer positions either.
But making a console/pc games means a higher barrier to sharing. And "just make a web export" for such a build is a massive ask for most projects.I do agree with the overall sentiment that the "programmer portfolio" is being more and more expected of a new hire, though. Especially web devs.
Code quizzes used to be more normal during interviews.
From my recent experience, interviews are still all over the spectrum. You can be asked about fundamentals, or about language quirks or system design or domain specific trivia. And of course the almighty FAANG algorithm gauntlet is enduring.
It's quite frustrating because it's the inverse of what your point was. You are more expected to find information as you need it for each task... But in the interview you better be a software encyclopedia recalling items at rote efficiency.
Before every interview I conduct, I tell the candidate, "This is open book, so if you want to look stuff up or ask me questions, go for it." I do have candidates take me up on this, but it happens...
Before every interview I conduct, I tell the candidate, "This is open book, so if you want to look stuff up or ask me questions, go for it."
I do have candidates take me up on this, but it happens less often than you'd think. Frequently I'm the one who's nudging them to look stuff up: "Hey, so you're absolutely right that this depends on what kind of data structure Python lists are under the hood… how would you find that out?" I wonder if all their past interviews have conditioned candidates to think they're admitting weakness by saying, "I don't know, let me Google it."
I guess what I'm trying to say is, if you ever have an open book interview, please—for my sake—use the gargantuan living encyclopedia your computer is plugged into.
Don't have to ask me twice, haha. I'm switching between so many languages that I'm not trying to memorize syntax for every little thing. I don't need much more than docs for that but but I'll...
Don't have to ask me twice, haha. I'm switching between so many languages that I'm not trying to memorize syntax for every little thing. I don't need much more than docs for that but but I'll never be ashamed for double checking stuff in docs.
Is a video of the gameplay perhaps enough of a teaser to get people to download it?
It's kinda an awkward position for me. I'd love to show my games but making something in web is a dead end for most Financials. And isn't necessarily the optimal portfolio to show for industry game programmer positions either.
Is a video of the gameplay perhaps enough of a teaser to get people to download it?
Video will help but it's an extra degree of effort for the user. It's part of why click through rates important. Every extra click a user needs to do to get to your stuff reduces the rate.
Video will help but it's an extra degree of effort for the user. It's part of why click through rates important. Every extra click a user needs to do to get to your stuff reduces the rate.
While I dabbled in IT in one fashion or another since 1998 or so (starting with repairing PCs for side cash in high school), I didn't enter the workforce proper till 2008, landing a government job...
While I dabbled in IT in one fashion or another since 1998 or so (starting with repairing PCs for side cash in high school), I didn't enter the workforce proper till 2008, landing a government job just before the banking meltdown. US based, fwiw.
The biggest single difference I've noticed across the board (and this applies outside of tech as well) is that nobody wants the financial "burden" of onboarding less-efficient junior staff. Paired with the "meritocracy" regular churn and imports of talent from overseas, it's really put a hamper on the new generation which is bright but inexperienced. Even having public repos and projects that look amateurish can hurt as much as help.
The only broad exceptions seem to be in local/state government and education, where the budgets tend to be tighter and the staff at the latter tend to want to foster learning (Tho the MBAs coming in to management from the private sector is blowing this away, that's a big sidebar).
If the ship doesn't turn soon, its going to really hurt in about 15-20 years when X and the elder Millenials start retiring and there will be a shortage of seniority to help onboard a whole lot of junior staff all at once.
Probably some misaligned incentives once again. When everything is hyper focused on kpis and sprint velocity, getting interns or entry hires on your team is like getting a go-kart engine for a...
Probably some misaligned incentives once again. When everything is hyper focused on kpis and sprint velocity, getting interns or entry hires on your team is like getting a go-kart engine for a formula 1 race. You know you're gonna lose.
Thing is these things shouldn't be a race to begin with. Interns are more like... Well, your future investment. But no one seems as focused on long term legacy these days over making the next quarter profits. Our future is in trouble until we incentivise behavior for stepping down and passing the torch.
I don't think that's really true. At least depending on the scale. All the major tech companies - I wouldn't even say "big", most startups also fall into this category - have pretty extensive...
I don't think that's really true. At least depending on the scale. All the major tech companies - I wouldn't even say "big", most startups also fall into this category - have pretty extensive internship programs. These often pay well over $60/hr - and believe me, 99% of these interns do not do $60/hr of work. Often most of the intern projects either never go into production or are internal dashboards or tooling. These interns are often hired at a 70-80% rate to newgrads.
There's not much point in these programs other than onboarding junior staff.
Not just ancedata. It also doesn't reflect the newer trend of posting mid-level positions as junior ones to drive down wages as unemployment numbers climb.
Yes, but these internship programs are competitive and targeted towards new grads. We might see a situation where new grads from elite universities do well, but a lot of other people can’t get...
Yes, but these internship programs are competitive and targeted towards new grads. We might see a situation where new grads from elite universities do well, but a lot of other people can’t get those jobs? Sort of like the pipeline from universities to law and finance.
What sort of shortages might result seems no easier to predict than figuring out where AI will be in N years. We don’t know what the effect on demand will be. It might be a Jevon’s paradox thing where increased efficiency increases demand, or maybe not?
The “entry level job requires 5 years experience” dilemma is something I experienced first hand in 2009. In my mind, seeing that complaint surface again is absolutely a sign that the job market is...
The “entry level job requires 5 years experience” dilemma is something I experienced first hand in 2009. In my mind, seeing that complaint surface again is absolutely a sign that the job market is not great.
I know they don’t want people to panic, but I really think the way they’re representing the unemployment rate, regardless of how they calculate it, still isn’t showing us the full picture.
It's not. Quality of the jobs is important too. If most of your working class is partaking in the gig economy and doing Lyfts, your economy isn't sustainable.
It's not. Quality of the jobs is important too. If most of your working class is partaking in the gig economy and doing Lyfts, your economy isn't sustainable.
I think we need better industry specific metrics that track how many positions of what specific title are open and at what salary. I dont think the us government tracks that, but I know that info...
I think we need better industry specific metrics that track how many positions of what specific title are open and at what salary. I dont think the us government tracks that, but I know that info is out there cause I use it when I’m job hunting
Didn't see any mention of ZIRP (zero interest rate policy) here nor in the article, so I'm going to throw that hat in. During the 2000s the US had very low interest rates, and then during...
Didn't see any mention of ZIRP (zero interest rate policy) here nor in the article, so I'm going to throw that hat in.
During the 2000s the US had very low interest rates, and then during basically the entire 2010s the US had interest rates at 0%, borrowing money was easy and cheap, it made sense to borrow lots of money to throw at people to make companies that could make even more money back.
If someone started their software career in 2010 and is giving you advice, take it with a grain of salt, they came in at and era where the money was flowing. Covid put an end to ZIRP for now, and now we're looking at a recession or worse. Tech companies don't have the same faucet of zero interest rate money anymore, they're doing layoffs, stealth layoffs, and replacing workers with AI (if feasible, good luck!). The job market is very different looking than what it used to be, someone who had most or all of their career in the good years (ZIRP years) is accustomed to something much different (and for their bank account, very pleasant) than what people are facing today, especially people just starting their careers.
Unless you meant mid-2010's (as in around 2015-2016), I would think the opposite, although I might be biased since I had to find a job in 2010 (not a software one though). For the "Information"...
If someone started their software career in 2010 and is giving you advice, take it with a grain of salt, they came in at and era where the money was flowing.
Unless you meant mid-2010's (as in around 2015-2016), I would think the opposite, although I might be biased since I had to find a job in 2010 (not a software one though). For the "Information" category unemployment was about 9.7% in 2010, compared to 5.4% now. So, almost double. That doesn't mean it's twice as hard to find a job, it's more than twice as hard. I couldn't even get a job in fast food or retail because there were hundreds of applicants for each job. I lucked out getting a factory assembly job making near minimum wage through a small staffing agency. I say "lucked out" because even though it was a shitty job with shitty pay, I was just glad to have any kind of income. To add on to that, home foreclosures hit their peak in 2010, so it's clear to me that the money was not really flowing yet. I would think someone who started their career in 2010 should have a lot of stories about times of difficulty, whether personal or from friends and family.
Based on my reading of the current job market, it's obviously worse than the crazy boom years of 2021-2022, but it's obviously so much better than the fallout years after the 2008 global financial crisis. I'd say it might be comparable to 2013-2014 in terms of finding a job. Kind of hard, but not impossible. The unemployment numbers were higher then, but the trend was in a more positive direction than it is now.
Note: The Information category is a bit too broad to be well representative of the software industry, but it's better than using the overall unemployment numbers. CompTIA is reporting tech unemployment at 2.3% vs. 4% overall, but I'm not sure how representative that is of the software industry either.
Edit: To clarify, I'm not saying "it's been worse, suck it up," or anything like that. I'm not offering any advice, just trying to clear up some history. Just kind of shocked to see the year with the highest unemployment rate in the past 40 years described as a good time to get a job.
While this is probably true, I would really appreciate something concrete as to what the difference is except that "it's harder".
When my cousin hit the workforce in the late 90s, it was truly a case of "if you knew which was was up on a keyboard, you could get a job in tech." They could not hire people fast enough.
When I got into the industry in the mid-2000s, companies would come and conduct interviews at my college campus, and I got a job right out of college.
Nowadays (based on what new devs have told me), you can barely get a recruiter to respond to you if you're a new grad.
This article resonated with me because it succinctly states what I've felt internally for a while. When new developers ask me about breaking into the industry, I pretty frankly tell them that I have no advice to give, because what worked for me absolutely won't work for them. It's a totally different environment now, not only in the ratio of supply vs. demand for developers, but also in what sort of expectations we put on new developers (I was way more of a doofus when I first started than people I see now).
As someone who broke into the industry in 2015, my thoughts are similar, but even by then it’d gotten a lot more difficult than what I’ve heard/read the 00s and 90s were like.
It took quite a number of applications, phone interviews, and in-person interviews before I finally found a company that would give me a chance, and I think the only reasons why that process weren’t even more protracted is because I had made a point of moving to a tech hub (SF Bay Area) and was targeting a niche that was in demand but hadn’t yet been flooded due to high-ish bar to entry (native iOS development). Had I been, say, a newbie front end web dev trying to get hired in an area where tech wasn’t dominant I’m sure I would’ve had a much harder time.
And yes, I was basically clueless for my first dev job. I knew enough to get along obviously, but the first several months were the most intense period of learning in my life by a long shot and a lot of newbies these days would be applying with a considerably higher degree of knowledge than I got hired with at that point.
Talking about the US but I assume the situation is the same in other countries:
I think the main difference between the 90s and now is supply from other countries. Most larger companies have a big presence in India and maybe a few other places that have lower labor costs than the US. This drastically reduces the demand for local labor. As a job applicant it’s very hard to compete with someone who will do your job for 1/3 the price. Especially since businesses don’t properly measure the actual costs of outsourcing because they focus on quarterly goals.
Anyway this has been going on for 30 years but the article talks about the last 10 or so.
A political comment: By the way Tariffs and deportations won’t do anything to reduce outsourcing of highly skilled technical jobs. If protectionism and being internationally competitive were really the goal he may weigh the importance of these higher than low skilled jobs and create appropriate incentives and costs. But I haven’t seen trump express concern about this at all. In fact the tech oligarchs including Musk want to increase outsourcing.
To be more subtle: the cause go back as far as 15 years ago, but the effects go in waves, typically with the economy
This is definitely a strange time, though. We have both an explosive new tech with AI while being in a weakening economy. And that tech in particular wants to be the outsourcing itself. Many people in industry know that it won't replace them this decade (so, same as outsourcing), but it's still overall such a weird vibe this time around. The youngins' really are off to a brave new world.
I work in cybersecurity, and I often get asked by younger people (or even more often, their parents - which is an entirely different issue I could go on about) how to get into it. And my path here was strange and twisted. My actual college degree is in Geography of all things. I worked at a help desk as a student, and met people there. Those people got my foot in the door for a low level sys admin job, and I just worked my way up from there in a haphazard fashion. I also graduated college in 2002, which was both an awful time to graduate but also a great time as you've mentioned, the "requirements" to get into technology were much lower at the time.
My advice is to just take what you can get and make it work from there, but that's not really helpful to the specific question of "how do I get into this field?"
Their parents?? 💀
Yes, I have had a few conversations with parents that start like this: "My son/daughter really likes computers. You work with computers. How can my child get into this field?"
To which I usually give some version of the story above but with an added "I would be more than happy to talk to your child about this instead!" I've found, for the parents that eventually put me in touch with their kids, the conversation goes much broader than "computers = good job" and I find I can give much more helpful advice that is somewhat tailored to the individual.
Back when I was a teenager, my parents would have said to me "Here's my colleague that works in IT, go talk to them" instead of "Hey, I'll ask the nerd at work about the computer stuff for you." But, I am getting old and wearing an onion on your belt just isn't popular anymore - so what do I know?
I’ve seen a lot of internet posting these past two years that make fun of Computer Science majors, saying that they will be unemployed or working fast food.
When I was in high school during the 2010s there was this big push from the Obama administration for students to study STEM in college. Even when I attended college from 2016 to 2020, someone being a CS major meant they were going to have a lot of job prospects. And now the major gets treated like if it’s Philosophy or any other sort of “useless” humanities degree.
Yup. It changed roughly in 2022, lot of tech layoffs that year, same year I was a junior in my CS program... Fun times getting an internship that year.
During the dotcom bubble, it was more like 'do you have a pulse, we can teach you to turn one of these things on'. At one point I was working in a 24x7 NOC where the rest of my team's previous job was (in no particular order): Construction worker, overnight disc jockey, grocery store stocker, coast guard, another construction worker. To their credit they all followed explicit directions extremely well, so I just wrote a bunch of run books and scripted them out of as many things as possible.
My current wild ass'd theory is that c-level decision makers have decided that AI is going to replace 90% of the tech work force so why not go ahead and not hire them now?
1.) The field used to recruit primarily for smart generalists who had a solid foundation of technical skills and were good at figuring stuff out. The field now recruits primarily for people who check the boxes on having experience with specific, highly abstracted frameworks, tools, platforms, and technologies which makes it harder to get your foot in the door or dabble in multiple things.
2.) There is a LOT of competition because of a glut in hiring and poorly designed training programs/bootcamps. There’s a lot of people now who have nearly 10 years of experience as “engineers” but whose actual engineering skills are lacking because they’ve been able to job hop or coast largely by drafting on other people on their teams. Because of the credentialism in item #1, it’s hard for a candidate to stand out or show that they’ve done good work unless they have a big name on the resume.
3.) Most technical problems now are actually strategy and business problems, but company IT departments are too engineer-brained to admit it. Everyone wants to pretend they’re AWS or Google and solving cutting edge engineering challenges, but for most companies’ use cases the engineering issues have mostly been solved and are matters of implementation. What’s actually holding them back is that the company’s silos don’t collaborate effectively and their data architectures are too chaotic and out of date to do any cool modern stuff with them. They’re hiring for the wrong mix of people/skill sets basically.
4.) There’s just a lot more scamming and spamming in the jobs space. Recruiters are getting scammed with AI applicants and scam placement agencies. Applications are getting scammed with fake job postings, umpteen million “knowledge checks” before you can talk to anyone, and very dumb resume filtration systems that make no sense and drop good candidates in favor of manipulative SEO people.
There’s other issues, but those have been the major ones. The other one is just a pipelining thing. The incoming generations of technical talent simply lack a lot of technical foundations AND basic office/professionalism skills that used to be taken for granted and teams and management styles have had trouble adapting.
Thank you!
Some things that I have noticed in the dev field that differ from when I got in 15ish years back. This is all anecdata, and might not apply to everywhere.
Made a little game or some other proof of concept? You can host it for like 5 bucks to show that you can do a thing. You can also make the github repository for it public so people looking to hire can check out how you wrote it and maybe what kinds of bugs or whatever you noticed and fixed.
This is a boon for people that like doing projects outside of work. This sucks for people that don't. The barrier to entry for this kind of thing was a lot higher in the late 00s so it was hardly expected that people would have public demos of their work (some still did, of course, because it is fun for some people).
For a long while, it seemed like every company wanted the one person who would come in and do the jobs of four developers. This has kind of gone away as everyone has realized that burnout is a real thing and you probably don't want your programs all designed with "get feature out the door as fast as possible" in mind. It's usually better (for non-startups) to have a bit more planning involved. Startups are excluded from this bit because they are kind of obligated to get features out the door in any way possible to exist for more than a year.
Code quizzes used to be more normal during interviews. Plenty of "gotcha"s where some weird quirk of the language was used to try to trap you into doing the wrong thing to see if you knew trivia about a given language. This has kind of switched over to a (better, in my opinion) strategy of asking about design patterns that are kind of language agnostic.
This is probably a result of languages evolving quickly and documentation being ever easier to find. In years past I have had quizzes during interviews where I was asked if something was a valid syntax and had to respond with something like "I don't believe it was valid when this quiz was written, but it is now or will be on the next release" which is not a particularly valuable thing to check.
All of this to say that the landscape has changed a lot and I only know some of these things because I keep in touch with a lot of the interns that I work with after they leave. Even that is cherry picking because those people managed to get through the filters of "getting the internship" and also by extension have previous job experience on their resumes when they hit "real" interviews and is one further barrier for those that didn't/couldn't do internships for whatever reason.
It's kinda an awkward position for me. I'd love to show my games but making something in web is a dead end for most Financials. And isn't necessarily the optimal portfolio to show for industry game programmer positions either.
But making a console/pc games means a higher barrier to sharing. And "just make a web export" for such a build is a massive ask for most projects.I do agree with the overall sentiment that the "programmer portfolio" is being more and more expected of a new hire, though. Especially web devs.
From my recent experience, interviews are still all over the spectrum. You can be asked about fundamentals, or about language quirks or system design or domain specific trivia. And of course the almighty FAANG algorithm gauntlet is enduring.
It's quite frustrating because it's the inverse of what your point was. You are more expected to find information as you need it for each task... But in the interview you better be a software encyclopedia recalling items at rote efficiency.
Before every interview I conduct, I tell the candidate, "This is open book, so if you want to look stuff up or ask me questions, go for it."
I do have candidates take me up on this, but it happens less often than you'd think. Frequently I'm the one who's nudging them to look stuff up: "Hey, so you're absolutely right that this depends on what kind of data structure Python lists are under the hood… how would you find that out?" I wonder if all their past interviews have conditioned candidates to think they're admitting weakness by saying, "I don't know, let me Google it."
I guess what I'm trying to say is, if you ever have an open book interview, please—for my sake—use the gargantuan living encyclopedia your computer is plugged into.
Don't have to ask me twice, haha. I'm switching between so many languages that I'm not trying to memorize syntax for every little thing. I don't need much more than docs for that but but I'll never be ashamed for double checking stuff in docs.
Is a video of the gameplay perhaps enough of a teaser to get people to download it?
Video will help but it's an extra degree of effort for the user. It's part of why click through rates important. Every extra click a user needs to do to get to your stuff reduces the rate.
While I dabbled in IT in one fashion or another since 1998 or so (starting with repairing PCs for side cash in high school), I didn't enter the workforce proper till 2008, landing a government job just before the banking meltdown. US based, fwiw.
The biggest single difference I've noticed across the board (and this applies outside of tech as well) is that nobody wants the financial "burden" of onboarding less-efficient junior staff. Paired with the "meritocracy" regular churn and imports of talent from overseas, it's really put a hamper on the new generation which is bright but inexperienced. Even having public repos and projects that look amateurish can hurt as much as help.
The only broad exceptions seem to be in local/state government and education, where the budgets tend to be tighter and the staff at the latter tend to want to foster learning (Tho the MBAs coming in to management from the private sector is blowing this away, that's a big sidebar).
If the ship doesn't turn soon, its going to really hurt in about 15-20 years when X and the elder Millenials start retiring and there will be a shortage of seniority to help onboard a whole lot of junior staff all at once.
Probably some misaligned incentives once again. When everything is hyper focused on kpis and sprint velocity, getting interns or entry hires on your team is like getting a go-kart engine for a formula 1 race. You know you're gonna lose.
Thing is these things shouldn't be a race to begin with. Interns are more like... Well, your future investment. But no one seems as focused on long term legacy these days over making the next quarter profits. Our future is in trouble until we incentivise behavior for stepping down and passing the torch.
I don't think that's really true. At least depending on the scale. All the major tech companies - I wouldn't even say "big", most startups also fall into this category - have pretty extensive internship programs. These often pay well over $60/hr - and believe me, 99% of these interns do not do $60/hr of work. Often most of the intern projects either never go into production or are internal dashboards or tooling. These interns are often hired at a 70-80% rate to newgrads.
There's not much point in these programs other than onboarding junior staff.
Not just ancedata.
It also doesn't reflect the newer trend of posting mid-level positions as junior ones to drive down wages as unemployment numbers climb.
Yes, but these internship programs are competitive and targeted towards new grads. We might see a situation where new grads from elite universities do well, but a lot of other people can’t get those jobs? Sort of like the pipeline from universities to law and finance.
What sort of shortages might result seems no easier to predict than figuring out where AI will be in N years. We don’t know what the effect on demand will be. It might be a Jevon’s paradox thing where increased efficiency increases demand, or maybe not?
The “entry level job requires 5 years experience” dilemma is something I experienced first hand in 2009. In my mind, seeing that complaint surface again is absolutely a sign that the job market is not great.
I know they don’t want people to panic, but I really think the way they’re representing the unemployment rate, regardless of how they calculate it, still isn’t showing us the full picture.
It's not. Quality of the jobs is important too. If most of your working class is partaking in the gig economy and doing Lyfts, your economy isn't sustainable.
I think they came up with a metric for the underemployed though? I remember that exact thing being a missing metric in 2008 and that was 15 years ago
No single metric can show the whole picture, but aside from that, what do you think is the issue with unemployment rate?
I think we need better industry specific metrics that track how many positions of what specific title are open and at what salary. I dont think the us government tracks that, but I know that info is out there cause I use it when I’m job hunting
Didn't see any mention of ZIRP (zero interest rate policy) here nor in the article, so I'm going to throw that hat in.
During the 2000s the US had very low interest rates, and then during basically the entire 2010s the US had interest rates at 0%, borrowing money was easy and cheap, it made sense to borrow lots of money to throw at people to make companies that could make even more money back.
If someone started their software career in 2010 and is giving you advice, take it with a grain of salt, they came in at and era where the money was flowing. Covid put an end to ZIRP for now, and now we're looking at a recession or worse. Tech companies don't have the same faucet of zero interest rate money anymore, they're doing layoffs, stealth layoffs, and replacing workers with AI (if feasible, good luck!). The job market is very different looking than what it used to be, someone who had most or all of their career in the good years (ZIRP years) is accustomed to something much different (and for their bank account, very pleasant) than what people are facing today, especially people just starting their careers.
Unless you meant mid-2010's (as in around 2015-2016), I would think the opposite, although I might be biased since I had to find a job in 2010 (not a software one though). For the "Information" category unemployment was about 9.7% in 2010, compared to 5.4% now. So, almost double. That doesn't mean it's twice as hard to find a job, it's more than twice as hard. I couldn't even get a job in fast food or retail because there were hundreds of applicants for each job. I lucked out getting a factory assembly job making near minimum wage through a small staffing agency. I say "lucked out" because even though it was a shitty job with shitty pay, I was just glad to have any kind of income. To add on to that, home foreclosures hit their peak in 2010, so it's clear to me that the money was not really flowing yet. I would think someone who started their career in 2010 should have a lot of stories about times of difficulty, whether personal or from friends and family.
Based on my reading of the current job market, it's obviously worse than the crazy boom years of 2021-2022, but it's obviously so much better than the fallout years after the 2008 global financial crisis. I'd say it might be comparable to 2013-2014 in terms of finding a job. Kind of hard, but not impossible. The unemployment numbers were higher then, but the trend was in a more positive direction than it is now.
Note: The Information category is a bit too broad to be well representative of the software industry, but it's better than using the overall unemployment numbers. CompTIA is reporting tech unemployment at 2.3% vs. 4% overall, but I'm not sure how representative that is of the software industry either.
Edit: To clarify, I'm not saying "it's been worse, suck it up," or anything like that. I'm not offering any advice, just trying to clear up some history. Just kind of shocked to see the year with the highest unemployment rate in the past 40 years described as a good time to get a job.