What new skills have you learned on your own time in the last six months?
If you’re like many IT people, you may be deferring to your boss when it comes to career development. Perhaps your company is ready to implement a new SharePoint site and you’ve been immersed in that project for the last six months. Great. But have you been keeping your eye on industry trends on what skills are emerging and those that are waning? I suspect most people working it IT would say “no.”
The June 4th edition of ComputerWorld magazine argues that companies have lots of job openings but there are still many IT workers who are struggling to land jobs. The reason? A skills gap. The jobs are there but workers’ skills are not.
Todd Weinman, a recruiter quoted in the article, said: “You can’t rely on a company for your growth and training anymore…Except for a few enlightened companies, if they’re training you at all, they’re training you for what they need, not necessarily training for what you need to develop your technical skills over the long run.”
Employers are clamoring for candidates with expertise in mobile app development, cloud computing, and business analytics, according to the article. But employees may be tired of the self-development treadmill—especially if they are working more than 50 hours per week.
According to the article, here are some suggestions for thriving in IT in the coming years:
* Create your own training plan. According a recent article published by the Harvard Business Review, set aside a few hours each week on career development.
* Pursue training and certifications on your own time and at your expense. On a related note, I recently read a book called the Six Fundamentals of Success by Stuart R. Levine. Levine writes: “Take control of your continuing education and personal growth. Every six months, give yourself specific learning goals…Too often people wait for their boss to tell them what skills they need. This is a stunning abdication of responsibility.”
* Develop soft skills, like communication.
* Identify different classes of skills. Which technologies are emerging, mainstream, and or legacy? Does your current expertise have a long future ahead of it? If not, what certifications and training do you need?
Here’s my take on it. I’ve worked about 15 years in IT and workers who enjoy learning and are motivated to learn appear to do better in the workplace. In contrast, those employees who decide to go on “career cruise control” become vulnerable during layoffs because their skills are perceived to be “rusty.”
Obvious advice? Perhaps. But many workers don’t seem to follow it. What do you think about this issue?
Check out the ComputerWorld article.
Yappa says
Hi Robert,
Thanks for the interesting blog. I have a question about this post… why do you think it’s important that employees pay for their own training? If the company will pay, that seems to make more sense.
Robert Desprez says
Hi Yappa,
If you are lucky enough to work for a company that will pay for your training and you think it will benefit your career, great! However, most companies won’t pay for training, especially if they don’t see a benefit to them. In that case, I’d recommend you pay for own education if possible.
Thanks for your comment!
Peter says
I strongly disagree on several points …
“…companies have lots of job openings but there are still many IT workers who are struggling to land jobs. The reason? A skills gap. The jobs are there but workers’ skills are not.”
I think that the key to solving this problem lies in the answer to a simple question: Whose responsibility is it to train workers, educational institutions or employers? Whose responsibility is it to identify real world job requirements and needs? Are job postings realistic? Do they offer truly fair compensation for the skill sets demanded? In many cases the anwer is a NO.
Do we have a system in place where employers and educational institutions cooperate with one another to create those skills? NO.
On the other hand, do employers hire fresh graduates from school? I don’t think so. They constantly require years of experience that students don’t have and will never get if they are not hired by someone. They prefer to steal trained professionals from their competitors instead. It’s cheaper, but at the end of the day, SOMEBODY will have to start creating.
They expect the students to bear all the weight of an IT education that is increasingly costly, to mortage their lives with student loans and when they are done studying what they thought would be relevant, they hit their face with the years of experience wall.
Every employer wants a fully trained workforce, but none wants to bear the cost. Reality check. No school in the world will be able to provide relevant skills while they remain divorced from employers. Experience won’t be there until people are hired.
So, who are really responsible for the so called skills gap? Both the employers and educational institutions, not the workers. If the institutions are not teaching relevant skills, and the employers are not willing to do it in-house, or fail to communicate properly, how could workers know any better?
“Pursue training and certifications on your own time and at your expense.”
Training and certifications are not cheap. If a money-making company finds it difficult to bear the expenses, what could you expect from (most usually unemployed/underemployed) students?
Robert Desprez says
Hi Peter,
You make some interesting points. I certainly agree with you when it comes to the crushing debt levels that many students face when they graduate. For those individuals, it may simply be impossible to enrol in more courses and certifications.
As far as educational institutions teaching the wrong skills, I believe it’s incumbent on every student to make sure the school is teaching job-ready skills. If not, go elsewhere. More and more schools are offering online courses, enabling students to evaluate a wider cross-section of training.
I should note that “training” doesn’t always mean university or college courses. Most of my learning has been self-directed, either through books, blog articles, and downloading trial versions of software. So education doesn’t have to be prohibitive.
To sum up, I realized some time ago that no company, government body, or educational institution will “take care of my career.” For better or worse, it’s up to me to make sure I’m employable.
Thanks for your comment.
Charles Hudson says
I certainly agree with the concept of continuing education and taking responsibility for one’s own development. Over the years I have spent quite a bit of my own money for classes and certification exams.
At the same time, with more ad more young people graduating from college with unheard of debit loads for their education, and the stagnation of wages and salaries that employers are willing to pay, I don’t see how we can expect them to continuously invest more money in education obtaining skills that may well be obsolete within just a few years.
I think that employers need to step up and realize that they have a responsibility for investing in their employees also. Back in the post WWII era, when the US economy was the envy of the world, employers readily accepted that they needed to invest in their workers through OJT, Apprenticeship Programs, After-hours training classes, and tuition assistance programs. They understood that their employees, along with their skills, were an asset, and just like any physical asset, they had to invest in the maintenance and upkeep of them. Today, many employers look at those they hire as expendable resources rather than as assets. The results are obvious to anyone who wants to take a look.
Germany, which is, and has been for some time, the strongest economy in Europe, has employers who look at employees as long-term investments and have active apprenticeship programs as well as development programs for new college graduates who are hired into their industrial firms. Our employers tell us that they can’t do this and stay in business, but the Germans do it and remain extremely competitive in the world markets. Perhaps a study of what works is in order rather than just shoving every possible cost off onto the employees.
Robert Desprez says
Hi Charles,
Yes, I agree: It seems that most employers have stopped treating their staff as an asset. I’ve actually heard managers say, “if we can’t get the technical writer to accomplish the work, we’ll just get another,” as it they’re switching cogs in a machine.
A comparison of companies in Germany and North America would be really interesting.
Thanks for your comment!
Eddie VanArsdall says
Great post, Robert.
Whereas employment opportunities in my geographical area (Washington, DC) used to have relatively standard requirements, I find that employers now look for prospective employees who can wear many hats (e.g., writing, editing, training, design, development). Employers are not likely to fund training related to forward-looking skills unless they see an immediate need for those skills. I try and keep up by maintaining a subscription to a video training service (lynda.com). I can get up to speed quickly on many different subjects. Yes, it can be expensive, but it has already paid off.
WyssWriter says
I think too much emphasis is placed on tools skills and not enough emphasis on analytical skills and experience. How is the IT professional supposed to know on which tool sets to focus. One could choose the latest and greatest, but most IT companies are slow to move to the latest and greatest. To me, the greater skill is the ability to learn and master new tools efficiently (not which specific tools one can list on one’s resume), the ability to manage projects and communicate effectively, the ability to look at a large, complex project and break it down to manageable pieces, the ability to work independently or on a team — all those soft skills.
Businesses would do well to look for the soft skills and acknowledge that these employees are smart enough to learn the tools used for that particular company and to aid them in gaining the skills either through on-the-job training or paying for classroom training.
I’m not saying that one should not learn a new tool on one’s own, one should. It shows potential employers we’re still eager to learn. Choose to learn a tool that you’re really interested in learning. Then you can look for a job where you can use the new tools skills, keeping in mind that you may or may not find that job. Even if you’ve chosen to learn the latest and greatest technology, one may have to continue in one’s current job or accept a new job that’s still using old technology, because the reality is we have to pay the bills.
Tim James says
Good points, especially for those of us on the cusp of our “golden years,” hoping we can coast to the finish line on the skills we presently possess. We need to expand our horizons, whether our companies are footing the bill or not. Like anything else the company spends money on, we need to demonstrate a return on the investment if we wish to have the company pay for our development. Even now, for all the times I was accused of having my head in the clouds, it appears having my head in the “cloud” is exactly where I need to be right now!
Robert Desprez says
Hi James,
I agree that everyone should continue learning, even if one is nearing retirement.
Thanks for your comment!
Juan Elevancini says
Hi all,
From my experience, I’m a firm believer in approaching career development as a real entrepreneur. This endeavour includes education, training and volunteer work. Without going into details, it has worked for me in in the long run. And like any other business venture or investment, there are risks and opportunities involved.
The reference to Germany gives me more confidence that I should continue with my German studies and for the long run too.
When some of my friends said to me, “how lucky you are to land that job (or contract)…” I reply with a quote from Pierre Trudeau:
“Luck, that’s when preparation and opportunity meet.”
RDesprez says
Hi Juan,
I like that quote. Thanks for sharing!