The year of (insert trend here) sounds familiar to anyone involved in the tech world. A new innovation arrives, it dominates the news cycle, and its popularity is followed by a chorus of talking heads declaring it as the product of the year. Reality is quite different: Tech trends don’t really follow a yearly cycle and new computing paradigms usually take a lot more than a year to become indispensable parts of the industry.
Take cloud computing for example: We said 2011 was going to be the year the cloud went completely mainstream. Now in 2017 we can see that, while definitely a fundamental piece of the IT puzzle, cloud computing still hasn’t taken over and it’s not likely to ever do so.
It can be tricky to predict trends, but job search site Indeed thinks it has the answer. Its team put together information from job postings and site searches going all the way back to 2014. Its prediction? 2017 is going to be less about the cloud and all about machine learning.
2017: The year of the learning machines?
If Indeed’s numbers are accurate there’s definitely a clear winner in the emerging tech field: Machine learning. IoT is following in a close second place and it’s obvious why: Data gathered by IoT devices is best sorted and organized using machine learning techniques.
Big data job postings are still dominant as well: Someone has to deal with all that data before it’s sorted by learning machines. Here we find our answer for the growing interest in machine learning: There’s too much data for humans to sort.
“More companies know how important it is to log every piece of data and customer interaction, and it’s now simple, with Amazon, to spin up a bunch of nodes and build a model that has predictive power and then shut it down again at a very low expense,” said Doug Gray, SVP of Engineering at Indeed. “We’re now seeing that companies have enough data and processing power to actually use deep learning.”
It’s a simple recipe when you break it down: IoT devices are everywhere and they’re producing tons of data. Humans can’t sort it in an efficient way so we create something that can: Learning machines.
If the Internet of Things continues to grow, machine learning will grow along with it because it has to. Datasets aren’t getting smaller but machines are getting smarter. Machine learning is a trend that isn’t going away anytime soon.
What about the cloud?
Rather than seeing Indeed’s numbers as a sign of the cloud’s end times it’s more likely that its growth is leveling off. There hasn’t been any movement in the number of searches for cloud computing jobs—a good sign that the industry is saturated.
Cloud computing was a game changer when it first arrived and now it’s a part of daily life for anyone who uses a computer. That doesn’t mean that it’s a panacea for every business ill, however.
“Cloud adoption is nearly ubiquitous, but it’s not now and will not in the foreseeable future be suitable for all workloads,” said SolarWinds CIO Joel Dolisy. There are organizations managing datasets so large that that it would take decades to migrate it to the cloud. Those kinds of groups won’t likely be in the cloud anytime soon but they’ll still need machine learning professionals to make sense of it.
The cloud isn’t dead, nor is the need for skilled cloud administrators vanishing. It’s been constant, and it will remain constant. We’re simply over the unfounded promises of a cloud-based future and can finally see what’s actually important.
The 3 big takeaways for MBI readers
- Research from job searching site Indeed shows that machine learning jobs are growing fast, and that cloud jobs are shrinking.
- IoT is growing as well, and pairing that with machine learning growth is natural: IoT devices produce huge amounts of data, and machine learning is the best way to sort it.
- Demand for cloud computing professionals has been steady since 2014. Cloud jobs aren’t going away—the industry just knows what it can and can’t do now.