The history of computing can be traced by the popular buzzwords of the day. In fact, at some point we should run a contest where everyone submits their 5 all-time favorite computer industry buzzwords. There have been dumb terminals, smart terminals, client server, thin client, peer-to-peer, virtualization, containers, cloud, paas, saas, iaas…the list, and the acronyms stretch to the horizon.
The one fact through all of this is, regardless of the buzzwords applied, computers are inextricably tied to companies large and small getting work done and the truth is, the ONLY reason to invest in technology to begin with is because it will do something that will save or even make money.
It should be obvious that, since their inception, computers have dramatically improved productivity by enabling critical business processes to be automated and thereby get done faster. Think for a moment of the communications process even in the 1980s, and for all the hundreds of years prior. An executive hand-wrote or dictated a “letter.” Someone else transcribed, typed and mailed it…and in a couple of days it reached the person for whom it was intended. Seems quaint now, when one can speak to their mobile device, have it transcribe and send the same as an email, changing that particular business process from days to seconds. The savings of money and time in the process are huge!
The application of technology has had this same type of profound impact on every business process imaginable; communications, marketing, finance, HR, R & D…everything. To the point where perhaps the most complex business process IN business today is the management and application of the technology. As technology has been adopted it has added countless layers of complexity which, like it or not, it essentially falls to the hearty SysAdmin to manage.
Users just want the thing to work. SysAdmins have to ensure not only that it works, but that it works with everything else, that it has the latest software and updates, that it is secure…again, the list goes on and on and the complexities of getting 1000 users each with 50 – 100 apps on 3-4 devices apiece in a virtual environment, all interfacing with corporate databases on one level or another…well I don’t need to tell you! But even a lot of those tasks have been automated, making life at least manageable for SysAdmins, mostly.
So we’ve got these amazing machines that have made business and the people in them move thousands of times faster, so what have we, the human race, gotten out of these advances and what’s next?
Many would argue that, in macro, we haven’t gotten much! The widely discussed Solow Productivity Paradox holds that despite massive investment in technology by corporations nationally, there have not been concurrent gains in “economic productivity” as measured by output at the national level. In fact, The Center for the Study of Income & Productivity, in collaboration with the Federal Reserve and The Economist, projects that there is actually a 35% delta between actual productivity growth from 1947 to 2012, versus where productivity growth would have been had levels prior to 1947 simply been maintained. (see chart) In other words, we’ve actually LOST ground!
There are numerous reasons, some would say excuses, for this productivity lag, and the very fact of its existence is pretty controversial; whether or not it actually even exists. However, several after-the-fact and interrelated observations would seem to go a long way toward addressing the paradox. First, the sheer complexity and expense of the automation process discussed above has cut into profitability as well as other measures of productivity gain. Next, productivity overall hasn’t been enhanced because, in great part up until now, the application of the technology has been to tasks that already exist. It wasn’t innovation so much as it was automation. It would follow, finally, that since technology is just automating existing “computational” tasks, making them go faster, companies could do the same amount of work with less people. So while per capita productivity is enhanced, effectiveness and overall output are not. The same amount of output is achieved by fewer people doing more faster.
The profitability that could be expected from such per capita gains could be accounted for by the actual investment in the technology; company A has saved $X by automating several tasks, allowing them to lay people off, but they had to buy the technology to automate the task. In fact, one of the arguments is that perhaps those economic gains are still in the future, and that the time span over which measurements have been taken are too short.
If we agree that business moves faster, technology continues to gain and that most meaningful “tasks” have already been automated, what’s left to do? What’s next? It would appear that most recent technological developments will not focus on the automation of already existing tasks. Instead, technology seems to be heading in directions that could fundamentally alter business and “work” entirely.
Doug Balog, General Manager for Power Systems at IBM, likes to talk about implementing the workloads of the future. He talks about the proprietary Unix-based systems still being critical, but maintains that they are not where new workloads are being deployed. “The new era of computing will be focused on data; data mining, unstructured data and business analytics, cloud; cloud management, deployment and cloud apps and engagement; how companies engage with customers.”
Mike Diehl, a Linux Journal writer, SysAdmin and general all-around visionary, talks about equating the workload of the future with the WORKPLACE of the future; asserting that where the work gets done is as interesting and important from a management and administrative perspective as what gets done. According to Mike, “technology has enabled complete ‘location ambiguity.’ It doesn’t matter where a worker is now, as long as he or she can log in, access the data and information they need and manipulate it appropriately. This ‘cloud-based work’ will continue to grow as a trend, requiring a lot more virtualization and all the technology that goes along with it.”
It would appear that this trend could potentially have the positive impact on overall productivity that has been missing so far despite corporate investment in technology for decades. According to census data cited in US News & World Report, remote workers, or telecommuters, put in 5-7 more hours on the job than their office-bound counterparts. A Stanford University experiment concluded that full-time telecommuters are 13% more efficient, with overall performance gains of from 13 to 22% over those in the office.
So the workloads of the future will impact the where, what and how of all companies dramatically. But the technology has to change accordingly. Mr. Balog talks about his job, and IBM’s job, being to capture “the newest workloads being deployed. Open Source is where those apps are being written and deployed.” Hadoop, Drupal, Android, MongoDB and a slew of other Open Source advancements have prodded he and IBM to found the Open Power Foundation; applying the same principles to the hardware side that Open Source development has done for software. “The ecosystem and openness drive cloud delivery, innovation and data-rich apps,” said Mr. Balog, “just as has happened in the Open Source software ecosystem.”
When we discuss Workloads of the future, perhaps the 800 pound gorilla is the Internet of Things. To quote from a study from Berkeley entitled Workloads of the Future, from 2008, “the long-predicted world of fully ubiquitous computation and communication is finally emerging….Where today a billion mobile phones are sold per year, in the not so distant future, perhaps upward of a trillion sensory nodes per year will be sold and deployed – with the majority of these being wirelessly connected. This has the potential to fundamentally change the ways we interact with and live in this information-rich world.”
This 2008 study precisely anticipates the Internet of Things, before it has a name. Today, along with the advanced “computational tasks” like analytics, unstructured data management, etc. previously mentioned, we are tasked with understanding and managing data generated from billions of sensors and actuators all around us. Ken Lutz, a director at the Berkeley Wireless Research Center and one of the authors of the aforementioned study talks about exactly that, positing that we are trending toward “content-centric networking, where computing is all about the data…not the IP address. The internet was originally about linking two computers, or many computers together. Today it is much more about linking data together.”
So building the “Internet FOR the Internet of Things” may be the ultimate workload of the future. Mr. Lutz terms it the “Global Data Plane.” It becomes all about mobility, of both people and data across multiple devices. Hierarchical storage and cloud live side by side, serving equally critical purposes. Encryption of everything becomes massively important as every data point, every actuator of the trillions and trillions, needs to be verifiable not only for correctness, but for authenticity.
The Infrastructure of the “Global Data Plane” is evolving. Obviously the Internet of Things will have far-reaching impact on both work and life for everyone, at least everyone in 1st world countries! The workloads Mr. Balog from IBM describes will also have profound impact, and those technologies will at least contribute to the building of the Global Data Plane. However, perhaps again the investments required to build it out will eat up some of the profitability, and the work involved in monitoring and analyzing the data, as well as managing the complexity of the systems, may cut into measureable productivity gains. If that’s the case perhaps the metrics of measuring productivity gains need to be reanalyzed. If air flight, and even driving, is measurably safer, if more accurate weather prediction saves thousands of lives per year, if surgery becomes more precise and vaccine development faster…those are some of the workloads of the future that may not impact productivity or profitability, but the world is a better place for them.
By John Grogan