« Taking IT out of the hardware business | Main | On clunky software »

Consumerization of IT?

CIO Magazine hosted a webinar in October, about the "Consumerization of IT". This is a topic that has gotten a lot of press lately. The CIO webinar covers these:

  1. Why consumerization is a major wave of change, not a fad
  2. The benefits of embracing and supporting consumerization
  3. The benefits of supporting a Bring Your Own Device (BYOD) policy
  4. The role of desktop virtualization in helping IT gain control while users keep their independence
  5. Best practices for rolling out a BYOD initiative

At issue with "Consumerization" is that concept of "BYOD", where faculty and staff prefer not to use the institution's laptops and computers, but would rather bring their own. This idea tends to be more popular the younger you are, and is most evident with tablet computing (the iPad) and smartphones.

For example:

  • I have my own iPad, and I love it as a portable email device. I'll take it to meetings when I travel to the Twin Cities campus, and use it during "downtime" to read and respond to email. I don't really use it for games, but I have a few of those installed too.
  • If I'm sharing a ride with someone else, I may use my Android phone to check email from the passenger seat. It may be somewhat challenging to write a long reply to someone using the tiny keyboard, but it's great to stay up-to-date with my email during that 3 hour trip to/from the Twin Cities campus.

Neither device belongs to the University, but I use them for work. This is an example of "BYOD".

While BYOD has worked out great for me, it's a topic that often keeps CIOs and IT Directors up at night. If the University doesn't control that device, how can we guarantee adherence to standards or controls?

A friend and I discussed this "loss of control", that the IT industry went through the same agony when organizations moved from the mainframe to the PC. They even used the same rhetoric at the time: "Consumerization of IT".

To understand the current trend, let's look at a brief history of business computing:

When businesses started to use computers to help us organize information, and process large amounts of data, everything was neatly stored on a mainframe. This "timeshare" system allowed all the data to be managed "centrally". The equipment could be easily audited, the organization could control how information was accessed. If you needed to process the data stored on the mainframe, you used a "terminal", not much more than a monitor and keyboard at your desk. But that was only a "view" into the system; the real processing always took place on the mainframe, located in an isolated server room.

In the early 1980s, IBM introduced the IBM-PC. This put individual computing within the budget of home users, or departments within an organization. With the right software, a worker could process data without having to go through the company's mainframe. Directors and managers could use the "personal computer" as a tool to solve new business problems.

But at the same time, the industry began to worry that technology was leaving IT's hands. PC's were not mainframes, and central IT did not know how to control the new computer when you could buy one at a store: the "Consumerization of the desktop". Many in IT scoffed at the PC as a "consumer" desktop, that "personal computers" were underpowered or lacked sophistication to become a suitable replacement for mainframes.

But eventually, the PC pushed aside the mainframe, and IT had to find ways to adapt to the new model, and adopt the PC as a business tool.

Fast-forward to today, 30 years later. We are hearing the same rhetoric about tablets and smartphones. Except this time, individual departments aren't bringing them to the workplace - it's the people. Central IT worries about controlling the data on these devices, when they aren't managed by the organization. Still others say the tablet and smartphone as work devices are a "fad", and will pass.

IT will ignore the impact of BYOD at its peril. Look around campus, at our students. I haven't seen many with iPads or other tablets, but they are there, and their numbers are growing. Many students look to their smartphone to check email, not a laptop or a lab computer. The era of the BYOD is already here.

As an institution, we need to embrace the concept of BYOD, and find ways to leverage it. How do we support these personal devices without putting data security at risk? Cloud computing is a good first step, because the data isn't actually stored on the device, it's in the cloud. But we need to plan ahead for where we need to be in 1 year, in 5 years. How will the IT landscape change with BYOD? We're now working on an "IT Masterplan" effort, and one of our goals is to find ways to adapt to BYOD, mobile technology, and mLearning. I welcome your input.