Technology, in particular data management systems, continue to go about rapid and dramatic changes. For many non-profit agencies and small, mission-conscious businesses, the task of keeping current with the changing tech environment is daunting especially given the common lack of adequate tech skills and resources.
Some of the key factors and changes include a continuing shift of emphasis on outcomes and performance management instead of output measures; the ascendancy of mobile computing and bring-your-own-device approaches; the “Internet of Things” meme; and huge advances in robotics, artificial or machine intelligence, and virtual reality platforms. For many in the non-profit world, those changes are interesting but can seem to have little practical application to day-to-day operations. And when they are perceived as applicable, their actual effective implementation appears to be far beyond the local operational grasp.
But those changes are central to the fundamental shift in global economies away from the industrial age and its focus on labor and capital to an information or ideas age that focuses on data, information, knowledge, creativity. So how can non-profits cope with those changes?
I don’t have a unified, overall approach to that but am beginning to see some of the pieces and considerations. First, the new world of technology and data does not require a huge refresh of either software or hardware. That means that any organization with a basic IT infrastructure (i.e., desktops and laptops joined in a local network and with widely available Internet access) can leverage that infrastructure to the new world.
Much of the recent shift has involved “The Cloud”– from cloud storage to hosted (cloud-based) applications. Connecting a typical desktop or laptop to the Internet and its resources essentially makes that local workstation a “thin client.” Unlike the traditional thin client (or even VDI) approach championed by Citrix, VMWare, Oracle, etc., the new thin client gets its processing power and data from a multitude of servers and providers. Accessing that power and data can be done with simple, inexpensive, or legacy tools. A “general purpose” computer is not needed anymore.
Obviously that approach places increasing emphasis on the access to reliable, adequate connectivity. Google’s experiments with gigabit broadband and national debate about “Net Neutrality” illustrate that connectivity is a major concern in the new tech world.
One implication of the shift to distributed systems, to “small pieces loosely joined” (David Weinberger), is that organizations have to pay more attention to figuring out what pieces and tools should be accessible to various workers. (And then, of course, how those pieces get joined or linked together.) Is it really necessary for every field case manager to have access to the entire electronic service record on their mobile phone or tablet? The challenge becomes finding or building systems that follow the small pieces approach and having the resources to link them together into larger, meaningful and actionable systems.
A major barrier in this new world is the issue of security. In the early days of computing, security was handled by restricting physical access, including keeping systems turned off unless being actively used. That approach worked well in the world of mainframes, mini-computers, PCs, local area networks, and closed wide area networks. But the always on, everything is open and transparent world of the vast and uncontrollable Internet, it becomes harder to use physical or even virtual restrictions.
The challenge of secure computing is probably the biggest issue facing not just non-profit agencies but all users of technology.