Category Archives: Information Systems

Small pieces (yaks) (very) loosely joined

My concept of the value and power of the internet rests largely on the idea that it is basically many small pieces loosely joined. That idea embraces the use of small, single-purpose widgets that can stand alone, which makes it easy to build and test the widgets. Then it combines them by being joined, but they are loosely joined– there are not lots of dependencies on other included pieces.

The joining makes the overall result very robust and powerful. The small pieces make it very flexible and easy to manage.

That works for the internet itself, and works very well.

But we seem to have major difficulty making the same model work for other systems. I used to think it was just me and my own limited skills at programming and system design. (The “pieces+joined” is almost the antithesis of “system design” but that’s another story….)

And then I came across this gem on today’s O’Reilly’s “Four Short Links.”  Jeffries’ experience struck me as my life on a pretty much daily basis.

So why does the internet work but when Jeffries– or poor slobs like me– try to work with the same model, we end up neck deep in yaks?  I don’t have a ready answer for that but one possibility that immediately comes to mind is the use of standards.  The internet, and it’s child the web, work because there are clearly articulated and enforced standards that you have to follow, else your application won’t work.

But the likes of Google App Engine and python (substitute your favorite environment tools here), don’t follow the same clear articulation of standards.  Depending on what gib repository you use or what options you use when installing python and it’s many libraries or your own development structure, using someone else’s sample code may or may not work right.  You are just as likely to end up with a room full of yaks as a working tool.

Just an early morning thought dump…

RIP hitchBOT, Long Live hitchBOT

So hitchBOT apparently met it’s demise in the city of Brotherly Love. But it did what it set out to do:

“We want to see what people do with this kind of technology when we leave it up to them,” Frauke Zeller, one of the creators and an assistant professor in professional communication at Toronto’s Ryerson University, told the AP. “It’s an art project in the wild — it invites people to participate.”

And as hitchBOT itself said, “I guess sometimes bad things happen to good robots!”

I fully expect that hitchBOT Jr. will rise soon. After all, that is the pattern of the humans who originally brought hitchBOT to our presence.

Thank you hitchBOT for your wonderful journey and thanks for what has been and what will become!

Exploring now– Bitcoin, Augur

TechCrunch today pointed to Ethereum and Augur which started my latest tech exploration and, of course, led to Bitcoin.  In spite of it’s widely recognized role in the underground ‘Net world as a means to conduct financial transactions securely and without regulatory oversight (think Silk Road, marijuana stores circumventing US ban on access to traditional banks), Bitcoin relies upon some very interesting and capable technology approaches.  Deeply rooted in the open source world, Bitcoin is fully de-centralized relying on peer-to-peer technology and top security.

From what I can see, Ethereum is sort of the “meta” version of Bitcoin. It seeks to provide a framework of decentralized connectivity and strong security upon which tools like Bitcoin could be readily built.  And Augur is one of the first efforts to do that.  In Augur’s case the industry is prediction markets.

My exploration has just started but this whole world of open-source, highly secure, decentralized operations strikes me as one of the fundamental economic shifts ushered in by the Internet (noting that “the Web” is only one part of “the Internet”).

Fascinating stuff.

Practical Considerations for IT Structure

Technology, in particular data management systems, continue to go about rapid and dramatic changes. For many non-profit agencies and small, mission-conscious businesses, the task of keeping current with the changing tech environment is daunting especially given the common lack of adequate tech skills and resources.

Some of the key factors and changes include a continuing shift of emphasis on outcomes and performance management instead of output measures; the ascendancy of mobile computing and bring-your-own-device approaches; the “Internet of Things” meme; and huge advances in robotics, artificial or machine intelligence, and virtual reality platforms. For many in the non-profit world, those changes are interesting but can seem to have little practical application to day-to-day operations. And when they are perceived as applicable, their actual effective implementation appears to be far beyond the local operational grasp.

But those changes are central to the fundamental shift in global economies away from the industrial age and its focus on labor and capital to an information or ideas age that focuses on data, information, knowledge, creativity. So how can non-profits cope with those changes?

I don’t have a unified, overall approach to that but am beginning to see some of the pieces and considerations.  First, the new world of technology and data does not require a huge refresh of either software or hardware. That means that any organization with a basic IT infrastructure (i.e., desktops and laptops joined in a local network and with widely available Internet access) can leverage that infrastructure to the new world.

Much of the recent shift has involved “The Cloud”– from cloud storage to hosted (cloud-based) applications. Connecting a typical desktop or laptop to the Internet and its resources essentially makes that local workstation a “thin client.”  Unlike the traditional thin client (or even VDI) approach championed by Citrix, VMWare, Oracle, etc.,  the new thin client gets its processing power and data from a multitude of servers and providers. Accessing that power and data can be done with simple, inexpensive, or legacy tools. A “general purpose” computer is not needed anymore.

Obviously that approach places increasing emphasis on the access to reliable, adequate connectivity. Google’s experiments with gigabit broadband and national debate about “Net Neutrality” illustrate that connectivity is a major concern in the new tech world.

One implication of the shift to distributed systems, to “small pieces loosely joined” (David Weinberger), is that organizations have to pay more attention to figuring out what pieces and tools should be accessible to various workers. (And then, of course, how those pieces get joined or linked together.) Is it really necessary for every field case manager to have access to the entire electronic service record on their mobile phone or tablet? The challenge becomes finding or building systems that follow the small pieces approach and having the resources to link them together into larger, meaningful and actionable systems.

A major barrier in this new world is the issue of security. In the early days of computing, security was handled by restricting physical access, including keeping systems turned off unless being actively used. That approach worked well in the world of mainframes, mini-computers, PCs, local area networks, and closed wide area networks. But the always on, everything is open and transparent world of the vast and uncontrollable Internet, it becomes harder to use physical or even virtual restrictions.

The challenge of secure computing is probably the biggest issue facing not just non-profit agencies but all users of technology.