Monday, March 18, 2013

Centralized IT - an obsolete notion?

Where do IT decisions now belong?
We are now into our second decade of the 21st century. This is the decade of social media, a BYOD in every pocket, of robust applications (very mature open source and proprietary solutions), of HTML5, IPv6 and the Internet of Everything. This is the decade when IT is in the plumbing of every organization, it’s the tick to their tock. Each organization has a unique need and use for advanced technologies, a requirement to power the business of their own IT DNA. But as we look around, many organizations are still functioning as if it’s Prince’s 1999.

Technology has not only changed in the last 13 years, it’s morphed and evolved into a new ubiquity. A good analogy might be the utility known as electricity. At the beginning of the 20th century, it was common for large businesses to have their own power plants. They had large organizations dedicated to maintaining generators and stringing lines to meet the demands of the manufacturing process. Today such a notion would be considered silly. Electricity became ubiquitous. There is now an outlet in the wall that you seldom give a second thought about.
I’m proposing that we are now in a similar place with IT here in the 21st century.

An obvious example is your Marketing department. Marketing has significantly different IT needs than even twelve months ago. Marketing needs to manage the enterprise’s web / portable / social network presence. This is about messaging, branding and customer contact. It’s about positioning and market share, it’s about outreach and inbound allegiance. It’s about communication. Both internal and external. It’s about being agile and responsive. The fact that these requirements are sitting upon an IT infrastructure is secondary. To use another utility analogy, IT is the plumbing – Marketing is the elaborate fountain display.

Or, what about Finance? Their IT needs have gotten as sophisticated and demanding as the Marketing arm. They have very unique requirements for government reporting, risk management, asset management, security, Human Resource Management as well as shareholder communications and financial management. Finance does not see the need to underwrite its own insurance, and seldom would they ever consider even owning a fleet of vehicles in this age. It’s the same as owning your own printing plant to print your annual report. Might be fun, but it does not make sense.

We could spend time in each department – and I propose that here in the 21st century we’d discover the same in each. Departmental demands are unique and becoming more so. Managing them from a central authority is not only no longer required, but is actually detrimental to being responsive in a fact (and fast) paced market. I believe being centralized impairs the ability to succeed.

So – here’s the point of discussion: instead of a central CIO – perhaps there should be a “Marketing Technology Subject Matter Expert”, and a “Finance Technology Subject Matter Expert”, “Operations Technology Subject Matter Expert” etc. who are responsible for assuring the departmental IT demands are being met. They also maintain a viewpoint into emerging systems that should be considered for incorporation into the departmental tool set, and are held accountable for the coordination of enterprise wide of technology decisions. Then there is an “Enterprise Security Expert” who is held accountable for maintaining security physical and virtual security – this function coordinates with all departmental SME’s.

I believe that this group of SME’s becomes a “virtual” CIO. The “utility” requirements are then outsourced to the cloud, or to the local electric utility, or the water company or….


"For first we use machines, then we wear machines, then we become machines."
Kim William Gordon
LinkedIn
Facebook
www.bitnus.com 
www.kimwilliamgordon.com

Sunday, March 17, 2013

A brief history of the interface: We become the ghosts in the machine.


We have undergone numerous stages of interaction between humans and computers - we are getting closer to actually "being inside the machine" as opposed to being on the outside looking in. 

The ways in which individuals interact with computers has changed dramatically.

·      Keyboard Interface – the initial means of interaction with machines was exclusively through keyboard commands (or via such keyboard secondary means as punched tape, cards, magnetic tapes). Computers at this time were expensive, and required extensive training and complex programming knowledge in order to interact with them.

·      Mouse Interface – at the beginning of the “personal computer” era, keyboard input was augmented with the now common “mouse”. This allowed individuals to interact with the machine in a more free form manner, and allowed the development of computer based analogs to the real world, such as “folders” and “files”. This input interface allowed the development of such disruptive technologies as Photoshop, 3D Computer Aided Design / Computer Aided Manufacturing as well as the commonly known word processors, spread sheets and other digital replacements for traditional business, engineering and creative systems.

·      Touch Input Interface – now very familiar to most, touch screens allow direct manipulation of the information within the computer via “touch, click, squeeze and drag” maneuvering. Generally, there is no mouse component to this interface, however the ability to use a digital keyboard is common. Recent innovations in this input method include “raised” interface and “touch screen modification”, where the touch screen changes physical characteristics (such as, becoming “rougher” or “dimpled” based upon the application’s requirements.

·      Post Touch Input Interface – this is the new horizon for machine interaction. Composed of both voice and gesture recognition, this emerging means of machine interaction allows the machine to recognize basic human communication efforts and respond to them. Voice recognition systems are part of this environs.

·      Wearable computation devices – although technically not a human interface method, the development of machines which monitor individual human performance telemetry (i.e. tools to measure a persons pace, their GPS coordinates, their heart rate, blood pressure, etc.) as well as their immediate environmental conditions, are already commonplace as add on apps within many smart phone systems. Utilization of Radio Frequency Identification Tags, Face recognition systems and crowd shared data (best friends, common interests, favorite restaurant, etc.) all add bulk to this “cloud” of data accumulated per person. Think of Charlie Browns cartoon character Pig Pen - the cloud that follows a modern human around is not of dust and dirt, but of personal data. A cloud following the individual around 24 hours a day, accessible instantaneously.

·      The Internet of Everything – in 2008 the number of “things” (devices, lights, heating systems, traffic systems, the list is inexhaustible) outnumbered the number of humans connected to the Internet. This vast and growing amount of data can be and is captured and included in the human – computer interface. 

·      Computer Augmented Human Systems – the sheer volume of human augmentation systems coming to the market is incredible. Brain implants designed to assist Alzheimer patients memory retention and implants designed to reduce or eliminate elliptic seizures, are here and being integrated into the human cranium. Computer assisted replacement limbs, optical sight enhancements and computer directed blood borne nano technology is no longer in the realm of science fiction, but science fact. To quote Arthur C. Clarke “Any sufficiently advanced technology is indistinguishable from magic.”

·      Augmented Reality Interaction – combined with the Post Touch Input Interface means of interaction, wearable computation and the Internet of Everything (above), this interaction places the individual within the computing environment itself, Google Glass project and the Apple equivalent are charting this new territory. 

As we dive deeper into the bowls of the machine, the concept of “reality” will begin to blur. We will be looking at the universe from within the interior of the computer – and will being to have difficulty determining where “reality” ends and “augmented reality” begins. At this point, we truly become the Ghosts in the Machine.

"For first we use machines, then we wear machines, then we become machines."
Kim William Gordon
LinkedIn
Facebook
www.bitnus.com 
www.kimwilliamgordon.com

Thursday, March 14, 2013

You Are Here: The Fourth Industrial Age.

The last 600 years have been remarkable times for humanity. When Gutenburg ignited the Communications Age in the 1400's with the invention of moveable type, he created the foundation for the next age - the First Industrial age. The Communications Age provided cheap, and (in those times) rapid distribution of ideas and information. It was just a matter of time before the world began to re-engineer itself.

The First Industrial Age launched with the invention of the steam engine in the 1760's. This achievement changed everything. No longer were ocean going vessels at the whim of the winds and tides - they were "under power". Trips that would have taken months previously now could be completed in weeks, sometimes even faster. We put engines into everything. We put them to work in the textile industry - an industry that previously was a "cottage industry"; thus causing unemployment for families who had woven fabrics for generations. The steam engine powered the new locomotive industries, encouraged new engineering techniques with hydraulics, which directly led to better sanitation in major cities. And better sanitation resulted in less disease, and less disease resulted in bigger and healthier populations. We attained our first billion souls in population around 1806, technology was the catalyst.

The Second Industrial Age began at the end of the the 1800's - with the harnessing of the magical power of electricity. Think about it. Prior to the 1900's, the world was a dark place, lit only by candles and kerosene, or natural gas. By the 1904 Worlds Fair, electricity was debuted in the "Palace of Electricity" - the centerpiece of the event. Electricity was touted as the technology that would change the world. They were right. Electricity powered the telegraph, then radio, and eventually television. The telephone was the central piece of the most significant change the world has seen in over a hundred thousand years.  Electricity changed everything yet again. Electricity energized the Communications Age pioneered by the Gutenberg Press. Suddenly the world was brighter and talking- globally. And we had a lot to say. By 1927 there were now 2 billion souls in population.

The Second Industrial Age directly led to the Third Industrial Age - the development of computational engines - computers. The first evidence of electro mechanical computers was in the early 1930's, but wasn't until World War II that the technology began to evolve. The technology exploded in the 1980's with the introduction of the personal computer. We started to put computers into everything. And now we carry more computing power in the smart phones in our pockets than was on the entire planet in the 1950's.
1960 - 3 Billion
1974 - 4 Billion
1987 - 5 Billion
1999 - 6 Billion
2012 - 7 Billion
All of these population increases are directly related to improved technologies in medicine, food production, sanitation, transportation, production, planning - and all of these improvements can be traced to computers - the logical result of the previous Industrial Ages.

The Fourth Industrial Age is upon us. Harnessing the vast power of computers from the Third Industrial Age, the limitless imagination of humanity is exploding. We now know the human genetic code, we are building powerful social networks that are changing society and governments, we are weaving computers into our clothes, our cars and our brains. We are putting computers into everything including ourselves. We are at the beginning of what portends to be the most creative period of human existence. There is no conceivable limit to the power of human imagination. This is a fascinating time to be alive.

"For first we use machines, then we wear machines, then we become machines."
Kim William Gordon
LinkedIn
Facebook
www.bitnus.com 
www.kimwilliamgordon.com

Wednesday, March 13, 2013

Defining Human.

Technology changes us.

 It always has.

From the discovery of fire to the invention of the plow which ignited the agriculture age, and ushered in the beginning of civilization itself, technology changes our society, our culture and us personally.

We evolved along with our technology through the perfection of metallurgy - through the copper age to the bronze age to the iron age, every step of the way, we changed.

We learned how to control the rivers and lakes in the great hydraulic ages and marched on to the revolutionary perfection of the Gutenberg printing press. An invention so powerful it created the communications, education and provided the foundation for the first industrial age.

And here we are, at the beginning of the fourth industrial age, an age where we once again can use technology to re-define ourselves. Except this time, we are doing it at the personal level. We are weaving technology into our clothes, embedding technology into our brains, connecting our minds to the global network.

"For first we use machines, then we wear machines, then we become machines."
Kim William Gordon
LinkedIn
Facebook
www.bitnus.com
www.kimwilliamgordon.com