Tuesday, September 6, 2011

Mass Extinction of Computers

Computers are undergoing Mass Extinction more prolific than the Dinosaur extinction. Personal and Organizational outlook to computers are unable to keep pace with the speed of Computer revolution. This is causing the IT-LOGJAM.
CIO is hard pressed to make the Head IT and Head of Business recognize this growing disconnect. The uphill task of bringing fundamental changes to attitudes and approaches is no smaller than the re-creation of life after the events of Mass Extinction.
To understand what happens during and after such cataclysmic changes, it is necessary to learn a bit about Mass Extinctions.   “Speciation” is the intensity and spread of new species. As time evolves the Speciation has been more intense, creating more diverse life (Fig-1 A). At every Mass Extinction datum (level) the fraction of the existing genera (higher grouping of species) that became extinct ranges from 30 to 55% (Fig-1 B). Notwithstanding the Mass Extinction, the number of species continued to grow.
Fig-1 (A)

Fig-1(B)
Figure Caption: The relationship between the percentage of species (genera) that became extinct and the number of species that occupied earth over the geological time. Every Mass Extinction event was followed by a spurt in growth of new species. Often, such new species formed a completely new ecosystem on the planet (Ref: http://en.wikipedia.org/wiki/Extinction_event#Patterns_in_frequency)
From the days of Main Frame Computers in 1960s, through the Mini-Computers, Micro-Computers, Super-Computers, Client-Servers, Multi-Processors, Multi-Core Systems, CICS, RISC, Desktop-PC, Desktop-Super-Computers, Clouds, Laptops, Palmtops, iPads to Smart-Phones – the computers have seen “speciation” and “extinction”. Very few even know the darling Mini-Computer of 1980s – The VAX. A Cray Super-Computer is no more familiar than Carnotaurus (meat eating bull Dinosaur). Smart Phone is more pervasive than Indian Mosquito (Anopheles gambiae)!  
Evolution is intrinsically related to Extinction. Every such traversal among them imbibes new concepts and principles. Some fundamentals may remain unchanged but a lot of details are radically changed.
·       Life remained stable around the principle of Cell and Genetic transmittal all through these 542 million years of the Figure above.
·       Computers remained supported by the fundamental binary logic and principle of Semi-Conductors.
·       Life always had a ‘birth’, ‘living’ and ‘death’.
·       Computers have a ‘program’, ‘hardware’ and ‘user’.
The similarities are large. Let us delve on few.
·       Imagine, we living in the world of dinosaurs or they living in our world.
·       More extremely, imagine a life in early Proterozoic when earth was deficient in Oxygen.
·       Imagine living in the most intense Ice-age known to the planet.
·       Imagine living along-side the super volcanism that erupted in Mumbai 140 million years ago.
·       Imagine using the smart phone like a 1970s’ mini-computer!
·       Imagine using the oldest computer you had ever seen like a smart-phone!
This is essentially what is happening. The power of 4*Vax 11/780 is resident in the Samsung Galaxy SII smart-phone (or an Apple iPhone 4). Millions of iPads will be sold to augment the existing Laptops. Almost all of them will make a 1980s super-computer look meek! 32GB on the phone is much larger than the central storage of huge computers in 1990. EVOLUTION is rapid and EXTINCTION is faster!
What has happened to the vision, veracity and vigor of computers as they evolved and became extinct? What fundamental changes are required to assimilate the new-generation of computers? How can we measure our effectiveness in deployment of the computers? What are the Paradigm Changes we need to accept?
As computers became an affordable commodity, its individual utilization has reached trivial levels. Three types of activities are identified (generally). 1) Instant gratifier (e.g. Facebook comments, Twitter etc.) 2) Compulsive (e.g. Filing Tax Return, Checking One’s Stock holding’s value, tallying bank account, Reserving air tickets etc.) and 3) Progressive – Concerted efforts in learning and doing towards a more cognitive, long-term goal (e.g. Citation analysis, Creative Music, Building database, Ontologies in Geology, New Signal Processing method etc.)
Ø  ~99% of computers including those embedded in smart-phones are used for “Instant Gratification” functions.
Ø  <0.001% of users are capable of elevating their vision or computer’s use to “Progression”
Ø  <1% of IT Managers, Business Leaders and Users can recognize, characterize and act to elevate the level of usage from (1) Instant-Gratifier to (3) Progressive
Ø  Most so-called systems in workplaces (e.g. SAP, EDMS, etc) are focused around driving (2) Compulsive – often pre-determined work.
Ø  This LACK OF ADAPTATION in the user’s mind towards the potential of the 50years of extra-ordinary scale-up (growth) of compute power has resulted into IT LOCK-DOWN.
Ø  Every EXTINCTION of Computers (Mainframes or Minis) has created a diverse and proliferous new generation (PCs, iPads), but the overall eco-system of Programs and Users have remained untouched. THIS IS IMBALANCE and A THREAT!

There is a strong contention that we are experiencing and creating the Earth’s most prolific Mass Extinction. Essentially it can be seen as disparate growth of one aspect of living eco-system – i.e. Humans and their so-called intelligence. In the same way, there is disparate growth of “processing power” and “communication infrastructure”.
Without similar growth of VISION, CONCEPTS and SKILLS, the computers as originally conceived will become extinct. Humans need a complete rethinking on what Computers Can and should do.

Tuesday, August 9, 2011

Database - for CIOs

Madnick and Donavan (Operating Systems, p.337-8. 1974) carried the following functions of Information Management. “Information Management is quite simple… yet, one of the most important …”.
1.       Keeping track of all information through various tables
2.       Deciding policies on storage and access
3.       Allocating and
4.       De-allocating information resource
The term ‘file-system’ is used where it is concerned with simple logical organization. ‘file’ or ‘data-set’ is single separate collection. ‘Data management system’ conducts some “structuring”, but NO interpretation. ‘Database system’ addresses both “structuring” and “interpretation”.
The database is now required to carry out two aspects –
1.       Structuring
2.       Interpretation
Until this is accomplished, the collection of data remains a data-set.
i.                     Classical MS-Excel data-set, if organized carefully can structure the data but cannot interpret.
ii.                   Database is not described through data quantities. Small data-set can become a database and large data-set may remain as file.
To further the understanding of database we need to deal with the “structure” and “interpretation” aspects. The term ‘semantics’ has later been introduced (Semantics is the study of meaning) to replace the interpretation (meaning is fundamental for interpretation) as pre-requisite of a database.
Database = Structure + Semantic {Data-set}

Structure

Data Structure as a distinctive part of Program was defined by Nicholas Wirth (Program = Algorithm + Data Structure). Thus, no Programs are possible without Data Structure! Over the 40 years’, data structures have developed into a complex specialized area with every computer science curriculum having courses addressing them. They are the back-bone to every Information System. Ph.D. level research on data structures is quite popular.
Data Structures emphasize on the elegance, efficiency and effectiveness of storage, operations and access from a computer implementation perspective. Any data structure can become the foundation architecture for database.
Databases emphasize also on the intrinsic context and applicability of the data for the world. This is where the Interpretation and Semantics come in.

Semantics

The semantics and associated developments in computer science are far the most complex and innovative aspects of human discovery. These are applied in many areas of pattern recognition, machine learning and knowledge representation. We will limit to the scope of discussion of semantics to common and widely used implementation of ‘database’ and how they constitute fundamental part of database.
Data Structures like trees created hierarchical databases, wherein the interpretation is captured as stages of hierarchy and resulting models became database systems. IMS (IBM) was so popular in my programming days that CICS-IMS is a sure success skill!
Network database systems approached modeling of interpretation using a network mesh of dependencies using a generalized graph structure. This allowed multiple parent-child dependencies. It acquired a standard status through CODASYL and was used to represent the Interpretation of real business interfaces through datasets.
Relational database developed using the work of Codd (1979) and represented datasets as ‘tables’ in certain unique forms called normalized tables. The business or real-world representation and use of data is modeled through techniques like Entity-Relationship models to capture the interpretation. The resulting design is accompanied with a data dictionary is provided with supporting reference tables like – master lists that provided for controlled vocabulary and values in every table. Together, considerable progress has been made to capture the semantics of data in the model.
Most of the modern information systems deploy relational database models.

Database

1.       Is a model of the real-word interfaces of the data on a computer information system – providing a structure and a semantic (interpretation)
2.       In absence of semantic support the database ceases to be one.
3.       The goodness of any database is largely dependent on the designed model which expresses the interpretation captured from the real-world
4.       Relational database is a structure of database providing semantics through techniques like E-R models and data dictionary

Modern Information Systems and their value are completely dependent on the databases. The 4th Paradigm Science is essentially built on the databases. For an organization, team or the CIO, databases are most important aspect of IS.
Contents in this post are Re-Created from the publicly available information as a CIO sees. For the spirit of this Read-Write culture check this http://blog.ted.com/2007/11/06/larry_lessig/

Monday, August 8, 2011

Oil & Gas Technical Information Systems

Petroleum Exploration & Production business deals with a variety of sciences, technologies and specializations. Along with all the information systems of standard business like ERP, EDMS, GIS etc., the Oil & Gas business handles typically 200-300 technical applications.
This note shall examine the experience with the technical applications over the last 3 decades. The SWOT of the application scenario is addressed. In the light of the current trends in application organization and delivery, the high-value opportunities are identified. CIO for Oil&Gas business has daunting task of specifying, developing and delivering these gold-mines for E&P.

Saturday, August 6, 2011

Know-All Knowledge in Computers

Thanks to the proliferation of computers into everything from the Microwave to the corporate board-room, everyone and anyone believe and claim to be computer experts! CIOs are often confronted with Know-all mindset of the populace in their scope of interaction. Dealing with the double-edged knife like situation from the versatile and proliferating knowledge and (mis) understanding about computers, databases, Information Systems etc. is a challenge to the CIO.
This curious faced young visitor at my home was in his 7th grade. His father an Computer Science Graduate engineer was deliberating with me on the know-all issues. I did a small experiment.
Q. What is a Tau Transform and what is a database?  I asked both the father and son.
Both didn’t know much of Tau Transform except the engineer recollecting that in the 6th semester signal processing course there was some mention of it.
Both had an answer for what a database is.  Both were right to some extent. There is probably no single English knowing educated person, who claims lack of knowledge or understanding of the term ‘database’.
Here is an example of excessive visibility of a term (database) giving a level of familiarity leading to sense of understanding. It is very valuable for CIO and at the same time very dangerous for his work.
There are far fewer people who developed and are working on Tau Transform. Say, a hundred or so world-wide. In contrast from the initial work of Codd (1979), who formulated the relational model of data-structures laying foundation to database architectures, which drives the vast Information System space, there are few hundred-thousand professionals working on this simple all-familiar term – ‘database’. The intricacy, detail and complexity in databases are deeper than those of Tau transform.
Yet, all of us know databases! How? In which context? To what extent?
Just anyone who can use MS-Excel knows databases and has a definition and opinion on it. This exuberance has led to complete clouding of what the database is and what are required to do it right. Consequence is the hugely wasted efforts and misnomers on the strength of organization’s database. Improper, Incomplete, Inadequate, Inefficient, and Ineffective databases are a bane and the reason why the companies (teams or groups) are unable to leverage the new computing techniques. Facebook, Twitter, You-Tube etc. are examples where the database is well designed KISS (Keeping It Simple and Serviceable) leading to the transformational collaboration facility.
Database is not a uniformly understood and addressed term. For the “Knowing”, it is an intricate Entity-Relationship model and Normalized description of the business’ data. For the “All-Knowing” (Could be even your CIO, CFO, CEO, CxO, President, VP, or just anyone else who matter) it is all pervading equivalent of a mobile-phone. Great to use, Nothing to Learn, Costing as little as $50!
CIO meets many such cases of conflict between Knowing:All-Knowing. It could be system security, RAM requirements, Backup architecture, Application needs, Standard Nomenclature, Cluster Architecture etc. for a Head IT. Aggravated by Bing, Google and Others, CIO often stands in same position as a experienced physician confronted with Googled Patient holding latest R&D details, completely interpreted out-of-context or out-of-relevance!
CIO is responsible to bring correct, complete and competent solutions with proper understanding for Information maturity.
  • Failed CIO is a failed database of the company.
  • Failed database is the failed Knowedge Mangement.
  • Failed KM is the business risk of 21st century.
What is a database? Try to answer this before reading further and grasping the CIO’s thoughts. I will take it further as an addendum to this post! It was hilarious to note reputed organization sites misrepresenting this popular term.

Thursday, August 4, 2011

Driving the Change

Experience brings a distinct advantage in the form of “real” understanding unmatched by any PPT. CIO has a responsibility to drive change in the organization. This post is looking back to what drove some significant “changes”, what have changed and where are they being resisted.
1.       1960-70              : Any requirement to telephone involved a workflow. There was always a sweet voiced Ms.Mary – the telephone operator. Request her for your connection and then you speak. The same process is followed with some unknown Mr.Operator, to make outstation calls (trunk-calls). Nothing much is needed as knowledge from the requester. No need to remember any STD code or extension number. Operators even assisted in telephone directory look-up.
THIS GOT CHANGED. There is no need to expand how calls are made – Speed dialing, voice dialing etc.!
REMOVING Ms.MARY and Mr. OPERATOR from JOB has happened. CHANGE lead to extinction of job roles.
2.       1980-90              : As a student in Bombay, planning to travel to Hyderabad on holidays required an early morning travel to Bombay-VT. Stand in queue for 4-6 hrs and get the train ticket booking. Often, to celebrate the success, it is followed by a nice lunch in Church-Gate and a movie!
THIS GOT CHANGED. Faster than completing this Blog post, a ticket can be booked from any origin to any destination in India, online!
QUANTUM CHANGE IN SOLUTION EFFECTIVENESS (1000 times) THOUGH PROPER TECHNOLOGY ADOPTION caused this.
3.       1990-95              : Trading in Bombay Stock Exchange or Madras Stock Exchange needed a broker. Paper applications, documents of shares, etc. was norm. No two share certificates carry same spelling of the name! There was no need to worry about Income-Tax. Unless disclosed, it can be never found (many shares were on some Shriikant’s name, thanks to the Gujarati influence on BSE!)
THIS GOT CHANGED. Income Tax Department issued the Unique PAN Card. REGULATIONS were strictly enforced for PAN CARD to do transactions. DMAT accounts came in and NAME of HOLDER is no more someone’s fancy! Income Tax department can track all transactions and so, it is necessary to file the income returns from market transactions.
NEW GUIDELINES (LAWS) AND GOVERNANCE has done this. TECHNOLOGY assisted the LAW and its GOVERNANCE mechanism.
Any Successful long-term Business Transformation and consequent will entail these 3 things.
i.                     Some tasks becoming redundant and roles removed <What is not needed, expunge>
ii.                   Designing and building new effective methods with QUANTUM improvements <Make all needed tasks many times efficient and easy>
a.       Ease of learning and use (adaptation)
b.      Expansion of scope (applicability)
c.       Addressing known, emerging and new aspects (assimilation - integration)
iii.                 Stipulating critical and essential regulatory controls and effective governance of them
If these aspects do not dominate any thought process of CHANGE, think what else have been driver of change in your experience and examine for them.
With every CHANGE, a work practice, job position or service method has undergone RADICAL transformation. It is imprudent to seek an operator to connect a telephone call today!
In organizational IT, many practices evolved and developed as standard way of working. CHANGES to the computer technology, software architecture, usage and cost structure has severely undermined many of these practices. I will list some for your thought and can discuss them later.
i.                     Strict review and audit of proposals for workstations, storage and network
ii.                   Keeping a team of “programmers” to help professionals to do standard improvements
iii.                 Preparing a monumental “Requirement Document” and specifications for new information systems
iv.                 Return-on-Investment (Cost-Benefit) justification for Information System proposals
v.                   Swearing in packaged vanilla applications that cost in millions (Specifically in Oil&Gas technical space)
vi.                 Creating and storing 100s of versions of the management presentation as it develops
vii.                Getting excited with every new software for an application
viii.              Trusting the mouse!

Monday, August 1, 2011

Changing World of Knowledge

Knowledge and Knowledge representation are undergoing radical changes with the use of computer technologies. The context is not about MS-Office, Facebook or live data feeds. This article is about how logic of human minds is being replicated with technologies that have significant practical use in all facets of science including Oil&Gas Exploration and Prodcution (area of my business).

Chess Grandmaster

Deep Blue made history when it beat reigning world chess champion Gary Kasparov in 1997. Beating with 3 ½ : 2 ½ score, the advent of clear dominance on a facet of intelligence was opened. Subsequently,  in 1998, a standard AMD processor based REBEL played with Vishwanathan Anand who was then world no-2 and beat him. By 2009 a chess engine that runs on a mobile phone has acquired ‘Grandmaster’ norm!( http://en.wikipedia.org/wiki/Human-computer_chess_matches)

Jeopardy!

A very popular quiz program (http://en.wikipedia.org/wiki/Jeopardy! ) was played by IBM Watson computer that had convincingly beaten the best players (http://en.wikipedia.org/wiki/Watson_Jeopardy ).  Unlike Chess program, Jeopardy involves complex integration of billions of items of information into a dynamic structure appropriate to the situation at hand. The game is very demanding on search, assimilation and inference.
These along with the inventions being made using 4th Paradigm of Science in Biotechnology point to a changing role of computers.

Role of Computers over 5 decades

1.       1960-2011+         : Electronic Data Processing (Science & Commerce)
2.       1980-2011+         : Databases
3.       1985-2011+         : Simulations and Advanced Graphics
4.       2000-2011+         : Communications and social networks
5.       2005-2011+         : Knowledge and Digital Inference (Also 4th Paradigm of Science)
Databases (Master data, Transaction data or Meta data) represent a critical foundation layer recognized in 1980s. Designing, developing and operating databases continue to consume the largest resource. Correctness, Completeness and Capability of databases to meet the changing (and growing) applications has become essential. Advent of automated measurements (DCS, SCADA) are expected to radically alter the requirement of humans to ‘Cure’ data and build databases.
HPCs (High Performance Computers), 4GLs, and Functional Development Environments (e.g. R, SAS, Mathcad etc.) have grown to provide necessary algorithmic power at the hands of executors to innovate beyond vanilla software packages. Ability to use these tools has opened new avenues of interpretation in all branches of science and technology.
Numerical data is amenable to Simulation and Advanced Graphics. Yet, a large quantum of human information is embedded into unstructured data (MS-Office Files, Pictures, Audio, Videos, Location details, Contexts etc.).
This is here the new technologies of Knowledge representation,  assimilation and inference come into foray.

Knowledge in Information Systems

IBM describes Watson as “IBM describes it as "an application of advanced Natural Language Processing, Information Retrieval, Knowledge Representation and Reasoning, and Machine Learning technologies to the field of open domain question answering" which is "built on IBM's DeepQA technology for hypothesis generation, massive evidence gathering, analysis, and scoring."[2]
Designed for Complex Analytics, the Watson includes cluster computers, 16TB RAM and a complex network of switches and storage. Hardware is fairly straight forward. What makes Watson different is - a) data organization and b) Inference Logic.
Some quips from Wikipedia:
1.       Watson can process 500 gigabytes, the equivalent of a million books, per second. Processing is not reading or loading 500GB/S into memory.
2.       Watson's hardware cost at about $3 million. GDPF used for Seismic Data Processing costs more.
3.       Watson uses SUSE Linux              
4.       Solutions are developed using
a.       Java
b.      C++
c.       Apache Hadoop (for distributed computing)
d.      Apache UIMA (Unstructured Information Management Architecture)
e.      IBM’s DeepQA software
5.       More than 100 techniques are used in analysis (includes, integration, inference and assessment)
6.       Watson also used databases, taxonomies, and ontologies. Each of these databases tool over 15years to develop and are based on sophisticated semantic architecture.
a.       DBPedia,
b.      WordNet, and
c.       Yago
“We are using information as it exists and making the computer smarter in analyzing that content to compute answers.” Watson works using this principle, which is also the principle of 4th Paradigm
The new age of Knowledge is inherently more dependent on the analytical outcomes from the vast data. Microsoft Research has been working on a variety of new method of working with computers that facilitate transformation into the new world of knowledge. Visualizing, learning and adapting to the new methods of working is going to become the biggest challenge for the CIOs and CTOs of science and technology intense industries like Oil&Gas.

All models are WRONG, but some are useful. A radical outlook to the way science has been functioning and its limitations is given by Anderson (1). There are no fundamental organizing principles in Biology - this in turn facilitiated the field to become intensely data-driven (rather than theory). Earth-Sciences that drive the Oil & Gas industry also lack strong theoretical framework. Unlike Biology, there has been NO collective emphasis on data collection, organization or analysis in Earth-Sciences. Bio-Informatics has become a well funded and highly researched colloborative science since 1990s. There is significant learning from the advancements in the field of Earth Sciences. 
These two subjects Biology & Geology, point to the two differing growth pattern of respective sciences and their impact on scientific discovery.

ARE YOU READY?
1.       READY to UNLEARN the EDP perspective of Computers and Information Systems
2.       READY to organize, structure and semantically define your data
3.       READY for Analysis and Inference
4.       READY to accept the new findings
THE FUTURE IS BEYOND WHAT THE PRESENT IS POINTING TO.  

CIO is responsible to impart the understanding of the new technologies and facilitate development of skills to adapt the new representation of Knowledge. In Oil&Gas which is intensely knowledge driven, the CIO identifies and leads new studies and projects that will give the business necessary advantage and Intellectual Property.

Wednesday, June 22, 2011

Express Your Views

The most significant aspect of 21st century is the proliferation of individual expression to the world at large. Humans are always enamored by their desire and ability to express. Stone drawings to Tweets are the manifestation of this quest.
 
21st century uniquely places the entire world as the recipient to any and every individual’s view with great ease and effectiveness. Today is defined by this OPENNESS. If I were to withdraw this aspect of life, 21st century is far INFERIOR to 20th (in my perspective. Why? Elsewhere!)
 
Every occasion of openness has created two contradicting followers. Those who USE and Those who ABUSE. Yet, the biggest challenge to me (I assume for others like me) is to give a positive response to the reality of Expression as ESSENTIAL.
 
After accepting that there is a NEED to express, the issues of Secure, Safe and Consistent mechanism comes into foray.
 
Expression is a TWO-WAY process involving the entire audience and their willingness to accept Expression. This is different in the big world of internet and is ridden with subtle issues around Organization Culture inside an intranet.  
Choices are :
1.       Resist and Refuse to have any sharable view or opinion
2.       Have a guarded (and acceptable) level of Expression
3.       Everything I feel will be expressed (no holds barred)
 
Why one should Express?
1.       If already the ‘value’ creator, your ideas and opinions define the organization. Yet, many may seek to follow your logic and reason. Your success brings followers. You can lead them.
2.       If ideas are radical, novel or out of place, you need to express them cogently to give them a chance to be pursued
3.       If ideas need support, resources and critique; your expressions will bring these.
4.       If your ideas are destructive (yet likely to be true), tread carefully as your expressions will take time before valued. Galileo faced it.
 
Writing needs greater structure and content. This by itself, will organize your idea into a higher plane. If don’t express, there is really no way of progressing it to next stage (unless it needs no other support).
 
Organizations require a higher level of relationship to foster collective work systems. These are supported by a set of explicit and implicit conduct and behavioral boundaries. Expression will be governed and systematically will to be expanding around to provide increasing scope and strength. Emphasis shall be on maximizing USE and minimizing ABUSE.
 
Some starting guidelines could be:
1.       Encourage expression {All of us have a blog site}
2.       Denounce and Correct any Personal issues and references
3.       Zero tolerance on certain areas like – abuse, explicitness, vilification
4.       High standards of dignity in language and content
5.       Enforce confidentiality with respect to relevant data and detail of the organization
 
Within this, efforts should be made to EXPRESS. Initially, there will be ascription of (both) credit, discredit, motif, support, embarrassment or embellishment.  It is unavoidable and unless the original expression or its comments transgress out of the guidelines, it will be a learning to become visible.
 
Try never to react. If it is needed do after 48 hours of discussing within that the ‘reactionary’ view could be right. It will definitely moderate your response. You mostly will not regret your Post.

CIO encourages, facilitates and guides the organization to rightfully 'express' themselves

Four Phases in Growth of IT

I am one of the fortunate to have lived through the 4 phases of Computer Technologies. It has been a great learning, when I recognized these 4 phases and the way it impacted.
1.       The first batch of B.Tech Computer Science just came around in 1980s. Everyone, Civil Engineer to Sociologist learned how to Program and use the computers to their respective problems. Very new insights were found in every branch of Science, Technology and Commerce by using computer processing. CREATIVE phase of DISCOVERY
2.       Programming became a “profession”. Early adopters (1) got fast growth doing generally applicable solutions for their problems. CICS/IMS/COBOL gave foreign job opportunity. Rapid growth in Programming training. Streets full of Oracle training schools! Every college has established a Computer Science/IT course. World was on the course of Y2K water-fall! Nobody knew why year has to have 4 digits or will the airplanes fall down on 31-12-1999 1sec before 0.00hrs of Year 2000. PROGRAMMERS were EMPOWERED. Programmer decides – business endures.
3.       Managers who spent Billions of $ didn’t enjoy the event free Y2K transform. Big Bust of the IT startups. Bad assessment and control of Managers stood tall. MS introduced the almighty weapon – PowerPoint. All IS got refocused to provide the support to Management – ERP, Business Graphics, PPTs is priority. Programming and Computer Science became a ‘Commodity’. Professionals abided with Corporate IT Strategy and Security. Everyone forgot Programming. Quantity didn’t necessarily excite Quality in using Computers. MANAGEMENT FOCUS.
4.       New breed of platforms and frameworks kept GOOD programmers away from commodity IT. New RAD platforms came in, silently revolutionizing the art and science of problem solution (programming) on computers. Customizing lost its glamour and kind of detested. Configurable solutions EMPOWERED knowledgeable users to become CREATIVE on their OWN. EMPOWERMENT is the WORD for ALL. Governance gave room to Control. Managers got to DRILL-DOWN on DEMAND and detach from FED-IN PPTs. Professionals rediscovered their ability to create new information. Programmers thrived on creating the tools for world to evolve.
Information Systems (IS) and Information Techology (IT) is in this 4th Phase. Where are You?
1.       Did I create any new solution to what I do regularly on the computers, increasing value or productivity?
2.       Do You have a creative new digital information idea that will make You see hitherto unknown?
3.       Can You implement the idea  before week-end?
4.       Do You see all data you "need-to-see" and have "right-to-see"?
5.       Can You differentiate data and documents and can independently verify their conformance?
6.       Do You actively share and collaborate ideas and knowledge?
Yes to one of more points to you being on the 4th. Else, time to create similar set of questions to see if you are still in  3rd or struck further down. {Do I work largely on Emails and Powerpoint as source of information? +++}.
CIO is responsible in the organization to build the necessary culture, systems and learning to move on the EMPOWERMENT phase. HIT (Head IT) will provide the needed infrastructure and governance framework to get it RIGHT, SAFELY and SECURELY.
Let us think further on what this new IS phase 4 has offered and how it can be leveraged