Nagesh V. Anupindi

Saturday
Dec 16th
Text size
  • Increase font size
  • Default font size
  • Decrease font size
Home Publications Technology Articles

Information Technology Articles

Publications - Information Technology Articles

Unregulated IT - Beyond Utility Computing

Unregulated IT - Beyond Utility ComputingThe concept of buying an organization’s computing needs on a pay-as-you-go basis is known as Utility Computing. In this model, organizations pay for the usage of applications, either to Information Technology (IT) departments or to an external technology vendor. This approach is similar to the way consumers pay for utility services like electricity, gas, and water.

Though the Utility Computing concept has been around since the late 90s, it started becoming a reality with recent offerings from vendors including IBM, Sun, HP, and Veritas. However, the current products and services are primarily limited to infrastructural components such as network, storage, and server capacity.

In this paper, we extend the Utility Computing model beyond the infrastructure layer to incorporate information, application, business process, and presentation layers. A true and completely “unregulated” Utility Computing can only be achieved if each of these layers becomes “virtual” to its neighboring layers.

Although the current perception of today’s Utility Computing is just a marketing spin on an outsourcing model, it is raising the awareness to change the mindset and the way IT departments think of themselves.

Holding the business users as hostages of monopolist IT departments will push the business departments to demand the Next Big Thing from IT: Unregulated IT.

INTRODUCTION TO UTILITY COMPUTING
When we need electricity, we don’t build power plants in our backyards. We use the power from the local utility company. This is the very essence of a utility model: standard service available when we need it, in the amount required, and paid-for as consumed.

A similar approach could be taken to meet the Information Technology (IT) needs. However, in most organizations today, IT needs are met by internal organizations that plan, procure, deploy, and support capacity internally (“building a power plant in their own backyard”) based on forecasted, not actual, demand. This often creates issues as IT infrastructures struggle to respond to changes in the business environment. Utility Computing proposes to provide constantly varying levels of computing power on an as-needed basis (Figure 1). An organization that needs significant capacity once a month (for example, for month-end financial processing) could use (and pay) for that service only during the needed period, making the resource available to others for the remainder of the period.

Utility Computing aims to provide varying capacity with flexible pricing to the business user. This leads to reduced IT costs, allowing true pay-as-you-go pricing. The implications to an IT organization are tremendous, freeing it from day-to-day operational responsibilities to focus on the next wave of computing: Unregulated IT.

The Unregulated IT model forces IT to think through the reasons why “IT does matter to business”: the quest for enabling business value.

ENABLING BUSINESS VALUE
Current vendors are successfully implementing Utility Computing models by leveraging technologies in autonomic computing, grid computing, and virtualization. However, this provides only a foundation for further enhancements. To truly enable business value from IT, the Utility Computing journey should continue towards enabling the business user to own and manage their processes. Unregulated IT accomplishes this by building on top of the Utility Computing model. Unregulated IT takes the system-level concept of virtualization (or abstraction) and extends it to an architectural framework (Figure 2) that addresses business process and presentation-level abstraction.

A Services-Oriented Architecture allows an organization to disconnect business process from underlying business function, enabling a business user to universally access common business functionality that can quickly refine and implement business processes to meet the changing business needs. Business Process Management (BProM) then facilitates the rearrangement and optimization of business services, accomplished by business personnel, thus reducing the impedance introduced by traditional IT. Finally, abstraction of the presentation layer allows consistent business capability to be delivered to anyone, anywhere, on any device.

WHAT WILL BE CHANGING?
With this approach, Unregulated IT dramatically increases flexibility and responsiveness, while fundamentally changing the nature of an IT organization.

Through Unregulated IT, value will be created by the business owner but enabled through IT. This model changes the very nature of an IT organization, changing from the constrainer and limiter to an enabler. The organizational implications of this change are significant.

Externally, Unregulated IT will change how business perceives IT. Traditional IT requires technical involvement from a business user: business users are often involved in hardware configuration, storage requirements, etc. Unregulated IT releases them from this burden. The “under-the-hood” elements of IT such as hardware, network, operating systems, including the type of software and applications being used, will become irrelevant to the business user (Figure 3). They only see the realization of their business process through the business services that they manage. We can now characterize the IT infrastructure from a business user’s perspective as process management and a supporting set of business services, instead of servers and applications.

Internally, Unregulated IT also presents the opportunity to change how an IT organization perceives itself. World class IT assessment will be based on standardization, measurement, and management of the value chain. To achieve this, standardization will occur across key operational areas including systems management, security, source code management, and outsourcing. Simultaneously, these operations push IT to redefine the strategy, enterprise architecture, perceived IT services, and the management of its people and organization. This process, by its very nature, turns world-class traditional IT capability into commodities. With Unregulated IT, the competitive advantage of owning world class “traditional infrastructure” decreases as business users perceive the higher-level business services and process management as their true infrastructure. Using our utility metaphor, electricity is the true infrastructure to the consumer, not the power plant behind it.

BENEFITS OF UNREGULATED IT
The mantra for CIOs/CTOs during the last decade has been “better, faster, and cheaper” (Figure 4). All the IT efforts such as optimizing IT operations, reducing implementation costs, improving service level agreements, standardizing hardware architectures, and simplifying software license agreements are efforts to realize this goal. Frank Gillett of Forrester Research states that “We think [Utility Computing] is the third major computing revolution - after mainframes and the Internet.”

Unregulated IT can make IT better by simplifying the way IT functions: it creates an environment where IT assets are not an issue; the assets become adaptable to each other since they will be built using standards. Unregulated IT can make IT faster because IT can promptly provide service to business users either to execute new ideas or to change the computing needs at the speed of businesses. .Unregulated IT can make IT cheaper because IT will lower overhead costs and take advantage of the flexible pricing and payments, while reducing start up costs for new initiatives. Realization of the better, faster, cheaper mantra will force IT organizations to think strategically about the Unregulated IT Architecture.

STRATEGIC ARCHITECTURE OF UNREGULATED IT
Though vendors such as IBM and Veritas are producing products and services around the infrastructure in the current Utility Computing model, the model should not be limited to servers and processing power. The idea of virtualization can extend into the higher levels of an IT infra-structure (Figure 5), creating an environment with responsive business processes and information as a true corporate asset.

With today’s Utility Computing focusing on the virtualization of the Infrastructure layer, the next virtualization will be at the Information Stores layer. In Unregulated IT, this will be achieved through a Common Information Model (CIM). CIM allows business users to view information in an industry-standardized format. For example, utilities and telecommunications already have industry-specific CIMs. The concept of the CIM allows decision-making systems such as Corporate Data Warehouses to produce standardized analytics in sync with industry-specific benchmarks, driving operational excellence and creating a competitive advantage during mergers and acquisitions.

Similarly, application interactions (usually traveling on Enterprise Application Integration bus) would use a similar approach to enable the virtualization of the next layer: Service Oriented Architecture (SOA). SOA abstracts (or virtualizes) business functionality from its underlying implementation.

With an SOA infrastructure in place, business users can use business process modeling tools to deploy and update a process; leveraging business functionality exposed through the SOA infrastructure and implemented using the evolving standards for execution languages such as the Business Process Execution Language (BPEL).

Finally, the need to access this functionality over multiple platforms will require the architecture to virtualize the presentation layer. This virtualization can be realized through portal suites from vendors such as IBM, Vignette, and others. These systems are capable of delivering consistent information through a wide variety of interface platforms, including Web, voice, PDA, as well as traditional desktop systems.

CURRENT TRANSITIONS TOWARDS UNREGULATED IT
While transition towards a more virtualized infrastructure is already underway, it is unrealistic to expect this transition to happen overnight. Writing off the multi-million dollar infrastructure investments that were made in the last decade and portraying it as scrap is unrealistic. Innovative service agreements such as those between IBM and Morgan Stanley are repurposing the invested-infrastructure. Instead of paying for the computer power needed by Morgan Stanley, it is sharing the processing power, storage, and bandwidth with other IBM customers.

The foundation for virtualizing other layers is also under way, with several remaining challenges. The biggest of all these challenges may lie in the adoption of a granular SOA architecture and associated service-based pricing at the application layer by Independent Software Vendors (ISVs). For example, several ERP vendors provide packaged functionality, but only a small percentage of it gets used by a specific client. The transition needs to occur in both the ISV’s revenue model as well as the underlying product architecture; providing the needed service as a detachable function with a use based price tag. The widespread adoption of this approach to turn business functionality into a commodity may require a set of disruptive players in the ISV space.

Implementation of Unregulated IT will have several economic implications to IT shops (refer to the “Cost Structure” section below). The current problems associated with accurate measurement of Total Cost of Ownership (TCO) will be replaced by a value-focused metrics such as Total Value of Ownership (TVO) and/or business value metrics. These metrics will attempt to capture the value of IT as it relates to supporting the value-driven business processes. It is reasonable to expect to see measurements like SLA’s (still an important element of the Utility Computing layer) augmented with measurements that measure the availability of business-level processes, rather than the underlying IT elements.

COST STRUCTURE OF UNREGULATED IT
The cost structure of the Unregulated IT model (Figure 6) will reduce the fixed costs that today’s IT departments face in hardware support staff, hardware depreciation, software procurement, and license maintenance. The hardware support staff will become experts at “shopping” for computing needs; software procurement costs get shared between organizations or outsourced completely; and the aging hardware gets dynamically repurposed to non-critical services by a service partner skilled at infrastructure management.

Additionally, the variable cost in application and infrastructure maintenance will be optimized to take a reduced cost curve by sharing common costs across a larger customer base. The high-upfront costs associated in starting a new initiative will be drastically lowered, allowing organizations to venture into business optimization with less risk.

However, history of the utility industry suggests that the current Utility Computing model will encounter consolidation and monopolistic attitudes and the inevitable regulatory response before it becomes an Unregulated IT. The economics will prefer to recover the cost rather than creating the value. Hence, IT departments may tend to choose outsourcing vendors without solemn investigation into supplier costs; business partners’ strategy, resources, and policies; or their ability to provide true business value.

CIOs/CTOs venturing to implement Unregulated IT need to ensure that their efforts are geared towards a cost-effective collection across all the layers within IT. Otherwise, IT may become a regulated monopoly heading towards arcane procedures for allocating costs against bandwidth and processing power. CIOs/CTOs will start seeing several clever accounting techniques that even utility regulators have difficulty grasping. Managing peak and off-peak pricing for usage of applications will be difficult to manage and understand – much like the century-old utility industry of today.

CONCLUSION
Though the Unregulated IT model is in its infancy, it is far more than smoke and mirrors. Utility Computing service models, a foundational element of an Unregulated IT capability, are available today and growing in popularity. Standards to support the higher levels of the architecture are established or known and emerging.
Unregulated IT will simplify IT by reducing and masking complexity. It will change much of IT’s fixed costs into variable costs. It will allow IT operating expenses to vary with the expansion and contraction of the business that it supports.

Unregulated IT, when extended beyond the infrastructure layer, could become the “Next Big Thing” in the technology industry. CIOs/CTOs need to reinforce that the model behind Unregulated IT is in its infancy; the key ingredient for the healthy growth is instillation of transformative thought processes around ‘how we engage’ rather than immediate changes to the IT infrastructure.

Besides making the virtualization layers a reality, CIOs/CTOs need to facilitate the necessary cost structure. A few lessons from the history of regulated utility industry could be handy. Automation is a mandatory element to implement virtualization. Additionally, standardization is critical to low-cost service delivery and the ability to .accurately gauge (and bill for) consumption.

Without the practical and operative evidence of successful implementation of Unregulated IT across all the IT layers, the business community may continue to wonder whether the model will work as advertised. Setting the right expectation around capabilities and timing will be critical to gaining trust from business users.

Finally, the implementation of Unregulated IT is not something that CIOs/CTOs just outsource or offshore. It needs a strong backbone to survive; a backbone built on top of a vision that is supported by senior executives across the organization: it is as fundamental to the business as it is to the IT organization.

REFERENCES
“The New IT Infrastructure”, Nagesh Anupindi, Michael Bates, and Gerry Coady, Network InterOp, Las Vegas, May 2004.
“Utility Computing: The Next New IT Model”, Graeme Thickins, Darwin Magazine, Apr 2003.
”Tech Wave 1: Utility Computing”, Steve Hamm, BusinessWeek, Aug 2003.
“Why Utility Computing will succeed where ASPs and Outsourcing failed”, Lief Eriksen, Aug 2003.
“Utility Computing: Legal Challenges and how to handle them”, Alan Steel-Nicholson and Peter Brudenall, Simmons & Simmons, Sep 2003.
“Utility Computing: Any port in a storm?”, Alan Steel-Nicholson and Peter Brudenall, Simmons & Simmons, Oct 2003.
“The Voodoo Economics behind Utility Computing”, Michael Schrage, CIO Magazine, Nov 2003.
“Lessons for Utility Computing”, George Chen and Lloyd Switzer, eWeek, Jan 2004.
“Rethinking Utility Computing: Lessons from the Power industry”, EnterpriseInnovator, Feb 2004.
“Morgan Staneley, IBM Ink Utility-Computing Deal”, Elizabeth Millard, Ecommerce Times, April 2004.
“The New, New IT Strategy”, Tom Davenport, CIO Magazine, January 2001.

ABOUT THE AUTHORS
Nagesh Anupindi, Ph.D., is an Executive Consultant in Information Technology for Xcel Energy. He is currently involved in developing enterprise-wide IT strategy for transforming the core business and its operations. He focuses on innovatively aligning Enterprise Architecture, Data Integrity, Information Backbone, and Business Performance Management. He received his Bachelors in Electronics & Telecommunications; Masters in Electrical Engineering from Indian Institute of Technology (IIT); and Doctorate in Computer Engineering from University of Rhode Island. He has over 15 years of Information Technology experience in Data Management and Business Process consulting. He currently consults to medium-large corporations on building IT Strategy and discovering Information as a transformation vehicle. Email at Nagesh@Nagesh.com.

Michael Bates is currently Executive consultant for Government projects as served as the Director of Enterprise Architecture for Xcel Energy. At Xcel Energy, he was responsible for the development and implementation of Enterprise Architecture deliverables and processes for the corporation. He has over 30 years of Information Technology experience in a wide variety of industries, including transportation, utility, telecommunications, engineering, and education. Michael has extensive experience in the development and management of infrastructure capabilities on a global scale. Projects include the integration of over 1,200 systems in 270+ countries to manage information flow for a global transportation company, development and integration of customs clearance capabilities in the US, Japan, Europe, Taiwan, Singapore; linking customs brokers, government agencies, and manifest systems for another global transportation companies. As a consultant for IBM, Michael specialized in Enterprise Application Integration and large system architecture.

Gerry Coady is CIO of Frontier Airlines and served as the Chief Architect and Managing Director of the Strategic Enterprise Solutions Group at Xcel Energy. Prior to joining Xcel, Gerry was a Program Executive with IBM. Coady spent five years at JD Edwards where we was Vice President and Chief Knowledge Officer (CKO). He has also held numerous management, consulting and technical positions with Digital Equipment Corporation (DEC). Gerry has a Masters in Business Management, majoring in Innovation and Technology, from Boston University; currently in the dissertation phase of his PhD in Knowledge Management. Email: Gerry.Coady@Gmail.com.


William Santos is currently the Director of Innovation and Strategy for IBM, in support of the IBM/Xcel outsourcing partnership. Bill is responsible for identifying transformational opportunities within the context of an outsourcing partnership. Prior to his current role, Bill was a member of IBM’s Watson Research organization, as well as CTO for software startup 2ce, Inc. Bill owned and operated the Atlantec Group, a technology services and consulting firm. Bill has a Bachelors and Master’s degree in Computer Science and Engineering from the Massachusetts Institute of Technology (MIT).

 
Publications - Information Technology Articles

How Oracle can meet Data Warehouse End-User Expectations

User Rating: / 3
PoorBest 
How Oracle can meet Data Warehouse End-User Expectations ABSTRACT
This paper presents the key steps and the responsibilities of a Data Warehouse or Data Mart Architect in developing a Data Warehouse or Data Mart project. The emphasis is placed on the process of gathering end-user requirements and methods to...
Publications - Information Technology Articles

Getting Entity and Attribute Relationships Without ER Software

User Rating: / 2
PoorBest 

Have you lost your Entity Relationships diagram? Or do you want to see if the table relationships in your database match your ER diagram? Here is an SQL*Plus script that lets you find out the Child and Parent tables (entities) for a given table, and ...


Warning: Illegal string offset 'active' in /home2/apoorva/public_html/nagesh.com/templates/ja_teline_ii/html/pagination.php on line 129

Warning: Illegal string offset 'active' in /home2/apoorva/public_html/nagesh.com/templates/ja_teline_ii/html/pagination.php on line 135

Warning: Illegal string offset 'active' in /home2/apoorva/public_html/nagesh.com/templates/ja_teline_ii/html/pagination.php on line 129

Warning: Illegal string offset 'active' in /home2/apoorva/public_html/nagesh.com/templates/ja_teline_ii/html/pagination.php on line 135
Page 2 of 2