|See also:-||the Top SSD Companies|
meet Ken and the SSD event horizon
SAN Data Security & Fabric Management
Storage Administrators: A Changing of the Guard in IT
|A Day in the Life of a CIO|
It seems like only yesterday that the IT community was arguing the case for the board-level IT director. Since then, the CIO has come a long way and he probably now feels that he is carrying the weight of the company on his shoulders. Today, the CIO is tasked with delivering more with so much less - working with reduced budgets whilst under orders to improve overall IT performance and capability.
The tough economic climate drives businesses to seek rapid ROI on all IT investments - old and new. The CEO, CFO and the entire board need to see that IT is linking successfully with the overall business strategy of the organisation, and it's up to the CIO to demonstrate this.
So how has the CIOs role changed over the years?
Firstly, a typical CIO no longer is just concerned with the old IT domain. He is now responsible for a growing number of areas, and is expected to be accountable to all levels of the organisation. As an example, the typical CIO or IT director must be adept at the following roles:
The 'Tech-Head': The CIO is responsible for ensuring the purchasing, integration and smooth running of all applications, servers, hardware, software and storage within the organisation.
The Management Guru: The CIO has to manage the complexity of disparate systems, multiple users and numerous service providers.
The Negotiator: Once considered an anti-social department, the IT team now have to deal with a plethora of people, meet customer service level agreements and manage IT supplier relationships.
The Lawyer: The IT department has to ensure that the organisation is protected against email and Web based threats, and that they meet legal and regulatory requirements.
The Accountant: The pressure is on, and the CIO has to manage IT costs and deliver a return on IT investment.
The Futurologist: The CIO has to be up on all new technology trends, understanding the credibility of emerging technologies for their IT landscape.
Given the diverse demands and the increased level of responsibility to the business, today's CIOs are looking towards a solution that is both holistic to the business and the technology to address the increasingly complex IT/business environment. As a result, we have seen the rise and increased viability of the utility computing concept.
Utility computing, a service model in which organisations are provided with computing resources and infrastructure management as needed, has proven to cut IT expenditure while making networks more flexible and scalable. In fact, industry analysts Gartner recently claimed that ignoring utility computing would be a mistake.
Utility computing is no flash in the pan fad, but simply the way things ought to be in the ideal world. Functionally, it is similar to a public utility, like water or electricity: internal customers specify the resources required (processing, storage, networking) as well as the qualities of service (performance, reliability, cost), and the computing utility automatically and transparently delivers and accounts for the resources demanded.
The utility computing model pulls computing resource from across the enterprise together, and because this resource can be shared, it results in higher resource utilisation, greatly simplified management, and a superior cost of ownership. For example, instead of having to purchase additional hardware to cater for peaks in demand, under-utilised resource from elsewhere within the business is harnessed. It also means better support for the business processes that depend on computing.
This is a far cry from the disparate silos of under utilised server, storage, network and application resource that plague organisations today, creating hideous complexity and draining already strained budgets. Surely it represents the pill that will cure the CIO's headache?
|According to technology
consultants, The Clipper Group, utility
computing is generally associated with five key attributes:
Virtualised: Computing resources are pooled and tailored for simpler management and better utilisation.
Open and heterogeneous: Standards and interoperability allow multiple vendors and even legacy equipment to be incorporated into an infrastructure. Customers benefit from choice and competition.
Unified and centrally managed: All the components that comprise the utility are centrally integrated, coordinated and managed, dramatically reducing operating costs.
Dynamic provisioning: Resources are dynamically and precisely allocated to meet changing business requirements.
Automated: The utility monitors and manages itself and takes action based on user-defined policies.
An IT infrastructure that combined all of these attributes in full would be a true computing utility. However, the real world is not quite this ideal yet, and the concept is still developing and will continue to evolve. IDC, the IT research organisation, believe that the market will take off slowly at first, and will accelerate around 2007 or 2008. Larger companies are dipping their toes in, and the mid-market will follow.
While on the one hand 2007 or 2008 might seem like a long way off, it's not. We are so used to the fast-moving pace of technological hypes, often disguised as developments or solutions, that a well-developed business concept creates some eyebrow raising. We must remember that utility computing is a true business vision, and selling it as a quick-fix technology solution does not do it justice.
|STORAGEsearch is published by ACSL|