Posts Tagged IBM
Recently I talked with several data center managers about their experiences with virtualization. While these managers have different perspectives, they all agreed that server virtualization alone isn’t enough. By moving beyond server virtualization to a more holistic approach including virtualizing storage, network and other technology assets, these companies are increasing the ROI of their virtualization implementations. There is an element missing in their virtualization strategy that is making it hard to meet the increasing demands of the business. These companies are beginning to reset their virtualization priorities to make sure that all the elements of the IT environment work together – and this includes creating a virtualization environment that automatically allocates resources based on the demands of specific applications and workloads.
IT management needs to focus on the application priorities in terms of performance to support the business. If all your applications are treated with the same priority– how can you be assured that your most critical applications always have access to the resources they require? You may be doing a great job monitoring CPU usage and available memory in your server virtualization environment, but still have unexpected performance problems that impact critical customer applications. What’s missing is a way to adjust for business priority variations when you allocate resources across your virtualized environment.
One way to ensure that your environment operates with an increased awareness of the requirements of each specific application is to implement application infrastructure virtualization. It is a capability that allows for codifying business and technical usage policies to make sure that each of your applications leverages virtual and physical resources in a predictable way. By automating many of the underlying rules and processes, you can optimize both resource utilization and the customer experience at the same time.
There are three main characteristics for application infrastructure virtualization:
- Setting business priorities for applications and automatically adjusting resources to keep customer service levels in balance
- Applying a customer focused approach to the automation of resource optimization so that each application gets the resources it needs based on resource availability and the application’s priority to the business
- Allocating a pooled set of resources to support a group of workloads.
Application infrastructure virtualization ensures that any resource can run any workload. If there are resource limitations then application with the lowest business priority at the time is allocated with the fewest resources.
I amplified this issue in a white paper I recently wrote for IBM on the topic. The paper, called Creating Application Awareness in IT Virtualization Environments, discusses application infrastructure virtualization and how companies can combine server and application infrastructure virtualization to improve overall performance levels. In addition, the paper describes IBM’s solution for application infrastructure virtualization, IBM WebSphere Virtual Enterprise (WVE).
It is easy to assume that server virtualization itself is enough to solve resource management issues in the data center. However, it is increasingly clear that virtualization has to be tied to the performance of various applications based on the times and situations where they demand performance. Tying application performance to virualization creates a more efficient and proactive environment to satisfy customer expectations.
Many business executives are interested in moving to the cloud because of the potential impact on business strategy. Increasingly they are convinced that a cloud model – particularly the private cloud – will give them an increased amount of flexibility to change and manage the uncontrolled expansion of IT. In contrast, from an IT perspective, the ability to virtualize servers, storage, and I/O, is often viewed as the culmination of the cloud journey. Of course, the world is always more complicated than it appears. Cloud computing if implemented in a strategic manner can help a company experiment and change more easily. Likewise, virtualization, which may seem like an isolated and pragmatic approach, needs to be considered in context with an overall cloud computing strategy.
However, the challenge for many companies right now is how to transform their virtualized infrastructure into a private cloud that delivers on the promise of on-demand and self-service provisioning of IT resources. To make sure that business leaders gain desired cost savings and business flexibility while IT gains the optimization that can be achieved through virtualization requires an integrated strategy. At the heart of this strategy is service management of this emerging highly virtualized environment.
Why is service management important? There are good reasons why a key focus of the virtualization strategy at many companies has been on server virtualization. For example, server virtualization helps companies create a faster and more efficient IT provisioning process for users. It helps users with increased operational flexibility based on the mobility and isolation capabilities of virtual machines.
However, there is a major management problem with the typical virtualization approach in many companies. Often developers satisfy their demands for computing resources by simply creating or spinning up a new virtual machine rather than anticipating that they might not have the time or money to purchase new IT systems. IT management in the beginning allows this practice because it is easier than trying to control impatient developers. However, there is a price to be paid. Each new virtual machine image requires memory and disk resources. When the number of virtual machines grows out of control, companies end up spending more time and money on disk, storage, and memory resources than was anticipated. Lack of control means lack of management. What is the answer?
At a recent IBM systems software meeting, Helene Armitage, General Manager of Systems Software, emphasized that in order to truly transform the data center into an agent of change requires a focus on manageability. Here are the main take always from my conversations with Helene and her team.
- Virtualization is not just about server consolidation. Companies need to think differently about virtualization in order to prepare for a future marked by rapid change. For example, storage virtualization can offer big benefits to companies facing explosive demand for data storage from increased use of new technologies like embedded and mobile devices.
- Physical and virtual environments will need to be managed in a unified way and at the same time. Some may think of virtualization as a way of reducing management requirements. After all, with consolidated resources you should have less demand for power, energy, and space. However, management needs to be done right to be effective. it is important to consider how to manage across the platform and to include monitoring and configuration in combination with business service management – delivering IT services in a structured, governed, and optimized way.
- Cloud changes the complexion of what’s important in how virtualization is approached. In the cloud, virtualization takes on a different level of complexity. If elements of virtualization including storage, server, and I/O virtualization are handled as isolated tasks, the cloud platform will not be effectively managed.
- System pooling is key for managing lots of systems. Systems pooling essentially means that IT treats all resources as a unified set of shared resources to improve performance and manageability.
- Enforcing standardization right from the beginning is a requirement — especially when virtualization is at the core of a cloud environment. The only way to truly manage the virtualized environment is to apply repeatable, predictable, and standardized best practices across the entire computing environment.
The greatest challenge for companies is to think differently about virtualization. Management will need to realize that virtualization is a foundational element in building a private cloud environment and therefore has to be looked at from a holistic perspective.
It isn’t easy being a small fish in a big pond. How does a small mortgage company leverage its ability to be nimble in a competitive market? I believe that SaaS offerings are initiating a revolution with broad implications for business models and competitiveness. It is becoming increasingly clear that with the advent of sophisticated software as a service (SaaS) environments combined with process as a service, it is possible and even commonplace for a small company to be competitive. I had the opportunity to spend some time at IBM’s LotusSphere with a small mortgage company called Signature Mortgage based in Ohio. By leveraging a combination of IBM’s LotusLive platform combined with Silanis Technology’s electronic signature platform, Signature Mortgage has been able to differentiate itself in an extremely complex market dominated by Fortune 500 companies.
Mortgage brokers like Signature Mortgage are involved primarily in the origination process of the loan, bringing mortgage borrowers together with mortgage lenders who actually fund the money for the loan. Mortgage borrowers engage with mortgage brokers early in the mortgage process when a consumer looking for a mortgage often has a lot of choices. It can be hard at this early stage for a consumer to differentiate between mortgage brokers and if one mortgage broker takes several days or longer deliver the mortgage application documents for signing, the consumer might switch to another broker. Typically a consumer selects a mortgage broker based on the offered rate, term, and closing costs and then locks down the rate to protect against a rate change. The broker will facilitate a complex series of steps that must take place in order to move from this initial rate lock down to the approval and closing of the mortgage. The consumer must submit financial and personal documentation so the mortgage broker can assess the credit worthiness of the individual and appraise the property.
The process between mortgage origination and closure can take as long as 45-60 days. To make money, the mortgage broker needs to be able to collect all the required information quickly and then be in a position to close the deal with the mortgage lender before rates change. There is a lot at stake for both the mortgage broker and applicant during this time period. For example, missing a deadline for a signature can lead to cancellation of locked-in rates on a mortgage commitment, potentially leading to higher costs for a buyer or lost revenue for a mortgage broker.
Bob Catlin, President of Signature Mortgage explained to me that he was determined to use his company’s small size as an advantage by quickly implementing innovative technology that might take much longer in a large company with legacy policies and infrastructure. By streamlining and speeding up the mortgage origination process, he could differentiate from the larger banks, increase profits and have happier more satisfied customers.
Signature Mortgage has its customers log in to a portal designed to capture best practices for submitting application documentation, revising documents, receiving status reports and securing electronic signatures when required. All of the steps in the mortgage process are documented within the Silanis Technology portal that is based on IBM’s LotusLive collaboration platform. The Silanis electronic signature solution is delivered as a cloud-based service, making it attractive to a small business with a limited IT staff. By implementing Silanis Technology’s solution, Caitlin has been able to shave days off the loan origination process because there are no more last minute surprises resulting from missing data or documents and delays in arranging in-person meetings.
For Signature Mortgage, the first step in tightening the timeline for the mortgage process was to consistently lock in rates in less than 15 days. Caitlin is now working on decreasing application processing time to 24 hours and shortening the total time between mortgage application and closing down to 10-15 days. Any mortgage borrower who has dealt with requests for last minute faxes for missing documentation or errors that are discovered as the rate-lock deadline approaches will understand that decreasing the time to close from the industry average of 30-45 days is a big deal. However, speed isn’t the entire benefit; managing all information related to the mortgage applications in one centralized portal improves accuracy and accountability. Using the capabilities of LotusLive and Silanis e-signature, Signature Mortgage has been able to include features that make the online mortgage process intuitive and consistent with paper-based manual processes that are familiar to many people. For example, customers click to sign where they see a virtual sticky note similar to the process used at in-person signings. The consistency and repeatability of the process helps the company to maintain compliance with legal and regulatory requirements.
With this predictable foundation, Signature Mortgage has been able to grow quickly, increase profitability and build a strong presence in the community.
I am welcoming my business partner, Judith Hurwitz as a contributor on my blog. The following is her observations about the partner ecosystem in the cloud.
I have been spending quite a bit of time these days at Cloud Computing events. Some of these events, like the Cloud Camps are wonderful opportunities for customers, vendors, consulted, and interested parties to exchange ideas in a very interactive format. If you haven’t been to one I strongly recommend them. Dave Nielsen who is one of the founders of the Cloud Camp concept has done a great job not just jump starting these events but participating in most of them around the world. In addition, Marcia Kaufman and I have been conducting a number of half and full day Introduction to Cloud Computing seminars in different cities. What has been the most interesting observation from my view is that customers are no longer sitting on the side lines with their arms crossed. Customers are ready and eager to jump into to this new computing paradigm. Often they are urged on by business leaders who instinctively see the value in turning computing into a scalable utility. So, for the first time, there is a clear sense that there may well be money to be made.
I am looking forward to attending The Smart Governance Forum (23rd meeting of the IBM Data Governance Council) in California on February 1-3, where I will be a panelist for a session on Smart Governance Analytics. As my panel group started to plan for the event, I did some background research on the Council to understand more about them. What kinds of questions were Council members asking about information governance when they began meeting in 2004 and how are things different today? Have they developed best practices that would be useful to other companies working to develop an information governance strategy?
Information governance refers to the methods, policies, and technology that your business deploys to ensure the quality, completeness, and safety of its information. Your approach to information governance must align with the policies, standards, regulations, and laws that you are legally required to follow. When a group of senior executives responsible for information security, risk, and compliance at IBM customer organizations began meeting in 2004, interest in IT governance was high, but there wasn’t as much attention focused specifically on information governance.
Books like “IT Governance: How Top Performers Manage IT Decision Rights for Superior Results” by Peter Weill and Jeanne W Ross helped companies understand the benefit of aligning IT goals with the overall goals and objectives of the business. In addition, there were other publications at this time focused on how to take a balanced scorecard approach to managing business strategy and on best practices for implementing IT governance. These approaches are of critical importance to business success, however there was also a need to develop a framework for understanding, monitoring, and securing the rapidly increasing supply of business data and content.
And that is what a group of IT information focused business leaders and IBM and business partner technology leaders decided to do. The amount of data they needed to collect, aggregate, process, analyze, share, change, store, and retire was growing larger every day. In addition to data stored in traditional data bases and packaged applications like CRM (customer relationship management) systems, they were also concerned about information stored and shared in unstructured formats like documents, spreadsheets, and email.
Having more information about your companies customers, partners, and products creates great opportunity, but more information also means more risk if you don’t manage your information with care. Council members asked each other lots of questions such as:
- How can we be sure that the right people get access to the right information at the right time?
- How can we make sure that the wrong people do not get access to our private information at any time?
- How can we overcome the risks to data quality, consistency, and security increased by the siloed approach to business data ownership that is so prevalent in our organizations?
- How can we create a benchmarking tool for information governance that will help our businesses to increase revenue, lower costs, and reduce risks?
- How can improve our ability to meet the security and protection standards of auditors and regulators?
As a result of its discussions, The Council developed a Maturity Model to help you assess your current state of information governance and provide guidance for developing a roadmap for the future. The Model identifies 11 categories of information governance. The categories cover all the different elements of building an information security strategy such as understanding who in the business/IT is responsible for what information, what policies do you follow to control the flow of your information in your company, what are your methodologies for identifying and mitigating risk, and how do you measure the value of your data and the effectiveness of governance. I read two IBM White Paper’s on the Model that add insight to the questions you need to ask to begin building a path to better information governance, “The IBM Data Governance Council Maturity Model: Building a roadmap for effective data governance” and “The IBM data governance blueprint: Leveraging best practices and proven technologies“.
So, what’s changed? FInancial crises, increasing regulation, high-profile incidents of stolen private data, cloud technology, and other factors have added substance and complexity to the questions you need to ask about information governance. There is much to do. One question we will explore at the conference next week is, How do you measure the effectiveness of your information governance strategy and what analytical measures are appropriate? For example, some companies are using analytical tools to look for patterns of email communication across the company and discover a greater level of insight into how information is flowing and what needs more review. Look for more on analytics and governance after the conference.
You say you already have a plan in place to guard your company’s data? Are you sure it has you adequately protected? While you certainly understand the need for data security – your sales challenges are tough enough without exposing your customer’s credit card information to a security breech, for example – the chances are good that in 2010 you will consider various options for improving the security of your data. If you are going to protect your company’s most valuable asset — your data — you will begin to view data security as a component of a more comprehensive information governance strategy.
The risks of internal or external threats to your company’s data are becoming more complex as the depth and breadth of your information expands rapidly and your data is shared with business partners, suppliers, and customers. In addition, as companies begin to take advantage of cloud services for some of their workloads, additional complexity is added to the multitude of security concerns. Many companies have deployed a disjointed approach to securing, controlling, and managing its data making it hard to anticipate and prepare for constantly changing security risks. There are lots of different ways that unauthorized users may enter your network or otherwise steal your data. Many companies typically have a distinct solution to combat each one individually and typcially can’t each of themprotect against all of them and. For example, access control, data encryption, network traffic monitoring, vulnerability testing, and auditing may all be monitored with independent applications.
There is a good reason why many companies find they need to deploy lots of different solutions to effectively govern its information. Some of the most innovative solutions have come from emerging companies who have built a niche around a particular vertical market or some segment of the information security market. So you deploy the best solution for you biggest challenges and move on. However, as you begin to think more holistically about your needs for information governance, you will want to ensure that information security solutions are well integrated. This is one reason why emerging companies with an information security solution have become desirable acquisition candidates for larger software vendors.
Guardium, a privately-held company based in Massachusetts, is one of the most recent examples of this trend. When IBM announced its acquisition of the company in the last week of November, Guardium moved from a fast growing startup to one of the pillars of the IBM information governance strategy.The company’s technology helps clients with some of the most challenging issues around unauthorized access to critical data. Their solutions provide secure access to enterprise data – across many different database environments such as IBM, Oracle, Microsoft, Teradata and others. In addition, customers can reduce operational costs by automating regulatory compliance tasks. While many companies may have the ability to monitor one database at a time, Guardium brings added value by enabling companies with complex environments to monitor databases across their organization.
This acquisition aligns well with IBM’s strategy to provide customers with a well-integrated and comprehensive approach to information management. IBM has spent in the range of $12 Billion over the past five years to add software assets that will help companies to make more intelligent decisions and realize more business value from their information.