Archive for category IBM
Recently I talked with several data center managers about their experiences with virtualization. While these managers have different perspectives, they all agreed that server virtualization alone isn’t enough. By moving beyond server virtualization to a more holistic approach including virtualizing storage, network and other technology assets, these companies are increasing the ROI of their virtualization implementations. There is an element missing in their virtualization strategy that is making it hard to meet the increasing demands of the business. These companies are beginning to reset their virtualization priorities to make sure that all the elements of the IT environment work together – and this includes creating a virtualization environment that automatically allocates resources based on the demands of specific applications and workloads.
IT management needs to focus on the application priorities in terms of performance to support the business. If all your applications are treated with the same priority– how can you be assured that your most critical applications always have access to the resources they require? You may be doing a great job monitoring CPU usage and available memory in your server virtualization environment, but still have unexpected performance problems that impact critical customer applications. What’s missing is a way to adjust for business priority variations when you allocate resources across your virtualized environment.
One way to ensure that your environment operates with an increased awareness of the requirements of each specific application is to implement application infrastructure virtualization. It is a capability that allows for codifying business and technical usage policies to make sure that each of your applications leverages virtual and physical resources in a predictable way. By automating many of the underlying rules and processes, you can optimize both resource utilization and the customer experience at the same time.
There are three main characteristics for application infrastructure virtualization:
- Setting business priorities for applications and automatically adjusting resources to keep customer service levels in balance
- Applying a customer focused approach to the automation of resource optimization so that each application gets the resources it needs based on resource availability and the application’s priority to the business
- Allocating a pooled set of resources to support a group of workloads.
Application infrastructure virtualization ensures that any resource can run any workload. If there are resource limitations then application with the lowest business priority at the time is allocated with the fewest resources.
I amplified this issue in a white paper I recently wrote for IBM on the topic. The paper, called Creating Application Awareness in IT Virtualization Environments, discusses application infrastructure virtualization and how companies can combine server and application infrastructure virtualization to improve overall performance levels. In addition, the paper describes IBM’s solution for application infrastructure virtualization, IBM WebSphere Virtual Enterprise (WVE).
It is easy to assume that server virtualization itself is enough to solve resource management issues in the data center. However, it is increasingly clear that virtualization has to be tied to the performance of various applications based on the times and situations where they demand performance. Tying application performance to virualization creates a more efficient and proactive environment to satisfy customer expectations.
It isn’t easy being a small fish in a big pond. How does a small mortgage company leverage its ability to be nimble in a competitive market? I believe that SaaS offerings are initiating a revolution with broad implications for business models and competitiveness. It is becoming increasingly clear that with the advent of sophisticated software as a service (SaaS) environments combined with process as a service, it is possible and even commonplace for a small company to be competitive. I had the opportunity to spend some time at IBM’s LotusSphere with a small mortgage company called Signature Mortgage based in Ohio. By leveraging a combination of IBM’s LotusLive platform combined with Silanis Technology’s electronic signature platform, Signature Mortgage has been able to differentiate itself in an extremely complex market dominated by Fortune 500 companies.
Mortgage brokers like Signature Mortgage are involved primarily in the origination process of the loan, bringing mortgage borrowers together with mortgage lenders who actually fund the money for the loan. Mortgage borrowers engage with mortgage brokers early in the mortgage process when a consumer looking for a mortgage often has a lot of choices. It can be hard at this early stage for a consumer to differentiate between mortgage brokers and if one mortgage broker takes several days or longer deliver the mortgage application documents for signing, the consumer might switch to another broker. Typically a consumer selects a mortgage broker based on the offered rate, term, and closing costs and then locks down the rate to protect against a rate change. The broker will facilitate a complex series of steps that must take place in order to move from this initial rate lock down to the approval and closing of the mortgage. The consumer must submit financial and personal documentation so the mortgage broker can assess the credit worthiness of the individual and appraise the property.
The process between mortgage origination and closure can take as long as 45-60 days. To make money, the mortgage broker needs to be able to collect all the required information quickly and then be in a position to close the deal with the mortgage lender before rates change. There is a lot at stake for both the mortgage broker and applicant during this time period. For example, missing a deadline for a signature can lead to cancellation of locked-in rates on a mortgage commitment, potentially leading to higher costs for a buyer or lost revenue for a mortgage broker.
Bob Catlin, President of Signature Mortgage explained to me that he was determined to use his company’s small size as an advantage by quickly implementing innovative technology that might take much longer in a large company with legacy policies and infrastructure. By streamlining and speeding up the mortgage origination process, he could differentiate from the larger banks, increase profits and have happier more satisfied customers.
Signature Mortgage has its customers log in to a portal designed to capture best practices for submitting application documentation, revising documents, receiving status reports and securing electronic signatures when required. All of the steps in the mortgage process are documented within the Silanis Technology portal that is based on IBM’s LotusLive collaboration platform. The Silanis electronic signature solution is delivered as a cloud-based service, making it attractive to a small business with a limited IT staff. By implementing Silanis Technology’s solution, Caitlin has been able to shave days off the loan origination process because there are no more last minute surprises resulting from missing data or documents and delays in arranging in-person meetings.
For Signature Mortgage, the first step in tightening the timeline for the mortgage process was to consistently lock in rates in less than 15 days. Caitlin is now working on decreasing application processing time to 24 hours and shortening the total time between mortgage application and closing down to 10-15 days. Any mortgage borrower who has dealt with requests for last minute faxes for missing documentation or errors that are discovered as the rate-lock deadline approaches will understand that decreasing the time to close from the industry average of 30-45 days is a big deal. However, speed isn’t the entire benefit; managing all information related to the mortgage applications in one centralized portal improves accuracy and accountability. Using the capabilities of LotusLive and Silanis e-signature, Signature Mortgage has been able to include features that make the online mortgage process intuitive and consistent with paper-based manual processes that are familiar to many people. For example, customers click to sign where they see a virtual sticky note similar to the process used at in-person signings. The consistency and repeatability of the process helps the company to maintain compliance with legal and regulatory requirements.
With this predictable foundation, Signature Mortgage has been able to grow quickly, increase profitability and build a strong presence in the community.
I am looking forward to attending The Smart Governance Forum (23rd meeting of the IBM Data Governance Council) in California on February 1-3, where I will be a panelist for a session on Smart Governance Analytics. As my panel group started to plan for the event, I did some background research on the Council to understand more about them. What kinds of questions were Council members asking about information governance when they began meeting in 2004 and how are things different today? Have they developed best practices that would be useful to other companies working to develop an information governance strategy?
Information governance refers to the methods, policies, and technology that your business deploys to ensure the quality, completeness, and safety of its information. Your approach to information governance must align with the policies, standards, regulations, and laws that you are legally required to follow. When a group of senior executives responsible for information security, risk, and compliance at IBM customer organizations began meeting in 2004, interest in IT governance was high, but there wasn’t as much attention focused specifically on information governance.
Books like “IT Governance: How Top Performers Manage IT Decision Rights for Superior Results” by Peter Weill and Jeanne W Ross helped companies understand the benefit of aligning IT goals with the overall goals and objectives of the business. In addition, there were other publications at this time focused on how to take a balanced scorecard approach to managing business strategy and on best practices for implementing IT governance. These approaches are of critical importance to business success, however there was also a need to develop a framework for understanding, monitoring, and securing the rapidly increasing supply of business data and content.
And that is what a group of IT information focused business leaders and IBM and business partner technology leaders decided to do. The amount of data they needed to collect, aggregate, process, analyze, share, change, store, and retire was growing larger every day. In addition to data stored in traditional data bases and packaged applications like CRM (customer relationship management) systems, they were also concerned about information stored and shared in unstructured formats like documents, spreadsheets, and email.
Having more information about your companies customers, partners, and products creates great opportunity, but more information also means more risk if you don’t manage your information with care. Council members asked each other lots of questions such as:
- How can we be sure that the right people get access to the right information at the right time?
- How can we make sure that the wrong people do not get access to our private information at any time?
- How can we overcome the risks to data quality, consistency, and security increased by the siloed approach to business data ownership that is so prevalent in our organizations?
- How can we create a benchmarking tool for information governance that will help our businesses to increase revenue, lower costs, and reduce risks?
- How can improve our ability to meet the security and protection standards of auditors and regulators?
As a result of its discussions, The Council developed a Maturity Model to help you assess your current state of information governance and provide guidance for developing a roadmap for the future. The Model identifies 11 categories of information governance. The categories cover all the different elements of building an information security strategy such as understanding who in the business/IT is responsible for what information, what policies do you follow to control the flow of your information in your company, what are your methodologies for identifying and mitigating risk, and how do you measure the value of your data and the effectiveness of governance. I read two IBM White Paper’s on the Model that add insight to the questions you need to ask to begin building a path to better information governance, “The IBM Data Governance Council Maturity Model: Building a roadmap for effective data governance” and “The IBM data governance blueprint: Leveraging best practices and proven technologies“.
So, what’s changed? FInancial crises, increasing regulation, high-profile incidents of stolen private data, cloud technology, and other factors have added substance and complexity to the questions you need to ask about information governance. There is much to do. One question we will explore at the conference next week is, How do you measure the effectiveness of your information governance strategy and what analytical measures are appropriate? For example, some companies are using analytical tools to look for patterns of email communication across the company and discover a greater level of insight into how information is flowing and what needs more review. Look for more on analytics and governance after the conference.
You say you already have a plan in place to guard your company’s data? Are you sure it has you adequately protected? While you certainly understand the need for data security – your sales challenges are tough enough without exposing your customer’s credit card information to a security breech, for example – the chances are good that in 2010 you will consider various options for improving the security of your data. If you are going to protect your company’s most valuable asset — your data — you will begin to view data security as a component of a more comprehensive information governance strategy.
The risks of internal or external threats to your company’s data are becoming more complex as the depth and breadth of your information expands rapidly and your data is shared with business partners, suppliers, and customers. In addition, as companies begin to take advantage of cloud services for some of their workloads, additional complexity is added to the multitude of security concerns. Many companies have deployed a disjointed approach to securing, controlling, and managing its data making it hard to anticipate and prepare for constantly changing security risks. There are lots of different ways that unauthorized users may enter your network or otherwise steal your data. Many companies typically have a distinct solution to combat each one individually and typcially can’t each of themprotect against all of them and. For example, access control, data encryption, network traffic monitoring, vulnerability testing, and auditing may all be monitored with independent applications.
There is a good reason why many companies find they need to deploy lots of different solutions to effectively govern its information. Some of the most innovative solutions have come from emerging companies who have built a niche around a particular vertical market or some segment of the information security market. So you deploy the best solution for you biggest challenges and move on. However, as you begin to think more holistically about your needs for information governance, you will want to ensure that information security solutions are well integrated. This is one reason why emerging companies with an information security solution have become desirable acquisition candidates for larger software vendors.
Guardium, a privately-held company based in Massachusetts, is one of the most recent examples of this trend. When IBM announced its acquisition of the company in the last week of November, Guardium moved from a fast growing startup to one of the pillars of the IBM information governance strategy.The company’s technology helps clients with some of the most challenging issues around unauthorized access to critical data. Their solutions provide secure access to enterprise data – across many different database environments such as IBM, Oracle, Microsoft, Teradata and others. In addition, customers can reduce operational costs by automating regulatory compliance tasks. While many companies may have the ability to monitor one database at a time, Guardium brings added value by enabling companies with complex environments to monitor databases across their organization.
This acquisition aligns well with IBM’s strategy to provide customers with a well-integrated and comprehensive approach to information management. IBM has spent in the range of $12 Billion over the past five years to add software assets that will help companies to make more intelligent decisions and realize more business value from their information.
After just returning from IBM’s Information on Demand (IOD) Conference in Las Vegas, I would like to take this opportunity to virtually whisper just one word in the ear of a current day Benjamin Braddock, “analytics”. Many businesses have spent the past 25 years or so automating and streamlining business processes in order to drive improvements in efficiency and productivity. But now, it is becoming apparent that these businesses expect their future success will increasingly depend on how skillfully they manage, govern, and analyze information. Businesses are applying analytical techniques to business information to help reduce risk and increase the certainty that they are making the right decisions.
IBM has, in fact, spent $12 Billion in software investments (both organic multiple acquisitions like SPSS, Cognos, Filenet, iPhrase, and Ascential Software, just to name a few) over the past 4-5 years to ensure it will be able to support its customers in their quest to unlock the business value of information. In addition, in April of 2009 IBM announced a new organization comprised of 4000 consultants focused on advanced business analytics and business optimization – teams with skills in applying business intelligence technologies like mathematical modeling, simulation, data analytics, and optimization techniques.
In an era of intense competition, tight credit, and cost concerns across global and vertical markets, this focus on getting the most value from the information you have makes a lot of sense. Companies find they are processing more information than ever before, but less of this information is being accurately and adequately used. The quantity of available data that a business needs to manage and understand has skyrocketed along with the increase in instrumented and intelligent products. For example, RFID tags that are embedded in manufactured products, plants and animals generate an enormous amount of data in efforts to control inventories and improve security and safety. Trying to make decisions with inadequate, inaccurate, or untimely information is like driving a fast sports car down the highway with a very large blind spot impeding your view of the truck approaching on your side. You need to know about the obstacles that might appear in your pathway before you try to make a “real-time” correction and steer your car (or your business) of a cliff. So, students and business leaders alike please take note, I see some “analytics” in your future.