mkaufman

This user hasn't shared any biographical information

Homepage: http://marcia.kaufman@hurwitz.com

What is Service Virtualization?

There has been a lot of discussion lately about “service virtualization”, however the term alone can make your head spin. Are we talking about server virtualization? What types of services are involved? What does virtualization have to do with testing? I’d like to quickly clear up any confusion you may have. Service virtualization is used to simulate the behavior of components in an application so you can perform an accurate and timely test in a world of complex interrelated applications. Production services that may not be available for integration testing can be virtualized so the testing can take place at an appropriate time in the software development process.

While quality professionals have always needed to test combinations of code, current methods for writing and combining code have changed so much that traditional approaches to testing can’t get the job done at the right price and the right time.  There is a fast growing commercial market for production services that are incorporated as self-contained modules into software applications. Third party services such as PayPal or a credit checking service are increasingly used in customer facing applications.

Use of these third-party services increases the efficiency of software development, but at the same time makes your application dependent on services that you do not control. Consider, for example, the scenario of an online retailer with multiple suppliers. The retailer has created a new mobile application for customers.  This application uses a credit check service provided by a third-party vendor. The team can’t test without this dependent component, but it is not available for testing. Without service virtualization, the software development team has some difficult choices to make and none of the options are good. If the development team proceeds without doing the necessary testing, they may introduce errors that are much harder and more costly to fix later on. If the team waits until the third-party service is available, developer productivity will decline and the team may miss production deadlines. In addition, if the third-party service becomes available it can get pretty expensive to test application performance at high usage levels since the service costs money each time it is executed.

So what does the development team do in this situation? Service virtualization is a new approach to testing that helps organizations eliminate some of the testing bottlenecks that make it hard to bring new high quality applications to market quickly. Here are five key things you should know about service virtualization.

  1. To get started with service virtualization you need to understand your testing methodology and think about where service virtualization can increase team velocity while also helping your team to deliver higher quality software.
  2. Use a cost/benefit analysis to select which services should be virtualized.  Consider the cost to your company when testing is delayed because dependent services or software are not available for testing. How much is spent on staff needed to set up and maintain test environments? How much do you spend to maintain test environments that are not fully utilized? What is the cost for software licenses in the physical test lab environment? What is the cost of third-party service access fees?
  3. Service virtualization can help you find errors in all testing phases- including unit testing, performance testing, integration testing, system testing, system integration testing, user acceptance testing, and operability testing.
  4. Recording a service that already exists is a great way to define the behavior of your virtual component. You can use the recording process to identify the behaviors that will need to be simulated so you can create test cases quickly.
  5. You can’t expect to virtualize all your components. Therefore, you need to be able to easily move back and forth between real components and virtual components while you are testing. You want to maintain consistency across real and virtual components.

One of the biggest impacts of service virtualization for developers is the ability to validate integrations much earlier in the application life cycle. The software development team can move beyond unit testing and overcome many of the roadblocks that inhibit timely, efficient, and cost effective testing.

, , , , , , , , , , ,

4 Comments

Looking for the Small in Big Data

While the buzz in the big data market is around massive amounts of data and what you can do with that data, in truth companies are more interested in the small stuff. Let me explain. One of the greatest benefits of getting hold of lots of data is that a company can see patterns and anomalies that would have gone undetected with a smaller set of data.

Companies are interested in big data not so much because of what that data represents in terms of volume, variety and velocity, but what the analysis of that data can do for the business. As business decision makers begin to recognize the potential of big data, they realize that the biggest insights and hidden gems of knowledge are found when big data actually becomes quite small. However, the analysis must start with very large volumes of data. Exploring the world of big data takes business leaders beyond the data found in their organization’s traditional databases. Researchers, data scientists, and marketers, are analyzing very large volumes of unstructured data from previously untapped sources such as e-mails, customer service records, sensor data, and security logs.

Now that companies have the ability to deal with so much more data, they are free to incorporate a greater variety of data into the mix. For example, they are incorporating social media, mobile phone location, traffic, and weather data into more traditional analysis. But what is the goal? Add more data and more elements so that those patterns emerge. Armed with the insight from analysis, business leaders can take a more precise set of data and compare it to data from a data warehouse or a system of record. As a result, the research becomes more targeted and directed to fit in with the context of the business.

Why do companies need to take this targeted approach to leveraging big data? In essence, companies want to use big data analysis to make a personalized offer that is just right for the customer when that customer is ready to buy. In some cases, the answer may be related not to selling but to diagnosing problems with a manufacturing system or a patient with an unexplained illness.

How do you move from big data analysis to small data insights and personalized action? You need to consider three elements: defining your business problem, defining and analyzing your data sources, and integrating and incorporating your big data analysis with your operational data.

Defining your business problem. Companies are beginning to ask the traditional questions about customers, products, and partners in new ways. For example, are you looking to manage your customer interactions armed with in-depth and customized knowledge about each individual customer? Companies with a focus on driving continuous improvement in customer service are asking, “How can I delight this customer and anticipate their specific needs?” These are important goals for businesses competing in today’s fast-paced, mobile-driven market.

Defining and analyzing your data sources. What information do you need to make the right offer to your buyer when he deciding on a purchase? What information can you glean from outside sources such as social media data? What big data sources do you have available internally that were previously underutilized? For example, can you use text analytics to gain new insight about customers from call center notes, emails, and voice recordings? One important goal with big data analytics is to look for patterns and relationships that apply to your business and narrow down the data set based on business context. Your big data analysis will help you find the small treasures of information in your big data.

Integrating and incorporating the analysis of you big data with your operational data.
After your big data analysis is complete, you need an approach that will allow you to integrate or incorporate the results of your big data analysis into your business process and real-time business actions. This will require some adjustment to the conventional notion of data integration. In order to bring your big data environments and enterprise data environments together, you will need to incorporate new methods of integration that support Hadoop and other nontraditional big data environments. In addition, if you want to incorporate the results of very fast streaming data into your business process, you will need advanced technology that enables you to make decisions in real-time.

Ultimately, if you want to make good decisions based on the results of your big data analysis you need to deliver information at the right time and with the right context. In order to make the results of your analysis actionable, you need to focus more on the small – targeted and personalized results of big data – than on the large data volumes.

, , ,

Leave a comment

Getting Your Virtualization Priorities Straight

Recently I talked with several data center managers about their experiences with virtualization. While these managers have different perspectives, they all agreed that server virtualization alone isn’t enough. By moving beyond server virtualization to a more holistic approach including virtualizing storage, network and other technology assets, these companies are increasing the ROI of their virtualization implementations. There is an element missing in their virtualization strategy that is making it hard to meet the increasing demands of the business. These companies are beginning to reset their virtualization priorities to make sure that all the elements of the IT environment work together – and this includes creating a virtualization environment that automatically allocates resources based on the demands of specific applications and workloads.

IT management needs to focus on the application priorities in terms of performance to support the business.  If all your applications are treated with the same priority– how can you be assured that your most critical applications always have access to the resources they require?  You may be doing a great job monitoring CPU usage and available memory in your server virtualization environment, but still have unexpected performance problems that impact critical customer applications. What’s missing is a way to adjust for business priority variations when you allocate resources across your virtualized environment.

One way to ensure that your environment operates with an increased awareness of the requirements of each specific application is to implement application infrastructure virtualization. It is a capability that allows for codifying business and technical usage policies to make sure that each of your applications leverages virtual and physical resources in a predictable way. By automating many of the underlying rules and processes, you can optimize both resource utilization and the customer experience at the same time.

There are three main characteristics for application infrastructure virtualization:

  • Setting business priorities for applications and automatically adjusting resources to keep customer service levels in balance
  • Applying a customer focused approach to the automation of resource optimization so that each application gets the resources it needs based on resource availability and the application’s priority to the business
  • Allocating a pooled set of resources to support a group of workloads.

Application infrastructure virtualization ensures that any resource can run any workload. If there are resource limitations then application with the lowest business priority at the time is allocated with the fewest resources.

I amplified this issue in a white paper I recently wrote for IBM on the topic. The paper, called  Creating Application Awareness in IT Virtualization Environments discusses application infrastructure virtualization and how companies can combine server and application infrastructure virtualization to improve overall performance levels. In addition, the paper describes IBM’s solution for application infrastructure virtualization, IBM WebSphere Virtual Enterprise (WVE).

It is easy to assume that server virtualization itself is enough to solve resource management issues in the data center. However, it is increasingly clear that virtualization has to be tied to the performance of various applications based on the times and situations where they demand performance. Tying application performance to virualization creates a more efficient and proactive environment to satisfy customer expectations.

, , ,

Leave a comment

IBMs Vision for Analytics in the Midmarket: gaining deeper business insight

I recently attended an IBM analyst meeting focused on solutions for the midmarket.  What caught my attention was the focus on analytics as an important and growing revenue opportunity for IBM.  In fact, IBM mentioned during the meeting that 70 percent of midsized firms are looking for analytics solutions.  It is clear from this meeting that IBM wants to bring a comprehensive set of analytical tools to the midsize companies.  Unlike some of IBM’s packaging, analytics tools are being packaged specifically for the midmarket so that they can be more consumable and affordable.

Analytics is fast becoming a high priority for companies as a result of the explosion in the variety, velocity, and volume of data with a potential impact on business decision-making. Much of this data is unstructured, such as the text included in customer service records, customer sentiment data in social media, or streams of data from instrumented devices.  Making good business decisions  often requires analysis across multiple sources and types of data.  Companies often have independent systems designed to manage business processes ranging from order/inventory to point of sales, marketing research, and customer relationship management. The challenge for many of these companies is that answering the most urgent questions about the business requires analysis across all of these independent systems. Even a small company with a few hundred employees may have a dozen systems are are disconnected and keep the company from having a full picture of the business.

Therefore, it is not surprising that some midsize companies are finding they can benefit from business analytics solutions. Yet, while some midsize companies are finding ways to get the answers they want using analytics, the word needs to spread to other companies still struggling with manual spreadsheet analysis that doesn’t go deep enough.

IBM is going to market through its business partners that typically support midsize companies with a variety of solutions. These business partners are being asked by their clients in the midmarket to help them implement technology solutions that will enable them to make smarter business decisions. They want to find new ways to deeper their understanding of customer expectations and priorities. For example, a midsize retailer might be trying to figure out why certain products are returned while others sell well.  The analytics market offers huge opportunity for IBM and its partners.

The approach IBM is taking with analytics for the mid-market is to offer its partners a pre-configuration of hardware and software into a single system at a price point  that is both affordable for midsized companies, but also has enough of a margin to make it attractive to a partner channel.

However, the challenge for partners is to change the traditional way they have gone to market.  Many partners that have built successful businesses by specializing in hardware sales or a specific category of software such as IBM  Rational find that they need to meet a broader set of client requirements.  They now need to both learn the new analytics products and be ready to sell and implement solutions differently.  Selling analytics to the mid market requires much more than a technical sell. Partners need to have a thorough understanding of the business context in which the analytics will be used to help customer visualize the potential business value.

One of IBM’s offerings that partners should be looking at is  the IBM Smart Analytics System 5710, which is a database appliance for business intelligence and data analytics targeted at the SMB market. The IBM Smart Analytics System 5710 is based on IBM System x, runs Linux, and includes InfoSphere Warehouse Departmental Edition and Cognos 10 Business Intelligence Reporting and Query.  The system is designed to enable partners to get their clients up and running very quickly a broad set of  analytics and business intelligence capabilities. I expect that you will see a lot more of this type of packaging from IBM with collaboration from its solution business partners.

, , , , , , ,

Leave a comment

Is Service Management the Missing Link on the Path from Virtualization to Cloud?

Many business executives are interested in moving to the cloud because of the potential impact on business strategy. Increasingly they are convinced that a cloud model – particularly the private cloud – will give them an increased amount of flexibility to change and manage the uncontrolled expansion of IT. In contrast, from an IT perspective, the ability to virtualize servers, storage, and I/O, is often viewed as the culmination of the cloud journey. Of course, the world is always more complicated than it appears. Cloud computing if implemented in a strategic manner can help a company experiment and change more easily. Likewise, virtualization, which may seem like an isolated and pragmatic approach, needs to be considered in context with an overall cloud computing strategy.

However, the challenge for many companies right now is how to transform their virtualized infrastructure into a private cloud that delivers on the promise of on-demand and self-service provisioning of IT resources. To make sure that business leaders gain desired cost savings and business flexibility while IT gains the optimization that can be achieved through virtualization requires an integrated strategy. At the heart of this strategy is service management of this emerging highly virtualized environment.

Why is service management important? There are good reasons why a key focus of the virtualization strategy at many companies has been on server virtualization. For example, server virtualization helps companies create a faster and more efficient IT provisioning process for users. It helps users with increased operational flexibility based on the mobility and isolation capabilities of virtual machines.

However, there is a major management problem with the typical virtualization approach in many companies. Often developers satisfy their demands for computing resources by simply creating or spinning up a new virtual machine rather than anticipating that they might not have the time or money to purchase new IT systems.  IT management in the beginning allows this practice because it is easier than trying to control impatient developers.  However, there is a price to be paid.  Each new virtual machine image requires memory and disk resources. When the number of virtual machines grows out of control, companies end up spending more time and money on disk, storage, and memory resources than was anticipated. Lack of control means lack of management. What is the answer?

At a recent IBM systems software meeting, Helene Armitage, General Manager of Systems Software, emphasized that in order to truly transform the data center into an agent of change requires a focus on manageability.  Here are the main take always from my conversations with Helene and her team.

  • Virtualization is not just about server consolidation. Companies need to think differently about virtualization in order to prepare for a future marked by rapid change. For example, storage virtualization can offer big benefits to companies facing explosive demand for data storage from increased use of new technologies like embedded and mobile devices.
  • Physical and virtual environments will need to be managed in a unified way and at the same time. Some may think of virtualization as a way of reducing management requirements. After all, with consolidated resources you should have less demand for power, energy, and space. However, management needs to be done right to be effective. it is important to consider how to manage across the platform and to include monitoring and configuration in combination with business service management  – delivering IT services in a structured, governed, and optimized way.
  • Cloud changes the complexion of what’s important in how virtualization is approached.  In the cloud, virtualization takes on a different level of complexity.  If elements of virtualization including storage, server, and I/O virtualization are handled as isolated tasks, the cloud platform will not be effectively managed.
  • System pooling is key for managing lots of systems. Systems pooling essentially means that IT treats all resources as a unified set of shared resources to improve performance and manageability.
  • Enforcing standardization right from the beginning is a requirement — especially when virtualization is at the core of a cloud environment.  The only way to truly manage the virtualized environment is to apply repeatable, predictable, and standardized best practices across the entire computing environment.

The greatest challenge for companies is to think differently about virtualization.  Management will need to realize that virtualization is a foundational element in building a private cloud environment and therefore has to be looked at from a holistic perspective.

, , , , , , ,

Leave a comment

Can a small business act like a giant with SaaS?

It isn’t easy being a small fish in a big pond.  How does a small mortgage company leverage its ability to be nimble in a competitive market? I believe that SaaS offerings are initiating a revolution with broad implications for business models and competitiveness. It is becoming increasingly clear that with the advent of sophisticated software as a service (SaaS) environments combined with process as a service, it is possible and even commonplace for a small company to be competitive.  I had the opportunity to spend some time at IBM’s LotusSphere with a small mortgage company called Signature Mortgage based in Ohio.  By leveraging a combination of IBM’s LotusLive platform combined with Silanis Technology’s electronic signature platform, Signature Mortgage has been able to differentiate itself in an extremely complex market dominated by Fortune 500 companies.

Mortgage brokers like Signature Mortgage are involved primarily in the origination process of the loan, bringing mortgage borrowers together with mortgage lenders who actually fund the money for the loan. Mortgage borrowers engage with mortgage brokers early in the mortgage process when a consumer looking for a mortgage often has a lot of choices. It can be hard at this early stage for a consumer to differentiate between mortgage brokers and if one mortgage broker takes several days or longer deliver the mortgage application documents for signing, the consumer might switch to another broker. Typically a consumer selects a mortgage broker based on the offered rate, term, and closing costs and then locks down the rate to protect against a rate change.  The broker will facilitate a complex series of steps that must take place in order to move from this initial rate lock down to the approval and closing of the mortgage.  The consumer must submit financial and personal documentation so the mortgage broker can assess the credit worthiness of the individual and appraise the property.

The process between mortgage origination and closure can take as long as 45-60 days.  To make money, the mortgage broker needs to be able to collect all the required information quickly and then be in a position to close the deal with the mortgage lender before rates change.  There is a lot at stake for both the mortgage broker and applicant during this time period. For example, missing a deadline for a signature can lead to cancellation of locked-in rates on a mortgage commitment, potentially leading to higher costs for a buyer or lost revenue for a mortgage broker.

Bob Catlin, President of Signature Mortgage explained to me that he was determined to use his company’s small size as an advantage by quickly implementing innovative technology that might take much longer in a large company with legacy policies and infrastructure. By streamlining and speeding up the mortgage origination process, he could differentiate from the larger banks, increase profits and have happier more satisfied customers.

Signature Mortgage has its customers log in to a portal designed to capture best practices for submitting application documentation, revising documents, receiving status reports and securing electronic signatures when required. All of the steps in the mortgage process are documented within the Silanis Technology portal that is based on IBM’s LotusLive collaboration platform.  The Silanis electronic signature solution is delivered as a cloud-based service, making it attractive to a small business with a limited IT staff. By implementing Silanis Technology’s solution, Caitlin has been able to shave days off the loan origination process because there are no more last minute surprises resulting from missing data or documents and delays in arranging in-person meetings.

For Signature Mortgage, the first step in tightening the timeline for the mortgage process was to consistently lock in rates in less than 15 days. Caitlin is now working on decreasing application processing time to 24 hours and shortening the total time between mortgage application and closing down to 10-15 days. Any mortgage borrower who has dealt with requests for last minute faxes for missing documentation or errors that are discovered as the rate-lock deadline approaches will understand that decreasing the time to close from the industry average of 30-45 days is a big deal. However, speed isn’t the entire benefit; managing all information related to the mortgage applications in one centralized portal improves accuracy and accountability. Using the capabilities of LotusLive and Silanis e-signature, Signature Mortgage has been able to include features that make the online mortgage process intuitive and consistent with paper-based manual processes that are familiar to many people. For example, customers click to sign where they see a virtual sticky note similar to the process used at in-person signings. The consistency and repeatability of the process helps the company to maintain compliance with legal and regulatory requirements.

With this predictable foundation, Signature Mortgage has been able to grow quickly, increase profitability and build a strong presence in the community.

, , , , ,

Leave a comment

Cashing in on the Cloud

I am welcoming my business partner, Judith Hurwitz as a contributor on my blog.  The following is her observations about the partner ecosystem in the cloud.

Judith Hurwitz

I have been spending quite a bit of time these days at Cloud Computing events. Some of these events, like the Cloud Camps are wonderful opportunities for customers, vendors, consulted, and interested parties to exchange ideas in a very interactive format. If you haven’t been to one I strongly recommend them.  Dave Nielsen who is one of the founders of the Cloud Camp concept has done a great job not just jump starting these events but participating in most of them around the world.  In addition, Marcia Kaufman and I have been conducting a number of half and full day Introduction to Cloud Computing seminars in different cities.  What has been the most interesting observation from my view is that customers are no longer sitting on the side lines with their arms crossed. Customers are ready and eager to jump into to this new computing paradigm.  Often they are urged on by business leaders who instinctively see the value in turning computing into a scalable utility.  So, for the first time, there is a clear sense that there may well be money to be made.

More

, , , ,

1 Comment

Are you bypassing CIO policies to access cloud services?

I recently spoke with a CIO of a large and highly regulated organization about his company’s experiences with cloud computing. Security and compliance issues are top priorities for this CIO causing the company’s leadership to move with caution into the cloud. He expects that all cloud implementations throughout the enterprise – from Software as a Service (SaaS) to Infrastructure as a Service  (IaaS) and Platform as a Service (PaaS) will receive prior approval from his office. This CIO is implementing the same approach to security and compliance that he has taken with every project undertaken within the company. In other words, security must be implemented following a centralized approach in order to ensure that information governance policies are upheld.   The company’s cloud experiences so far have included the on-demand purchase of extra compute power and storage for development and test on two small projects as well as use of Salesforce.com in several business unit sales teams. Overall, he feels confident about the level of control he has when it comes to managing cloud security issues, and understanding the potential impact of the evolving cost and economic models of cloud computing.

However, is this CIO is really as in control of the situation as he thinks?  If his experience is in line with what I have heard from CIO’s at similar enterprises, then he may well be blind sighted. For example, many businesses find that while their centralized governance processes are effective at improving security, there may also be some unintended consequences. While the CIO directs his team to implement policies to monitor the flow of information between internal users, customer, and partners, there may be some people in the company who are undermining his efforts. Tighter control at the corporate level may lead to longer approval processes for IT resources.  And departments that need to complete a project quickly have never been very patient.  As a result, developers and business unit analysts are leveraging cloud delivery models for quick and cost effective access to computing resources even if it means bypassing CIO instituted governance policies. Right now, the usage of cloud computing is small and is not impacting security or the expense structure in any significant way. However, I expect that as his company becomes more involved in cloud commuting this CIO will need to pay more attention to controlling the costs of cloud services and the management of cloud security.

Controlling costs. Cloud computing is fundamentally about the economics of delivering IT resources in a cost efficient, elastic, and secure manner.  But, the price per CPU for compute power or the price to bring the first five users onto a SaaS application is only one element of the overall economic equation.  It can be so inexpensive to access public cloud resources to meet short-term requirements that it is easy for users to enter a corporate credit card number and move ahead with the project. But, over time small projects can grow larger or take longer to complete than expected. For example, a software development team has a tight deadline to evaluate the performance of a new application prior to an upcoming sales promotion.  One of the developers uses a corporate credit card to get the extra compute power needed for this short-term test and spends a lot less money and gets faster results than by requesting additional resources from his company’s data center. Job completed. Deadline met. Cost low. However, what happens when the application requires additional testing under various scenarios and goes into production? The initial payment to Amazon may have gone unnoticed, but when the development team’s use of cloud resources expands significantly the CFO and the CEO suddenly start to ask a lot of questions.

Security. CIO’s identify security concerns as one of the top reasons why they are cautious about cloud computing. In addition to checking out the security policies of the cloud vendors under their control, CIO’s worry that you may be accessing cloud-based services without their approval. One big area of concern is the increasing use of  social networking applications accessed on mobile devices and used with little or no distinction between business and personal usage. For example, you may use LinkedIn to get help from a business contact to close a deal and Twitter and facebook to connect with friends and clients. For many people, there are few boundaries between business and personal conversations conducted in the cloud and this has some CIO worried about security and compliance issues.

The bottom Line. Unfortunately, these issues and concerns are not going away any time soon. In fact, I expect that the level of oversight will only increase. The CIO will be called to task if various departments begin relying on cloud services for various mission critical projects without any oversight.  This is only the tip of the iceberg. And I suspect this is going to be a big iceberg.


, , ,

8 Comments

Five Steps to Effective SOA Governance

I will be presenting a session titled “Five Steps to Effective SOA Governance” in BrightTALK’s online SOA Governance Summit on Thursday, February 25. The summit includes five webcasts on different aspects of industry best practices for SOA Governance. This summit is sponsored in association with the ITGRC Forum. Please join me for the live session by clicking on this link or check out the webcast online at BrightTALK at a later date.

While I was at the IBM Pulse2010 Conference for Integrated Service Management this week, I spoke with several IBM executives about governance issues. I’ve had SOA Governance on my mind because I was preparing for this webcast and found many opportunities at Pulse2010 to discuss governance as it relates to SOA, cloud, security, and business outcomes.

While there are many layers of complexity around governance, here is one basic truth that we discussed.  There are two types of governance  that can impact your business. One is the governance you need to do to keep out of jail – or at least out of trouble with government and industry regulators – and the second is the governance you want to do to ensure that your company has the flexibility to grow and innovate.  This is an important distinction.  You can establish tightly regulated governance policies with a goal of  passing your required audits, but without the right level of governance around business process issues you can’t anticipate change and find opportunities in unexpected challenges.

If you are hoping to obtain better business outcomes ( and who isn’t?) you will need to work hard at improving business and IT collaboration. Implementing a Service Oriented Architecture helps an organization to align IT with business goals and to succeed in rapidly changing business environments. In order for you to achieve these benefits of SOA, you will need to implement a governance model. SOA governance is critical for achieving business value from your SOA initiative by ensuring the reusability of business services.

What is SOA Governance? It helps define a methodology for creating, managing, and safeguarding your movement to SOA. It also supports the management of business rules in a standardized way across the business.

How do you create effective SOA Governance for your organization? It is important to fit your governance model to your SOA. You need to start small and grow.

Here are five key steps to effective SOA Governance.

  1. Approach executive management with a justification for SOA governance.
  2. Create a comprehensive plan to create the right business services with executive support.
  3. Establish process for organizational change since managing change is as important as creating SOA services.
  4. Balance risk with oversight to find a proper balance for SOA governance.
  5. Plan for the lifecycle of business services.

SOA Governance is all about finding the right balance for your organization. You need to create the right set of business services at the right level of granularity to support the business. If your services are too narrow and technically defined, then they may not have the right meaning for the business. In order to achieve business success with SOA, you need to implement a SOA governance model that ensures business service reuse and business value.

, , , ,

1 Comment

Asking the right questions about information governance

I am looking forward to attending The Smart Governance Forum (23rd meeting of the IBM Data Governance Council) in California on February 1-3, where I will be a panelist for a session on Smart Governance Analytics. As my panel group started to plan for the event, I did some background research on the Council to understand more about them. What kinds of questions were Council members asking about information governance when they began meeting in 2004 and how are things different today? Have they developed best practices that would be useful to other companies working to develop an information governance strategy?

Information governance refers to the methods, policies, and technology that your business deploys to ensure the quality, completeness, and safety of its information. Your approach to information governance must align with the policies, standards, regulations, and laws that you are legally required to follow. When a group of senior executives responsible for information security, risk, and compliance at IBM customer organizations began meeting in 2004, interest in IT governance was high, but there wasn’t as much attention focused specifically on information governance.

Books like “IT Governance: How Top Performers Manage IT Decision Rights for Superior Results” by Peter Weill and Jeanne W Ross  helped companies understand the benefit of aligning IT goals with the overall goals and objectives of the business. In addition, there were other publications at this time focused on how to take a balanced scorecard approach to managing business strategy and on best practices for implementing IT governance.   These approaches are of critical importance to business success, however there was also a need to develop a framework for understanding, monitoring, and securing the rapidly increasing supply of business data and content.

And that is what a group of IT information focused business leaders and IBM and business partner technology leaders decided to do. The amount of data they needed to collect, aggregate, process, analyze, share, change, store, and retire was growing larger every day. In addition to data stored  in traditional data bases and packaged applications like CRM (customer relationship management) systems, they were also concerned about information stored and shared in unstructured formats like documents, spreadsheets, and email.

Having more information about your companies customers, partners, and products creates great opportunity, but more information also means more risk if  you don’t manage your information with care. Council members asked each other lots of questions such as:

  • How can we be sure that the right people get access to the right information at the right time?
  • How can we make sure that the wrong people do not get access to our private information at any time?
  • How can we overcome the risks to data quality, consistency, and security increased by the siloed approach to business data ownership that is so prevalent in our organizations?
  • How can we create a benchmarking tool for information governance that will help our businesses to increase revenue, lower costs, and reduce risks?
  • How can improve our ability to meet the security and protection standards of auditors and regulators?

As a result of its discussions, The Council developed a Maturity Model to help you assess your current state of information governance and provide guidance for developing a roadmap for the future. The Model identifies 11 categories of information governance.  The categories cover all the different elements of building an information security strategy such as understanding who in the business/IT is responsible for  what information, what policies do you follow to control the flow of your information in your company,  what are your methodologies for identifying and mitigating risk,  and how do you measure the value of your data and the effectiveness of governance. I read two IBM White Paper’s on the Model that add insight to the questions you need to ask to begin building a path to better information governance,  “The IBM Data Governance Council Maturity Model: Building a roadmap for effective data governance” and  “The IBM data governance blueprint: Leveraging best practices and proven technologies“.

So, what’s changed? FInancial crises, increasing regulation, high-profile incidents of stolen private data, cloud technology, and other factors have added substance and complexity to the questions you need to ask about information governance.  There is much to do.  One question we will explore at the conference next week is, How do you measure the effectiveness of your information governance strategy and what analytical measures are appropriate? For example, some companies are using analytical tools to look for patterns of email communication across the company and discover a greater level of insight into how information is flowing and what needs more review. Look for more on analytics and governance after the conference.

, , , , , ,

1 Comment