What is Service Virtualization?

There has been a lot of discussion lately about “service virtualization”, however the term alone can make your head spin. Are we talking about server virtualization? What types of services are involved? What does virtualization have to do with testing? I’d like to quickly clear up any confusion you may have. Service virtualization is used to simulate the behavior of components in an application so you can perform an accurate and timely test in a world of complex interrelated applications. Production services that may not be available for integration testing can be virtualized so the testing can take place at an appropriate time in the software development process.

While quality professionals have always needed to test combinations of code, current methods for writing and combining code have changed so much that traditional approaches to testing can’t get the job done at the right price and the right time.  There is a fast growing commercial market for production services that are incorporated as self-contained modules into software applications. Third party services such as PayPal or a credit checking service are increasingly used in customer facing applications.

Use of these third-party services increases the efficiency of software development, but at the same time makes your application dependent on services that you do not control. Consider, for example, the scenario of an online retailer with multiple suppliers. The retailer has created a new mobile application for customers.  This application uses a credit check service provided by a third-party vendor. The team can’t test without this dependent component, but it is not available for testing. Without service virtualization, the software development team has some difficult choices to make and none of the options are good. If the development team proceeds without doing the necessary testing, they may introduce errors that are much harder and more costly to fix later on. If the team waits until the third-party service is available, developer productivity will decline and the team may miss production deadlines. In addition, if the third-party service becomes available it can get pretty expensive to test application performance at high usage levels since the service costs money each time it is executed.

So what does the development team do in this situation? Service virtualization is a new approach to testing that helps organizations eliminate some of the testing bottlenecks that make it hard to bring new high quality applications to market quickly. Here are five key things you should know about service virtualization.

  1. To get started with service virtualization you need to understand your testing methodology and think about where service virtualization can increase team velocity while also helping your team to deliver higher quality software.
  2. Use a cost/benefit analysis to select which services should be virtualized.  Consider the cost to your company when testing is delayed because dependent services or software are not available for testing. How much is spent on staff needed to set up and maintain test environments? How much do you spend to maintain test environments that are not fully utilized? What is the cost for software licenses in the physical test lab environment? What is the cost of third-party service access fees?
  3. Service virtualization can help you find errors in all testing phases- including unit testing, performance testing, integration testing, system testing, system integration testing, user acceptance testing, and operability testing.
  4. Recording a service that already exists is a great way to define the behavior of your virtual component. You can use the recording process to identify the behaviors that will need to be simulated so you can create test cases quickly.
  5. You can’t expect to virtualize all your components. Therefore, you need to be able to easily move back and forth between real components and virtual components while you are testing. You want to maintain consistency across real and virtual components.

One of the biggest impacts of service virtualization for developers is the ability to validate integrations much earlier in the application life cycle. The software development team can move beyond unit testing and overcome many of the roadblocks that inhibit timely, efficient, and cost effective testing.

, , , , , , , , , , ,

4 Comments

Looking for the Small in Big Data

While the buzz in the big data market is around massive amounts of data and what you can do with that data, in truth companies are more interested in the small stuff. Let me explain. One of the greatest benefits of getting hold of lots of data is that a company can see patterns and anomalies that would have gone undetected with a smaller set of data.

Companies are interested in big data not so much because of what that data represents in terms of volume, variety and velocity, but what the analysis of that data can do for the business. As business decision makers begin to recognize the potential of big data, they realize that the biggest insights and hidden gems of knowledge are found when big data actually becomes quite small. However, the analysis must start with very large volumes of data. Exploring the world of big data takes business leaders beyond the data found in their organization’s traditional databases. Researchers, data scientists, and marketers, are analyzing very large volumes of unstructured data from previously untapped sources such as e-mails, customer service records, sensor data, and security logs.

Now that companies have the ability to deal with so much more data, they are free to incorporate a greater variety of data into the mix. For example, they are incorporating social media, mobile phone location, traffic, and weather data into more traditional analysis. But what is the goal? Add more data and more elements so that those patterns emerge. Armed with the insight from analysis, business leaders can take a more precise set of data and compare it to data from a data warehouse or a system of record. As a result, the research becomes more targeted and directed to fit in with the context of the business.

Why do companies need to take this targeted approach to leveraging big data? In essence, companies want to use big data analysis to make a personalized offer that is just right for the customer when that customer is ready to buy. In some cases, the answer may be related not to selling but to diagnosing problems with a manufacturing system or a patient with an unexplained illness.

How do you move from big data analysis to small data insights and personalized action? You need to consider three elements: defining your business problem, defining and analyzing your data sources, and integrating and incorporating your big data analysis with your operational data.

Defining your business problem. Companies are beginning to ask the traditional questions about customers, products, and partners in new ways. For example, are you looking to manage your customer interactions armed with in-depth and customized knowledge about each individual customer? Companies with a focus on driving continuous improvement in customer service are asking, “How can I delight this customer and anticipate their specific needs?” These are important goals for businesses competing in today’s fast-paced, mobile-driven market.

Defining and analyzing your data sources. What information do you need to make the right offer to your buyer when he deciding on a purchase? What information can you glean from outside sources such as social media data? What big data sources do you have available internally that were previously underutilized? For example, can you use text analytics to gain new insight about customers from call center notes, emails, and voice recordings? One important goal with big data analytics is to look for patterns and relationships that apply to your business and narrow down the data set based on business context. Your big data analysis will help you find the small treasures of information in your big data.

Integrating and incorporating the analysis of you big data with your operational data.
After your big data analysis is complete, you need an approach that will allow you to integrate or incorporate the results of your big data analysis into your business process and real-time business actions. This will require some adjustment to the conventional notion of data integration. In order to bring your big data environments and enterprise data environments together, you will need to incorporate new methods of integration that support Hadoop and other nontraditional big data environments. In addition, if you want to incorporate the results of very fast streaming data into your business process, you will need advanced technology that enables you to make decisions in real-time.

Ultimately, if you want to make good decisions based on the results of your big data analysis you need to deliver information at the right time and with the right context. In order to make the results of your analysis actionable, you need to focus more on the small – targeted and personalized results of big data – than on the large data volumes.

, , ,

Leave a comment

Getting Your Virtualization Priorities Straight

Recently I talked with several data center managers about their experiences with virtualization. While these managers have different perspectives, they all agreed that server virtualization alone isn’t enough. By moving beyond server virtualization to a more holistic approach including virtualizing storage, network and other technology assets, these companies are increasing the ROI of their virtualization implementations. There is an element missing in their virtualization strategy that is making it hard to meet the increasing demands of the business. These companies are beginning to reset their virtualization priorities to make sure that all the elements of the IT environment work together – and this includes creating a virtualization environment that automatically allocates resources based on the demands of specific applications and workloads.

IT management needs to focus on the application priorities in terms of performance to support the business.  If all your applications are treated with the same priority– how can you be assured that your most critical applications always have access to the resources they require?  You may be doing a great job monitoring CPU usage and available memory in your server virtualization environment, but still have unexpected performance problems that impact critical customer applications. What’s missing is a way to adjust for business priority variations when you allocate resources across your virtualized environment.

One way to ensure that your environment operates with an increased awareness of the requirements of each specific application is to implement application infrastructure virtualization. It is a capability that allows for codifying business and technical usage policies to make sure that each of your applications leverages virtual and physical resources in a predictable way. By automating many of the underlying rules and processes, you can optimize both resource utilization and the customer experience at the same time.

There are three main characteristics for application infrastructure virtualization:

  • Setting business priorities for applications and automatically adjusting resources to keep customer service levels in balance
  • Applying a customer focused approach to the automation of resource optimization so that each application gets the resources it needs based on resource availability and the application’s priority to the business
  • Allocating a pooled set of resources to support a group of workloads.

Application infrastructure virtualization ensures that any resource can run any workload. If there are resource limitations then application with the lowest business priority at the time is allocated with the fewest resources.

I amplified this issue in a white paper I recently wrote for IBM on the topic. The paper, called  Creating Application Awareness in IT Virtualization Environments discusses application infrastructure virtualization and how companies can combine server and application infrastructure virtualization to improve overall performance levels. In addition, the paper describes IBM’s solution for application infrastructure virtualization, IBM WebSphere Virtual Enterprise (WVE).

It is easy to assume that server virtualization itself is enough to solve resource management issues in the data center. However, it is increasingly clear that virtualization has to be tied to the performance of various applications based on the times and situations where they demand performance. Tying application performance to virualization creates a more efficient and proactive environment to satisfy customer expectations.

, , ,

Leave a comment

IBMs Vision for Analytics in the Midmarket: gaining deeper business insight

I recently attended an IBM analyst meeting focused on solutions for the midmarket.  What caught my attention was the focus on analytics as an important and growing revenue opportunity for IBM.  In fact, IBM mentioned during the meeting that 70 percent of midsized firms are looking for analytics solutions.  It is clear from this meeting that IBM wants to bring a comprehensive set of analytical tools to the midsize companies.  Unlike some of IBM’s packaging, analytics tools are being packaged specifically for the midmarket so that they can be more consumable and affordable.

Analytics is fast becoming a high priority for companies as a result of the explosion in the variety, velocity, and volume of data with a potential impact on business decision-making. Much of this data is unstructured, such as the text included in customer service records, customer sentiment data in social media, or streams of data from instrumented devices.  Making good business decisions  often requires analysis across multiple sources and types of data.  Companies often have independent systems designed to manage business processes ranging from order/inventory to point of sales, marketing research, and customer relationship management. The challenge for many of these companies is that answering the most urgent questions about the business requires analysis across all of these independent systems. Even a small company with a few hundred employees may have a dozen systems are are disconnected and keep the company from having a full picture of the business.

Therefore, it is not surprising that some midsize companies are finding they can benefit from business analytics solutions. Yet, while some midsize companies are finding ways to get the answers they want using analytics, the word needs to spread to other companies still struggling with manual spreadsheet analysis that doesn’t go deep enough.

IBM is going to market through its business partners that typically support midsize companies with a variety of solutions. These business partners are being asked by their clients in the midmarket to help them implement technology solutions that will enable them to make smarter business decisions. They want to find new ways to deeper their understanding of customer expectations and priorities. For example, a midsize retailer might be trying to figure out why certain products are returned while others sell well.  The analytics market offers huge opportunity for IBM and its partners.

The approach IBM is taking with analytics for the mid-market is to offer its partners a pre-configuration of hardware and software into a single system at a price point  that is both affordable for midsized companies, but also has enough of a margin to make it attractive to a partner channel.

However, the challenge for partners is to change the traditional way they have gone to market.  Many partners that have built successful businesses by specializing in hardware sales or a specific category of software such as IBM  Rational find that they need to meet a broader set of client requirements.  They now need to both learn the new analytics products and be ready to sell and implement solutions differently.  Selling analytics to the mid market requires much more than a technical sell. Partners need to have a thorough understanding of the business context in which the analytics will be used to help customer visualize the potential business value.

One of IBM’s offerings that partners should be looking at is  the IBM Smart Analytics System 5710, which is a database appliance for business intelligence and data analytics targeted at the SMB market. The IBM Smart Analytics System 5710 is based on IBM System x, runs Linux, and includes InfoSphere Warehouse Departmental Edition and Cognos 10 Business Intelligence Reporting and Query.  The system is designed to enable partners to get their clients up and running very quickly a broad set of  analytics and business intelligence capabilities. I expect that you will see a lot more of this type of packaging from IBM with collaboration from its solution business partners.

, , , , , , ,

Leave a comment

Is Service Management the Missing Link on the Path from Virtualization to Cloud?

Many business executives are interested in moving to the cloud because of the potential impact on business strategy. Increasingly they are convinced that a cloud model – particularly the private cloud – will give them an increased amount of flexibility to change and manage the uncontrolled expansion of IT. In contrast, from an IT perspective, the ability to virtualize servers, storage, and I/O, is often viewed as the culmination of the cloud journey. Of course, the world is always more complicated than it appears. Cloud computing if implemented in a strategic manner can help a company experiment and change more easily. Likewise, virtualization, which may seem like an isolated and pragmatic approach, needs to be considered in context with an overall cloud computing strategy.

However, the challenge for many companies right now is how to transform their virtualized infrastructure into a private cloud that delivers on the promise of on-demand and self-service provisioning of IT resources. To make sure that business leaders gain desired cost savings and business flexibility while IT gains the optimization that can be achieved through virtualization requires an integrated strategy. At the heart of this strategy is service management of this emerging highly virtualized environment.

Why is service management important? There are good reasons why a key focus of the virtualization strategy at many companies has been on server virtualization. For example, server virtualization helps companies create a faster and more efficient IT provisioning process for users. It helps users with increased operational flexibility based on the mobility and isolation capabilities of virtual machines.

However, there is a major management problem with the typical virtualization approach in many companies. Often developers satisfy their demands for computing resources by simply creating or spinning up a new virtual machine rather than anticipating that they might not have the time or money to purchase new IT systems.  IT management in the beginning allows this practice because it is easier than trying to control impatient developers.  However, there is a price to be paid.  Each new virtual machine image requires memory and disk resources. When the number of virtual machines grows out of control, companies end up spending more time and money on disk, storage, and memory resources than was anticipated. Lack of control means lack of management. What is the answer?

At a recent IBM systems software meeting, Helene Armitage, General Manager of Systems Software, emphasized that in order to truly transform the data center into an agent of change requires a focus on manageability.  Here are the main take always from my conversations with Helene and her team.

  • Virtualization is not just about server consolidation. Companies need to think differently about virtualization in order to prepare for a future marked by rapid change. For example, storage virtualization can offer big benefits to companies facing explosive demand for data storage from increased use of new technologies like embedded and mobile devices.
  • Physical and virtual environments will need to be managed in a unified way and at the same time. Some may think of virtualization as a way of reducing management requirements. After all, with consolidated resources you should have less demand for power, energy, and space. However, management needs to be done right to be effective. it is important to consider how to manage across the platform and to include monitoring and configuration in combination with business service management  – delivering IT services in a structured, governed, and optimized way.
  • Cloud changes the complexion of what’s important in how virtualization is approached.  In the cloud, virtualization takes on a different level of complexity.  If elements of virtualization including storage, server, and I/O virtualization are handled as isolated tasks, the cloud platform will not be effectively managed.
  • System pooling is key for managing lots of systems. Systems pooling essentially means that IT treats all resources as a unified set of shared resources to improve performance and manageability.
  • Enforcing standardization right from the beginning is a requirement — especially when virtualization is at the core of a cloud environment.  The only way to truly manage the virtualized environment is to apply repeatable, predictable, and standardized best practices across the entire computing environment.

The greatest challenge for companies is to think differently about virtualization.  Management will need to realize that virtualization is a foundational element in building a private cloud environment and therefore has to be looked at from a holistic perspective.

, , , , , , ,

Leave a comment

Can a small business act like a giant with SaaS?

It isn’t easy being a small fish in a big pond.  How does a small mortgage company leverage its ability to be nimble in a competitive market? I believe that SaaS offerings are initiating a revolution with broad implications for business models and competitiveness. It is becoming increasingly clear that with the advent of sophisticated software as a service (SaaS) environments combined with process as a service, it is possible and even commonplace for a small company to be competitive.  I had the opportunity to spend some time at IBM’s LotusSphere with a small mortgage company called Signature Mortgage based in Ohio.  By leveraging a combination of IBM’s LotusLive platform combined with Silanis Technology’s electronic signature platform, Signature Mortgage has been able to differentiate itself in an extremely complex market dominated by Fortune 500 companies.

Mortgage brokers like Signature Mortgage are involved primarily in the origination process of the loan, bringing mortgage borrowers together with mortgage lenders who actually fund the money for the loan. Mortgage borrowers engage with mortgage brokers early in the mortgage process when a consumer looking for a mortgage often has a lot of choices. It can be hard at this early stage for a consumer to differentiate between mortgage brokers and if one mortgage broker takes several days or longer deliver the mortgage application documents for signing, the consumer might switch to another broker. Typically a consumer selects a mortgage broker based on the offered rate, term, and closing costs and then locks down the rate to protect against a rate change.  The broker will facilitate a complex series of steps that must take place in order to move from this initial rate lock down to the approval and closing of the mortgage.  The consumer must submit financial and personal documentation so the mortgage broker can assess the credit worthiness of the individual and appraise the property.

The process between mortgage origination and closure can take as long as 45-60 days.  To make money, the mortgage broker needs to be able to collect all the required information quickly and then be in a position to close the deal with the mortgage lender before rates change.  There is a lot at stake for both the mortgage broker and applicant during this time period. For example, missing a deadline for a signature can lead to cancellation of locked-in rates on a mortgage commitment, potentially leading to higher costs for a buyer or lost revenue for a mortgage broker.

Bob Catlin, President of Signature Mortgage explained to me that he was determined to use his company’s small size as an advantage by quickly implementing innovative technology that might take much longer in a large company with legacy policies and infrastructure. By streamlining and speeding up the mortgage origination process, he could differentiate from the larger banks, increase profits and have happier more satisfied customers.

Signature Mortgage has its customers log in to a portal designed to capture best practices for submitting application documentation, revising documents, receiving status reports and securing electronic signatures when required. All of the steps in the mortgage process are documented within the Silanis Technology portal that is based on IBM’s LotusLive collaboration platform.  The Silanis electronic signature solution is delivered as a cloud-based service, making it attractive to a small business with a limited IT staff. By implementing Silanis Technology’s solution, Caitlin has been able to shave days off the loan origination process because there are no more last minute surprises resulting from missing data or documents and delays in arranging in-person meetings.

For Signature Mortgage, the first step in tightening the timeline for the mortgage process was to consistently lock in rates in less than 15 days. Caitlin is now working on decreasing application processing time to 24 hours and shortening the total time between mortgage application and closing down to 10-15 days. Any mortgage borrower who has dealt with requests for last minute faxes for missing documentation or errors that are discovered as the rate-lock deadline approaches will understand that decreasing the time to close from the industry average of 30-45 days is a big deal. However, speed isn’t the entire benefit; managing all information related to the mortgage applications in one centralized portal improves accuracy and accountability. Using the capabilities of LotusLive and Silanis e-signature, Signature Mortgage has been able to include features that make the online mortgage process intuitive and consistent with paper-based manual processes that are familiar to many people. For example, customers click to sign where they see a virtual sticky note similar to the process used at in-person signings. The consistency and repeatability of the process helps the company to maintain compliance with legal and regulatory requirements.

With this predictable foundation, Signature Mortgage has been able to grow quickly, increase profitability and build a strong presence in the community.

, , , , ,

Leave a comment

Cashing in on the Cloud

I am welcoming my business partner, Judith Hurwitz as a contributor on my blog.  The following is her observations about the partner ecosystem in the cloud.

Judith Hurwitz

I have been spending quite a bit of time these days at Cloud Computing events. Some of these events, like the Cloud Camps are wonderful opportunities for customers, vendors, consulted, and interested parties to exchange ideas in a very interactive format. If you haven’t been to one I strongly recommend them.  Dave Nielsen who is one of the founders of the Cloud Camp concept has done a great job not just jump starting these events but participating in most of them around the world.  In addition, Marcia Kaufman and I have been conducting a number of half and full day Introduction to Cloud Computing seminars in different cities.  What has been the most interesting observation from my view is that customers are no longer sitting on the side lines with their arms crossed. Customers are ready and eager to jump into to this new computing paradigm.  Often they are urged on by business leaders who instinctively see the value in turning computing into a scalable utility.  So, for the first time, there is a clear sense that there may well be money to be made.

More

, , , ,

1 Comment

Follow

Get every new post delivered to your Inbox.