Tuesday, October 20, 2009

Analysis of SAS Institute and IBM Intelligence Alliance

"At IBM's PartnerWorld 2000 in San Diego this Monday (24Jan00), SAS Institute and IBM will announce a new business intelligence relationship that will include the formation of consulting practices focused on SAS solutions, and further development of e-business intelligence solutions that integrate IBM's DB2 database product family and SAS software.

The announcement between the two business intelligence leaders is the latest in a select group of key strategic relationships forged by IBM as it refocuses its partnering efforts to provide world-class e-business applications. Recent announcements have included partnerships with other leading software providers such as Siebel Systems and SAP AG.

The agreement between IBM and SAS Institute and the planned joint development efforts will result in:

* Creation of a consulting practice in IBM Global Services specializing in SAS solutions. These consultants will work with joint customers to integrate the powerful decision support capabilities of SAS solutions with existing transaction systems and other e-business applications.

* Closer integration of SAS solutions and DB2 Universal Database to enhance performance for all IBM server platforms, including Netfinity, AS/400, RS/6000, NUMA-Q and S/390.

* IBM Global Services' access to a wide range of SAS Institute solutions for business intelligence, data warehousing, and decision support.

The relationship will initially focus on three primary areas where IBM and SAS Institute will offer end-to-end solutions to enterprise customers. IBM Global Services will provide the analytical services, systems integration and industry-specific consulting expertise. SAS Institute will provide software solutions for Customer Relationship Management (CRM), Enterprise Resource Planning (ERP), and Supplier Relationship Management (SRM). IBM and SAS Institute plan to more tightly integrate and thus enhance performance of DB2 Universal Database and SAS software."
The existing customer base for IBM DB2 Universal Database should be strongly interested in this development. The ability to access solutions for customer relationship management and extended supply chain solutions should be especially intriguing. We believe that the combination of SAS's strong business intelligence solutions and IBM's global sales and consulting forces will make a powerful combination. The question for customers will be whether this is just a marketing alliance or an actual combination of powerful products at the code level, allowing customers to seamlessly integrate the products.

Using Predictive Analytics within Business Intelligence: A Primer

Predictive analytics has helped drive business intelligence (BI) towards business performance management (BPM). Traditionally, predictive analytics and models have been used to identify patterns in consumer oriented businesses, such as identifying potential credit risk when issuing credit cards, or analyzing the buying habits of retail consumers. The BI industry has shifted from identifying and comparing data patterns over time (based on batch processing of monthly or weekly data) to providing performance management solutions with right-time data loads in order to allow accurate decision making in real time. Thus, the emergence of predictive analytics within BI has become an extension of general performance management functionality. For organizations to compete in the market place, taking a forward-looking approach is essential. BI can provide the framework for organizations focused on driving their business based on predictive models and other aspects of performance management.

We'll define predictive analytics and identify its different applications inside and outside BI. We'll also look at the components of predictive analytics and its evolution from data mining, and at how they interrelate. Finally, we'll examine the use of predictive analytics and how they can be leveraged to drive performance management.

Overview of Analytics and Their General Business Application

Analytical tools enable greater transparency within an organization, and can identify and analyze past and present trends, as well as discover the hidden nature of data. However, past and present trend analysis and identification alone are not enough to gain competitive advantage. Organizations need to identify future patterns, trends, and customer behavior to better understand and anticipate their markets.

Traditional analytical tools claim to have a 360-degree view of the organization, but they actually only analyze historical data, which may be stale, incomplete, or corrupted. Traditional analytics can help gain insight based on past decision making, which can be beneficial; however, predictive analytics allows organizations to take a forward-looking approach to the same types of analytical capabilities.

Credit card providers offer a first-rate example of the application of analytics (specifically, predictive analytics) in their identification of credit card risk, customer retention, and loyalty programs. Credit card companies attempt to retain their existing customers through loyalty programs, and need to take into account the factors that cause customers to choose other credit card providers. The challenge is predicting customer loss. In this case, a model which uses three predictors can be used to help predict customer loyalty: frequency of use, personal financial situations, and lower annual percentage rate (APR) offered by competitors. The combination of these predictors can be used to create a predictive model. The predictive model can then be applied and customers can be put into categories based on the resulting data. Any changes in user classification will flag the customer. That customer will then be targeted for the loyalty program. Financial institutions, on the other hand, use predictive analytics to identify the lifetime value of their customers. Whether this translates into increased benefits, lower interest rates, or other benefits for the customer, classifying and applying patterns to different customer segmentations allows the financial institutions to best benefit from (and provide benefit to) their customers.
Data mining can be defined as an analytical tool set that searches for data patterns automatically and identifies specific patterns within large datasets across disparate organizational systems. Data mining, text mining, and Web mining are types of pattern identification. Organizations can use these forms of pattern recognition to identify customers' buying patterns or the relationship between a person's financial records and their credit risk. Predictive analytics moves one step further and applies these patterns to make forward-looking predictions. Instead of just identifying a potential credit risk, an organization can identify the lifetime value of a customer by developing predictive decision models and applying these models to the identified patterns. These types of pattern identification and forward-looking model structures can equally be applied to BI and performance management solutions within an organization.

Predictive analytics is used to determine the probable future outcome of an event, or the likelihood of a situation occurring. It is the branch of data mining concerned with the prediction of future probabilities and trends. Predictive analytics is used to analyze automatically large amounts of data with different variables, including clustering, decision trees, market basket analysis, regression modeling, neural nets, genetic algorithms, text mining, hypothesis testing, decision analytics, and so on.

The core element of predictive analytics is the predictor, a variable that can be measured for an individual or entity to predict future behavior. These predictors are based on models that are created to use the analytical capabilities within the generated predictive models. Descriptive models classify relationships by identifying customers or prospective customers, and placing them in groups based on identified criteria. Decision models consider business and economic drivers and constraints that surpass the general functionality of a predictive model. In a sense, statistical analysis helps to drive this process as well. The predictors are the factors that help identify the outcomes of the actual model. For example, a financial institution may want to identify the factors that make a valuable lifetime customer.

Multiple predictors can be combined into a predictive model, which, when subjected to analysis, can be used to forecast future probabilities with an acceptable level of reliability. In predictive modeling, data is collected, a statistical model is formulated, predictions are made, and the model is validated (or revised) as additional data becomes available. One of the main differences between data mining and predictive analytics is that data mining can be a fully automated process, whereas predictive analytics requires an analyst to identify the predictors and apply them to the defined models.

A decision tree is a variable within predictive analytics that allows the user to visualize the mapping of observations about an item and compare it to conclusions about the item's target value. Basically, decision trees are built by creating a hierarchy of predictor attributes. The highest level represents the outcome, and each sub-level identifies another factor in that conclusion. This can be compared to if-else statements, which identify a result based on whether certain factors meet specified criteria. For example, in order to assess potential bad debt based on credit history, salary, demographics, and so on, a financial institution may wish to identify multiple scenarios, each of which is likely to meet bad debt customer criteria, and use combinations of those scenarios to identify which customers are most likely to become bad debt accounts.

Regression analysis is another component of predictive analytics that allows users to model relationships between three or more variables in order to predict the value of one variable in comparison to the values of the others. It can be used to identify buying patterns based on multiple demographic qualifiers such as age and gender which can be beneficial to identify where to sell specific products. Within BI, this is beneficial when used with scorecards that focus on geography and sales.
Practical applications of all of these analytical models allow organizations to forecast results to predict financial outcomes, hopefully increasing revenues in the process. Within BI, aside from financial outcomes, predictive analytics can be used to develop corporate strategies throughout the organization. What-if analyses can be performed to leverage the capabilities of predictive analytics to build various scenarios, allowing organizations to map out a series of outcomes of strategic and tactical plans. This way, organizations can implement the best strategy based on the scenario creation.

How Predictive Analytics Are Used within BI, and How They Drive an Organization's BPM

Data mining, predictive analytics, and statistical engines are examples of tools that have been embedded in BI software packages to leverage the benefits of performance management. If BI is backward looking, and data mining identifies the here and now, predictive analytics and their use within performance management is the looking glass into the future. This forward-looking view helps organizations drive their decision making. BI is known for its consolidation of data from disparate business units, and for its analysis capabilities based on that consolidated data. Performance management goes one step further by leveraging the BI framework (such as the data warehousing structure and extract, transform, and load [ETL] capabilities) to monitor performance, identify trends, and allow decision makers the ability to set appropriate metrics and monitor results on an ongoing basis.

With predictive analytics embedded within the above processes, the metrics set and business rules identified by organizations can be used to identify the predictors that need to be evaluated. These predictors can then be used to shift towards a forward-looking approach in decision making by using the strengths from the areas identified above. Scorecards are one example of a performance management tool that can leverage predictive analytics. The identification of sales performance by region, product type, and demographics can be used to define what new products should be introduced into the market, and where. In general, scorecards can graphically reflect the selected sales information and create what-if scenarios based on the data identified to verify the right combinations of new product distribution.

What-if scenarios can be used within the different visualization tools to create business models that anticipate what might happen within an organization based on changes in defined variables. What-if analysis gives organizations the tools to identify how profits will be affected based on changes in inflation and pricing patterns as well as the impact of increasing the number of employees throughout the organization. Online analytical processing (OLAP) cubes can be created to identify dimensional data, and patterns within changing dimensions can be compared over time to contrast scenarios using a cube structure to automatically view the outcome of the what-if scenarios.

Marketing and Intelligence, Together at Last

Angara offers an ASP-based service for targeting web site content to unidentified visitors (see article, "Getting Strangers to Take Your Candy"). The company buys online profile data from other websites. These are data that users agree to provide in exchange for receiving newsletters or other offers or are captured from clickstreams by online advertising networks such as MatchLogic.

By arrangement with the websites Angara gets to drop a cookie - but not any data that might identify the user as an individual. When the user later visits an Angara customer, Angara can provide segmentation information such as age, sex, or geographic region. The customer's website uses the segmentation information to serve targeted content to the visitor.

In the case of data from ad agencies, Angara is given access to the cookies dropped by the agencies. In both cases the data only identify broad characteristics of the user, such as sex, interests and responses to categories of advertising. The goal is to make first time visitors more likely to make purchases or return to the site.

Net Perceptions specializes in analysis of existing customers. The company sifts through data on the viewing and purchasing behavior of shoppers and uses its conclusions to make recommendations for personalized offers and targeted ads. Net Perceptions has chosen Angara to complement its own offerings in its new ASP offering, called the Net Perceptions Personalization Network.

The Personalization Network will offer four "channels," each driven from databases compiled by the network:

* The Intelligence Channel provides analytic tools to let companies understand their website visitors.

* The Recommendation Channel makes recommendations for cross-sells and up-sells based on the behavior of previous visitors.

* The Customer Acquisition Channel uses Angara's Converter product to target content to first-time visitors.

* The E-Mail Channel, a strategic partnership with Xchange, Inc., provides clients with the ability to design and target consumer emails.

The move by eCRM firms to embrace ASP offerings is accelerating. In Angara's case an ASP model is a necessity because of the dependence of their solution upon the data they collect. Net Perceptions is one of the very first to move an existing, successful suite to an ASP model. We expect that this will encourage the expansion of the market.

One thing that could hurt this market in the future is a privacy scare. Angara has a good privacy model in that they never get to see information that identifies individuals. One might argue that opting in to receive promotions does not necessarily mean that you want to be identified to a service that tells websites how to serve content to you, but given that the Web is supported by advertising this seems to us like a minimal intrusion, if it is one at all.

Net Perceptions takes no responsibility for the use its customers make of their data; its official policy is "Net Perceptions encourages all of its customers to adopt privacy standards of their own and make those standards freely accessible." We haven't yet seen a privacy policy for the ASP service. We believe that it should contain provisions that each ASP customer's data will be kept isolated from all the other customers, and that data collected through Net Perceptions' applications not become part of the Angara database.

We don't see data merging of this type to be a priori improper - that would depend on the mechanics - but we feel certain that it would ignite the concerns of privacy advocates and the public. Angara assures us that there are in fact no plans for any such data merging.

Enterprise Resource Planning Vendor Gains Connectivity through Acquisition of Plant Intelligence Provider

The acquisition of Lighthammer Software Development Corporation (www.lighthammer.com), a privately-held supplier of enterprise manufacturing intelligence and collaborative manufacturing software, by SAP might indicate that manufacturing operations management (MOM) software systems are becoming mature for consolidation. MOM software is the Web-based collaborative software layer (with the traits of both the integration and analytic applications) for monitoring plant-level trends, establishing operational context, and disseminating plant-level information to the other enterprise-wide constituencies. It is also referred to as enterprise manufacturing intelligence (EMI), manufacturing performance service (MPS), or whichever other acronym some analyst has come up with to make the traditionally not very user-friendly space that includes manufacturing execution systems (MES), plant automation systems, and other plant-centric technologies seem more attractive.

For background information on this acquisition, see The Importance of Plant Level Systems, Multipurpose SAP NetWeaver, and Enterprise Resource Planning Giants Eye the Shop Floor.

In fact, there have been numerous examples of other large plant-centric vendors (including the likes of ABB, Rockwell Automation, General Electric [GE], and Siemens) acquiring an array of companies and products (such as the former Skyva, Systems Modeling, IndX, and Datasweep), thus enabling them to build a broader, integrated, single-source MES scope. SAP's acquisition of Lighthammer might suggest that such manufacturing floor ventures of enterprise applications vendors are more than merely the knee-jerk reaction of a long overdue and much anticipated spending increase in the plant-level software market (see Do Chinese Enterprises Really Need MES and WMS? and The Challenges of Integrating Enterprise Resource Planning and Manufacturing Execution Systems).

Plant floor applications are generally very different from each other, though even their vendors deliver somewhat generic solutions, since continuous flows, discrete piece production rates, temperatures, pressures, and other manufacturing process parameters are common across many manufacturing applications. Still, owing to a dearth of standardized plant-level processes, bundled with a raft of manufacturing styles and industry-specific regulatory compliance (and consequently quality assurance) requirements, user organizations have typically implemented applications on a system-by-system basis. This is in part a response to firefighting requirements defined by department managers, manufacturing engineers, and equipment or process vendors.

This diversity of applications affects one of the major roles of the plant execution system, which is to collect and pool data from the real time processes for delivery to planning level enterprise applications, including enterprise resource planning (ERP) and supply chain management (SCM) systems. This is because, while mainstream ERP vendors have invested in making their products more attractive to specific vertical markets, they cannot really afford to deliver specialized functionality unless there is a large market.
As with its earlier appetizing acquisitions, such as those of TopTier, TopManage, and A2i (see SAP Acquires TopTier to Further Broaden Its Horizons and SAP Bolsters NetWeaver's MDM Capabilities; Part Four: SAP and A2i), the Lighthammer deal should provide SAP with several benefits. For one, the two parties have quite close product DNAs, since Lighthammer has long been a strategic marketing and development partner, with a manufacturing-centric product, which is now delivered as an SAP xApp-like composite application.

Lighthammer, formerly an SAP partner, had worked to create technology integration between its products and the SAP architecture, so reconfiguring Lighthammer as an SAP composite application running on SAP NetWeaver should present no special difficulty. With the acquisition of Lighthammer, SAP gains workflow-based connectivity to virtually any source on the plant floor and analytical functionality with Lighthammer's products for plant intelligence. This meshes well with SAP's recent business intelligence (BI) dashboard forays (see Business Intelligence Corporate Performance Management Market Landscape).

Furthermore, a high percentage (over 85 percent) of Lighthammer's approximately 150 clients are also SAP clients, a fact which should help SAP manage these clients' expectations. In addition, the improved plant-level functionality should make SAP more competitive in non-SAP environments as well. In particular, SAP's existing non-Lighthammer manufacturing clients should benefit, because they should gain greater flexibility in integrating multiple plant floor solutions with SAP. On the flip side of the coin, the vendor has pledged to support existing Lighthammer-only customers for a period of time. However, logically, the value of operating in this mode would decrease if customers are not going to pursue an SAP-centric strategy in the long term.
We concur with AMR Research's finding in the SAP Plus Lighthammer Equals xMII November 2005 report that there are ample opportunities for vendors to amplify xApp Manufacturing Integration and Intelligence (xMII) in terms of data historians or operational data stores, industry-based manufacturing models and KPIs, data mining add-ons to enable proactive, model-based decision making, etc. xMII performance management product functionality is moving in the direction of enhanced alert and event management, knowledge management, real time economics, and directed workflows, which will be key to encapsulating information and capabilities that are needed to make better and faster decisions at multiple levels within manufacturing. Over time, xMII will leverage selected SAP technologies such as the new SAP Visual Composer, which reduces the effort required to develop user interfaces (UI), but will present a challenge to users who have to adapt to the change.

Another weak area that SAP acknowledges is their inability to structurally improve manufacturing processes themselves. This is due to the fact that it is incredibly difficult to map what is happening on the shop floor in detail to, for example, the business systems or the costing systems. It is even more difficult across multiple plants, as the vendor has to provide customers with the ability not only to get the workflows right, but to assemble the data needed to do structural improvements. For this, one would need a plant-level analytic server that could unify data from multiple process control systems into a single contextual database in order to capture, process, and transform real time raw data into intelligible monitoring, counting, and measuring information that could be used by planning and other systems.

The Lighthammer acquisition may compound the above problem. So far, Lighthammer's raison d'�tre has been mostly to provide visibility into disparate plant systems for root cause and analysis, or, to put it another way, merely to take raw data and distill it on the screen. Unfortunately, SAP has never owned the complex data models or analytical tools for process discovery that, somewhat ironically, might provide other vendors that sell plant-focused applications at many levels of solutions for manufacturing and value networks with many opportunities and even allow them to use Lighthammer as the integration toolkit and interface to SAP. These vendors may include Invensys/Wonderware, Rockwell/Datasweep, Camstar, Accumence (formerly Visibility Systems), Visiprise, PEC Info, DSM, Activplant, Informance, OSIsoft, Pavilion Technologies, CIMNET, GE Fanuc, Citect, Siemens, Yokagawa, and PSI-BT, to name only a few.

A further challenge for SAP will be establishing themselves as a trustworthy partner to independent software vendors (ISV) that are afraid of being acquired in order to build the necessary ISV ecosystem (see SAP NetWeaver Background, Direction, and User Recommendations). SAP also has to clarify for potential plant-level ISV partners how to use xMII as an underlying platform for delivering preconfigured industry templates and systems.

Another uncharted area is the proliferation of Lighthammer to the discrete manufacturing industries, since despite having over a hundred joint customers, the focus of this relationship has been predominantly in process manufacturing (e.g., chemicals or life sciences) environments. It makes sense for SAP to have started with the process industries because there was more apparent opportunity. Nonetheless, although the acquisition restricts Lighthammer competitors from further penetrating SAP process manufacturing accounts, the next challenge is for both merging parties to respond to the unique needs of discrete manufacturers for standards-based interoperability and plant-level requirements within the automotive, aerospace, high technology, and other discrete manufacturing industries. In the end, the vendor hopes to achieve the maximum commonality between the two sectors, but that is going to be neither quick nor easy.
Even in light of the acquisition of Lighthammer, and given the natural question of what the acquisition means for other plant-level SAP software partners, SAP maintains that it will remain fully committed to strongly supporting and growing these partner relationships, and that it does not expect that this acquisition will interfere with that. Indeed, SAP may have an industry-wide ethical responsibility to stick to this agreement. Exemplifying this, a group of leading manufacturing companies and software vendors endorsed the Instrumentation, Systems, and Automation Society's (ISA) ISA-95 Enterprise-to-Control System Integration standards and World Batch Forum's (WBF) business to manufacturing markup language (B2MML) at a recent plant-to-business (P2B) interoperability workshop hosted by SAP and ARC Research. Workshop attendees also discussed the establishment of an open vendor and user consortium to share knowledge and best practices for plant floor to business integration and to provide compliance certification for use of B2MML and related standards. In addition to SAP and ARC, participants included representatives from Apriso, Arla Foods, Datasweep, Dow Corning Corporation, DuPont Engineering, Eli Lilly, Emerson Process Management, Empresas Polar S.A., GE Fanuc, General Mills, Invensys-Wonderware, LightHammer, MPDV, MPR de Venezuela, OSIsoft, Procter and Gamble, PSI Soft, Rockwell Automation, Rohm and Haas, SAB Miller, Siemens, and Yokogawa, as well as representatives from ISA and WBF. Ever since this endorsement, the progress in terms of leveraging ISA-95 as a standard and the WBF's B2MML as an appropriate schema for the process industries has been remarkable.

Similarly, two years ago or so, in response to growing customer need, SAP announced the industry wide "manufacturing interoperability" initiative, the aim of which was to dramatically reduce enterprise-to-production systems integration costs using available industry standards. For more details see Multipurpose SAP NetWeaver.

SAP is not to blame for having planning solutions that, in some cases, may extend deep into the plant floor, since the lack of integration is in part because the involved parties in the software industry have tacitly agreed to divide the software world into disjointed sets of vendors—the automation vendors, the MES vendors, and the ERP or enterprise applications vendors. Viewing things with this old and outgoing mindset has brought many endeavors to a halt, due to the question of where the line between MES and ERP is. As discussed in The Challenges of Integrating Enterprise Resource Planning and Manufacturing Execution Systems, the answer is often that there is no one clear line.

For example, due to the notion of "plant-centric ERP" in the 1990s, vendors, such as the former Marcam in process manufacturing and Baan in discrete manufacturing, had deep manufacturing models that challenged the artificial boundaries between ERP and MES. Along the same lines, Oracle plans to add more built-in plant-level functionality in the upcoming Oracle e-Business Suite Release 12, precluding the need for the typically extensive and painful customization outside of Oracle's toolset. Even smaller ERP vendors have been adding industry-specific functionality, for example Ross Systems (now part of CDC Software) for pharmaceutical and life sciences companies, IQMS for plastic processors, and Plexus Systems for automotive customers.

When it comes to SAP, it has a lot of customers that use SAP functionality to tie directly into low-level shop floor systems. However, there are also SAP customers that at the same time—at another site or division—use SAP in conjunction with an MES system. In the end, SAP will likely compete in the marketplace where it feels its functionality is competitive enough compared to other solutions. But the vendor acknowledges that customers have, for good reasons, installed other solutions, and will continue to do so. Thus, in order to be a trusted platform provider, it will want to be able to integrate with those systems.

Has SAP Nailed Plant Level Leadership with Lighthammer

At the end of June, SAP announced that it was delivering enhanced connectivity between the plant floor and the enterprise by acquiring Lighthammer Software Development Corporation (www.lighthammer.com), a privately-held supplier of enterprise manufacturing intelligence and collaborative manufacturing software, based in Exton, Pennsylvania (US). Lighthammer and SAP shared a vision of adaptive business networks (ABN), as illustrated by their longstanding partnership, during which Lighthammer was a premier "SAP Powered by NetWeaver" and SAP xApps partner. The company's approximately sixty employees have reportedly remained in their current facilities, and have become a part of SAP America and SAP Labs. Mufson Howe Hunter & Company LLC, a Philadelphia, Pennsylvania (US)-based investment bank, served as financial advisor to Lighthammer on this transaction.

At the time of the announcement, the two merging parties and formerly close partners believed that the acquisition would deliver value through improved manufacturing performance with more rapid time-to-value for SAP's installed base of more than 12,000 manufacturing customers. Lighthammer's Collaborative Manufacturing Suite (CMS), currently used by hundreds of companies worldwide, including more than 100 Fortune 500 manufacturing companies, was to be delivered as an SAP xApps composite application on the SAP NetWeaver platform, so as to provide enterprises with what SAP refers to as adaptive manufacturing (i.e., the ability of a manufacturer to profitably replenish the supply chain while dynamically adapting to unpredictable change). For background information on this acquisition, see The Importance of Plant Level Systems, Multipurpose SAP NetWeaver, and Enterprise Resource Planning Giants Eye the Shop Floor.

Lighthammer CMS has been re-branded as SAP xApp Manufacturing Integration and Intelligence (SAP xMII). Built on a modern, service oriented architecture (SOA)-based foundation, the former Lighthammer CMS provided a broad set of services that were required to relatively quickly assemble operational excellence applications in the areas of performance management, continuous improvement, and operational synchronization. The initial version of xMII is basically the former Lighthammer software, re-released in accordance with SAP software production methodology. Moving forward, the xMII team charter will be to help SAP manufacturing customers achieve better business performance through the synchronization of operations with business functions and continuous improvement. This translates into packaged manufacturing integration and intelligence solutions targeted for real time performance measurement. On the integration front, xMII will maintain a considerable degree of autonomy, but will also be closely associated with SAP NetWeaver, running on the SAP NetWeaver Web Application Server (WAS). This is because autonomy is required to match the unique product needs of manufacturing operations that are non-SAP shops or are driven by limited on-site information technology (IT) resources and skills, both of which can be an obstacle to leveraging the complex NetWeaver stack.

The SAP xMII solution will provide near real time visibility to manufacturing exceptions and performance variances, including root causes and business impacts. This will enable manufacturers and their production personnel to better adapt to change and to more rapidly respond to unforeseen demand and supply events. In addition, this combination reportedly will permit SAP to deliver real time transactional integration between enterprise resource planning (ERP) and plant floor systems. Another potential benefit will be the ability to provide unified, real time analytics and visualization, often referred to as manufacturing intelligence or plant intelligence, out-of-the box to manufacturing customers. Moreover, with the xMII solution, SAP is also aiming to enable user companies to leverage their current investments at a lower total cost of ownership (TCO). For more information, see Plant Intelligence as Glue for Dispersed Data?.

Using the Instrumentation, Systems, and Automation (ISA)-95 standards for process manufacturing interoperability (an emerging standard for interfacing low level industrial control level [ICL] code to business applications, which aims to further reduce the complexity of building custom connections to shop floor systems and thereby accelerate the time-to-value for the end customer), the Lighthammer and SAP manufacturing solution will exchange data and render them through SAP manufacturing intelligence dashboards, in order to deliver actionable intelligence in the form of alerts, reports, key performance indicators (KPI), and decision support to production personnel for right-time decision making (see Manufacturer's Nirvana—Real Time Actionable Information and SAP NetWeaver Background, Direction, and User Recommendations). The combined solution will thus allow production personnel to identify deviations in real time, provide drill-downs so as to understand the business and financial impact of the exceptions to be managed, and display the workflows so as to resolve them relatively rapidly and cost-effectively. The aim, of course, is improved productivity.
One idea that has been gaining in popularity lately is the inclusion of a value-adding process layer that can fairly easily link to scattered data sources, retrieve specific data, perform process logic, and deliver a meaningful output. Companies are applying manufacturing (plant) intelligence systems, such as the one supplied by Lighthammer, to aggregate appropriate information from plant-focused data sources into a meaningful context for presentation and analysis. These systems are a combination of integration or middleware platforms and business intelligence (BI) applications, since portals can aggregate and process manufacturing data for specific user communities, and then can share scheduling information across collaborative value chains. On the other hand, manufacturing intelligence systems can collect specific data from plant-focused devices and systems, and then analyze and present the information in dashboards and other KPI tracking systems. For more information, see Plant Intelligence as Glue for Dispersed Data?.

Integral to Lighthammer is the concept of non-intrusive connectivity, allowing legacy data sources to be integrated into the overall enterprise decision support scheme with minimal effort and no disruption to operations. The product's connectivity is not limited to data sources, as it can deliver information to a broad range of Web devices, including all major browsers, handheld or palmtop devices, Web phones, and enterprise applications. The visualization functionality includes a variety of charting components, support for wireless devices, and a set of wizards for automatic generation of Web page content for users with little or no technical expertise. There is also an animation editor in the Lighthammer technology that enables users to animate objects. For instance, one might want to be able to see a vessel actually filling up and see the level changing.

A comprehensive reporting module allows content from multiple data sources to be aggregated and correlated in a single report, which can be either "live" or static, and displayed in a browser, printed, or disseminated via e-mail. For some time, the product also has provided an "enterprise" option for multi-site views of production and manufacturing operations. This option enables multiple Illuminator (a core component of the former Lighthammer CMS suite that features solid extract, transform, and load [ETL] capabilities) servers throughout the business to provide a single, unified view of enterprise information. This allows, for example, a corporate process engineer to assist plants with process problems, or a production executive to view real time manufacturing results at a number of sites from a single web browser.

The Lighthammer technology connects to the three areas that users need connection to.

1. It connects to the main SAP modules.
2. It connects to the dashboard, so that users have KPIs coming out of both the SAP environment and the manufacturing systems.
3. It connects to a BI platform, which is useful as the data warehouse (i.e., SAP BW) environment is an important source of information. For example, a customer might want to capture information about reason codes for failure, so that when things are not made as they are supposed to be, all that information is captured in a data warehouse.

The problem is that, while information comes from production operations, goes to a data warehouse, and is viewed by the business, the very people who fed the information in typically do not see the data. In fact, because of a ripple-up effect of failures into the business down on the shop floor, sharing information through the manufacturing intelligence dashboards out of the BI layer can be as valuable in some cases as getting the information from the production level. For this reason, Lighthammer touts its ability to enable manufacturing in an adaptive environment by providing the business context for manufacturing data on an event-based integration in order to close this loop between the business and production levels.
At some SAP events, the two formerly independent partner vendors related a scenario-based example that was modeled around a paint process, which had both process industry characteristics (e.g., using reactors and vessels that handle liquids and fluids) and consumer packaged goods (CPG) industry characteristics (in that material is packaged and ultimately put in a warehouse or on a shelf).

The process that the SAP and Lighthammer teams have developed starts with material being added to a mixing and reaction process, whereby the product is extracted from the reaction, and then filtered, dried, and placed as an intermediate in cans. This particular process is also applicable to the pharmaceutical industry. The product is then packaged, palletized, labeled, and shipped to a distribution center, where quality tests are performed and the ISA-95 integration standard is employed to exchange schedule and performance data between the ERP and plant-level applications. To eliminate any latency or lack of synchronization, the production plan update and associated master data are automatically transmitted to the plant floor via SAP XI using the ISA-95 integration standard. The production plan synchronizes the plant systems, so that performance data, including status costs and quality information, are fed back into SAP in real time.

To be precise, the production schedule is sent from mySAP ERP to Lighthammer CMS (now SAP xMII), transmitted to the automated system, and then displayed on the manufacturing dashboard. After the batch is executed, Lighthammer aggregates production performance data and automatically updates the mySAP ERP inventory. Needless to say, the solution also tracks how things are developing throughout the batch, capturing not only the start and end points of a batch, but it also the intermediate ones. Thus, based on the sensitive data its captures as the batches are being manufactured and on some Six Sigma control analysis, Lighthammer technology detects quality problems, generates alerts, and quarantines the batches in mySAP ERP.

Quarantining a batch based on an anomaly in the process is the epitome of a closed-loop behavior. Production quality alerts appear in the dashboard, and the production supervisor can then drill down into the alert to perform a rapid root cause analysis. At this point, it is important to have not only the visibility to stop or change the process, but an understanding of why this problem has occurred so as to prevent it from reoccurring. The final stage would thus be the production supervisor initiating a corrective action to fix the problem, resolving the exception before it becomes a customer issue in an effort to have a continuous improvement environment.

Another often presented scenario leverages radio-frequency identification (RFID) technology. In this scenario, one might have paint cans containing a certain color or a certain blend that are moving more quickly than others. RFID-enabled business processes would indicate the pattern of these cans on the floor. In addition, the notification of material available for shipping would occur automatically and immediately. What one would like to be able to do is to respond at the manufacturing level to this change on "now what?" basis. For example, the sales department might want to rapidly capitalize on this opportunity. In this scenario, the production plan can be re-aligned in real time, based on the actual capability to deliver or the capability to promise (CTP), and the transient opportunity can be successfully realized since one has the ability to respond.

With the above scenario, we are talking once again about a closed-loop application, whereby Lighthammer receives the schedule and master data from SAP, and Lighthammer in turn uses SAP XI to deliver real time alerts and KPIs to the SAP dashboard. The dashboard itself is a composite application consisting of the XI views, the KPIs, and any accompanying alerts. There might be alerts coming out of the SAP environment and out of the external plant level systems as well. In which case, Lighthammer would be monitoring conditions, calculating KPIs, and further applying execution logic.

The possible value of this for customers could be multifold. First of all, it is a closed-loop system with real time synchronization—when a plant manager is looking at data from Lighthammer on his or her dashboard, it is live data. Moreover, users have control over how often the data is sent to the screen, which is done automatically in the background. The business implications of quality performance and delivery issues on the shop floor are thereby quantified and made visible, while proactive exception detection is supported to minimize the overall supply chain impact. In addition, production personnel are empowered with a productivity tool that enables them to access all the relevant documentation on one single dashboard or system, in order to manage by exception, leverage the dashboard as a decision support environment, perform tasks assisted by automated workflows, and initiate improvements and monitor their impact with the KPI dashboard.
In May 2005, almost immediately before the acquisition, Lighthammer unveiled CMS 11.0, which was a major upgrade of the flagship product, featuring enhanced scalability, multisite metrics, security, and traceability for regulatory compliance of the composite platform for building manufacturing intelligence applications. The new release also added features that extended the development environment's existing performance management, continuous improvement, and operational synchronization capabilities, which were SOA-based. Importantly, the new capabilities aimed at helping developers to more easily build and deploy applications that can be accessed across the distributed manufacturing enterprise. At least 60 percent of the code in version 11, which had been under development for about a year, had reportedly been rewritten. For end users, this might mean about 15 percent more functionality and a complete upward compatibility with existing applications.

Among the most significant enhancements to version 11.0 was the Security Manager service, which added unified user management and single sign-on capabilities for run-time applications. This means users will be able to access any CMS-built application regardless of the platform on which it runs. Therefore, CMS, which previously operated only on Microsoft Windows-based systems, can now run on other operating environments, such as Linux, Sun Solaris, and HP-UX. The service also allows integration with a wide range of third-party authentication systems, including SAP, lightweight directory access protocol (LDAP) using Active Directory, security assertion markup language (SAML), Windows Domains, Kerberos from Massachusetts Institute of Technology (MIT), and others. These features should allow customers to manage user roles, memberships, and attributes better, as well as to define authentication or authorization services either from existing enterprise user management directories or through the Lighthammer application. This service should thus provide the ability to implement a security strategy that could fit virtually any existing enterprise architecture and should extend "single sign-on" into the domain of plant applications, improving compliance.

Additional compliance and traceability features that were added include an electronic signature service and a multilevel confirmation or challenge capability, which securely controls and documents user actions for regulatory compliance with 21 Code of Federal Regulations Part 11 (21 CFR 11), Sarbanes-Oxley (SOX), and other regulations. The enterprise application integration (EAI) capabilities have also been enhanced, with the addition of new business logic capabilities that take advantage of Web services in SAP NetWeaver to simplify data integration between plant systems and enterprise systems. Last but not least, the generally available CMS version 11.0 laid the groundwork for another upgrade set. Currently, the product is built mostly on Java, but the logic engine is based on Microsoft .NET. The next release, however, will be 100 percent Java-based, which should give customers a much broader choice of development platform.

Lighthammer's process manufacturing industry expertise and foresight in developing intelligent manufacturing middleware was helped by its early commitment to open technologies like the ISA-95 standard, Java, extensible markup language (XML), and SOA. Even earlier releases featured Lighthammer's leadership in the deployment of these open technologies as enablers for acquisition, analysis, distribution, and presentation of information from manufacturing systems. Lighthammer CMS functionality has long included built-in transformation of data into any standard XML message structure, such as Microsoft BizTalk, RosettaNet, and others, as well as the ability to interface with peer plant-level or enterprise-level systems using XML as the default data format for both incoming and outgoing data. Back in 2001, Illuminator 8.5 introduced a breakthrough Intelligent Agent subsystem, which could be used to enable inter-application messaging upon detection of production events or exceptions; automated calculation of KPI metrics; automatic transfer of information between XML, database, and e-mail sources; the gathering and conversion of data from external Web sources; and much more.

Friday, October 2, 2009

Microsoft’s Underlying Platform Parts for Enterprise Applications: Somewhat Explained

What About Visualization and User Interface (UI) Technologies?

However, what has somewhat intrigued me is Microsoft’s not-so-vocal touting and promoting of Windows Presentation Foundation (WPF), although it is an intrinsic part of the .NET Framework. In fact, to the best of my knowledge, the tool has not yet been used within the Dynamics set in earnest, although Lawson Software and Verticent would be the two independent software vendors (ISV) that I am aware of deploying it.

Both vendors tout WPF’s rich UIs that support virtually infinite customizations and business process compositions using Microsoft applications. Other Microsoft-centric ISVs either support only a limited number of specific and prescriptive business scenarios, or use a combination of technology products (for example, Microsoft Office Business Applications (OBAs), Visual Studio.NET, and proprietary interfaces and UI tools) to come up with similar custom scenarios. Again, Microsoft currently uses WPF very selectively in Dynamics UIs, for example, in the Dynamics AX graphical view of the organization structure of the business.

With its Smart Office offering, Lawson is not the first to leverage Microsoft Office to deliver not only manager and employee self-service, but much more as well. In fact, I could think of the joint SAP and Microsoft Duet product, Epicor Productivity Pyramid, QAD .NET UI, SYSPRO Office Integration [SOI]), IFS Business Analytics, and so on.

However, by leveraging WPF, Lawson embeds manager and employee self-service functionality more directly into Microsoft Outlook than Duet (which is more of an add-on launched from Outlook as an integrated pane) and most other vendors’ OBA solutions. Fore more details on Lawson Smart Office, see my earlier blog post on the vendor’s CUE 2008 conference and the Gartner Dataquest Insight report by Bob Anderson entitled “Lawson Raises the Bar With Differentiating ERP User Interface.”

Curiously, Lawson has deployed another non-mainstream Microsoft technology, Microsoft Office Groove. It is a peer-to-peer (P2P) collaboration platform, providing an outstanding base for collaboration (document exchange) scenarios that involve teams with sometimes disconnected participants. Microsoft claims that future product releases will improve the alignment for collaboration between Groove and SharePoint.

Lawson’s technology decision was likely owing to Groove’s concept of “shared workspaces” and Lawson’s view that individuals live in a “space” where they do most of their work. For example, a manager really “lives in” Microsoft Outlook, and should be able to do all his/her work from there. An accountant lives in Microsoft Excel and should be able to work from there. A mobile technician lives in the cell phone/personal data assistant (PDA) metaphor, where the Apple iPhone or Palm Treo similarity of UI can come in handy.

Some Other Vendors’ UI Approaches

Still, although WPF provides a visually appealing, familiar and intuitive UI, it comes with some trade-offs, specifically in memory utilization (being hardware intensive), the need to be hooked to the network, and a much greater dependency on Microsoft software. For instance, IFS doesn’t use WPF today for IFS Applications’ UI simply because of hardware needs: running WPF requires quite a hefty PC in terms of memory, and preferably the (possibly still unstable) Windows Vista platform.

We are talking here about IFS’ upcoming next-generation UI, which had for some time been called Aurora, but is now called IFS Enterprise Explorer (IEE). Namely, to prevent any confusion about Aurora being a separate product from IFS Applications, IFS has recently clarified its naming conventions.

Aurora is now a development project that will yield several enhancements to IFS Applications, all with a focus on ease-of-use and user productivity. The first deliverable as part of the Aurora project is IEE, the new graphical user interface (GUI) for IFS Applications. It is important to note that after IEE is released, the Aurora project will continue, yielding future enhancements.

In any case, IEE is interesting, to say the least, for leveraging Microsoft UI technology to create a look (albeit not yet the multi-touch touch screen, handgestures, etc. feel) of Apple iPhone (on top of Oracle database and Java-based application servers on the back end: some mix of technologies from adversaries, indeed). It is becoming quite obvious that the iPod and iPhone generation is our future workforce, who require well designed tools that they “love” to interact with. At the same time, they accept no excuses for “Why can’t I…?” questions, such as, for instance, “Why can’t I search in the enterprise application in the same way that I search on Google?”

At the end of the day, the design goal is to achieve more with fewer staff members, who thus have broader responsibilities, are able to handle the unexpected, collaborate with colleagues, and be more productive. In other words, the market drivers are the new and engaging design and user productivity. Consumer information technology (IT) and the web are leading the way, and are also becoming quite important for business applications.

To that end, prior to the IEE undertaking, IFS developed a pervasive enterprise search engine that attempts to think the way people think (e.g., “I need that fault report about the fire alarm not working”), and not the way enterprise systems think (i.e., “I want go into the preventive maintenance module where, in the service request folder, I will start the fault report screen, in which I shall then make a query on the description field containing any words followed by the words ‘fire alarm’ followed by any other words again”).

With built-in security (users can be limited in search authorizations as required), the enterprise search capability vouches for better results and value without additional costs. For more information, see the TEC’s article entitled “Why Enterprise Application Search Is Crucial to Your ERP System.”

Microsoft’s Underlying Platform Parts for Enterprise Applications: Somewhat Explained – Part 4 » booklet-p12-1-small-display1.png

booklet-p12-1-small-display1.png



Show Me, Don’t Sell Me

Show me, don’t sell me

Clearly, the easiest way for a vendor to allay my fears is to marry their feature list to my to-do list, and show me how to use their software for some of the things that I do every day.

And because I’m an inveterate YouTube addict (as are many of us office folk, I think) the best way to show me is to make some videos. Short ones that get right to the point so I can watch them while I’m eating lunch and still make my 1:00 meeting.

Looking around for videos such as these can be frustrating. Many vendor websites don’t have any. Some sites have them, but force you to register if you want to watch them (as if I needed any more email). Still other sites bury their videos so far down that you’re ten clicks away from finding out that they even exist. Ugh.

Hope is not lost

On the other hand, there are some vendors who, at least partly, get it, and the two that stood out in my quick survey were Microsoft and SAP.

The Microsoft Dynamics site’s introductory series of videos presents the Dynamics product line from the point of view of five “typical” department heads and their cartoon staff. It’s an overt marketing piece but if you hang on through the first minute of each manager’s spiel, you do get a few nuggets of valuable information.

For example, you can see how different Dynamics products integrate with other Microsoft Office products in the context of actual tasks, like tracking orders, generating reports, setting up marketing campaigns, etc. More to the point, you can see the products’ interfaces, which goes along way towards forming your gut feeling about each product.

Dig deeper into the Dynamics site and you’ll find demo videos for each of the products. These tend to be a strange mixture of Powerpoint-ish presentations and actual walkthroughs. But again, if you can hang on through the benefit statements (and you’re not put off by images of trains, factory floors, and guys in suits shaking hands and sharing laptops), you’ll get genuine task-oriented information that will give you some idea of what it’s like to work with the products.

SAP has a similarly extensive video library for its Business One product. While the videos aren’t quite as easy to find as Microsoft’s are, I thought they did a better job of connecting the dots. For each demo, SAP lists a few capabilities. When you click through to the video, you’ll notice that it explains each of those capabilities in terms of day-to-day tasks.

You still have to put up with “typical user” personas, dull stock photography, and a few marketing-y bullets, but the SAP videos are pretty well focused on the end user. Which is nice. It’s one thing to know that a piece of software “manages customer interactions, from contact data and history to calendaring and tasks.” It’s another thing to see how a real customer service call might be handled using that software. Especially if you work in customer service.

It’s not that hard

“Of course,” you’re saying, “Microsoft and SAP have the money to do that sort of thing.” But the truth is, it’s not that hard. Just search YouTube for any popular software—Photoshop, for example—and you’ll find a wealth of video tutorials produced by lone users in their spare time. Any corporate marketing department worth its budget should be able to do at least that much. And while Microsoft and SAP might have the resources to polish their videos to within an inch of their lives, for the average end user, good content trumps good presentation any day.

And it’s worth it

Making it easy to find nuts & bolts information about day-to-day software use has important benefits for both buyers and vendors of enterprise software.

If you’re in the market for new enterprise software, and you’re following TEC’s sage advice, one of the things you’ll do in the early stages of your selection project is to ask end users exactly what they do with your current software, and what might help them do it better.

Don’t worry, you don’t have to ask them individually. Instead, make sure that your selection team includes a few people who know, or can find the answers to those questions, and help you turn early-stage user feedback into criteria that you can weigh and analyze relative to all of your other requirements.

When they pass their feedback up the chain to the selection team, users who have seen various solutions in action can point to concrete examples of functionality they’d like to have in the new system. Instead of having to fully explain a complete workflow or a missing feature, they can say “I want something that works like that.” The end-user advocates on the selection team can translate “that” into more quantifiable feature lists and requirements.

As the beleaguered user, what’s in it for me is the feeling that my voice has been heard, which means I’ll be more likely to adopt the new system and less likely to turn into the office jerk.

What vendor wouldn’t want that kind of bottom-up support from their potential customers?

Now I know that end users rarely make the final decision, and that software selection projects tend to be fraught with political considerations that can pull a company in one direction or another. But smart companies—the ones that carefully consider the everyday needs of their employees before making a rational selection—are going to get more bang for their enterprise software buck.

Smart vendors are going to do everything they can to get those employees on their side.

Meridian Systems’ “Catch Up” Challenge in the Capital Infrastructure Industry

Meridian, which promotes its business as the Plan-Build-Operate (PBO) technology solutions leader for Project-Based Organizations (another PBO acronym, and thus the “PBO squared” mantra), offers an end-to-end solution for building owners, construction and engineering firms, and public agencies in three flavors. These offerings respectively cater to high-end (Tier One), mid-market, and small market organizations that manage capital building programs and facility assets.

Meridian’s overall focus is to improve customers’ revenue and profit growth by optimizing facilities, and by reducing construction and facility costs. To that end, Proliance, which is a full-fledged infrastructure lifecycle management (ILM) suite on a native Web services-based platform, is aimed at Tier One high-end PBOs with over US$ 1 billion in revenues and over 500 full-time employees. In this market segment, where the competition comes largely from SAP and Oracle, and with the deals valued from $750,000 to $10 million, Meridian typically wins owing to its OBA strategy, ready BIM and enterprise resource planning (ERP) integrations, Web services-based platform, distinct PBO product breadth, and well-attuned business analytics (BI) tools for PBOs.

Different Strokes for Different Folks

This part will focus more on Meridian’s forerunner Prolog product for smaller organizations, and on the vendor’s upcoming fourth quarter of 2008 (Q408) release of Prolog Connect for the mid-market.

Prolog was originally introduced in 1993 on a client/server platform, and is in use today by more than 4,000 companies that have revenues from $10 million to $500 million, and from 10 to 100 employees. With typical contract values of less than $150,000, the product grew rapidly across small organizations in the architecture, engineering & construction (A/E/C) sector because of Meridian’s micro-vertical expertise and rich understanding of this space, a usable and intuitive user interface (UI), and easy customization by business users (versus information technology [IT] staff).

Prolog is best suited for the “Build” phase of Meridian’s PBO solution set, and includes more than 400 packaged reports. It manages a wide breadth of activities including purchasing/bid management, budgets and cost management, contract and change management, correspondence management, design collaboration management, daily journal entries, jobsite tracking, and safety and quality programs. Usual-suspect competitors are Primavera [evaluate this product] and Autodesk Constructware (and occasionally e-Builder).

To modernize Prolog, and also appeal to larger mid-market companies, Meridian is releasing in late 2008 a new mid-market product, Prolog Connect, which provides Web services and service oriented architecture (SOA) layer atop the Prolog’s Project Portfolio Management (PPM) oriented product set. Featuring OBA strategy, secure collaboration with internal users and external supply chains, and flexible integration, Prolog Connect is targeted to companies in the $500 million to $1 billion revenue range or between 100 and 500 full-time employees (FTEs). When sold together, Prolog and Prolog Connect’s typical contract price is expected to be up to $750,000.

Current State of Affairs at Meridian

Lately, Meridian continues to win with its PBO value proposition for ILM with deals across a broad segment of public and private organizations. Keynote recent deals were:

* In the federal government sector – The United States General Services Administration (US GSA), two contracts valued at $2.5 million and $10 million respectively, beating or replacing Skire and Primavera;
* In the energy sector – Ontario Power Generation (OPG), contract valued at $2.2 million, beating or replacing Primavera;
* In the transportation sector – The Illinois Tollway, contract valued at $2.2 million, beating or replacing CapitalSoft;
* In the A/E/C sector, Ryan Companies and DMJM H&N/AECOM, contracts respectively valued at $2 million and $432,000, beating or replacing Oracle and own Prolog product; and
* In the real estate sector – CB Richard Ellis (CBRE), contract valued at $3.9 million, beating or replacing Bricsnet.

Other notable deals for Meridian include the State of Connecticut, Los Angeles World Airports, and the City of Seattle. Also of interest is that the company uses primarily a direct sales and support model for its upper-range Proliance product, and sells largely indirectly through system integrators (SIs) and value added resellers (VARs) in the small and mid-markets.

Meridian does not want to be in the ERP game, rather it wants to “connect in.” Within the Prolog and Prolog Connect solutions the vendor has pre-built hard connections into major project-based ERP leaders including Deltek Systems. Proliance was built on Web services and in Extensible Markup Language (XML) to allow for multiple points of integration with other applications, including ERP, financials/accounting, document management, etc. Proliance includes its own asset management modules, but can also be integrated with other (more powerful) enterprise asset management (EAM) systems as required.

It is also interesting to note that from the beginning both ProjectTalk (the on-demand version of Prolog) and Proliance OnDemand were multi-tenant offerings (i.e., keeping many customers in one environment rather than dedicating one environment per each customer). Meridian determined early on that this was a much more economical way to achieve the economies of scale needed to reach profitability with its offerings. As for customers, there are many using both systems. Haskell, Hathaway Dinwiddie and many others are on ProjectTalk, while ISTHA and CBRE use on Proliance OnDemand

Market Opportunity (and Challenges)

While Meridian is based out of the US, it works with a wide partner network, including customers that are turning into VARs, and partners that are looking to sell to emerging markets. A new Morgan Stanley report entitled “Emerging Markets Infrastructure: Just Getting Started” and published in April 2008 identifies that a sizable boom in infrastructure building is underway. This PBO surge spans the realms of power and water, property, ports, airports, and is across both the government/public and private sector.

Morgan Stanley forecasts US $21.7 trillion in infrastructure spending in emerging markets over the next decade (at least before the onset of the global credit crunch). The report identifies a surge in market listings of owners, operators and contractors to build infrastructure/assets in emerging markets, and states the number of listed infrastructure-related entities therein is up from 230 to 354 (54 percent increase) over the last five years. Morgan Stanley sees huge market capitalization — increasing from $146 billion to $1.1 trillion over this same 10 year period.

But what about the big enterprise software competitors who also play in these markets, and who indisputably own the IT departments’ mind share? Meridian’s President and Co-Founder John Bodrozic, quoted in Part 1 of this series, boldly says “bring it on” when queried about competitive consolidation, such as Oracle’s recent acquisition of Primavera.

“Oracle now has two products that do the same thing: Oracle Projects and Primavera, and the real question for installed users will be, “which one lives and which one dies” or will it continue its six year history of letting multiple products do the same thing (e.g., JD Edwards, PeopleSoft, etc.) but that have zero interoperability? Oracle’s published Frequently Asked Questions (FAQ) document on the acquisition states that Oracle plans on integrating Primavera PPM to “Oracle ERP,” but never states which one of Oracle’s ERP products. With nearly 50 acquisitions in a few years, one wonders how any buyers or sellers can make sense of which products can and should work well together in which instances.”

While I can understand the Meridian CEO’s confidence in his holistic ILM/BIM offering, I certainly would not dismiss the Primavera acquisition. At least, I agree with Vinnie Mirchandani’s liking of the deal and Oracle’s vertical industry-based acquisition strategy, backed by a coherent Oracle Fusion Middleware (OFM) strategy. Also, Brian Sommer has an impressive blog post on the PPM software space, besides what the deal might mean for Oracle and for Primavera customers.

In addition to its relatively small size and best-kept secret status when it comes to brand recognition, Meridian’s major challenge could be the fact that, since the ILM space is still new/evolving, there is a steep learning curve to explain it to customers. This is why the market is still often defaulting to more simplistic solutions that don’t do the job as well as Meridian does. It is akin to customers’ silly practice of switching light bulbs to save money on energy when you have no insulation in the walls.

There are so many inefficiencies and even adversarial relationships in the industry that unneeded costs and poor practices are built in and accepted. The challenge for Meridian is to build a greater understanding of the big picture impact of a complete PBO product line so that the market doesn’t continue to default to less complete ILM solutions like Primavera, Oracle, Skire, Tririga, etc.

Also, since the BIM/ILM connection is enabled via the partnership with Horizontal LLC, its competitors can emulate that over time (i.e., they can strike partnerships too). Thus, are there any other frontiers that Meridian could tackle next, in order to be ahead of the curve and continue to challenge the market with the “Catch us if you can” mantra? To that end, other potential areas are things like building new automated solutions for pulling data out of Meridian solutions to make the Leadership in Energy and Environmental Design (LEED) certifications turnkey, and exploring the latest construction delivery methods (e.g., private-public partnerships such as lease-leaseback for school districts).

Final Thoughts

In summary, Meridian offers a well-thought-out approach for small to large companies, with the right technical foundation for the future with a native SOA/Web services platform already in the market for the past five years. Additionally, it has integrated business functionality for managing ILM in the complete PBO spectrum. This scope includes the combination that both the market leaders and market pundits are missing: PPM, Scheduling, Facilities Management, and BI. If you are a project-based organization engaged in holistic capital infrastructure lifecycle management, this is one solution you should certainly consider.

Is One Country Good Enough to Handle Your Outsourcing Business

The concept of “portfolio” is very prominent in the finance world. “In finance, a portfolio is an appropriate mix of or collection of investments held by an institution or a private individual.” (Wikipedia) The practices of portfolio management now have many different models; some have become very complicated and need tremendous analysis. Simply speaking, the purpose of investing in different assets instead of betting all the money on one arises because different assets have different return potentials and different risk exposures. If you can build your portfolio appropriately, the diversity of your assets may help you to offset individual risks while maintaining an acceptable return.

Let’s take a look at this extremely simplified example: If you have the opportunity to buy a bond (low return but risk free) and a stock (high return but associated with high risk), what is your investment decision? The absolutely risk-averse people will only buy the bond and the opposite (extreme risk-takers) will only buy the stock. However, most of people are more likely to take some risks (but not too many or too high) while having higher return expectations than what the bond can yield. Thus, a mix of the two assets makes sense, and the proportion of each depends on what your return expectation is, or in other words, how much risk you are willing to take.

In the investment area, a “hedge” is a widely used method to manage risks. The main idea of hedge is to include two different types of assets in one portfolio. There should be a relationship between the two – when one tends to go down, the other goes up and vice versa. Hence, no matter what the economic and market situation are, the risk of your investment portfolio will be manageable.

Having provided the two examples above, I hope the idea of vendor portfolios becomes easier to understand. First of all, let’s take a look at risks that are associated with software outsourcing. Besides quality, delivery, support and such issues as are more related to individual vendors, there are also risks from the macro-environment:

* Physical risks: natural disasters (e.g. earthquakes, floods, and tornadoes) that will cease or temporarily impede your vendors’ development activities

* Regulatory risks: regulations (e.g. import/export tariffs, taxes, and employee compensation requirements) that will impact your vendors’ business costs and as a result, your cost

* Economic risks: such as exchange rates, employment levels, and vendor domestic market demands that will influence vendors’ pricing policies

* Societal and political risks: caused by political events, strikes, and culture shifts that will directly or indirectly change your vendors’ ability to provide service.

The vendor-specific risks (or, let’s call them micro risks) vary from vendor to vendor, but the macro risks are more related to the macro-environment in which vendors operate. In many cases, it is convenient to examine these risks at a country level.

If we agree that macro risks exist and that many of them vary from country to country, we may draw a conclusion that too much reliance on one single country is like investing all your money in one stock.

By building a portfolio that includes vendors from different countries, a company should be in a better position to manage macro risks. If there are complementary elements amongst those countries, you may expect a hedging situation. For example, when you discover that outsourcing to a certain country becomes unprofitable due to increased programmer wages, you may find that in another country wages are going down due to the surplus of programmers.

Microsoft’s Underlying Platform Parts for Enterprise Applications: Somewhat Explained

Shedding Some “Northern Star” Light on IEE

For IEE IFS uses Microsoft ClickOnce, which is a technology designed to perform web-based deployment of rich applications. Basically the authorized user clicks on a link and the application loads straight from the web server without needing to be installed and distributed via CDs (like traditional client/server applications). It works similar to the counterpart Java Web Start or Adobe Flash technologies.

ClickOnce can be used for all Microsoft .NET UI application styles including Windows Presentation Foundation (WPF), Windows Forms, and Silverlight. Basically, it is the deployment technology for Windows applications. IFS decided not to use WPF as the technology for building UI initially but plans to do so for its next major update due in a couple of years, when it also expects the availability of Microsoft .NET Framework 4.0, which the vendor believes will serve its needs well. It is also currently possible to mix WPF and Windows Forms in the same application, since the interoperability apparently works very well.

In any case, the current set of tools used by IFS has helped the ergonomic design and easy navigational technologies, such as: adaptable links panel, contextual breadcrumb navigation, and rich media. Adaptable links panel is a panel at the screen that shows all places “where a user can go from here.” For example, when viewing a customer order the Link panel will show links to customer information, price agreement, service level agreement (SLA) contract, and other “related” information (see figure below).


booklet-p12-1-small-display1.png


Contextual breadcrumb is a context-sensitive navigation menu that helps users visually navigate (and return to the start page, in association with the classic fable about Hansel and Gretel) to other application areas/pages that are “near” his/her current “path” in the application. You have a similar thing in Windows Vista for folder navigation. Similar to this is the Visual Recent Screens capability, which is a visual navigation history, showing all pages in the application visited since a user logged on (see figure below). It is also similar to the feature in Internet Explorer (IE) that shows all open tabs.


booklet-p10-3-recent-screens1.png



A good example of breadcrumb navigation could be found in the use of Webcom’s WebSource CPQ product catalog and configurator. The product is written in Java and AJAX UI technologies, just note that this navigation mode is technology-agnostic.

A Webcom’s user can be a seller of categories like: Software, Hardware, and Services. If a potential buyer clicks on Hardware, then the system will open up the subcategories like Servers and Printers. By further clicking on Servers, then the options can be Web Servers, Storage Servers, File Servers, and so on.

While the user is navigating, the system creates at the top of the screen “breadcrumbs”, so that the user knows how he/she has come to this place and how to go back. The breadcrumbs path might look like:

“Home > Top Level Catalog > Hardware > Servers > …> Current location”

Making it Stick

As for rich media features, they would comprise everything that is not a static HyperText Markup Language (HTML) page, such as RealVideo, Adobe Flash, Microsoft Mediaplayer, Microsoft Silverlight, and so on. Previously, these gadgets could only show videos and play music and animation, but now users can write applications over them.

In IFS’ case, the most visible way to use rich media is to use the Sticky Notes feature. Basically the user can put a sticky (”Post it”) note (only logically, not really physically, duh!) onto any record in the system (e.g., customers, projects, orders, invoices etc.). The note “sticks” to the record and will be visible to all other authorized users who look at that same record. Inside the note users can put any content they can put in a regular “rich text” field in Windows.

This content includes, for example, pictures, hyperlinks, video clips, Objects Linking & Embedding (OLE) objects (or any embeddable document type), etc. The sticky note is to enable data to be kept that is not part of the normal system database (as a sticky note would be on a physical desktop),but that can be searched along with the data in the database. This serves the purpose of capturing knowledge in the organization and not just with an individual (see figure below). Could this capability also be a first date between user communities and enterprise applications?


booklet-p8-1-purchase-requsition1.png


Silver Lining in Silverlight?

But unless Microsoft Silverlight is used, Windows users are tied to the desktop, which means less reach and portability than in browser-based applications. Silverlight (formerly called WPF Everywhere [WPF/E]) provides a runtime browser-based deployment environment for WPF applications written in Extensible Application Markup Language (XAML). Silverlight is a subset of WPF and is designed to run cross-platform to enable Rich Internet Applications (RIA); WPF has some additional capabilities but assumes it is running on a “Windows box.”

Silverlight and WPF have had their share of many broad platform announcements at the recent Microsoft PDC (Professional Development Conference) 2008 conference. Relative to business applications-oriented UI design controls, there has been a panoply of new capabilities within the Silverlight toolkit, such as Charting, TreeView, DockPanel, WrapPanel, ViewBox, Expander, AutoComplete, NumericUpDown, and so on.

These controls should all also be available for WPF, which on its part has received the controls like DataGrid, DatePicker, Ribbon, Calendar and VisualState Manager. These controls are already also included in the Silverlight 2.0 announcement, albeit the Ribbon control from WPF is not in Silverlight yet.

These capabilities are in great part what the Dynamics team has been waiting for before jumping in broadly on the Silverlight/WPF bandwagon. In addition, Microsoft Developer Division is open-sourcing the Silverlight controls, so we can expect to see lots of advanced controls added by ISV’s down the track. Thus, Microsoft admits that visualization is a key area of investment for Microsoft Dynamics products, and as Silverlight capabilities around data expand, Dynamics products will add Silverlight experiences to their common controls.

This is not to neglect the work Microsoft has done around introducing role-tailored user experience (UX) across the Dynamics products, embedding role-specific and contextual analytics directly in the application UX, and introducing both breadcrumb bar navigation and action panes (the Office ribbon-style interface). Independent of what “plumbing” the company uses, these have been pretty dramatic UX changes, and similar to the abovementioned navigation gadgets in some of the other vendors’ products.

Bottom Line: Win-Win for Microsoft

Coming back to the second issue from the beginning of this blog series, i.e., Microsoft Business Division’s (MBD) Profit & Loss (P&L) statement, at the Convergence 2008 user conference, the giant stated the following stats for Microsoft Dynamics:

* A 26 percent revenue growth in Q2 2008;
* Nearly 300,000 customers worldwide;
* Nearly 10,000 business partners worldwide;
* About 1,700 Dynamics solutions in Solution Finder; and
* Over 14,000 customers and over 625,000 users of Microsoft Dynamics CRM.

Now, some nitpickers might say that Microsoft Dynamics is not a profit generator for Microsoft, if not even bleeding money due to all the ongoing product investment. Well, guess what, Microsoft is certainly not in dire need of cash to squeeze it out of Dynamics’ operations.

As some of you might know, now that Dynamics is part of MBD, which contains Microsoft Office, Dynamics, Exchange, Office Live and Unified Communications, the parent company doesn’t report the Dynamics business separately any longer in terms of revenue and operating income. However, Microsoft still discloses Dynamics customer billings figures every quarter, and here are the three data points it has publicly disclosed:

1. In fiscal 2006, the last time Dynamics was an external P&L entity, it achieved profitability in Q4, and was profitable for the full year;
2. In fiscal 2007, Dynamics crossed an important internal milestone of becoming an over US$ 1 billion business; and
3. For fiscal 2007 and 2008, Dynamics has reported a 21 percent growth in billings in each of those years.

But the thing that represents Dynamics’ “extra” contribution is the sale of all those Microsoft platform components to all of the customers of Dynamics. That is to say that Dynamics creates a “pull” for other Microsoft technologies.

Money for the Caviar

Plus, let’s not forget about all the revenue and profits coming from related sales of SQL Server, SharePoint, Office, Exchange, and so on to an army of ISV’s (many of which are even fierce Dynamics competitors).

Microsoft has also touted and recruited many ISVs for Office Business Applications (OBAs) as a way in which line-of-business (LoB) systems can be seamlessly integrated with the ubiquitous Microsoft Office productivity tools. Business applications are made possible by key platform capabilities, called OBA Services, in the Microsoft Office 2007 system that cater to the following features: workflow; search; the Business Data Catalog (BDC); a new, extensible UI; Microsoft Office Open XML Formats; and the Web Site and Security Framework.

Another example (staying on the IFS theme) is IFS Business Analytics, the first product from the vendor’s Intelligent Desktop initiative. This product is a business intelligence (BI) solution that extends Microsoft Excel from a desktop productivity tool to a full-fledged, enterprise-scale client for planning, reporting and analysis. In other words, users hereby benefit from using IFS Applications (and its embedded security and authorizations) within an already familiar Excel environment.