Tuesday, June 1, 2010

BAAN Announces "Open World": Business-To-Business Collaboration Over The Internet

Today, manufacturing organizations spend more than 35% of their IT budgets integrating disparate systems such as CRM, ERP, and Supply Chain applications. Yet, the level of integration most organizations achieve is a simple point-to-point data exchange which may allow the sharing of customer information between front and back office applications, or batch transfer of supply chain planning information with an ERP System.

Baan is painting a vision of true business to business collaboration over the Internet. In this vision business partners are able to seamlessly share information and collaborate on business processes to achieve new levels of responsiveness to customers and business partners.

"For companies to compete effectively in the emerging business to business Internet economy, integration across the various business domains is an absolute requirement," said Mary Coleman, Chairman and CEO, Baan Company. "As consumer expectations of service and support have changed dramatically with the advent of Internet sales - today customers expect 7 x 24 service, next day delivery and more competitive pricing - Baan believes that the changes brought about by the Internet on B-2-B commerce will be even more dramatic. To compete, manufacturing businesses will need to learn to partner with suppliers and customers to deliver custom configured solutions, while maintaining minimal inventory in reduced timeframes."

Baan OpenWorld is an integration framework built on four tiered levels of exchange:

Data Level: At this first level, applications share or exchange common elements such as customer information, part numbers, and inventory levels. This level of integration includes data migration, replication, and data connectivity.

Application Level: At this second level, applications exchange data in the form of objects at the sub-process level. This includes connectivity between applications and certified integration interfaces for 3rd party applications.

Business Process Level: At this third level, organizations are able to integrate business processes between applications using standards like XML and based on IBM MQ Series or MSMQ messaging queues. At this level, businesses are able to use common process modeling and workflow tools, common user interfaces and business intelligence systems to seamlessly solve multi-functional business problems like Available to Promise, order fulfillment, demand management, etc.

Business Community Level: This top-level of exchange enables true business process collaboration within an enterprise, and across the heterogeneous enterprises of business partners and customers. Adaptable business logic, real-time alerts, and a publish and subscribe communications model will allow organizations to work together, using XML standards-based Internet messaging to react quickly to customer demands.

The first three tiers of the Baan OpenWorld Integration Framework are included in the Baan Enterprise Solutions suite. Beta customers for new products supporting the Business Community level are planned for the first half of 2000.

Accelerating (and Fast-Starting) the SME Business at Oracle (and SAP) – Part 3

Oracle Accelerate is not only a partner program but also Oracle’s go-to-market approach to provide business software solutions to midsize organizations. Part 1 described the main constituent parts of the approach, while Part 2 talked about the program’s current state of affairs. Part 3 of this blog series will analyze the program’s latest partner-enablement developments as well as the inevitable room for improvements.

Oracle Accelerate Partners-Enabling Portal

To further enable its Accelerate partners, Oracle launched a new portal in late September 2009, called Midsize.Oracle.com. This is a resource focused on midsize customers and partners to search for geographic- and industry-specific solutions, as well as a platform for collaboration.

The new online storefront is where prospective midsize customers can quickly find solutions to fit their needs based on location, industry, and business requirements. The new Web presence, which features all available Oracle Accelerate solutions in the Oracle Accelerate Marketplace section, houses in-depth information on partners and their products, including user demonstrations, customer testimonials, and pricing.

The marketplace also allows prospective customers to contact Oracle Accelerate solutions providers directly. The abovementioned partner marketplace section is an online marketing portal for Oracle partners to highlight their Oracle Accelerate solutions. The section is tied to the Midsize.Oracle.com main page and to the new Oracle Business Accelerators (OBA) pages to aid in awareness and demand generation activities. Partners are able to promote their company, expertise, and related Oracle Accelerate solutions.

Oracle envisions the portal as the go to place for anyone looking for Oracle applications for midsize companies, and thus the lead generation tool. Partners can have any of their own branded marketing assets (i.e., they do not necessarily need to be Oracle branded) on their dedicated partner and solutions pages.

The Midsize.Oracle.com portal currently supports the following languages: English, French, German, Spanish, Portuguese, and Simplified Chinese. Available Oracle Accelerate solutions and partners can be filtered by country, and then further either by industry or product functionality (process flows). Accompanying featured marketing articles, white papers, and customer success stories are updated monthly.

To prevent (or at least mitigate) redundancy and competition between Oracle Consulting and Accelerate partners, Oracle has imposed a so-called “no fly zone” for its direct sales teams for United States (US) companies that have less than US$100 million in revenues. That limit is US$250 million in Europe, and these demarcation lines vary by region elsewhere.

In summary, the Midsize.Oracle.com portal simplifies and intensifies Oracle’s message to midsize organizations about available regional applications, OBA’s, and solutions. The idea is to deliver more leads to the right partners faster. Future improvements will aim to include more industry- and geography-specific content, more solutions’ granularity, a Benefit-Cost Analysis (BCA) capabilities, and even more contextualized customer references.

Saturday, May 1, 2010

The Intelligence of Social Media (Part 2)

In the first part of this blog, I mentioned that sentiment analysis measures the polarity of opinion—positive, negative, or neutral—regarding a subject, a product, a service, etc.

Two main approaches can be used to perform sentiment analysis or text mining: a knowledge-based approach, which uses linguistic models to classify sentiments; and a learning-based approach, which uses machine learning techniques to classify text. The concept of sentiment analysis opens a great number of possibilities and opportunities for introducing BI strategies to analyze the enormous amount of data flowing through the Web.

In fact, some software solutions have been designed to address this type of analysis. These tools are called “social Web analytics.” According to the definition provided by The Social Web Analytics eBook (2008), by Philip Sheldrake, social Web analytics are “the application of search, indexing, semantic analysis and business intelligence technologies to the task of identifying, tracking, listening to and participating in the distributed conversations about a particular brand, product or issue, with emphasis on quantifying the trend in each conversation’s sentiment and influence.”

Many organizations are aware of the importance of measuring this information and analyzing it. Currently, sentiment analysis has a strong potential to be used jointly with BI applications making it possible to apply traditional BI techniques to visualize what a sentiment-based tool has discovered on the Web. Some vendors are already offering analytics services (radian6, Sysomos, BuzzLogic, and Attentio) to measure and analyze social media content.

Now, there is also an existing trend regarding traditional BI providers to address these tasks:

Workforce Analytics – A Blend of Business Intelligence and Human Resources

If you are a HR manager in a company that employs thousands of people, one of your main concerns should be reporting and analytics. Workforce analytics can help your organization determine how efficient its recruiting processes are. It can also help during the hiring process—recruiting the right people at the right costs.

Workforce analytics can give your company a general overview of the activity of the HR department. Depending on the product being used, you can drill down to the next level of detail, build interactive graphs, and export the data to different file formats.

By understanding the demand and supply in HR—as well as the gaps between the two—HR professionals can create and implement better internal procedures for talent management, retention, succession, etc.

This is done by gathering information on your workforce and putting it into a single repository. This information comes from your HR system, enterprise resource planning (ERP) solution, or other business software (e.g. time and attendance, project management, accounting software, etc.).

By using this data, your company can create forecasts and what-if scenarios in order to understand how a change in the activity of the company can impact its HR department and vice-versa. For instance, if you decide to launch a new product line, you can estimate how many people you’ll need and how much it will cost you.

Most of the vendors in this area (HR, BI, or workforce analytics) offer pre-defined key performance indicators (KPIs) that your company can use to measure the efficiency of its workforce, but they can also help build new ones and even implement best practices to improve the way people work.

Quote-to-Order: An Overlooked Software Application

Last year, I met an analyst from another firm, and asked him what he thought about quote-to-order (Q2O) solutions, given the relevance between Q2O and the conference that I was attending. Not quite surprisingly, the answer I got was, “this kind of application doesn’t have a future.” The conversation didn’t go any further due to limited time but I could imagine that his reasoning might have sounded like this: even though activities from quoting to ordering may be taken care of by multiple systems, there’s no need to have another system (if there’s good integration in place), which makes the already complicated enterprise information landscape even more complicated. Certainly, this statement can be true if there is good integration in place. However, the truth is that today’s integration amongst various information systems is far from perfection. Let’s take a look at the reality of many companies’ Q2O process.

In a real working environment, sales professionals may have existing tools (independently or as a part of a customer relationship management [CRM] system) to support their quoting activities. These tools may be quite handy in generating a beautiful quotation document. However, what really counts in the quality of a quotation is the accuracy of the information provided to potential clients. More specifically, a good quotation has to present the correct product/service configuration based on a client’s requirements and what a company is willing to offer. Most of the time, product information is mainly produced by another group of people (usually called a product development department or something similar) using different systems. Given today’s fast-paced product development, relying on printed handbook, spreadsheet, or even batched update as the source of product information risks the inaccuracy in quoting.

Inaccurate product information is not the only problem in the Q2O process. Even though a quotation presents what is “technically perfect” (i.e., correct product configuration), it may not present what is commercially and operationally feasible to be delivered. In theory, production, purchasing, and inventory information should all play a role in generating a deliverable quotation. However, in practice, delivery terms are often determined based on experience or rule of thumb. As a matter of fact, in many organizations, only a few individuals know the so-called “tribal knowledge” of all the rules, constraints, etc., about what can or cannot be manufactured and delivered with what difficulties.

Disconnected data flow between quoting and ordering is another issue in the Q2O process. When quoting and ordering are handled by different groups of people, a finalized quotation often has to be re-entered (or in a better case, imported) to systems that control production and delivery. This non-value-added activity not only consumes resources but also opens the door for errors.

In the case that companies sell via distributors and resellers, the situation can only become more complicated.

Undoubtedly, integrating one system to another pair by pair (e.g., product lifecycle management [PLM] and CRM, CRM and enterprise resource planning [ERP], and PLM and ERP) is a method to address the above mentioned issues. However, there is a small group of software applications (titled Q2O or configure, price, quote [CPQ] solutions) taking a more focused approach (click here to see the list of Q2O solutions). Q2O solutions use the Q2O process as a main thread to integrate all relevant activities and needed information in one place. In addition, some solutions also provide functionality such as quotation documentation, product information management (PIM) (also called master data management [MDM] for product information), and e-commerce capabilities (e.g., shopping carts, checkouts, save for later, etc.) that are often not the case associated with ERP or CRM systems.

Saturday, March 27, 2010

Using Predictive Analytics within Business Intelligence: A Primer

Predictive analytics has helped drive business intelligence (BI) towards business performance management (BPM). Traditionally, predictive analytics and models have been used to identify patterns in consumer oriented businesses, such as identifying potential credit risk when issuing credit cards, or analyzing the buying habits of retail consumers. The BI industry has shifted from identifying and comparing data patterns over time (based on batch processing of monthly or weekly data) to providing performance management solutions with right-time data loads in order to allow accurate decision making in real time. Thus, the emergence of predictive analytics within BI has become an extension of general performance management functionality. For organizations to compete in the market place, taking a forward-looking approach is essential. BI can provide the framework for organizations focused on driving their business based on predictive models and other aspects of performance management.

We'll define predictive analytics and identify its different applications inside and outside BI. We'll also look at the components of predictive analytics and its evolution from data mining, and at how they interrelate. Finally, we'll examine the use of predictive analytics and how they can be leveraged to drive performance management.

Overview of Analytics and Their General Business Application

Analytical tools enable greater transparency within an organization, and can identify and analyze past and present trends, as well as discover the hidden nature of data. However, past and present trend analysis and identification alone are not enough to gain competitive advantage. Organizations need to identify future patterns, trends, and customer behavior to better understand and anticipate their markets.

Traditional analytical tools claim to have a 360-degree view of the organization, but they actually only analyze historical data, which may be stale, incomplete, or corrupted. Traditional analytics can help gain insight based on past decision making, which can be beneficial; however, predictive analytics allows organizations to take a forward-looking approach to the same types of analytical capabilities.

Credit card providers offer a first-rate example of the application of analytics (specifically, predictive analytics) in their identification of credit card risk, customer retention, and loyalty programs. Credit card companies attempt to retain their existing customers through loyalty programs, and need to take into account the factors that cause customers to choose other credit card providers. The challenge is predicting customer loss. In this case, a model which uses three predictors can be used to help predict customer loyalty: frequency of use, personal financial situations, and lower annual percentage rate (APR) offered by competitors. The combination of these predictors can be used to create a predictive model. The predictive model can then be applied and customers can be put into categories based on the resulting data. Any changes in user classification will flag the customer. That customer will then be targeted for the loyalty program. Financial institutions, on the other hand, use predictive analytics to identify the lifetime value of their customers. Whether this translates into increased benefits, lower interest rates, or other benefits for the customer, classifying and applying patterns to different customer segmentations allows the financial institutions to best benefit from (and provide benefit to) their customers.

What's Really Driving Business Intelligence

If you follow the logic of the major analysts covering the business intelligence (BI) market, the market drivers for business intelligence software are based on fairly simple environmental factors. The most commonly cited market drivers are the following:

1. Increasing Regulation—New laws in both the US and Europe are requiring companies to make their external reporting more transparent, forcing business to develop better systems for storing and retrieving the most current and detailed information on operations.

2. Information Overload—Having invested heavily in CRM, ERP, and SCM systems, many businesses are awash in data, but short on actionable intelligence. Being able to aggregate, mine, and analyze data in order to prepare for and respond to business and market events is the next step in making IT investments pay off.

3. Demand for Accountability and Metrics—A slow economic recovery has forced many businesses to continue trimming budgets, while requiring greater accountability for every area of spending. Business intelligence, and its associated data-mining, analytics and scorecards, provides the tools necessary to track performance metrics tied directly to strategic corporate goals.

4. Need to Improve Competitive Responsiveness—With markets exposed to increasing competition, customer demand and pricing pressure, businesses need to reduce cycles by accelerating processes that support aggressive competitive strategies. BI initiatives provide real-time information that can help businesses eliminate process delays and streamline management to improve decision-making and market response.

There's nothing wrong with these descriptions of existing market conditions. Each of them is a correct and compelling reason for businesses to support BI initiatives. However, they don't tell the whole story. In fact none of these market drivers, taken individually or taken as a whole, are enough to explain the level of investment being made in business intelligence software.

Think about it. Businesses have found ways to skirt or delay the impact of increasing regulations for decades. Why would they suddenly respond so rapidly to new regulations today? While information overload is acute, many businesses took a soaking in IS investments over the past few years. What smart CEO would throw good money after bad to try and rescue a previous investment? Metrics and accountability are certainly in high demand while budgets are tight, but how many businesses would invest millions of dollars just to be confident their million-dollar investments are sound? That kind of long-term thinking doesn't move markets in our quarterly-driven world. And finally, yes, businesses are being compelled to be more efficient and effective in order to compete, but that battle has been shaping up for decades among TQM, Six-Sigma, lean production methodologies—does anyone really believe the end of the rainbow is only a dashboard away?

While, each of these market drivers is accurate, they're only symptoms of a much deeper drive—a drive that is shaped by a concern far greater than the threat of regulation, information overload, accountability or competitive response. It's a drive that reflects the deepest fears of a CFO. It's a drive that shapes the search for CEOs that can move the businesses that move markets. It's a drive that cuts straight to the bottom line of the corporation, because it's about the single, all-important factor that defines the success of every business today.

It's all about how the value of a business is measured.

How BI Addresses the Needs of SOX Compliance

Traditionally, BI software has targeted the needs of financial decision makers. BI tools initially enabled organizations to analyze financial data, to identify trends, and to drill down on report data to reveal operational transactions, as well as to assign tasks to individual employees, in order to give management the ability to implement robust auditing processes. The driver behind these functions is the ability to capture data from several data sources across an organization, and to centralize them in a data warehouse. Aside from data centralization across the organization, data warehouses allow organizations to implement and monitor data quality activities to ensure accurate data. This reduces the potential for accidental data errors.

BI tools help vendors to meet the demands of organizations that need to comply with SOX regulations, scorecards, and business activity monitoring (BAM). General reporting and analysis functionality permits organizations to take a top-down approach to management, yet still meet SOX compliance. CEOs and CFOs who are responsible for assuring compliancy and who are accountable to the SEC often aren't directly responsible for actual report generation or in-depth budgeting. Task assignment and management of processes are internal driving forces within BI, and help companies manage employee tasks and responsibilities for each financial report and function, as well as ensuring data quality. Basically, BI allows the CEO to manage internal processes and data to meet SOX compliance, and gives CEOs the ability to micromanage tasks at each level to ensure compliance, and to identify any potential errors (as well as identifying who made them, and when they were made within the process). If proper data quality processes are implemented, organizations can guarantee that data errors do not occur within the data warehouse itself and that any key stroke errors and the like are cleansed as they enter the data warehouse, before financial analyses and reporting functions are performed to meet SOX requirements.

Although, as mentioned above, BI software can help organizations meet SOX compliancy, vendors have also taken SOX issues into account when upgrading their product suites to make sure that required standards can be met on an ongoing basis. Even though many other forms of financial reporting software meet SOX compliancy, BI solutions have the added bonus of built-in workflow processes and data integration features to ensure long-term compliancy. Data within spreadsheets can be changed, and structures are not always put in place to manage those changes. However, BI software suites have built-in task assignment and audit functions for managing, distributing, and auditing data (based on where the data comes from, who has ownership of the data, and how the data has been processed).

Using Business Intelligence Infrastructure to Ensure Compliancy with the Sarbanes-Oxley Act

The US Sarbanes-Oxley Act (SOX) of 2002 was established to protect investors from the potential for fraudulent accounting. After the exposure of several corporate scandals, such as the Enron and WorldCom affairs, the US government was compelled to pass legislation ensuring accurate financial reporting and auditing from organizations publicly traded in the United States. SOX affects any public corporation competing in the American marketplace. As a result of SOX, not only have financial controls and reporting schedules become stricter, but responsibility for accurately reporting financial results has been placed in the hands of organizational heads, namely the chief executive officers (CEOs) and chief financial officers (CFOs), to provide accurate financial and auditing data.

This means that financial departments have had to reevaluate the way they manage their controls and reporting. It is no longer possible for organizations to change data without accounting for these changes to shareholders. Now that the responsibility for accurate financial reporting has been placed on upper management, with heavy fines and potential prison terms being imposed for noncompliance, financial analysis tools, such as those provided by business intelligence (BI) vendors, are becoming increasingly important to the financial auditing process. Ensuring proper data controls, proper reporting and auditing structures, and the accurate capture of the ensuing data, are important aspects of SOX compliance and make up the essential elements of BI solutions.

There are three sections of SOX that deal directly with the use of information technology (IT). Section 302 requires management certification that procedures have been put in place to address accurate financial conditions and disclosure controls for all financial statements. Section 404 requires management certification that effective internal controls and procedures have been developed for financial report preparation. Finally, section 409 requires that timely reports be provided to investors, the US Securities and Exchange Commission (SEC), and other corporate stakeholders.

Wednesday, January 27, 2010

How Predictive Analytics Are Used within BI, and How They Drive an Organization's BPM

Data mining, predictive analytics, and statistical engines are examples of tools that have been embedded in BI software packages to leverage the benefits of performance management. If BI is backward looking, and data mining identifies the here and now, predictive analytics and their use within performance management is the looking glass into the future. This forward-looking view helps organizations drive their decision making. BI is known for its consolidation of data from disparate business units, and for its analysis capabilities based on that consolidated data. Performance management goes one step further by leveraging the BI framework (such as the data warehousing structure and extract, transform, and load [ETL] capabilities) to monitor performance, identify trends, and allow decision makers the ability to set appropriate metrics and monitor results on an ongoing basis.

With predictive analytics embedded within the above processes, the metrics set and business rules identified by organizations can be used to identify the predictors that need to be evaluated. These predictors can then be used to shift towards a forward-looking approach in decision making by using the strengths from the areas identified above. Scorecards are one example of a performance management tool that can leverage predictive analytics. The identification of sales performance by region, product type, and demographics can be used to define what new products should be introduced into the market, and where. In general, scorecards can graphically reflect the selected sales information and create what-if scenarios based on the data identified to verify the right combinations of new product distribution.

What-if scenarios can be used within the different visualization tools to create business models that anticipate what might happen within an organization based on changes in defined variables. What-if analysis gives organizations the tools to identify how profits will be affected based on changes in inflation and pricing patterns as well as the impact of increasing the number of employees throughout the organization. Online analytical processing (OLAP) cubes can be created to identify dimensional data, and patterns within changing dimensions can be compared over time to contrast scenarios using a cube structure to automatically view the outcome of the what-if scenarios.

Components of Predictive Analytics

Data mining can be defined as an analytical tool set that searches for data patterns automatically and identifies specific patterns within large datasets across disparate organizational systems. Data mining, text mining, and Web mining are types of pattern identification. Organizations can use these forms of pattern recognition to identify customers' buying patterns or the relationship between a person's financial records and their credit risk. Predictive analytics moves one step further and applies these patterns to make forward-looking predictions. Instead of just identifying a potential credit risk, an organization can identify the lifetime value of a customer by developing predictive decision models and applying these models to the identified patterns. These types of pattern identification and forward-looking model structures can equally be applied to BI and performance management solutions within an organization.

Predictive analytics is used to determine the probable future outcome of an event, or the likelihood of a situation occurring. It is the branch of data mining concerned with the prediction of future probabilities and trends. Predictive analytics is used to analyze automatically large amounts of data with different variables, including clustering, decision trees, market basket analysis, regression modeling, neural nets, genetic algorithms, text mining, hypothesis testing, decision analytics, and so on.

The core element of predictive analytics is the predictor, a variable that can be measured for an individual or entity to predict future behavior. These predictors are based on models that are created to use the analytical capabilities within the generated predictive models. Descriptive models classify relationships by identifying customers or prospective customers, and placing them in groups based on identified criteria. Decision models consider business and economic drivers and constraints that surpass the general functionality of a predictive model. In a sense, statistical analysis helps to drive this process as well. The predictors are the factors that help identify the outcomes of the actual model. For example, a financial institution may want to identify the factors that make a valuable lifetime customer.

Multiple predictors can be combined into a predictive model, which, when subjected to analysis, can be used to forecast future probabilities with an acceptable level of reliability. In predictive modeling, data is collected, a statistical model is formulated, predictions are made, and the model is validated (or revised) as additional data becomes available. One of the main differences between data mining and predictive analytics is that data mining can be a fully automated process, whereas predictive analytics requires an analyst to identify the predictors and apply them to the defined models.

A decision tree is a variable within predictive analytics that allows the user to visualize the mapping of observations about an item and compare it to conclusions about the item's target value. Basically, decision trees are built by creating a hierarchy of predictor attributes. The highest level represents the outcome, and each sub-level identifies another factor in that conclusion. This can be compared to if-else statements, which identify a result based on whether certain factors meet specified criteria. For example, in order to assess potential bad debt based on credit history, salary, demographics, and so on, a financial institution may wish to identify multiple scenarios, each of which is likely to meet bad debt customer criteria, and use combinations of those scenarios to identify which customers are most likely to become bad debt accounts.

Regression analysis is another component of predictive analytics that allows users to model relationships between three or more variables in order to predict the value of one variable in comparison to the values of the others. It can be used to identify buying patterns based on multiple demographic qualifiers such as age and gender which can be beneficial to identify where to sell specific products. Within BI, this is beneficial when used with scorecards that focus on geography and sales.

Overview of Analytics and Their General Business Application

Analytical tools enable greater transparency within an organization, and can identify and analyze past and present trends, as well as discover the hidden nature of data. However, past and present trend analysis and identification alone are not enough to gain competitive advantage. Organizations need to identify future patterns, trends, and customer behavior to better understand and anticipate their markets.

Traditional analytical tools claim to have a 360-degree view of the organization, but they actually only analyze historical data, which may be stale, incomplete, or corrupted. Traditional analytics can help gain insight based on past decision making, which can be beneficial; however, predictive analytics allows organizations to take a forward-looking approach to the same types of analytical capabilities.

Credit card providers offer a first-rate example of the application of analytics (specifically, predictive analytics) in their identification of credit card risk, customer retention, and loyalty programs. Credit card companies attempt to retain their existing customers through loyalty programs, and need to take into account the factors that cause customers to choose other credit card providers. The challenge is predicting customer loss. In this case, a model which uses three predictors can be used to help predict customer loyalty: frequency of use, personal financial situations, and lower annual percentage rate (APR) offered by competitors. The combination of these predictors can be used to create a predictive model. The predictive model can then be applied and customers can be put into categories based on the resulting data. Any changes in user classification will flag the customer. That customer will then be targeted for the loyalty program. Financial institutions, on the other hand, use predictive analytics to identify the lifetime value of their customers. Whether this translates into increased benefits, lower interest rates, or other benefits for the customer, classifying and applying patterns to different customer segmentations allows the financial institutions to best benefit from (and provide benefit to) their customers.