Harnessing Collective Intelligence in Decision Making through Big Data Analytics


Decision! Decision! Decision! What a hazardous and difficult human endeavor it is! Those of us who had to make decisions in personal life, business or profession know that the chance of our decision producing the desired end-result is always in doubt. This is so mainly because decision made today fructify tomorrow. If all decision makers were clairvoyant no one would make wrong decisions – making decisions would be a routine job. Unfortunately this is never going to happen. We will keep making wrong decisions as we do today. However, we could enhance, in a substantial way, our chances of making the right decision by reviewing everything that is happening now or has happened in the past relating to our target area of activity. Of course, this may be a tall order to follow but in the information age that we are living, it seems feasible. There is so much data available at diverse sources that if we evolve a scientific and feasible way of disseminating all this data we will find answers to our queries as we have never been able to do so before. The ‘way’ we mentioned above is the ‘Big Data Analytics’.

 Genesis of Big Data

Big Data, one of the hottest IT buzzwords of 2012, has emerged as a new technology paradigm to address the volume, velocity and variability of massive data coming from different sources. The social media is one well known source of big data. A somewhat less known source but big nonetheless is the data generated by data acquisition systems (DAS) in machinery and structures in the field of engineering. Large volumes of data are also being generated by health monitoring devices of interest to medical professionals. There are (many) other sources too. Within this heaps of massive data, there is treasure of information that can be extracted for saving major disasters, accidents, outbreak of epidemics, etc. In the field of business and marketing, big data available through the ‘social network’ is already being proactively used in propelling growth of businesses.

Businesses have been relying on tools such as Business Intelligence (BI) dashboards and reports for decisions based on transactional data stored in relational databases. With evolution of social media, we started seeing emergence of non-traditional, less structured data such as weblogs, social media feeds, email, sensors, photographs and YouTube videos that can be analyzed for useful information. With reduction of cost in both storage and compute power, it is now feasible to store and analyze this data, as well, for meaningful purposes.  As a result, it is important that businesses cast a new look at the extended range of data, i.e. Big Data, for business intelligence and for decision making.

 Sources of Big Data

The major sources of Big Data may be listed as follows:

  • Enterprise Applications data, that generally includes data emanating from Enterprise Resource Planning (ERP) Systems, customer information from Customer Relationship Management (CRM) systems, Supply Chain Management systems, e-commerce transactions, and Human Resource (HR) and Payroll transactions.
  • Semantic data that comprise Call Detail Records (CDR) from Call centers, weblogs, smart meters, manufacturing sensors, equipment logs and trading systems data generated by machine and computer systems.
  • Social Media data that includes customer feedback streams, micro-blogging sites like Twitter, and social media platforms like Facebook.

The McKinsey Global Institute [1] estimates that data volume is growing 40% per year, and will grow 44 xs between 2009 and 2020. There are four key characteristics, 4 Vs of volume, velocity, variety and value that are commonly used to characterize different aspects of big data.

Characteristics of Big Data

We consider these characteristics in some detail in the following few paragraphs.

Volume

Social Media (Facebook, Twitter, LinkedIn, Foursquare, YouTube and many more)  is an obvious source of large volume of data. Machine generated data or Semantic web data are other large but somewhat less known source of data. To judge the volume of this type of data it may be sufficient to know that a single jet aircraft engine can generate 10TB of data in 30 minutes. With more than 25,000 airline flights per day, the daily volume of just this single data source runs into petabytes (1015 bytes). Smart meters and heavy industrial equipment like oil refineries and drilling rigs generate similar data volumes.

Velocity

The data comes into the data management system rapidly and often requires quick analysis for decision making. The speed of the feedback loop is extremely important for taking data from input through to analysis and decision making state. The tighter the feedback loop, the greater will be the usefulness of the data. It is this need for speed, particularly on the web, that has driven the development of key-value stores and columnar databases, optimized for the fast retrieval of pre-computed information. These databases form part of an umbrella category known as NoSQL ( Not Only Structured Query Language) used when relational models does not suffice. Social media data streams bring large input of opinions and relationships valuable to customer relationship management in retail business. Even at 140 characters per Tweet, the high velocity of Twitter data generates large volumes (over 8 TB per day). Most of these data received may be of low value and analytical processing may be required in order to transform the data into usable form or derive the meaningful information.

Variety

Big Data brings variety of data types. It varies from text, image and video from social networks, and raw feed directly from sensor sources to semantic weblogs generated by machines. These data are often not associated with application. A common use of big data processing is to take unstructured data and extract meaningful information for consumption either by humans or as a structured input to an application. Big Data brings a lot of data that has patterns, sentiments and behavioral information that need analysis. Relational Database Management Systems (RDBMS) were not designed to address this sort of data.

Value

The value of different data varies significantly. Generally, there is good information hidden within a larger body of non-traditional data. Big data offers great value to businesses in bringing real time market and customer insights enabling improvement in new products and services. Big data analytics can reveal insights such as peer influence among customers, revealed by analyzing shoppers’ transactions, social and geographical data. The past decade’s (1990s) successful web startups are prime examples of big data used as an enabler of new products and services. For example, by combining a large number of signals from a user’s actions and those of their friends, Facebook has been able to craft a highly personalized user experience and create a new kind of advertising business. Recently the machine tool industry has developed MTConnect protocol through which it will be possible to collect and broadcast key performance indicators (KPI) to interested parties who can evaluate efficiency of operation of the machine tools and be able to anticipate machine problems [2].

Big Data Solutions

With the evolution of Cloud deployment model, majority of big data solutions are offered as software only, as an appliance or cloud-based offerings. As is the case with any other applications deployments, big data deployment will also depend on several issues such as data locality, privacy and governmental regulations, human resources and project requirements. Many organizations are opting for a hybrid solution using on-demand cloud resources to supplement in-house deployments.

Big data is messy and needs enormous efforts in cleansing and quality enhancement. The phenomenon of big data is closely tied to the emergence of data science, a discipline that combines math, programming and scientific instinct.

Current data warehousing projects take long to offer meaningful analytics to business users. It depends on extract, transform and load (ETL) processes from various data sources. Big Data analytics on the other hand can be defined as a process in relationship or in context to the need to parse large data sets from multiple sources, and to produce information in real-time or near-real-time.

Big Data analytics represents a big opportunity. Many large businesses are exploring the analytics capabilities to parse web-based data sources and extract value from the social media. However, an even larger opportunity, the Internet of Things (IOT) is emerging as a data source. Cisco Systems Inc. estimates[3] that there are approximately 35 billion electronic devices that can connect to the Internet. As a matter of fact any electronic device can be connected to the Internet, and even automakers are building Internet connectivity into vehicles. “Connected” cars will become commonplace by 2012 and generate millions of transient data streams.

The big data market (comprising of technology and services) is on the verge of a rapid growth in tune of $50 billion mark worldwide within the next five years. As of early 2012, the big data market stands at just over $5 billion based on related software, hardware, and services revenue. Increased interest in and awareness of the power of big data and related analytic capabilities to gain competitive advantage and to improve operational efficiencies, coupled with developments in the technologies and services that make big data a practical reality, will result in a super-charged CAGR(Compound annual growth rate) of 58% between now and 2017.

Of the current market, big data pure-play vendors account for $310 million in revenue. Despite their relatively small percentage of current overall revenue (approximately 5%), vendors such as Vertica, Splunk and Cloudera are responsible for the vast majority of new innovations and modern approaches to data management and analytics that have emerged over the last several years and made big data the hottest sector in IT. Wikibon considers big data pure-plays as those independent hardware, software, or services vendors whose big data-related revenue accounts for 50% or more of total revenue.

The big data market includes technologies, tools, and services designed to address these opportunities. These include Hadoop distributions, software, projects and related hardware. Next-generation data warehouses and related hardware, Data integration tools and platforms as applied to big data; Big data analytic platforms, applications, and data visualization tools;

Pure-plays Vendors Delivering Big Data Innovation

The most impactful innovations in the big data market are coming from the numerous pure-play vendors that own just a small share of the overall market. Hadoop distributions Cloudera and Hortonworks are significant contributors to the Apache Hadoop project that is significantly improving the open source big data framework’s performance capabilities and enterprise-readiness. Cloudera, for example, contributes significantly to Apache HBase, the Hadoop-based non-relational database that allows for low-latency, quick lookups.

Hortonworks engineers are working on a next-generation MapReduce architecture that promises to increase the maximum Hadoop cluster size beyond its current practical limitation of 4,000 nodes. MapR takes a more proprietary approach to Hadoop, supplementing HDFS  (Hadoop Distributed File System) with its API-compatible Direct Access NFS in its enterprise Hadoop distribution, adding significant performance capabilities. Next Generation vendors such as Vertica, Greenplum, and Aster Data are redefining the traditional enterprise data warehouse market with massively parallel, columnar analytic databases that deliver lightening fast data loading and real-time analytic capabilities.

The latest iteration of the Vertica Analytic Platform, Vertica 5.0, for example, includes new elastic capabilities to easily expand or contract deployments and many in-database analytic functions.

Big Data Analytics Platforms and Applications 

Hadoop based platforms: Few Niche vendors are developing applications and platforms that leverage the underlying Hadoop infrastructure to provide both data scientists and business users with easy-to use tools for experimenting with big data. These include Datameer [5], which has developed a Hadoop based business intelligence platform with a familiar spreadsheet like interface, Karmasphere[6], whose platform allows data scientists to perform ad hoc queries on Hadoop-based data via a SQL interface; and Digital Reasoning[7], whose Synthesis platform sits on top of Hadoop to analyze text-based communication.

Cloud-based applications and services are increasingly allowing small and mid-sized businesses to take advantage of big data without needing to deploy on-premises hardware or software. Tresata’s [8] cloud-based platform, for example, leverages Hadoop to process and analyze large volumes of financial data and returns results via on-demand visualizations for banks, financial data companies, and other financial services companies. 1010data [9] offers a cloud-based application that allows business users and analysts to manipulate data in the familiar spreadsheet format but at big data scale. And the ClickFox [10] platform mines large volumes of customer touch-point data to map the total customer experience with visuals and analytics delivered on-demand.

Non-Hadoop Big Data Platforms: Other non-Hadoop vendors contributing significant innovation to the big data landscape include Splunk[11], which specializes in processing and analyzing log file data to allow administrators to monitor IT infrastructure performance and identify bottlenecks and other disruptions to service. HPCC (High Performance Computing Cluster) Systems, a spin-off of LexisNexis [12], that offers a competing big data framework to Hadoop that its engineers built internally over the last ten years to assist the company in processing and analyzing large volumes of data for its clients in finance, utilities, educational and research institutions and government and DataStax [13], which offers a commercial version of the open source Apache Cassandra NoSQL database along with related support services bundled with Hadoop.

In order to make best meaningful use of big data, businesses must evolve their IT infrastructures to handle the rapid rate of delivery of extreme volumes of data, with varying data types, which can then be integrated with an organization’s other enterprise data to be analyzed. When big data is captured, optimized and  analyzed in combination with traditional enterprise data, businesses can develop a more thorough and insightful understanding of their business, which can lead to enhanced productivity, a stronger competitive position and greater  innovation to impact on the bottom line. For example, in the delivery of healthcare services, management of chronic or long-term conditions is expensive. Use of in-home monitoring devices to measure vital signs, and monitor progress is just one way that sensor data can be used to improve patient health and reduce both office visits and hospital admittance.

Manufacturing companies deploy sensors in their products to return a stream of telemetry. The proliferation of smart phones and other GPS devices offers advertisers an opportunity to target consumers when they are in close proximity to a store, a coffee shop or a restaurant. This opens up new revenue for service providers and offers many businesses a chance to target new customers.

Use of social media and web log files from their ecommerce sites can help retailers understand their customers buying pattern, behavior, likes and dislikes.  This can enable much more effective micro customer segmentation and targeted marketing campaigns, as well as improve supply chain efficiencies. As with data warehousing, web stores or any IT platform, an infrastructure for big data has unique requirements. In considering all the components of a big data platform, it is important to to easily integrate big data with enterprise data to conduct deep analytics on the combined data set.

The requirements in a big data infrastructure involve data acquisition, data organization and data analysis. Because big data refers to data streams of higher velocity and higher variety, the infrastructure required to support the acquisition of big data must deliver low, predictable latency in both capturing data and in executing short, simple queries; be able to handle very high transaction volumes, often in a distributed environment; and support flexible, dynamic data structures.

NoSQL databases are frequently used to acquire and store big data. They are well suited for dynamic data structures and are highly scalable. The data stored in a NoSQL database is typically of wide variety because the systems are intended to simply capture all data without categorizing and parsing the data. For example, NoSQL databases are often used to collect and store social media data. To allow for use in varying customer applications, underlying storage structures are kept simple.

Instead of designing a schema with relationships between entities, these simple structures often just contain a major key to identify the data point, and then a content container holding the relevant data. This simple and dynamic structure allows changes to take place without costly reorganizations at the storage layer. In classical data warehousing terms, organizing data is called data integration. Because there is such a high volume of data, there is a tendency to organize data at its original storage location, thus saving both time and money by not moving around large volumes of data.  The infrastructure required for organizing big data must be able to process and manipulate data in the original storage location; support very high throughput (often in batch) to deal with large data processing steps; and handle a large variety of data formats, from unstructured to structured.

Apache Hadoop is a new technology that allows large data volumes to be organized and processed while keeping the data on the original data storage cluster. HDFS is the long-term storage system for web logs, for example. These web logs are turned into browsing behavior (sessions) by running MapReduce programs on the cluster and 6 generating aggregated results on the same cluster. These aggregated results are then loaded into a RDBMS system. Since data is not always moved during the organization phase, the analysis may also be done in a distributed environment, where some data will stay where it was originally stored and be transparently accessed from a data warehouse. The infrastructure required for analyzing big data must be able to support deeper analytics such as statistical analysis and data mining, on a wider variety of data types stored in diverse systems; scale to extreme data volumes; deliver faster response times driven by changes in behavior; and automate decisions based on analytical models. Most importantly, the infrastructure must be able to integrate analysis on the combination of big data and traditional enterprise data. New insight comes not just from analyzing new data, but from analyzing it within the context of the old to provide new perspectives on old problems. For example, analyzing inventory data from a smart vending machine in combination with the events calendar for the venue in which the vending machine is located, will dictate the optimal product mix and replenishment schedule for the vending machine.

Many new technologies have emerged to address the IT infrastructure requirements outlined above.

  • NoSQL solutions: developer-centric specialized systems
  • SQL solutions: the world typically equated with the manageability, security and trusted nature of relational database management systems (RDBMS)

NoSQL systems are designed to capture all data without categorizing and parsing it upon entry into the system, and therefore the data is highly varied. SQL systems, on the other hand, typically place data in well-defined structures and impose metadata on the data captured to ensure consistency and validate data types.

Distributed file systems and transaction (key-value) stores are primarily used to capture data and are generally in line with the requirements discussed earlier in this paper. To interpret and distill information from the data in these solutions, a programming paradigm called MapReduce is used. MapReduce programs are custom written programs that run in parallel on the distributed data nodes.

The key-value stores or NoSQL databases are the OLTP (online transaction processing) databases of the big data world; they are optimized for very fast data capture and simple query patterns. NoSQL databases are able to provide very fast performance because the data that is captured is quickly stored with a single indentifying key rather than being interpreted and cast into a schema. By doing so, NoSQL database can rapidly store large numbers of transactions.

However, due to the changing nature of the data in the NoSQL database, any data organization effort requires programming to interpret the storage logic used. This, combined with the lack of support for complex query patterns, makes it difficult for end users to distill value out of data in a NoSQL database.

To get the most from NoSQL solutions and turn them from specialized, developer-centric solutions into solutions for the enterprise, they must be combined with SQL solutions into single proven infrastructure that meets the manageability and security requirements of today’s enterprises.

 Reference

1.James Manyika, Michael Chui, Brad Brown, Jacques Bughin, Richard Dobbs, Charles Roxburgh, Angela Hung Byers-The McKinsey Global Institute- Big Data :The next frontier for innovation, Competition and productivity, May 2011.

2. Kathy Levy,”A New Age in Manufacturing Is At Hand”, The Shot Peener (ISSN 1069-2010),    v26, Issue 2, spring 2012

3. John Webster-Understanding Big Data Analytics, May 2012

4. Jeff Kelly, David Vellante and David Floyer, “Big Data Market Size and Vendor revenues”, wikibon.org, May 2012.

Wisdom lies in Collaborative Power and Intelligence


In my recent blog posts, I shared insights on Predictive Analytics (Will Predictive Analytics at ‘Speed of Thoughts’ Help Businesses?), Real Time Decisions (How critical are Real Time decisions in business today?) Personalization (Personalization: A Key Tenet of User Engagement) and  their significance in our lives in general and in businesses today. In current business paradigm shift- with evolutionary social business, it is paramount that businesses look for wisdom in collaborative power and intelligence. There is old time saying that 5 sticks tied together are stronger and unable to break as opposed to individual stick. We have recently witnessed the power of ordinary people uniting together and fought collaboratively using Facebook and Twitter to topple  down dictators in Tunisia, Egypt and Libya—and are threatening absolute rule in Syria. And in India one man’s (Anna Hazare) campaign against corruption went viral, bringing thousands to the streets in support.

As anyone who has worked in a sizeable organization knows, there is no guarantee that the organization as a whole will perform efficiently and achieve its goals, even if each employee is individually efficient and every team has a high level of productivity. To achieve enterprise productivity, it is necessary not only for individuals and groups to “do things right” by working productively but also for the enterprise as a whole to “do the right things” – form the right teams, make the right decisions, allocate resources correctly, and effectively coordinate activities across the entire organization.

Most organizations fall short of the optimal level of enterprise productivity because of one or more of these reasons, all at a great cost to the business.

  • They are disconnected from themselves with various parts of the organization unintentionally working at cross-purposes with each other.
  • Information that exists is not getting shared or reused.
  • Human talent is not being applied where it is most needed.
  • The same problems are being solved repeatedly by multiple groups

Intelligent collaboration through automated business processes has the ability to alter the course of any important business activity, with a potentially dramatic impact on the financial performance of the business. Whether it is a simple email exchange, a physical or virtual meeting, a task force, or a large-scale project, the activity is inherently collaborative.  In fact, collaboration can be defined as the work that takes place among people when a business process is not pre-determining how the work should take place.  Collaboration is many things: information sharing, brainstorming, problem solving, best practice negotiation, innovation, coordination of activity, alignment of purpose, and so forth.  Collaboration is the “white space” between the business processes; it is the glue that holds an organization together, and the lubricant that allows the machinery to keep running.

Real time search and collaborative capability of right people with right content supported by defined processes will provide unparallel wisdom in the organization in the most competitive business environment today. Interestingly, technologies such as Oracle WebCenter platform   offer these capabilities in our Web based business transactions and compliment in the overall collaborative intelligence and Power.

 

 

Will Predictive Analytics at speed of thoughts help businesses?


Alakh Verma, Director, Platform Technology Solutions, Oracle

In my recent blog posts, I shared insights on real time decisions (How critical are Real Time decisions in business today?) and its significance in our lives in general and in business today.  Threats and damages caused by major catastrophe such as earthquakes, Tsunami, Air, rails or ships accidents can be minimized if not averted with the help of predictable and agile analytics at speed of thoughts. What to talk about businesses? Not only product failures, major losses and economic disaster can be avoided but it can be turnaround with major innovation in the area of in-memory analytics.

Analytics is all about gaining insights from the data for better decision making. The business press is abuzz with examples of leading organizations across the world using data-driven insights for strategic, financial and operational excellence. A recent study on “data-driven decision making” conducted by researchers at MIT and Wharton provides empirical evidence that “firms that adopt data-driven decision making have output and productivity that is 5-6% higher than the competition”.

There are few solutions offered in this space such as SAP and IBM, as well as specialized plays such as QlikTech and Tableau and Oracle has recently announced Exalytics-an engineered system to enable fast and easy ad hoc analysis across large end-user communities using an in-memory processing engine. Speed of thought and instant response are the hallmarks of its functionality, making it highly applicable to a range of ad hoc, what-if analysis and forecasting and real-time planning applications. The in-memory capabilities are key to enabling a highly interactive and visual analytic experience for end users.

Although Oracle Exalytics is a new product, the architecture is built on several existing Oracle products: parallelized versions of its Times Ten in-memory database and Oracle Essbase OLAP Server (a specialized in-memory version), together with an optimized version of the Oracle BI Foundation Suite (OBI EE 11g for standard enterprise-grade BI query, analysis, reporting, dash boarding, and other visualizations). Most, if not all, of these software products have been modified to run in-parallel and in-memory data-processing architectures. All of these combine to deliver query optimization, complex multidimensional analysis and planning calculations, and enterprise-wide BI scale, respectively, through a revamped user interface that is designed for “speed-of-thought” analytics.

These predictive analytics capabilities can  easily be leveraged on demand anywhere with the access of mobile devices at speed of thought if not only to avert disaster but to minimize risks and to turn it around for growth and prosperity.

7 Best Practices of Web Experience Management


Alakh Verma, Director, Platform Technology Solutions, Oracle

In my recent blog posts, I shared insights on personalization and personalized care that plays significant role in offering pleasant user experience (Personalization: A Key Tenet of User Engagement) and then on importance of Portal and Content (If Content is the King, then Portal is the Queen).In this post, my efforts are to identify and summarize some of the best practices of Web Experience Management(WEM) to create delightful customer experience based on the recent research report by CITO Research.

The immense success and proliferation of Facebook, Twitter, LinkedIn, Foursquare and mobile/smart devices such as iPhones, Blackberry and Tablet PC among consumers has created consumerization of IT and new paradigm shift seeking pleasant web experience. Now, Consumers have become more powerful than before with access of information and social media to communicate to the world at the fingertip.

1.Social Computing and use of Social Media is changing the way people interact with each other and with companies online. A recent report from Comscore showed that two-thirds of shoppers begin their process online, and the most frequent starting point is the retail site itself. These consumers are engaging, connecting and collaborating. They want to understand what their friends liked, what other options are available, and what people like them ultimately bought. Most of these consumers consider Facebook recommendations when making decisions about purchasing.

Jeff Bullas in his recent blog writes about Best Buy successful implementation of social media that energized it’s employees and customers(How Best Buy energized employees and customers with social media)

2.Consistent experience across Multiple Channels and Devices- have offered Customers many choices of accessing the Web in new, engaging  and collaborative ways than before  and they want that ease and diversity to be reflected in their dealings with businesses as well. Consumers change channels and switch devices whenever they need to. They can keep tabs on friends and family on Facebook and follow the Twitter regularly. They can find a nearby restaurant with foursquare geolocation on their phone and then look up reviews of that restaurant on Yelp.

With changing consumer patterns and behavior, it is extremely important to offer targeted, personalized, relevant and consistent experiences across these channels and devices.

3.Mobility-Smart Devices and Tablets – We have witnessed a smart revolution of the mobile space. As per the recent statistics report, the number of mobile devices has increased fivefold from 1 billion to 5 billion, with a resulting escalation in the number of connected people from 400 million to over 2 billion. This unprecedented growth of connectivity is creating an overwhelming range of new possibilities, and tapping, swiping, locating, pinging and socializing are quickly becoming part of normal human behavior. Technology is starting to change people and these people, whether consumers or employees, will change businesses. As per recent report by research firm IDC, mobile usage will surpass that of PCs and other wired devices by 2015. Mobile data traffic is expected to increase 26-fold between 2010 and 2015. In this rapid shift, mobility has to be the integral part of the solutions and the framework.

4.Real Time Decisions and Social Analytics- The recent technology such as big data analytics helps and supports us with right information, real-time event messaging provides at right time, mobility at right place anywhere and social media in the right context to make right decisions. I have shared my insights on my earlier post on how critical are Real Time decisions in business today?  http://t.co/H6Yf0J9k

5.Personalized Experiences –In the connected business environment, consumers expect companies to know about them and their likes and dislikes. For a compelling and relevant experience across channels, companies need to target, analyze, and optimize the customer experience. I had discussed and shared insights on this personalized experience in my earlier post (Personalization: A Key Tenet of User Engagement)

6.Multiple Stakeholders Need Control-Expectations for system usability and manageability have changed manifold. Marketers and line of-business executives require that managing websites be extremely easy to manage their campaign and relevant content to meet the timely execution Nontechnical users need to be empowered to build websites, design the layout, make content changes, set up targeting rules, control user-generated-content, and enable the mobile web, all from an intuitive and easy-to-use interface. (Oracle WebCenter Sites based on Fatwire acquisition enables this seamlessly.)

7.Integrated Campaign to drive customers -In order to create a compelling web experience for customers, we need the ability to organize and access their enterprise data and leverage it in web interactions. We need a technology platform that helps create an integrated campaign to encourage repeat business suiting consumer lifestyle either at work or on the road on an iPhone or visiting a physical store or calling customer service.

Oracle WebCenter is one of the leading technology platforms that offers comprehensive web experience management capabilities such as targeting and optimizing content, social computing, and multichannel engagement—all of which help improve customer loyalty, drive web traffic, and target new customer segments. In one integrated suite, it combines an array of complementary capabilities: web experience management (Oracle WebCenter Sites), composite applications and mashups (Oracle WebCenter Portal), social networking and collaboration (Oracle WebCenter Social), and enterprise content management (Oracle WebCenter Content).

How critical are Real Time decisions in business today?


Alakh Verma, Director,Product Management,Oracle 

We are faced with challenges to take some decisions everyday; some are minor such as paying bills and some are major such as buying house, investing in stocks, development of a product or acquiring a company that need some relevant information in that context. When we look back many years with no computers or access of ready data and wonder how people must be taking these major decisions in day to day life, businesses or even administering nations?

Recently, I watched a movie on Indian mogul emperor Akbar that depicted his popularity among citizen that was based on his real time decision capabilities, intuitive and cognitive intelligence and advice by council of ministers. He used to visit the street in disguise to gather conversation of citizen on the street in order to get real time feedback and sentiments to execute effective decisions. In those days, there was no support of technology and all decisions were based on intuitive judgment. Our brains take in massive streams of sensory data and make the necessary correlations that allow us to make value judgment and taking decisions all in real-time.

Relating the above with our current era, we are given the support of computing power with additional memory and data processing to make real time decisions. The recent technology such as big data analytics helps and supports us with right information, real-time event messaging provides at right time, mobility at right place anywhere and social media in the right context to make right decisions.

With computing power and cognitive intelligence, we are in much better situations to offer the following for Real Time Decisions in the business context.

  • Personalized, Optimized and Context Centric Decisions-For the 1000’s if not millions of interactions, we have with our customers, we can deliver in context the right offer, message, recommendation, treatment or action tailored and personalized within the “moment of impact” to deliver unequaled value.
  • Cross-Channel Learning & Decisions-With a multichannel customer experience platform for true cross channel decisioning that enables consistent operational Decisions for the web channel, in the contact center, or the point-of-sale, and across all lines of business, we can offer cross channel learning and decisions. The automatic insights derived from one channel are seamlessly used both within and across other channels.
  • A Complete Decision Framework leveraging Data, Rules and Predictive Models-A balanced Decision Management Framework that combines both business rules and self-learning predictive models helps in real time decisions.  This also helps in arbitrating rules and predictive model scores in the context of organizational goals/KPI’s at the moment of a Decision’s execution.
  • Closed Loop Business Intelligence & Insight Discovery-Closed loop real-time learning becomes immediately available for the next prediction to drive adaptive, high-value interactions. We may be able to discover and highlight important correlations in the data automatically by way of user-friendly reports. Automated data discovery effortlessly leads user to the right and relevant business insights.
  • Business Controls supporting Collaboration over End to End Decision Lifecycles-A rich web based user application to manage, analyze and refine Decisions over their entire lifecycle helps in gathering cognitive and collaborative intelligence.

In the most competitive business environment, Real Time decisions are critical for effective decisions and for survival and growth for business.


If content is the king then Portal is the queen


If content is the king then Portal is the queen

Alakh Verma,Director,Product Management,Oracle

World Wide Web (WWW) was conceived as a tool by Tim Berners-Lee which created and gathered knowledge through human interaction and collaboration. Web 1.0 came as merely a presentation web with static HTML pages of information. Web 2.0 then gradually advanced towards online participation in content creation and social interaction like e-commerce, e-service, and content strategy which is all aspects of the “Transactional Web” using social media like Facebook, Twitter, and LinkedIn etc. We are now transitioning into Web 3.0 that was first defined by Dakota Reese Brown as “The Contextual Web.” Content and Context will now be extremely significant in determining what content we need and with whom we need to collaborate in social business.

It is estimated that by 2020, there would be 4 billion people will be online; 31 billion connected devices, 25 million applications, 1.3 trillion sensors/tags and 50 trillion gigabytes of content created in networked society. So, we are moving in the world where content is going to be the king and help determine the success or failure of any business. Web content has changed and so is web content management. During last decade, we focused on making it easier and more powerful for non-technical people to move content from their desktop to their web site and helped the enterprises with powerful workflows, approval processes and the ability to integrate with other enterprise tools and applications. Next decade would be more focused on providing web experience or customer experience management on the web. And Content and Portal would remain at the center stage as evolve  into new era of computing.

The explosive growth of content and more specifically unstructured such as videos on YouTube, photos and chat messages on Facebook and emails have given birth to yet another evolutionary paradigm of big data management as we enter the year 2012. As Content remains the king, it is paramount to store, manage, archive and retrieve from the unified repository and Oracle content 11g does it all seamlessly. Customers have started seeking pleasant web experience with their transactions and businesses need to offer robust Customer engagement platform to meet that. 

Content is the King

As content is the king, Web portal or public portal (defined by webopedia) that refers to a web site or service that offers a broad array of resources and services, such as e-mail, forums, search engines, and online shopping malls seems to be the queen to render the context based content from the unified repository from any source on the web to complement each other. Without sound portal framework, users would not be able to get the right content in the context of their business in real time to execute their transaction.

 

 

Portal is the Queen

For example, if any customer visits BestBuy on their web portal for getting any product information leading them to subsequent sales transaction, first thing they would look for a context based search interface that would help the visitor to provide the relevant content. The portal interface should also provide the similar other products or recommended products based on other user feedbacks and ratings. Also quick references and recommendations with an option to verify via chat/email become almost convincing to the prospective buyer to make decision.

Interestingly, technologies such as Oracle WebCenter Portal and Sites offer these capabilities and help portal to be the queen in our Web based business transactions and compliment Web Content management in the overall customer experience management.

Personalization:A key tenet of User Engagement in Social business


Personalization: A key tenet of User Engagement in Social Business

Alakh Verma, Director, Platform Technology Solutions  ,Oracle

In today’s constantly changing and the most competitive business environment, consumers are seeking more personalized care and services for any engagement before they enter into any transaction. It is natural that relationship begins with humble greetings and smile with natural conversation and then understanding the needs and behavior in the process.

User engagement involves a mixture of quantitative and qualitative analysis. Quantitative analysis offers useful patterns and is generally more scalable and easier to conduct. User engagement also involves contextual study and ethnography. These provide information including person, what their routines are in daily life and what their needs are for which they have visited the electronic website, physical store or workplace.

Few weeks back, I walked into Wells Fargo branch in San Jose, to carry out normal banking chores to deposit a cheque. I was overwhelmed by warm greetings by its store manager, Robert Borcherding and he offered to fill out the pay-in-slip while I was waiting in the queue for the teller. Considering it might take longer, he came in again to offer assistance to accept my cheque for the deposit by personal banker. He kept talking and displaying his humility to offer the best care he could while I stayed in the store.

Last week, when I walked in his store again for some other banking transaction, I was pleasantly surprised to receive his personal greetings by my name with smile and again offer to assist with humility. I could see his urgency to satisfy me the fullest. This time, I was in his store to meet his banking manager, Rosa S Aguirre, for refinance on a scheduled meeting. When he saw that she was still busy with some other customer, he offered to carry out all necessary photo copies of all my documents that may have been needed by her. While he was doing this act of an extended personalized service, he was winning my trust and my heart in the process. By then, Rosa was free and ready to meet with me with all ready documents for my meeting. I would categorize this act as clear display of personalization that would go long way of building trust and gaining, growing and retaining business for his store and for Wells Fargo in general. On the other hand, I receive numerous aggressive and disturbing call center calls from their competitive bank for their services and I refuse to take those calls. I am sure many of us would have faced similar episodes in our lives and we appreciate the significance of Personalization and Personalized care in business settings.

Customer Services have been widely used and misused and now businesses need to re-evaluate the paradigm shift of customers’ needs and expectations. With recent technology and tools, we can easily build and deploy virtual web store where consumers may get near real term user experience with personalization and personalized care.

Personalization implies that the changes are based on implicit data, such as items purchased or pages viewed. On an intranet or B2E Enterprise Web portals, personalization is often based on user attributes such as department, functional area, or role.

Image Source http://www.oshyn.com 

There are three categories of personalization:

  1. Profile / Group based   2.Behaviour based   3.Collaboration based

Web personalization models include rules-based filtering, based on “if this, then that” rules processing, and collaborative filtering, which serves relevant material to customers by combining their own personal preferences with the preferences of like-minded others. Collaborative filtering works well for books, music, video, etc.

There are three broad methods of personalization:

  1. Implicit 2.Explicit 3.Hybrid

With implicit personalization the personalization is performed by the web page based on the different categories mentioned above. With explicit personalization, the web page is changed by the user using the features provided by the system (such as Oracle WebCenter Sites). Hybrid personalization combines both two approaches to leverage the best of both worlds.

Personalization is also being considered for use in less overtly commercial applications to improve the user experience online. Facebook introduced Instant Personalization recently.  This new technology is different from Social Plug-ins, which many B2B and B2C sites are already using.  Social plug-ins include things like Facebook live streams and “Like Buttons” and are intended to drive user engagement and make a website more ‘social’.  With Instant Personalization, Facebook shares data with a handful of non-Facebook websites.

Oracle WebCenter personalizes the online buying experience with a comprehensive, highly scalable user engagement platform and applications. Built on a foundation of proven capabilities, including personalization, business user control, cross-channel support, and a flexible platform, WebCenter boosts cross-channel business growth and advanced features help customers quickly find desired products, learn about new offerings, comparison shop, register for gifts, preorder products, redeem coupons, and easily complete their purchases.

Follow

Get every new post delivered to your Inbox.