Articles From Zachary Jarvinen
Filter Results
Article / Updated 07-10-2023
They say if it ain’t broke, don’t fix it, but anyone with high-value assets, whether a fleet of bucket trucks or drilling rigs, knows preventive maintenance is much more effective than performing repairs reactively. Servicing equipment before it fails reduces costly downtime and extends its lifespan, thus stretching your resources as far as possible. This concept certainly isn’t new. Routine equipment checks and preventive maintenance in general have been the responsibility of every maintenance department for decades. But here’s the good part. You can use AI and Internet of Things (IoT) sensors to go beyond preventive maintenance to implement predictive maintenance. Preventive maintenance prevents failures with inspections and services performed at predetermined intervals. Predictive maintenance uses large volumes of data and advanced analytics to anticipate the likelihood of failure based on the history and current status of a specific piece of equipment and recommends service before it happens. How do you like the sound of that? It’s the sound of asset performance optimization. This figure traces the evolution of this concept. Spying on Your Machines Asset performance optimization (APO) collects the digital output from IoT-enabled equipment and associated processes, analyzes the data, tracks performance, and makes recommendations regarding maintenance. APO allows you to forecast future needs and perform predictive maintenance before immediate actions are needed. Although some machines run continuously with little need for maintenance, others require much more care and attention to operate at their best level. Determining which equipment needs more frequent servicing can be time-consuming and tedious. Often maintenance guidelines rely heavily on guesswork. Time frames for tune-ups tend to be little more than suggestions, based on information such as shop manuals and a recommendation from the lead mechanics rather than hard data, such as metrics from the performance history of each piece of equipment, including downtime and previous failures. APO, on the other hand, analyzes both structured and unstructured data, such as field notes, to add context for equipment readings and deliver more precise recommendations. Using IoT devices, APO systems gather data from sensors, EIM systems, and external sources. It then uses AI to acquire, merge, manage, and analyze this information, presenting the results using real-time dashboards that can be shared and reviewed quickly for at-a-glance updates. Throughout the lifespan of the equipment, through regular use and maintenance, the system continues to learn and improve its insights over time. Fixing It Before It Breaks APO allows you to take a strategic approach to predictive maintenance by focusing on what will make the greatest difference to your operation. Unforeseen equipment malfunctions and downtime cause disruptions, which can ultimately jeopardize project timelines, customer satisfaction, and business revenue. These are benefits of APO: Smoother, more predictable operations: Equipment issues are addressed preemptively instead of after a disruption occurs, leading to greater overall operational efficiency. Implementing APO can help companies achieve a 70-percent elimination of unanticipated equipment failure. Reduced downtime: Preventive maintenance typically reduces machine downtime by 30 to 50 percent. Boosted productivity: In addition to reducing downtime, predictive maintenance allows you to become more strategic in scheduling maintenance. This also can uncover any routine maintenance tasks that can be performed at less frequent intervals. Lowered costs: APO can reduce the time required to plan maintenance by 20 to 50 percent, increase equipment uptime and availability by 10 to 20 percent, and reduce overall maintenance costs by 5 to 10 percent, according to a Deloitte study. Increased customer satisfaction: Assets nearing failures can sacrifice production quality, cause service outages, and create other circumstances that ultimately affect the customer. By preventing these issues from happening, APO helps companies achieve and maintain better customer satisfaction. Improved safety outcomes: Equipment malfunctions can result in serious injury, but often companies don’t know a system is about to fail until it’s too late. PwC reports that APO and predictive maintenance can improve maintenance safety by up to 14 percent. Reduced risk of litigation and penalties: With fewer breakdowns and disruptions to service, organizations can minimize their risk of costly fines, lawsuits, and subsequent reputational damage. Ultimately, in any industry with high-value equipment, or even large numbers of low-cost assets, APO pays off. It directs technicians’ efforts to the machines that need attention the most, instead of performing inspections or maintenance on the equipment that doesn’t need it. This leads to predictable and seamless operations, improved uptime, increased revenue opportunities, and greater customer satisfaction. Learning from the Future APO solutions allow you to enhance your operations by making your machines smarter and sharing that intelligence with your workforce, thereby maximizing the value of both your human teams and your mechanical equipment. As the age of AI advances, the companies that thrive will be those that find the best ways to harness data for improved operational performance. Data collection APO continuously collects data on mechanical performance from IoT sensors in virtually any type of device or machine, ranging from hydraulic brake system sensors on a train to temperature monitors inside industrial and medical refrigerators holding sensitive products. The system collects numerous data points from the sensors, blending them with other sources, and analyzes the results. For example, in the case of a hydraulic brake system, APO compares current data to historical performance records, including failure reports, to deliver predictive maintenance insights. When further data inputs are blended with this specific brake data, even richer and more accurate recommendations can be delivered. Additional input samples from internal and third-party sources can be blended to provide context and greater insight; these types of input can be invaluable: Weather data Maintenance recommendations from manuals Supplier quality data Historical brake maintenance schedules and failure rates Passenger travel analysis Heavy or unusual usage data Using this comprehensive blend of data, you can implement an APO solution to recognize patterns and perform an in-depth analysis in multiple applications, from manufacturing plants to utilities and even healthcare. The data can include metrics such as temperature, movement, light exposure, and more collected from IoT sensors on fleets, plants, pipelines, medical imaging equipment, jets, grids, and any other Internet-enabled device. Analysis AI uses big data analytics and natural-language processing to derive key data points from structured data as well as unstructured content such as field notes and equipment manuals. You can use AI to analyze this information and relevant historical data to identify patterns and generate questions that help engineers, maintenance supervisors, production managers, and other key personnel make informed and timely decisions. APO solutions use these patterns to make predictive conclusions that help you answer questions in various areas, such as these: Timing: Am I performing inspections at appropriate intervals? Would shortening the intervals improve overall uptime? Or could I afford to lengthen the intervals to reduce resource expenditure? Quality: Could defective components be slipping through my inspections and leading to downtime? If so, how can I improve the inspection process to prevent this issue moving forward? Design: Can my design be modified to reduce future failures? For example, a predictive conclusion formed by the patterns observed in the case of the train brake system might indicate the need for shorter inspection intervals. This is where humans come in and leverage all of these valuable findings to improve their business. Putting insights to use After patterns are identified and their related questions are answered, the predictive conclusions provided by APO solutions can then be implemented. For example, train maintenance workers can schedule inspections more frequently to check for a key component in the hydraulic brake system that has shown a tendency to fail. Or perhaps the APO solution discovers that a defective component in the train needs attention. Field engineers can use a digital model of the train to determine a repair strategy. If the part cannot be repaired, the APO solution can trigger a replacement part order through the supply chain network, using automation to streamline the process of getting the train back up and running.
View ArticleArticle / Updated 07-05-2023
All that data being collected in manufacturing from IoT devices at unprecedented volume and velocity is driving the fourth industrial revolution. The first industrial revolution was powered by steam. The second was powered by electricity. The third was powered by silicon, which enabled unprecedented computing power. And the fourth industrial revolution is being powered by data. In fact, in the last decade, data has emerged as the new currency that operates across all levels of commerce, right down to the consumer, who pays for the use of “free” social media platforms with their personal data, which those platforms exchange with their clients. The combination of AI and analytics can help manufacturers optimize the use of their IoT data for many applications. Consider three related strategies: proactive replenishment, predictive maintenance, and pervasive visibility. The following figure shows the relationship between the method of controlling costs and the AI technique used to accomplish it. Connected supply chain A supply chain connects a customer to the raw materials required to produce the product. It can be as simple as a single link, such as when the customer stops at a roadside stand to buy tomatoes directly from the farmer. Or it can be very complex and involve dozens of links, such as all the steps between a customer driving a car off the lot back to mining the iron ore or bauxite for the steel or aluminum engine block. In a traditional supply chain, each link operates as a black box, connected to each other by a paper-thin link made up of documents such as purchase orders, shipping manifests, invoices, and the like. Each entity does its own planning, forecasting, and ordering while being blind to conditions on either side of it in the chain. You could think of the links in the traditional supply chain as separate continents, each with its own ecosystem but largely isolated and insulated from the ecosystems of other continents. A connected supply chain brings those continents together like the supercontinent Pangea in the Paleozoic era, forming one interconnected ecosystem of partners, suppliers, and customers. In the connected supply chain, the paper-thin connection of the traditional model is replaced with a digital connection that provides full visibility in all directions. A 2017 McKinsey study suggests that companies that aggressively digitize their supply chains can expect to boost annual growth of earnings by an average of 3.2 percent — the largest increase from digitizing any business area — and annual revenue growth by 2.3 percent. In a recent KRC Research survey of manufacturing executives, 46 percent indicated that big data analytics and IoT are essential for improving supply chain performance. They identified the two areas where big data analytics can have the greatest impact on manufacturing to be improving supply chain performance (32 percent) and enabling real-time decisions and avoiding unplanned downtime (32 percent). The study also identified the top three benefits of big data for manufacturers: Enabling well-informed decisions in real-time (63 percent) Reducing wasted resources (57 percent) Predicting the risks of downtime (56 percent) The connected supply chain enables you to react to changing market needs by sharing insights between every node in the value chain. There are many other applications for AI in manufacturing, some of which I address later, but the three elements of connected supply chain — proactive replenishment, predictive maintenance, and pervasive visibility — provide an intuitive case for using AI to revolutionize your business. Proactive replenishment Optimizing inventory levels while improving customer experience requires the ability to automate much of the replenishment process. Proactive replenishment leverages analytics to monitor inventory consumption and initiate a purchase order on the business network to the supplier to replenish the stock before an out-of-stock situation occurs. The intelligent and connected supply chain provides real-time inventory visibility. In addition to reporting stock levels, it can indicate the condition of each item — such as the temperature at which it is stored — to ensure the quality of those items. Manufacturers can automate the replenishment of parts from the supplier before they are needed in the production process. And, query between suppliers to procure from which has availability, the best rates, and taking into account shipping times required for in-time delivery. Predictive maintenance The ability to predict when a part of a sub-system of a serviceable product is likely to fail and intervene can save a manufacturer millions of dollars, and thus predictive maintenance is a key investment area for the supply chain. Whether that part is within the production process, within the warehousing environment, or part of a connected vehicle, an IoT network automatically monitors and analyzes performance to boost operating capacity and lifespan. This system intelligently decides whether the part needs to be replaced or repaired and can automatically trigger the correct process, typically reducing machine downtime by 30 to 50 percent and increasing machine life by 20 to 40 percent. Studies by the Federal Energy Management Program of the U.S. Department of Energy found that predictive maintenance provides a 30 to 40 percent spending reduction compared to reactive maintenance, and an 8 to 12 percent spending reduction compared to preventative maintenance. Organizations often make incremental progress toward predictive maintenance, starting with monitoring IoT data in the control room and reacting to problematic readings. In the next stage, they run reports to view recommendations for maintenance. In the final stage, they remove the requirement for human interaction to initiate maintenance by automating the creation of repair tickets. Pervasive visibility Pervasive visibility is the ability to see exactly where goods are during their life cycle, providing a view of the current state of all assets across the entire organization and beyond to partners, customers, competitors, and even the impact of the weather on operations and fulfillment. IoT plays a key role in providing that visibility. Current estimates predict that there will be 75 billion connected devices by 2025, ten connected devices for every human on the planet. Most of those devices will be in the manufacturing sector. When you consider that a single device can produce gigabytes of information each minute, the volume and velocity of data within production and operations can easily spiral out of control — or, more often, simply be ignored. While 60 percent of executives that McKinsey surveyed said that IoT data yielded significant insights, 54 percent admitted that they used less than 10 percent of that IoT information. Making sense of that data is where AI comes in. By combining AI and analytics, you can bring together information from a variety of data sources, identify patterns and trends, and provide recommendations for future actions. It provides the basis for new levels of production and business process automation, as well as improved support for employees in their daily roles and in their decision-making. To achieve these gains in the supply chain, AI has the potential to bring together both structured and unstructured data from a wide range of sources, including IoT devices, plant operations, and external partners to identify patterns linking factors such as demand, location, socioeconomic conditions, weather, and political status. This information forms a basis for a new level of supply chain optimization spanning raw materials, logistics, inventory control, and supplier performance and helps anticipate and react to market changes.
View ArticleArticle / Updated 09-01-2020
Business organizations look to professional services firms to offload existing processes such as payroll, claims processing, and other clerical tasks. Consequently, rather than push the innovation curve as early adopters of emerging technology, professional services firms have traditionally followed well-established procedures and used conventional tools. However, much of the work they take on involves processes that are well suited for optimization through AI, and many corporations are investigating the benefits of AI for streamlining workflows and cutting operational expenses. A KPMG report predicts that enterprises will increase their spending on intelligent automation from $12.4 billion in 2019 to $232 billion in 2025, almost 19 times as much in just seven years. A McKinsey report estimates that 20 percent of the cyclical tasks of a typical finance unit can be fully automated and almost 50 percent can be mostly automated. Exploring the AI Pyramid From all appearances, the industries typically served by professional services firms are in the early stages of a tectonic shift that will reverberate throughout the professional services industry. The initial shock will involve adopting new ways of organizing and delivering professional services, but the aftershocks could very well challenge the essence of what professional services firms deliver. The following figure shows the hierarchy of business complexity. AI projects usually start at the base of the pyramid, where the goal is to save costs by optimizing manual process via human-machine collaboration. As the projects move up the pyramid, they move away from saving costs and focus on increasing revenue by making more informed decisions regarding existing lines of business or launching new lines of business. In a real-life example, one cookware company uses home demonstrations to sell high-end pots and pans using internal financing. They brought in AI to replace a manual workflow based on rules and decision trees with a semi-automated process that streamlines the underwriting decision and reduces acquisition cost. This project was a tactical move to save money by making the process more efficient. On the strength of that success, they moved up the pyramid. They used AI to analyze the historical behavior of accounts that underwriting declined and passed on to a third-tier lender. The model looked for common characteristics of borrowers that had been declined but got financing from the third-tier lender and didn’t default but paid it in full. They then applied the model to new loan applications to identify candidates who might not meet the traditional requirements for financing but were still a good risk. This project was a strategic move to expand their market to increase revenue. To begin with, if AI doesn’t eventually replace the most fundamental tiers of service delivery, such as paper handling and data entry, it will at the very least optimize them to the point that they can be delivered by a significantly reduced staff through human-machine collaboration. Or it could lead to an increase in staff by freeing up funds through increased efficiency, An Accenture report indicated that AI could boost employment levels by 10 percent if the rest of the world invested in AI and human-machine collaboration at the same level as the top performing 20 percent. Professional services firms touch many industries, and just as technology matures and affects all industries, by necessity it affects how professional services firms engage their clients. AI won’t replace core professional expertise, but it will make you more efficient and thus enable you to increase the value proposition for your clients. However, professionals who do embrace AI will replace those who don’t. Climbing the AI Pyramid The research tells us that enterprises across the board will increasingly turn to AI and big data to reduce costs and errors while improving efficiency and strategic planning. With a history of anticipating the needs of the market and then providing the services, you can use that knowledge to automate your own back-office processes and build on that experience to offer expanded services that relieve your clients of the heavy lifting of creating the architecture for an in-house or out-sourced AI initiative. Many firms focus on helping their clients automate routine tasks as a low-friction entry point with obvious time and cost savings. This simple application also serves as a platform for educating the client on the principles of AI and evaluating use cases for the best fit and results. With a proven win under their belt, clients are more receptive to expanding the role of AI and machine learning in their organizations, allowing them to introduce innovation and differentiation in their product and service offerings and to use the data to tackle tasks at higher levels of the pyramid. By applying AI to your client’s environment, you can also increase your value to your clients by weaning them from reactively correcting when unexpected issues arise to forestalling common issues with preventative practices and ultimately to anticipating outcomes with predictive management. As the application of AI to routine processes relieves employees from attending to mundane tasks, it also frees them to tackle more valuable and interesting tasks, thus enhancing their own career paths and adding more value for the clients. Another byproduct of the cycle of expanding automation upward through the tiers of the complexity pyramid is that, as the capabilities of artificial intelligence grow, the practices of your firm become more specialized until they are distilled to services that are beyond the touch of AI. Or the singularity happens, whichever comes first. But until that time, those who lean into innovation will gain the competitive advantage, but only if they incorporate continuous learning for their employees as a part of the business model. Unearthing the Algorithmic Treasures The uses for AI are as varied as the industries served by professional services firms. Healthcare AI can quickly and economically acquire, classify, process, and route unstructured text to everyone in the information pipeline, increasing accessibility while lowering costs. Natural-language processing can extract targeted information from unstructured text such as faxes, clinical notes, intake forms, and medical histories, to improve end-to-end workflow. The process starts with data capture and classification, and then routes data and documents to the appropriate back-end systems, spotting exceptions, validating edge cases, and creating action items. Content management AI uses machine learning, text mining, and natural-language processing to process content, extracting concepts and entities, such as names, places, dates, and customized elements relevant to the business. AI then uses that information to create metadata and import it into a structured database, accelerating searches and data analysis. At the same time, the system automatically classifies the document based on its type and content and either assigns it to the next step in an automated workflow or flags it for review. Compliance AI uses unstructured data mining, robotic process automation, statistical data aggregation, and natural-language processing to read and interpret compliance documents, interpret metadata, and identify roles and relationships, and then uses cognitive-process automation to deliver concise, actionable insights. AI uses supervised and unsupervised learning, natural-language processing, and intelligent segmentation to capture, analyze, and filter possible compliance violations to discard false positives that waste the time of compliance officers. AI uses structured and unstructured data mining and natural-language processing to monitor internal and external records, documents, and social media to detect errors, violations, and trends, allowing the compliance department to be proactive and avoid costly penalties. AI uses robotics-process automation, natural-language processing, and machine learning to identify potential violations of Know Your Customer (KYC) and Anti-Money Laundering (AML) regulations. Law AI uses text mining to process large pools of unstructured data, such as legal documents, emails, texts, and social media to identify key concepts, categorize content, detect subjectivity, isolate behavior patterns, discern the sentiment expressed in the content, and extract phrases and entities such as people, places, and things. AI uses supervised and unsupervised learning based on native or custom taxonomies to classify or characterize large volumes of documents and cull irrelevant documents as required in support of pre- and post-production activities, such as early case assessment and privilege detection. AI uses machine learning and natural-language processing to analyze large amounts of textual content and distill it into short summaries and chronologies, which can display entity and concept trends over time, as well as behavioral patterns of persons of interest. The results can be integrated with data visualization to display the outcome in a consumable and intuitively understandable structure using interactive reports and dashboards. Manufacturing AI uses decision trees and neural networks to establish baseline requirements and then uses real-time data to reveal patterns and relationships to determine demand behavior, which drives optimized inventory levels and replenishment plans. AI uses text mining, data mining, and optimization-planning techniques to integrate suppliers and automate transactions to help clients understand their current business, address issues, and formulate strategies for improved performance. Clients can use supply-chain analytics to compare the performance of trading partners to operational and business metrics to make better decisions about their partnerships. AI uses reinforcement learning to automate repetitive human processes. Robotic-process automation (RPA) combines analytics, machine learning and rules-based software to capture and interpret existing data-input streams to process a transaction, manipulate data, trigger responses, and communicate with other enterprise applications. Oil and gas AI uses predictive maintenance algorithms to achieve optimum uptime. AI uses IoT sensors and machine-learning algorithms to support data-driven decision-making and enable operational excellence for midstream processes, such as storing and transporting oil and gas. AI uses text mining, natural-language processing, and machine learning to read legacy exploration and production data to optimize new construction and development projects. AI uses text mining and machine learning to collect, combine, and assess data to improve operational performance, reduce cost, minimize risk, and accelerate time-to-production in well-site development. It also uses those techniques to boost health and safety and to improve environmental performance. Utilities AI uses machine-learning algorithms and data from IoT devices to help energy and utility companies predict energy demand to assist in meeting short- or long-term needs, pinpointing areas of the plant or grid that need maintenance and reducing waste by uncovering inefficiencies.
View ArticleArticle / Updated 08-20-2020
Chances are good that you’ve sent an email to a customer service department at one point or another. Perhaps your order was late, items were damaged in shipping, or you needed to know how to initiate the return process. You may have found that while some companies are prompt in sending a reply and resolving your issue, you may not hear back from others for days. Although the timeliness of their response may have something to do with the nature of your issue, it’s also likely to be influenced by whether the company is still using manual processes to sort through incoming emails. Retailers that offer a prompt resolution are likely using AI-enhanced advanced capture technology. These solutions offer the ability to quickly process incoming data, but they don’t stop at email. Advanced capture technology can process handwritten notes, snail mail, and even social media. Several technologies come together to make enhanced content capture possible. Capture The workhorse of the capture technology is, of course, its ability to capture data from any source, including handwritten forms, emails, PDF files, Word documents, and more. Advanced OCR technology recognizes both machine- and hand-printed characters in any major language. AI-enhanced capture can also recognize specific forms and can manage complex capture workflows across different departments quickly. Most systems also capture mobile information, such as forms submitted via smartphone. Digitize where needed Based on predetermined configurations, capture technology can convert the information it captures into editable text or a searchable PDF file, depending on your needs. For example, some paperwork-heavy industries, such as medical offices, have begun scanning documents primarily for archival purposes, while others are transforming their entire business processes to become digitized. Process, classify, and extract AI uses machine learning, including natural-language processing and sentiment analysis, to gain a contextual understanding of the data. After it reads and understands content, it applies advanced recognition and auto-classifies it based on these findings. AI-enhanced capture uses two types of technology to drive speed and accuracy: Zonal extraction: This approach uses a template that identifies fields to capture and their locations. It’s most effective for recurring documents, such as claims forms or vendor invoices. Freeform extraction: Using keywords and text analysis, freeform extraction is a flexible solution for retrieving data from documents that come from different sources. For instance, vendors may send your company invoices in multiple formats. AI-enhanced capture uses this technology to apply freeform rules that enable the extraction of key data from the invoices. Together, these technologies automate data extraction to save time and reduce the risk of human error. AI delivers clear, actionable insights and even predictive analytics. It also prioritizes content based on any additional established or learned criteria to trigger a machine-initiated workflow. For example, in the contact form scenario mentioned above, AI can quickly determine whether emails from customers have a positive, neutral, or negative tone. This ability to read and analyze sentiment allows the system to prioritize appropriately, so customer support personnel can deliver answers in a timely manner to the customers who need them most. Similarly, it can detect important differences between internal documents. For example, it can appropriately process invoices sent to customers versus invoices received from vendors requiring payment. Validate edge cases Another standout quality of AI-enhanced capture is its ability to help humans focus on challenging tasks. Not only does it reduce the tedious processing of data without the need for manual intervention, but it also brings edge cases to the attention of the right person for validation. For example, an admissions department at a community college may be able to process most transcripts rapidly using capture technology. They extract the information and send the files to the appropriate repository. Yet, in some cases, the system might flag missing information or errors that exceed value thresholds. In these scenarios, these specific transcripts can be brought to the attention of the appropriate admissions officers so they can follow up with students or take other actions as needed. AI-enhanced content capture becomes more intelligent over time. It learns from historical data to determine which cases can be considered normal and which require human intervention. It can also make decisions based on pre-established thresholds to deliver value to your organization right away. Manage AI-enhanced content capture also simplifies document management. With its ability to read and make meaning of data, it routes and indexes information to the appropriate place within the content suite repository. Because it also can extract keywords, it makes your data and content easier to search. You can use AI-enhanced capture to automatically assign metadata from keywords to each piece of content that enters the enterprise, effectively acting as comprehensive translators. Although functions like HR, finance, and sales all have their own unique document types and language, these systems are sufficiently intelligent to understand their specific nuances. They can therefore manage content across the entire organization and link various functions seamlessly through simplified sharing and connections to line-of-business applications. Visualize Finally, AI-enhanced capture offers key analytics via dashboards and reports. It can deliver key performance indicators to help you spot inefficiencies in your business processes.
View ArticleArticle / Updated 08-20-2020
An intelligent recommendation system analyzes the available information to produce a detailed, individualized picture of each customer and make predictions about their preferences and behavior, specifically their buying propensity. Many offers are irrelevant to consumers or may even strike the wrong note. For example, you aren’t putting your best foot forward by recommending a romantic getaway to someone in the middle of a messy divorce, or by listing the benefits of retirement community units to someone under 30. For some customers, the sheer number of offers could annoy them to the point of deleting them all, unread. Popular product recommendation The simplest method is to recommend popular products. Recommending what’s popular requires very little analysis. For example, a server in a restaurant uses this method when you ask what they recommend, and they say, “Our Spam tartare is very popular.” Popularity-based recommendations do not consult or track the tastes or behavior of the individual customer. If it’s trending, it’s recommended. This approach will likely perform better than recommending a random product from the catalog, but that isn’t an intelligent recommendation. Market-basket analysis A slightly more sophisticated approach, market-basket analysis, takes into consideration what other customers have bought. A common example is Amazon’s recommendation: “Customers who bought [what you’re interested in] also bought [this other product].” For example, a basket that contains disposable diapers might also contain baby food, logically enough. Fed by millions of data points in buyer behavior, this approach can provide strong statistical support for intuitive associations, like shelving tissues with the cold and cough medicines. However, this approach can also be thrown off by short-term events (maybe the diapers and baby food were for a visiting relative) or coincidences that don’t actually reveal significant trends. For example, the Harry Potter books are so massively popular on Amazon that they provide hardly any clues about what other titles not about a boy wizard would appeal to that purchaser. Propensity modeling In this use case, AI creates value by leveraging propensity modeling to help you avoid those pitfalls and instead target the right offer to the right customer. Jerry Strickland, a Senior Data Specialist with USAA, distilled the various types of propensity modeling into six examples: Predicted Customer Lifetime Value (CLV): Instead of looking at the size of individual purchases for big spenders, this metric estimates the total net profit that a given customer will bring to the business for the length of their relationship. It is a prediction of future sales, not a summary of past sales. Predictive modeling evaluates everything you know about the customer, including past sales, demographics, social listening, and other channels, to draw inferences from similarities with the behavior of other customers. Predicted Share of Wallet (SOW): Instead of attempting to grow the percentage of sales in a category (market share) by attracting new customers, growing wallet share focuses on increasing the amount an existing customer spends on your brand at the expense of the competition. Predictive modeling looks for growth customers instead of growth markets; that is, it identifies current customers who are spending more with a competitor than with you, introducing opportunities for marginal increases of existing business. Propensity to engage: Instead of blasting out emails or social media posts to every customer, match the channel to the customer. AI uses social listening and other techniques to identify the customers most likely to click, reply, or comment so you can maximize your digital marketing efforts. Propensity to unsubscribe: Conversely, AI can also identify customers who are less responsive to digital marketing push strategies so you can use other strategies for customers with high CLV and a high propensity to unsubscribe. Propensity to buy: You use a different engagement strategy for a customer who is comparison shopping or just looking than for the customer who is ready to buy right now. Propensity modeling uses data from various sources to help you tell the window shoppers from the serious shoppers so you can apply the strategy that encourages a sale without unnecessarily cutting into margins. For example, you might tempt comparison shoppers with a discount, but a customer with a high propensity to buy might be more interested in features and upgrades than discounts. Propensity to churn: Some businesses — Internet and TV service providers come to mind — seem prone to maximizing their margins until a customer threatens to cancel, and only then attempt to keep them from jumping ship. Propensity modeling alerts you to customers who are at risk so that you can proactively reach out to them, especially for customers identified as high value by your system. These predictive marketing approaches date back at least to the mid-twentieth century and the rise of catalog and direct-mail outreach, when punch cards and simple spreadsheets were among the few tools that could crunch the numbers and add statistical support to a merchandising manager’s intuition. The ability to effectively predict and act on buyer behavior came with the arrival of technology that can collect and analyze literally billions of records and transactions, combined with the Internet to facilitate instantaneous communications. Of course, the information used to predict customer behavior resides in enormous volumes and is encoded in a dizzying variety of data types, from structured data in databases to free-form text in documents, emails, scanned images, and social media feeds. It quickly becomes apparent that machines provide the only efficient method of parsing, understanding, and gaining value from the information. Only machines — which is to say, artificial intelligence — can read at the pace required to acquire and process thousands of documents and articles every second, and then merge, aggregate, and persist the information while analyzing it for content and tone. The main tools that AI uses to tackle these huge volumes of structured and unstructured data are data mining and text mining. Of course, these techniques can address a wide range of AI business cases besides intelligent recommendations. But first, let me provide a brief overview of these important terms in enterprise AI. Data and text mining Machine learning uses information acquired through data mining and text mining to make associations and draw inferences. Data mining processes structured data such as is found in corporate enterprise resource planning (ERP) systems or customer databases or in an online shopping cart, and it applies modeling functions to produce actionable information. Text mining uses natural-language processing to extract data elements to populate the structured metadata fields such as author, date, and content summary that enable analysis. Depending on the characteristics of the domain, the quality of available data and the business goals, recommendation engines apply various data mining techniques such as these: Collaborative filtering (CF) Content-based filtering (CBF) Knowledge-based methods Case-based methods The filtering techniques are the most commonly used techniques for intelligent recommendations. Collaborative filtering (CF) Because it relies solely on readily available and easily analyzed transaction-level data to find correlations in consumption patterns, collaborative filtering, shown in Figure 20-1, is one of the more popular recommendation techniques. Several types of collaborative filtering algorithms are available: User-based CF measures the similarity between target users and other customers by computing a similarity score for every customer pair, and then offers products that similar customers have chosen in the past. It answers the question “What did customers with similar tastes find interesting?” Item-based CF measures the similarity between the items that target customers interact with and other items by computing a similarity score between every product pair. It answers the question “What items are similar to the item this customer finds interesting?” Context-aware CF adds another dimension to the user-based and item-based CF in the form of contextual information such as time, location, and social information. It answers the question “What else do I know about this customer that might affect the level of interest in an item?” Because it is based on comparing shopper behavior, user-based CF is very effective. However, because it must compute a similarity score for every customer pair, it is time- and resource-intensive. A database of N customers generates (N * (N - 1)) / 2 unique customer pairs. For example, 100 customers produce 4,950 unique pairs. If you have 5,000 customers, you would have 12,497,500 unique customer pairs. As you can see, for large user bases, this algorithm is hard to implement without a very strong, parallelizable system. Item-based CF does not need to compute similarity scores between customers, only between products, and the number of products is likely to be much smaller than the number of customers. As a result, it is less resource-intensive and takes much less time. In addition, if you have a fixed set of products, you can perform the computation once and you’re done. Context-aware CF enhances a recommendation engine with context, expanding the ability to establish relevance. Content-based filtering (CBF) This personalization technique builds a profile of the customer’s interests based on the item metadata in the customer’s buying history and looks for matches with the item metadata for other products. For example, if a customer buys a book on woodworking, the metadata of that purchase could be matched with metadata for related products, such as woodworking tools and protective equipment. The process involves two main tasks: Identify metadata attributes to be associated with each item. Build user profiles of items a user has interacted with, giving more weight to metadata attributes that appear more often. User feedback is critical to fine-tune the profile and subsequent recommendations. The accuracy of CBF relies heavily on the quality of metadata, and the results are usually less accurate than CF methods. Simpler techniques such as popularity-based methods or market basket analysis generally also lack high predictive power or accuracy. Cross-validation Intelligent recommendation systems are typically deployed with the goals of identifying and raising the effectiveness of cross channel/cross product sales campaigns, identifying a customer’s affinity for specific products, reducing customer churn and boosting revenue. So how do you assess how well your recommendations perform? Cross-validation is one technique used to test the accuracy of the recommendations by calculating the receiver operating characteristic (ROC) curve to produce a metric called “area under the curve” (AUC). This exercise compares predictions (recommendations) to actual outcomes (purchases) to quantify the likelihood that items with the highest ranked recommendation will be purchased. Historical Fact: The seemingly strange-named Receiver Operating Characteristic curve dates to WWII and was used to measure the effectiveness of a sonar signal to differentiate between a whale and a submarine. You can customize your intelligent recommendation system to track key performance indicators to optimize performance using these values: Click-through rate (CTR): How often does the customer explore the recommended offerings? Conversion rate (CR): How often does the customer purchase an item based on the recommended offerings? Conversion time: How long does it take to convert a casual shopper into a loyal customer? Average order value (AOV): Does the average order value go up? Other business metrics to consider include ROI and customer lifetime value. However, these metrics depend on several unknowns and thus can be difficult to measure at times. Data visualization Data visualization is one of the most important elements of any analytics solutions. This is especially true for predictive analytics software where the results provide actionable insight to improve decision-making. Enterprises can’t afford to leave the interpretation of the results to data scientists; they must be easily digestible by the people who will actually work with them: end-users and business managers. Data visualization can produce a single view of multi-dimensional data sets to help you with these tasks: Assimilating a large amount of information in a glance and focusing on what matters most Recognizing the correlation between consumer behavior and metrics such as customer life value and share of wallet Identifying trends and connections and preparing strategies in advance Improving collaboration between teams Most predictive analytics solutions provide a range of data visualization capabilities including charts, graphs, reports, and dashboards. The best predictive analytics software will give you easy-to-use, self-service features where users can define their own visualization capabilities to display the results in the way they want.
View ArticleArticle / Updated 08-20-2020
In addition to not pursuing profit, government agencies and nonprofit organizations often work with constrained budgets and limited resources, and AI can help with both. Government In 2002, the world generated 5EB (exabytes) of data. That is the equivalent of a 4-quadrillion (18 zeroes) page text document or 5 billion hours of high-definition YouTube videos. Ten years later, the world was generating that much data every week. Now, the world churns out more data than that every day. The following figure gives you an idea of how quickly data generation has ramped up since the commercialization of the Internet. The data tsunami touches everyone, but it particularly hits government, the ultimate bastion of paperwork. Government offices and officials have been swamped by numbers and paperwork for centuries. For the 1880 U.S. census, it took eight years to process 50 million responses by hand. To address this problem, inventor Herman Hollerith took a hint from Joseph Marie Jacquard’s automated loom, which created sophisticated weaving patterns controlled by punched cards. Hollerith developed a punched card driven electronic calculating machine. As a result, even though the population grew 25 percent to 62 million, the 1890 census took only six years to process, saving the government $5 million. Legacy IT systems For 2022, Gartner has projected the global government and education IT budget will hit $634 billion. Unfortunately, most government IT departments spend nearly 75 percent of their budgets on maintaining legacy systems. Seventy percent of U.S. government workers are over 40 years old, matching the age of many of the IT systems they use to do their jobs. For example, until July 2019, the Strategic Automated Command and Control System that controls the U.S. nuclear arsenal ran on a computer system that requires eight-inch floppy disks. In July, the U.S. Department of Defense upgraded the system to highly secure solid-state digital storage. Citizens routinely encounter twentieth-century systems when renewing a driver’s license, engaging with elected representatives, or applying for benefits, an experience that can be underwhelming at the least, if not frustrating. Data silos Typically, every agency has its own intake process with its own forms and its own databases, which severely hampers any attempt to deliver a seamless experience to the user of government services. In addition, the silos expand across domains, preventing agencies from acting on the valuable information available in social media, websites, forums, and email. This vast pool of unstructured data can provide significant insights into the people that government agencies serve and could significantly enhance the quality of services provided. Data security Budget cuts are also having an impact on security. A recent Thales Security report revealed that 70 percent of U.S. government agencies have experienced a data breach, 57 percent in 2018 alone. Nonprofit Nonprofits also suffer their share of data breaches. Approximately 15 percent of breaches reported in CyberCrime Magazine’s “Cybercrime Diary” target nonprofit and government organizations. Some of these breaches compromise millions of accounts, exposing personally identifiable information, such as names, addresses, tax ID numbers, electronic medical records, and financial information such as bank and credit card numbers. Consequently, it’s not surprising that in a recent CohnReznick survey 89 percent of nonprofits placed cybersecurity in their top three (37 percent) or top ten (52 percent) concerns. However, only 32 percent of the surveyed organizations had performed a cybersecurity assessment that included penetration testing. A Salesforce survey found that 74 percent of nonprofits list among their top challenges capturing and managing accurate donor data. In addition, 37 percent said their priority was to reduce the costs of program execution, but only 16 percent prioritized reducing manual tasks, a significant contributor to cost reduction. Fraud A global Association of Certified Fraud Examiners report on fraud across all sectors indicated that 25 percent of fraud cases occur in government agencies (16 percent) and nonprofit organizations (9 percent). The study also showed that although the risk stays fairly constant regardless of the size of an organization (between 22 and 28 percent), the median financial loss from organizations with fewer than 100 employees ($200,000) is twice as much as for larger organizations. A McKinsey study found that government programs detect less than half of the losses due to fraud, waste, and abuse, which amounted to $57 billion detected out of $148 billion lost in 2017. Nonprofits had the smallest median loss of $75,000, but given the limited resources of most nonprofits, this amount can deal a death blow to the organization. In fact, another study showed that 25 percent of nonprofits that experienced a publicized fraud incident shut down within three years, with the mortality rate focused on smaller and newer organizations. Artificial intelligence is uniquely suited to address these challenges that nonprofits and government agencies face.
View ArticleArticle / Updated 08-20-2020
The high-value applications of AI are built upon a hierarchy of competencies. This figure shows the hierarchy of competencies required to use artificial intelligence. Data collection Data collection is the foundation of the pyramid, the stage where you identify what data you need and what is available. If the goal is a user-facing product, are all relevant interactions logged? If it is a sensor, what data is coming through and how? Without data, no machine learning or AI solution can learn or predict outcomes. Data flow Identify how the data flows through the system. Is there a reliable stream or extract, transform, and load (ETL) process established? Where is the data stored, and how easy is it to access and analyze? Explore and transform This is a time-consuming and underestimated stage of the data science project life cycle. At this point, you realize you are missing data, your machine sensors are unreliable, or you are not tracking relevant information about customers. You may be forced to return to data collection and ensure the foundation is solid before moving forward. Business intelligence and analytics After you can reliably explore and clean data, you can start building what is traditionally thought of as business intelligence or analytics, such as defining key metrics to track, identifying how seasonality impacts product sales and operations, segmenting users based on demographic factors, and the like. Now is the time to determine: The features or attributes to include in machine learning models The training data the machine will need to learn What you want to predict and automate How to create the labels from which the machine will learn You can create labels automatically, such as the system logging a machine event in the back-end system, or through a manual process, such as when an engineer reports an issue during a routine inspection and the result is manually added to the data. Machine learning and benchmarking To avoid real-world disasters, before the sample data is used to make predictions, create a framework for A/B testing or experimentation and deploy models incrementally. Model validation and experimentation can provide a rough estimate of the effects of changes before you implement them. Establish a very simple baseline or benchmark for performance tracking. For example, if you are building a credit card fraud detection system, create test data by monitoring known fraudulent credit card transactions and compare them to the results of your model to verify it accurately detects fraud. Artificial intelligence After you reach this stage, you can improve processes, predictions, outcomes, and insights by expanding your knowledge, understanding, and experience with new methods and techniques in machine learning and deep learning.
View ArticleArticle / Updated 08-20-2020
Recently Gartner analyst Nick Heudecker generated a firestorm of debate when he said a previous Gartner statistic that reported 60 percent of big data projects fail was too conservative and that an 85 percent failure rate is more accurate. Either way, it’s a daunting statistic. One way to avoid becoming a statistic is to approach your AI journey using an industry-proven model — the machine learning development life cycle. This figure shows the seven elements of the methodology. This methodology is based on the cross-industry standard process for data mining (CRISP-DM), a widely used open standard process model that describes common approaches used by data mining experts. The table shows the questions that must be answered for each element. The Machine Learning Development Life cycle: Elements and Questions Element Question Define the task What problem or question do you want to address with data? Collect data What data do you have that could answer our questions? Prepare the data What do you need to do to prepare the data for mining? Build the model How can you mimic or enhance the human’s knowledge or actions through technology? Test and evaluate the model What new information do you know now? Deployment and integrate the model What actions should you trigger with the new information? What needs human validation? Maintain the model How has the data changed over time? Do the results reflect current reality? The process of developing a machine learning model is highly iterative. Often, you will find yourself returning to previous steps before proceeding to a subsequent one. A machine learning project is not considered complete after the first version has been deployed. Instead, the feedback you collect after the initial version helps you shape new goals and improvements for the next iteration. In the light of this feedback-and-iterate practice, the model is more a life cycle than a process, largely because, for the most part, in the model, data drives the process, not a hunch or policy or committee or some immutable principle. You start with a hypothesis or a burning question, such as “What do all our loyal customers have in common?” or flip it to ask “What do all our cancellations have in common?” Then you gather the required data, train a model with historical data, run current data to answer that question, and then act on the answer. The steering group provides input along the way, but the data reflects the actual, not the hypothetical. This principle of data-driven discovery and action is an important part of the life cycle because it assures that the process is defensible and auditable. It keeps the project from going off the rails and down a rabbit hole. Using the life cycle, you will always be able to answer questions such as how and why you created a particular model, how you will assess its accuracy and effectiveness, how you will use it in a production environment, and how it will evolve over time. You will also be able to identify model drift and determine whether changes to the model based on incoming data are pointing you toward new insights or diverting you toward undesired changes in scope. Of the seven steps in the methodology, the first three take up the most time. You may recall that cleaning and organizing data takes up to 60 percent of the time of a data scientist. There’s a good reason for that. Bad data can cost up to 25 percent of revenue. However, all that time spent preparing the data will be wasted if you don’t really know what you want out of the data. Define the task What problem or question do you want to address with data? Follow these steps: Determine your business objectives. Assess the situation. Determine your data mining goals. Produce a project plan. Some people think of AI as a magic machine where you pour data into the hopper, turn the crank, and brilliance comes out the other end. The reality is that a data science project is the process of actually building the machine, not turning the crank. And before you build a machine, you must have a very clear picture of what you want the machine to do. Even though the process is data-driven, you don’t start with data. You start with questions. You may have a wealth of pristine data nicely tailored to databases, but if you don’t know what you’re trying to do, when you turn the crank, the stuff that comes out the other end might be interesting, but it won’t be actionable. That’s why you start with questions. If you ask the right questions, you will know what kind of data you need. And if you get the right data, at the end you will get the answers — and likely more questions as well. During the business understanding step, you establish a picture of what success will look like by determining the criteria for success. This step starts with a question. In the course of determining what you need to answer the question, you explore the terminology, assumptions, requirements, constraints, risks, contingencies, costs, and benefits related to the question and assemble an inventory of available resources. For example, your initial question might be “What is causing an increase in customer churn?” This question could be expanded to ask “Can you pinpoint specific sources of friction in the customer journey that are leading to churn?” Pursuing that question may lead you to brainstorming and research, such as documenting the touchpoints in the current customer journey, analyzing the revenue impact of churn, and listing suspected candidates for friction. Collect the data What data do you have that may be able to answer your questions? Follow these steps: Collect initial data. Describe the data. Explore the data. Verify data quality. To get to where you’re going, first you must know where you are. Remember that moment in The Princess Bride when Westley, Inigo, and Fezzik list their assets and liabilities before storming the castle and determine that they will need a wheelbarrow and that a holocaust cloak would come in handy? That was data understanding. During the data understanding step, you establish the type of data you need, how you will acquire it, and how much data you need. You may source your data internally, from second parties such as solution partners, or from third-party providers. For example, if you are considering a solution for predictive maintenance on a train, you might pull information from Internet of Things (IoT) sensors, weather patterns, and passenger travel patterns. To make sure you have the data required to answer your questions, you must first ask questions. What data do you have now? Are you using all the data you have? Maybe you’re collecting lots of data, but you use only three out of ten fields. This step takes time, but it is an essential exercise that will increase the likelihood that you can trust the results and that you aren’t misled by the outcomes. Prepare the data What do you need to do to prepare the data for mining? Follow these steps: Select the data. Clean the data. Construct the data. Integrate the data. Format the data. Select the data: In this current data-rich environment, narrowing the options to identify the exact data you need can pose a challenge. Factors to consider are relevance and quality. In cases that might be sensitive to bias, you must pay close attention to seemingly unrelated fields that might serve as a proxy. In a classic example, a loan approval process excluded race from its model, but included ZIP code, which often correlates directly with race, so the process retained the same level of bias as before. Clean the data: The available data for your project may have issues, such as missing or invalid values or inconsistent formatting. Cleaning the data involves establishing a uniform notation to express each value and setting default values or using a modeling technique to estimate suitable values for empty fields. Construct the data: In some cases, you might need a field that can be calculated or inferred from other fields in the data. For example, if you are doing analysis by sales region, detailed order records may not include the region, but that information can be derived from the address. You might even need to create new records to indicate the absence of an activity, such as creating a record with a value of zero to indicate the lack of sales for a product in a region. Integrate the data: You might encounter a situation where you need to combine information from different data sources that store the data in different ways. For example, suppose you are analyzing store sales by region; if you don’t have a table for store-level data, you need to aggregate the order information for each store from individual orders to create store-level data. Or you may need to merge data from multiple tables. For example, in the store sales by region analysis, you may combine regional information such as manager and sales team from one source with store information from another source into one table. Format the data: The data you need might be trapped in an image, such as a presentation or graphic, in which case you would have to extract it through some method, such as optical character recognition, and then store the information as structured data. Build the model How can you mimic or enhance the human’s knowledge or actions through technology? Follow these steps: Select an algorithm and modeling techniques. Test the fit. Build the model. Assess the model. This step represents the primary role of a data scientist. Based on the data and likely best fit, the data scientist selects what should be the most promising algorithm, often from an open source library like MLlib. Then, the data scientist uses techniques like those available in popular programming languages like R or Python to build a usable ML model based on the algorithm and the data. The process can take some time based on peculiarities in the data or the nuances of your business. In the end, however, based on training the algorithm using the sanitized historical data, you get actionable information such as a prediction or a next best action. By now, the modeling technique to use should be an obvious choice based on the questions you developed at the beginning and the data you have to work with. After you have trained the model using the source data set, test its accuracy with the test data set. One way of evaluating test results for a classification model is to use a confusion matrix, which is a simple classification quadrant, also known as Pasteur’s quadrant. For a simple example, consider a binary classifier that produces a yes or no answer. There are two ways of getting it right (to correctly predict yes or no) and two ways of getting it wrong (to incorrectly predict yes or no). In this case, imagine a recommendation engine offering a yes or no prediction for a specific customer regarding 100 items compared to the customer’s actual responses. This table shows a set of possible results. Example of Binary Classifier Results Iterations = 100 AI (Predicted) No Yes Customer (Actual) No 35 10 Yes 5 50 A result can be true or false and positive or negative, giving four possibilities as shown in the next table. Example Results Categories Prediction Actual Category Percent Yes Yes True positive 50 No No True negative 35 Yes No False positive 10 No Yes False negative 5 In this case, the model has an accuracy rate of 0.85. That number may be good or bad, depending on your requirements. You may move forward, or you may refine the model and try again. Test and evaluate the model What new information do you know now? Follow these steps: Evaluate the results. Review the process. Determine the next steps. In the penultimate step, you go back to the beginning and compare your goals with the results to determine if they provided enough of the right kind of insight to allow you to answer your questions. Deploy and integrate the model What actions should you trigger with the new information? What needs human validation? Follow these steps: Plan the deployment. Plan monitoring and maintenance. Produce the final report and presentation. Review the project. After you have an acceptable model, it’s time to roll out the information using the plan developed during the business understanding stage so your teams can execute on the insight the project has produced. Look for game-changing insights that will alter how you do business. You might change workflows, introduce automation at some points, and establish touchpoints for human intervention. You might introduce confidence scoring, with automated actions for outcomes above or below a window and human review for the middle ground. Maintain the model Because data has a shelf-life, no data science project can run under the set-and-forget philosophy. In the maintenance stage, your model must regularly retrain on fresh data so the answers reflect the new reality of now. The final report can be as simple as a summary of the life of the project and the outcomes, or it can be an exhaustive analysis of the results, their implications, and your plans for implementing the insights. It’s always a good idea to have a lessons-learned session after any significant effort, particularly if you plan to continue using it. This meeting can cover rabbit trails you followed and insights into best practices.
View ArticleArticle / Updated 08-20-2020
Artificial intelligence offers significant benefits for a broad range of markets. The most noticeable is optimizing the workforce by increasing their efficiency and reducing the burden of manual tasks. AI is good at automating things you might feel bad about asking someone else to do, either because it is tedious, such as reading through reams of reports, or dangerous, such as monitoring and managing workflow in a hostile environment. In other words, AI can relieve workers from the part of the job that they like the least. In addition, when an algorithm produces results with high accuracy and predictability, mundane processes and routine decisions can be automated, thus reducing the need for human intervention in the paper chase of the typical enterprise and freeing workers to focus on tasks that increase revenue and customer satisfaction. AI thrives on data and excels at automating routine tasks, so those industries with a wealth of digitized data and manual processes are poised to reap large rewards from implementing AI. For these industries, AI can enhance the things you want to increase, such as quality, adaptability, and operational performance, and mitigate the things you want to reduce, such as expense and risk. This article provides a bite-sized overview of industries that can derive specific benefits from implementing AI. Healthcare, a prime target for AI It’s hard to find an industry more bogged down in data than healthcare. With the advent of the electronic health record, doctors often spend more time on paperwork and computers than with their patients. In a 2016 American Medical Association study, doctors spent 27 percent of their time on “direct clinical face time with patients” and 49 percent at their desk and on the computer. Even worse, while in the examination room, only 53 percent of that time was spent interacting with the patient and 37 percent was spent on the computer. A 2017 American College of Healthcare study found that doctors spend the same amount of time focused on the computer as they do on the patients. A 2017 Summer Student Research and Clinical Assistantship study found that during an 11-hour workday, doctors spent 6 of those hours entering data into the electronic health records system. The good news is that AI is changing that equation. Healthcare is a data-rich environment, which makes it a prime target for AI: Natural-language processing can extract targeted information from unstructured text such as faxes and clinical notes to improve end-to-end workflow, from content ingestion to classification, routing documents to the appropriate back-end systems, spotting exceptions, validating edge cases, and creating action items. Data mining can accelerate medical diagnosis. In a 2017 American Academy of Neurology study, AI diagnosed a glioblastoma tumor specimen with actionable recommendations within 10 minutes, while human analysis took an estimated 160 hours of person-time. Artificial neural networks can successfully triage X-rays. In a 2019 Radiology Journal study, the team trained an artificial neural network model with 470,300 adult chest X-rays and then tested the model with 15,887 chest X-rays. The model was highly accurate, and the average reporting delay was reduced from 11.2 to 2.7 days for critical imaging findings and from 7.6 to 4.1 days for urgent imaging findings compared with historical data. Speech analytics can identify, from how someone speaks, a traumatic brain injury, depression, post-traumatic stress disorder (PTSD), or even heart disease. Manufacturing and AI If any system is ripe for transferring the tedious work to intelligent agents, it’s a system of thousands of moving parts that must be monitored and maintained to optimize performance. By combining remote sensors and the internet of things with AI to adjust performance and workflows within the plant or across plants, the system can optimize labor cost and liberate the workforce from the tedious job of monitoring instruments to add value where human judgment is required. AI can also drive down costs using sensor data to automatically restock parts instead of referring to inventory logs and by recommending predictive maintenance as opposed to reactive maintenance, periodic maintenance, or preventative maintenance, extending the life of assets and reducing maintenance and total cost of ownership. McKinsey estimated cost savings and reductions could range from 5 to 12 percent from operations optimization, 10 to 40 percent from predictive maintenance, and 20 to 50 percent from inventory optimization. Ai in the Energy sector In the energy sector, downtime and outages have serious implications. One study estimated that more than 90 percent of U.S. refinery shutdowns were unplanned. A McKinsey’s survey found that, due to unplanned downtime and maintenance, rigs in the North Sea were running at 82 percent of capacity, well below the target of 95 percent, because, although they had an abundance of data from 30,000 sensors, they were using only 1 percent of it to make immediate yes-or-no decisions regarding individual rigs. In December 2017, a hairline crack in the North Sea Forties pipeline halted production that cost Ineos an estimated £20 million per day. In contrast, Shell Oil used predictive maintenance and early detection to avoid two malfunctions, saving an estimated $2 million in maintenance costs and downtime. AI can capture data across all rigs and other operations and production systems to apply predictive models that can quickly identify potential problems, order the required parts, and schedule the work when physical maintenance is required. Banking and investments and AI The finance sector is blessed, or cursed, with both a super-abundance of paperwork and a surplus of regulation. I say “blessed” because the structured nature of the data and tightly defined rules create the perfect environment for an AI intervention. Credit worthiness: AI can process customer data, such as credit history, social media pages, and other unstructured data, and make recommendations regarding loan applications. Fraud prevention: AI can monitor transactions to detect anomalies and flag them for review. Risk avoidance and mitigation: AI can review financial histories and the market to assess investment risks that can then be addressed and resolved. Regulatory compliance: AI can be used to develop a framework to help ensure that regulatory requirements and rules are met and followed. Through machine learning, these systems can be programmed with regulations and rules to serve as a watchdog to help spot transactions that fail to adhere to set regulatory practices and procedures. This helps ensure real-time automated transaction monitoring to ensure proper compliance with established rules and regulations. Intelligent recommendations: AI can mine not just a consumer’s past online activity, credit score, and demographic profile, but also behavior patterns of similar customers, retail partners’ purchase histories — even the unstructured data of a customer’s social media posts or comments they’ve made in customer support chats, to deliver highly-targeted offers. AI in the insurance industry Some in the industry think that factors unique to insurance — size, sales channel, product mix, and geography — are the fundamental cost drivers for insurers. However, a McKinsey survey notes that these factors account for just 19 percent of the differences in unit costs among property and casualty insurers and 46 percent among life insurers. The majority of costs are dependent on common business challenges, such as complexity, operating model, IT architecture, and performance management. AI can play a significant role in mitigating these costs. Claims processing: Using NLP and ML, AI can process claims much faster than a human and then flag anomalies for manual review. Fraud detection: The FBI estimates the annual cost of insurance fraud at more than $40 billion per year, adding $400 to $700 per year for the average U.S. family in the form of increased premiums. Using predictive analytics, AI can quickly process reams of documents and transactions to detect the subtle telltale markers that flag potential fraud or erratic account movements that could be the early signs of dementia. Customer experience: Insurance carriers can use AI chatbots to improve the overall customer experience. Chatbots use natural language patterns to communicate with consumers. They can answer questions, resolve complaints, and review claims. Retail and AI The global economy continues to apply pressure to margins, but AI gives retailers many ways to push back. Reduced customer churn: MBNA America found that a 5-percent reduction in customer churn can equate to a 125-percent increase in profitability. Predictive analytics can identify customers likely to leave as well as predicting the remedial actions most likely to be effective, such as targeted marketing and personalized promotions and incentives. Improved customer experience: A 2014 McKinsey study notes that companies that improve their customer journey can see revenues increase by as much as 15 percent and lower costs by up to 20 percent. AI provides a deeper and contextual understanding of the customer as they interact with your brand. In particular, natural-language processing and predictive analytics provide a granular understanding of your customer regarding their product preferences, communication preferences, and which marketing campaigns are likely to resonate with each customer. Optimized and flexible pricing: Predictive analytics enable a company to implement an optimized pricing strategy, pricing products according to a range of variables, such as channel, location, or time of year. The system creates highly accurate predictive models that study competitor prices, inventory levels, historic pricing patterns, and customer demand to ensure that pricing is correct for each situation, achieving up to 30 percent improvement in operating profit and increasing return on investment (ROI) up to 800 percent. Personalized and targeted marketing: A 2016 Salesforce report found that 63 percent of millennials and 58 percent of Generation-X customers gladly share their data in return for personalized offers and discounts. Retailers are uniquely positioned to collect a range of data on individual customers, including preferences, buying history, and shopping patterns. Predictive analytics help personalize marketing and engagement strategies. A 2017 Segment study found that 49 percent of shoppers made impulse buys after receiving a personalized recommendation and 44 percent become repeat buyers after personalized experiences. Improved inventory management: The days of overstocking inventory are quickly diminishing as retailers realize that optimized stock equals more profit. Predictive analytics gives retailers a better understanding of customer behavior to highlight areas of high demand, quickly identify sales trends, and optimize delivery so the right inventory goes to the right location. The results are streamlined supply chains, reduced storage costs, and expanded margins. Legal system and AI AI is tackling the mountain of paper that characterizes most legal proceedings by providing better and smarter insights from organizational data to detect compliance risks, predict case outcomes, analyze sentiment, identify useful documents, and gather business intelligence to make better-informed decisions. Through automation and the use of predictive analytics, these technologies have significantly helped reduce the time and costs associated with discovery. A 2018 test pitted 20 lawyers with decades of experience against an AI agent three years into development and trained on tens of thousands of contracts. The task? Spot legal issues in five NDAs. The lawyers lost to the AI agent on time (average 92 minutes as opposed to 26 seconds) and accuracy (average of 85 percent as opposed to 94 percent). In one case, a discovery team of three attorneys on a class-action lawsuit had 1.3 million documents to review. They used eDiscovery to code 97.7 percent of the 1.3 million documents as non-responsive, leaving fewer than 30,000 documents for the three-attorney team to review. AI can aggregate and analyze data across a law department’s cases for budget predictability, outside counsel and vendor spend analysis, risk analysis, and case trends to facilitate real-time decision making and reporting. AI can perform document on-boarding and reviews based on continuous active learning to prioritize the most important documents for human review — lowering the total cost of review by up to 80 percent. AI and human resources Another bastion of paperwork, the HR department is a good candidate for streamlining processes using AI. In the 2018 “Littler Annual Employer Survey” of employers, the top three uses for AI were recruiting and hiring (49 percent), HR strategy and employee management (31 percent), and analyzing company policies and practices (24 percent). As the average job opening attracts 250 resumes, the most immediate gains in efficiency are possible in recruiting and hiring. Scanning resumes into an applicant tracking system can reduce the time to screen from 15 minutes per resume to 1 minute. Natural-language processing and intent analysis go beyond keyword searches to find qualified candidates whose wording doesn’t exactly match the job posting. Virtual assistants interact with candidates to schedule meetings, an otherwise time-consuming and tedious task. By automating these and similar tasks, HR personnel have more time to focus on strategic tasks that require an interpersonal approach. AI impacts on supply chain Globalization increases volatility in demand, lead times, costs, and regulatory hurdles, just to name a few factors. The announcement of a new trade tariff or a sudden flare-up of civil unrest can force quick adjustments and decisions. AI and data visualization techniques can accelerate the transition from reactive operations to predictive supply chain management and automated replenishment. It starts with recovering the value locked up in structured and unstructured data to convert a data swamp into a data lake to provide pervasive visibility of the current state of all assets across the entire organization and beyond to partners, customers, competitors, and even the impact of the weather on operations and fulfillment. It ends with streamlined processes, improved customer satisfaction, reduced costs, and an increased bottom line. Transportation and travel and AI Transportation issues have become the many-headed hydra of the twenty-first century, threatening the lifestyle and sustainability of metropolitan life. Addressing traffic is one of the defining challenges of worldwide urban life for this century. Congestion: The cost of congestion in the U.S. reached $305 billion in 2017. AI can process the complex dataset of traffic monitoring to suggest intelligent traffic light models and use real-time tracking and scheduling to mitigate traffic, both on the road and for public transport systems. Maintenance: A single downed truck can cost a fleet up to $760 per day. A grounded plane can cost more than $17,000 a day. Using machine learning and digital twins, you can assess the performance of a vehicle, plane, or train in real-time and trigger notifications or alerts when repairs or preventive maintenance are needed. The system uses automation to order parts and schedule maintenance. Public safety: AI can track real-time crime data to increase public safety and direct law enforcement to developing situations. Freight transport: Predictive analytics can assist in forecasting volume to optimize routes and inventory. Telecom industry and AI With the turn of the millennium and the advent of ubiquitous communications, the era of customer loyalty for a communications provider has passed. Customers churn faster than carriers can roll over minutes. As the network continues to evolve, customer quality-of-experience expectations increasingly dictate consumer behavior. Customer support: AI-powered chatbots are helping many telecoms improve the customer experience while saving support costs. Nokia improved resolution rates by 20 to 40 percent. Vodafone improved customer satisfaction by 68 percent with its chatbot, TOBi. Predictive and preventive maintenance: AI can process performance data at the equipment level to anticipate failure based on historical patterns and recommend tactical or strategic actions. For example, the system could alert a technician, who can use the AI-powered insights to proactively run diagnostics, perform root-cause analysis, and take action at any point in the link, from the set-top box all the way up the chain to the cell tower or network operations center. On the strategic level, these insights can inform network redesign to sustain better quality of experience and provide valuable data to inform development of new services to maintain a competitive edge. Network optimization: AI can find patterns at the traffic level and notify the network operations center of anomalies so a potential issue can be corrected before it affects the quality of service and to assist in exploring alternatives for optimizing the existing network. AI in the public sector In 2017, United States agencies collectively received more than 818,000 freedom-of-information (FOI) requests and processed more than 823,000. In the second quarter of 2017, the U.K. Department for Exiting the EU was able to respond to only 17 percent of FOI requests, and the Department for International Trade faired only slightly better at 21 percent. AI can shorten the time to provide information by automating manual tasks and flagging requests that require special consideration, enabling government workers to focus on high-value tasks instead of tedium. The U.S. Citizenship and Immigration Services respond to more than 8 million applications each year. In 2018, Emma, a virtual assistant on their website, responded to 11 million inquiries with a success rate of 90 percent. AI-assisted decision making: Many aspects of governance suffer from a surfeit of information. Separating the important from the mundane is a time-consuming and mind-numbing task for a human, but a simple and appropriate task for predictive analytics. AI can process and analyze enormous amounts and varieties of data to highlight patterns and reveal insights that facilitate efficient and effective decisions. Internet of Things: As cities deploy devices such as traffic cameras, smart traffic lights, smart utility meters, and other sensors, AI can sift through the mountain of data they generate to streamline operations, optimize process control, and deliver better service. Professional services benefit from AI Professional services firms often focus on high-touch engagements that are essentially human-centric and thus may not seem to be good candidates for AI. However, much of the work they take on involves process that are quite amenable to AI. Professional services touch many of the industries discussed here, and just as technology matures and affects all industries, of necessity it affects how professional services firms engage their clients. The key takeaway is that AI won’t replace core professional expertise, but it will make professional services firms more efficient and thus increase the value proposition for their clients. However, professionals who do begin to embrace AI will replace those who don’t. The applications span all industries: Document intake, acceptance, digitization, maintenance, and management Auditing, fraud detection, and fraud prevention Risk analysis and mitigation Regulatory compliance management Claims processing Inventory management Resume processing and candidate evaluation AI impacts marketing The secret sauce in marketing is not a secret. The ingredients are well known and are used daily all over the world. What is new is the glut of data now available regarding every search, click, and comment your customers make. AI doesn’t reinvent marketing. It just simplifies the daunting task of incorporating everything your data tells you about customers so you can anticipate their next move and improve the experience. With AI, your marketing can accomplish these feats: Use everything you know about customers, including their order history, browsing path through the website, customer service interactions, and social media posts Target your candidates and customers down to the individual Personalize messages according to whatever metric you have tracked, even down to buyer personas Generate thousands of variations on a message Schedule messages to maximize engagement Train messages based on engagement feedback Customize the customer experience on your website Optimize customer engagement and reduce churn Optimize price, even down to the individual if you so choose Qualify leads automatically Produce more accurate sales forecasts Media and entertainment and AI AI obviously plays a big role in movies and video games through CGI, special effects, and gaming engines, but what can it do for the enterprise? Valuing and financing: AI can use predictive analytics to determine the potential value of a script and then identify likely prospects for investment. Personalized content: AI can analyze user data to make intelligent recommendations for streaming media services. Search optimization. AI can support intelligent search engines for visual content for applications within and outside of the media industry. Film rating: AI can use predictive analytics to process historical rating information to suggest the proper rating for a film.
View ArticleArticle / Updated 08-19-2020
Innovation never sleeps. New breakthroughs in artificial intellligence (AI) and supporting technologies show up in the headlines every day—if you’re reading the right publications, that is. But first, let’s get the burning question out of the way. Flying cars? No. Or, at least not ones that look like those in Back to the Future. Autonomous cars? They’re already here. Now, on to real stuff. Consider this timeline: In the late eighteenth century, the steam engine powered the first industrial revolution. One hundred and twenty years later, at the turn of the twentieth century, commercially available electricity sparked the second industrial revolution. Sixty years after that, the first silicon-based computer triggered the third industrial revolution. Thirty years later in 1991, the World Wide Web became publicly available, laying the foundation for big data — along with computing and storage — primary drivers of the fourth industrial revolution. Hmm, 120 years between the first and second, 60 years between the second and third, 30 years between the third and fourth. Given that timeline, are you wondering what powered the fifth revolution you evidently overlooked in 2006? Do you think that back in 1989, Tim Berners-Lee, who kicked off the World Wide Web by posting a description of the project in the alt.hypertext newsgroup, had an inkling that with the click of a mouse he had laid the foundation for the longest AI renaissance, a renaissance that has no end in sight? Proliferation of AI in the enterprise Information will rule the future, and AI will continue to be leveraged to process that information to solve complex challenges. Today’s limits in AI will continue to be surpassed, and applications will double across a broad range of industries. In retail, you’ll see AI maximize cross-selling by providing hyper-personalizing content through intelligent recommendations, while manufacturers will increase margins through predictive maintenance, which maximizes the usable life of equipment and reduce costly downtime. In real estate, AI will be utilized to analyze massive amounts of data on past home sales, school districts, transportation, and traffic patterns to accurately project future value of homes and cost per square foot. In HR and recruiting, AI will accelerate the talent sourcing process by screening resumes 15 times faster than a human to identify the best candidates. In healthcare, AI will assist in medical, legal, and regulatory review for pharmaceutical companies to verify the development and marketing of new medication that complies with all legal requirements. As these industries rely on more data from users, AI will continue to advance, learn, and innovate the enterprise. AI will reach across functions Cross-functional teams, sometimes referred to as centers of excellence (CoEs), will empower organizations to create effective AI projects. These teams will represent the entire organization and will include individuals with business knowledge, IT experience, and specialized AI skills such as data engineers, data scientists, subject matter experts, and project managers. Often, they will include members embedded in lines of business — like ops, sales, marketing, and R&D — to ensure the work is aligned to deliver on departmental mandates such as reducing costs, growing revenue, or unlocking new business models. At the same time, the best organizations will have members in a centralized IT or IT-like function to scale learning from different departments across the entire organization, and will ensure compute power and access to data is provisioned. This type of dual hub-and-spoke setup — where the lines of business (LOBs) are the spokes and IT is the hub to help scale — is widely seen as a best practice for AI COEs. In either case, these teams will identify use cases and manage a digital platform that supports collaboration on key business initiatives. They must also partner with the right vendor who has the tools and expertise needed to help the organization kick-start a successful AI journey. Combining internal and external resources will be imperative to building and executing powerful AI projects that see the light of day and provide real business value instead of getting locked in some corner of the office. AI R&D will span the globe Currently the bulk of the new work in AI happens in the traditional loci of technological innovation. But you can’t fence in AI. Even now it’s escaping the compound and running off to emerging markets like Brazil, Russia, India, and China; however, as the knowledge required and tools to deploy become more and more open and available, AI will and should continue to expand farther, from Kenya to Kansas, Turkey to Trinidad, and beyond. This trend will increase as people all over the world enlist AI in their efforts to address their unique challenges. The data privacy iceberg will emerge While regulations such as Europe’s General Data Protection Regulation (GDPR) and the California Consumer Privacy Act (CCPA) have already been established, new regulatory developments regarding data privacy continue to emerge. Although these regulations have some differences, the fundamental intent of data privacy laws is to give consumers the right to know what types of personally identifiable information (PII) are collected, how the information is collected, and how to exercise the option to remove or take legal action in the event that consumers incur damages from bias or data security breaches. Until now, most organizations have focused their efforts on structured information, but they must also be able to understand what PII is located in textual data and documents. Archived documents, in particular, are an especially pressing concern for most enterprises. AI-powered solutions will be instrumental in locating sensitive data and managing it through automated workflows. Organizations will also need to establish internal data governance practices to determine who is accountable for data security and enterprise-wide policy, which may include creating teams that blend technical and regulatory expertise, as well as augmenting those teams with AI-powered solutions to help facilitate the same. More transparency in AI applications In both the private and public sectors, more organizations will recognize the need to develop strategies to mitigate bias in AI and to explain outcomes. With issues such as amplified prejudices in predictive crime mapping, organizations will build in checks in both AI technology itself and their people processes by ensuring that their data samples are sufficiently robust to minimize subjectivity and yield trustworthy insights. Data collection will evolve beyond selective data sets that mirror historical bias to more accurately reflect reality. In addition, teams responsible for identifying business cases and creating and deploying machine learning models will represent a rich blend of backgrounds, views, and characteristics. Organizations will also test machines for biases, train AI models to identify bias, and appoint an HR or ethics specialist to collaborate with data scientists, thereby ensuring cultural values are being reflected in AI projects. At the time of this writing, AI software solutions and features have also been emerging that evaluate AI risk in terms of explainability, bias, fairness, and robustness, to help remediate the same issues the technology enabled to begin with. Augmented analytics will make it easier With a massive amount of information becoming more available to organizations, augmented analytics will become the favored choice for processing data and running business intelligence operations. Through advancements in embedding AI and ML techniques, augmented analytics will help continue to pave new ways to lower the barrier for broader data and analytics use with even less training required, such as asking questions from data with natural language queries (NLQ), graph and chart recommendations based on the data selected or present, or smart preparation of data based on associative logic, rules, or models. These smart-data discovery features will continue to develop, understand, and optimize analytics experiences. With it, you’ll continue to see major adjustments in the business intelligence market, with an upward trend of enterprise buyers purchasing more of these augmented tools and applications and incorporating them into their data practices. As a result, the roles of computer programmers and software developers will shift back to support building core features for their business. And the roles of data scientists and data engineers in the enterprise will shift from business analyst work they are often pulled away to do because a companies’ lack of overall analytics literacy, back to focus on the more complex data projects and models they were hired to tackle from the start. Rise of intelligent text mining Organizations will increasingly use sophisticated AI solutions to contextually classify and derive meaning from all types of content, including structured, semi-structured, and unstructured content. Gartner has estimated as much as 80 percent of enterprise content is unstructured, which leaves a vast pool of information for companies to leverage. Data within these emails, customer service transcripts, and other textual documents can provide real business value, as well as insights on which key business decisions can be made. Through intelligent text mining, AI solutions can quickly read and understand huge stores of content for accurate synopses and sentiment analysis, thereby allowing organizations to rapidly access the insights that demand the greatest level of attention. Chatbots for everyone The average consumer may converse with a chatbot more than they speak with coworkers, family members, or even their spouse, as the demand for an instant response at any time continues to bubble up. With their advanced contextual capabilities that can personalize any experience through deep learning, chatbots will prevail as the next preferred digital interface. As chatbots dominate human interactions more than ever, consumer-facing businesses that want to stay competitive will incorporate these human-like AI personas into their service. Additionally, chatbot implementation will expand into the workplace in new ways to help with recruiting, training (via knowledge assistants), and overall efficiency (via virtual assistants), becoming more intertwined with all facets of life. Ethics will emerge for the AI generation Children born since 2010 make up the AI generation, those who have never known a world without the daily influence of AI. Yet, because many children will use AI-powered toys, programs, and educational software long before they develop critical thinking skills, it is up to adults to enforce ethical uses of AI. This will mean helping children establish logic to question the credibility of information and its sources, along with holding companies accountable for their products and practices intended for young audiences. Companies must establish transparent policies about how information is collected and used in toys, educational software, games, and apps. Specifically, software used in the classroom must be devoid of bias that could deny children educational opportunities. Parents and educators must also familiarize themselves with the products and programs their children are using, supervise their use, and watch closely for any signs of bias or invasions of privacy. Rise of smart cities through AI Smart cities are coming of age. The next phase in this evolution will be a significant rollout of smart city AI implementation. Large organizations have long utilized AI and analytics to turn unstructured data into more actionable insights. Now, artificial intelligence is opening the door for applications and networks outside of the workplace to harness big data more intelligently and engage with citizens in new ways and, as a result, make cities more efficient and sustainable. For example, AI can transform a city’s infrastructure and power utilization and can also make strides in public safety and healthcare, and can even make public parking more efficient. Cities will utilize smart technology to find innovative solutions to some of their most pressing urban challenges. AI will usher in even greater opportunities to make the smart city dream more of a reality. Ultimately, advancements in analytics and artificial intelligence has enabled humans to do so many things that weren’t possible just a few years ago, and we are just scratching the surface. With the turn of the decade, AI adoption and implementation will continue to soar to new heights, painting a future that is digital first and full of possibilities.
View Article