Data Quality Tool Consolidates Duplicates into Single Golden Record of Customer Data; Uniquely Determines Most Accurate Information Based on Objective Data Quality Score

Rancho Santa Margarita, CALIF- April 23, 2014 - Melissa Data, a leading provider of contact data quality and integration solutions, today announced new matching and de-duplication functionality in its MatchUp Component for SQL Server Integration Services (SSIS), uniquely solving the business challenge of duplicate customer data. Based on proprietary logic from Melissa Data, MatchUp determines the best pieces of data to retain versus what to discard - consolidating duplicate records objectively, unlike any other data quality solution. By assessing the quality of individual data fields, MatchUp enables a smart, consistent method for database administrators (DBAs) to determine the best customer contact information in every field.

"The average database contains 8 to 10 percent duplicate records, creating a significant and costly business problem in serving, understanding and communicating with customers effectively. The ideal is a single, accurate view of the customer - known as a golden record - yet this remains one of the biggest challenges in data quality based on methodologies that don't adequately evaluate the content of each data field. As a result, DBAs either overlook duplicates or consistently struggle with determining what information survives in the database and why," said Bud Walker, director of data quality solutions, at Melissa Data. "By using intelligent rules based on the actual quality of the data, DBAs are much better positioned to retain all the best pieces of information from two or more duplicate records into a single, golden record that provides valuable insight into user behavior and helps boost overall sales and marketing performance."

MatchUp works in sharp contrast to matching and de-duplication methods that rely solely on subjective principles, such as whether the record is the most recent, most complete or most frequent. Instead, selection criteria for determining a golden record is based on a relevant data quality score, derived from the validity of customer data such as addresses, phone numbers, emails and names. Once the golden record is identified intelligently, MatchUp further references the data quality score during survivorship processes to support creation of an even better golden record; duplicate entries are then collapsed into a single customer record while retaining any additional information that may also be accurate and applicable.

Utilizing deep domain knowledge of names and addresses, survivorship operations with MatchUp can granularly identify matches between names and nicknames, street/alias addresses, companies, cities, states, postal codes, phones, emails, and other contact data components.

Melissa Data will be demonstrating its MatchUp Component for SSIS at booth #46 during Enterprise Data World, April 27-May 1, 2014 at The Renaissance Hotel in Austin, TX. To download a free trial of Melissa Data's MatchUp Component for SSIS, click here or call 1-800-MELISSA (635-4772).

News Release Library

Customer Interactions

| No Comments | No TrackBacks
By David Loshin

In my last set of posts, I suggested that organizations reconsider the scope of the concept of the "customer" and how redefining the relationship between the organization and a customer. More to the point, I wanted to begin to explore how managing the different aspects of the customer relationship can enhance customer centricity, improve the customer experience, and eventually lead to increased profitability.

I summarized with the suggestion that you consider every interaction with any entity (individual or organization) in which there is an exchange of value as a customer interaction, and that is the topic of this week's post.

That suggestion hinges upon two core concepts. The first is that one can effectively identify the scenarios within any business process where two entities interact and there is an identifiable exchange of value. The second is that one can describe and quantify what that exchange of value is.

We can begin with an assessment of the business functions that traditionally are associated with customer interactions and their processes. For example, the marketing function seeks to attract and engage prospects, while the sales function looks to convert prospects into committed purchasers.

There may be a fulfillment function tasked with delivery of the purchased product or service, the finance function to collect payments, and a customer service function to deal with inquiries and complaints. Each of these business functions has some interaction with customers; the challenge is to identify (and document, if necessary) the business processes and then specify where in the process the customer interaction occurs.

Those customer interactions will be the focal point of our next series of postings. Next we will consider the exchange of value, which frames the point of the interaction, and then we will look at the information about the customer that can be used to increase the value of the interaction.

DQ Tools Correct/Standardize Contact Data for More than 240 Countries/Territories; Plus Real-Time Fraud Prevention and Identity Verification for E-Commerce

RANCHO SANTA MARGARITA, Calif., April 2, 2014 - Melissa Data, a leading provider of contact data quality and direct marketing solutions, today announced its COLLABORATE 2014 exhibit will feature a global focus on its data quality (DQ) tools for Oracle. Natively integrating DQ operations such as address cleansing, email verification, record matching and de-duping, Melissa Data's Oracle integration notably includes international cleansing tools in support of global contact data quality.

Using Melissa Data's comprehensive collection of APIs and cloud services, Oracle developers can now capture, validate, and cleanse international addresses for more than 240 counties and territories around the world. Melissa Data's Oracle integration also includes Personator, its integrated DQ Web service designed to provide real-time fraud prevention and identity verification for e-commerce applications.

"Melissa Data's Oracle integration responds directly to the needs of database administrators. We're challenging competitors by reducing costs and eliminating complexity for common yet sophisticated global data quality operations," said Bud Walker, director of data quality solutions at Melissa Data. "Native access to global data cleansing ability, as well as powerful fraud prevention and identity verification, enables DQ tools in a familiar programming construct - eliminating the need for low-level integration processes and resources."

Melissa Data's MVP Program further supports Oracle ACEs and community IOUG (Independent Oracle Users Group) leaders, providing referral incentives, core product training, and complimentary access to Melissa Data's suite of data quality products for cleaning and verifying global data.

Melissa Data's focus on global DQ tools for Oracle developers is integral to the company's overall international growth strategy. Building on nearly 30 years as a global data quality leader, Melissa Data's international expansion enables local sales and support for a global clientele of end-users as well as developers. Supported by a recently opened central London office, Melissa Data conducts business worldwide and also has key locations in Germany and India.

Melissa Data will be demonstrating its global data quality tools for Oracle at booth #1549 during COLLABORATE 14/IOUG, April 7-11, 2014 at The Venetian and Sands Expo Center in Las Vegas. To learn more about the DQ tools included in Melissa Data's Oracle integration, click here or call 1-800-MELISSA (635-4772).

Devising the Strategy, Making the Plan

| No Comments | No TrackBacks
By David Loshin

The three steps that I suggested in my last post about where to begin with data quality are truly meant to help determine where to begin, but also to guide the development of a longer term strategy and plan. Let me recall the three steps, but this time put them into the long-term perspective:

  1. Solicit data quality requirements from the business users. The objective of this task is to understand how information contributes to value creation, the users' perceptions of data usability, with the intent of identifying the key value drivers for data quality and consequently determining key data quality measures as well as the levels of acceptability in meeting the business users' needs.
  2. Perform a data quality assessment. The assessment serves the purpose of evaluating the current state in relation to the defined data quality measures.
  3. Establish a process and repository for data quality incident reporting and tracking. This puts practices and capabilities into place for addressing data issues in a way that continually improves the environment to achieve levels of acceptability for all data quality measures.

The user data quality requirements frame a desired end-state for data usability. Matching the measures collected during the current state assessment against the levels desired by the users provides a gap analysis that will highlight issues in areas need of specific improvement.

Employing a prioritization strategy to the list of documented issues provides a tactical roadmap for evolving toward the proposed end-state by incrementally addressing the root causes and eliminating the sources of data quality issues or by instituting methods for early validation and error detection.

In other words, the starting steps not only provide immediate value, they also help the data quality practitioner to craft a medium- and long-term plan for data quality improvement.

Server Side Replacement Enables Smooth Data Quality Conversion; Minimizes Need for Recoding Existing Application

Rancho Santa Margarita, CALIF - March 26, 2014 - Melissa Data, a leading provider of contact data quality and data integration solutions, today announced it will provide end-of-life options for customers using SAP's PostalSoft ACE program, an address correction engine designated end-of-life on March 31, 2015. Melissa Data offers a compelling alternative, enabling fast, seamless conversion with its server side replacement solution; Melissa Data's Address Object drops in with minimal changes to existing applications.

"Effectively converting from legacy ACE platforms requires a trusted global leader, offering capabilities proven with thousands of customers worldwide and based on extensive understanding of the mailing industry," said Ray Melissa, president and CEO, Melissa Data. "We are well underway in transitioning data quality systems for a number of PostalSoft customers - meeting this challenge by capitalizing on our nearly 30 years of software development expertise and deep domain knowledge of address correction issues, strategies and technologies."

Melissa Data's Address Object effectively replaces SAP's PostalSoft ACE, providing a USPS CASSTM Certified address correction engine that supports mailing and data quality processes and operations without disruption. SAP users can also access Melissa Data's other data quality services such as matching/consolidation and USPS NCOALink® Move Update processing to improve mail accuracy and reduce the cost and impact of bad data throughout business operations.

For more information about transitioning from SAP's PostalSoft ACE call 1-800-MELISSA (635-4772).

Where to Start with Data Quality

| No Comments | No TrackBacks
By David Loshin

In one of my previous posts I noted that there were two common questions about a data quality program. In my last post we dealt with the first about developing a business justification, so this week we will look at the other one: where to start?

I believe that there are three key tasks that must be done at the beginning of a data quality initiative, all intended to help laser-focus the plan to best address the most critical needs:

  1. Solicit data quality requirements from the business users. Don't directly ask the users about their data quality rules. Instead, devise a standardized approach to interviewing the business users about the way they rely on information within the context of their day-to-day activities. Document their description of their data use, extract their key dependencies, synthesize some quantifiable measures, and then reflect those measures back to them to validate your analysis.

  2. Perform a data quality assessment. Combine the use of statistical profiling tools and qualitative review to assess the degree to which data sets comply with user expectations and to find potential anomalies that require closer inspection to determine criticality.

  3. Establish a process and repository for data quality incident reporting and tracking. Provide a centralized management process that allows the users to report issues as they are identified, automate data practitioner notification, provide a set of rules for prioritization and evaluation, and then provide a workflow management scheme that ensures that high priority issues are addressed within a reasonable time frame.

In essence, these three steps will allow you to do a current state assessment of the levels of data quality, document the most critical ones, and help prioritize the methods by which those issues are investigated and resolved.

Justifying Data Quality Management

| No Comments | No TrackBacks
By David Loshin

Last week I shared some thoughts about a current client and the mission to justify the development of a data quality program in an organization that over time has evolved into one with distributed oversight and consequently very loose enterprise-wide controls.

The trick to justifying this effort, it seems, is to address some of the key issues impacting the usability of data warehouse data, as many of the downstream business users often complain about data usability, long times to be able to get the data for their applications, and difficulty in getting the answers to the questions they ask.

The issues mostly center on data inconsistency, timeliness of populating the analytical platform, completeness of the data, the rampant potential for duplication of key entity data due to the numerous data feeds, long times to figure out why the data is invalid, and general inability to satisfy specific downstream customer needs.

Because all of these issues are associated with the context of day-to-day reporting and analysis, we have inferred that the common theme is operational efficiency, and that is the dimension of value that we have chosen as the key for establishing the value proposition. Therefore, our business justification focuses on the data quality management themes that would resonate in terms of improving operational efficiency:

  • Improve proactive detection of invalid data prior to loading into the data warehouse
  • Speed the effort to finding and analyzing the source of data errors
  • Make the time to remediate data issues more predictable
  • Better communicate data errors to the data suppliers
  • Provide business-oriented data quality scorecards to the data users
The goal is to provide reporting with creditable statistics that demonstrate the efficiency of validating data, finding errors and fixing them, and delivering measurably high-quality data.

Deutsche Post Direkt Partners with Melissa Data

| No Comments | No TrackBacks

Relationship Fuels Melissa Data's International Growth; Adds Value for Direct Advertisers with Free Analysis of International Contact Data

Rancho Santa Margarita, CALIF- February 6, 2014 - Melissa Data, a leading provider of contact data quality and integration solutions, today announced a partnership with Deutsche Post Direkt GmbH, subsidiary of Deutsche Post and veteran developer of address management solutions to optimize direct advertising. The partnership represents continued progress in Melissa Data's global expansion, benefitting customers of Deutsche Post Direkt with proven solutions in international address validation and geocoding.

"Partnering with Deutsche Post Direkt, a national leader in address management and interactive marketing, underlines our capabilities and potential for development and expansion of shops at the international level," said Çağdaş Gandar, Sales Director Europe, Melissa Data.

Melissa Data's global perspective on data quality aligns with Deutsche Post Direkt's core business operations in address cleansing, analysis, enhancement and leasing. Using Melissa Data's Global Address Verification, a component of the company's Global Data Quality Suite, direct advertisers can reduce costs and improve operations through international address verification. Users can analyze addresses from more than 240 countries and territories - correcting and translating them into the official postal format of each geographic region, and enhancing them with location-based information.

Upon request, Melissa Data is providing a free data analysis and data quality report for customers of Deutsche Post Direkt, enabling a benchmark for determining further data quality strategies based on individual customer needs.

For more information about the partnership between Melissa Data and Deutsche Post Direkt, dial +49-30-79788829 to reach Melissa Data's Berlin office, or visit In the U.S., call 1-800-MELISSA (635-4772).

Express Entry® Validates Addresses as They are Entered; Improves Online Experience and Eliminates Possibility of Incorrect Data

Rancho Santa Margarita, CALIF- February 4, 2014 - Melissa Data, a leading provider of contact data quality and integration solutions, today announced free availability of Express Entry®, the company's address auto-completion tool designed to eliminate bad data at the source of entry. Express Entry validates addresses as they are entered, enabling fast and accurate customer data ideal for CRM and call center applications. Data entry is not only correct, but also up to 50 percent faster. Site users have a better experience, with type-as-you-go data correction or fielded data that quickly recognizes and suggests correct addresses based on the ZIP Code provided.

Express Entry is free for up to 1,000 addresses per month when users display an approved Melissa Data logo, readily enabling real-time data quality for the vast majority of small business websites. Larger volume subscriptions are available if website traffic warrants increased usage. Both business and customers benefit from improved, real-time data quality; order entry is easier and faster, and misdirected shipments and shipping costs are reduced.

"We're ensuring that sophisticated data quality is within reach across the business spectrum. The reality is that it's too costly not to integrate data quality tools, especially when they are simple, intuitive and free," said Bud Walker, director of data quality solutions, Melissa Data. "Express Entry provides an essential advantage in this effort, preventing bad data from even entering the system. Cost and complexity are no longer barriers, and businesses can readily access both value and simplicity in data quality operations."

Express Entry is a simple Web service; no extensive IT sources are required and it can be easily integrated into customized applications and platforms. In addition to the Web service, Webmasters and site developers alike can download a JavaScript code snippet for use with e-commerce forms and online guest books. Users begin typing an address into a search bar and Express Entry completes the query with a verified address, including city, state, and ZIP Code.

Melissa Data will be demonstrating Express Entry during their webinar slated for Wednesday, February 19 at 11:00 am Pacific/2:00 pm Eastern; click here to register.

To download Melissa Data's Express Entry, click here or call 1-800-MELISSA (635-4772).

Assembling a Data Quality Management Framework

| No Comments | No TrackBacks
By David Loshin

As easy as it is to discuss singular aspects of data quality as we have over the past few years, there are two main questions that I am asked over and over again when people are trying to put together a cohesive plan and program for data quality management.

The first question is "how can you develop a business justification for a data quality program?" and the second is "how do we get started?" We are currently working on a task with one customer who seems to be ready to commit to instituting a data quality management program, yet they remain somewhat resistant because of the absence of an answer to the first question and confused about planning because of the absence of an answer to the second.

Let me clarify the scenario somewhat: this is a large organization that has over the years empowered their user community with an atypical degree of data freedom. At the same time, they have a widely distributed management structure for information technology development. The result is, as you can imagine, some controlled chaos.

There are data validation routines here and there for extracting data (from one or more sources) and loading the data into a target system. But these routines are completely distinct and non-standardized, to the point where even in the few places where anyone actually looking at the validation scores would be challenged to make sense of them in attempting to assess data quality and usability.

Luckily, there is a new initiative for considering enterprise-level data services, and data quality has emerged as one of the potential foundations of this service strategy. In my upcoming post, we will look at some aspects of the business justification to be used in socializing the value proposition of data quality improvement.