DQ Tools Correct/Standardize Contact Data for More than 240 Countries/Territories; Plus Real-Time Fraud Prevention and Identity Verification for E-Commerce


RANCHO SANTA MARGARITA, Calif., April 2, 2014 - Melissa Data, a leading provider of contact data quality and direct marketing solutions, today announced its COLLABORATE 2014 exhibit will feature a global focus on its data quality (DQ) tools for Oracle. Natively integrating DQ operations such as address cleansing, email verification, record matching and de-duping, Melissa Data's Oracle integration notably includes international cleansing tools in support of global contact data quality.

Using Melissa Data's comprehensive collection of APIs and cloud services, Oracle developers can now capture, validate, and cleanse international addresses for more than 240 counties and territories around the world. Melissa Data's Oracle integration also includes Personator, its integrated DQ Web service designed to provide real-time fraud prevention and identity verification for e-commerce applications.

"Melissa Data's Oracle integration responds directly to the needs of database administrators. We're challenging competitors by reducing costs and eliminating complexity for common yet sophisticated global data quality operations," said Bud Walker, director of data quality solutions at Melissa Data. "Native access to global data cleansing ability, as well as powerful fraud prevention and identity verification, enables DQ tools in a familiar programming construct - eliminating the need for low-level integration processes and resources."

Melissa Data's MVP Program further supports Oracle ACEs and community IOUG (Independent Oracle Users Group) leaders, providing referral incentives, core product training, and complimentary access to Melissa Data's suite of data quality products for cleaning and verifying global data.

Melissa Data's focus on global DQ tools for Oracle developers is integral to the company's overall international growth strategy. Building on nearly 30 years as a global data quality leader, Melissa Data's international expansion enables local sales and support for a global clientele of end-users as well as developers. Supported by a recently opened central London office, Melissa Data conducts business worldwide and also has key locations in Germany and India.

Melissa Data will be demonstrating its global data quality tools for Oracle at booth #1549 during COLLABORATE 14/IOUG, April 7-11, 2014 at The Venetian and Sands Expo Center in Las Vegas. To learn more about the DQ tools included in Melissa Data's Oracle integration, click here or call 1-800-MELISSA (635-4772).


Devising the Strategy, Making the Plan

| No Comments | No TrackBacks
By David Loshin

The three steps that I suggested in my last post about where to begin with data quality are truly meant to help determine where to begin, but also to guide the development of a longer term strategy and plan. Let me recall the three steps, but this time put them into the long-term perspective:

  1. Solicit data quality requirements from the business users. The objective of this task is to understand how information contributes to value creation, the users' perceptions of data usability, with the intent of identifying the key value drivers for data quality and consequently determining key data quality measures as well as the levels of acceptability in meeting the business users' needs.
  2. Perform a data quality assessment. The assessment serves the purpose of evaluating the current state in relation to the defined data quality measures.
  3. Establish a process and repository for data quality incident reporting and tracking. This puts practices and capabilities into place for addressing data issues in a way that continually improves the environment to achieve levels of acceptability for all data quality measures.

The user data quality requirements frame a desired end-state for data usability. Matching the measures collected during the current state assessment against the levels desired by the users provides a gap analysis that will highlight issues in areas need of specific improvement.

Employing a prioritization strategy to the list of documented issues provides a tactical roadmap for evolving toward the proposed end-state by incrementally addressing the root causes and eliminating the sources of data quality issues or by instituting methods for early validation and error detection.

In other words, the starting steps not only provide immediate value, they also help the data quality practitioner to craft a medium- and long-term plan for data quality improvement.


Server Side Replacement Enables Smooth Data Quality Conversion; Minimizes Need for Recoding Existing Application

Rancho Santa Margarita, CALIF - March 26, 2014 - Melissa Data, a leading provider of contact data quality and data integration solutions, today announced it will provide end-of-life options for customers using SAP's PostalSoft ACE program, an address correction engine designated end-of-life on March 31, 2015. Melissa Data offers a compelling alternative, enabling fast, seamless conversion with its server side replacement solution; Melissa Data's Address Object drops in with minimal changes to existing applications.

"Effectively converting from legacy ACE platforms requires a trusted global leader, offering capabilities proven with thousands of customers worldwide and based on extensive understanding of the mailing industry," said Ray Melissa, president and CEO, Melissa Data. "We are well underway in transitioning data quality systems for a number of PostalSoft customers - meeting this challenge by capitalizing on our nearly 30 years of software development expertise and deep domain knowledge of address correction issues, strategies and technologies."

Melissa Data's Address Object effectively replaces SAP's PostalSoft ACE, providing a USPS CASSTM Certified address correction engine that supports mailing and data quality processes and operations without disruption. SAP users can also access Melissa Data's other data quality services such as matching/consolidation and USPS NCOALink® Move Update processing to improve mail accuracy and reduce the cost and impact of bad data throughout business operations.

For more information about transitioning from SAP's PostalSoft ACE call 1-800-MELISSA (635-4772).

Where to Start with Data Quality

| No Comments | No TrackBacks
By David Loshin

In one of my previous posts I noted that there were two common questions about a data quality program. In my last post we dealt with the first about developing a business justification, so this week we will look at the other one: where to start?

I believe that there are three key tasks that must be done at the beginning of a data quality initiative, all intended to help laser-focus the plan to best address the most critical needs:

  1. Solicit data quality requirements from the business users. Don't directly ask the users about their data quality rules. Instead, devise a standardized approach to interviewing the business users about the way they rely on information within the context of their day-to-day activities. Document their description of their data use, extract their key dependencies, synthesize some quantifiable measures, and then reflect those measures back to them to validate your analysis.

  2. Perform a data quality assessment. Combine the use of statistical profiling tools and qualitative review to assess the degree to which data sets comply with user expectations and to find potential anomalies that require closer inspection to determine criticality.

  3. Establish a process and repository for data quality incident reporting and tracking. Provide a centralized management process that allows the users to report issues as they are identified, automate data practitioner notification, provide a set of rules for prioritization and evaluation, and then provide a workflow management scheme that ensures that high priority issues are addressed within a reasonable time frame.

In essence, these three steps will allow you to do a current state assessment of the levels of data quality, document the most critical ones, and help prioritize the methods by which those issues are investigated and resolved.


Justifying Data Quality Management

| No Comments | No TrackBacks
By David Loshin

Last week I shared some thoughts about a current client and the mission to justify the development of a data quality program in an organization that over time has evolved into one with distributed oversight and consequently very loose enterprise-wide controls.

The trick to justifying this effort, it seems, is to address some of the key issues impacting the usability of data warehouse data, as many of the downstream business users often complain about data usability, long times to be able to get the data for their applications, and difficulty in getting the answers to the questions they ask.

The issues mostly center on data inconsistency, timeliness of populating the analytical platform, completeness of the data, the rampant potential for duplication of key entity data due to the numerous data feeds, long times to figure out why the data is invalid, and general inability to satisfy specific downstream customer needs.

Because all of these issues are associated with the context of day-to-day reporting and analysis, we have inferred that the common theme is operational efficiency, and that is the dimension of value that we have chosen as the key for establishing the value proposition. Therefore, our business justification focuses on the data quality management themes that would resonate in terms of improving operational efficiency:

  • Improve proactive detection of invalid data prior to loading into the data warehouse
  • Speed the effort to finding and analyzing the source of data errors
  • Make the time to remediate data issues more predictable
  • Better communicate data errors to the data suppliers
  • Provide business-oriented data quality scorecards to the data users
The goal is to provide reporting with creditable statistics that demonstrate the efficiency of validating data, finding errors and fixing them, and delivering measurably high-quality data.


Deutsche Post Direkt Partners with Melissa Data

| No Comments | No TrackBacks

Relationship Fuels Melissa Data's International Growth; Adds Value for Direct Advertisers with Free Analysis of International Contact Data


Rancho Santa Margarita, CALIF- February 6, 2014 - Melissa Data, a leading provider of contact data quality and integration solutions, today announced a partnership with Deutsche Post Direkt GmbH, subsidiary of Deutsche Post and veteran developer of address management solutions to optimize direct advertising. The partnership represents continued progress in Melissa Data's global expansion, benefitting customers of Deutsche Post Direkt with proven solutions in international address validation and geocoding.

"Partnering with Deutsche Post Direkt, a national leader in address management and interactive marketing, underlines our capabilities and potential for development and expansion of shops at the international level," said Çağdaş Gandar, Sales Director Europe, Melissa Data.

Melissa Data's global perspective on data quality aligns with Deutsche Post Direkt's core business operations in address cleansing, analysis, enhancement and leasing. Using Melissa Data's Global Address Verification, a component of the company's Global Data Quality Suite, direct advertisers can reduce costs and improve operations through international address verification. Users can analyze addresses from more than 240 countries and territories - correcting and translating them into the official postal format of each geographic region, and enhancing them with location-based information.

Upon request, Melissa Data is providing a free data analysis and data quality report for customers of Deutsche Post Direkt, enabling a benchmark for determining further data quality strategies based on individual customer needs.

For more information about the partnership between Melissa Data and Deutsche Post Direkt, dial +49-30-79788829 to reach Melissa Data's Berlin office, or visit www.MelissaData.com. In the U.S., call 1-800-MELISSA (635-4772).


Express Entry® Validates Addresses as They are Entered; Improves Online Experience and Eliminates Possibility of Incorrect Data


Rancho Santa Margarita, CALIF- February 4, 2014 - Melissa Data, a leading provider of contact data quality and integration solutions, today announced free availability of Express Entry®, the company's address auto-completion tool designed to eliminate bad data at the source of entry. Express Entry validates addresses as they are entered, enabling fast and accurate customer data ideal for CRM and call center applications. Data entry is not only correct, but also up to 50 percent faster. Site users have a better experience, with type-as-you-go data correction or fielded data that quickly recognizes and suggests correct addresses based on the ZIP Code provided.

Express Entry is free for up to 1,000 addresses per month when users display an approved Melissa Data logo, readily enabling real-time data quality for the vast majority of small business websites. Larger volume subscriptions are available if website traffic warrants increased usage. Both business and customers benefit from improved, real-time data quality; order entry is easier and faster, and misdirected shipments and shipping costs are reduced.

"We're ensuring that sophisticated data quality is within reach across the business spectrum. The reality is that it's too costly not to integrate data quality tools, especially when they are simple, intuitive and free," said Bud Walker, director of data quality solutions, Melissa Data. "Express Entry provides an essential advantage in this effort, preventing bad data from even entering the system. Cost and complexity are no longer barriers, and businesses can readily access both value and simplicity in data quality operations."

Express Entry is a simple Web service; no extensive IT sources are required and it can be easily integrated into customized applications and platforms. In addition to the Web service, Webmasters and site developers alike can download a JavaScript code snippet for use with e-commerce forms and online guest books. Users begin typing an address into a search bar and Express Entry completes the query with a verified address, including city, state, and ZIP Code.

Melissa Data will be demonstrating Express Entry during their webinar slated for Wednesday, February 19 at 11:00 am Pacific/2:00 pm Eastern; click here to register.

To download Melissa Data's Express Entry, click here or call 1-800-MELISSA (635-4772).

Assembling a Data Quality Management Framework

| No Comments | No TrackBacks
By David Loshin

As easy as it is to discuss singular aspects of data quality as we have over the past few years, there are two main questions that I am asked over and over again when people are trying to put together a cohesive plan and program for data quality management.

The first question is "how can you develop a business justification for a data quality program?" and the second is "how do we get started?" We are currently working on a task with one customer who seems to be ready to commit to instituting a data quality management program, yet they remain somewhat resistant because of the absence of an answer to the first question and confused about planning because of the absence of an answer to the second.

Let me clarify the scenario somewhat: this is a large organization that has over the years empowered their user community with an atypical degree of data freedom. At the same time, they have a widely distributed management structure for information technology development. The result is, as you can imagine, some controlled chaos.

There are data validation routines here and there for extracting data (from one or more sources) and loading the data into a target system. But these routines are completely distinct and non-standardized, to the point where even in the few places where anyone actually looking at the validation scores would be challenged to make sense of them in attempting to assess data quality and usability.

Luckily, there is a new initiative for considering enterprise-level data services, and data quality has emerged as one of the potential foundations of this service strategy. In my upcoming post, we will look at some aspects of the business justification to be used in socializing the value proposition of data quality improvement.

Get it Right the First Time

| No Comments | No TrackBacks
By Elliot King

Elliot King
Every database of a certain size will inevitably contain some unwanted data. There are just so many sources of bad data that it isn't possible to safeguard against all of them. The inexorable decay of data due to life itself--people move, get married, change jobs and so on--means that at some point databases will contain inaccurate, incomplete and obsolete information.

But that doesn't mean that organizations are totally helpless in trying to maintain high quality data right from the start. One of the most significant steps organizations can do is to proactively ensure that the data initially captured in a database is correct the first time. Think about it this way. One of the most time-honored maxims in the information technology universe is "garbage in--garbage out." So it makes sense to try to keep the garbage out in the first place.

Often, the steps that can have the biggest impact on data quality can be the simplest. Despite all the automation, a lot of information is still entered into databases manually, often through Web forms. And the sad truth is, people make mistakes. They misspell names. They select the wrong entry from a pull down box. They put the wrong information in the right box or the right information in the wrong box.

Four simple steps can help cut down on those kinds of mistakes. For information being entered internally, training is essential and people must be given enough time to take their time. Too often, the desire to increase productivity--i.e. have more data entered quicker--results in more mistakes.

For data entered both internally and externally, organizations have to pay attention to form design. There is both an art and a science to form design and a good design guides users as they enter data.

The next two steps include the development of strong metadata and using real-time validation tools. With strong metadata definitions, organizations can constrain the information that can be entered into a field. And real-time validation can identify at least some errors before they get into the database.

These commonsense procedures can be easily implemented by most organizations. Collectively, they represent a couple of ounces of prevention. And you know what that is worth.


Rancho Santa Margarita, CALIF- January 14, 2014 - Melissa Data, a leading provider of global contact data quality and integration solutions, today announced its strategic alliance with Blu Sky to solve growing data management challengesin healthcare markets. Melissa Data offers a comprehensive platform for data integration and data quality, and Blu Sky provides data capture technologies optimized for EpicCare software deployments used to administer mid-size and large medical groups, hospitals, and integrated healthcare organizations. By partnering with Melissa Data and its extensive suite of established data quality solutions, healthcare providers have a comprehensive single source for superior data management and compliance.

"Integrated data quality is essential to advancing universal healthcare options, yet the complexities of healthcare data management are evident in today's headlines," said Gary Van Roekel, COO, Melissa Data. "As government initiatives catalyze change in the market, our alliance with Blu Sky provides a significant technical and competitive advantage for healthcare CTOs - offering a comprehensive, proven resource for data quality, integration, and capture. Improved and integrated patient data quality will not only help providers reduce the cost of care, but also facilitate better diagnosis and treatment options for patients."

With this alliance, Melissa Data provides global data quality solutions that verify, standardize, consolidate, and enhance U.S. and international contact data, in combination with a comprehensive Data Integration Suite in Contact Zone, enabling cleansed and enhanced patient data to be transformed and shared securely within a healthcare network. Blu Sky adds subject matter experts to the equation - with deep expertise in the EpicCare software used extensively in healthcare networks to facilitate a "one patient, one record" approach; patient data capture, storage, and management is assured of compliance with a growing range of healthcare regulations, including CASS certification of address results, and HIPAA privacy and security policies.

"Mobile healthcare, connected pharmacy applications, and electronic medical records represent tangible advancements in healthcare accessibility," said Rick O'Connor, President, Blu Sky. "The same advances increase complexity of data management in the context of HIPAA confidentiality and other industry standards. With a single source to address compliance network-wide, providers are poised for healthcare innovations based on secure, high quality patient information."

For more information about the healthcare data management alliance between Melissa Data and Blu Sky, contact Annie Shannahan at 360-527-9111, or call 1-800-MELISSA (635-4772).