Recently in Data Integration Category
Technology Partnerships Advance Data Quality as Central Business Process;Enable Worldwide Expansion of Both Direct Mail and Data Quality Divisions
Rancho Santa Margarita, CALIF - December 16, 2014 - Melissa Data, a leading provider of contact data quality and address management solutions, today recognized 2014 as a milestone year for company partnerships in both its direct mail and data quality divisions. By working with leaders in ecommerce, data integration and comprehensive mailing solutions, Melissa Data has cemented strategic alliances that further expand its customer base worldwide. Essential to the company's growth strategy, Melissa Data's technology relationships are designed to fuel data quality as a central business process integrated into the earliest stages of master data operations.
"As the world recognizes the scope and value of data as a business asset, Melissa Data's global perspective on contact data quality has increasing impact. By providing sophisticated data quality that is simple to deploy, we're enabling businesses worldwide to embrace the competitive value of correct, validated customer information," said Ray Melissa, CEO, Melissa Data. "Collaborating with other technology leaders is essential in these efforts, and 2014 saw some of our most significant partnerships to date. Working closely with key partners, we're building existing business, growing in new markets, and seeing a greater understanding of data quality as the foundation for optimized master data systems."
Melissa Data's newest direct mail partnerships include Deutsche Post and Royal Mail, integrating global address verification and geocoding into their existing solutions and customer base. Melissa Data is one of only a handful of vendors licensed by Canada Post to provide Canadian NCOA (National Change of Address) processing so mailers can reach customers in both Canada and the U.S.
In the data quality realm, Melissa Data partnered with EXTOL International to enable seamless data quality operations in tandem with EDI. The company's alliance with Scribe Software uniquely enables cleansing, validating and enhancing customer information in a single integration as data moves between CRM, ERP and marketing automation applications. Melissa Data's relationship with Semarchy unites data quality and master data operations; Semarchy users have a fast and easy method for consolidating unstructured data sources and enriching master data, while Melissa Data customers have a strategic path to upgrade data quality projects into more comprehensive data governance initiatives. Melissa Data has increased its already deep support for developers by aligning with Varigence, incorporating data quality in Varigence's business intelligence accelerator tools based on Business Intelligence Markup Language (Biml).
Building data quality into ecommerce applications has also been a priority; by working with NetzTurm GmbH and Crafty Clicks, service providers for the implementation of online stores, Melissa Data has enabled plug-in data quality for users of Shopware and Magento-based ecommerce sites. Support for additional ecommerce platforms is imminent, enabling a greater number of online merchants to verify and correct addresses as they are entered, validating information from more than 240 countries and territories in real-time.
Earlier this year, Melissa Data also partnered with Blu Sky to solve healthcare data management challenges; data quality tools are now integrated into Blu Sky's EpicCare data capture technologies used to administer medical groups, hospitals and integrated healthcare organizations. Furthering its role in healthcare data quality initiatives, Melissa Data has also partnered with TriZetto, integrating data quality into software solutions that support nearly a quarter million doctors and healthcare providers under 350 different healthcare plans nationwide. Melissa Data's recent alliance with V3Gate expands public sector market dramatically, while enhancing V3Gate's big data capabilities with integrated data quality tools and solutions. Together the two firms are actively targeting business growth in federal, state and local government sectors, offering cutting-edge IT solutions that support high-level public agencies in achieving their diverse missions.
Partnerships on the 2015 horizon focus on ecommerce as well as enterprise data quality. Melissa Data conducts business worldwide, supporting its global clientele with sales and service from key locations in the United States, United Kingdom, Germany and India. For information about partnering with Melissa Data, contact firstname.lastname@example.org or call 1-800-MELISSA (635-4772).
Melissa Data Enrichers Enable Clean, Global Contact Data for Semarchy Users; Webinar Demonstrates Best of Breed Strategy for Fast, Optimized MDM Operations
Rancho Santa Margarita, CALIF - November 12, 2014 - Melissa Data, a leading provider of contact data quality and data integration solutions, today formally announced its partnership with Semarchy, a developer of innovative Evolutionary Master Data Management (MDM) software and solutions. Together the two firms are facilitating sophisticated data quality as a key performance enabler of effective MDM operations. Through this partnership, Semarchy offers its users a fast and easy way to perform worldwide address enrichment, standardization, geocoding and verification using Melissa Data's proven data quality tools and services. In turn, Melissa Data enables its users to upgrade data quality projects, moving beyond tactical processes to engage in a more comprehensive strategy combining data quality, data governance and master data management. Data integrators can learn more in a joint webinar presented by the two firms; click here to access the presentation on demand on Melissa Data's website.
"Clean, enhanced contact data is essential to enterprise MDM - maximizing the value of applications, empowering sales and marketing, and assuring trusted information as the basis for analysis and reporting," said Bud Walker, Director of Data Quality Solutions, Melissa Data. "By integrating data quality operations within the Semarchy Convergence ™ platform, we're supporting users in easily unlocking this kind of high level business value, increasing the quality of supplier, location and customer data within a single de-duplicated 360° view."
Semarchy Convergence™ for MDM is an integrated platform, available in the cloud or on-premise, enabling ground-up design and implementation of master data management initiatives; using a single tool, developers and data managers can create data models, manage quality, set up match/merge and transformation rules, and build data management applications and workflows. Semarchy Convergence™ now supports Melissa Data Enrichers, validated plug-ins that uniquely enable Semarchy users to verify, clean, correct, standardize, and enrich global customer records including full contact validation for global addresses, phones and emails.
Melissa Data Enrichers enhance the capture of new master records at the point of entry; deployed on-site or via the cloud, these tools eliminate the need for users to build integration between internal MDM systems and data quality processes. Customer records can be enhanced with missing contact data and master data can be appended to include attributes such as geographic coordinates. Users of the Semarchy Convergence™ platform can model any domain, managing data via collaboration workflows, generated user interfaces, or through simple batch processing to clean, standardize and enrich legacy data.
"Our perspective on master data management as an evolving business function recognizes the essential correlation between reliable data and authoritative analytics. By capitalizing on Melissa Data's proven technologies, Semarchy users can more effectively manage increasingly intelligent applications of global data - consolidating unstructured sources and enriching master data across all their business domains," said Matthew Dahlman, Technical Director Americas, Semarchy.
Click here for Semarchy Convergence™ for MDM as a 30-day trial version, and here to begin your free trials of Melissa Data Enrichers for the Semarchy platform. For more information, contact email@example.com, firstname.lastname@example.org, or call 1-800-MELISSA (635-4772).
PentahoWorld Presentation Details Data Blending Practices to Ensure Meaningful Intelligence
Rancho Santa Margarita, CALIF - October 1, 2014 - Melissa Data, a leading provider of contact data quality and data enrichment solutions, today advocated data quality practices as critical to maintaining the validity and power of Big Data analytics. The firm will provide details in a PentahoWorld training presentation slated for October 10 at 8:00 a.m. as part of the conference's Big Data Today breakout session. Melissa Data's training session will examine the essential correlation between reliable data and authoritative analytics, including the Big Data imperative of standardizing and validating distinct customer records from aggregated, unstructured data.
Analytics is the leading use case for Big Data - it capitalizes on the ability to process huge datasets quickly and economically, yet Big Data simultaneously introduces data quality challenges. If garbage in the form of bad data is fed through an enterprise's Big Data machine, the resulting analytics and insight are flawed, outcomes are compromised and business value negated. Up to 60 percent of IT leaders report a lack of accountability for data quality, with more than 50 percent doubting the overall validity of the data itself.
"Enterprise operations rely on understanding data relationships uncovered by Big Data. Data quality processes must be applied to ensure reliability of the distinct customer records that drive these analytics - fueling improved business intelligence or fraud prevention, understanding customer sentiment or even seeking out medical cures," said Charles Gaddy, director of global sales and alliances, Melissa Data.
Because the unstructured data used for Big Data analytics comes from a variety of sources, its supporting data quality processes are more imperative than those required to handle small relational data. By matching duplicates from multiple data sources, data mangers can create a golden record - a single version of the truth. This information can then be blended with multi-sourced reference data, for instance adding precise lat/long coordinates to a customer address, or demographic data that enriches and heightens insight. "The result is real business intelligence based on validated data, optimized for Big Data analytics and based on a 360° view of the customer," added Gaddy.
Melissa Data's training session is titled "Using Reference Data Sources and Data Quality Practices with Big Data." It will briefly explore data quality processes such as entity extraction used to identify customer and other structured data points from unstructured data, and Big Data blending with authoritative customer information.
In the past few entries in this series we have basically been looking at an approach to understanding customer behavior at particular contextual interactions that are informed by information pulled from customer profiles.
But if the focal point is the knowledge from the profile that influences behavior, you must be able to recognize the individual, rapidly access that individual's profile, and then feed the data from the profile into the right analytical models that can help increase value.
The biggest issue is the natural variance in customer data collected at different touch points in different processes for different business functions. A search for the exact representation provided may not always result in a match, and at worse, may lead to the creation of a new record for the same individual, even one or potentially more records already exist.
In the best scenario, the ability to rapidly access the customer's profile is enabled through the combination of smart matching routines that are tolerant to some variance along with the creation of a master index.
That master index contains the right amount of identifying information about customers to be able to link two similar records together when they can be determined to represent the same individual while differentiating records that do not represent the same individual.
Once the right record is found in the index, a pointer can be followed to the data warehouse that contains the customer profile information.
This approach is often called master data management (MDM), and the technology behind it is called identity resolution. Despite the relative newness of MDM, much of the capability has been available for many years in data quality and data cleansing tools, particularly those suited to customer data integration for direct marketing, mergers, acquisitions, data warehousing, and other cross-enterprise consolidation.
In other words, customer profiles and integrated analytics builds on a level of master data competency that is likely to already be established within the organization.
Named to DBTA 100: Companies that Matter Most in Data, as well as SD Times 100 for Innovation and Market Presence
Rancho Santa Margarita, CALIF, June 18, 2014 - Melissa Data, a leading provider of contact data quality and integration solutions, today announced it was named for the second consecutive time to the DBTA 100, Database Trends and Applications magazine's list of the companies that matter most in data. SD Times has also recognized Melissa Data with its third consecutive appearance in the SD Times 100, acknowledging firms demonstrating innovation and leadership.
"Melissa Data's comprehensive data quality solutions, both onsite and cloud-based, recognize data quality as a global challenge," said Gary Van Roekel, COO, Melissa Data. "Whether you're a start-up or major international brand, trusted data is a powerful business tool that can accelerate company performance and provide the ideal foundation for global growth. These industry acknowledgements validate the reasoning that for business intelligence and governance initiatives to succeed, there needs to be the proper foundation of data quality, enrichment and accuracy."
Judged by DBTA's editorial staff as the top companies in data and enterprise information management, organizations are selected for the "DBTA 100" based on their presence, execution, vision, and innovation in delivering products and services to the marketplace. All 100 companies are highlighted in the special June edition of Database Trends and Applications magazine and on the DBTA.com website.
The "SD Times 100" spotlights industry leadership and influence in a range of essential enterprise business categories, and is based on reputation, product leadership and overall innovation. By examining newsmakers of the previous year, SD Times editorial staff reviews company achievements to determine the stand-outs. Each organization's offerings and reputation are considered, as well as industry "buzz" which demonstrates market leadership. Melissa Data has again been recognized in the Database & Database Management category, acknowledging top firms supporting developers; Melissa Data's development tools help create seamless data quality integrations, enabling custom applications to capitalize on clean, enhanced customer data.
For more on Melissa Data and its global contact data quality solutions, visit www.MelissaData.com or call 1-800-MELISSA (800-635-4772).
Business Users Can Seamlessly Cleanse and Enrich Customer Data During Transformation; Reduces Resources Needed for Enterprise Data Quality
Rancho Santa Margarita, CALIF - June 5, 2014 - Melissa Data, a leading provider of contact data quality and data integration solutions, today announced its strategic relationship with EXTOL International Inc., a provider of end-to-end business integration software and services, to enable seamless data quality operations in tandem with electronic data interchange (EDI) and other integrated business processes. This partnership gives EXTOL users the option to access Melissa Data's sophisticated, native data quality functions, adding value by eliminating the need for separate data quality resources and assuring data quality immediately as customer information enters the enterprise system. With Melissa Data's suite of data quality tools, EXTOL users can cleanse, validate and enhance customer information as data is being transformed. This allows more intelligent use of customer contact data for significantly better segmentation, targeting, fraud detection and identity verification. The relationship between EXTOL and Melissa Data expands Melissa Data's market through direct access to EDI, and other data syntax communities, while also enabling business users with ideal, streamlined access to powerful data quality services and operations.
"Anytime you're migrating data from one source to another, cleansing and enriching the information allows better results as the data moves forward into the enterprise," says Gary Van Roekel, COO, Melissa Data. "EXTOL and Melissa Data are solving this data quality challenge for the business user, by offering integrated processes that simplify data quality operations from step one."
Connection to Melissa Data's cloud-based data quality tools and services is available through referral from EXTOL. During data transfers, users can simultaneously validate, cleanse and augment multi-national customer contact information across countries, languages and character sets - including verification of emails and international phone numbers, and geocoding street addresses by adding precise latitude and longitude coordinates.
"By offering EXTOL customers a data quality solution through our strategic partnership with Melissa Data, we are providing a tangible advantage for database administrators to simplify data quality operations for the range of CRM, marketing automation and ERP applications," says Mark Denchy, Director of Worldwide Partner Program at EXTOL International. "Achieving intelligent customer data is faster, painless and more cost-effective, ultimately raising data quality as a true business priority."
Melissa Data and Extol have joined together to launch an Educast on how to improve the quality of customer data. The webinar will be held on June 24th at 1 pm EST. To register for free, go to: http://www.extol.com/educast.
Powerful Data Quality Tool Consolidates Duplicates into Single Golden Record; Objective Data Quality Score Uniquely Determines Most Accurate Customer Information
Rancho Santa Margarita, CALIF - May 8, 2014 - Melissa Data, a leading provider of contact data quality and integration solutions, today announced its TechEd 2014 exhibit will feature new matching and de-duplication functionality in the company's MatchUp Component for SQL Server Integration Services (SSIS). Based on proprietary logic from Melissa Data, MatchUp consolidates duplicate customer records objectively, unlike any other data quality solution. Uniquely assessing the quality of individual data fields, MatchUp determines the best pieces of data to retain versus what to discard - enabling a smart, consistent method for data integrators to determine the best customer contact information in every field.
"A single, accurate view of the customer, known as the golden record, is the ideal for any business relying on customer data - reducing waste, optimizing marketing outreach and improving customer service. Yet common methods for matching and eliminating duplicate customer records involve subjective rules that don't consider the accuracy of the data itself," said Bud Walker, director of data quality solutions at Melissa Data. "MatchUp's intelligent rules offer a smarter, more consistent method for determining what information survives in the database and why. It's a critical data quality function that dramatically improves business operations."
MatchUp assesses the content within the customer record, in contrast to matching and de-duplication methods that rely solely on subjective principles, such as whether the record is the most recent, most complete or most frequent. Instead, selection criteria for determining a golden record is based on a relevant data quality score, derived from the validity of customer information such as addresses, phone numbers, emails and names.
Once the golden record is identified intelligently, MatchUp further references the data quality score during survivorship processes to support creation of an even better golden record; duplicate entries are then collapsed into a single customer record while retaining any additional information that may also be accurate and applicable. MatchUp relies on deep domain knowledge of names and addresses for survivorship operations, used to granularly identify matches between names and nicknames, street/alias addresses, companies, cities, states, postal codes, phones, emails, and other contact data components.
MatchUp is part of Melissa Data's Data Quality Components for SQL Server Integration Services (SSIS), a suite of custom data cleansing transformation components for Microsoft SSIS, used to standardize, verify, correct, consolidate and update contact data for the most effective business communications. The suite further includes selected Community Editions for simple data quality procedures, free to developers and downloadable with no license required. Community Editions include Contact Verify CE for address, phone and name parsing, and email syntax correction; MatchUp CE provides simple matching and de-duplication without advanced survivorship operations, for up to 50,000 records using nine basic matchcodes.
Melissa Data will be demonstrating its MatchUp Component for SSIS at booth #1934 during Microsoft TechEd, May 12-15, 2014 at the George R. Brown Convention Center in Houston, TX. To download a free trial of Melissa Data's MatchUp Component for SSIS, click here; to request access to Melissa Data's free Community Editions, click here or call 1-800-MELISSA (635-4772).
Rancho Santa Margarita, CALIF- January 14, 2014 - Melissa Data, a leading provider of global contact data quality and integration solutions, today announced its strategic alliance with Blu Sky to solve growing data management challengesin healthcare markets. Melissa Data offers a comprehensive platform for data integration and data quality, and Blu Sky provides data capture technologies optimized for EpicCare software deployments used to administer mid-size and large medical groups, hospitals, and integrated healthcare organizations. By partnering with Melissa Data and its extensive suite of established data quality solutions, healthcare providers have a comprehensive single source for superior data management and compliance.
"Integrated data quality is essential to advancing universal healthcare options, yet the complexities of healthcare data management are evident in today's headlines," said Gary Van Roekel, COO, Melissa Data. "As government initiatives catalyze change in the market, our alliance with Blu Sky provides a significant technical and competitive advantage for healthcare CTOs - offering a comprehensive, proven resource for data quality, integration, and capture. Improved and integrated patient data quality will not only help providers reduce the cost of care, but also facilitate better diagnosis and treatment options for patients."
With this alliance, Melissa Data provides global data quality solutions that verify, standardize, consolidate, and enhance U.S. and international contact data, in combination with a comprehensive Data Integration Suite in Contact Zone, enabling cleansed and enhanced patient data to be transformed and shared securely within a healthcare network. Blu Sky adds subject matter experts to the equation - with deep expertise in the EpicCare software used extensively in healthcare networks to facilitate a "one patient, one record" approach; patient data capture, storage, and management is assured of compliance with a growing range of healthcare regulations, including CASS certification of address results, and HIPAA privacy and security policies.
"Mobile healthcare, connected pharmacy applications, and electronic medical records represent tangible advancements in healthcare accessibility," said Rick O'Connor, President, Blu Sky. "The same advances increase complexity of data management in the context of HIPAA confidentiality and other industry standards. With a single source to address compliance network-wide, providers are poised for healthcare innovations based on secure, high quality patient information."
For more information about the healthcare data management alliance between Melissa Data and Blu Sky, contact Annie Shannahan at 360-527-9111, or call 1-800-MELISSA (635-4772).
While I have discussed the methods used for parsing, standardization, and matching is past blog series, one thing I alluded to a few notes back was the need for increased performance of these methods as the data volumes grow.
Let's think about this for a second. Assume we have 1,000 records, each with a set of data attributes that are selected to be compared for similarity and matching. In the worst case, if we were looking to determine duplicates in that data set, we would need to compare each records against the remaining records. That means doing 999 comparisons 1,000 times, for a total of 999,000 comparisons.
Now assume that we have 1,000, 000 records. Again, in the worst case we compare each record against all the others, and that means 999,999 comparisons performed 1,000,000 times, for a total of 999,999,000,000 potential comparisons. So if we scale up the number of records by a factor of 1,000, the number of total comparisons increases by a factor of 1,000,000!
Of course, our algorithms are going to be smart enough top figure out ways to reduce the computation complexity, but you get the idea - the number of comparisons grows in a geometric way. And even with algorithmic optimizations, the need for computational performance remains, especially when you realize that 1,000,000 records is no longer considered to be a large number of records - more often we look at data sets with tens or hundreds of millions of records, if not billions.
In the best scenario, performance scales with the size of the input. New technologies enable the use of high performance platforms, through hardware appliances, software that exploits massive parallelism and data distribution, and innovative methods for data layouts and exchanges.
In my early projects on large-scale entity recognition and master data management, we designed algorithms that would operate in parallel on a network of workstations. Today, these methods have been absorbed into the operational fabric, in which software layers adapt in an elastic manner to existing computing resources.
Either way, the demand is real, and the need for performance will only grow more acute as more data with greater variety and diversity is subjected to analysis. You can't always just throw more hardware at a problem - you need to understand its complexity and adapt the solutions accordingly. In future blog series, we will look at some of these issues and ways that new tools can be adopted to address the growing performance need.