IDQ2013 home > Program > Abstracts and Bios
IDQ2013 Proceedings
[login required]

IDQ2013 Proceedings
Purchase access US$300

Abstracts and Bios

Monday 4 November, 2013— Pre-Conference Tutorials

Morning tutorials 8:30–11:45 am

IQCP Exam Preparation, the Information and Data Quality Core, and Quality Foundations

John R. Talburt, PhD, IQCP, Chief Scientist, Black Oak Partners, LLC, and Coordinator, UALR Information Quality Graduate Program

Abstract

This tutorial covers three topics related to the IQCP credential offered by the IAIDQ.

  1. An overview of the IQCP exam - how it was designed and constructed, what it covers, preparation resources available, and suggestions test taking
  2. A refresher on the topics in the Information and Data Quality Core section of the IQCP exam including Cost, Value, and Impact of Data Quality, Data Governance and Stewardship, Data Quality measurements, requirements, and tools, and the information life cycle
  3. A refresher on the topics in the Quality Foundation section of the IQCP exam including process improvement, contributions of quality pioneers, statistical techniques, and team processes.”

Speaker bio

About the Author

John Talburt's photo

Dr. John R. Talburt IQCP is Professor of Information Science at the University of Arkansas at Little Rock (UALR) where he is the Coordinator for the Information Quality Graduate Program and the Executive Director of the UALR Center for Advanced Research in Entity Resolution and Information Quality (ERIQ). He is also the Chief Scientist for Black Oak Partners, LLC, an information quality solutions company based in Little Rock, Arkansas.

Prior to his appointment at UALR he was the leader for research and development and product innovation at Acxiom Corporation, a global leader in information management and customer data integration.  Professor Talburt is an inventor for several patents related to customer data integration and the author of numerous articles on information quality and entity resolution, and is the author of Entity Resolution and Information Quality (Morgan Kaufmann, 2011). He also holds the IAIDQ Information Quality Certified Professional (IQCP).

View my profile on LinkedIn

Data Profiling: Best Practices by Example

Gian Di Loreto, Owner/Manager, Loreto Services

Abstract

This tutorial is geared towards data quality and data governance professionals who wish to brush up on their data profiling skills.

All too often data profiling is misused by data quality professionals.  Driven by the preponderance of excellent data profiling tools, huge amounts of metadata and reports can be generated with a few clicks. 

We intend to show how to properly integrate data profiling into a larger data quality context.  Further, we show common pitfalls to avoid and in general best practices while profiling data.

Finally, we will demonstrate some of the techniques we discuss using one or more commercially available tools by vendors Trillium, DataFlux and Talend. 

Speaker bio

About the Author

Gian Di Loreto
I'm speaking at IDQ2013

Gian Di Loreto is one of the USA's leading authorities on human resource data quality.

Gian holds a Ph.D. in particle physics from Michigan State University. He began his career as an experimental physicist at Chicago's world-renowned Fermi National Accelerator Laboratory (Fermilab), where he spent several years performing statistical analyses and identifying errors in massive databases generated by proton/anti-proton collisions. After his tenure at Fermilab, he leveraged the analytical skills he developed in a scientific context as a software developer for a firm that reconciled and corrected data generated by General Motors' pension funds.

He has since created Loreto Services & Technologies, a consulting and IT outsourcing firm whose mission is to help companies understand what is in their databases, identify errors and discrepancies, integrate disparate databases and, in the process, eliminate historical and ongoing mission critical data errors using proven, scientific and statistical techniques.

View my profile on LinkedIn

Top 5 Artifacts Every Data Governance Program Must Have

Kelle O'Neal,Managing Partner, First San Francisco Partners

Abstract

You have garnered support from the key stakeholders, allocated resources and launched your Data Governance Program. Now what? Are you struggling with how to get started crafting the important deliverables in an efficient way? In this workshop, we will review five key artifacts necessary to progress a Data Governance Program. We will talk about  why they are important, how to create them and what content is needed. Attendees will also receive templates they can use in their own organizations. The key artifacts we will discuss are:

  • Charter, including the Vision, Mission and Objectives
  • Enterprise Principles that reflect the Charter
  • Important Policies that codify the Enterprise Principles
  • Metrics to track both progress and impact to the organization
  • Communication Plan to update the organization of progress and changes

Attend this tutorial to learn how to build a new Data Governance Program on a strong foundation, or to discover ways to re-energize an existing effort.

Speaker bio

About the Author

Kelle O’Neal's photo

I'm speaking at IDQ2013Kelle O’Neal
Having worked with the software and systems providers key to the formulation of Master Data Management (MDM), Kelle O'Neal has played important roles in many of the groundbreaking initiatives that confirm the value of MDM to the enterprise. Recognizing an unmet need for clear guidance and advice on the intricacies of implementing MDM solutions, she founded First San Francisco Partners in early 2007. Under her leadership, First San Francisco Partners immediately established a reputation as the first-call resource for companies looking to tap the value of customer data integration and MDM. Kelle developed her ability to work through organizational complexity, build consensus and drive results in senior roles at companies such as Siperian, GoldenGate Software, Oracle and Siebel Systems. She has worked at the executive level in the U.S., Europe and Asia. Kelle's strong background in customer relationship management, enterprise software, and systems integration enables her to provide expert counsel to any organization seeking to execute an MDM project. Kelle most recently served as General Manager, EMEA for Siperian. She earned her B.A. from Duke University and also holds an M.B.A. from the University of Chicago Booth School of Business.

She can be reached at kelle [at] firstsanfranciscopartners [dot] com and on Twitter at @1stSanFrancisco

 

Monday Afternoon Tutorials 12:45–4:00 pm

Data Quality and Governance in Projects

Danette McGilvray, IQCP, President and Principal, Granite Falls Consulting, Inc.

Abstract

Organizations make substantial investments in projects.  Of course, the more effective the projects, the sooner they realize business results.  Historically, many projects have concentrated their efforts on people, processes, and technology.  Many, however, fail to fully address the data and information aspects of their efforts.
Information quality and governance can make or break projects where data is migrated and integrated.  Including them can help a project stay on time and within budget or break the project by increasing project costs and timelines due to poor quality data and lack of good decisions.  The team should also look ahead during the project and lay a foundation where data quality and governance continue to be managed once in production.
Topics include:

  • Information quality and data governance activities throughout the project life cycle,  including the solution development life cycle (SDLC)
  • Continuing data governance and quality post project implementation
  • Real-life best practices and lessons

Come prepared to share your project challenges, interact with your fellow attendees, and learn proven methods for optimizing your project results through data quality and governance.

Speaker bio

About the Author

Danette McGilvray's photo

I'm speaking at IDQ2013Danette McGilvray is president and principal of Granite Falls Consulting, Inc., a firm that helps organizations increase their success by addressing the information quality and data governance aspects of their business efforts. She also emphasizes communication and the human aspect of information quality and governance.

Danette is the author of Executing Data Quality Projects: Ten Steps to Quality Data and Trusted Information™ (Morgan Kaufmann, 2008). An internationally respected expert, her Ten Steps™ approach to information quality has been embraced as a proven method for creating and improving data quality in the enterprise. The Chinese-language edition is available and her book is used as a textbook in university graduate programs. Danette was the 2013 recipient of IAIDQ’s Distinguished Member Award in recognition of her outstanding contributions to the field of information and data quality.

She can be reached via email at danette [AT] gfalls [dot] com

View my profile on LinkedIn

Introduction to Entity Resolution and Identity Life Cycle Management in Support of MDM

John R. Talburt, PhD, IQCP, Chief Scientist, Black Oak Partners, LLC, and Coordinator, UALR Information Quality Graduate Program

Abstract

The inability to properly integrate the same information coming from multiple sources is one of the leading causes of poor data quality in an organization. Whether it is the failure to recognize the same customer making transactions through different sales channels or to aggregate sales of the same product, the negative impact on business can be significant. This tutorial provides an introduction to current practices for data matching, record linking, and identity management that are foundational to building an effective strategy to improve data integration and managing master data.

Major topics include:

  • Four major types of entity resolution architectures
  • Creating and analyzing matching rules for record linking
  • Strengths and weaknesses of commonly used approximate match algorithms
  • How to maintain persistent master data identifiers through entity identity information life cycle management
  • Methods for assessing entity identity integrity

Speaker bio

About the Author

John Talburt's photo

Dr. John R. Talburt IQCP is Professor of Information Science at the University of Arkansas at Little Rock (UALR) where he is the Coordinator for the Information Quality Graduate Program and the Executive Director of the UALR Center for Advanced Research in Entity Resolution and Information Quality (ERIQ). He is also the Chief Scientist for Black Oak Partners, LLC, an information quality solutions company based in Little Rock, Arkansas.

Prior to his appointment at UALR he was the leader for research and development and product innovation at Acxiom Corporation, a global leader in information management and customer data integration.  Professor Talburt is an inventor for several patents related to customer data integration and the author of numerous articles on information quality and entity resolution, and is the author of Entity Resolution and Information Quality (Morgan Kaufmann, 2011). He also holds the IAIDQ Information Quality Certified Professional (IQCP).

View my profile on LinkedIn

Become a Power Data Steward: Trim the Fat and Balance Tasks to Operate with Greater Effectiveness and Efficiency

Tina McCoppin, Partner, Ajilitee

Abstract

Data Stewards often juggle too much and fall prey to scattered responsibilities without generating lasting, impactful results.  To strike the right balance of effort for sustained results, we recommend a focus on standards and conformity, metadata, enterprise glossaries and data dictionaries while strengthening areas such as data quality (DQ), enterprise data integration (EDI), and master data management (MDM).

This ‘pump-it-up session’ will pinpoint activities that really count for Data Stewards and offers a plan designed to “trim the fat” (i.e., time wasters) and focus on optimal efficiency and results. We will also cover the tools and frameworks that can help target specific “muscle groups” and (data) movement to support your Power Data Steward training.

This “how to” tutorial has been developed to help structure, implement and maintain a Data Governance program, with particular focus on the Data Steward role. The session will detail:

  • Typical Responsibilities of a Data Steward 
  • Data Steward Health Plan: defining a program and schedule to power through your job
  • Defining and establishing a Data Policy
  • Benchmarking and measuring Data Governance’s impact and results
  • Communication skills and techniques
  • Practical tips for time management
  • Must have tools for transparency and visibility
  • Tailoring the Steward role within your organization

Speaker bio

About the Author

Tina McCoppin's photo

Tina McCoppin I'm speaking at IDQ2013
Before joining LaunchPoint Consulting, Tina served as Engagement & Project Manager for Fortune 1000 companies throughout her tenure at leading IT services companies including HP, Knightsbridge, Forte, Seer, Pansophic and Accenture.

With 25+ years of information technology integration experience, Tina has managed multiple delivery teams with 100+ members, coordinating and managing the efforts of client staff and consultants in locations both onshore and offshore.   Tina’s business intelligence and data integration projects include:  global “single customer view;” customer call center tracking; householding; campaign tracking and feedback; insurance cross-sell and up-sell data warehouse; data governance and stewardship framework.

As Business Analyst Competency Lead at HP, Tina developed IP material and defined and coached the Business Analyst career track for HP’s BAs.  Tina graduated from Marshall University with a Bachelor in Business Administration (BBA), Georgia Institute of Technology with a Master of Science in Information Technology (MSIM), and Notre Dame with a Master of Arts (MA).

Monday Afternoon Conference Sessions 4:15-5:15 pm

Making the Most of Your Data Quality and Data Governance Dollar$

Karen Way, Principal/Owner, Three Elm Technology Strategies

Abstract

Having difficulty getting budget dollars allocated for your Data Quality and Data Governance efforts?  Even so, there’s the expectation is that the work will still get done, right?  Are you being asked to do more with less? Though there are signs that the economy is starting to recover, many organizations continue to find it difficult to increase, or even sustain DQ and DG budget allocations.  While the ultimate goal may be to purchase a tool/suite of tools or hire resources to meet your organization’s DQ and DG program needs, it may not be possible to do so in the near term. This session will provide you with some tips on how to answer these questions and, yes, do more with less.
Topics covered during this session will include:

  • Using standard tools you already have
  • Spending your budget dollars wisely
  • Creating symbiotic relationships
  • Using existing resources to your best advantage

You’ll come away with some ideas of how you can make each budget dollar stretch a little bit further within your organization.

Speaker bio

About the Author

Karen Way's photo

Karen Way is the Principal and Owner of Three Elm Technology Strategies, a company that specializes in Data Quality, Data Governance and MDM solutions for the healthcare sector. She has extensive experience in healthcare information technology, data quality, data governance, data management, data analytics and master data management from both business and technical implementation perspectives. Her expertise includes business and systems requirements, technical design and development, use case analysis, and business process re-engineering. With an MS in Healthcare Administration, Karen has a proven ability to build and lead effective, successful teams that deliver best in class solutions to improve client satisfaction, organizational cost savings, and workplace productivity.

View my profile on LinkedIn

How to Integrate Total Information Risk Management Techniques into Your Data Quality Job

Alexander Borek, Senior Strategy Consultant, IBM

Abstract

Many talk about the unlimited opportunities of data. But, by acknowledging that data and information are key assets, one must also acknowledge that poor data and information quality create risks. Managing risk is a new angle to selling data quality activities to upper managers who do not want to take the time to learn about the value of data quality. Based on the new book "Total Information Risk Management" and on case studies conducted in a number of industries, this presentation will show you how to integrate risk management techniques into your daily job activities as data and information quality manager. In particular, this talk will show you how to:

  • Complete a business case for data quality
  • Develop a clear link between data quality and your organization's business objectives
  • Identify data quality "pain points" that have the highest impact in your organization
  • Get stronger "buy-in" from senior executives for your data quality initiatives

Speaker bio

About the Author

Alex Borek's photo
I'm speaking at IDQ2013

Alexander Borek is thought leader, innovator and educator on how to apply risk management principles and methods to measure and manage the impact of poor data quality and bad information insights. In his current role as Senior Strategy Consultant at IBM Corporate Headquarters, Alex is using data analytics to drive IBM’s world-wide corporate strategy. Alex is a frequent speaker at international data management conferences. He is the author of Total Information Risk Management: Maximizing the Value of Data and Information Assets (Morgan Kaufmann) and of numerous peer-reviewed research articles (e.g. Information & Management). Dataqualitypro.com described him in its February 2012 newsletter as "a rising star in the data quality firmament". Tom Redman recognized the "significant breakthroughs" that are provided to the discipline through his work. Alex holds a Ph.D. in Engineering from the University of Cambridge and a first class M.Sc. in Information Management from the Karlsruhe Institute of Technology.

View my profile on LinkedIn

Monday Afternoon IDQ Mixer 5:30-7:30 pm

Tour of the UALR Emerging Analytics Center

Abstract

UALR’s new George W. Donaghey Emerging Analytics Center™ (EAC) features “first of its kind in the world” data visualization equipment and a unique, campus-wide cross discipline approach to cutting-edge data analytics and data visualization.  Key components of the EAC are its “first ever” EmergiFLEX™ and MobileFLEX™ data visualization equipment specially developed for the EAC, in partnership with the Mechdyne Corporation.  Additional partnerships with HP and Today’s Office have provided other unique assets to the EAC.

Advanced Data Analysis and Data Visualization are the tools needed now. The EAC’s over-arching mission of providing VISIONARY DATA SOLUTIONS FOR ARKANSAS includes its role to:

  • Foster cutting-edge research in data-intensive and experience-based areas across the UALR campus, with an emphasis on three colleges: Donaghey College of Engineering and Information Technology (EIT), College of Business (COB), and College of Science (COS)
  • Create a competitive, nationally recognized new center for research and co-development focused on using advanced data analytics and data visualization to accelerate Arkansas’ economic future by providing remarkable value-added services to broad community engagement: enhance economic development and UALR’s leadership role
  • Provide services for activities in the medical community, particularly UAMS
  • Build on the proven expertise and resources of the Donaghey College of Engineering and Information Technology, which include Computational Research (HPC) and Virtual Reality (VRC) Centers, with a bold, campus-wide, community-wide, and state-wide approach

 

Tuesday 5 November, 2013— Conference Sessions

Session 8:45-9:45 am KEYNOTE

Big Marketing Data: Single View of the Customer Across Channels, Devices, and Applications

Dr. Phil Mui, Chief Product & Engineering Officer, EVP, Acxiom

Abstract

Speaker bio

About the Author

Phil Mui's photo
I'm speaking at IDQ2013

Dr Phil Mui. With more than 15 years of experience in game-changing product creation, execution and innovation, Dr. Phil Mui is well known for his contributions in the world of marketing analytics and product marketing and engineering. He has worked for some of the world’s most innovative and well-known technology companies, and brings that knowledge to Acxiom. 

Phil is a visionary yet proven innovator who is uniquely qualified to help Acxiom jump the curve – and the competition. He works closely with Acxiom senior leaders and renowned client roster to develop and implement technologies that will change the business of marketing as we know it. 

Prior to joining Acxiom, Phil led the strategy formulation and execution of Google Analytics in his role as group product manager at Google. He also led the development of an annotation infrastructure that underlies Google+, Google Maps Reviews and Ratings and Google Bookmarks, among other Google products.

Phil has also served in leadership roles at the Stanford Functional Genomics Facility, Oracle Corporation, Microsoft Corporation, Lycos and a London-based display advertising startup 

Phil has a Ph.D., M.Eng., and S.B. (EECS) from MIT where he was a Whitaker Fellow, Harvard/MIT Health Sciences and Technology Fellow, and National Institute of Health Fellow. His dissertation in MIT’s Laboratory for Computer Science was on multi-agents modeling of social networks. He also has an M.Phil. (Management) from Oxford University where he was a Marshall Scholar. Phil is a member of the following honor societies: Eta Kappa Nu, Tau Beta Pi, Sigma Xi.

View my profile on LinkedIn

 

Session 10:00-11:00 am

Establishing a Simple Information Quality Metric. A Useful Technique and Method to Baseline, Monitor, and Raise Information Quality in your Organization

Rodney Schackmann, Sr. Staff Information Architect, Intel Corporation

Abstract

Information Quality is core to effective business analysis and decision making. If you can’t consistently measure the Information Quality, how do you have appropriate and hopefully proactive discussions about information project expectations and outcomes?  We all know the “Garbage-In, Garbage-Out” principle. Isn’t understanding and measuring Quality key to that determination?

In the industry, we’ve observed that many projects are conducted without Information Quality visibility. It’s not just a Data Quality consideration – other factors are involved.

A simple approach that helps work this problem is amazingly absent, so we’ve devised a strategy and method that can help. It is transforming the way we execute projects, and ultimately do business. We’d like to share it with you.

Speaker bio

About the Author

Rodney Schackmann's photo
I'm speaking at IDQ2013

Rodney Schackmann is an Information Industry veteran since 1982, with a history in software development, management, consulting, mentoring, and strategic technical leadership. Rodney Schackmann is a Sr. Staff Information Architect in Intel’s Corporate Quality organization. In his 20+ years at Intel, with the majority in IT, the last 12+ have largely been focused on Key Information efforts — educating and establishing Taxonomies, Controlled Vocabularies, Master Data Management, Data Governance, and driving cross-organizational teams to better solutions with higher Information Quality results. Rodney is now leveraging Intel’s long Quality history and world leadership in silicon manufacturing, and applying expertise and best-known practices from that domain to their Enterprise Information landscape.

Designing and Piloting Information Governance at Seattle Public Utilities

Duncan Munro, Technology Program Manager, Seattle Public Utilities

Abstract

Seattle Public Utilities (SPU) manages infrastructure assets worth $7.5 billion in the provision of Drinking Water, Drainage and Wastewater and Solid Waste Services to the Seattle metropolitan area. Design and implementation of Information Governance at SPU is centered on:

  • A core business model based on Asset Management principles
  • Business processes that generate and consume information about our assets 
  • A vision that all decisions will be driven by analysis of information of the highest possible quality 
  • A governance model founded within a paradigm of data quality metrics 
  • Emphasizing engagement between stakeholders in information governance with consumers of information 
  • Adoption of a metric-centric maturity model to measure progress and drive course corrections

The presentation illustrates examples of challenges, successes and our next steps towards maturity in each of these six areas

Speaker bio

About the Author

Duncan Munro's photo
I'm speaking at IDQ2013

Duncan Munro is a Technology Program Manager with Seattle Public Utilities (SPU) a department of the City of Seattle. Since completing his PhD in 1992 he has worked as an acquirer, steward and manager of information in support of business processes in various organizations ranging from international government agencies to small start-up technology companies. Duncan’s current focus is on leading engagement between business process owners and technology enablers in managing and governing the information life-cycle. Coupled to the diversity of assets that are managed by SPU in the course of service delivery is an attendant diversity in the information that is required to implement the asset management paradigm. Designing and implementing approaches that ensure that the quality of the information that is used in our decision making is always measureable and visible has allowed SPU to leverage their information assets more deeply and to lower the risk of poor decisions.

Big Data Quality:  The Elephant in the Data

Carla Staeben, Data Quality Manager, HERE, a Nokia Business

Abstract

“Big Data” is on everyone’s mind.  What is it?  How do I manage it?  What system should I use to store it?  With so many things to think about, Data Quality inevitably falls to the side.  “It will take care of itself”, “Data quality issues normalize away”, and “Big data contains machine data, there’s more quality in that” are phrases that are often heard to justify not starting quality efforts.

Data Quality software vendors are trying to catch up with the data tsunami, but a lot of their solutions are to “extract and report on a sample of the data.”  I maintain there has to be a better way.  If we can provide statistical analysis on terabytes and petabytes of data, why can’t we do it with a data quality angle?  This is my attempt. I will explain how we've built a "BIG" Data Quality solution on our Hadoop cluster. I'll go through the history of our challenge and what we've done to try to overcome it. I’ll cover:

  • Thought processes
  • High level architecture
  • Mistakes made and lessons learned.

Speaker bio

About the Author

Carla Staeben's photo
I'm speaking at IDQ2013

Carla S. Staeben has spent the last 15 years of her career advocating for the importance of data and data quality. With employers such as Allianz Life, Carlson Marketing Group, and Target, Carla worked to improve the quality of the data that went into decisions that impacted where millions of marketing dollars were spent. In her most recent position as Data Quality Manager at Nokia, Carla has focused on Big Data. She is currently bringing Data Governance and Data Quality to the big data platform.  She has been with Nokia’s analytics team for two years, and now leads a team that provides data quality recommendations and analysis for the disparate consumer and operational data assets from across Nokia’s applications and services.

 

Session 11:10 am - 12:10 pm

It Is Possible to Agree on an Industry-wide List of Dimensions of Data Quality

Dan Myers IQCP, Manager of Enterprise Data Management, Farmers Insurance

Abstract

This presentation will be hands-on in terms of discussing what six data quality authorities write in their books regarding the dimensions of data quality. We’ll also discuss how these perspectives can be aligned with real-world data quality challenges in organizations. The goal of this presentation is to stimulate thought on what are  the highest priority dimensions of information quality and how that may provide a foundation for an industry standard set of conformed dimensions of data quality.

  • Overview of the dimensions of quality as defined by six authors of data quality books.
  • Deep-dive into key concepts in each dimension of quality according to six authors with real-world examples.
  • Presentation of areas of agreement between these six authors and proposal of industry-wide standardized dimensions that accommodate most of each author’s contributions to the field.

Speaker bio

About the Author

Dan Myer's photo

Dan Myers IQCP currently manages enterprise data management initiatives for Farmers Insurance. Dan helped design and implement Farmers’ enterprise-wide metadata, data quality/data profiling programs. In other roles in the past, Dan has managed Web application development, data modeling and data/functional BI testing teams. Previously Dan worked as an independent Oracle Certified Professional consultant doing application development and data modeling. Dan's fluency in Japanese enabled him to work in both the public and private sector in Japan. Dan received his MBA from the U.S.C. Marshall School of Business in 2009. He is an IAIDQ Information Quality Certified Professional (IQCP, 7/2013). Dan has authored a series of articles related data quality dimensions, and published them on Information Management.

View my profile on LinkedIn

Building the Business Case for Data: A Case Study in Action

Jody Dyerfox, Client Partner, Southwest, Utopia

Abstract

In this session, we will detail how to build a customer business case for enterprise data management, including the economic impact of data.

Several use cases will be outlined including the progressive journey of CSL Behring.

Speaker bio

About the Author

Jody Dyerfox's photo
I'm speaking at IDQ2013

Jody Dyerfox is Client Partner, Data Innovator and Emerging Technology Strategist at Utopia, Inc.

View my profile on LinkedIn


Big Data and Big Data Governance

Kelle O'Neal, Managing Partner, First San Francisco Partners

Abstract

Data sets are growing in volume and with greater speed and variety, as data is acquired through business transactions, eCommerce, radio frequency identification readers, smart phones and meters, social networks and other Internet connected devices.  The purpose of the session is to establish a solid foundation for big data and big data governance, and to equip participants with essential knowledge to govern big data.

Topics include:

  • An Overview of Big Data management
  • How it differs from other Enterprise Data Management
  • Traditional Data Governance vs. Big Data Governance
  • Why and how it is critical to govern Big Data

Attendees will leave with an understanding of Big Data areas of concern, typical Big Data Governance decisions and how to identify the critical starting point with Big Data Governance. We will also discuss how practitioners can extend their existing data governance structures to address Big Data, and how to avoid the inherent risks.

Speaker bio

About the Author

Kelle O’Neal's photo

I'm speaking at IDQ2013Kelle O’Neal
Having worked with the software and systems providers key to the formulation of Master Data Management (MDM), Kelle O'Neal has played important roles in many of the groundbreaking initiatives that confirm the value of MDM to the enterprise. Recognizing an unmet need for clear guidance and advice on the intricacies of implementing MDM solutions, she founded First San Francisco Partners in early 2007. Under her leadership, First San Francisco Partners immediately established a reputation as the first-call resource for companies looking to tap the value of customer data integration and MDM. Kelle developed her ability to work through organizational complexity, build consensus and drive results in senior roles at companies such as Siperian, GoldenGate Software, Oracle and Siebel Systems. She has worked at the executive level in the U.S., Europe and Asia. Kelle's strong background in customer relationship management, enterprise software, and systems integration enables her to provide expert counsel to any organization seeking to execute an MDM project. Kelle most recently served as General Manager, EMEA for Siperian. She earned her B.A. from Duke University and also holds an M.B.A. from the University of Chicago Booth School of Business.

She can be reached at kelle [at] firstsanfranciscopartners [dot] com and on Twitter at @1stSanFrancisco

 

 

Session 1:10 - 1:40 pm

Collibra's Data Governance Center v4.1: a Data Steward Focused Platform

Stan Christiaens, Co-founder and Operational Director, Collibra

Abstract

Data Governance is operationalized through Data Stewardship. Collibra's Data Governance Center enables your data stewards in their daily work: managing the business glossary, curating reference data, setting up policies and rules, measuring compliance, monitoring quality and resolving issues.

The Data Governance Center supports essential data assets (e.g., data definitions, business glossary, reference data, data domains, …) which are critical instruments in any stage of maturity. Collibra goes beyond and supports more advanced data assets (e.g., business rules, policies, metrics & measures, …). The platform is set up to handle any kind of operating model: different organizational setups  (e.g., functional, centralized, federated, …), out of the box and configurable roles, responsibilities and workflows (e.g., approval, notification, issue management, …).

Because usage and business adoption is critical, Collibra also makes this content and the required stakeholder interactions easily accessible: using the Search Everywhere your data producers and consumers can pull up search results in the context of their work, taking them into the Business User Portal where they can further browse, navigate, search and filter out what they need before engage with the relevant data stakeholders (e.g., by raising an issue).
Collibra recently released its Data Governance Center (V4). In today's session we introduce a first update (v4.1), with a focus on Data Quality Management in Data Governance.

Speaker bio

About the Author

Stan Christiaens's photo
I'm speaking at IDQ2013

Stan Christiaens is co-founder and operational director at Collibra (www.collibra.com), a Data Governance enterprise software company. Stan has a global responsibility for all technical pre sales, implementation and support activities. This allows him to have a front seat view on real customer demands, issues and implementation challenges. Prior to founding the company he was a senior researcher at the Vrije Universiteit of Brussels (STARLab), a leading semantic research center in Europe, performing application-oriented research in semantics. Stan participated actively in several international (ITEA, FP6, FP7) research projects and conferences (OTM, FIS, ESTC, ...). He has also published various articles in the field of ontology engineering. He is an active DAMA member and speaker at various events.

Data 2 Insights™

Jody Dyerfox, Client Partner, Southwest, Utopia

Abstract

Better Data. Better Insights. Better Decisions. For over 10 years, clients have entrusted Utopia with their data management challenges to help them drive revenues, shred costs and streamline operations. Customers turn their Data 2 Insights™ through the use of one or more of our proven solutions: data governance, data migration, analytics, big data and data outsourcing. Data is our passion, business processes our second nature – we excel at blending both to achieve bigger, better and bolder business decisions with accurate data, anytime, anywhere – faster. Attend this session to learn more about our products and services!

Speaker bio

About the Author

Jody Dyerfox's photo
I'm speaking at IDQ2013

Jody Dyerfox is Client Partner, Data Innovator and Emerging Technology Strategist at Utopia, Inc.

View my profile on LinkedIn


Session 1:50 - 2:40 pm

ASK Master Data

Noha Radwan, Master Data Analyst, Schlumberger

Abstract

As a company moves to proactive data quality improvement, it is often necessary to create a framework and methodology to manage and maintain Master Data Quality. Within Schlumberger, Ask Master Data started from the clear understanding of the importance of a centralized tool to gather and act on Master Data issues.

Ask Master Data is a complete framework that covers all the aspects needed to operate a Master Data Help Desk.  It is a way to utilize the already existing and structured enterprise IT Help Desk, in coordination with the MDM team and the Data owners, to address Master Data issues in an efficient and timely way. Additionally, it also provides a powerful tool to measure the workload and justify the resources needed to maintain the quality of Master Data.

Attendees will discover:

  • How we developed our Master Data Help Desk
  • The type of data we provided for the Help desk
  • What the Help Desk cannot do
  • The levels of support we have defined, and how do they interrelate
  • How the data owners interact with the Master Data Help Desk

Speaker bio

About the Author

Noha Radwan's photo
I'm speaking at IDQ2013

Noha Radwan is Master Data Analyst at Schlumberger, coming with years of experience of IT in operation and software development, to join the evolving Master Data Team. As a part of the launching team for the Master Data Quality center in Schlumberger. She has implemented data quality metrics and reporting, as well as source to consumer alignment strategies. She has worked with the Master Data Management team to establish Master Data Quality center standards, processes, and templates for Schlumberger, contributed to data consumer training programs, and established the basic structure of the Schlumberger Master Data Help desk (ASK Master Data). Noha holds degrees in communications and electronic engineering from Cairo University.

Speaker bio

Using Machine Learning to Automate Data Integration

Melody Penning, BI Reporting, University of Arkansas at Little Rock (UALR)

Abstract

Data integration is a quickly growing issue for businesses. As the availability of data grows, so does the need to integrate that data. Discovering matching elements in structured sources consumes the time of subject matter experts as well as database experts. Machine learning can decrease the time and effort spent identifying matching elements by limiting the number of elements that need to be considered. I propose the use of supervised learning to identify the most likely elements for matching then verify the success of the classification with an overlap score comparison against a random element set.  This project demonstrates the potential of machine learning techniques to significantly cut the data integration time for complicated schema.

  • Data integration is a challenge
  • Machine learning can provide automation
  • Finding the best technique: issues and opportunities

Speaker bio

About the Author

Melody Penning
I'm speaking at IDQ2013

Melody L. Penning is a PhD student in the Information Quality program at the University of Arkansas at Little Rock (UALR). She completed her Master’s Degree in Information Quality in 2012 and received honors for Outstanding Academic Achievement and Best MSIQ Project 2011-12.

She is an employee of the University, working in the Information Technology Department on data analysis and visualization. Ms. Penning is also a member of the OYSTER Entity Resolution research team at the Center for Advanced Research in Entity Resolution and Information Quality (ERIQ).  In 2012 she earned a certification in the Level I ISO 8000-110:2009 Data Quality Standard for Master Data Management and most recently she has received a Graduate Certificate in Statistics. Ms. Penning is also a past president of the UALR Chapter of the IAIDQ.

Session 3:20 - 4:20 pm

Data Alliances–a Foundation for Data Governance Success

Thomas Tong, VP of Client Engagements, Knowledge Transformation Partners

Abstract

The largest obstacle in achieving a profound managed state of alignment around data is the “human factor”. Most often, it is people and how they relate to data that determine data value – access, quality and timeliness – not the actual technology systems. A Data Alliance is a specific approach to forming strong relationships among groups for the purpose of co-managing the data assets to new levels of business value. A Data Alliance establishes a sustainable set of relationships and defines clear expectations and roles around managing data.  They are in effect a way to implement data governance.  Data Alliances are more profound than "Data Contracts", adding a business dimension and driving alignment across executives and management…not just system to system "data contracts".  Topics include:

  • Understanding Data Alliances
  • How to create and sustain Data Alliances
  • Key Success factors

Speaker bio

About the Author

Thomas Tong's photo
I'm speaking at IDQ2013

Thomas Tong is an accomplished leader in technology management, knowledge management, learning management and cultural transformation. Through his work, Thomas has generated more than 17 billion in hard returns. Since 2003, Thomas has worked with more than 30 organizations to better manage data, information, and knowledge. Most recently, Mr. Tong has been focusing on data management within the Oil and Gas sector, working with super majors to design data initiatives that will lead to breakthrough business, process, and technical results.  Thomas is keenly aware that challenges must be addressed in an empowering way that facilitates the involvement of many stakeholders, functional groups and business units. A large part of Thomas' methodologies involve the deliberate creation of cultures that value alignment and partnership, establishing a sustainable base for data, information, and knowledge management.

Since 1991, Thomas has worked both in public and private sectors, across many verticals, and within local and enterprise level initiatives. Thomas is a managing partner with Knowledge Transformation Partners and is accountable for the results generated for its customers. He officially holds the title VP of Client Engagements.

Prerequisites for Analytics: Data Stewardship and Data Sharing

Stan Christiaens, Co-founder and Operational Director, Collibra

Abstract

Visualization and mining algorithms can provide tremendous insights in data, and business users can use those to take the right actions and make the right decisions. New functions (e.g., data scientists) and skills (e.g., Hadoop) are essential, but they are also hard to come by.  It can be a big disappointment to learn that the data scientists can't figure out what "Customer" means, where that data can be found or shared from, or even who is responsible for it.

So how can you make sure that analytics move from a niche experiment to a structural, competitive differentiator?

In this session we will show how Data Governance and Data Stewardship provide the right level of control and trust in data. Data Stewards are the data worker bees to achieve this operationally: managing the business glossary, curating reference data, handling classifications, taxonomies and hierarchies, setting up policies and rules, measuring compliance, monitoring quality and resolving issues, facilitating data sharing, ... They enable the process of data management.

We will make use of Collibra's Data Governance Center to show how this works in practice, and how a configurable operating model (roles, workflow, organization, ...) drives the right level of adoption and business engagement to establish a sound approach to data maturity, and turn information into a competitive differentiator.

Speaker bio

About the Author

Stan Christiaens's photo
I'm speaking at IDQ2013

Stan Christiaens is co-founder and operational director at Collibra (www.collibra.com), a Data Governance enterprise software company. Stan has a global responsibility for all technical pre sales, implementation and support activities. This allows him to have a front seat view on real customer demands, issues and implementation challenges. Prior to founding the company he was a senior researcher at the Vrije Universiteit of Brussels (STARLab), a leading semantic research center in Europe, performing application-oriented research in semantics. Stan participated actively in several international (ITEA, FP6, FP7) research projects and conferences (OTM, FIS, ESTC, ...). He has also published various articles in the field of ontology engineering. He is an active DAMA member and speaker at various events.

The Visualization Process: Transforming Data into Information

Ken Benson, Independent Verification & Validation Business Manager, First Data Government Solutions

Abstract

Too many reports or presentations furnish excessive detail, instead of providing focused information that people can use for decision making. Most organizational members do not have the time to analyze the data to derive conclusions to make informed decisions. This presentation will delineate a process consisting of defining

  • Who determines what is to be measured?
  • What is the origin of the data?
  • How is the data collected and stored?
  • How is the data manipulated to transform it into information?
  • What is the best way to present the information?
  • What is done with the final product such as the report, the presentation, etc.?

This is a practical how-to presentation for people interested in the preparation of reports and presentations that may be used by teams of peers, management and senior management in the organization to make informed decisions. Attendees will leave with best practices on how to turn data into actionable information.

Speaker bio

About the Author

Ken Benson's photo
I'm speaking at IDQ2013

Ken Benson has worked in the field of Quality Assurance for his entire professional career. He graduated with an MS in Information Quality from UALR in 2009; his Bachelor degree is in Electrical Engineering. Ken is currently employed by First Data Government Solutions as an Independent Verification & Validation Business Manager. Prior, Ken worked at Acxiom Corp. as a Business Improvement | Quality Manager. As Program Manager, he achieved implementation and appraisal of a CMMI v1.3 Maturity Level 3 on the second largest service contract in 15 months. Formally, as owner of Information and Process Quality consulting, he managed successful accomplishment of CMMI v1.2 ML 3 appraisal in 20 months.  For seven years prior, he was a Quality Process Engineer at Acxiom preceded by Quality Manager for PSINet Consulting, an independent QA/Test consultant and various job roles at Digital Equipment Corp.

 

Session 4:30 - 5:30 pm

How to Define and Implement Effective Data Quality Metrics

Laura Sebastian-Coleman IQCP Data Quality Architect Optum

Abstract

Join this highly interactive discussion on the challenges and joys of establishing effective data quality metrics. If you have defined effective metrics, come prepared to share your tips and best practices, and how you have used these metrics to help improve data quality. If you are struggling to define and implementing metrics, please bring stories and examples of the obstacles you are facing, and we will identify possible solutions together.

Speaker bio

About the Author

Laura Sebastian-Coleman

Laura Sebastian-Coleman IQCP, a data quality architect at Optum, has worked on data quality in large health care data warehouses since 2003. Optum specializes in improving the performance of the health system by providing analytics, technology, and consulting services. Laura has implemented data quality metrics and reporting, launched and facilitated Optum’s Data Quality Community, contributed to data consumer training programs, and has led efforts to establish data standards and to manage metadata. In 2009, she led a group of analysts from Optum and UnitedHealth Group in developing the original Data Quality Assessment Framework (DQAF) which is the basis for her book Measuring Data Quality for Ongoing Improvement (Morgan Kaufmann, 2013).

View my profile on LinkedIn

How to Manage Politics and Conflicts in Data Governance and Information Quality

Gwen Thomas Senior Operations Officer IFC

Abstract

Because politics and conflicts in data governance and information quality can be brutal, the DG and IQ leader's ability to navigate and manage them is often the differentiator between success and failure.

What are the typical sources of tension? How are politics and conflicts best handled? What are effective ways to use the organization’s formal and informal structures and systems?

Come share your success stories, lessons learned and current challenges; but most importantly, attend this highly interactive session to learn best practices and practical tips you'll be able to implement right away.

Speaker bio

About the Author

Gwen Thomas's photo

Gwen Thomas is a Data Governance pioneer. Founder of the vendor-neutral Data Governance Institute (DGI) and primary author of the DGI Data Governance Framework and guidance materials found at www.datagovernance.com, she has influenced hundreds of programs around the globe. In 2013, Gwen joined the International Finance Corporation, part of the World Bank Group.

Gwen has personally helped many organizations build their Data Governance and Stewardship programs.  Before data got “hot”, she spent a dozen years working in the trenches and in management roles as a systems integration consultant and Knowledge Manager. Prior to that, she was a college teacher, a marketing professional, and a publisher of a literary magazine.

Gwen is a frequent presenter at industry events. Information-Management.com has named her one of "17 Women in Technology You Should be Following on Twitter" (she's @gwenthomasdgi) and has included her in their Data Governance Gurus list.

View my profile on LinkedIn

How do I become Big Data Pro or a Data Scientist?

Piyush Malik Worldwide Leader, Big Data Analytics Center of Excellence, IBM

Abstract

While the Big Data buzz in the media has reached a crescendo, according to Harvard Business Review, the honor of “Sexiest job title of 21st Century” in the industry today goes to “Data Scientist”. Armed with data and analytical results, a top-tier data scientist can become a powerful organizational change agent.  However, this new breed of professionals is tough to find and harder to groom. 
Join this interactive session to discuss the following:

  • What is Data Science and why are data scientists in high demand today?
  • What does a data scientist do and what is a typical day in the life of a data scientist?
  • What is the link between Data Science and Big Data, if any?
  • Are Data Quality and Governance skills relevant in the era of Big Data?
  • What does it entail to become a Big Data Professional or a Data Scientist? How do you get started?

Come ask your questions and share your perspectives on these current hot topics.  You will leave with new and practical insights on how to transform yourself and your organization in the era of Big Data and Data Science. You would not want to miss this!

Speaker bio

About the Author

Piyush Malik

I'm speaking at IDQ2013Piyush Malik focuses on strategic initiatives and emerging technologies as part of IBM’s Global Business Analytics & Optimization (BAO) consulting practice. He leads their Worldwide Big Data Analytics Center of Excellence serving clients and engaged in building high performing teams globally. Specializing in data science, big data, information management strategy and architecture, information quality and governance, business intelligence (BI), master data and advanced analytics, he has over twenty four years of international consulting, practice building, sales and delivery experience with Fortune 500 clients across multiple industries and continents.

Piyush has served as the founding Director of IBM’s Global BAO Center of Competency. Previously as America’s Global Delivery leader for BI solutions, he was instrumental in setting up and explosive growth of IBM’s global services delivery centers in India, China, Brazil and other emerging markets.

Prior to IBM, Piyush led the Information Integrity consulting practice at PricewatehouseCoopers Management Consulting Services before it got acquired by IBM in 2002. Piyush is an intraprenuer who is active in the community as well serving on the advisory boards of a number of non-profit and professional organizations, including IAIDQ. He holds a bachelors degree in Electronics & Communications Engineering and a masters in Management of Technology from IIT, Delhi. He is a frequent speaker at industry conferences globally and has authored several articles and papers.

View Piyush Malik's profile on LinkedIn

 

Wednesday 6 November, 2013— Conference Sessions

Session 8:35-9:35 am KEYNOTE

Organizational Imperatives in the Unfolding Data Revolution

Thomas Redman, President, Navesink Consulting Group

Abstract

More and more businesses are coming to understand the vast potential that lies in their data: potential to deliver more value, to make better decisions top to bottom, and to make the fundamental “big data” discoveries that could “change everything.” The technology challenges are numerous and complex. After all, data do not give up their secrets easily!

Still, these technology challenges pale in comparison to the organizational challenges. Facing issues that range from poor quality; to a lack of needed analysts, managers, and leaders; to organizational structures that inhibit data sharing; today’s organizations are not up to the rigors of data. Resolving these challenges is the most important management challenge of our times.

In this keynote, Tom Redman, the Data Doc, lays out the challenges. He points out that, within relatively few years, no industry segment, no company therein, no department, indeed no job, will remain untouched.  Tom then describes several “points of light,” age-old principles that can help leaders get in front of the issues.  Finally, he suggests how data-driven companies will resolve the thorniest data challenges.

Speaker bio

About the Author

Tom Redman's photo

I'm speaking at IDQ2013Dr. Thomas C. Redman, "The Data Doc," is President of Navesink Consulting Group, in Rumson, NJ. After earning his Ph.D. in Statistics from Florida State, Dr. Redman joined Bell Labs, where he formed the Data Quality Lab in the late 80's. Tom started Navesink Consulting Group in 1996 and has been fortunate enough to consult with many of the world's leading organizations.

Dr. Redman has helped thousands understand that quality data, and more recently data governance, are top-line business issues. He has developed some of the most powerful methods of data management and holds two patents. His fourth book, Data Driven: Profiting from Your Most Important Business Asset, was published by Harvard Business Press in 2008.

He is an IAIDQ co-founder and winner of its 2011 Distinguished Member award. 

View my profile on LinkedIn

 

Session 10:00-11:00 am

The Data Quality and Data Governance Story at Cedars-Sinai: Implementing a Practical Data Quality Management Solution in a Large Teaching Hospital

Alein Chun PhD, MSPH, IQCP, Manager, Data Quality Management Unit, Cedars-Sinai Health System

Abstract

This presentation describes how we implemented a practical data quality management (DQM) solution at Cedars-Sinai Medical Center, a major academic medical center in Southern California. We present the following components of our DQM solution:

  • Managing data quality incidents
  • Creating communication channel to data community
  • Mitigating data reporting risks to external organizations
  • Building early detection of data quality occurrences
  • Measuring impact of data quality management
  • Data Governance structure

Our main objective is to share the story of how our organization has evolved from having individual observations about data anomalies to having an organized, dedicated work unit that has become an instrument of change and a recognized leader in data quality problem solving from both within and outside the hospital.

Speaker bio

About the Author

Alein Chun's photo
I'm speaking at IDQ2013

Alein Chun PhD, MSPH, IQCP is Manager of the Data Quality Management Unit (DQMU) at Cedars-Sinai Health System since 2005. Prior to his DQMU role, Dr. Chun was the Manager of Data and Methods in Resource & Outcomes Management Department. 

Dr. Chun has been involved since the inception of the hospital-wide data governance and data quality management function at Cedars, which was established to assure the integrity of the institutional information supply chain for decision-making. He currently chairs the institution’s Data Quality Management Working Group, which is comprised of over 25 members from various operational departments.

Dr. Chun holds a Ph.D. in Health Services Research from UCLA School of Public Health. He also holds a Master’s degree in Public Health from UCLA. In addition, Dr. Chun is an IAIDQ Information Quality Certified Professional (IQCP) and an ISO 8000 Master Data Quality Manager.

From the Bottom to the Top: Shell's Customer Master Data Journey

Tom Kunz, Data Manager, Royal Dutch Shell

Abstract

In 2009 Shell's customer master data registered 80% compliance to data quality standards.  Data maintenance was buried in the Customer Service Centers and connections between data defects and process failures were unknown.  Data was a black hole. By the end of 2012, Shell's customer master data processes were rated best in class and the data quality will achieve 99.5% compliance by the end of 2013.  Continuous improvement projects have resulted in a reduction of over 40 FTEs and tangible bottom line improvements exceeding $2M.  With improved data, an ordering pattern Analytics project is delivering additional value.

This session will discuss: 

  • The causes of our huge shift in focus 
  • The factors that led to the rapid improvement in compliance
  • The new processes that improved productivity 
  • The source of the benchmarking exercise and how it is fueling the insatiable drive for world class data
  • The keys to success for the Analytics project

Ours has been a remarkable journey over the last three years, starting with unresponsive stakeholders and ending with a fully energized Offer to Cash program that understands and values high quality master data. 

This presentation will be valuable to anyone seeking to implement a successful data governance program.

Speaker bio

About the Author

Tom Kunz's photo
I'm speaking at IDQ2013

Tom is a 32+ year Shell finance professional who accepted a job with Shell’s global master data organization in 2009. Working as the Data Manager for Shell’s Downstream businesses, he brought his business and finance knowledge to the base of a huge learning curve about data. Responsible for managing Downstream’s customer, product, property and facilities/equipment data processes, his team of approximately 250 data professionals are busy bringing bottom-line improvements to Royal Dutch Shell.

Tom has a degree in business management from Brigham Young University and has been involved with continuous education over the last 30 years including post graduate courses at the University of Houston and Management Education courses at Wharton.  He is also recognized as a Virtual Teamwork expert both inside and outside of Shell, having been featured in several magazine articles including the Harvard Business Review for his practical approaches to virtual work, which he uses daily.

Effective Business Metrics for Governance – an IBM Case Study

Christian Walenta IQCP, Program Manager Information Management, IBM

Abstract

Drawing from his work with one of the largest end-to-end information chains inside IBM, Christian will describe ways to mitigate the challenges in developing the right metrics for data governance.

We all know that metrics drive behaviors and the wrong metrics will lead to “wrong” or unintended behaviors. Based on real life experiences Christian shares his insights and lessons learned in establishing governance structures that “stick”. He also discusses how to establish focus on information quality and secure organizational support.

Attendees will also learn how to ensure metrics measure the right thing, how to work in a cross functional environment across teams and how to establish accountabilities and achieve continuous improvements.

Speaker bio

About the Author

Christian Walenta

Christian Walenta IQCP serves as the President on the Board of Directors of the International Association for Information and Data Quality (IAIDQ). He has held a variety of professional and management positions with IBM in Germany, UK and the USA. His professional experience of more than 20 years includes information management, data governance, management of supply chain systems, business development and project management. Christian is a recognized leader in Information Management (IM) at IBM and currently works in IBM's Product Information organization. Prior to it, he led IBM's Information Management Office, responsible for providing information management strategy, IM programs and Data Governance enterprise wide. He implemented Information Quality programs for IBM's Supply Chain and has extensive practical experience in all aspects of Information Management, including setting overall IM strategies, implementing data stewardship and governance programs for complex organizations and driving Information Management as continuous culture change.

Christian achieved varies professional certifications, including the IAIDQ Information Quality Certified Professional (IQCP), the PMI Project Management Professional (PMP), and the IBM Executive Project Manager. He holds a Master degree in Computer Science and Business Administration, as well as an MBA in International and Strategic Management.

View my profile on LinkedIn

 

Session 11:10 am - 12:10 pm KEYNOTE

Analysis in motion

Debbie Gracio,
Director, Computational and Statistical Analytics Division, National Security Directorate
Pacific Northwest National Laboratory,

Abstract

Speaker bio

About the Author

Debbie Gracio

Ms. Deborah (Debbie) Gracio joined Pacific Northwest National Laboratory in 1990 and is currently the Director for the Computational and Statistical Analytics Division. Since 1990, she has led multiple cross-disciplinary, multi-laboratory projects focused on the basic sciences and national security sectors. Her work has included research, development and management of integrated computational environments for biodefense, computational biology, computational chemistry, and atmospheric modeling. As the director of the Data-Intensive Computing research initiative, Ms. Gracio was responsible for building a program that identified PNNL as the leader in addressing the challenges of high-throughput streaming data and multi-source analytics focused on problems in both fundamental science and national security domains.

Since joining the Laboratory, Ms. Gracio has been involved in a variety of projects that have helped her develop a broad technical background in computer systems integration, software engineering, scientific computing, large-scale data management, and data acquisition. From 2003–2006, she was responsible for the Computational Sciences Product Line at PNNL, a series of programs within the DOE, DOD, and Intelligence Community. From 2000–2004, she worked with the leadership of the Biomolecular Systems Initiative to develop a computational biology and bioinformatics portfolio for PNNL. This work has included developing and coordinating a research agenda and roadmap for bridging computational and biological sciences and building enduring collaborations, and business opportunities. From 1995–2000, Ms. Gracio was the project lead for the Extensible Computational Chemistry Environment (Ecce), a key capability in the Molecular Sciences Software Suite - the flagship computational chemistry suite for the Department of Energy. From 1990–1995, she was responsible for the design, development and management of the Atmospheric Radiation Measurement Experiment Center, providing observational data streams and computational model results to climate modelers and researchers across the globe.

Ms. Gracio received a R&D 100 Award in 1999 and a Federal Laboratory Consortium Award in 2000 for the Molecular Sciences Software Suite, a software product which is now deployed to institutions worldwide. In 1994, she was recognized by the DOE with a Certificate of Accomplishment for the Atmospheric Radiation Measurement Program and in 1989 as the DOE Outstanding Woman in Engineering. Ms. Gracio is currently a Senior Member of the Institute for Electrical and Electronics Engineers and served on the IEEE Information Systems Strategy Council. She serves on the executive advisory board for the School of Electrical Engineering and Computer Science at Washington State University, is a member of the University of Arkansas at Little Rock Emerging Analytics Center Data Science Advisory Board, and is a board member of the Columbia Basin College Foundation.  She also serves as an executive member of the Habitat for Humanity Board in Richland, WA.

Ms. Gracio holds B.S. and M.S. degrees in Electrical Engineering, both from Washington State University

View my profile on LinkedIn

 

Session 1.10 - 1.40 pm

Supplier master quality: The key to improving spend visibility, reducing regulatory risk and improving supplier communications

Kevin Cassidy, Principal, Profit Optimization, PRGX
David Giat, Principal, Management Consulting, PRGX

Abstract

The quality of the data in your supplier master has significant impact on your organization’s operations. Learn how we help organizations improve spend visibility, reduce regulatory risk and improve vendor communications by cleansing, standardizing and enriching their supplier master.

Speaker bio

About the Author

Kevin Cassidy
I'm speaking at IDQ2013

Kevin Cassidy, Principal, Profit Optimization, PRGX

Kevin has been leveraging data to solve business problems and helping develop systems for companies for over 15 years.  Engagements have included master data process assessment and optimization, strategy development for data modeling groups as they transitioned to SAP, building and analyzing spend cubes, and developing systems to support reference databases.  Prior to PRGX, Kevin worked at XRoads Solutions Group, A.T. Kearney, and Seer Technologies.
 

About the Author

David Giat
I'm speaking at IDQ2013

David Giat, Principal, Profit Optimization, PRGX

David has been helping companies improve their purchase to pay processes since 2000 as a consultant at PRGX and formerly with Deloitte Consulting. In recent years David has led numerous client engagements in master data cleansing, enrichment and governance, helping client to significantly improve the quality of their supplier masters. Speaking engagements have included the Institute of Financial Operations and the Food Marketing Institute.


Emerging Data Visualization Tools: New Frontiers for Data Quality

Joe Swaty, APR, Director of Operations, UALR Emerging Analytics Center

Abstract

In this session, you will see how advanced data visualization technologies are emerging as possible new solutions to dynamically present abstract data sets. How to convey the importance of data quality and its business impact on the bottom line can be a challenge throughout all levels of business management. Examples will show how businesses and academia are using a combination of advanced presentation technologies and research-driven discovery to offer a “bold new look” in both large scale and portable applications to visualize large data sets. The “hands on” session will open your imagination to the possibilities of future synergy between data quality and advanced data visualization technologies.

Speaker bio

About the Author

Joe Swaty's photo

I'm speaking at IDQ2013Joe Swaty serves as the Director of Operations for the UALR EAC. His career includes serving as Assistant Dean for both the University of Arkansas at Little Rock's College of Engineering and Information Technology (EIT), and Assistant Dean for the University of Colorado at Colorado Springs' College of Engineering and Applied Science (EAS). Specializing in external corporate relations and business development, he coordinates specialized projects in the UALR EAC for clients and academic collaborators from throughout the nation. His development background also includes serving as Director of Development for the United States Air Force Academy's Association of Graduates.

Joe will be joined during the "fast-paced, hands-on" vendor product session by Mechdyne representatives (including Jason Keeney, Mechdyne's Onsite Technical Specialist), the EAC’s Chief Data Scientist Dr. Yassine Belkhouche, along with other faculty and graduate students.

 

Session 1.50 - 2.50 pm

Data Quality Assessment Framework (DQAF): A Comprehensive Approach to Data Quality Assessment

Laura Sebastian-Coleman IQCP, Data Quality Architect, Optum

Abstract

A comprehensive approach to data assessment is fundamental to improving data quality as well as to successful, long-term data management. Many of us are familiar with the benefits of profiling of data during development projects, but few of us establish clear goals and apply consistent measurement activities at other points in the information lifecycle. This presentation will describe a comprehensive approach data quality assessment, from initial profiling to in-line controls and measurements through periodic measurement.

 Participants will come away with:

  • An overall approach to data quality assessment
  • Increased awareness of how different forms of data assessment enable successful data management
  • Specific ideas that they can implement in their environments

Speaker bio

About the Author

Laura Sebastian-Coleman

Laura Sebastian-Coleman IQCP, a data quality architect at Optum, has worked on data quality in large health care data warehouses since 2003. Optum specializes in improving the performance of the health system by providing analytics, technology, and consulting services. Laura has implemented data quality metrics and reporting, launched and facilitated Optum’s Data Quality Community, contributed to data consumer training programs, and has led efforts to establish data standards and to manage metadata. In 2009, she led a group of analysts from Optum and UnitedHealth Group in developing the original Data Quality Assessment Framework (DQAF) which is the basis for her book Measuring Data Quality for Ongoing Improvement (Morgan Kaufmann, 2013).

View my profile on LinkedIn

Twice as Long to do it Wrong

John Ossege, Data Quality Advisor, ExxonMobil Technical Computing Company
Scott Robinson, Senior Exploration Geologist, ExxonMobil Technical Computing Company

Abstract

There is much discussion within the IQ community about the high cost of poor quality data, but few measurements of it.  In this presentation, we report the results of a pioneering effort to calculate the return on investment of Data Quality.

Our approach was to first quantify the cost of conducting a data-cleanup project that was being undertaken in ExxonMobil to remediate poor data quality.  We then sought to determine the “break-even point”: the amount of effort which could be employed to load data correctly before it would equal the effort of cleaning it up later. We calculated that retrieving data after it has been cleaned up would take less time than it did before clean-up, so we used this time savings to calculate "X", our expression of the "break-even point".  We defined "X" as the number of times the cleaned-up data has to be retrieved for the time savings to add up to the time spent cleaning it up. 

In this session, we describe the process we followed, share the value of x we computed and discuss other findings of our analysis, including how we documented that if data were to be used even once in the future, it was far more cost effective  to store it correctly the first time than to launch a huge clean-up effort later.

The group for whom we did the study is using these findings to strengthen their business case for changing the current best practice for storage of digital and paper documents to include capture of standardized metadata.

Speaker bios

About the Author

John Ossege
I'm speaking at IDQ2013

John H. Ossege, is the Data Quality Advisor for the Data Management Practices and Operations group with ExxonMobil. He has been with ExxonMobil for 33 years with experience in geologic interpretation, seismic data processing and data management. John has a Master’s Degree in Geology from Wright State University in Dayton, Ohio and has completed all the course work in the TIQM (Total Information Quality Management) Mastery Series by Larry English. He has presented at the 2008 IDQ Conference and the 2008 / 2012 Petroleum Network Education Conferences. He and his wife have three grown children and one grandson. His hobbies include traveling, exercising and sports.

About the Author

Scott Robinson
I'm speaking at IDQ2013

Scott E. Robinson got his Master’s in Geology in 1985, and has been noodling around the oil industry since then as an employee of ExxonMobil. He started with Exxon in digital mapping and became a user support geologist, then a data manager, and now works in the Upstream Data Management organization assisting groups with data mapping and integration of geoscience and engineering technical applications and databases. He developed an interest in Information Quality early in his career from seeing firsthand how much poor data quality can cost in time and resources. He and his wife are raising two teenage daughters. His hobbies are reading, going to school events, and (mild) exercising.

The Human Side of Data: Why Individual Contact Information (not Just Company Information) Is More Important Than Ever

Patricia Anton, President & Founder, Anton Consulting, Inc
Carol Marchesani, Director of Global Marketing Operations, SAP

Abstract

In this presentation we will address:

  • Why data that helps you market to "people" not just "companies" is so important.
  • Common pitfalls most organizations face in managing individual contact data and how these can be overcome.
  • The relevance of capturing social media contact data and the challenges with tracking social responses.
  • A case study on how SAP was successful in overcoming a contact data challenge.

Speaker bio

About the Author

Patricia Anton's photo
I'm speaking at IDQ2013

Patricia Anton — President & Founder of Anton Consulting, Inc. is an acknowledged thought leader in Customer Data Management, Patricia Anton has over 20 years of experience in this field. Before founding Anton Consulting, Inc. twelve years ago, Patricia was the SVP of Customer Relationship Management at Digitas in San Francisco, a marketing consulting firm focused on customer loyalty. Prior to joining Digitas, Patricia ran the CRM practice at Tessera, a systems integration company where her teams specialized in improving one-to-one customer relationships through the use of customer data and technology.

Her early experience is rooted in marketing where she worked for companies such as Colgate-Palmolive, Cigna International, May Company, and Glaxo. Patricia holds a B.S. from the University of Illinois and a Masters in International Business from the University of South Carolina.

View my profile on LinkedIn

 

About the Author

Carol Marchesani's photo
I'm speaking at IDQ2013

Carol Marchesani is a seasoned marketing executive with over 15 years of data management experience. As the leading practitioner of Marketing Data Quality at SAP, she has successfully delivered game changing data initiatives that have improved bottom line results of direct marketing campaigns and overall sales productivity. With a disciplined eye toward scalable, operational processes and syndication of data best practices, Carol has been instrumental in making data a corporate asset at SAP. Among her global data management responsibilities, Carol is currently spearheading the use of contact data to drive intelligent, automated, real time interactions across multiple-channels to convert unknown prospects into loyal customers. 

Carol holds a B.A. in Economics from Washington and Lee University.

Session 3:20 - 4:20 pm

Ensuring the Quality of Health Information: The Canadian Experience

Maureen Kelly, Manager, Data Quality, Canadian Institute for Health Information (speaker)
Heather Richards IQCP, Program Lead, Canadian Institute for Health Information
Christina Willemse, Program Lead, Canadian Institute for Health Information

Abstract

High quality health information is critical for quality health care and for effective and efficient management of the health care system. The Canadian Institute for Health Information (CIHI) has over 30 data holdings that provide crucial information to stakeholders, including health system managers, policy makers and clinicians. This presentation provides an overview of the data quality strategies and programs implemented at CIHI to promote high quality information.
 
Key topics include:

  • CIHI’s Data Quality Strategy
  • CIHI’s five dimensions of data quality: accuracy, timeliness, comparability, usability and relevance
  • Examples data quality strategies in action-prevention, monitoring, feedback and continuous improvement
  • The evolution of CIHI’s data quality program to meet the changing information needs of stakeholders

High quality health information is critical for quality health care and for effective and efficient management of the health care system. The Canadian Institute for Health Information (CIHI) has over 30 data holdings that provide crucial information to stakeholders, including health system managers, policy makers and clinicians. This presentation provides an overview of the data quality strategies and programs implemented at CIHI to promote high quality information.
 
Key topics include:

  • CIHI’s Data Quality Strategy
  • CIHI’s five dimensions of data quality: accuracy, timeliness, comparability, usability and relevance
  • Examples data quality strategies in action-prevention, monitoring, feedback and continuous improvement
  • The evolution of CIHI’s data quality program to meet the changing information needs of stakeholders

Speaker bio

About the Author

Maureen Kelly
I'm speaking at IDQ2013

Maureen Kelly is the manager of the Data Quality department at the Canadian Institute for Health Information (CIHI). She leads a team of specialists who provide support to internal and external clients through the development of tools and practices to promote high quality health information. Her team provides data quality leadership through collaboration and knowledge exchange. Prior to managing the Data Quality department, Maureen managed the activities and relationships for CIHI’s Home and Continuing Care department. Prior to joining CIHI in 2003, Maureen held positions with Statistics Canada and the Office for National Statistics in the UK.

Why Remediation Matters: Implementing Corrective and Preventive Action (CAPA) as part of Data Governance

George Acosta, Finance Data Steward Executive, Bank of America

Abstract

One of the biggest challenges data governance efforts face today is showing real business results quickly and in a language the business will understand.  In our practice we typically spend significant time articulating a series of actions (glossaries, stewardship, quality measurement…) with a vague promise of benefits somewhere in the future after upfront investment.

Deming figured this out when he created the Plan-Do-Check-Act (PDCA) cycle.  PDCA contains the fundamental piece we so often overlook: Act on what you have Checked.  This is so basic to the quality movement that it is enshrined in ISO standards since the 1980s.

This presentation will discuss how Bank of America is using Corrective and Preventive Action (CAPA) in support of Data Governance.  

Topics include:

  • An overview of the process and how it drives change in culture
  • How CAPA links Data Governance actions to business results directly (real world example)
  • How CAPA links defects to business processes directly.

Speaker bio

About the Author

George Acosta's photo
I'm speaking at IDQ2013

George Acosta is a Senior Vice President at Bank of America where he leads the Finance Data Management & Governance team and is the Finance Data Steward Executive.

George joined the Bank in 2007.  In previous roles, George held leadership roles in manufacturing operations and engineering, developing strong expertise in process improvement and operations management while at Johnson & Johnson and Smith & Nephew, both leading medical device companies.  George started his career as a naval officer, reaching the rank of Lieutenant Commander assigned as Chief Engineer of the USS McInerney (FFG 8), a guided missile frigate.

George received a B.S. in Aerospace Engineering from the United States Naval Academy at Annapolis, MD.  He also attained an M.B.A. degree with a focus in International Business from Thunderbird School of Global Management.  George was certified as a Six Sigma Green Belt at Johnson & Johnson and was a recognized Lean expert.

View my profile on LinkedIn

The Emerging Synergy between Data Quality and Data Visualization

Dr. Mary L. Good, Special Advisor to the Chancellor for Economic Development, University of Arkansas at Little Rock (UALR)

Abstract

In this presentation, key topics will include:
  • Why data quality and data visualization are developing a new synergy that allows corporations and researchers garner more knowledge from the ever-growing scope of data being collected in all fields
  • How the time-consuming process of “data wrangling” is being supplemented by interactive visualization
  • Examples of new, immersive 3D and animated data visualization software and hardware as a data quality exploration and transformational analysis tool
  • The unique challenges ahead in the new era of “data wrangling” with advanced data visualization

Speaker bio

About the Author

Mary Good's photo

Dr. Mary L. Good, University of Arkansas at Little Rock’s Special Advisor to the Chancellor for Economic Development, is the Founding Dean of UALR’s College of Engineering & Information Technology who also served as Former Under Secretary for Technology for the Technology Administration in the U.S. Department of Commerce. She is also Former President of the American Chemical Society and Former Senior Vice President for Allied Signal (now Honeywell International).

Her distinguished career also includes serving as the past President of the American Association for the Advancement of Science (AAAS) and an elected member of the National Academy of Engineering. She has received many awards, including the National Science Foundation’s Distinguished Public Service Award, the American Institute of Chemists’ Gold Medal, the Priestly Medal from the American Chemical Society, and the Vannevar Bush Award from the National Science Board, among others. Dr. Good is a member of the Science, Technology and Economic Policy (STEP) Board of the National Research Council of the National Academies. She is an elected member of the Swedish Academy of Engineering, the American Philosophical Society, and the National Academy of Art and Sciences.

Session 4:45 - 5:45 pm KEYNOTE PANEL

Information and Data Quality: Are We Moving Fast Enough?

Panelists
Ted Friedman, Vice President, Research, Gartner Inc.
Thomas Redman, President, Navesink Consulting Group

Moderator
C Lwanga Yonke, Advisor, IAIDQ

Abstract

This year, our closing panel will look to the future and tackle the BIG issues in data quality, including:

  • What role does data play in current urgent global needs (global economy, healthcare, civil liberties, national security etc.)?
  • How do we become better skilled at articulating the links between data and these issues?
  • How do we position Data Quality to address these needs?
  • What structures need to be put in place so the Data Quality community is better able to think globally, plan regionally and act locally?
  • What can we individually commit to do?

This will be a fast-paced, thought-provoking and insightful conversation, with lots of audience participation. Do not miss it!

Panelist bios

About the Author

Tim Friedman's photo
I'm speaking at IDQ2013

Ted Friedman. As a member of Gartner’s Information Management team, Ted Friedman conducts research focused on data quality, data integration, information governance and information management strategy. He works with Gartner clients on building the business case for investing in and furthering their maturity in these areas. In addition, Ted closely follows the related technology markets, advising clients on vendor and tool selection, price negotiation, and emerging deployment approaches for optimal impact and value. Through his work on concepts such as Gartner's Information Capabilities Framework, Ted assists clients with modernization strategies for their information management programs and related technology infrastructure.

View my profile on LinkedIn

About the Author

Tom Redman's photo

I'm speaking at IDQ2013Dr. Thomas C. Redman, "The Data Doc," is President of Navesink Consulting Group, in Rumson, NJ. After earning his Ph.D. in Statistics from Florida State, Dr. Redman joined Bell Labs, where he formed the Data Quality Lab in the late 80's. Tom started Navesink Consulting Group in 1996 and has been fortunate enough to consult with many of the world's leading organizations.

Dr. Redman has helped thousands understand that quality data, and more recently data governance, are top-line business issues. He has developed some of the most powerful methods of data management and holds two patents. His fourth book, Data Driven: Profiting from Your Most Important Business Asset, was published by Harvard Business Press in 2008.

He is an IAIDQ co-founder and winner of its 2011 Distinguished Member award. 

View my profile on LinkedIn

Moderator bio

About the Author

C. Lwanga Yonke

C. Lwanga Yonke IQCP is a seasoned information quality and data governance leader with more than 24 years of oil industry experience. He has successfully designed and implemented projects in multiple areas, including information quality, data governance, data management, business intelligence, data warehousing, data architecture and document management. His initial experience is in petroleum engineering and operations.

He is a founding member of the International Association for Information and Data Quality (IAIDQ) and currently serves as an Advisor to the IAIDQ Board of Directors. Lwanga is a member of the Society of Petroleum Engineers (SPE) and a senior member of the American Society for Quality (ASQ).

Lwanga is the recipient of the 2008 SPE Western North America Regional Management and Information Award and the 2011 IAIDQ Distinguished Service Award.

An ASQ Certified Quality Engineer, Lwanga holds a BS degree in petroleum engineering from the University of California at Berkeley and earned an MBA from California State University, Bakersfield.

View Lwanga Yonke's profile on LinkedIn

Session 5:45 - 6:30 pm

International Association for Information and Data Quality (IAIDQ) Meeting

Christian Walenta IQCP, President, IAIDQ
Karen Way, Director Member Services, IAIDQ

Abstract

All are invited to attend this meeting to get the latest update on IAIDQ activities. You will also meet IAIDQ members and others who are interested in the association, and network with others who share a keen interest in information quality. IAIDQ leaders will present a brief overview of current goals and objectives, and will solicit your input and ideas for future plans. Opportunities to volunteer will also be discussed. Add your voice to the conversation!

Speaker bio

About the Author

Christian Walenta

Christian Walenta IQCP serves as the President on the Board of Directors of the International Association for Information and Data Quality (IAIDQ). He has held a variety of professional and management positions with IBM in Germany, UK and the USA. His professional experience of more than 20 years includes information management, data governance, management of supply chain systems, business development and project management. Christian is a recognized leader in Information Management (IM) at IBM and currently works in IBM's Product Information organization. Prior to it, he led IBM's Information Management Office, responsible for providing information management strategy, IM programs and Data Governance enterprise wide. He implemented Information Quality programs for IBM's Supply Chain and has extensive practical experience in all aspects of Information Management, including setting overall IM strategies, implementing data stewardship and governance programs for complex organizations and driving Information Management as continuous culture change.

Christian achieved varies professional certifications, including the IAIDQ Information Quality Certified Professional (IQCP), the PMI Project Management Professional (PMP), and the IBM Executive Project Manager. He holds a Master degree in Computer Science and Business Administration, as well as an MBA in International and Strategic Management.

View my profile on LinkedIn

About the Author

Karen Way's photo

Karen Way is the Principal and Owner of Three Elm Technology Strategies, a company that specializes in Data Quality, Data Governance and MDM solutions for the healthcare sector. She has extensive experience in healthcare information technology, data quality, data governance, data management, data analytics and master data management from both business and technical implementation perspectives. Her expertise includes business and systems requirements, technical design and development, use case analysis, and business process re-engineering. With an MS in Healthcare Administration, Karen has a proven ability to build and lead effective, successful teams that deliver best in class solutions to improve client satisfaction, organizational cost savings, and workplace productivity.

View my profile on LinkedIn

 

Thursday 7 November, 2013— Post-Conference Tutorials

Thursday tutorials 8:30 am –4:30 pm

The Ten Habits of Organizations with the Best Data: a Management System for Data Quality

Thomas C. Redman, President, Navesink Consulting Group

Abstract

 This tutorial addresses a simple question:  “What is it that companies with the best data do differently than everyone else?”  Naturally, there is no simple answer.   Demanding questions almost never yield simple answers.
Instead, those with the best data do a number of things tolerably well.  For examples:

  • They focus a significant fraction of their efforts on finding and eliminating the root causes of error, at their source,
  • They focus their attention on the “most important needs of the most important customers,”
  • They measure against those needs,
  • They assign the right management accountabilities to data creators, customers, and process owners, and
  • Their leaderships demand results.

This fast-paced, interactive, workshop-style builds on this theme.  It describes the “ten habits of those with the best data,” including some essential “how-tos,” some important “here’s whys,” and, perhaps most critically, the “here’s how these habits fit together to form a powerful whole.”

Speaker bio

About the Author

Tom Redman's photo

I'm speaking at IDQ2013Dr. Thomas C. Redman, "The Data Doc," is President of Navesink Consulting Group, in Rumson, NJ. After earning his Ph.D. in Statistics from Florida State, Dr. Redman joined Bell Labs, where he formed the Data Quality Lab in the late 80's. Tom started Navesink Consulting Group in 1996 and has been fortunate enough to consult with many of the world's leading organizations.

Dr. Redman has helped thousands understand that quality data, and more recently data governance, are top-line business issues. He has developed some of the most powerful methods of data management and holds two patents. His fourth book, Data Driven: Profiting from Your Most Important Business Asset, was published by Harvard Business Press in 2008.

He is an IAIDQ co-founder and winner of its 2011 Distinguished Member award. 

View my profile on LinkedIn

Leading Change: the Human Dimension

C. Lwanga Yonke, IQCP, Advisor, IAIDQ

Abstract

Effective change leadership is often the differentiator between successful data governance and information quality efforts and less successful ones.  This tutorial provides helpful frameworks, models and tools that IQ and DG leaders can use to achieve success.

Drawing from lessons learned at the frontline, this interactive tutorial goes beyond classical change management concepts to include best practices from marketing and psychology. 

Participants will learn:

  • The peculiar characteristics of data that increase the complexity of data governance projects
  • The critical success factors of effective change
  • How to leverage and reduce resistance
  • How to understand and manage personal and organizational transitions
  • The elements of successful change communication throughout the change lifecycle

Speaker bio

About the Author

C. Lwanga Yonke

C. Lwanga Yonke IQCP is a seasoned information quality and data governance leader with more than 24 years of oil industry experience. He has successfully designed and implemented projects in multiple areas, including information quality, data governance, data management, business intelligence, data warehousing, data architecture and document management. His initial experience is in petroleum engineering and operations.

He is a founding member of the International Association for Information and Data Quality (IAIDQ) and currently serves as an Advisor to the IAIDQ Board of Directors. Lwanga is a member of the Society of Petroleum Engineers (SPE) and a senior member of the American Society for Quality (ASQ).

Lwanga is the recipient of the 2008 SPE Western North America Regional Management and Information Award and the 2011 IAIDQ Distinguished Service Award.

An ASQ Certified Quality Engineer, Lwanga holds a BS degree in petroleum engineering from the University of California at Berkeley and earned an MBA from California State University, Bakersfield.

View Lwanga Yonke's profile on LinkedIn

Step-by-Step Data Governance Strategies

Gwen Thomas, Senior Operations Officer, IFC

Abstract

What’s your Data Governance strategy? If you can’t answer concisely and confidently, this tutorial is for you.  It’s designed for IQ workers who want insight into DG, DG managers who need to start or freshen their programs, and Stakeholders who need to articulate and communicate strategies. Using the DGI Data Governance Framework, we’ll explore the elements of a DG function and its key activities, looking at how to configure a set of services that align with your unique set of cultural and environmental factors. Then, we’ll use step-by-step procedures to collect our decisions into a Data Governance Strategy.

In this highly interactive session, attendees will:

  • Explore key drivers that prompt the move to formal Data Governance
  • Practice building easy-to-quote value statements
  • Identify key governance activities
  • Review governance models and organizational options
  • Dissect common strategy templates, and practice translating their own decisions into a formal strategy document.

Speaker bio

About the Author

Gwen Thomas's photo

Gwen Thomas is a Data Governance pioneer. Founder of the vendor-neutral Data Governance Institute (DGI) and primary author of the DGI Data Governance Framework and guidance materials found at www.datagovernance.com, she has influenced hundreds of programs around the globe. In 2013, Gwen joined the International Finance Corporation, part of the World Bank Group.

Gwen has personally helped many organizations build their Data Governance and Stewardship programs.  Before data got “hot”, she spent a dozen years working in the trenches and in management roles as a systems integration consultant and Knowledge Manager. Prior to that, she was a college teacher, a marketing professional, and a publisher of a literary magazine.

Gwen is a frequent presenter at industry events. Information-Management.com has named her one of "17 Women in Technology You Should be Following on Twitter" (she's @gwenthomasdgi) and has included her in their Data Governance Gurus list.

View my profile on LinkedIn

 

 

 

Little Rock autumn
Little Rock Autumn view
Photos © Little Rock Little Rock Convention & Visitors Bureau used by permission