From Short Stack to Competitive Advantage: IHOP’s Pursuit of Data Quality

What constitutes a short stack of pancakes?

That’s a pertinent question for a breakfast-focused restaurant chain such as IHOP Corp., based in Glendale, CA. The company, which had 2005 annual revenues of $348 million, operates
International House of Pancakes, the third largest family-style restaurant chain in the country, behind Denny’s and Waffle House. The company has over 1,250 restaurants, of which over 99
percent are operated by franchisees.

Restaurant chains want to know what customers buy across an organization, to study buying patterns, regional variations, the popularity of test items on the menu, and customer satisfaction. If a
“short stack” isn’t the same between two IHOPs—one has three pancakes, the other includes four—or consistently named, then any related data will lose meaning in
aggregate unless time and resources are spent to standardize such information after the fact.

Such data consistency and quality concerns aren’t limited to the food-service industry. Many organizations struggle with what constitutes a “lead” in their marketing automation
software, or a “prospect” or “customer” in a customer relationship management (CRM) system. Especially in large companies, a customer may exist multiple times in multiple
departments’ unconnected systems; one business division may record revenue from a multi-year consulting contract as a lump sum, while another may amortize it over the life of the contract.

Duplicate information and competing definitions, however, complicate a company’s efforts to create what data management gurus like to call “a single version of the truth.”
Multiple versions of the same data, for example, make it difficult to generate precise sales projections, calculate correct compensation for executives or salespeople, correctly report annual
revenue, demonstrate compliance with regulations, and, above all, accurately gauge the state of the business.

To solve this problem, many organizations are launching data quality initiatives to improve the operational metrics they gather, while also creating—or outsourcing—data quality teams.
These teams, often led by a data steward, manage how data is collected, standardized, and shared across the organization. They specify policies for applications into which data is entered, oversee
automated data feeds, periodically reassess all data for cleanliness, and consolidate duplicate records.

The Push for Reliable Data
The growth of data quality teams parallels organizations’ increasing data quality problems. “Especially now that CRM has become more mainstream—people have had their systems for
five-plus years—the enormous interest in business intelligence and analytics just magnifies the problem,” notes Bernard Drost, chief technology officer of Innoveer Solutions Inc., a
customer strategy and solutions consultancy based in Cambridge, MA. Before, companies might not have seen data problems, but once they begin gathering information in a CRM system, which feeds
financial ERP applications (not to mention data warehouses), slight irregularities can become much more evident. “All of a sudden, you’re making $100 million when you should have made
$50 million, or your [sales] pipeline is only $50 million, when you thought it was $100 million.”

A company’s financial data may not be the only information at risk. “In a finance or accounting system, if you have three duplicate records for John Doe, each with a unique invoice,
it’s not always perceived as a problem, because the primary business issue is to get paid for the three invoices,” says Tom Brennan, president at Stalworth, an enterprise data quality
software vendor based in San Mateo, CA. In a CRM system, however, companies need to know which sales and support records refer to the same person, to prevent duplicate marketing efforts, to know
which products the customer owns to avoid trying to sell them the same thing again, and to have a complete and accurate view of any problems customers experience, to better support them.

Some software, however, may be naturally prone to creating more—not fewer—versions of the same data. For one of Stalworth’s customers, for example, Brennan says whenever the ERP
system received an order, it would check to see if the customer already existed in the CRM system. If it didn’t find an existing record, it created one. Due to poor matching capabilities,
however, it rarely found existing records, and, as a result, 80 percent of the records in the CRM system were duplicates. While that’s an extreme case, it highlights the popular notion that
systems can be great duplicators.

Companies increasingly want to eliminate duplicate data, or competing versions of data, both to create a more efficient business and to better comply with regulations. The Sarbanes-Oxley Act, for
example, requires company officers to sign off on the integrity of their financial statements. Other regulations, however, also touch on data quality, including the need to recognize high
risk—such as patient data under HIPAA—and limit access to all copies of that high-risk data. Also, don’t forget state do-notcall lists, which prohibit companies from cold-calling
residents who’ve opted out of such communications. Ironically, companies can continue to contact existing customers, even if they’re on a do-not-call list, unless those customers
specifically opt out. This marketing opportunity creates an imperative to have an authoritative record for each customer.

Organic Growth Complicates Data Consistency
So how many pancakes are in a short stack? Today, at any of IHOP’s locations, a short stack always contains three pancakes. Although the price varies by restaurant, the menu item—down
to the picture and description—is standardized. While this might seem like a common-sense approach to supporting so many franchised locations, enforcing this level of consistency is quite new
and not typical for many restaurants.

The food-service industry is renowned for taking a disconnected, bottomup approach to technology. When you’re operating with tight margins, working long hours in hot environments, managing a
labor force prone to turnover, and constantly trying to maintain proper inventory levels, general data quality concerns—or indeed, any technology concerns beyond the efficacy of the
kitchen’s automatic fire suppression system, ranges, and refrigeration units—may seem remote.

IHOP was no exception, especially because all but seven locations are franchised. Thus individual owners made their own technology choices, and the resulting hodgepodge had data management
repercussions. By the summer of 2003, besides the 200 manual cash registers still in use, IHOP restaurants were using point-of-sale (POS) systems—the modern, often touch-screen updates to the
venerable cash register—from five different manufacturers, two of whom had gone out of business and stopped updating their software.

The lack of modern POS equipment meant IHOP Corp. could only automatically pull about 20 percent of the point-of-sale data into a centralized database for further analysis. Thus, the company was
operating with incomplete information about what customers ordered, when they ordered it, how much they spent, and other information IHOP needs to operate more efficiently—especially as a
publicly traded company. “Our ability to pull data at the restaurant level was absolutely critical,” says Patrick Piccininno, vice president of IT for IHOP Corp. Yet how could IHOP get
such information?

Finding the Business Case
Despite a widespread need for improved data, companies rarely simply set out with the stated goal of gathering better data. Rather, they need a business case, says Ed Abbo, senior vice president of
CRM products for Oracle Corp, based in Redwood City, CA. “The right approach is to essentially have a business initiative, such as ‘To drive revenue growth, let’s increase the
product density—which is essentially how many products a particular customer has—from 1.5 to 2, or 1.5 to 3.’”

One way to create a business case is to find what directly costs your company money, and how better data, data analysis, and data retrieval might help. Brennan mentions one customer, a well-known
consumer-software vendor, with a 2,000-person-strong call center. In the past, when a customer contacted the call center, employees first had to ascertain whether the customer was already in the
system, and if not, create a new record. Yet creating a new record took 45 seconds less than accessing an existing one and became the favored approach no matter what, since employees’
compensation was based on call-resolution speed.

“The reps did the math and stopped trying to find customer records altogether,” says Brennan. The end result: “It was an automatic duplicate generator.” Yet if a large
company can eliminate duplicates, and also shave 10, 20, or 45 seconds off the average time needed to resolve a customer support call, it can theoretically save tens or even hundreds of thousands
of dollars per year in service representatives’ time, and have cleaner data to boot.

Finding IHOP’s Customer Data
At IHOP, Piccininno recalls, the overall business driver was: How do you get an organization to think holistically about a customer? The answer for IHOP was to gather better customer data, and so
the company began by standardizing on POS systems from MICROS Systems Inc. By June 2006, it had rolled out POS systems to 700 restaurants. Next, IHOP implemented analytics, reporting, and directory
systems that rely on customer data. Ultimately, this gives employees tools to analyze what customers are doing and allows managers to judge employee performance based on customer-centric metrics.
All told, “It has really forced behaviors and held people accountable to managing some of those pieces.”

To handle many of the technological underpinnings to this more customercentric approach, IHOP researched software from several vendors, and ultimately short-listed technology from Oracle and
Siebel. (It already used software from both.) Ultimately, the company opted for Oracle, and to date has implemented Oracle E-Business Suite, including Oracle TeleService, Property Manager,
Contracts, and Project Management and Project Collaboration modules, and an Oracle Customer Data Hub, which helps standardize transactional and analytical environments, and uses it to relay data
between the EBusiness Suite and other enterprise applications, such as the Lawson software currently used for payroll, and for amassing some POS and revenue accounting information.

One of the most useful results of this integration has been “what we affectionately refer to as FRED: Franchise and Restaurant Enterprise Directory,” says Piccininno. “FRED is a
custom, Web-based application built using OC4J (Oracle Containers for J2EE).” It works with Oracle Portal, is deployed via the Oracle Application Server, and also interfaces with Oracle
Customer Hub.

Ultimately, FRED provides “an easy way to view and maintain all key attributes and relationships of the IHOP franchisees, their restaurants, and the IHOP corporate staff,” he says.
Today, it contains roughly 85 percent of the 250–300 data points tracked for any given restaurant, including daily transaction logs, information about franchisees, the person in charge of
maintaining manually updated data points in FRED, and who manages each automated data feed’s consistency and accuracy.

As a result, IHOP already has a better view of its business-critical data. By summer 2006, it was able to pull information from over 80 percent of its restaurants’ POS systems automatically,
and Piccininno anticipates full data integration by December 2007.

Controlling the Menu
These data management processes have created other opportunities, such as retooling IHOP’s menu management processes. Previously, when a franchise owner needed new menus— perhaps the
old ones were tattered or prices needed changing—the owner would mark up the old menu and mail it in. The menu department recorded the changes, sent a file to IT, and ordered new menus from
the printer. Changes might also be faxed back and forth. Then, IT would attempt to enter changes into a centralized database, to know what each restaurant had on its menu, and thus decode
individual restaurants’ POS data feeds.

With this approach, however, it was difficult to enforce data consistency or accurately track data, and the manual processes were especially timeconsuming and prone to error. “Our old menu
creation process was an absolute disaster,” acknowledges Piccininno. Even so, “Up until last year, that’s the way the process worked for 47-plus years.”

In 2005, however, IHOP implemented an online menu-ordering system, and everything changed. Now franchise owners order their menus twice annually, and only via the online tool. From a data
management perspective, this online process is elegant simply because it limits restaurant owners’ options, since the menu tool interfaces with a centralized database containing all menu data
points, which is administered by IHOP. As a result, IHOP can enforce item names and descriptions; require franchises to carry core IHOP items; let franchisees select from a specific list of
standard menu items, manager’s specials, and regional variations; while also letting them set their own prices, which IHOP then knows, so whatever options and prices owners select not only
end up on their printed menus, but also on their restaurant’s POS touch screens.

IHOP abstracted the range of menu possibilities into a set of controlled parameters, says Piccininno. “I’ll liken it to an item master—where you have your menu items locked and
consistently managed—so if you’re rolling up 1,000 restaurants’ menus, it refers to a short stack of pancakes in the same way.”

Creating such centralized enforcement, however, wasn’t easy. “Just the notion of our being able to manage a central database of items, and to give you, the franchisee, local control of
items—not to manage the items themselves, but just the pricing—that took a lot of work,” says Piccininno. In particular, IHOP had to standardize on (and implement) new POS
equipment and work with MICROS to make the software changes necessary to accommodate the distributed menu-management process.

Now, however, IHOP can quickly push just menu changes down to its POS systems. “Because we’ve re-architected the menu items from MICROS in a way that allows us to download the new menu
items, not the old, all you have to do is change eight menu items, and you’re done,” says Piccininno. “Now, I hate to say the words out loud, but it’s becoming more and more
mundane, very routine, it’s no big deal, and that’s the best news of all.”

Of course, IHOP also had to sell franchise owners on the time and effort needed to standardize on this POS system. “We’ve tried to craft a value proposition around the new capabilities
of the MICROS system,” says Piccininno, and in particular around essential business metrics. For example, IHOP plans to make the MICROS information available via a portal for its franchise
owners, so they can slice and dice point-of-sale data, including check information, promotion efficacy, and other operational metrics.


Data Quality Team Tasks

A data quality team typically tackles three problems: data duplication; standardization of information (often related to mailing addresses), data definitions, and data formats; and data
completeness.

“Especially if you want to talk data analytics, if no one fills a database column, you’re out of luck,” notes Bernard Drost, chief technology officer of Innoveer Solutions
Inc.

Reconciling duplicate records, however, may be the data quality team’s hardest ongoing challenge. For example, in a typical CRM system, Drost points out, 30 to 40 percent of customer
records are duplicates unless “companies are really, really good and really strict,” in which case it might only be 20 percent. Even so, “especially if you have millions
of records, it becomes a very big problem.” (Note older mainframe environments with years of never-before-cleaned data may have even worse data quality problems.)

A variety of tools can help organizations prune duplicate records, but they can’t remove them all. Drost says that, in general, about 50 percent of duplicates can be fairly
automatically removed and another 25 percent removed with some manual effort using tools that apply textual logic to information from such companies as Firstlogic Inc. (recently acquired by
Business Objects SA), Group 1 Software, Identity Systems (now part of Nokia, and formerly known as SearchSoftwareAmerica), QAS Systems Ltd, and Stalworth Inc. The remaining duplicate
records—roughly 25 percent—require vetting each individual record to determine which are duplicates and which must be merged and how, and which may come down to a value
judgment. “That’s the problem with data quality, and the reason why you need to identify someone who’s the data owner.”

Beyond cleaning existing data, to maintain the integrity of data before it gets into systems, “you want to try and intercept the data you use, and see [if] there are any dupes also
propagating to [the] warehouse,” says Drost. For example, when a customer service representative enters a customer’s first and last name into a customer-support system, the
software might present any matches or suspected matches between that data and customers already in the database to prevent additional duplicates from being created, and to let
representatives flag potential duplicates they find, for the data quality team.

Intercepting data before it gets into the system is critical, especially when the information in question is financial. “If you run an opportunity report on a sales pipeline, and the
same opportunity is in there twice, then your pipeline is going to be off,” Drost warns.

Appointing a Data Steward
Driven by the need to improve operating efficiencies and comply with regulations, many companies, like IHOP, are launching their own data quality initiatives, which typically have two components: a
new system to capture and report on needed data and ensuring data stays clean and consistent.

Increasingly, data quality teams—often led by a data steward—are ensuring cleanliness as well as consistency. “Generally a data steward has data analysts working for them, and
they have a great grasp of data, cut horizontally across several departments, and drive compliance and consistency of format across the organization,” notes Stalworth’s Brennan.

Data stewards are especially useful for defusing questions about where data should live, and who “controls” it, and they refocus the discussion on what’s needed to maintain clean
and consistent data of maximum usefulness to the company as a whole. In particular, notes Abbo, “There’s a need for a central team that defines the terms and essentially maps terms
across the organization, so there’s a common understanding of terms across the organization.”

How IHOP Manages Data Quality
At IHOP, the project management team, led by one person, oversees the data quality initiative—the process of ultimately ensuring each data point is accurate. To help, each functional team in
IHOP—including accounting, legal, franchisee operations, property management, IT, franchisee property development, HR, marketing, and field operations—has a regional data owner who is
responsible (and held accountable) for all relevant attributes in FRED, and for attending the FRED steering meetings that occur at least every six weeks.

Beyond that, “The IT executive steering committee—composed of all the executive stakeholders within the organization that provide oversight into all the cross-functional technology
initiatives in place—holds them accountable,” says Piccininno. The group, which meets at least monthly, always gets a FRED update: “Who didn’t update? Is anyone standing in
their way? Are automated data feeds having issues?”

Because of the data steward’s purview, getting senior-management sign-off for data quality projects is essential. “Where I’ve seen this be successful is if it’s driven from
the top of an organization down, meaning the CEO or the COO is driving an initiative that is looking for the answers to these business questions,” says Abbo. “Where we’ve seen
these initiatives not really get traction is if they’re driven from one [internal] organization, from one IT organization, or from IT itself; it’s much harder to make them
successful.”

Still, feathers occasionally get ruffled. For example, at IHOP, “every once in a while you will observe a little bit of finger-pointing between some of the various
functions—that’s my data, you’re not touching my data,” says Piccininno. By and large, however, thanks to senior management’s backing the data quality initiatives, and
careful oversight, he says, most employees and franchise owners are on board.

Data Quality Is Just the Beginning
While ensuring data is accurate and reliable helps solve current business problems, clean data also lays the foundation for future endeavors. For example, with IHOP’s franchise owners,
“We’re always really good about asking them for information, but we’ve never been, in my opinion, really good about sharing it,” Piccininno admits.

Next year, IHOP plans to make an IHOP.net portal available to franchise owners. With the help of identity management software, it will present them a portal customized for the restaurant or
restaurants they own, and they will be able to view all information IHOP captures about their location as well as business-wide trends.

When sharing these types of business metrics, “data quality, as you can imagine, is going to be pretty damn critical,” says Piccininno; yet it will also create new opportunities.
“To see all this valuable operating data, so they can become more successful restaurant operators—that’s really the next step.”

Mathew Schwartzis a freelance technical writer based in Massachusetts.mat@penandcamera.com

Share this post

scroll to top