Dell eyes $50 million savings; Standardizes data platforms and business intelligence tools

Capture1

Dell is currently consolidating data platforms and BI tools while giving business users more options for self-service and deep analytics.

“When we looked at that $70 million BI spend in the business, 70% of it — or about $50 million — was spent taking data out of the IT environment and putting it back together in their own databases and customer-support systems. We looked at that and asked, ‘how can we reduce that big, $70 million chunk?'”: Rob Schmidt – Executive Director for IT, Analytics, and Business Intelligence

Dell announced its BI plan two years ago. It assessed cost centers of business units under IT wing by surveying more than 130 business executives. The study found that more than $50 million was being spent annually on its data warehousing, BI and analytics tools. It also found that an additional amount of almost $70 million was being spent on independent data management, BI and analytics initiatives.

One way of reducing this amount is to build a developer-friendly architecture around centralized. The centralized resources include 200-plus TB Enterprise Data Warehouse (EDW) and SAS analytics environment. The idea was to

  • give better access to data
  • get more development and analysis capabilities
  • get more autonomy on IT supported centralized framework

For this purpose, a three-tier architecture was launched 18 months back.

“We’re moving to a platform where the business develops its own applications, but they migrate them up into our second tier, where we automate and put processes around the applications. The third tier is our production data warehouse, which the business can’t touch.”

This initiative has not changed the technologies drastically. Teradata, SAS, MicroStrategy (production reporting) and Tableau Software (ad-hoc data exploration) were already in use. Business process re-engineering was the biggest challenge as business units signing on to take part would no longer be able to extract data and go their own way.

“We’re teaching and certifying each business unit to develop against our data platforms. And there’s a formal agreement between myself and a VP in each business that lays out what you can and can’t do on the platform.”

“Thou shalt not” rules limit data extraction from the warehouse and querying against certain tables that will bring the database down. Certified DBAs and developers within each business unit know that Schmidt can ‘take away access in a heartbeat’ if something impacts the production environment. According to Schmidt, business units get access to more data and better capabilities than they ever had in their independent reporting and analytical environments in exchange for giving up their silos. A Hadoop cluster for high-scale, low-cost storage and a Teradata Aster database for rapid analytical modeling are some of the new capabilities. Social network data, clickstreams from the Dell.com site and machine sensor data from Dell equipment in the field are currently semi-structured.

Data-Center

“We used to run customer scoring in batch, and we would update it every two or three months, but now we’re doing near-real-time scoring.”

According to Schmidt, Dell can support faster, more-frequent, and more-pervasive analysis using these new tools. For example, a marketing customer-propensity-score analysis that previously required pages of SQL and 16 days of data prep on Teradata takes less than five minutes on Aster. Aster is also taking on a complex, high-scale analysis used to spot customers with expiring warranties. That task used to be outsourced to a third-party service company.

“We paid that vendor a significant amount of money, yet we were only able to do the analysis for the Americas. With these savings alone we have paid for the new environment and we’ve rolled out to all of our regions.”

Dell has also completed a social media proof-of-concept (POC) project whereby social media data captured through Salesforce.com Radian6 social media monitoring environment is used to develop a single view of Dell customers and prospects, tying transactional records with social media handles and email addresses. According to Schmidt, this effort will reduce the cost of fragmented data-retention and analysis efforts across various departments.

Waves of POC projects are aimed at improving customer service through machine data analysis. For example, the customer experience on Dell.com is being optimized by analyzing clickstream data that reveals site navigation patterns and dead ends on the site that turned into service phone calls. A predictive maintenance POC project is aimed at spotting failing hard drives and other equipment before it becomes a problem for customers. This ‘Internet-of-Things’ style deployment will tap into sensor data from Dell hardware in use in the field.

“We’re seeing a massive reduction in data leaving the data warehouse, and we’re also seeing a massive increase in the amount of analytics that can be done. Because the data is shared across the company, there’s very little that I have to do expand this to other groups.”

Shift to a centralized approach has not been hassle free. Dell is expected to layoff more than 1000 employees in the attempt to consolidate redundant organizations and maintaining fewer independent databases and software instances. A Dell spokesman later claimed that the employees were relocated in order to avoid layoff. So far Dell’s marketing, sales, and service organizations are all on board. Schmidt is looking to certify more groups and get to that $50 million savings goal within the next year.