Follow Us

We use cookies to provide you with a better experience. If you continue to use this site, we'll assume you're happy with this. Alternatively, click here to find out how to manage these cookies

hide cookie message

Facebook's big data plans include warehouses, faster analytics

An engineer reveals during an industry conference how the site is working to make its backend data processing more efficient

Article comments

Facebook may treasure the data it has on its one billion-plus users for its advertising returns, but the analysis the site performs on that data is expected to continue to pose numerous challenges over the coming year, an engineer said.

The problems, which Facebook has been forced to grapple with "much sooner than the broader industry," include figuring out more efficient ways to process user behavior on the site, how to better access and consolidate different types of data across Facebook's multiple data centers, and devising new open source software systems to process that data, Ravi Murthy, who manages Facebook's analytics infrastructure, said Tuesday.

"Facebook is a data company, and the most obvious thing people think of on that front is ads targeting," he said at an industry conference in San Francisco, during a talk on Facebook's back-end infrastructure, data analytics and open source projects.

"But it goes deeper than this," he said.

One major area of behind-the-scenes work relates to Facebook's analytics infrastructure, which is designed to accelerate product development and improve the user experience through deep analysis of all the available data, whether it consists of the actions users take on the site like posting status updates or which applications they use within Facebook on different devices.

Facebook currently uses several different open source software systems known as Hadoop, Corona and Prism to process and analyze its data, which the company will focus on making faster and more efficient over the next six to twelve months, Murthy said.

Many of the company's challenges are tied to what Facebook refers to as its data warehouse, which combines data from multiple sources into a database where user activity can be analyzed in the aggregate, such as by giving a daily report on the number of photos that have been tagged in a specific country, or looking at how many users in a certain area have engaged with pages that were recommended to them.

The analysis is designed to optimize the user experiences and find out what users like and don't like, but it also is becoming more taxing as Facebook is able to access more and more data about its users, Murthy said. Currently, the Facebook warehouse takes in 500 terabytes of new data every day, or 500,000 gigabytes. The warehouse has grown nearly 4,000-times in size over the last four years, "way ahead of Facebook's user growth," Murthy said.

To deal with these issues, Facebook has developed its Prism software system, which is designed to perform key analysis functions across the company's data centers worldwide, and split up the analyses into "chunks," Murthy said. That way, performing an analysis on, say, some metric related to users' news feeds won't clog up the warehouse more generally.

"We're increasingly thinking about how to capture this data," he said.

The company is also working on a system that takes a completely different approach to query the warehouse to give a response time within a matter of seconds, Murthy said.

Another area Facebook is continually looking at improving is its "transactional infrastructure," which handles the more basic, day-to-day data processing of, say, likes, comments and status updates to keep the social network running smoothly. Some of the questions the company's engineers and analysts are looking at include figuring out how to forecast the actual growth in this type of data, and how much computing Facebook should really allot for it, Murthy said.

"Can we predict what it's going to be six months from now?" he said.

Meanwhile, Facebook is also involved in a long-term effort to make its physical servers more efficient. The company began its Open Compute Project in 2011, with the goal of designing modularized servers that give customers greater control over the networking, memory, power supplies and other components that go into their servers. It was expanded to incorporate ARM processors in January.



Share:

More from Techworld

More relevant IT news

Comments



Send to a friend

Email this article to a friend or colleague:

PLEASE NOTE: Your name is used only to let the recipient know who sent the story, and in case of transmission error. Both your name and the recipient's name and address will not be used for any other purpose.

Techworld White Papers

Choose – and Choose Wisely – the Right MSP for Your SMB

End users need a technology partner that provides transparency, enables productivity, delivers...

Download Whitepaper

10 Effective Habits of Indispensable IT Departments

It’s no secret that responsibilities are growing while budgets continue to shrink. Download this...

Download Whitepaper

Gartner Magic Quadrant for Enterprise Information Archiving

Enterprise information archiving is contributing to organisational needs for e-discovery and...

Download Whitepaper

Advancing the state of virtualised backups

Dell Software’s vRanger is a veteran of the virtualisation specific backup market. It was the...

Download Whitepaper

Techworld UK - Technology - Business

Innovation, productivity, agility and profit

Watch this on demand webinar which explores IT innovation, managed print services and business agility.

Techworld Mobile Site

Access Techworld's content on the move

Get the latest news, product reviews and downloads on your mobile device with Techworld's mobile site.

Find out more...

From Wow to How : Making mobile and cloud work for you

On demand Biztech Briefing - Learn how to effectively deliver mobile work styles and cloud services together.

Watch now...

Site Map

* *