Skip to main content

Web Analytics is Dependent Upon Valid Data

Can you trust the data that's used in your Web site analytics? comScore released the results of a study analyzing the validity of using cookie-based data to measure the number of unique visitors to individual Web sites or to gauge the number of unique users that were served an advertisement by an ad server.

The study, based on an analysis of 400,000 home PCs included in comScore's U.S. sample during December 2006, examined both first-party and third-party cookies. A cookie is a very small text file inserted on a user's computer by a Web server and is unique to that computer's web browser (i.e. Internet Explorer or Firefox).

Cookies are often used by Web servers to identify users and for authenticating, tracking and maintaining specific information about users. First-party cookies are those left on a computer by a Web site that has been visited, while third-party cookies are those left by a domain different than the site being visited, such as an advertising server that has just delivered an ad to a computer, or certain third-party tools used to measure site traffic.

For the purposes of this study, comScore analyzed the cookies from one prominent Web property and one third party ad server, each representative of the total U.S. Internet audience and each reaching well in excess of 100 million Internet users every month. The study examined the degree to which users cleared cookies from their computers, thereby causing servers to deposit new cookies and potentially leading to overstated estimates of unique users when relying on cookie-based server data.

comScore observed that 31 percent of U.S. Internet users cleared their first-party cookies during the month. Within this user segment, the study found an average of 4.7 different cookies for the site. Among the 7 percent of computers with at least 4 cookie resets, comScore counted an average of 12.5 distinct first-party cookies per computer, accounting for 35 percent of all cookies observed in the analysis.

Using the total comScore sample as a basis, an average of 2.5 distinct first-party cookies were observed per computer for the site being examined. This indicates that Web site server logs that count unique cookies to measure unique visitors are likely to be exaggerating the size of the site's audience by a factor as high as 2.5, or an overstatement of 150 percent.

Now, I believe that this insight shouldn't be surprising news to people who understand that web analytics rely on subjective data from sources that may, or may not, be one hundred percent accurate. Moreover, certain web site visitors may be more prone to be over-counted than others. As an example, experienced users of Firefox can easily clear all usage data from their browser each and every time the application is closed (i.e. daily, at a minimum).

"While past studies from other research companies have shown a similar proportion of computers that clear their cookies, the comScore study is the first to highlight the disproportionately high percentage of cookies represented by those computers," commented Dr. Magid Abraham, President and CEO of comScore.

"For example, with just 7 percent of computers accounting for 35 percent of all cookies, it's clear that a certain segment of Internet users clears its cookies very frequently. These serial resetters have the potential to wildly inflate a site's internal unique visitor tally, because just one set of eyeballs at the site may be counted as 10 or more unique visitors over the course of a month. The result is a highly inflated estimate of unique visitors for sites that rely on cookies to count their audience."

That said, when compared to other forms of marketing and advertising, online is still considered much better regarding the validity of an accountability assessment. In fact, the current state of the art for radio and TV consumer advertising measurement has often raised unanswered questions, regarding the apparent inability to accurately state the true audience size.

Popular posts from this blog

How to Capitalize on New AI-Driven APIs

The rapid evolution of the enterprise software landscape is amazing. One of the most significant trends I've observed is the surging demand for Application Programming Interfaces (APIs) driven by the rise of Artificial Intelligence (AI) and Large Language Models (LLMs). According to the latest market study by Gartner, more than 30 percent of the increase in API demand will come from AI and LLM-powered tools by 2026. This illustrates the transformative impact these technologies are poised to have on leaders who innovate. The Gartner study paints a clear picture of the forces at play. Technology Service Providers (TSPs) are leading the charge in adopting Generative AI (GenAI), with 83 percent of the 459 TSPs surveyed reporting that they have already deployed or are piloting these capabilities within their organizations. GenAI API Market Development As TSPs help large enterprise customers integrate GenAI into their offerings, the demand for APIs to power these AI-enabled solutions wi