What Happens When Google Meets Statistics?

What Happens When Google Meets Statistics?

Austin, Texas – https://www.localsurgemedia.com/

Citations have often been thought of as the “links” of local SEO. When a business’s address, name, or phone number gets mentioned on the web, this is treated as a sign of the business’s trustworthiness and relevance. But a recent Whiteboard Friday at SEOMoz suggested that links are, on some level, being replaced by “co-citation” in the traditional search results.

Back in January of 2009, Google’s chief economist told McKinsey Quarterly “I keep saying the sexy job in the next ten years will be statisticians.” Statistics, if you’re unaware, is about using mathematical methods to sift through data for patterns.

Sometimes we forget that Google has access to virtually the entire internet, not just it’s link profile. This incredible data profile is a statistician’s wonderland. Panda’s ability to tell quality pages from junk pages is almost certainly the result of statistical analysis. Google’s Knowledge Graph almost certainly uses patterns in search behavior to identify what else you might be interested in. And most SEOs are now convinced that Google is using patterns in user behavior as part of its algorithm.

Here are some of the ways Google might be, or could at some point in the future, use their vast resources of data to rank pages based on patterns in data other than links:

–        As Rand suggested, “co-citation,” in which a company is frequently mentioned, with or without a link, along with certain keywords.

–        Brands that are often mentioned in conjunction with popular brands

–        Companies or websites that users search for after they fail to find what they are looking for during a previous search

–        Frequency of word associations, which can inform Google when two words or phrases are essentially identical to one another

–        Patterns in the structure of language itself, which may allow Google to actually interpret user’s questions in the not too distant future

–        Usage of words or sentence structures that are associated with low quality or high quality content

–        Analysis of title click through rates, which could allow Google to make predictions about what kinds of titles are more likely to be clicked on, even before any clicks are made

–        Patterns in article content or structure, which could be used to identify which articles are “copying” others, so that sources of the original information can be identified even without links

–        Patterns in the type of content that is most likely to go “viral,” so that these articles hit the top of Google even before they propagate through Twitter and Facebook (Google certainly has incentives to make this happen)

–        Patterns in the way that readers rate articles

–        Patterns in the way that accurate, helpful, funny, enjoyable articles are written

The list goes on and on. As computational power increases, so too does the power of statisticians to sift through data and reach conclusions about it. Google’s entire business is about sifting through data, and they have the largest computational resource in the world to back them up.

Brace yourselves…and hire a few stat majors.

Recommended Posts

Start typing and press Enter to search