Search engines may penalize sites they discover using black hat methods, either by reducing their rankings or eliminating their listings from their databases altogether. Such penalties can be applied either automatically by the search engines' algorithms, or by a manual site review. One example was the February 2006 Google removal of both BMW Germany and Ricoh Germany for use of deceptive practices. Both companies, however, quickly apologized, fixed the offending pages, and were restored to Google's search engine results page.
To do that, you need a consolidated view of customer preferences and expectations across all channels – web, social media, mobile, direct mail, point of sale, etc. Retailers do this using omnichannel retail analytics. Marketers can use this information to create and anticipate consistent, coordinated customer experiences that will move customers along in the buying cycle. The deeper your customer insight into behavior and preferences, the more likely you are to engage them in lucrative interactions.
However, with all of these so-called modern conveniences to life, where technology's ever-pervading presence has improved even the most basic tasks for us such as hailing a ride or ordering food or conducting any sort of commerce instantly and efficiently, many are left in the dark. While all of us have become self-professed experts at consuming content and utilizing a variety of tools freely available to search and seek out information, we're effectively drowning in a sea of digital overload.
Early versions of search algorithms relied on webmaster-provided information such as the keyword meta tag or index files in engines like ALIWEB. Meta tags provide a guide to each page's content. Using metadata to index pages was found to be less than reliable, however, because the webmaster's choice of keywords in the meta tag could potentially be an inaccurate representation of the site's actual content. Inaccurate, incomplete, and inconsistent data in meta tags could and did cause pages to rank for irrelevant searches.[dubious – discuss] Web content providers also manipulated some attributes within the HTML source of a page in an attempt to rank well in search engines. By 1997, search engine designers recognized that webmasters were making efforts to rank well in their search engine, and that some webmasters were even manipulating their rankings in search results by stuffing pages with excessive or irrelevant keywords. Early search engines, such as Altavista and Infoseek, adjusted their algorithms to prevent webmasters from manipulating rankings.
The fee structure is both a filter against superfluous submissions and a revenue generator. Typically, the fee covers an annual subscription for one webpage, which will automatically be catalogued on a regular basis. However, some companies are experimenting with non-subscription based fee structures where purchased listings are displayed permanently. A per-click fee may also apply. Each search engine is different. Some sites allow only paid inclusion, although these have had little success. More frequently, many search engines, like Yahoo!, mix paid inclusion (per-page and per-click fee) with results from web crawling. Others, like Google (and as of 2006, Ask.com), do not let webmasters pay to be in their search engine listing (advertisements are shown separately and labeled as such).
It is increasingly advantageous for companies to use social media platforms to connect with their customers and create these dialogues and discussions. The potential reach of social media is indicated by the fact that in 2015, each month the Facebook app had more than 126 million average unique users and YouTube had over 97 million average unique users.
Optimization techniques are highly tuned to the dominant search engines in the target market. The search engines' market shares vary from market to market, as does competition. In 2003, Danny Sullivan stated that Google represented about 75% of all searches. In markets outside the United States, Google's share is often larger, and Google remains the dominant search engine worldwide as of 2007. As of 2006, Google had an 85–90% market share in Germany. While there were hundreds of SEO firms in the US at that time, there were only about five in Germany. As of June 2008, the market share of Google in the UK was close to 90% according to Hitwise. That market share is achieved in a number of countries.
By using the Facebook tracking pixel or the Adwords pixel, you can help to define your audience and work to entice them to come back to your site. Let's say the didn't finish their purchase or they simply showed up and left after adding something to their shopping cart, or they filled out a lead form and disappeared, you can re-target those individuals.
Website saturation and popularity, or how much presence a website has on search engines, can be analyzed through the number of pages of the site that are indexed by search engines (saturation) and how many backlinks the site has (popularity). It requires pages to contain keywords people are looking for and ensure that they rank high enough in search engine rankings. Most search engines include some form of link popularity in their ranking algorithms. The following are major tools measuring various aspects of saturation and link popularity: Link Popularity, Top 10 Google Analysis, and Marketleap's Link Popularity and Search Engine Saturation.
A content marketer, for example, can create a series of blog posts that serve to generate leads from a new ebook the business recently created. The company's social media marketer might then help promote these blog posts through paid and organic posts on the business's social media accounts. Perhaps the email marketer creates an email campaign to send those who download the ebook more information on the company. We'll talk more about these specific digital marketers in a minute.