In addition to helping you find keywords you should be bidding on, thorough keyword research can also help you identify negative keywords – search terms that you should exclude from your campaigns. Negative keywords aren’t terms with negative connotations, but rather irrelevant terms that are highly unlikely to result in conversions. For example, if you sell ice cream, you might want to exclude the keyword “ice cream recipes”, as users searching for ice cream recipes are unlikely to be in the market for your product.
A variety of methods can increase the prominence of a webpage within the search results. Cross linking between pages of the same website to provide more links to important pages may improve its visibility.[48] Writing content that includes frequently searched keyword phrase, so as to be relevant to a wide variety of search queries will tend to increase traffic.[48] Updating content so as to keep search engines crawling back frequently can give additional weight to a site. Adding relevant keywords to a web page's metadata, including the title tag and meta description, will tend to improve the relevancy of a site's search listings, thus increasing traffic. URL canonicalization of web pages accessible via multiple URLs, using the canonical link element[49] or via 301 redirects can help make sure links to different versions of the URL all count towards the page's link popularity score.
Connecting the dots between marketing and sales is hugely important -- according to Aberdeen Group, companies with strong sales and marketing alignment achieve a 20% annual growth rate, compared to a 4% decline in revenue for companies with poor alignment. If you can improve your customer's' journey through the buying cycle by using digital technologies, then it's likely to reflect positively on your business's bottom line.
Website saturation and popularity, or how much presence a website has on search engines, can be analyzed through the number of pages of the site that are indexed by search engines (saturation) and how many backlinks the site has (popularity). It requires pages to contain keywords people are looking for and ensure that they rank high enough in search engine rankings. Most search engines include some form of link popularity in their ranking algorithms. The following are major tools measuring various aspects of saturation and link popularity: Link Popularity, Top 10 Google Analysis, and Marketleap's Link Popularity and Search Engine Saturation.
Every company with a website will have analytics, but many senior managers don't ensure that their teams make or have the time to review and act on them. Once a strategy enables you to get the basics right, then you can progress to continuous improvement of the key aspects like search marketing, site user experience, email and social media marketing. So that's our top 10 problems that can be avoided with a well thought-through strategy.
In my experience, a common challenge is where to start drawing up your digital marketing plan. I think there is a fear that a massive report is required, but we believe that lean planning works best. Your plan doesn't need to be a huge report, a strategy can best be summarized in two or three sides of A4 in a table linking digital marketing strategies to SMART objectives within our RACE planning framework. We recommend creating a lean digital plan based on our 90-day planning templates to implement your digital plan rapidly to gain traction. You can learn more in our free download.
As an Internet marketing strategy, SEO considers how search engines work, the computer programmed algorithms which dictate search engine behavior, what people search for, the actual search terms or keywords typed into search engines, and which search engines are preferred by their targeted audience. SEO is performed because a website will receive more visitors from a search engine the higher the website ranks in the search engine results page (SERP). These visitors can then be converted into customers.[4]
Just think about any relationship for a moment. How long you've known a person is incredibly important. It's not the be-all-end-all, but it is fundamental to trust. If you've known someone for years and years and other people that you know who you already trust can vouch for that person, then you're far more likely to trust them, right? But if you've just met someone, and haven't really vetted them so to speak, how can you possibly trust them?
Digital strategist Dr Dave Chaffey is co-founder and Content Director of marketing publisher and learning platform Smart Insights. Dave is editor of the 100+ templates, ebooks and courses in the digital marketing resource library created by our team of 25+ digital marketing experts. Our resources are used by our Premium members in more than 100 countries to Plan, Manage and Optimize their digital marketing. Free members can access our free sample templates here. Dave is a keynote speaker, trainer and consultant who is author of 5 bestselling books on digital marketing including Digital Marketing Excellence and Digital Marketing: Strategy, Implementation and Practice. To learn about my books, see my personal site Digital marketing books by Dr. Dave Chaffey. In 2004 he was recognised by the Chartered Institute of Marketing as one of 50 marketing ‘gurus’ worldwide who have helped shape the future of marketing. Please connect on LinkedIn to receive updates or ask me a question.
You may not want certain pages of your site crawled because they might not be useful to users if found in a search engine's search results. If you do want to prevent search engines from crawling your pages, Google Search Console has a friendly robots.txt generator to help you create this file. Note that if your site uses subdomains and you wish to have certain pages not crawled on a particular subdomain, you'll have to create a separate robots.txt file for that subdomain. For more information on robots.txt, we suggest this Webmaster Help Center guide on using robots.txt files13.

SEO techniques can be classified into two broad categories: techniques that search engine companies recommend as part of good design ("white hat"), and those techniques of which search engines do not approve ("black hat"). The search engines attempt to minimize the effect of the latter, among them spamdexing. Industry commentators have classified these methods, and the practitioners who employ them, as either white hat SEO, or black hat SEO.[50] White hats tend to produce results that last a long time, whereas black hats anticipate that their sites may eventually be banned either temporarily or permanently once the search engines discover what they are doing.[51]
One way marketers can reach out to consumers, and understand their thought process is through what is called an empathy map. An empathy map is a four step process. The first step is through asking questions that the consumer would be thinking in their demographic. The second step is to describe the feelings that the consumer may be having. The third step is to think about what the consumer would say in their situation. The final step is to imagine what the consumer will try to do based on the other three steps. This map is so marketing teams can put themselves in their target demographics shoes.[71] Web Analytics are also a very important way to understand consumers. They show the habits that people have online for each website.[72] One particular form of these analytics is predictive analytics which helps marketers figure out what route consumers are on. This uses the information gathered from other analytics, and then creates different predictions of what people will do so that companies can strategize on what to do next, according to the peoples trends.[73]
The biggest problem that most people have when trying to learn anything to do with driving more traffic to their website or boosting their visibility across a variety of online mediums, is that they try to do the least amount of work for the greatest return. They cut corners and they take shortcuts. Because of that, they fail. Today, if you're serious about marketing anything on the web, you have to gain Google's trust.

Keep resources crawlable. Blocking page resources can give Google an incomplete picture of your website. This often happens when your robots.txt file is blocking access to some or all of your page resources. If Googlebot doesn't have access to a page's resources, such as CSS, JavaScript, or images, we may not detect that it's built to display and work well on a mobile browser. In other words, we may not detect that the page is "mobile-friendly," and therefore not properly serve it to mobile searchers.
Imagine that you've created the definitive Web site on a subject -- we'll use skydiving as an example. Your site is so new that it's not even listed on any SERPs yet, so your first step is to submit your site to search engines like Google and Yahoo. The Web pages on your skydiving site include useful information, exciting photographs and helpful links guiding visitors to other resources. Even with the best information about skydiving on the Web, your site may not crack the top page of results on major search engines. When people search for the term "skydiving," they could end up going to inferior Web sites because yours isn't in the top results.
For that reason, you're probably less likely to focus on ‘leads' in their traditional sense, and more likely to focus on building an accelerated buyer's journey, from the moment someone lands on your website, to the moment that they make a purchase. This will often mean your product features in your content higher up in the marketing funnel than it might for a B2B business, and you might need to use stronger calls-to-action (CTAs).
×