Just a few days ago, Google’s own John Mueller felt the need to reiterate that new generic Top-Level Domains (gTLDs) still in fact do not provide any ranking benefit in search results. In fact, they’re well in line with how a sub-domain would rank.
Previously when the new gTLDs were making momentum in 2012, Matt Cutts stated this fact in response to an article which claimed new gTLDs did provide a rankings advantage. For some reason, John Mueller felt the need to share this information again as even more new gTLDs enter the domain registration market, 400+ as of December 2014.
For those that may not be aware, you can register domains now with an extension other than the most popular .com, .net and .org. Extensions such as the ones above in the image, but there’s some really stupid gTLDs that have been approved and have made it in the root zone. From there, this has to be one of the funniest images he has on his site:
These days there’s a lot more to choose from beyond the traditional .com, or .net, but none of them will give you any advantage in the search engines. From what I gather after reading Mueller’s statements, there’s no disadvantage to using these either. “They can perform well in search, just like any other TLD can perform well in search,” says Mueller.
The new gTLDs when registered, will act just like any web “page” or sub-domain and is unique in Google’s eyes as www vs. no www is. If you do decide to register a domain under one of the new gTLDs, then you should put a lot of effort into it because it’s going to be a hard uphill battle. For those who have premium .com domains and are just sitting on them, you need to start developing or someone with a hideous gTLD will come along and start and a year from now, you’ll wish you had started today!
You should not be swayed by any posts claiming to have data which suggests new TLDs are doing well in search results. If that’s the case, it’s not due to any artificial boost or preference from Google. You can make a great website that performs well in search on any TLD, but obviously the longer you have it public, the better chance you get to play outside of the sandbox.
Recently, Google announced forwarding numbers for their AdWords Adverting platform to help better bridge the gap between when someone views a Google Ad and clicks, then calls a special phone number displayed on the websites and identifies the click-through coming from the Ad within their Analytic reports.
New Conversion Type uses Google Forwarding Numbers
… many customers also call your business after clicking through a Google ad and learning more about the products and services you offer on your website. That’s why today we’re launching website call conversions, a powerful way for you to identify and measure calls from your website that occur after an ad click.
Basically, Google realizes that there’s a demand for a better understanding between conversions and when (for example) someone in a panic searches for “24 hour plumber” because they have a basement full of water at 3AM in the morning and calls the plumber from the search results vs. filling out a contact-us form and waiting. The Converg call tracking and call measurement service is best suited for the service industry where their clients would rather call them (either in an emergency or just to speak about something personal). Services that work best for our call tracking include (but not limited to):
- Legal Issues (DUIs, bail bondsman, etc.)
- Water Damage services
- Pest Control Services (Rodents, wasps or urgent issues)
- And many more…
We’ve been providing a custom and dynamic method that goes beyond where Google leaves off from their ad-only tracking. Our call tracking covers on-line or off-line tracking through dynamic and static local numbers for businesses. Whether it’s direct-navigation (they just type in your domain name directly), referral traffic (from another website), social mentions or PPC / Advertising channels. We’re actually integrating our tracking into a plugin within Google Analytics so that soon, you’ll be able to see on your dashboard the results on how people got to you when they called you. If you would like to check out what Converg can do for the service industry or would want additional information, then please contact us and we’ll be happy to help you identify your traffic sources and conversions.
<img src="https://www.google-analytics.com/collect?v=1&tid=UA-999999-1&cid=CLIENT &t=event&ec=Email&ea=Open&el=RECIPIENT&cs=EmailBlast&cm=Email&cn=IO44444">"/>
|v=1||Protocol version within Google Analytics|
|tid=UA-9999999-1||Your Google Analytics Tracking ID|
|cid=CLIENT||A systematic tracking ID for the customer|
|t=event||Tells Google Analytics this is an Event Hit Type|
|ec=Email||The Event Category helps segment various events|
|ea=Open||The Event Action helps specify exactly what happened|
|el=RECIPIENT||Event Label specifies a unique identification for this recipient|
|cs=Newsletter||Campaign Source allows segmentation of campaign types|
|cm=Email||Campaign Medium could segment social vs. email, etc.|
|cn=IO444444||Campaign Name identifies the campaign to you|
NOTE: Converg clients are doing very well with the latest update and algorithm changes like these only help increase our clients positions.
So, who’s who?
|First Launched||24 Feb 2011
(last updated in May 2014)
|24 April 2012
(new updated is expected soon)
|Also known as||Farmer update||Web spam penalty||Semantic search update|
based on low quality
based on low quality
|A change of the entire
(no penalties issued
for separate sites)
|Goal||To target sites with low-quality content and display them lower in search results.||To target web spam, i.e. sites not following Google guidelines and manipulating Google rankings.||To make Google respond not simply to keywords in a query, but to users’ actual search intent behind these keywords.|
Panda 4.0 — May 21, 2014 – Some of the Panda algorithm change might have been to reverse some of the damage done to sites that had great content yet were impacted for other on-page reasons. Ebay seems to have been hit real hard.
Payday Loan — May 21, 2014 – Update to identify hacked websites and inbound links to financial, payday or pharma. (in general)
Chatter — March 25, 2014 – Major algorithm flux trackers and webmaster chatter. Speculating something new or major coming out.
Page Layout #3 — February 6, 2014 – Google “refreshed” their page layout algorithm, also known as “top heavy”. Originally launched in January 2012, the page layout algorithm penalizes sites with too many ads above the fold.
Penguin 2.1 (#5) — October 4, 2013 – After a 4-1/2 month gap, Google launched another Penguin update. Given the 2.1 designation, this was probably a data update (primarily) and not a major change to the Penguin algorithm. The overall impact seemed to be moderate, although some webmasters reported being hit hard.
Hummingbird — August 20, 2013 – Hummingbird has been compared to Caffeine, and seems to be a core algorithm update that may power changes to semantic search and the Knowledge Graph for months to come.
If your site has been subject to an Un-natural Link warning in your webmaster tools account, you will need to identify which links are of concern and submit a reconsideration request once you cleaned them up.
Reconsideration requests to Google is only for websites who received an un-natural link warning. Un-natural Link warnings -if any- can be found on the webmaster tools under the “Manual Action”.
Google’s un-natural link warning Example:
“We’ve detected that some of the links pointing to your site are using techniques outside Google’s Webmaster Guidelines.
We don’t want to put any trust in links that are unnatural or artificial, and we recommend removing any unnatural links to your site. However, we do realize that some links may be outside of your control. As a result, for this specific incident we are taking very targeted action to reduce trust in the unnatural links. If you are able to remove any of the links, you can submit a reconsideration request, including the actions that you took.”
What’s the difference between algorithmic penalty and manual penalty you ask? Google has an automatic algorithm and manual webspam team to find which sites has as bad back link profile. An algorithm penalty is done automatically by Google’s ranking software. Algorithmic penalized sites will witness a drop in SERP positions and thus a steep decline in traffic. If a site is hit by algorithmic penalty it will not receive an un-natural link warning.
Manual penalty is applied by Google’s Search monitoring team who manually had analyzed bad links, took action and then sent an un-natural link warning to the website owner like the one above.
If you had submitted articles with “follow” links and stopped promoting your site, failing to adjust to the changes that Google has recommended for articles sites, then you may have receive a notice like this.
Things to avoid
Press Release Links with Exact Match Anchor Text — Should you disavow? This is a difficult one to call but if you have non-branded anchor text links in a press release, you could be at risk. Branded links in a press release are perfectly natural and before you take action, these PRs should be reviewed on a case by case basis. We ensure that our client’s press releases are branded and newsworthy.
Poor Quality Article Directories — Poor quality article directories should be disavowed. Niche directories are fine. If you run an Remodeling website and are included in a Home Remodeling article directory, then that is fine (why wouldn’t you want to be in there). If you’re listed in a generic directory and not in a specific category, then I would disavow this source at domain level only if you are not able to log into your article account and update the links to “no-follow” and ensure your category is specific to your industry.
Link Networks — Don’t use them. Never use networks or schemes to help you rank. It also seems there is mass confusion about link networks and what they are. Simply put, a link (or site/blog/article) network is a group of sites that are connected. They can be owned by one person or multiple people, their connections can be as obvious as a badge displayed that proudly identifies the site as a member of X network or as covert as a footprint uncovered by lots of digging. Avoid any site that Page that lists a ton of other sites. This can be linked with the anchor “Friends” or “Partners” to “Proud partner in XYZ Network” or “See our other networked sites” which is the key here — the wording about networks specifically. It doesn’t always indicate a network, but it does indicate the need for stricter review.
Footer Links — Footer links are totally legit, when they appear in a natural number, correlated to the distribution of the visual positions on a page, of all your backlinks. Footer links to avoid are the ones that sell links, are involved in link exchange schemes or have widgets or plugins installed (usually for SERP manipulation). There are situations when Google does not flag footer links, but no one says that they are not discarded from Google’s ranking algorithm.
Site-wide links – Many companies get their partners to have site-wide footer links to their domain. This can look unnatural and again for peace of mind I would no-follow the link if possible or if it’s completely not relevant to your product/service/industry. If your domain does not have the necessary authority and receives a couple of site-wide links from some low quality sites, your site can be subject for manual penalty action.
Blogrolls and Blocks of Links — Blogrolls might also be considered a “link network” but a public one with no exact control of who includes the widget on the site. Too many links that fit these patterns might be viewed as unnatural by the Google algorithm and you might get penalized for them. Some of those links can be site-wide links or not. Make sure your links are “no-follow” from anything like these.
Forum profile links & Forum Signatures — Forums are abused the same way blogs are. Having people discuss your website is natural, what is not natural is non-relevant signature links and profiles on sites that are not relevant. For example, if you’re on a webmaster forum and post about optimizing your website and have a link in your signature about image optimization, then this would be relevant. If, on the other hand, you have a signature promoting a keyword that is about concrete polishing, then it really has nothing to do with the site you’re on and would not be a positive link for your website.
Anchor Text Distribution — Over optimization of anchor text does not appear natural and will definitely be flagged for review. This still holds today and forever. A natural backlink profile for a top level domain who has never invested into SEO strategies is likely to have an anchor text distribution of about:
15-20% Primary Keyword Exact Match
This is your primary keywords that you would like to rank in the search engines for. These keywords are exact matches, or the equivalent of doing a search with the term in double quotes. (example: “plumbing supplies“)
12-18% Partial Match / Phrase
Phrase backlinks are anchor text links that include your exact match keyword, plus additional “filler”/prepositional words. Using the “plumbing supplies” exact match example from above, an example phrase would be “the best resource for plumbing supplies in the area“.
23-34% Brand Match
Brand related is segmented into three distinct parts: the brand name by itself, brand name plus exact match keyword, and finally brand name phrases. Expanding on our “plumbing supplies” keyword example, a brand related keyword instance could be “Bob’s Plumbing services”.
Other links refer to non-contextual links such as “website”, “click here”, as well as image links.
20-30% Domain Match
Bare links, or domain specific, is the absolute URL of a webpage. For example, “http://www.example.com” would be a bare domain match link.
Based on what Google hints at, the global environment changes for webmasters. You need relevant backlinks to your website to give you importance. Google looks at no-follow links as a way now to show you are promoting your content through articles & article directories but are not necessarily submitting your content for the sole purpose to pass “Google juice” or to increase your total number of backlinks. Based on what Google as said regarding this, we adjust what we do for our clients to ensure that the submissions contain no-follow links but still adhering to the unique content and relevant category of the locations we submit to like we’ve always done.
The best way to get other sites to create high-quality, relevant links to yours is to create unique, relevant content that can naturally gain popularity in the Internet community. This is what we do for our clients. Creating good content pays off: Links are like editorial votes given by choice, and the more useful content you have, the greater the chances someone else will find that content valuable to their readers and link to it.
If your company has been subject of a manual penalty with an un-natural link warning or you believe that you’re subject to an algorithmic penalty, then contact us to help you by performing a complete backlink audit of your existing links and submitting a link removal request to website owners. We will also complete an anchor text audit and submit a disavow link list for problematic links and finally a reconsideration request on your behalf to Google.
By Shane Kinsch
This infographic has everything you need to know about how the Google AdWords program works and why some pricing for keywords are more expensive to you thank your competitors.
Original Penguin Update: Tuesday, April 24th, was a day that seemed normal until, a reasonably noticeable drop in rankings changed things. What did Google do? Could it have been another, more aggressive version of Panda? A part of the “no ads above the fold” algorithm change? Maybe it was the famed and feared over-optimization penalty that has been looming since Matt Cutts mentioned it a couple of months back?
Apparently, it turned out to be a uniquely different new search algorithm targeted at catching webspam, strangely enough called “Penguin” (black and white like a Panda, although not as large or imposing…) – this is actually the change Matt Cutts was talking about, but he told Search Engine Land that using the term “over-optimization” wasn’t entirely accurate, because the algorithm doesn’t target SEO, just webspam. Now, normally, the majority of peoplewould say webmasters don’t need to panic about this sort of thing , provided that their sites aren’t spammy. However the perception Google’s algorithm has of spam as well as the perception of spam that you, as a site owner may have are not the same – so, even though you may believe your own site would not be affected, it usually is plausible (though Google does state that this update only has an effect on around 3.1% of searches).
This algorithm boils down towards the age-old battle between “white hat” as opposed to the “black hat” SEO. It will “Reward” high-quality sites and punish black hat webspam which is everyone’s goal in this algorithm update.
Basic Things to Avoid :
Google is on a mission to rank high-quality sites and lower sites that “are participating in webspam tactics to manipulate search engine rankings,” the search algorithm will specifically be attuned to tactics that webmasters could use to get better rankings through questionable schemes, such as the following:
Keyword Stuffing – Avoid repeating your keywords too many times on a page (in fact, try to stay within 2-4 times), and steer clear of throwing keywords into content that is definitely unrelated for them. Should you use keywords and phrases more often than once, be sure they make sense in perspective and that the article flows while using the keywords included. This ensures the user’s reading experience is a useful one knowning that your site content is correctly optimized for search engine utilization.
Link Schemes – Since you probably know, Google analyzes the number of sites backlinking to you (the greater sites that link back to yours, the greater you’re considered an “authority” by Google) as a means of discerning whether your website is relevant and beneficial to users. Your link profile has a large amount to do with your site’s rankings. Backlinks you build needs to be quality ones, and in addition they really should be strongly related your website. Quality over quantity is quickly becoming a new industry standard.
Duplicate Content – An honest mistake is ok, however when you are purposely plagiarizing and copying content, don’t anticipate to rank in the SERPs and, with a related note: Don’t Post Good Content. Post GREAT content. Obviously, as Google indicates in their Webmaster Guidelines, it’s always better to naturally garner links simply by creating relevant, high-quality content that people naturally hyperlink to. However, for sites that take part in SEO, this may not always be the easiest thing to do. Regardless of how you garner links, your content should ALWAYS be great – content that is certainly grammatically correct, that flows, and that is certainly easily consumed by the search engines like google and not loaded with keywords. As Google concludes: “We want people doing white hat search engine optimization to be able to focus on creating amazing, compelling websites).”
Do you want more Information, or if you would like to find out more info on Internet search algorithm updates? Need methods for engaging in white hat SEO? Visit our B2B blog for regular updates, free tips, and more. Have questions that are more specific in your business? Give us a call at 888-888-1022.
The Internet accounted for $684 billion, or 4.7% of all U.S. economic activity in 2010, Boston Consulting Group found. By way of comparison, the federal government, contributed $625 billion, or 4.3%, to the nation’s output.
If it was considered its own separate industry, the Internet would also be larger than America’s education, construction or agricultural sectors.
In the retail sphere alone, e-commerce accounted for 5% of U.S. sales in 2010.
“All businesses are increasingly digital and need to think about how to take advantage of these opportunities,” said Dominic Field, a BCG partner and co-author of the report. “And for policymakers, we would hope they recognize the importance of Internet growth and making sure their countries are taking advantage of these opportunities.”
As a share of gross domestic product, only three countries have larger Internet economies: the United Kingdom, South Korea and China. The U.S. is tied with Japan.
Boston Consulting Group predicts the Internet will grow about 10% a year through 2016 in the Group of 20 nations. It will grow nearly twice as fast in emerging markets as in developed economies, with Argentina and India accounting for the fastest growth, the study said.
“The U.S. is relatively mature as an Internet economy, whereas some of the developing economies are further behind — so their growth rates are higher,” Field said.
Granted, measuring the full impact of the Internet can be a fuzzy matter. In the Boston Consulting Group study, the researchers included the impact of e-commerce, what consumers pay to access the Internet and money spent by businesses and the government on building Internet infrastructure.
Recently, Matt Cutts spoke at SXSW where he announced that Google is working on a search ranking penalty for sites that are “over-optimized” or “overly SEO’ed.”
Matt Cutts said the new over optimization penalty will be introduced into the search results in the upcoming month or next few weeks. The purpose is to “level the playing field,” Cutts said. To give sites that have great content a better shot at ranking above sites that have content that is not as great but do a better job with SEO.
Here is the transcription from SEL of Matt Cutts:
What about the people optimizing really hard and doing a lot of SEO. We don’t normally pre-announce changes but there is something we are working in the last few months and hope to release it in the next months or few weeks. We are trying to level the playing field a bit. All those people doing, for lack of a better word, over optimization or overly SEO – versus those making great content and great site. We are trying to make GoogleBot smarter, make our relevance better, and we are also looking for those who abuse it, like too many keywords on a page, or exchange way too many links or go well beyond what you normally expect. We have several engineers on my team working on this right now.
Back in 2009, Matt did a video on over optimization penalties saying there was no such thing. Here is that video:
What does this mean for Search Engine Optimization for 2012+? Well, there’s two main points to take away from this: On-site optimization or “over optimization” and Off-site link building.
On-site optimization or “over optimization”
What does this mean? Why would it matter? Well, basically it comes down to whether or not your site or page contains valuable unique content that’s fresh and not full of ads or just a ‘lander’ where you capture a lead with very little content. Lander pages are great for converting an online Ad (Adsense, etc.) but not for SEO. Keep your landers separate from your primary page. In other words, don’t make your home page the “lander” page. Make it useful. If you’re creating a lander for advertising and conversion tracking, then make it separate from your home page.
Are you updating your content often? Weekly? Are you linking to ‘bad neighborhoods’ (i.e. link sharing programs or anything else that may seem a bit unscrupulous)? Are you not allowing some of your ‘link juice’ to be credited to other sites? In other words, you should typically use no-follows for any links going out of your site — unless it’s a similar site which you should allow some of your link-juice to follow. Don’t sell links on your site and don’t engage in any activity on-site that will penalize your work.
So what to do?
On-site optimization in the very basic form should concentrate around your primary keywords, but also make it valuable for the visitor (aka real humans) that come your way.
Just make sure your keyword for your page is in the Title, H1/H2/H3 tags, bold somewhere, italicized somewhere and typically within the last paragraph of the page. Make sure your keyword density isn’t too high and make sure your page is natural and has more content than all the navigation and template text on your site. In other words, if you have a lot of navigation & template text (footer, etc.) but your actual real content on the site is < that of the nav + footer, etc… then you might want to increase your content… keep it fresh!
Don’t overly use multiple keywords on any single page. Use a keyword map that helps you organize your website with they keywords you’re focusing on such as:
Off-site link building
What’s at issue here and what will it mean for the future for link building? Who is ultimately effected by this upcoming algorithmic change?
No one can be certain but we do know Google and Bing are looking for quality content and quality backlinks. For the software vendors that create tools that try and game the search engines by creating thousands of backlinks, spamming the search engines or hiring individuals or companies that promise 20,000 backlinks in 1 week for $100.. those types of organizations are going to be worthless. Although Google won’t penalize you for those hideous no-quality backlinks, they won’t help either.
What works now and in the future is unique quality content created and distributed throughout the Internet written for humans to read and utilize. Creating unique external content about your company or website, linking back to you is key. Try and find websites that have articles or content like your own and ask if they would link to you but this is hard on an individual basis. For example, put yourself in their position, would you blindly link back to someone without something in return? What would that “something” be? Would be be a monetary fee? Why would anyone want to link back to your website especially if it’s a product page or you’re selling something. If your page is full of ads I certainly wouldn’t link back to you.. why would anyone else?
Converg helps overcome these obstacles by doing all this work for you. We help Fortune 500 companies with their organic search optimization and do it ‘naturally’. No quick fix, no crazy link schemes.
When we do your link building for you and create that buzz about your company or services, it’s just natural for the search engines to follow suit and make your site more prominent over your competition.
Retailers including J.C. Penney, Nordstrom and GameStop closed their Facebook stores within months of launching them, throwing the future of Facebook commerce into doubt, says Forrester Research analyst Sucharita Mulpuru. “There was a lot of anticipation that Facebook would turn into a new destination, a store, a place where people would shop,” Mulpuru said. “But it was like trying to sell stuff to people while they’re hanging out with their friends at the bar.”
“It never did make sense”, said Shane Kinsch of Converg Media. “It was dead before it started. Many companies rightfully want to be where customers are but in this case, like many social media destinations, users just want to be left alone.” Read more at Bloomberg…
The trend is clear: Yahoo!’s search traffic Free Falls.
The Search Engine wars have always been sketchy with each trying to reach more visitors and appealing to visitors to use them as their “homepage”. The search engine wars will forever continue, but for the time being, the results are clear. For the past 3 months, Yahoo! has been losing a lot of traffic.
Google is the recipient for most of the traffic loss from Yahoo! as seen in the numbers below. The agreement between Microsoft and Yahoo! hasn’t helped stop the big fish, Google and probably will never stop Google’s momentum. If you didn’t already know, Microsoft’s Bing search engine provides Yahoo!’s paid search. Microsoft retains all money made from Bing advertising, while Yahoo! only keeps about 90% of the ad revenue from Yahoo!
The results below are explicit “core” search queries where users enter a search term in the search box o none of the three sites. It doesn’t include other search features that may be part of the search engine. It does not contain mobile search traffic numbers nor searches performed outside of the United States — two more areas where Google is even that much more dominant.
The numbers for for the last year.. notice the RED section below of Yahoo!’s steep decline in traffic.