Category: Testing

What is CTR Manipulation And Does It Work?

Many Search Engine Optimization professionals are looking for additional ways to increase Google rankings and CTR manipulation has recently been seen as a SEO technique or an extra ranking factor to gain those few extra google search engine ranking places, but does it really work?

click through rate (CTR)

So first let’s discuss what CTR is how it can be manipulated.

CTR stands for “Click Through Rate” and measures the amount of traffic and clicks a page receives for a particular search versus the number of times it is presented on Google search.

So, if Google showed a page off your web property for a particular search term 100 times and it was clicked on 5 times, that would be a CTR of 5%. A fair argument could be that a webpage search term that has a CTR of 5%, would get a ranking boost over a webpage that only delivers 2%, where everything else is equal. You can see the clickthrough rates within your Google search console.

Testing The CTR Theory

I first tested this theory so many years ago, I can quite remember, except it was when CTR was introduced as a metric with PPC (Pay Per Click). The better the CTR, the less you paid for a click. In fact, if you got this right, you could be paying a lot less for a PPC campaign than those, many positions below you.

I believed that surely if it is being used with Google paid search, then it would be a factor in natural search. However, at that time, I found no benefit to any rankings, when organic click through rate improved with normal search and gave up on the theory.

As with all SEO strategies you should re-test, but this time I did see some improvement, but it was so small it was not worth the investment of time or money. So, let’s bring you up to date.

So How Do You Manipulate CTR?

There are two tracks for this,

  • Use automated software (emulator, CTR Bot)
  • Pay someone to visit your site and emulate a real visitor with organic clicks
  • Use several mobile phones and do it yourself, organic CTR.
  • Use multiple VPN’s on different devices and do it yourself with various SEO traffic.

The first option normally requires a subscription where an automated traffic bot, pretends to be human (emulate) and will visit your web site for the agreed amount of time. There are many emulators out there, but very few have ever got any recognition of doing the job. I have tried 3 different options, the last one, came with some recognition, but in my opinion, it made no difference. I won’t mention their names for this reason, this was my testing only and some may argue, I did not try it long enough. After 3 days, ranking and organic SEO traffic dropped, so I immediately cancelled and removed the Click Through Rate Manipulation bot.

The second option requires employing real people to pretending to the search engines to be genuine user behaviour. This normally involves using Micro workers from countries where the cost of living is lower, and so the project cost is affordable. After trying this SEO technique for a few months, I did see a small rise in rankings, but it never rose to the point where I felt it covered the cost of return. In better words, for the same cost, I could add content and more links and get a better improvement in rankings and traffic.

The third option is to complex to mention all here, but it involves buying many “pay as you go” android phones and new sim cards, registering them in different names, only using data to visit your website.

The fourth involves using several VPN services or one that has multiple local VPN’s, then changing browsers and equipment to mix up the visitor engagement signals.

CTR Manipulation flow cart

Was This Proof It Worked?

No, It wasn’t proof. You see this was not just about CTR, it was also about the metrics that occurred after the click. Let me explain with a question.
Suppose your search term appears 100 times, then visitors clicked on that page 50 times, this would be a CTR of 50% amazing, right?
But just suppose every one of those visitors immediately bounced back to Google from your site, without engaging in it? Would you still expect to get a ranking boost from this? Of course not.
I believe that a high bounce rate after a high CTR, will actually lose you rankings.
So, the click through rate in my opinion cannot be measured as a stand alone metric, it is also about what happens after the click, this is why I think automated emulators just do not work yet. They may get the clickthrough rate manipulations right, but fail to make the engagement signals and organic user behaviour afterwards.
With a high click through rate then a high engagement with your site, then you should expect some sort of rankings boost. But a high CTR followed by a high bounce rate, could give you a rankings loss.

A Metric Based Summary

When I had gained rankings using CTR it had been because of the good metrics after the click, where visitors had viewed more than one page and explored the site or had completed a form etc. It is also fair to say, that the local business sites I had been working on, had a relatively high yield from fewer clicks and I design pages with conversion in mind, so I don’t need thousands of clicks a day for example. Sometimes only a handful of clicks per page is enough in these niches. This does mean I would preferer the VPN solution should I do more experiments today, from a cost point of view.

You could use paid social media clicks such as Facebook ads to deliver traffic to your site, if you are confident the user behaviour will be positive. But in this case the CTR is not being manipulated, it is just a way of adding further on page user metrics.

But I go back to a previous point, I do not have any websites, which has used up all its opportunities for adding great content, or better link building. I believe these basics still deliver a better return than CTR manipulation, but it has taken these SEO experiments to prove the worth.

The only time I could see the benefit of implementing click through rate manipulation, is when you are at position 3 or 4 and are looking for that last bit of extra respect from Google, to get you up to position 1.

Then only if the rewards of those few places are worth it.

But if you think it is time to implement this SEO strategy and looking for a SEO course? Then Craig Campbell, Chris Palmer and Holly Starks all do a SEO course on implementing CTR. Just google search their names.

Read More

Relationship Between Main Domain and Sub Domain

If you ask most SEO experts whether they prefer subdomains or sub folders, for their content, in most cases the sub folder option is preferred. I agree, but with one exception, and that is lead gen pages, or websites that have a lot of location-based info and there is a risk of thin or similar content.

domains

With these instances, I opt for sub domains, to separate the blog posts and higher word content from the landing pages.

I have been doing this a long while, now and have a system and minimum unique content level I work too on each local page. However, with one site’s subdomains I am working on now, there seems to an issue of finding new updates on each page with Google. It discovers the pages fine when brand new but takes forever to find and changes to the pages, as I have been adding more value to them.

There is one different to previous projects and that is the main website is also struggling for added value content to be discovered, so this week I have been experimenting with the relationship between the main website and its sub domain.

How relevant is the ranking of the subdomain to the main domain, even if the subdomain has quality links going to it?

A Focus on the Main Domain

So, all work on sub domains have stopped, the focus is adding more and more quality content to the main domain, so it starts to see traffic by itself and in the meantime, I will measure any metrics that change with the sub domain, to see the relationship between the two.

More blog posts and related sub folder content is being added slowly and on different dates. Will this prove that before working on any subdomain, the main site needs to be sound with plenty of content and traffic?

It will also be interested to see where the line is, or will it make no difference to a sub domain’s value?

Let’s see.!!

Read More

Page Resources Not Loaded Oxygen WordPress Technical SEO

This week I have been doing some technical SEO investigation, to understand why so few of my pages within a site designed with Oxygen for WordPress are being fully cached by Google.

By using “inspect URL” in Googles Search console, then more info, it will show if all the page resources of a web page are being cached. This can be very important with SEO, especially if critical components of a page are missing, such as CSS and frameworks.

Google Crawl Quotas, for A Single Page

In my case, several pages were not being fully cached. Google was only caching about the first 25 – 30 or so page resources, before deciding not to load the rest. A bit of investigation and the critical term “Crawl Quota” appeared yet again.

I had always believed crawl quotas related to how often Google came to a website, or how many pages it would cache in a particular period, based on the quality of the site. I am not more educated and can say, it is also about how many resources per page, it will cache, before stopping.

non page resources

Oxygen From Elementor

I have recently been getting to grip with Oxygen, after being disappointed with load times from Elementor template designs. Oxygen has been a saviour on my opinion of WordPress websites, as load speed is much faster.

I had assumed it was because it used much less resources to achieve this faster speed. But no, even with faster speeds, the page resource count is about the same than with Elementor of the designs and projects I have been putting together.

But what also confused me was that some pages even in a second site with only 15 page resources, were still not being fully cached by Google. However, eventually I figured I could easily fix this by “Regenerating the CSS” within Oxygen withing the settings menu. After doing this, every inspect page, produced a cache of 100% of the page.

It does worry me that I must log in regularly regenerate CSS to ensure pages can be cached by Google. This will be on my Oxygen check list. If it only needs doing after a change or added content, then this will bearable.
But back to the first website, with page resources well above 25-30, there were issues. Especially one with 40 page resources. Well I found a solution by not using the CSS in Oxygen and replacing it with a plugin called Autoptimize. It brings many of the CSS and JSS resources into much fewer, in this case 40 down to 9.

all page resources

I started using this on Elementor sites also, to see if this make a difference with SEO rankings, improving the quality score of a website.

If you are having ranking issues with a WordPress site and your page load speed is good, and you know you have a good link profile and content. Take a few minutes to check how much of each page is being cached by Google? You may be surprised at what you find.

Read More

Experimental Test Site – Digital Marketing Las Vegas

I am a regular watcher of Craig Campbell’s and Chris Palmer’s “Ask Me Anything” podcast that goes out on Wednesday. During one of these episodes late September they joked that they should have a competition against each other, create a new webpage or site and use whatever the knowledge they had, to see who would rank highest for a particular search term within 60 days. Immediately I thought this would be a great experiment and an opportunity to test the strategies I could use and and compare with their results.

Digital marketing Las Vegas

10 minutes later the competition was opened up to everyone, the search term would be “Digital Marketing Las Vegas”, no exact or partial match domains could be used. No dirty hurting other site tactics, but apart from that pretty much everything was open to test.

I decided right from the start there would be no black hat stuff at my end, in fact winning didn’t matter, it would be interesting to compare who got where an how.

The Clean Strategy Starting From Nothing

  1. I bought a brand new domain “Visibiltybam.com”, don’t know why, except I knew it needed to be a brand. The .com represented, this was to be a USA businesses, albeit a non existing one.
  2. I then looked at buying a cityscape image of the city, that I could easily change to create unique images, for every article piece. This cost $10 courtesy of a Fiverr advert.
  3. I then looked a finding a free word-press template to start the site. Every process was to be done as clean as possible as if I genuinely was a Las Vegas Digital Marketing firm.

Get The Technical SEO Right First

Whenever I start a project, especially using WordPress I do some “due diligence first on the structure. So for instance the first WordPress template I chose, after adding a bit of content, I could not get to load very fast with the mobile version. With only 60 days to rank, I did not want to be re-coding stuff just to get an acceptable load time, so that template was binned.

The second template, kept showing “Mobile Errors” within the search console, every live test. It loaded fine into several mobile phones, but once again, seemed too much trouble, so that was binned also.

The third template, loaded at a acceptable speed, and showed no mobile errors, so was chosen, even if I didn’t really like the style of colour, but with only 60 days, beggars can’t be choosers and it was free.

As much as this took several hours of testing this bit of technical SEO, getting speed and mobile stuff right, would have saved much more time down the line.

Keep Within Theme – The Content Strategy

Experience had taught me, that the “Las Vegas” part of the search term, was as difficult to rank for, than the “Digital Marketing” part. So the site, needed to have a Vegas theme running right though it, even on the non marketing pages. This was supported, by the fact so many of the first page results were dominated by SEO directories with listings of  local Agencies, including loads of local addresses, zip codes and phone numbers.

  • I also choose to keep all pages, within one click of the home page with one exception that I will mention later, so the website would be the main theme, not the sub folder.
  • I had visited the La Vegas Pub Com 10 years earlier, this is an annual SEO conference, so the first piece of relevant digital marketing content was a piece about the this conference. Although I was disappointed on how little was now on YouTube compared to 5 years ago.
  • The “About Page”, contained a small amount of local information, as well as local zip codes, hotels ans even a picture of me last visit to Vegas 10 years ago. The false name I chose was a bit of a joke, Jamie Claus the long lost Nephew of Santa Claus.
  • Each page of content would support the main “money page”, so I looked for existing articles published on local  sites, then re-wrote the article adding more content then linked out to these original articles. Every piece would cement the Vegas theme and relate to actual local news.

The exception with the content was creating my own mini directory of other digital agencies within the region. This page would contain several local addresses, phone numbers and zip codes. It was important that Google found the data, but I did not want this page to rank, so it was the only page, 2 clicks deep from the homepage.

Links Needed To Be Cheap

Following my own rules of not using any existing resources. I looked for links on Fiverr, acquiring 5 guest posts in total under $90. Also one quick video, made by my 12 year daughter, published on YouTube.

3 Social media accounts were set up, Facebook, Twitter and Pinterest. Out of these 3, Pinterest, aided the biggest link value.

At The 60 Day Results.

Google position 11

The morning of the final day, this strategy meant I ranked 11th in the UK but a lowly 34 in the States. One of guest posts had still to be cached by Google, so I knew it may climb. However, later that day, all results disappeared as Google started a core update right on cue. Only one other person in the competition ranked higher. A few days later when the update seemed to settle, results returned to position 9 (yes first page) in the UK, but around 25 in the States and dare I say higher than anyone else. However 4 or 5 days too late, to be winner, but that was never the plan anyway.

Google position 4

A week or so later, as I write this, the results are position 4 in the UK but dropped again down to 35 in the USA. The above is a demonstration of how Google shows different results in different countries. One factor to this, is that most metrics on the site are UK based, so I have been rewarded here, but not in America.

Google USA Results

When the actual cost if $110 is factored in, to compete with other digital marketing specialist in a huge foreign country, even position 35 is a good result. Most SEO professionals would expect to pay into the thousands to take on existing powerful websites at their own game. But to rank 4th in the UK, is way beyond my expectations, and will use this strategy for future projects.

Read More