Editor choice

Every Algorithm Made by Human, Can Be Bypassed or Bent by Other Human

I used a few different softwares to test CTR manipulation/crowd-search, but at the end of the day, there is still room for improvement for each one of them, but they worked. Some of them had horrible interface, buggy back-end problems and they were overall pain in the ass to use them, but hey…they got the work done.

I won’t throw names which softwares I used because I don’t endorse either one of them, but I will talk about strategy and what worked for me personally. I specialize in YouTube SEO and Video marketing, but I found out this can be used with either Google and YouTube for higher SERP positions. (with slight differences).

I’ll try to speak as basic as possible, so more people can understand what am I talking about.

Let’s start with Google, shall we?

How it all started:

So everybody who’s doing SEO is aware of Rank Brain and all the goodies Google updates every year to create a better UX for end user.
The one factor that interested me the most is User Interaction.

Now this one is pretty logical to me and before I even start talking about CTR manipulation, let me say this:

  1. You need to have KILLER content with superb quality on your site you are trying to rank.
  2. Do content research, find the best KWs and create content around them.
  3. On-page SEO. Can’t stress this enough how important this is. Take a peek at Matt Diggity’s On-page guide, or maybe there is some kind of guide here in the group.
  4. Domain shouldn’t be brand-spanking new. Let it age a little and start progresivelly with link building, so it doesn’t raise an alarm in Google HQ. 😉
  5. Start Link building campaign, there needs to be some traction with your site.
    Start slow if site is new, use link building velocity to spread links through the whole month.
  6. Drip-feed Social signals through 30 days when doing crowdsearch.

Not blast, drip-feed.
It looks legit through the whole month when we start sending traffic to the site.


Where Does This Work

  1. I personally used it in combination for Local SEO and affiliate sites.
    It worked beautifully and it still does.The monthly searches for KWs I tested (for now) weren’t exceeding 5000 searches/per month.
  2. Also it can drastically reduce bounce rate if you have a problem with that.

TIP: This method won’t skyrocket you from Page 10 to Page 1 on Google.

The best you can do is try to come as close to at least page 3 and then you have a realistic shot to come to the first page for your desired KW.



    1. The majority of this kind of service providers offer big numbers;
      Up to 1,000,000 visitors (usually around 10k-50k visitors) to your site for relatively cheap.This is the first problem because there is manipulation threshold for every KW.That means if your desired KW is searched aprox. 7-9k times per month, the worst thing you could possibly do is to send double or even more traffic to your site.Google picks up all the searches around that KW and creates a data, but if you try to send an unusual amount of traffic to rank faster, big G will pick it up and you don’t want to create any kind of suspicious action regarding this risking penalty.
    2. The other thing is you don’t know the quality of traffic, because not all traffic is equal in Google’s eyes. I’ll explain that below.
    3. Usually people send traffic through 1 ‘direct’ URL which is a huge mistake and it looks super-unnatural.
    4. When ‘visitor’ comes to the site, they usually sit dead on top of the site for some time and then they bounce, no movement.



Now I tested a lot of services on dummy sites provided on various marketplaces, and to be honest, there are very few that are actually beneficial to your site.
So I tried to do this by myself.

In order to execute this my way, I needed a software that can “store” virtual personas (like having real people waiting for my command) so I reached out to a VA and ordered a few thousand profiles to be created in Excel, including full NAP (full name, address, phone)+ google accounts so they are logged it. (around 60% of the accounts).
After that, I started examining some patents Google holds regarding User Interaction and most of them date from 99’ to 2005. (what I had found)

A lot of people are worried about canvas fingerprints that could leave a trail something is being manipulated in the SERPs.
(like what kind of font, browser, desktop resolution etc. user is using to browse through Google to reach to desired website)
I never had any problem with this as long you don’t send big batches of traffic to your site, but just in case you are worried (and for future Google Updates):

I used iMacros (programmed pre-recorded movemets) so ‘visitors’ come to site through various browsers (Chrome, Mozilla, Opera…) and some browsers allow you to install add-ons that cover/mix your Canvas
(there are a bunch of them, just Google it) so you are secure and you don’t have to worry about that.

The imacros paths I used are driving traffic to sites using:

  • Organic traffic – visitors search through pages on Google until they find your URL and then they click on it
  • Direct traffic – most popular one, they type your URL in search bar and go to your site
  • Referral traffic – they click through link on your social media profile or PBN or any kind of external link pointing to your money site (extracting maximum link juice from that link if do-follow because there is traffic flow through that link)
  • Pogo-Sticking search – visitor bounces through few higher ranked sites until it reaches yours. (this signals Google that sites ranked above yours are not what they are searching for ,until they land on your site

*All the links used in pathway needs to be indexed in Google.
* For Organic Traffic, the targeted site needs to be in first 3 pages in Google.
* 45% traffic comes through referral, 30% through direct and the rest is through organic search. (aprox.)

Now of course, visitors land on the desired URL, they browse, click and all the good pre-recorded stuff Google values and then they bounce off after 5-7min (dwell time) or as long you want them to stay.

I personally make sure around 20% of logged-in Chrome user return to the desired URL within 2 weeks, sending Google strong signals about the site, getting even higher in SERPs. (use Sticky Residetial proxies for this, more about it below)

*Send visitors through 30 days period (always drip-feed to look normal)

The main problem why Google penalizes people doing this is because in order for this to succeed, you need to use Real residential IPs- which come with a price.
And yes, if using this method for Local SEO, try to get 0 Spam Residentials that are Geo-targeted from real ISP.
(meaning if you are in Germany, using german residential proxies your search console will show ‘real-as-it-gets’ traffic from Frankfurt, Berlin etc. depending where they are located in Germany)

Those visitors go through your site, scroll, interact, click through inner pages, on links and all the good stuff…and everything is recorded by big G.

It doesn’t get any better than this and I hope you managed to follow me through the text…

I probably forgot to address all the things here… but I want to hear you in comments below.
If you’d like me to write another method similar to this, Like this post.

We will be happy to hear your thoughts

Leave a reply