One of the most frequently asked questions I receive is what I do to run my projects at such a high links per minute count when using GSA Search Engine Ranker. So I decided to release this post to explain my process along with the logic for the various steps.
This particular method is designed to be used as the very bottom tier to help index your upper tiers, with this being the goal the focus is to build as many do follow links per day as possible to have as many paths for search engine spiders to find and climb your tiers. In the past, I have used this as a tier three layer but I currently also have a number of test projects where I have demoted this process to tier four. If you wish to try this for indexing then be sure to use the method I explain here to pre-filter the targets to make sure they are do follow only!
As with my ultimate guide to GSA Search Engine Ranker post I will go over my tool settings and use the free Skitch tool to take screengrabs and add visual aids to them.
As you can see from the screenshot below, implementing these steps is the reason I am able to run at 538 links per minute with 245 active projects with 99% of the link output being do follow.
Tweaks At The Hardware Level
Without a doubt, the easiest way to increase your links per minute output is to upgrade the VPS or Server your tool is running on, I highly recommend you check out my guide on how I choose, set up and optimize my VPS’ to maximize the output of SER. The better the hardware and internet connection the tool has access to the better it is going to perform, it’s as simple as that.
Additionally, freeing up resources on your VPS will also help the tools performances. For example. if you are running GSA Search Engine Ranker at 1800 threads, GSA Captcha Breaker prioritizing accuracy over speed, Scrapebox scraping the internet and GSA Platform Identifier processing the Scrapebox harvest then your links per minute is going to suffer.
You can see the difference upgrading your hardware can make in the screenshot below from a few month back when I was using a dedicated server and pulling over 944 links per minute.
Back when I was running a larger GSA based operation I had one server broken into three VPS’ with one VPS gathering targets via footprint scraping and link extraction, one VPS processing all the harvested URLs with GSA Platform Identifier and the third VPS running its own instances of GSA Search Engine Ranker and GSA Captcha Breaker using my filtering method explained in this post to produce filtered lists for the VPS’ I had running live projects.
Although not exactly hardware, I highly recommend you invest in a number of semi-dedicated proxies when running SER in this way as your VPS or Server provider may stop your service if your server IP receives complaints from webmasters. There is no exact number of proxies required for this process as it depends on the number of links you are attempting to build in a day but I recommend at least 30.
Many people think that you should have one proxy per 10 active threads but my testing has shown this to be wrong, for example, I have used 50 semi dedicated proxies across 1800 threads to build 1.5 million links per day consistently in the past. That is a ratio of one proxy per 36 active threads.
Tweaks At The Tool Level
Now we will be taking a quick look at some tool level tweaks you can implement.
Firstly is increasing the active thread count limit you have set for the tool. There is no one size fits all answer to this as it depends on the hardware you are using so you will have to do some testing to work out what your rig can take without crashing.
Then we have unticking automatic detection of hardware usage off, these options may be useful if you are using the tool on a desktop or laptop at home but on a VPS it is just going to slow you down. My logic, when using the tool in this way is that I want it to maximize the hardware usage it has available to increase performance as best as possible.
Then we have un-ticking the bandwidth limit, similar to above when using GSA Search Engine Ranker in this way I want it to be maximizing the resources available to it to increase the daily link output.
Moving On To Captcha Settings
As you can see when running this type of project I only have GSA Captcha Breaker enabled to solve captchas, I also have retry set to zero, if GSA Captcha Breaker is having a problem solving the type of captchas protecting these targets then I want that thread to drop the captcha as soon as possible and get onto the next one.
I have the “Finally ask user if everything else fails” option unticked all the time but just on the off chance some guys have it on, I recommend you turn it off when using these kinds of projects as it will just keep the threads active on captchas that require you to type them in when there is a good chance that thread could have dropped that target, picked up its next and smashed it through captcha breaker with no problems.
Moving On To The Indexing Tab
The whole point of building these links is to help with your indexing rates, as you are only building on platforms that should already be indexed in Google there is no need to submit these links to an indexing service.
Skipping the index service submission reduces the actions required on each submission meaning GSA Search Engine Ranker can make more submissions in a shorter time frame as well as saving you index credits that would otherwise be wasted.
The Advanced Tab
Although I always have these options turned off some users may have them enabled for links touching their money site. Getting PR or using the YandexTIC takes up time, no matter how small and when running at 1800 active threads over the span of a day, week or month all that time adds up in to link loss so I recommend you disable these options when running this type of project.
Have The Full Installation Completing The Job
I have touched on this in a few other posts but when you scale up your SER operations, I believe that it is best to have separate installations completing the various jobs required of SER. For example, with this one, it is being set up to run a massive tier three link campaign with the purpose of helping your links index.
To do this you are required to turn off a number of options that may be useful for things such as contextual link building or list filtering but on the flip side, the rig is highly optimized for building out indexing tiers.
When first starting out it can be expensive to have the required resources to run like this. You can use a single VPS and just re-optimise it every few days when you first start and give each day a specific task with a specific set up. Here is an example.
Monday – Contextual Link Building
Tuesday – Contextual Link Building
Wednesday – Contextual Link Building
Thursday – Indexing Campaigns
Friday – Indexing Campaigns
Saturday – List Filtering
Sunday – List Filtering
The next tweak is the way you set these projects up in the scheduler as shown in the screenshot above. Unfortunately, there is no one size fits all option for this as it depends on your filtered list, your VPS and how you have set your projects up so some testing is going to be required on your part.
I will use the project I am currently building a tier three for as a quick example. It currently has around 50,000 verified URLs on its tier two. I will cover this in the next section but ordinarily, I split by URLs into batches of 100 per project but as this VPS has other things going on I have had to try batches of 200. This means that I have 250 tier three projects that need to run to meet my daily link output requirement.
As shown in the first screenshot in this post my current rig is able to run 245 projects at around 540 links per minute meaning that the rig can kick out around 780,000 submissions per day. My daily link requirement for this current project is 1,250,000, if I let all projects run at the same time then I fall massively short of this but by running a limited number of projects in the schedule giving each project more system resources when running my over all links per minute climbs bringing my closer to my daily link requirement.
I just want to clarify that I am aware I will not meet my daily link requirement with the current VPS I am using as I require a server to be pushing out that kind of link count but I am trying to keep costs down until I find a new scalable method so I am making do with what I have.
Some Project Level Tweaks You Can Implement
When running projects to index your tiers, the most important tweak at project level is to only enable platforms that should already be indexed in Google and increase the chance of Google crawling your site. For this reason, I only recommend that you have Blog Comments, Guestbooks, Image Comments and perhaps Trackbacks enabled for these to remove the risk of GSA Search Engine Ranker building links on a platform that does not meet the overall purpose.
The next tweaks shown in the screenshot below is to optimize your project captchas for speed. I have GSA Captcha breaker set up as my first captcha service, I only let SER send captchas for these projects to Captcha Breaker. There are two reasons for this, the first is to keep manual captcha costs down and the second is to increase speed as Captcha Breaker is lightning fast for the majority of captcha types running on these engines.
I also tell the project to skip forms that can’t be filled, I have no data to back this part up but my theory is that if it chooses random then it’s just time wasted. We currently have no way to check the success rate of these random form selections so I just drop them. Finally, for captcha filtering is ticking the “Skip hard to solve captchas”. The majority of these engines will have easy platforms but some will have ReCaptcha and similar captcha types running on the domain. GSA Captcha Breaker has a close to zero success rate for hard captchas meaning it will fail anyway so its a waste of time so if a hard captcha is detected I just want it dropped.
The next project level tweak, is to complete the navigation in the screenshot above and disable the usage of any engines out of the platforms I recommended using that require email verification. You will only lose a hand full of engines by doing this step but it means your projects don’t have to waste time verifying emails and as I have previously touched on when using a huge amount of projects and active threads this time all adds up.
As shown in the screenshot above, we have the next set of recommendations. As you won’t be building any additional links to these link types or sending their verified links to an indexing service there is no reason to waste the time of having the projects verifying the links were built successfully.
Changing the “When to verify” option to never prevents the projects randomly going into verification mode when you want them to be submitting as many links as possible. As I have previously touched on you won’t be sending these links to an indexing service so disable it here at the project level too.
Ideally, you should be using a properly filtered verified target list meaning you can have you submission retry amount set as low as five to prevent your projects eventually becoming stuck in a constant loop of dead sites causing links per minute to drop off.
As shown in the screenshot above, the next tweaks are on where your projects pull their targets from. I turn all search engines off as they waste threads and resources trying to find targets but when blasting in this way you want to have all threads actively submitting with the targets from your filtered lists.
The next point is to pull all of your targets from folders you have recently filtered. I see people loading premium lists directly into their folders and then letting GSA Search Engine Ranker pull targets from them. Many list sellers do not complete even basic maintenance of their lists meaning the majority of their list is essentially dead and slowing you down, that’s one of the reasons I developed this filtering method you can implement to increase your links per minute.
Moving on, we come to the projects scheduled posting options as shown in the screenshot above. Realistically, with this type of project, you don’t care how long the links built are actually alive for you just want the search engine spiders to find them and be pushed up your tiers so you can massively increase the maximum posts per account.
The logic on this is that creating an account on a site will take time, once that time has been spent use that account as much as possible before the project has to take time out to create a second account. This is far from advisable when building contextual projects as if the account is deleted you lose all the posts related to it but as explained with these indexing blasting projects you don’t count how long the accounts are alive.
As you can see from the screenshot above, I currently run these projects with all filters turned off and don’t plan to change that anytime soon. As these projects are producing links to help your tier two and tier one become indexed then I see no reason to filter them. Applying filters massively restricts the targets you have available at any given time when in my oppinion any potential negative effects that the unfiltered targets posses would be filtered out by the tier two and three links.
As shown in the top right of the above screenshot, the next option is the number of URLs you add to each project for link building. There is no one size fits all option in this as it is totally depending on the verified list you are using as well as your hardware.
One thing I will say is that 200 is absolutely on the upper limit of how I set my projects up with the current VPS I am using. Ordinarily, I will have around 100 URLs per project. One major problem I have seen previously when helping people is they would just use the built-in tiered system with a single project on their tier three trying to post to crazy amounts of verified URLs on their tier two. Although I have seen Sven mention this should not cause any issues a few times my own personal testing suggests otherwise.
In my opinion, second only to the hardware you are using a good filtered list is the second most important thing to improve your links per minute.
I already have this 6686-word guide on how to properly filter your list as well as this 1998-word case study available on how I took a premium list from 75.77 links per minute up to 763.33 links per minute with list filtering so I won’t be going over it much here.
The short version of the reasoning behind it is that webmasters take a number of measures to counter automated tool usage on their sites meaning your verified list will slowly become full of these targets slowing it right down, I also have this post where I explain the full process that happens on the back end to counter your verified list by webmasters.
The Side Effects
As you are essentially trying to push GSA Search Engine Ranker as hard as possible when building these indexing projects there are a few side effects that you should expect.
The first is thread lockup, this is essentially when a thread becomes stuck on a task and won’t release from it. You can check how many locked threads your set up currently has by pressing stop and then checking the number in the bottom left of the tool near the T. Over time some of these threads may release but when using SER as I am explaining in this post a fair few just stay locked.
The next are mainly error messages and warning popups as shown in the screenshots below, essentially I just press no and ignore them and SER will keep going about its business for you.
Due to GSA Search Engine Ranker being a 32-bit program it has a number of limitations such as a hard limit on the amount of RAM it can use as well as the number of active threads the tool has access to.
Currently, there is no way around this so if you require more links than your optimized rig is able to kick out then I suggest you scale up. Invest in a new copy of GSA Search Engine Ranker, GSA Captcha Breaker as well as a new VPS or Server and scale your link count that way.
Wrapping It All Up
I hope this post helps people increase their links per minute and improve their overall indexing rates of their tiers from this process but keep in mind, as I already explained setting GSA Search Engine Ranker up as explained in this process can cause issues when using it to build contextual links or for list filtering so I recommend you either use a variation of the daily schedule I outlined earlier in the post or use individual VPS for different tasks.