Why The Native GSA Search Engine Ranker Web 2.0 Engines Are A Waste Of Time!

Welcome to my case study where I will investigate the efficiency of the native GSA Search Engine Ranker engines. Ever since I first began using the tool I have avoided using them and while I do recall doing some testing with them when I first began to use the tool my knowledge of the tool and how best to set it up has increased drastically since then.

Time For A Prediction

Web 2.0 article platforms are one of the most complex link types automated link builders use, this due to the tool needing to complete a large number of steps just to create an account on the platform and post an article to it. Additionally, they are usually protected by a stronger captcha platform as well as having anti-automation measures running on their backend to detect automated behavior. This is why most people, including myself, choose to use either automated web 2.0 creators or manual web 2.0 submission services.

From my understanding, a standard GSA Search Engine Ranker install has a few issues when it comes to dealing with JavaScript. Now I’m no code monkey and in all honesty, I have no idea what JavaScript is but I have been told by multiple people that the majority of web 2.0 platforms use it meaning SER struggles to submit to their engines.

Due to these issues, I predict the projects I create for this case study are going to be a resource hog meaning running them could have a negative effect on SER projects running different platforms by taking resources away from them while the web 2.0 case study projects produce little to no links.

How I Plan To Run The Test

I feel a single project will be sufficient to provide enough data to decided if the native GSA Search Engine Ranker web 2.0 platform and its engines are worth the resource requirement and captcha costs to run on live projects.

To accurately allow me to track the system resource requirements of the project, I plan to make sure the project for this case study is the only project running during the test so I can view its system resource requirements across the bottom of the tool.

I will ensure I am running the current version of GSA Search Engine Ranker to make sure I have the latest script version for these engines and I plan to set the project up with all of the web 2 engines currently available in the web 2 platform enabled as shown in the screenshot below.

GSA Search Engine Ranker Web 2 Eninges

As is standard practice when building web 2.0 platform links with any tool I will be using 250 actual email accounts rather than catch-all emails, a human solved captcha server to solve captchas and semi-dedicated proxies in an attempt to increase the chances of link verification as much as possible. When pasting the fresh proxies into SER I will test them to make sure they are alive and usable during submission.

I will enable the re-verification of links for the project to remove the chance of a link being verified by GSA Search Engine Ranker only to be deleted on the web 2 platform and then for SER to try to reverify the link and remove it from its verified targets pane.

I plan to allow the project to process 1000 human solved captchas before I will change it to Active-Verify forcing it to keep checking through its submitted targets looking for a successfully verified link so I can accurately gauge the link yield of the test.

So What Actually Happened

Out of the seventeen engines enabled for the project I only saw active submissions being made on by three of the engines with only one producing verified URLs. As you can see from the screenshot below both the Pastebin and Amazon engines has successful submissions registered but would not produce verified URLs.

GSA Search Engine Ranker Submissions

All verified URLs came from the colourlovers.com engine. Unfortunately, this is a no follow engine as shown in the screenshot below. In my opinion, these links are useless for tiered link building. This video by Matt Cutts of Google and this post released by Bing confirms that both search engines essentially process no follow links they same way. The links are disregarded from the pages link graph by the search engine spider. In my opinion, this means that any links lower in your pyramid pointing to this page are wasted and offer no benefit to your money site.

GSA Search Engine Ranker Verified URLs

As you can see from the screenshot below, I was wrong about the web 2 engines being resource hogs.

GSA Search Engine Ranker Web 2 system resources

The active thread count, memory and CPU usage of the project was surprisingly very low. My theory is that this is linked to the number of actions required to create an account on a web 2.0 engine and then create the article meaning the threads are caught up on the same page doing many actions rather than the tool having to reload a bunch of web pages in a very short time as the actions can be completed instantly such as a blog comment campaign.

Moving Forward From Here

Although the test only required a low amount of system resources on the VPS I use the loss of these resources on lower end systems used by some users may have a significant impact on the performance of their tool. The fact that 1025 captcha requests only produced 93 verified links that were all no follow also leads me to believe that these system resources are best allocated to project doing what GSA Search Engine Ranker does best such as content management system submissions.

Although these results have the potential to be improved in the future if their scripts are updated I still see no reason to use the native web 2 platform engines in GSA Search Engine Ranker when there are dedicated web 2.0 creators out there supporting better platforms with much higher verification rates.

At the time of writing, version two of the SEREngines plugin is in the beta testing phase and shows potential to add a web 2 capability to SER if the remaining few bugs can be corrected but I feel it is safe to say that having the native web 2 platform enabled in your projects will only waste both system resources and captcha credits that can be better spend else where.

AdminRSSNews

MBA candidate at @tuckschool of business at dartmouth. lover of huskies, the ocean & boston sports. hoya saxa!