Scraping my own list and GSA Threads
2014-07-08, 08:05 PM, (This post was last modified: 2014-07-08, 08:07 PM by luckskywalker.)
#1
Hey guy, sorry for a lot of question :P

1.
I have GScraper and Scrapebox, what did you prefer to scrape your list for GSA?

It's possible to "combine" them and use together to create a better list?

2.
Did you set your GSA threads per proxies or per machine? I have a dedicated with 16gb ram so that is not a problem, but I just have 10 semi-dedicated proxies from buyproxies.
Reply
2014-07-08, 08:08 PM, (This post was last modified: 2014-07-08, 08:09 PM by Krrish.)
#2
I already told you Gscraper.... Is this not enough :p

You can run 300 thread easily with that type of machine. Make sure timed out is set to 150-180

[Image: ytsrcpae_500x90.jpg]
Reply
2014-07-08, 08:27 PM, (This post was last modified: 2014-07-08, 08:27 PM by amar125.)
#3
The more proxies, specs, the more threads you can run
For Proxies point of view, following threads recommended
1 Proxy >> 10 Threads

For Specs, you need to check and see how many threads your VPS can handle without lagging etc !!
Reply
2014-07-08, 08:31 PM,
#4
(2014-07-08, 08:27 PM)amar125 Wrote: The more proxies, specs, the more threads you can run
For Proxies point of view, following threads recommended
1 Proxy >> 10 Threads

For Specs, you need to check and see how many threads your VPS can handle without lagging etc !!

Hm, I will need to invest more in proxies then :P

Thanks
Reply
2014-07-10, 04:22 AM,
#5
Scraping is all about proxies. Public not private.

FOR THE LOVE OF ALL THAT IS HOLY! DON'T SCRAPE WITH PRIVATE PROXIES!!

Notice how Gscraper sells proxies for ONE TIME use. There's a reason for that. If you disable the multi-threaded harvester in Scrapebox and look at the individual results with private proxies chances are you'll see a lot of server error codes. Also the more specific your footprints are for scraping the less Google with allow you to search.

Check out the videos by loopline. He explains it better than I can. What I recommend is use the proxy finder on GSA and save them to a file. Then import those to Scrapebox and test them. You will find some anonymous, Google proxies. It's better than buying them and burning them out.

Trust me, I just spent the last week figuring this out for my self. Countless emails between myself and Scrapebox support.
Reply
2014-07-10, 04:56 AM,
#6
(2014-07-10, 04:22 AM)Silverdot Wrote: Scraping is all about proxies. Public not private.

FOR THE LOVE OF ALL THAT IS HOLY! DON'T SCRAPE WITH PRIVATE PROXIES!!

Notice how Gscraper sells proxies for ONE TIME use. There's a reason for that. If you disable the multi-threaded harvester in Scrapebox and look at the individual results with private proxies chances are you'll see a lot of server error codes. Also the more specific your footprints are for scraping the less Google with allow you to search.

Check out the videos by loopline. He explains it better than I can. What I recommend is use the proxy finder on GSA and save them to a file. Then import those to Scrapebox and test them. You will find some anonymous, Google proxies. It's better than buying them and burning them out.

Trust me, I just spent the last week figuring this out for my self. Countless emails between myself and Scrapebox support.

But even with low connections is not better to use the private ones?
Reply
2014-07-10, 05:52 AM, (This post was last modified: 2014-07-10, 05:53 AM by Silverdot.)
#7
(2014-07-10, 04:56 AM)Neros Wrote:
(2014-07-10, 04:22 AM)Silverdot Wrote: Scraping is all about proxies. Public not private.

FOR THE LOVE OF ALL THAT IS HOLY! DON'T SCRAPE WITH PRIVATE PROXIES!!

Notice how Gscraper sells proxies for ONE TIME use. There's a reason for that. If you disable the multi-threaded harvester in Scrapebox and look at the individual results with private proxies chances are you'll see a lot of server error codes. Also the more specific your footprints are for scraping the less Google with allow you to search.

Check out the videos by loopline. He explains it better than I can. What I recommend is use the proxy finder on GSA and save them to a file. Then import those to Scrapebox and test them. You will find some anonymous, Google proxies. It's better than buying them and burning them out.

Trust me, I just spent the last week figuring this out for my self. Countless emails between myself and Scrapebox support.

But even with low connections is not better to use the private ones?

You missed this part:

" Also the more specific your footprints are for scraping the less Google with allow you to search. "

No. No it's not.

If you want to fight me on it and do your own thing then go ahead. But when you start seeing 402 error codes and not returning any URL's from your scraping section feel free to come back to this thread and tell me I was right.
Reply
2014-07-10, 06:01 AM,
#8
(2014-07-10, 05:52 AM)Silverdot Wrote:
(2014-07-10, 04:56 AM)Neros Wrote:
(2014-07-10, 04:22 AM)Silverdot Wrote: Scraping is all about proxies. Public not private.

FOR THE LOVE OF ALL THAT IS HOLY! DON'T SCRAPE WITH PRIVATE PROXIES!!

Notice how Gscraper sells proxies for ONE TIME use. There's a reason for that. If you disable the multi-threaded harvester in Scrapebox and look at the individual results with private proxies chances are you'll see a lot of server error codes. Also the more specific your footprints are for scraping the less Google with allow you to search.

Check out the videos by loopline. He explains it better than I can. What I recommend is use the proxy finder on GSA and save them to a file. Then import those to Scrapebox and test them. You will find some anonymous, Google proxies. It's better than buying them and burning them out.

Trust me, I just spent the last week figuring this out for my self. Countless emails between myself and Scrapebox support.

But even with low connections is not better to use the private ones?

You missed this part:

" Also the more specific your footprints are for scraping the less Google with allow you to search. "

No. No it's not.

If you want to fight me on it and do your own thing then go ahead. But when you start seeing 402 error codes and not returning any URL's from your scraping section feel free to come back to this thread and tell me I was right.

I'm just asking, calm down lol
Reply


Related Threads
Thread Author Replies Views Last Post
  Need traffic to build a targeted email list Med Fd 0 31 8 hours ago
Last Post: Med Fd
  building email list without a website Med Fd 9 323 2017-12-10, 11:04 PM
Last Post: chumban
  Email marketing list DirksDoom 2 240 2017-11-09, 12:12 PM
Last Post: mentorcell
  GSA SER low verified ratio DeL0x 0 131 2017-11-04, 12:36 AM
Last Post: DeL0x
  [HELP] How to export a list of users who commented on an instagram post? SeoFreak 3 226 2017-10-02, 10:07 PM
Last Post: sudirbhai





About Us | Contact Us | CPA Elites | Advertise | Stats | Staff Team

© 2013-2018 CPA Elites Ltd
Enhanced by MyBB and WallBB
Return to top