[SEO Tool] Web hyperlink extraction tool (wordless website link crawler)

[SEO Tool] Web hyperlink extraction tool (wordless website link crawler)

 [SEO Tool] Web hyperlink extraction tool (wordless website link crawler)

 

Website link crawler, as its name implies, is mainly used to crawl website links.

That is, input the home page address of a website, and then the software can capture the entire website page address and save it.

What's the use?

1、 Create a site map

Use the TXT saved by the software. Directly upload it to the root directory of the website to become a simple TXT website map.

After making a website map, you can go to http://zhanzhang.baidu.com/sitemap/index Submit for inclusion.

2、 Check for errors.

The so-called mistakes refer to some background links. Don't want users to know. However, I didn't notice that there was a link to the webpage somewhere in the program due to carelessness. At this time, you can use the software to grab the links of the entire website. Then check to see if there are sensitive links.

3、 Check the recording.

You can check Baidu's index of this site. Then use the software to grab all the links from the website. See if the gap in quantity is large. If the gap is too large, pay attention to whether the space is unstable or the article quality is too low or the exposure rate is too low.

 

usage method

(Note: It was updated on May 8, 2017, with new use methods, and no instructions for use are provided for the time being)

1. Enter the homepage address of the website to be crawled.

2. Set the number of threads. (The fewer threads, the slower the speed, and the lower the CPU and network speed, the less likely it is to miss links. The more threads, the faster the speed, and the higher the CPU and network speed, the higher the probability of missed fetching.)

3. Select the saved address of the captured link. (Note: If the selected TXT name is abc.txt, the software will be saved as abc_ *. txt)

4. Select the number of saved entries for each TXT. (Take the third example. If it is set to 5000 here, when the number of links saved in abc_1.txt reaches 5000, the subsequent links will be automatically saved in abc_2.txt, and so on.)

5. Start

 

Download address

Lanzuo Cloud: https://www.lanzous.com/i13tgze

It can also be downloaded from the author's official website

https://www.wuyublog.com/ruanjianfabu/70.html

 

Statement: This resource is collected and sorted on the network by Watson blog [wosn. net]. If there is infringement, please contact Watson blog[ admin@wosn.net ]Delete processing.

Watson Blog (wosn. net) --- Focusing on PHP technology and resource sharing!

 Watson Blog
  • This article is written by Published on May 29, 2018 00:15:00
  • This article is collected and sorted by the website of Mutual Benefit, and the email address for problem feedback is: wosnnet@foxmail.com , please keep the link of this article for reprinting: https://wosn.net/1234.html

Comment