Bash script: wget a website and save urls containing a specific string
Hi, I've spent hours trying to figure this out without any luck. Would appreciate your help :) I need to "crawl" a website and search for a specific string in the html. If a string is found the url to the page containing it is saved to a file. So the end result should be a file containing a list of urls. I tried writing a bash script using wget. My knowledge of Linux is very basic. I am using Cygwin for Windows. |
No comments:
Post a Comment