Whether you are an IT professional who needs to download 2000 online bug reports into a flat text file and parse them to see which ones need attention, or a mum who wants to download 20 recipes from an public domain website, you can benefit from knowing the tools which help you download webpages into a text based file. If you are interested in learning more about how to parse the pages you download, you can have a look at our Big Data Manipulation for Fun and Profit Part 1 article.
In this tutorial you will learn:
- How to retrieve/download webpages using wget, curl and lynx
- What the main differences between the wget, curl and lynx tools are
- Examples showing how to use wget, curl and lynx