Here's how you can download entire websites for offline reading so you have access even when you don't have Wi-Fi or 4G.
Although Wi-Fi is available everywhere these days, you may find yourself without it from time to time. And when you do, there may be certain websites you wish you could save and access while offline---perhaps for research, entertainment, or posterity.
How to Find Free Unlimited Wi-Fi Internet Access Almost Anywhere
There's nothing better than scoring free Wi-Fi. Here are some ways to find free unlimited Wi-Fi no matter where you are.
It's easy enough to save individual web pages for offline reading, but what if you want to download an entire website? Well, it's easier than you think! Here are four nifty tools you can use to download any website for offline reading, zero effort required.
1. WebCopy
Available for Windows only.
WebCopy by Cyotek takes a website URL and scans it for links, pages, and media. As it finds pages, it recursively looks for more links, pages, and media until the whole website is discovered. Then you can use the configuration options to decide which parts to download offline.
The interesting thing about WebCopy is you can set up multiple "projects" that each have their own settings and configurations. This makes it easy to re-download many different sites whenever you want, each one in the same exact way every time.
One project can copy many websites, so use them with an organized plan (e.g. a "Tech" project for copying tech sites).
How to Download an Entire Website With WebCopy
Install and launch the app.
Navigate to File > New to create a new project.
Type the URL into the Website field.
Change the Save folder field to where you want the site saved.
Play around with Project > Rules… (learn more about WebCopy Rules).
Navigate to File > Save As… to save the project.
Click Copy Website in the toolbar to start the process.
Once the copying is done, you can use the Results tab to see the status of each individual page and/or media file. The Errors tab shows any problems that may have occurred and the Skipped tab shows files that weren't downloaded.
But most important is the Sitemap, which shows the full directory structure of the website as discovered by WebCopy.
To view the website offline, open File Explorer and navigate to the save folder you designated. Open the index.html (or sometimes index.htm) in your browser of choice to start browsing.
2. HTTrack
Available for Windows, Linux, and Android.
HTTrack is more known than WebCopy, and is arguably better because it's open source and available on platforms other than Windows, but the interface is a bit clunky and leaves much to be desired. However, it works well so don't let that turn you away.
Like WebCopy, it uses a project-based approach that lets you copy multiple websites and keep them all organized. You can pause and resume downloads, and you can update copied websites by re-downloading old and new files.
How to Download a Website With HTTrack
Install and launch the app.
Click Next to begin creating a new project.
Give the project a name, category, base path, then click Next.
Select Download web site(s) for Action, then type each website's URL in the Web Addresses box, one URL per line. You can also store URLs in a TXT file and import it, which is convenient when you want to re-download the same sites later. Click Next.
Adjust parameters if you want, then click Finish.
Once everything is downloaded, you can browse the site like normal by going to where the files were downloaded and opening the index.html or index.htm in a browser.
3. SiteSucker
Available for Mac and iOS.
If you're on a Mac, your best option is SiteSucker. This simple tool rips entire websites and maintains the same overall structure, and includes all relevant media files too (e.g. images, PDFs, style sheets).
It has a clean and easy-to-use interface that could not be easier to use: you literally paste in the website URL and press Enter.
One nifty feature is the ability to save the download to a file, then use that file to download the same exact files and structure again in the future (or on another machine). This feature is also what allows SiteSucker to pause and resume downloads.
SiteSucker costs $5 and does not come with a free version or a free trial, which is its biggest downside. The latest version requires macOS 10.13 High Sierra or later. Older versions of SiteSucker are available for older Mac systems, but some features may be missing.
4. Wget
Available for Windows, Mac, and Linux.
Wget is a command-line utility that can retrieve all kinds of files over the HTTP and FTP protocols. Since websites are served through HTTP and most web media files are accessible through HTTP or FTP, this makes Wget an excellent tool for ripping websites.
While Wget is typically used to download single files, it can be used to recursively download all pages and files that are found through an initial page:
wget -r -p https://www.makeuseof.com
However, some sites may detect and prevent what you're trying to do because ripping a website can cost them a lot of bandwidth. To get around this, you can disguise yourself as a web browser with a user agent string:
wget -r -p -U Mozilla https://www.makeuseof.com
If you want to be polite, you should also limit your download speed (so you don't hog the web server's bandwidth) and pause between each download (so you don't overwhelm the web server with too many requests):
wget -r -p -U Mozilla --wait=10 --limit-rate=35K https://www.makeuseof.com
Wget comes bundled with most Unix-based systems. On Mac, you can install Wget using a single Homebrew command: brew install wget (how to set up Homebrew on Mac). On Windows, you'll need to use this ported version instead.
Which Websites Do You Want to Download?
Now that you know how to download an entire website, you should never be caught without something to read, even when you have no internet access.
But remember: the bigger the site, the bigger the download. We don't recommend downloading huge sites like MakeUseOf because you'll need thousands of MBs to store all of the media files we use.
The best sites to download are those with lots of text and not many images, and sites that don't regularly add new pages or changed. Static information sites, online ebook sites, and sites you want to archive in case they go down are ideal.
If you're interested in more options for offline reading, take a look at how you can set up Google Chrome for reading books offline. And for other ways to read long articles instead of downloading them, check out our tips and tricks.
Most RSS readers recommend RSS feeds or let you search for them. But, sometimes you need to manually find one if the site you want to subscribe to doesn't show up as a choice in your favorite RSS reader app.
Here are several ways to help you find a website's RSS feed so that you can stay updated on all the newest content.
Look for the RSS Icon
The easiest way to find an RSS feed is to look for the RSS icon somewhere on the website. If a site has one, they won't be shy in showing it because they want you to subscribe.
You can usually find the RSS feed icon at the top or bottom of the site. It's often near a search bar, email newsletter signup form, or social media icons.
As you can see in the above screenshot, not all RSS links are orange like the standard RSS icon. They also don't necessarily need to contain this symbol. You might find the RSS feed from a link that reads, "Subscribe for updates," or a totally different symbol or message.
Depending on the website, there might be several different RSS feeds you can subscribe to. To find those links, you might need to do a search or locate the specific area of the site you want to be updated on. If there's an RSS feed for that particular type of content, the icon will appear along with the results.
Torrent sites are a prime example of this, since most of them have several categories of information. The Pirate Bay, for instance, has a massive list of RSS feeds.
Edit the URL
Lots of websites provide their RSS feed on a page called feed or rss. To try this, go to the website's home page (erase everything but the domain name) and type /feed or /rss.
Here's an example:
https://www.lifehack.org/feed
Depending on the website you're on and the browser you're using, what you see next might be a normal-looking web page with a Subscribe button or an XML-formatted page with a bunch of text and symbols.
View the Page Source
Another way you might find the RSS feed is to look "behind" the page. You can do this by viewing its source, which is the raw data your web browser translates into a viewable page.
How to View the Source Code of a Web Page in Every Browser
Most web browsers let you quickly open the page source with the Ctrl+U or Command+U keyboard shortcut. Once you see the source code, search through it (with Ctrl+F or Command+F) for RSS. You can often find the direct link to the feed somewhere around that line.
Use an RSS Feed Finder
There are special tools you can install in your web browser to locate a site's RSS feed(s). These add-ons are super easy to install and usually work really well.
If you use Chrome, you might try Get RSS Feed URL or RSS Subscription Extension (by Google). Firefox users have similar options, such as Awesome RSS and Feedbro.
Still Can't Find the Site's RSS Feed?
Some websites simply don't use RSS feeds. But, that doesn't mean you're out of luck. There are tools you can use to generate RSS feeds from websites that don't use them, but they don't always work very well.
Some examples of RSS generators that let you make a feed from nearly any website include FetchRSS, Feed Creator, PolitePol, Feed43, and Feedity.
What to Do After Finding the RSS Feed
After you find the RSS feed you want to subscribe to, you have to use a specific program that can read the data from the feed and update you when the feed changes.
First, copy the RSS feed URL by right-clicking it and choosing the copy option. With the address copied, you can paste it into whatever tool you want to use to deliver the news to you. There are online RSS readers, feed readers for Windows, and Mac-supported RSS readers available, plus RSS aggregator tools to join multiple feeds together.
Since 1996, our company has been focusing on domain name registration, web hosting, server hosting, website construction, e-commerce and other Internet services, and constantly practicing the concept of "providing enterprise-level solutions and providing personalized service support". As a Dell Authorized Solution Provider, we also provide hardware product solutions associated with the company's services.
Contact Us
Address: No. 2, Jingwu Road, Zhengzhou City, Henan Province
Phone: 0086-371-63520088
QQ:76257322
Website: 800188.com
E-mail: This email address is being protected from spambots. You need JavaScript enabled to view it.