Search
  • Admond Lee

Quick Guide To Install Scrapy to Windows OS


Scrapy is an application framework for crawling web sites and extracting structured data which can be used for a wide range of useful applications, like data mining, information processing or historical archival.


To put it in a more simpler term, Scrapy is used to perform web scraping and web scraping is the process of data extraction from websites.


Data here can be anything ranging from text, images, videos, emails, phone numbers etc…(You get what I mean)


Take Google as an example: Everytime when you search something on Google search engine, under the hood it searches and scraps data from the almost the entire websites in the World Wide Web and returns you the search results.


All these happened in seconds. Amazing right?


I’ve always been wanting to learn web scraping and recently I have a project that requires this technique. Upon the recommendation by my close friend — Low Wei Hong and reading through his article — Scrapy or Selenium?, I started learning Scrapy.


And it’s fun!


In this article, I’ll share with you some simple yet practical guides on how to install Scrapy to Windows OS after struggling with some technicality. At the end of this article, I hope you’d find that helpful before stepping in to learn Scrapy.


Let’s get started!


How to Install Scrapy to Windows OS


1. Create a virtual environment


First thing first, it is highly recommended to create a virtual environment and install Scrapy in the virtual environment created. This is to avoid conflict with already-installed Python system packages (which could break some of your system tools and scripts)

Conda creates a virtual environment

In the terminal above, I created a virtual environment named virtualenv_scapy using conda create — name virtualenv_scrapy. Once you’re done with this step, you can use conda to activate the virtual environment.


But before we move to the next step, it is always a good idea to check which conda environment that you’re currently using by typing conda info --envs

Conda checks the current environment

Great. Now that I knew that I was using the original conda environment. We can verify if we have successfully changed to the virtual environment created later by using the same method.


2. Activate the virtual environment

Conda activates the virtual environment

To activate the virtual environment created, you just type conda activate virtualenv_scrapy. And we’ve also checked that we’re now using the virtual environment.


The next thing to do is to create a new folder, in this case I created a new folder called virtualenv_scrapy and changed my directory to the folder as shown below.

New folder created

3. Install Scrapy via conda-forge channel


Though it’s possible to install Scrapy on Windows using pip, it is recommended to install Anaconda or Miniconda and use the package from the conda-forge channel, which will avoid most installation issues.


Since we already have Anaconda installed, we can directly install Scrapy with the code: conda install -c conda-forge scrapy


Once you’ve typed y to proceed with the installation of all the necessary packages you’ll start downloading and extracting all the packages as below.

Downloading and extracting packages to install Scrapy


4. Use Scrapy to create a new project


Bamm!


If you’ve reached this stage. Congratulations and you’ve successfully installed Scrapy to your local machine (Windows OS)!


And guess what? You can create a new project using Scrapy in no time by typing scrapy startproject demo_project as below.

Create a new project using Scrapy

Up to this point, you can definitely start playing around with Scrapy. Enjoy and have fun!



Final Thoughts

Thank you for reading.


I hope this article gave you some simple guides on how to install Scrapy to your local machine (if you’re using Windows OS) in the quickest way possible.


As always, if you have any questions, feel free to leave your comments below. Till then, see you in the next post!

16 views

Let's Connect.

Admond Lee © 2019

Get all my insights in your inbox.