铁雪资源网 Design By www.gsvan.com

最近刚好在学习python+scrapy的爬虫技术,因为mac是自带python2.7的,所以安装3.5版本有两种方法,一种是升级,一种是额外安装3.5版本。

升级就不用说了,讲讲额外安装的版本吧~~~

因为python是有自带版本的,最开始安装的时候都会有一种“ 会不会冲突 ”的感觉。

其实安装3.5版本也就是在官网上直接下载之后安装,和普通的mac软件安装方式是一样的~~

https://www.python.org/downloads/release/python-353/

安装完成之后,不会覆盖原来的python,会在 /Library/Frameworks/Python.framework/Versions/3.5/lib/python3.5文件中

在终端直接输入 python 会执行python2.7版本

python 
 
Python 2.7.12 (default, Jun 29 2016, 14:05:02) 
[GCC 4.2.1 Compatible Apple LLVM 7.3.0 (clang-703.0.31)] on darwin 
Type "help", "copyright", "credits" or "license" for more information. 
> 

在终端直接输入 python3 则会执行python3.5版本

python3 
 
Python 3.5.3 (v3.5.3:1880cb95a742, Jan 16 2017, 08:49:46) 
[GCC 4.2.1 (Apple Inc. build 5666) (dot 3)] on darwin 
Type "help", "copyright", "credits" or "license" for more information. 
> 

接下来就可以开始安装scrapy了

python3.5中会自带 pip,所以不需要额外安装了,可以直接在终端输入 pip3 --version查看版本和路径

pip3 --version 
 
pip 9.0.1 from /Library/Frameworks/Python.framework/Versions/3.5/lib/python3.5/site-packages (python 3.5) 

使用 pip3 安装scrapy

pip3 install Scrapy 

这里的Scrapy一定要首字母大写,不然会在安装的过程中报错~~

Collecting scrapy
 Retrying (Retry(total=4, connect=None, read=None, redirect=None)) after connection broken by 'NewConnectionError('<pip._vendor.requests.packages.urllib3.connection.VerifiedHTTPSConnection object at 0x103aa2c88>: Failed to establish a new connection: [Errno 61] Connection refused',)': /simple/scrapy/
 Retrying (Retry(total=3, connect=None, read=None, redirect=None)) after connection broken by 'NewConnectionError('<pip._vendor.requests.packages.urllib3.connection.VerifiedHTTPSConnection object at 0x103aa29e8>: Failed to establish a new connection: [Errno 61] Connection refused',)': /simple/scrapy/
 Retrying (Retry(total=2, connect=None, read=None, redirect=None)) after connection broken by 'NewConnectionError('<pip._vendor.requests.packages.urllib3.connection.VerifiedHTTPSConnection object at 0x103aa2630>: Failed to establish a new connection: [Errno 61] Connection refused',)': /simple/scrapy/
 Retrying (Retry(total=1, connect=None, read=None, redirect=None)) after connection broken by 'NewConnectionError('<pip._vendor.requests.packages.urllib3.connection.VerifiedHTTPSConnection object at 0x103aa2f28>: Failed to establish a new connection: [Errno 61] Connection refused',)': /simple/scrapy/
 Retrying (Retry(total=0, connect=None, read=None, redirect=None)) after connection broken by 'NewConnectionError('<pip._vendor.requests.packages.urllib3.connection.VerifiedHTTPSConnection object at 0x103aa2be0>: Failed to establish a new connection: [Errno 61] Connection refused',)': /simple/scrapy/
 Could not find a version that satisfies the requirement scrapy (from versions: )
No matching distribution found for scrapy

安装成功之后,可以直接在终端上输入 scrapy 查看版本号及使用

Scrapy 1.4.0 - no active project
Usage:
 scrapy <command> [options] [args]
Available commands:
 bench   Run quick benchmark test
 fetch   Fetch a URL using the Scrapy downloader
 genspider  Generate new spider using pre-defined templates
 runspider  Run a self-contained spider (without creating a project)
 settings  Get settings values
 shell   Interactive scraping console
 startproject Create new project
 version  Print Scrapy version
 view   Open URL in browser, as seen by Scrapy
 [ more ]  More commands available when run from project directory
Use "scrapy <command> -h" to see more info about a command

在pycharm中是没有直接创建scrapy项目的,可以使用 scrapy 命令手动新建项目

scrapy startproject ArticleSpider(ArticleSpider为项目名称) 

以上这篇mac安装scrapy并创建项目的实例讲解就是小编分享给大家的全部内容了,希望能给大家一个参考,也希望大家多多支持。

标签:
mac,安装,scrapy

铁雪资源网 Design By www.gsvan.com
广告合作:本站广告合作请联系QQ:858582 申请时备注:广告合作(否则不回)
免责声明:本站文章均来自网站采集或用户投稿,网站不提供任何软件下载或自行开发的软件! 如有用户或公司发现本站内容信息存在侵权行为,请邮件告知! 858582#qq.com
铁雪资源网 Design By www.gsvan.com

评论“mac安装scrapy并创建项目的实例讲解”

暂无mac安装scrapy并创建项目的实例讲解的评论...

稳了!魔兽国服回归的3条重磅消息!官宣时间再确认!

昨天有一位朋友在大神群里分享,自己亚服账号被封号之后居然弹出了国服的封号信息对话框。

这里面让他访问的是一个国服的战网网址,com.cn和后面的zh都非常明白地表明这就是国服战网。

而他在复制这个网址并且进行登录之后,确实是网易的网址,也就是我们熟悉的停服之后国服发布的暴雪游戏产品运营到期开放退款的说明。这是一件比较奇怪的事情,因为以前都没有出现这样的情况,现在突然提示跳转到国服战网的网址,是不是说明了简体中文客户端已经开始进行更新了呢?