Product Updates
New features and product updates
Now you can have your Scrapy spiders automatically deployed from a GitHub repository to Scrapy Cloud.

You just have to connect your Scrapy Cloud project to a Github repo and select the branch that you want to deploy. Then, every time you push changes into that repo, a new deploy will be triggered.

Screen-Shot-2017-04-19-at-16.10.22-1024x399.png 187.19 KB


Check out our Help Center to learn more about this feature.
Please give it a warm welcome to our new Help Center!. This one will centralize all help articles and guides to use Scrapinghub platform (Scrapy Cloud and Crawlera). Previously some of this content was in the support.scrapinghub.com KnowledgeBase and some was in the doc.scrapinghub.com, which is now left only for API reference.

pasted-file-1.png 726.17 KB
We introduced Scrapy with Python 3 support back in May and now we're happy to announce that Scrapy Cloud also supports Python 3.

To deploy your Python 3 spiders, you'll need to configure your project’s scrapinghub.yml. There, just include a section called stacks with scrapy:1.1-py3 as the stack for your Scrapy Cloud project:

projects:
    default: 99999
stacks:
    default: scrapy:1.1-py3

Scrapy Cloud will continue to support Python 2, which will remain the default unless you explicitly set your environment to Python 3.

For more about Python 3 support in Scrapy Cloud, please see the Scrapinghub Blog.

We are proud to introduce a new feature built specifically for developers: the job console. With the job console you can open a unix shell directly into the container where your job is running: 


Typical uses for this are tailing the scrapy log or accessing the Scrapy telnet console.
You can now immediately share the results of your Scrapinghub projects as publicly searchable datasets, with our new Datasets Catalog


This both a great way to collaborate with others and save time by utilizing others’ datasets in your projects.




For more about Datasets, including how to publish your own, please see the Scrapinghub blog and this Knowledge Base article.
The scraped fields summary has been improved with a new field type selector:


Define the data type for each of your scraped fields and then create schemas that can be re-used with future jobs.

For more details, please see the Knowledge Base article

We hope you enjoy this new and improved interface! 

We’ve launched a new and improved Scrapy Cloud 2.0! 

Full details are available on the Scrapinghub blog, but here are some of the key features:
  • Improved capabilities: Better CPU performance and more available memory
  • Cheaper cost per job
  • Specify resource needs on a per-job basis: No need to rent a whole server to run high performance jobs



Along with all of the new capabilities, there’s a brand new Scrapy Cloud dashboard to manage your resources. Just allocate your Scrapy Cloud container units amongst different groups, and then drag and drop projects between those groups. The jobs in each group are isolated from the other jobs running on the platform, so your performance will never be affected by other resource consuming jobs. 

Again, these are just the highlights so be sure to check out the full details!

We’re happy to announce a new and improved interface for Crawlera that will help you better understand how Crawlera is doing for your crawls:

pasted-file-1.png 164.79 KB


Now you can see which websites you crawl the most and how much, as well as having real-time access to the last requests performed through Crawlera. 

There are more improvements and new features on the way!
Starting today you’ll always be inside of one of your Organizations rather than having to choose an Organization from your home page. 

Switch your Organization or create a new one right from the top navigation bar: 



We believe this will simplify your experience using the Scrapinghub platform. Let us know what you think!
Good news for all you PayPal fans out there: You can now pay for Scrapinghub services via Paypal!

To take advantage of the new PayPal option, head to the billing portal or select PayPal during your next checkout (credit cards are still supported).  Once set up, subscription payments will appear in your PayPal account as "Scrapinghub".

We hope you enjoy being able to use PayPal with Scrapinghub, and are looking forward to supporting more payment methods in the future!

pasted-file-1.png 137.71 KB
You can now get notified when a job finished by simply watching the job:

pasted-file-1.png 10.6 KB


You can also watch the spider or entire project to get notified when any of their jobs finish.
We now support credit card payment for the Scrapinghub platform and all its products.

You can find more information in this Knowledge base article.

If you're an existing customer and want to move to credit card payments please contact support. New customers will come with credit card payments enabled.

You will see a Billing section in your organization that would allow you to subscribe to any of our product plans. Once subscribed you will see the list of services you're subscribed to (we call them addons). To change the services you are subscribed to (either cancel, upgrade or subscribe to new ones) you should access the billing available through the green "Open Billing Portal" button you will see the Billing page once you're subscribed to any product. 
Button to open billing portal where you can change your subscription
This is a page that serves as the home/landing page for organizations, which provides a quick summary of the projects & resources the organization has, with links to all relevant places for more details.
New organization overview page
Did you want to make your organization stand out from the rest? Well, now you can by uploading that awesome logo of yours! 

Just go to the organization profile page and you will see an option to change the logo on the right. If you're feeling lazy, just drag & drop the image there, it also works.

Organization profile settings
Didn't you hate having to contact support to get your Splash instances provisioned?. No need to do that anymore!. When you subscribe to a Splash plan your Splash instance will get provisioned automatically and you will see it in the Splash instances section of your organization page.

Splash instances in organization page
We have rolled a new Scrapy Cloud sidebar where the items are organized in sections to be easier to find. We also merged some sections (like Eggs & Deploys) that make a lot more sense together.

New Scrapy Cloud sidebar
You can now sign in and sign up to Scrapinghub with Google 

pasted-file-1.png 27.82 KB