I would like to create and deploy a small scraper written in python. At the moment I run it from laptop. My starting point is https://www.digitalocean.com/community/tutorials/how-to-scrape-web-pages-with-beautiful-soup-and-python-3.
I would like to deploy it to some web server. Can I use the same shared hosting that I use for my website or must I use VPS/AWS or something like that? Tutorial is from Digital Ocean but I'm not hosting my site there.
Do I need to change the code depending on the type of server that I'm deploying to?
CodePudding user response:
As python is a interpreted language, it should probably work the same on every maching. Regarding the hosting, it depends on what kind of shared hosting you have, if you just get a web server, meaning you can drop html files and see them in your browser, you should get a vps, but if your shared hosting also comes with compute power, you could use that, though it would be slower.
CodePudding user response:
A vps server is the best in my opinion, I run a webscrapper on it. You have an access by distance in ssh (with Putty or another soft or with a terminal)
But make sure you know how to navigate in a server with a terminal as there is no graphic part (despite you can still install in command line through a terminal first and have a web remote access(like a sort of VNC or teamviewer), I have that with IONOS I don't know the other hosting services) and also make sure to secure your vps as you'll encounter intrustion attempt