Hacker News new | past | comments | ask | show | jobs | submit login

Personally I wasn't convinced until I also tried out the "Sublime SFTP" plugin..

I'm never going back.




Hopefully in this day and age, you are using that plugin to SFTP into a virtual machine locally.


No clue what that is meant to mean.

SFTP is as secure as any other SSH connection and people use those extremely regularly to remotely administer hosts which lack a VPN.

Aside from terrible Windows support SFTP is likely one of the best secure remote file transfer protocols around (in particular for freeform file transfers, rather than structured like AS2 or similar).

Certainly better than FTPS.


will is probably referring to version control/change tracking, and not to "security".


That's still a silly point. For security reasons a lot of people don't hook their source control directly into production web-servers, and instead deploy from a secured environment hooked into secure control onto production (which would still require something like e.g. SFTP for the actual transfer).

Regardless their point was poorly explained/explored.


Correct, workflow. Even with the tools themselves, rather than "best practice" it seems mind boggling using SFTP within the editor. With things like SASS, bower, grunt, npm and the like - it doesn't even seem like it'd be compatible.


Well I use a dev virtual server instead of a local vm. I enjoy working in a windows environment, but would rather not struggle with the discrepancies between running an app on windows with something like xampp vs apache on centos which are my web servers.

I use a sftp atom plugin to workin on/save to my dev server and just use ssh to do git/etc right on the server. I don't see how this workflow is very different than working locally and using a command prompt for all of your tools.


I develop from multiple machines depending on where I am. The dev server that then does all the version control bits as if it were the local server. It's also a lot beefier than my laptop. Also bear in mind these are my own one-man-job projects and websites. Made do with what I have else I'll spend years setting up the perfect system and have created nothing


I only do web development occasionally so excuse my ignorance, but what is the common way to deploy a website now?


I do, and to be honest with you it is more fractured than it has ever been.

Back in the 90s "everyone" used FTP. Now the methods of deployment vary wildly including but not limited to:

- FTP (over VPN), "Shared Folders" (SMB, over VPN), RDC/RDP (seriously, often not over VPN)

- Git clients (and other source control, controlled over SSH), SFTP, FTPS, CMS (over HTTP/S), SCP, rsync/robocopy (over VPN)

- Virtualisation trickery: Like cloning, snapshots, and shared data storage.

The "best" way depends on a lot of factors. For example are you doing staging? Do you even have source control? How many servers? Etc. There is no one size fits all solution.

For a small host or personal site, you can likely do what everyone else does: SFTP on Linux/BSD, and FTPS on Windows (via Filezilla Server).


We use git and have branches designated as Dev, Test, Prod (etc) with webhooks or other scripts that automatically deploy the branches based on the workflow. At my last job (small webshop) it was automated, but my current job is more "Enterprisey" and we have separation of duties and other work flow constraints which mean that I never push anything to production myself (or test for that matter).


lmao i actually used to do that once but the overhead was too much (i save frequently), so i just bought a desktop and i x11 over ssh on the remote's sublime. SFTP was a really good extension though.


I didn't know this existed, thanks for that!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: