Generally, when I work on a website, I maintain a local copy of all the files. Ideally, I use version control (git, svn or whatever), but failing that, I use rsync over ssh to keep my files in sync with the web server's files.
But I'm helping with a local nonprofit's website, and the cheap web hosting plan they chose doesn't offer ssh, just ftp.
While I have to question the wisdom of an ISP that insists that its customers use insecure ftp rather than a secure encrypted protocol, that's their problem. My problem is how to keep my files in sync with theirs. And the other folks working on the website aren't developers and are very resistant to the idea of using any version control system, so I have to be careful to check for changed files before modifying anything.
In web searches, I haven't found much written about reasonable workflows on an ftp-only web host. I struggled a lot with scripts calling ncftp or lftp. But then I discovered curftpfs, which makes things much easier.
I put a line in /etc/fstab like this: curlftpfs#user:email@example.com/ /servername fuse rw,allow_other,noauto,user 0 0
Then all I have to do is type mount /servername and the ftp connection is made automagically. From then on, I can treat it like a (very slow and somewhat limited) filesystem.
For instance, if I want to rsync, I can rsync -avn --size-only /servername/subdir/ ~/servername/subdir/ for any particular subdirectory I want to check. A few things to know about this:
- I have to use --size-only because timestamps aren't reliable. I'm not sure whether this is a problem with the ftp protocol, or whether this particular ISP's server has problems with its dates. I suspect it's a problem inherent in ftp, because if I ls -l, I see things like this: -rw-rw---- 1 root root 7651 Feb 23 2015 guide-geo.php -rw-rw---- 1 root root 1801 Feb 14 17:16 guide-header.php -rw-rw---- 1 root root 8738 Feb 23 2015 guide-table.php Note that a file modified a week ago shows a modification time, but files modified today show only a day and year, not a time. I'm not sure what to make of this.
- Note the -n flag. I don't automatically rsync from the server to my local directory, because if I have any local changes newer than what's on the server they'd be overwritten. So I check the diffs by hand with tkdiff or meld before copying.
- It's important to rsync only the specific directories you're working on. You really don't want to see how long it takes to get the full file tree of a web server recursively over ftp.
How do you change and update files? It is possible to edit the files on the curlftpfs filesystem directly. But at least with emacs, it's incredibly slow: emacs likes to check file modification dates whenever you change anything, and that requires an ftp round-trip so it could be ten or twenty seconds before anything you type actually makes it into the file, with even longer delays any time you save.
So instead, I edit my local copy, and when I'm ready to push to the server, I cp filename /servername/path/to/filename.
Of course, I have aliases and shell functions to make all of this easier to type, especially the long pathnames: I can't rely on autocompletion like I usually would, because autocompleting a file or directory name on /servername requires an ftp round-trip to ls the remote directory.
Oh, and version control? I use a local git repository. Just because the other people working on the website don't want version control is no reason I can't have a record of my own changes.
None of this is as satisfactory as a nice git or svn repository and a good ssh connection. But it's a lot better than struggling with ftp clients every time you need to test a file.
Someone on the SVLUG list posted about a shell script he'd written to find core dumps.
It sounded like a simple task -- just locate core | grep -w core, right? I mean, any sensible packager avoids naming files or directories "core" for just that reason, don't they?
But not so: turns out in the modern world, insane numbers of software projects include directories called "core", including projects that are developed primarily on Linux so you'd think they would avoid it ... even the kernel. On my system, locate core | grep -w core | wc -l returned 13641 filenames.
Okay, so clearly that isn't working. I had to agree with the SVLUG poster that using "file" to find out which files were actual core dumps is now the only reliable way to do it. The output looks like this:
$ file core
core: ELF 32-bit LSB core file Intel 80386, version 1 (SYSV), too many program headers (375)
The poster was using a shell script, but I was fairly sure it could be done in a single shell pipeline. Let's see: you need to run locate to find any files with 'core" in the name.
Then you pipe it through grep to make sure the filename is actually core: since locate gives you a full pathname, like /lib/modules/3.14-2-686-pae/kernel/drivers/edac/edac_core.ko or /lib/modules/3.14-2-686-pae/kernel/drivers/memstick/core, you want lines where only the final component is core -- so core has a slash before it and an end-of-line (in grep that's denoted by a dollar sign, $) after it. So grep '/core$' should do it.
Then take the output of that locate | grep and run file on it, and pipe the output of that file command through grep to find the lines that include the phrase 'core file'.
That gives you lines like
/home/akkana/geology/NorCal/pinnaclesGIS/core: ELF 32-bit LSB core file Intel 80386, version 1 (SYSV), too many program headers (523)
But those lines are long and all you really need are the filenames; so pass it through sed to get rid of anything to the right of "core" followed by a colon.
Here's the final command: file `locate core | grep '/core$'` | grep 'core file' | sed 's/core:.*//'
On my system that gave me 11 files, and they were all really core dumps. I deleted them all.
The snow is melting fast in the lovely sunny weather we've been having; but there's still enough snow on the Sangre de Cristos to see the dual snow hearts on the slopes of Thompson Peak above Santa Fe, wishing everyone for miles around a happy Valentine's Day.
Dave and I are celebrating for a different reason: yesterday was our 1-year anniversary of moving to New Mexico. No regrets yet! Even after a tough dirty work session clearing dead sage from the yard.
So Happy Valentine's Day, everyone! Even if you don't put much stock in commercial Hallmark holidays. As I heard someone say yesterday, "Valentine's day is coming up, and you know what that means. That's right: absolutely nothing!"
But never mind what you may think about the holiday -- you just go
ahead and have a happy day anyway, y'hear? Look at whatever pretty
scenery you have near you; and be sure to enjoy some good chocolate.