9

related

I was able to set up a Grunt task to SFTP files up to my dev server using grunt-ssh:

sftp: {
    dev: {
        files: {
            './': ['**','!{node_modules,artifacts,sql,logs}/**'],
        },
        options: {
            path: '/path/to/project',
            privateKey: grunt.file.read(process.env.HOME+'/.ssh/id_rsa'),
            host: '111.111.111.111',
            port: 22,
            username: 'marksthebest',
        }
    }
},

But this uploads everything when I run it. There are thousands of files. I don't have time to wait for them to upload one-by-one every time I modify a file.

How can I set up a watch to upload only the files I've changed, as soon as I've changed them?

(For the curious, the server is a VM on the local network. It runs on a different OS and the setup is more similar to production than my local machine. Uploads should be lightning quick if I can get this working correctly)

Community
  • 1
  • 1
mpen
  • 272,448
  • 266
  • 850
  • 1,236

5 Answers5

7

What you need is grunt-newer, a task designed especially to update the configuration of any task depending on what file just changed, then run it. An example configuration could look like the following:

watch: {
  all: {
    files: ['**','!{node_modules,artifacts,sql,logs}/**'],
    tasks: ['newer:sftp:dev']
  }
}
Ben
  • 10,106
  • 3
  • 40
  • 58
  • This doesn't seem to work very well in practice... it's uploading *everything* right now, and it's very slow at that. I'm not sure if it will only upload only the new stuff the 2nd time around, but I don't think I have the patience to wait a few hours. I think my IDE bundled and uploaded stuff so it was a lot quicker. The watch also takes a long time to kick in, but I think that's SFTP's fault. – mpen Dec 06 '13 at 17:29
  • 3
    Yeah I can see the problem. Have you tried rsync? It only uploads the parts of the files that changed so in comparison to traditional FTP it's super quick. There exists a grunt wrapper, too: https://github.com/jedrichards/grunt-rsync – Ben Dec 06 '13 at 20:30
  • Wow. `rsync` is like a billion times faster at uploading the entire project directory, and it was really easy to set up, but it doesn't play nice with `newer`. I opened a [ticket](https://github.com/tschaub/grunt-newer/issues/21) with them; not sure if this a fault in newer or I've mucked something up. – mpen Dec 06 '13 at 22:31
5

You can do that using the watch event of grunt-contrib-watch. You basically need to handle the watch event, modify the sftp files config to only include the changed files, and then let grunt run the sftp task.

Something like this:

module.exports = function(grunt) {
    grunt.initConfig({
        pkg: grunt.file.readJSON('package.json'),
        secret: grunt.file.readJSON('secret.json'),
        watch: {
            test: {
                files: 'files/**/*',
                tasks: 'sftp',
                options: {
                    spawn: false
                }
            }
        },
        sftp: {
          test: {
            files: {
              "./": "files/**/*"
            },
            options: {
              path: '/path/on/the/server/',
              srcBasePath: 'files/',
              host: 'hostname.com',
              username: '<%= secret.username %>',
              password: '<%= secret.password %>',
              showProgress: true
            }
          }
        }
    }); // end grunt.initConfig

    // on watch events configure sftp.test.files to only run on changed file
    grunt.event.on('watch', function(action, filepath) {
        grunt.config('sftp.test.files', {"./": filepath});
    });

    grunt.loadNpmTasks('grunt-contrib-watch');
    grunt.loadNpmTasks('grunt-ssh');
};

Note the "spawn: false" option, and the way you need to set the config inside the event handler.

Note2: this code will upload one file at a time, there's a more robust method in the same link.

sdecima
  • 658
  • 6
  • 12
3

You can achieve that with Grunt:

  • grunt-contrib-watch

  • grunt-rsync

First things first: I am using a Docker Container. I also added a public SSH key into my Docker Container. So I am uploading into my "remote" container only the files that have changed in my local environment with this Grunt Task:

'use strict';

module.exports = function(grunt) {

    grunt.initConfig({

        rsync: {
            options: {
                args: ['-avz', '--verbose', '--delete'],
                exclude: ['.git*', 'cache', 'log'],
                recursive: true
            },
            development: {
                options: {
                    src: './',
                    dest: '/var/www/development',
                    host: 'root@www.localhost.com',
                    port: 2222
                }
            }
        },

        sshexec: {
            development: {
                command: 'chown -R www-data:www-data /var/www/development',
                options: {
                    host: 'www.localhost.com',
                    username: 'root',
                    port: 2222,
                    privateKey: grunt.file.read("/Users/YOUR_USER/.ssh/id_containers_rsa")
                }
            }
        },

        watch: {
            development: {
                files: [
                'node_modules',
                'package.json',
                'Gruntfile.js',
                '.gitignore',
                '.htaccess',
                'README.md',
                'config/*',
                'modules/*',
                'themes/*',
                '!cache/*',
                '!log/*'
                ],
                tasks: ['rsync:development', 'sshexec:development']
            }
        },

    });

    grunt.loadNpmTasks('grunt-contrib-watch');
    grunt.loadNpmTasks('grunt-rsync');
    grunt.loadNpmTasks('grunt-ssh');

    grunt.registerTask('default', ['watch:development']);
};

Good Luck and Happy Hacking!

2

I have recently ran into a similar issue where I wanted to only upload files that have changed. I'm only using grunt-exec. Providing you have ssh access to your server, you can do this task with much greater efficiency. I also created an rsync.json that is ignored by git, so collaborators can have their own rsync data.

The benefit is that if anyone makes a change it automatically uploads to their stage.

    // Watch - runs tasks when any changes are detected.
    watch: {
        scripts: {
            files: '**/*',
            tasks: ['deploy'],
            options: {
                spawn: false
            }
        }
    }

My deploy task is a registered task that compiles scripts then runs exec:deploy

   // Showing exec:deploy task 
   // Using rsync with ssh keys instead of login/pass
    exec: {
        deploy: {
            cmd: 'rsync public_html/* <%= rsync.options %> <%= rsync.user %>@<%= rsync.host %>:<%=rsync.path %>'
    }

You see a lot of the <%= rsync %> stuff? I use that to grab info from rysnc.json which is ingored by git. I only have this because this is a team workflow.

// rsync.json
{
  "options": "-rvp --progress -a --delete -e 'ssh -q'",
  "user": "mmcfarland",
  "host": "example.com",
  "path": "~/stage/public_html"
}

Make sure you rsync.json is defined in grunt:

module.exports = function(grunt) {

  var rsync = grunt.file.readJSON('path/to/rsync.json');
  var pkg = grunt.file.readJSON('path/to/package.json');

  grunt.initConfig({
    pkg: pkg,
    rsync: rsync,
docodemore
  • 1,074
  • 9
  • 19
0

I think it's not good idea to upload everything that changed at once to staging server. And working on the staging server is not a good idea too. You have to configure your local machine server, to be the same as staging/production

It's better to upload 1 time, when you do deployment.

You can archive all the files using grunt-contrib-compress. And push them using grunt-ssh as 1 file, then extract it on the server, that will be much faster.

that's example of compress task:

compress: {
            main: {
                options:{
                    archive:'build/build.tar.gz',
                    mode: 'tgz'
                },
                files: [
                    {cwd: 'build/', src: ['sites/all/modules/**'], dest:'./'},
                    {cwd: 'build/', src: ['sites/all/themes/**'], dest:'./'},
                    {cwd: 'build/', src: ['sites/default/files/**'], dest:'./'}
                ]
            }
        }

PS: Didn't ever look to rsync grunt modules. I understand that it's might not what you are looking for. But i decided to create my answer as standalone answer.

Rantiev
  • 2,121
  • 2
  • 32
  • 56
  • It's not quite staging, it's just a different machine in our office. Was easier our sever admin to manage all the VMs this way I guess, but otherwise I agree -- I usually development locally or use Vagrant. This compression idea should come in handy later though, thanks! – mpen Feb 25 '14 at 16:27