16

I have a Laravel installation and have set up three environments with their own corresponding config directories:

  • local
  • staging
  • production

I use php artisan migrate:make create_users_table etc as described here to create database migrations.

In my local environment I use Vagrant and a simple MySQL server setup, and on staging & production I use AWS RDS.

To configure database access for the staging environment I have a app/config/staging/database.php file with settings like this:

...
"mysql" => array(
    "driver" => "mysql",
    "host" => $_SERVER["RDS_HOSTNAME"],
    "database" => $_SERVER["RDS_DB_NAME"],
    "username" => $_SERVER["RDS_USERNAME"],
    "password" => $_SERVER["RDS_PASSWORD"],
    "charset" => "utf8",
    "collaction" => "utf8_unicode_ci",
    "prefix" => "",
),
...

I use git to deploy the app with git aws.push as described here.

The question is: How do I run the migration on my staging (and later production) EBS server when deploying?

oskarth
  • 927
  • 2
  • 9
  • 18

3 Answers3

30

I solved it by creating a new directory in the root of my project named .ebextensions. In that directory I created a script file my-scripts.config:

.ebextensions/
    my-scripts.config
app/
artisan
bootstrap
...

The file my-scripts.config gets executed when EBS deploys, is a YAML file and looks like this:

container_commands:
    01-migration:
        command: "php /var/app/ondeck/artisan --env=staging migrate"
        leader_only: true

Add the directory and file to git, commit, and run git aws.push and it will migrate.

Explanations on how stuff in .ebextensions works can be found here.

The path /var/app/ondeck is where your application lives when your script runs, it will afterwards be copied into /var/app/current.

The artisan option --env=staging is useful for telling artisan what environment it should run in, so that it can find the correct database settings from app/config/staging/database.php

If you need a quick and dirty way to log why the migrate command fails you might want to try out something like "php /var/app/ondeck/artisan --env=staging migrate > /tmp/artisan-migrate.log" so that you can log into your ec2 instance and check the log.

oskarth
  • 927
  • 2
  • 9
  • 18
  • so what will happen for the next migrations I am doing? Like for those I cannot add them again to GIT as they are already added. Or I cannot run ```git aws.push``` . So what will be the step for that? - Thanks. – Himel Nag Rana Jan 13 '15 at 12:02
  • @HimelNagRana I don't quite follow, but you may be misunderstanding how migrations (should) work. All your migration scripts should always be in the git repo. Your database isn't supposed to be "reset" on every deployment, the migration should only perform the small changes in your `up` method in the migration script. Check out the migration docs: http://laravel.com/docs/4.2/migrations – oskarth Jan 14 '15 at 13:07
  • I think i was not able to make myself clear. Sorry for that. I understand how migration works (or should work). It was rather a deployment related question which I was able to figure out how. For example, let's say i want to add "last_login" to user entity. Then there i will generate migration and run it. My question was that do i need to do anything exceptional to make the migration run while deploying to ElasticBeanstalk? Later I found that the answer is "no". Thanks anyway. – Himel Nag Rana Jan 15 '15 at 14:11
  • @HimelNagRana I see, like you found out yourself you don't need anything other than the migrate command above. Cheers! – oskarth Jan 16 '15 at 18:20
  • I'm trying to implement the same for an node app with sequelize. Can you explain me, the leader_only flag. Is this an amazon eb option? Is it ensuring that the migration is only run once? – Manuel Oct 24 '16 at 11:50
  • 1
    @Manuel That's what it says, sort of, in the docs: http://docs.aws.amazon.com/elasticbeanstalk/latest/dg/customize-containers-ec2.html (search for leader_only). I have, since writing this SO Q&A, started performing the migrations from a CI environment so I'm not using this method anymore. – oskarth Nov 18 '16 at 13:52
  • One might want to make the environment variables available first, see here https://aws.amazon.com/premiumsupport/knowledge-center/elastic-beanstalk-env-variables-shell/ . – reim Aug 09 '21 at 13:23
6

After oskarth answer, some instructions might had been changed in the past few years on how AWS Elastic Beanstalk deploys a new application version. According to the AWS documentation related to the container_commands of .ebextensions, case the "cwd" option is not set, the working directory is the staging directory of the unzipped application. It means that the user under the instance during the deployment process will be positioned at /var/app/staging/, where the extracted source version of the application is. Therefore, the artisan command can be executed either alone or following by var/app/staging/ path instead of /var/app/ondeck/ like this:

container_commands:
01-migration:
    command: "php artisan --env=staging migrate"
    leader_only: true

or this

container_commands:
01-migration:
    command: "php /var/app/staging/artisan --env=staging migrate"
    leader_only: true

I have deploy my project using the both configurations above. I discovered that after hours looking at the eb-engine.log file and reading the documentation over and over again. I hope anyone take so long as well after reading that. The logs can be accessed throw the eb logs command on terminal, on the environment console or through the S3 bucket associated to the environment. There is pretty much everything in the documentation explained. I didn't comment it on the oskarth answer I'm not allowed yet!

ps. the /var/app/staging path has no relation with the staging environment in laravel.

Matheus Camara
  • 469
  • 6
  • 8
2

Worth mentioning here if people are using a dockerised container to run their application on Beanstalk, you will have to run this inside the container.

files:
    "/opt/elasticbeanstalk/hooks/appdeploy/post/98_build_app.sh":
        mode: "000755"
        owner: root
        group: root
        content: |
            #!/usr/bin/env bash
            echo "Running laravel migrations" >> /var/log/eb-activity.log
            docker exec $(docker ps -qf name=php-fpm) sh -c "php artisan --env=staging migrate --force || echo Migrations didnt run" >> /var/log/eb-activity.log 2>&1

The problem is the container name changes each time.

So the docker ps -qf name=php-fpm part is just going to get a container with the container name containing php-fpm. So replace that with something else that matches which container you want to run it on.

Also use --force because otherwise, its going to try and wait for a prompt

Oli Girling
  • 605
  • 8
  • 14