0

Well, I'm currently teaching DB in a course in University. Students had already gone through advanced programming (this includes torturing them through a whole semester with massive projects, they really learn a lot).

In my course there is currently a project in which I give each student an account in a server and he has to connect to PostgreSQL using PHP, show some queries, fix some problems that SQL can't handle using PHP ,etc etc etc.

But I'm thinking of ways to improve this. Currently they use FTP to connect to the server, upload the php files and then see how the changes went. Of course if they forgot a semi colon or something like that Apache just won't render the page and they won't have any clues as to what went wrong.

What do you think about the technologies used?

In another question I asked they told me to use Git but I think Git is slow if you want to test as you need to SSH to the server and do a git pull.

Do you have any suggestions or ideas?

dawud
  • 15,096
  • 3
  • 42
  • 61
  • Take a look at [these](http://serverfault.com/questions/498792/git-prime-hub-website-deployment-documentroot-changes-save-fine-but-permission/498845#498845) [Q/As](http://serverfault.com/questions/505370/how-to-enable-group-writable-suexec-capable-cgi-scripts/505394#505394). They both deal with a way of automating how publishing code can be automated, as described in detail [here](http://joemaller.com/990/a-web-focused-git-workflow/). – dawud Jun 21 '14 at 15:48
  • 1
    If you're not happy with FTP, why not go to the max and use a PaaS solution like [**OpenShift**](http://openshift.github.io/) which supports uploading the code with **git**? – Cristian Ciupitu Jun 23 '14 at 01:51

1 Answers1

2

rsync will be much faster than FTP for synchronising files repeatedly - it simply checks the file size and timestamp on the receiver to find out whether it should overwrite the file, thereby minimising the amount of data transferred for each synchronisation. If there is a lot of files you may want to consider creating a tarball automatically (using for example inotifywait) and syncing that, for example using tar -cz . | ssh my-server tar -C /my/app/dir -xz (untested).

make can be used to make the testing a single command. Something like this Makefile should do the trick (you have to use Tab for indentation):

test:
    rsync --recursive --progress . my-server:/my/app/dir
    firefox https://my-server/test-page

If you want to do some automated tests rather than manually checking the changes every time you can use Selenium.

PHP has a lot of stupid bugs, and personally (after an MSc + 10 years as a programmer) I believe it is positively brain-damaging. It is so riddled with terrible mistakes that I hope you will consider using a more sane language, such as Java, Python, Ruby or even Perl. Bad habits take a long time to die, and PHP teaches a lot of bad habits.

l0b0
  • 1,140
  • 1
  • 8
  • 17