2
  1. I downloaded and started authoring environment (crafter-cms-authoring.zip)
  2. Created site backed by remote git repo as described in: Create site based on a blueprint then push to remote bare git repository
  3. Created a content type, new page.
  4. Published everything

Now, I would expect, that I can see my changes in the remote repo. But all I can see are the initial commits from the 2. step above. No new content type, no new page, no branch "live". (The content items are however visible in the local repo)

What is missing?

Edit: Since Creafter can by set up in many ways, in order to clarify my deployment scenario, I am adding deployment diagram + short description.

headless deployment There are 3 hosts - one for each environment + shared git repo.

Authoring
This is where studio is located and content authors make changes. Each change is saved to the sandbox local git repository. When a content is published, the changes are pulled to the published local git repository. These two local repos are not accessible from other hosts.

Delivery
This is what provides published content to the end user/application. Deployer is responsible for getting new publications to the delivery instance. It does so by polling (periodically pulling from) specific git repository. When it pulls new changes, it updates the local git repository site, and Solr indexes.

Gitlab
This hosts git repository site. It is accessible from both - Authoring and Delivery hosts. After its creation, the new site is pushed to this repo. The repo is also polled for new changes by Deployers of Delivery instances.

In order for this setup to work, the published changes must somehow end up in Gitlab's site repo, but they do not (the red communication path from Authoring Deployer to the Gitlab's site)


Solution based on @summerz answer

I implemented GitPushProcessor and configured new deployment target in authoring Deployer, adding mysite-live.yaml to /opt/crafter-cms-authoring/data/deployer/target/:

target:
    env: live
    siteName: codelists
    engineUrl: http://localhost:9080
    localRepoPath: /opt/crafter-cms-authoring/data/repos/sites/mysite/published 
    deployment:
        pipeline:
        - processorName: gitPushProcessor
            remoteRepo:
                url: ssh://path/to/gitlab/site/mysite
Community
  • 1
  • 1

1 Answers1

3

I think you might have confused push with publish.

On Publishing

Authoring (Studio) publishes to Delivery (Engine) after an approval workflow that makes content go live. Authoring is where content (and code if you like) is managed and previewed safely, then that is published to the live delivery nodes for delivery to the end-user.

On DevOps

A site's local git repository can be pushed/pulled to/from remote repositories. This means:

  • Code can flow from a developer's workstation to Studio (via a github, gitlab, bitbucket etc.) <== this is code moving forward (and can flow via environments like QA, Load Testing, etc.)
  • Content can flow back, from Studio to the developer's local workstation in a similar manner <== this is content moving backward (you can have production content on your laptop if you want)

When code flows forward from a developer to Studio, that's when Studio pulls from the remote git repo.

When content flows backward from Studio to the developer, that's when Studio pushes to the remote git repo.

Documentation

A good bird's eye view of the architecture of the system relating to publishing can be found here: http://docs.craftercms.org/en/3.0/developers/architecture.html

A good article that explains the DevOps workflow/Git stuff is here: http://docs.craftercms.org/en/3.0/developers/developer-workflow.html


Update based on the expanded question

My new understanding based on your question is: You can't allow the deployers in Delivery to access Authoring's published repo to poll due to some constraint (even over SSH and even with limits on the source IP). You'd like to use GitLab as a form of content depot that's accessible as a push from Authoring and pull from Delivery.

If my understanding is correct, I can think of two immediate solutions.

  1. Set up a cron job in authoring to push to GitLab periodically. You'll need to add GitLab as a remote repo in published and then set up a cron like this:

* * * * * git --git-dir /opt/crafter/data/repos/sites/{YOUR_SITE}/published/.git push 2>&1

Test it out by hand first, then cron it.

  1. Write a deployer processor that can push content out to an end-point upon a change or, wait for the ticket: https://github.com/craftercms/craftercms/issues/2017. Once this is built, you'll need to configure another deployer in Authoring that will push to GitLab.

In either case, beware not to update things in GitLab since you're using published and not sandbox. (See DevOps notes above to learn why.)

sumerz
  • 1,346
  • 7
  • 4
  • I do not think I did. I expanded the question with deployment details to avoid confusion with development setup. I already read the documentation you are referring to. The doc about architecture is particularly silent about the way how the published changes get from authoring to (possibly many) delivery instances. My knowledge of the publishing process comes mainly from the source code. Unfortunately, I am still missing some pieces to get the whole picture right. – Maros Ivanco Apr 04 '18 at 12:12
  • I understand the problem better now that you provided a diagram and notes. I updated my answer above to hopefully help. Several Crafter developers hang out on IRC: freenode.net #craftercms, if you have more involved/specific questions, it might be easier to chat there. – sumerz Apr 05 '18 at 15:49
  • I should also note that published changes flow to `published` repo, and then those are pulled by the deployers. You can have an unlimited number of delivery nodes with deployers pulling from `published` (or in your case, GitLab which is in sync with `published`). – sumerz Apr 05 '18 at 15:51
  • My overall goal is to dockerize crafter into several images. I want to stick with "1 process per container" strategy. Hence my reluctance to either serve git or set up a cron job (suggestion 1). I have just implemented GitPushProcessor (as in suggestion 2), but instead of starting separate deployer, I have added the new target to the existing deployer. See the solution for details. PushProcessor PR is on the way. – Maros Ivanco Apr 17 '18 at 15:40