Search

So I've been working locally using git to manage Symphony, extensions etc. and recently tried to get my server involved, but it's been dreadful... I must have spent over 20 hours trying to figure out how to get everything up and running. I've read "Getting Git for Symphony," and pretty much every thread on this forum, StackOverflow, and blogs found via Google.

Q1. Is my setup "proper"?

I considered/encountered many problems and tried many solutions.

  • Bare repo in private directory + pull to public directory
  • Bare repo in private directory with post-receive hook to copy to public directory
  • Repo in public directory
  • All the above with different post-receive hook files trying to get submodules to be pushed as well

The only thing that worked for me was simply pushing to a repo in a public directory /home/cooluser/domain.com without submodules using git push origin master. I then ssh into the server and dogit checkout -f and git submodule update --init --recursive. I have no idea why these cannot be put into post-receive when doing it manually works.

Is this setup going to work for me down the road? I am able to push new code without a problem, but it nerves me that perhaps if I push something bad, my site will die. Otherwise, the extra steps are just annoying but bearable.

Q2. Pushing submodules: is it possible?

Just wanted to make sure there really is no way to push submodules. That is to say, not ssh login to the remote server and then fetch the submodules from their original repo. For example, I modified the facebook_toolkit extension, but I could not push that submodule. I had to remove the submodule and add it to my main project. Now I won't be able to pull any new changes to the extension.

Q3. How do I deal with the databases and manifest/config.php?

Currently, my whole project is tracked and pushed, except for the contents of manifest/cache, manifest/logs, and manifest/tmp, which I've ignored. Since putting my code on a server, I've changed my config.php to the remote MySQL database. This is nice to test with live data, but when developing new features (messing around with sections etc.), I want to use a local database.

Is DB Synchonizer the proper tool to use to sync? Furthermore, how do you deal with config.php? Should I

  1. Keep config.php tracked in Git and change it to my local db during development and back to the remote db before I push.
  2. Don't track config.php. But extensions that change things in preferences will break.
  3. ???

Q4. What is your setup?

Just curious to see how others are working.

There are many roads to Rome, but here's what works (well!) for me:

I have cloned Symphony from GitHub to my local development directory. I have then installed symphony as usual, to create the needed directories and (config) files. Because I will be maintaining three installations with this one repo (development, production and local), I have copied the manifest directory three times: manifest.dev, manifest.local and manifest.prod. I then removed the original directory, and symlinked it to the local version.

This whole bunch is then pushed to a private repo on GitHub, to be able to pull from it on my server, and to have a backup ready when things go bad.

With this set up I configured capistrano to do all the heavy lifting for me. Capistrano logs into the server using ssh, pulls the repo into a clean directory outside the webroot, checks out every changed submodule, symlinks workspace/uploads to a shared directory, symlinks manifest to the right version (dev or prod) and finally symlinks the webroot to the new deploy folder.

I have found that the CDI and database sync extensions do not really work for me; conflicts often occur, so I prefer to do things by hand - doing a dump before I begin, just in case.

The files and database are then up-to-date, but I have installed Varnish and APC, so I need to restart PHP, and flush varnish for the changes to be shown in the browser.

If you want I could clean sensitive data from my capistrano config file and share it.

I also use multiple manifests like creativedutchmen describes above, thus keeping the different config.php under git. The only annoyance/gotcha with this is to make sure that any changes to this file (e.g. when adding a new extension that adds/edits a config block) need to be manually copied across to your other config.php files. I use vim as my editor so it's just a simple case of running a :diffthis and manually syncing the differences for me. If your editor/ide doesn't have a diff functionality built just look for a diff program to help you out.

For my setups I create a bare git repo git init --bare as a remote I call origin. I push from my development clone of this repo and pull into a clone that serves as my var/www/htdocs folder. I have a post-receive hook setup that automatically pulls from origin, updates any submodules and runs any other commands I might need, my post-receive hook is as follows (note you need to do the unset GIT_DIR and other stuff to keep git happy - google deploy a website using git for more details. there's some great tutorials I followed to set this all up first time around).

cd /var/www/htdocs/example.com/ || exit
unset GIT_DIR
git pull origin master
echo "Updating git submodules..."
git submodule init
git submodule update
echo "Setting correct permissions for workspace..."
find workspace -type f -exec chmod 644 {} \;
echo "Running git-update-server-info..."
exec git update-server-info
echo "Post receive hook complete."

Notice I've included an example command to fix the permissions in the workspace, you might not need this and you might need to include your own commands you want run here, this is just an example.

For any changes that are made to my live server, i.e. editors adding new content images etc - indeed any changes - I ssh to my server, git add anything in /var/www/htdocs and then commit and push back UP to origin. I can then git pull origin master in my development repo to sync with live before starting new development work. I have Dump DB extension installed and the sql file under git too so if I need a copy of the database I simply dump the db on live before I start and push, then pull into development, do a db restore and get to work. The db syncing is the most fragile bit of this setup, but you do always have a sql backup under git and it's a simple workflow that tends to work well. (I've tried to use the CDI extension but could never get it to work). The big gotcha with this workflow is that it won't work if you have front-end users making changes to your site, like updating their profiles etc, since they could be editing while you're working on the development db and things will get out of sync. For that you would definitely need a more complex CDI type solution, but for simple brochure type sites this workflow is fine.

NOTE: I follow this git branching model religiously so my master branch is only ever in a production ready state - I HIGHLY recommend following this branching model (or something similar) NOT doing so while following my setup above could lead to very bad outcomes since you might be doing development work in your master branch = very bad!

Hope it helps some.

Thanks for that link firegoby.

My attempt can be seen here here: database integration manager extension

I'm using it in 2 production sites at the moment and it is working a dream! Any questions about it, let me know

I work in a more-or-less similar setup. Except that we've got our Server Admin guy to take care of all the git stuff, meaning server setups & checkouts.

We have set up a multi-dev version of CDI (still testing) but seems to be working fine so far. The important thing when using CDI is that you only ever touch blueprints/links to pages etc on the dev/local machine and never on production. Otherwise funny things might start to happen.

What we do otherwise is that whenever we push our master branch, this is checkout out automatically by the production server and the following functions are run. git submodule init, git submodule sync and git submodule update this ensures we only have the latest & correct versions on production.

I'm sure there are easier ways to setup but that's what our admin was happy to work with & so far I have no complaints whatsoever.

Note For all the submodules which we edit; we take a public/private fork depending on the situation. So we can pull updates from the original repo later on.

Ah, I see. So you fork the submodule extension on Github, and then add that as a submodule? That's where you push changes to your fork and pull changes from the original repo?

BTW to all: CDI has worked for me so far with git managing the files. The only thing is to remember to clear the CDI logs after each sync!

pat if you want to avoid clearing the CDI after each sync its possible as well. Been using this fork without issues so far (I still would recommend backups just in case)

Here is what I do to handle multiple databases between local and server:

http://note.io/HfB02j

Fast and easy. Not as fancy as all that. But hey!

Yes, but try saving your config from the Preferences page, and it will be overwritten.

There's months of discussion about it on the repo on github.

^ Or changing the sorted column in an entries view

Mmm... interesting. That's not good.

Create an account or sign in to comment.

Symphony • Open Source XSLT CMS

Server Requirements

  • PHP 5.3-5.6 or 7.0-7.3
  • PHP's LibXML module, with the XSLT extension enabled (--with-xsl)
  • MySQL 5.5 or above
  • An Apache or Litespeed webserver
  • Apache's mod_rewrite module or equivalent

Compatible Hosts

Sign in

Login details