Search

The latest code in the repo (v1.0) is for Symphony 2.3 only (see compatibility info). You need to check out tag 0.9.1 if you're using Symphony 2.2.5, and follow the old instructions in the README of that version to add a line of code to Symphony's class.mysql.php.

(Keep me. This forum has a weird bug, remove these, and some posts above are hidden!)

(Keep me. This forum has a weird bug, remove these, and some posts above are hidden!)

Nick, I will be using this great extension in my big project, and I am thinking about the following setup:

  • Local symphony repo with workspace as a submodule, both with three branches: master, production and development
  • Two bare repos on my server, one for symphony, the other for the workspace
  • Pushing to the server to the production or development branch of symphony will call two different scripts in the post-receive hooks, which will checkout the code and update all submodules.
  • Pushing to the workspace (again to prod or dev) on the server will have nearly the same effects, but with the following extras: dump the database to my backup space and run the sql file provided by your extension.

My workflow will the the following: locally I develop on master, then I merge and push to development for testing on the live server (with its different setup, etc), then, if everything still works I merge again and push to production.

From the readme and the code I wasn't really sure of one thing: can the sql file generated by this extension be run twice on the same database without any problems? This is kinda needed for this workflow, as there will be cases where I have to push to development a few times before pushing to production.

Also, what do you think of this approach? Do you think it might work?

can the sql file generated by this extension be run twice on the same database without any problems

No! This will cause big probIems. I suggest trying Remie's CDI extension instead — it is designed to pass these changes upstream as easily as possible.

No!

I was afraid you were going to say that. I don't really like the concept of having to install the extension on the server (as with the CDI), so if there is a workaround I'd rather go with that..;)

So, I've been thinking, and what about the following: In the pre-commit hook I git add the sql file generated by this extension, then in the post-commit hook I remove the file (both from git and from the filesystem). edit: so the sql file will only contain the changes made within the specific commit.

Then, in the post-receive hook I will go through all commits in reverse chronical order (earlier first), and import the sql file if it is present.

Sounds possible?

I suppose you could modify the extension so that instead of saving to a single file, you save a new timestamped file for every "event" (every submit action that causes database changes). You then get a stream of .sql files that you must run in sequence. So when new files are found, you apply them. This way you aren't trying to work out which part of the file to run. Just find the files, execute them, and delete them.

You must also add an element of backup — you should dump the main database before you run any .sql files, just in case the execution doesn't work. This extension works, but I can't vouch for it being perfect. You should always take a snapshot and store it, so that you can roll back.

Where CDI works well is that it keeps track of what has and hasn't been run between builds. It is my preferred solution.

So when new files are found, you apply them. This way you aren't trying to work out which part of the file to run. Just find the files, execute them, and delete them.

Yes, this is basically what I proposed, but then in a single file that gets deleted after it is committed.

Where CDI works well is that it keeps track of what has and hasn't been run between builds. It is my preferred solution.

Hmm, if you really prefer it I might be overlooking things.. The CDI has to run on the production server too, correct? And on the production server, it will log each query (much like your extension), or doesn't it?

If it does, I don't really (or rather: really don't) want to use the CDI, because of the added overhead.

Maybe I can help out here.

The CDI extensions consists of two parts: the log file on the development server and a log database table on the production server.

You can version control the log file and simply push it to the production server. From there you call an extensions content page which will parse the log file and run the SQL queries sequentially. It will check the database log table to see if the query has been executed already or not.

There is not much overhead involved: there is a database table that sits there and waits until the extension content page is triggered. It does not automagically do stuff. You need to trigger the update manually. Only during the update will it use the database table.

There is also no need to make core adjustments on the production server. This only needs to be done on the development server and will be replaced with the new query delegates in the next release of CDI.

If you still consider this to much overhead on the production server, you might be interested to know that CDI also has the Database Synchronizer mode implemented. It was designed to make DBSync obsolete, with the added bonus of keeping the old 'Nick Dunn' twist for those who like it old school.

However, like Nick, I do recommend you to consider the CDI implementation: it was designed to assure that SQL statements are executed in the same sequence they ran on the development machine. In the next version, it is also the goal to make it work with multiple sources (teams with multiple development databases that sync into one production server). This is still a challenge, but if it works, it will greatly speed up to development effort of teams working on the same Symphony project.

Give it a try! If you want, I can even give you a demonstration (the added benefit of living in the same country)

with the added bonus of keeping the old 'Nick Dunn' twist for those who like it old school

+4 points.

Remie, thanks very much for this detailed explanation. I think I misunderstood the point of your extension, and I will take a more thorough look at it tomorrow, because what you described sounds like exactly what I need.

I have one question though: is there a possibility of importing the sql from the command line, or would I have to write my own php code for this? I would love to put all the heavy lifting inside a git hook, so I can not screw things up.

is there a possibility of importing the sql from the command line

Yes this is possible. I have a CI server running to do the deployment of my projects which includes a script to call the CDI update mechanism. I use Curl to make an http call to www.example.org/symphony/extension/cdi/update/. So on my production server, whenever I do a deploy it automatically runs the DB update to sync my changes.

Perhaps it is also possible to run the PHP script from file. The script that is responsible for the update can be found in extensions/cdi/content/content.update.php. You can try to see if this can be run from the command-line.

Remie, would you be able to write a tutorial on how you have this all set up and running. Does this include a dev/stage/prod setup?

Should probably continue this in the CDI thread if that's where it's headed :-)

Hello hello, not sure if you are still up to hassle with 0.9.1, but when I have the search index extension installed and execute a search on the frontend I get the following error:

Fatal error: Class 'Administration' not found in .../extensions/db_sync/lib/class.logquery.php on line 27

Any ideas? Thanks!

Remie, I'm trying use your CDI, nice extension, really.

But in my first and second attempt, I have somethings broken.
So would like know if CDI works fine with Localisation Manager, or have some issue with some extension.

So, I will try much more to try reproduce same problem. But need check if I'm doing a correct workflow.

1) Installed CDI on Local and Staging.
2) Add cdi.sql file to GIT Deploy workflow.
3) Set the Master for Local and Slave for Staging.
4) Update the config.php manually because config is out of my GIT deploy workflow.
5) Make the updates.
6) Deploy changes (files, cdi.sql, etc)
7) Run the URL on CDI preferences.
8) Return Ok with number of queries executed and kipped.

Thanks.

@Remie +1 for a tutorial. CDI seems like something that could be of great benefit for me but I find it hard to wrap my head around.

@Remie +1 for a tutorial too please. I'm using DB Dump extension currently to take a copy of the DB, work on it locally, then upload whole DB back to production - CDI seems such a more elegant solution but when I tried it in the past I couldn't seem to get it work right/as expected - and that's certainly my usage of it and definitely not any fault in your code!

Symphony 2.4 Compatibility issue: Symphony Fatal Error: Call to undefined method Administration::saveConfig()

https://github.com/symphonists/db_sync/issues/5

It needs to be Symphony::Configuration()->write(); will send a PR to fix this up.

Create an account or sign in to comment.

Symphony • Open Source XSLT CMS

Server Requirements

  • PHP 5.3-5.6 or 7.0-7.3
  • PHP's LibXML module, with the XSLT extension enabled (--with-xsl)
  • MySQL 5.5 or above
  • An Apache or Litespeed webserver
  • Apache's mod_rewrite module or equivalent

Compatible Hosts

Sign in

Login details