Search

What if it were possible to dump the existing database structure of the sym_entries_n tables, then run a comparison that results in the SQL to drop, alter or create the existing structure to match that of the structure being imported using something like this?

Nick's Database Synchroniser extension still looks like the safest route.

Yep. Still thinking this through. Should there be more of a manual process to select tables to create on import, something that integrates with Nick's Entity Diagram extension? Or do I go back to work on the Schema Migration extension?

Stephen I went through the same thought process as you, and Database Synchroniser was the result. In my eyes it neatly solves the problem of separating structure from content, and does so at the most atomic level. It doesn't need to understand the concept of a section, a field or an extension, it just tracks the differences to the database.

Are you looking for something more visual, perhaps a UI to perform the migration itself? Database Synchroniser used to have a UI for importing/exporting the SQL file, but I removed it in favour of simplicity, and to encourage people to incorporate this process into a more strict workflow (including git) rather than relying on FTPing SQL files around.

Granted, we've not completely nailed the whole thing, but what requirement do you have that these existing extensions don't satisfy?

I'm thinking about a workflow where I use a combination of both the (updated) Export Ensemble extension and the Dump DB extension to maintain a repository that can be installed from any commit. So, for example, as I build an ensemble from the ground up, you could potentially grab the commit for the Forum ensemble sections before they get populated with data and install the ensemble from that point. The installer files from the Export Ensemble satisfy that requirement.

Once installed, I was wondering whether it would be possible to use the Dump DB extension to rewind or fast forward the database with complete snapshots of the structure and the data by checking out commits at specific stages of development. Or there might even be the possibility of cherry-picking commits used to build the sections and fields (although that sounds rather dangerous).

Maybe I've already got what I need with the Dump DB extension as it is. If I drop all the structural tables, then run an import of the data.sql from a different commit, I am able to rewind the database. Then I can truncate the entries data to have a clean install for an ensemble. I was just wondering whether it might be possible to choose either the complete structure or the data with a single click.

Also, the Database Synchroniser saves data to the manifest directory and may include data that I wouldn't want tracked by an ensemble repository. The Dump DB extension allows for a clean separation between sensitive author data and entries data. What it doesn't do is track structural data separately from the entries data. I could modify my fork to export the structure of the entries tables as well, and that would probably satisfy that requirement. But I was also wondering whether it could also be used as a database migration file, to be able to fast-forward a production database.

I was using Database Synchroniser at one point for our agency site, but I removed it, because I found Dump DB to be closer to what I was looking for and it seemed to simplify the process. But with Dump DB, I can only synchronize data from production to development/staging. I cannot safely migrate data from development/staging to production without destroying live data. I guess what I'm really looking for is a diff tool that can build the SQL to create, alter or drop tables based on the differences between two snapshots of the database structure.

Ah, I understand better. Well, Database Synchroniser will allow you to migrate structure from development->testing->live, and straight SQL file dumps from the database directly let you go in the reverse before you begin developing again.

But you're right, none of these methods allow you to increment through roll back through deltas at any point, while choosing to build/destroy data and/or structure. If you solved that problem, I imagine the developers of all frameworks and CMS would be buying you beers :-D

Hehe. Yeah, I'm definitely biting off more than I can chew here.

I rewrote to include the structure of the entries data fields. This means restoring with this file will destroy data. It seems to have the same effect as truncating the entries data tables, but without resetting the autoincrement value. I'm not sure how worthwhile this might be, other than being able to do a diff on the schema structure of production and development/staging databases.

Create an account or sign in to comment.

Symphony • Open Source XSLT CMS

Server Requirements

  • PHP 5.3-5.6 or 7.0-7.3
  • PHP's LibXML module, with the XSLT extension enabled (--with-xsl)
  • MySQL 5.5 or above
  • An Apache or Litespeed webserver
  • Apache's mod_rewrite module or equivalent

Compatible Hosts

Sign in

Login details