Search

Using git, I thought it would be good practise to add jquery libraries I use as a submodule rather then just to clone them, so I can integrate changes made to it over time.

However the workspace folder is set to gitignore by default...

What is your workflow for keeping external libraries up to date? Do create extensions for each one? (if so maybe a dummy extension to add js to frontend might be useful) or do you keep them in another directory altogether?
When select files are cloned from a submodule library directory to workspace/js, will they be able to track back to their orginating directory and then to the remote repository when updating?...
Do you keep the entire library including documentation and somehow on deploying filter out the docs and examples?

To be honest, I never really update JavaScript libraries on existing projects so I don't have a workflow (if I need the later version of a library I'm usually rebuilding everything anyway). In the rare occasion that I need to, it'd be quicker to download the file and commit it individually. I'd be interested to see how others go about doing this though.

For now I am putting all fully documented libraries as a sobmodule in a library directory inside workspace. I then manually copy and commit the parts I need. I like the idea that my local development version contains all its buildingblocks (+ orgin reference and some basic docs/boilerplates), rather then having to keep them on my computer and writing a 'used' libraries note for each project.

But I guess ideal would be to use ant like html5 does, to copy and optimize all the JS and CSS from its development directory to its life directory, before deploying?


Beyond the fysical paper notes and mockups, where do we archive our inspirations wireframes, designmockups in keynote,.... on our computer in a mirrored directory of our projects, independant of the repo's? Because frameworks sometimes also contain vektor files for mockups, inside there repo, I like the all integrated idea.

Then again ofcourse many code snippets, libraries, are re-used across different projects, so central does have its advantages; can we have the best of both worlds?

Or maybe even our inspiration (dribbble, patterntap, emberapp), wireframes (...), mockups (...), snippets (snipplr) and even tryouts (jsfiddle) all reside in the cloud nowadays...?

well the static file extension and this thread sure gives a new spin to this question....

Does anyone use braid to easy the workflow of your vendor branch management? A vendor branch is when you copy a vendor's code (plugins, gems, etc) inside your own repository / project.

Found it via SO Using braid is a simple as: braid add

I agree with, Nick. Just update the individual file. In general I'd have to say that I think submodules are more trouble than they're worth. If you don't have control over the originating repository then I'm not sure they're a good idea.

Are you sure Makenosound? Thats why Piston and Braid have sprung up, to make it dead-easy, and worth it...

Each to their own, but I've found that with my Symphony projects that using submodules has been a pain in the butt. They make it hard to monkey-patch and if the project moves (like say, a whole bunch of Alistair's extensions) then you have to find and re-link the source. In general I've found it simpler to include the extension source in my repository and update as needed. I don't find it any harder to pull down and override the folder manually and it's all in .git anyway so I won't lose anything.

In general I've found it simpler to include the extension source in my repository and update as needed

Ditto. Once the site is live I don't trust submodules being tied to the original. What if the third party developer updates their extension and breaks compatibility with my Symphony version? That's no fault of his, so I'd rather be in control over the repo that is linked to.

I've never been a fan of submodules, there is a lot of work needs doing on the command to get it right, like an rm option for instance. That being said though, they do have their merits...

...I don't trust submodules being tied to the original.

  1. They will never update themselves until you do it manually, so you can stick with a commit that works fine and it will stay that way.
  2. You can change branches easily, check latest commits, and rollback very easily, instead of manually grabbing and reuploading code via ftp.
  3. You could always fork them before adding them, that way you have an added layer between other developers work. Also, you can therefore edit easily if needed.

The main annoyance though is the adding of submodules when you have a big list to get through... I have a nifty solution for that though :) And maybe one for rming submodules too ;D

The main annoyance though is the adding of submodules when you have a big list to get through... I have a nifty solution for that though :) And maybe one for rming submodules too ;D

Now .. that sounds simply gorgeous ... ;;)

I asked on Stack Overflow for a quick bash script, for an article I've been writing for an age now...

It's available here if anyone would like to read it. Also, any comments and suggestions are warmly welcomed...

First document only, the second one is old...

@designermonkey

That article surely rocks! Could you advise on what steps should be altered for this scenario:

Rather then having my repo hosted externally (ea github) I'd prefer to have it locally on my mac for now (part of my backup workflow, and truly private) With the export symphony extension (or sitesucker) I will be creating a static website that I want to push to a public github repo (a static site has no more info then it's sourcecode, and thus no sensitive info) and then have it appear on a google app engine domain using drydrop. So the deployment would only be a push to github.

If I understand you right, you're only going to have to use one repo for the Symphony build, with one remote being the Symphony remote. Basically, you don't need to follow any of the steps for the mywebsite remote repo setup.

Then you're ready to create the static site, just create a folder for it, git init in that folder to create the repo for your static version, add the remote for your github repo.

Export your static site into the new repo and then do the usual git stuff, pushing the repo to your remote.

All in all, you're going to have two working repos, one for the Symohony build, that is linked to Symphony remote, and one for the static site, that is linked to Github remote.

Have I got it right?

You got it right. It might even make it more simple by not pushing to github, and just doing a wget to create the static site and then a appcfg.py update . to refresh my google app site. Maybe put those 2 commands in a bash script and wrap that in an automator script and macosx app. Somehow I would like a 'one click' publish.
Don't think I can fire a terminal script from a symphony event?

All in all, for this scenario, being a sole developer, and just using git to facilitate the adding of extensions and a few jquery libraries (not to deploy) would you think using git is even worth the hassle? I could also just use the builder ensemble.... My local install can function as a testbed, and the GAE as live, it even offers basic version control.

I guess you could run a terminal script from an event, I've not done that before.

I too am a sole developer from time to time, but still use git. IMO it's best practice to version control your code. I have an awful memory...

Create an account or sign in to comment.

Symphony • Open Source XSLT CMS

Server Requirements

  • PHP 5.3-5.6 or 7.0-7.3
  • PHP's LibXML module, with the XSLT extension enabled (--with-xsl)
  • MySQL 5.5 or above
  • An Apache or Litespeed webserver
  • Apache's mod_rewrite module or equivalent

Compatible Hosts

Sign in

Login details