Search

I wanted to throw this questions out there to get some opinions and pointers. I built a Ad management tool using symphony for the company I work for. We can upload ads, set their URLs, group them and grab the needed embed code for them. The tool also tracks the clicks and views for each ad, this is where my question comes in...

Our site is fairly busy and with handful of ads on the home page we are looking at hundred (sometimes 1000+) inserts every couple minutes, we are in our slow season right now and I'm concerned about scalability moving forward. I currently use an event that auto triggers to add an entries to either the Clicks or Views sections respectively. Is there a logical method I could use to avoid these constant small additions and instead do a bulk import each night?

Anyway, just thought I've had, I understand it may be very hard to give me any help of value, just figured it couldn't hurt to see what some of the genius' around here think about this.

Thanks!

Hmmm you could do a custom event to store these into a file and then batch import them but what would the gain be?

I don't exactly your site structure but the DB tables should be innodb, from my knowledge its far more adapted to handle writes & inserts. Most likely if you used a default installation you might have myisam tables. (which lock the whole table / insert)

@gunglien, have you used InnoDB tables with Symphony? No issues?

Ok, interesting, I'm no MySQL expert so its interesting to see that a large difference can depend on the storage engine.

I will maybe test with InnoDB on a local install. I had considered the file thing, maybe a CSV with a daily cron import, but wasn't sure if that would really help as I'm not familiar with file write performance and concurrency limitation in this situation.

Thanks for giving it some thought, I appreciate it.

yes I used innoDB tables without issues; I think it might suffer in some versions with text-search but that's only when using full-text retrieval and the latest mysql versions support full-text search so that wouldn't be a problem.

Was required to use it on a Modified symphony cache system (DS caching in db) with a production site (3servers 1db). Had this particular table which grew upwards of 1GB so had to use InnoDB so the table doesn't lock up when someone else is saving an entry - which meant locking up all users visiting the site.

@adam, if the additions are small I doubt this is every going to be a problem really. If you've ever used the dump_db extension, you'll find that that extension runs thousands of inserts in less than a second (at least it does on my small-ish vps).

Also, Symphony itself will insert Session information to the database for every visitor(!), so if your website can handle that, you should be in the clear.

InnoDB is a good thing to use here, though, as it supports row-level locking. If you are worried for the effects inno can have on the rest of your installation, you can always choose to just use innoDB for the table that will receive the data. If you go this route be careful not to use innoDB or MyIsam features when you join the table, though.

Create an account or sign in to comment.

Symphony • Open Source XSLT CMS

Server Requirements

  • PHP 5.3-5.6 or 7.0-7.3
  • PHP's LibXML module, with the XSLT extension enabled (--with-xsl)
  • MySQL 5.5 or above
  • An Apache or Litespeed webserver
  • Apache's mod_rewrite module or equivalent

Compatible Hosts

Sign in

Login details