Announcement

Symphony's issue tracker has been moved to Github.

Issues are displayed here for reference only and cannot be created or edited.

Browse

Closed#547: field.textarea.php throws errors

I don’t know what caused this, but from one day on the other symphony/lib/toolkit/fields/field.textarea.php throws errors.

when I want to show the page in the frontend (line 163):

trim() expects parameter 1 to be string, array given
return preg_replace('/&(?!(#[0-9]+|#x[0-9a-f]+|amp|lt|gt);)/i', '&', trim($value));

and on the backend when I want to edit the entry (line 27):

strlen() expects parameter 1 to be string, array given
$textarea = Widget::Textarea('fields'.$fieldnamePrefix.'['.$this->get('element_name').']'.$fieldnamePostfix, $this->get('size'), '50', (strlen($data['value']) != 0 ? General::sanitize($data['value']) : NULL));

For some reason, $value gets set as an array, instead of a string. I’ll look some further into this, try to keep you updated.

Allright, it seems that $data['value'] is usually send as a string (which is correct), but sometimes it gets send as an array (in displayPublishPanel()).

Also, it only happens when your logged in into Symphony, and it doesn’t happen with all entries. I now have 5 entries and only the last 2 show this behaviour.

Hmm, it seemed that my textarea was stored two times in the database with the same entry-id, causing default Symphony behaviour to create an array instead of a string.

The question that rises now: how was it even possible that there were two entries in the entry-data-table of a textarea with the same entry-id? This should not be possible!

Very odd, I’ve had this happen a couple of times in my Symphony lifetime, and have sunk countless hours trying to figure out just how the hell it happened.

In the end the solution I used was more of a preventative one, by adjusting the entry data tables to have unique indexes on the entry_id field.

I could implement this on all the core fields, but it’d be bloody awesome if we could figure out exactly why this occurs.

Any there any clues in the two entries?

How were they added? (Backend/Event)

Do they have any special characters?

How often was the entry updated?

Can you get this to reproduce?

it doesn’t happen with all entries. I now have 5 entries and only the last 2 show this behaviour

What extensions do you have installed? Were the entries created in the backend or via an event?

I too have had this problem, and as Brendan suggests, a unique index on these tables works. The problem I had was that a single row exists in sym_entries, but for each field (e.g. a text input) multiple rows are created for an entry. So for entry 123, sym_entries_data_567 for my field might have the value duplicated into two (or more) rows.

When an entry is saved, the data rows are deleted and then re-added again. My guess is that some off edge condition means that the data rows are not purged when the entry is saved, resulting in duplicates.

This is a real edge case and I have never sussed how or why it occurs. I’ve only seen it two or three times in as many years, so never gave it further time or thought.

Well, me and a collegue of mine are bumping into this problem only the last couple of weeks. On three sites! So my guess is that something triggers it.

As for extensions: There are no very special extensions used. The fields used are two input fields, a textarea (with ckeditor), and a tag list.

What we found:

  • It seems that the problem only occurs with a textarea
  • Not all entries are affected. Some are and some are not
  • In both cases the problem occurs because the entries are stored more than once in the database. In my case it was stored 2 times, whereas in my collegue’s case, it was stored 4 times!

This causes me to believe that as Nick Dunn sugests, something goes wrong when an entry is deleted and re-added again.

@nickdunn: Where exactly in which script does this happen?

-

We started to have there problems after i have changed the MySQL setting that sets the minimal word length for matches in fulltext indexes. The textfield field creates such an index. Could this be the source of the problem?

We had some crashed tables we had to repair as well.

We started to have there problems after i have changed the MySQL setting that sets the minimal word length for matches in fulltext indexes. The textfield field creates such an index. Could this be the source of the problem?

We had some crashed tables we had to repair as well.

We had some crashed tables we had to repair as well.

Yes, I had this too! When a table crashes then this error definitely does occur, and changing the length of word in MySQL did start to cause tables to crash. (That would be a MySQL bug and not Symphony.) But the fact that this bug occurs when a table crashes is a clue.

It seems that the problem only occurs with a textarea

When this next occurs, could you look in the data table for every field in your section and see if duplicate rows exist? It could be that other fields simply handle this more elegantly (e.g. the field PHP checks for an array and doesn’t trigger an error) even when these duplicate rows still exist. Do the duplicates get added for every field or just the textarea?

Are you able to replicate this on a completely vanilla install without extensions?

@nickdunn: Where exactly in which script does this happen?

The EntryManager::edit() method iterates over each field in an entry, tries to delete its data, then inserts the new data. If it can’t delete the old data then it fails silently. Perhaps you could modify this to dump out conditions when an Exception is triggered. If the SQL triggers an error then something is wrong.

try{
    Symphony::Database()->delete('sym_entries_data_' . $field_id, " `entry_id` = '".$entry->get('id')."'");
}
catch(Exception $e){
    // Discard?
}

When this next occurs, could you look in the data table for every field in your section and see if duplicate rows exist?

This was one of the first things I checked, and only the textarea created duplicate entries.

I think that failing silently is never good practice. Perhaps throwing a mysql_error() is a better option?

Given that it is essential that the data is deleted before it is re-added, it seems that failing silently is not very smart in this case.

With the textarea field that was failing, what position in the section is it in?

I’m just trying to see that if the table crashes, can the field’s after the crashed table be deleted, or does it just halt everything. If it halts (and your textarea is like the 2nd field, ie. Field B) is the data in Field C & D being duplicated as well?

If the table is crashing, I’m wondering if MySQL actually gives an error out, of if it just times out.

The hard thing here is that if something goes wrong, we should be able to pull back any data that has already been removed, so transactions would really help here. Then again, if MySQL is crashing, transactions are probably useless.

What version of MySQL and PHP are you running?

I’ve commited this, which adds a UNIQUE key constraint to all the core fields except Taglist and Select Box.

There is some updater logic that will go through the database tables for all the core fields, drop and create the correct index.

I’ve had a brief search on the MySQL bug tracker and found a few errors about tables crashing after modifying FULLTEXT and then trying to update/select the field, but they seemed to affect older versions (early 5.1.x).

Look at EntryManager::edit further, I just wanted to think this outloud.

Should the try block just be extended to cover all of the code in the foreach? The logic I’m seeing here is, if you can’t delete the existing data for the field, then don’t try to add it again, as it’s just going to cause duplicates (well not anymore, but it could still happen on Select Box/Taglist fields if it’s not isolated to the FULLTEXT index).

The thing then is, do you just ignore the error, or do you crash out to the user?

Depending on what field error (ie, field 3 of 6), the result would be a hybrid entry with the first 2 fields being the ‘updated’ data and fields 3-6 being the ‘old’ data.

Ignoring the error will isolate the bug to a single field, which will just appear not to have updated, but the user won’t know any better. An error could be added to the Log, but again, the user wouldn’t check there, only a developer would if the user brought the issue to their attention, which probably wouldn’t happen unless it crashed out to an exception.

Perhaps the entire Edit logic needs to be rejigged so that no new data is inserted before all the old data is deleted? This seems risky though, because if we do the above scenario, you would lose all data from field 1-2, and keep the old data from 3-6.

Bah. I don’t know. It’s such a bloody obscure error do we just stick with the DUPLICATE KEY preventative measure, at least for the purpose of 2.2 anyway?

I think the the try block should indeed be extended to actually catch something instead of failing silently. Some error handling should be detected. Is field_id empty? or $entry->get('id')? And just throw that error and stop the script. It might have your client giving you a call, but it prevents inserting duplicate data. And perhaps if we see what error gets thrown, it gives us some more insight and callback in where it goes wrong.

That’s not the case, if those values were empty you wouldn’t get duplicate data because the entry couldn’t be saved :)

There is no try around the insertion of data, so you would get an error in that case anyway.

Is there a solution for this problem? I have updated to 2.1.1 but still happens

Ok I found which entries table had my textareas and deleted 2 rows which had duplicate entry_ids

Hopefully this won't happen anymore!

I had this problem come up the other day when I rebuilt my server. I upgraded PHP from 5.2.9 to 5.3.3. I couldn't figure out what was going on and then I found this comment in the PHP documentation - the comment dated 8 June 2010 from 'basil'.

Not sure if this is related, but I rolled back to 5.2.9 and all seemed fine again.

http://php.net/manual/en/function.strlen.php

... in 5.2 strlen will automatically cast anything passed to it as a string, and casting an array to a string yields the string "Array". In 5.3, this changed, as noted in the following point in the backward incompatible changes in 5.3 (http://www.php.net/manual/en/migration53.incompatible.php):

Ah ha! I think that's definitely got something to do with it!

If I look back, all the times I've had this happen has been on our production server (which is PHP5.2) while our development server is PHP 5.3.

I'll have a look through the code at the strlen calls.

Thanks @veryphatic!

Had a lookover the code, can't seem to see anything with strlen that would affect textarea and be consistent with the rest of the comments in this issue.

The UNIQUE KEY constraints on the field IMO will prevent this issue happening again. It won't be possible to get an array returned for an Entry as their won't be two rows in the database with the same $entry_id.

@veryphatic What version of Symphony were you running? Can you reproduce it everytime? If you can, could you provide an ensemble and steps to reproduce?

Going with assumption that this cannot occur anymore because the UNIQUE KEY index will prevent two records from existing in the database.

Fields such as Taglist and Select Box are already written to support multiple rows, so there wouldn't be an error if somehow duplicates do occur.

The error does seem to be related to FULLTEXT keys, as we haven't seen an instance where any other field has had this issue (that doesn't use FULLTEXT keys).

We also haven't heard anything since the introduction of UNIQUE KEY, so closing this issue :)

This issue is closed.

Symphony • Open Source XSLT CMS

Server Requirements

  • PHP 5.3-5.6 or 7.0-7.3
  • PHP's LibXML module, with the XSLT extension enabled (--with-xsl)
  • MySQL 5.5 or above
  • An Apache or Litespeed webserver
  • Apache's mod_rewrite module or equivalent

Compatible Hosts

Sign in

Login details