Search

I'm using the Remote Datasource extension to pull in my Twitter timeline – http://www.twitter.com/statuses/user_timeline/19859892.rss

It only works sporadically, though; half the time the tweets are there, the other half, I get an error code (I've had error code 0 and error code 400).

I'm asking the DS to update the cached result every 30 minutes (long enough not to come close to using up my API calls, I thought). One thing I noticed, though, is that underneath this cached result time section in the datasource settings page, it says, 'Cache has expired or does not exist'. This text appears below any DS I try to create with the Remote DS extension and I'm guessing it isn't the correct behaviour.

I've got a 'cache' folder inside 'manifest', with 777 permissions, and it contains cached items from Dynamic XML datasources (e.g. Flickr) – is this where the Remote DS should be caching its results, and if so, what should I be doing to get it working for me (if, indeed, this is the problem)?

I had a bit of a scrap updating from 2.2.5 to 2.3 and I wonder if there's a folder or file that I should have on my server that hasn't made it to the right place somehow.

I've just had another look at what I had thought were cached results from Dynamic XML datasources, but they seem, instead, to be cached entire pages (I think).

In any event, if I make a Dynamic XML datasource (instead of a Remote DS) for my Twitter timeline, using the same URL as above, it too works initially, but then tends to give me errors (and the result doesn't appear to get cached in the manifest > cache folder).

Any ideas?

Thanks,
D

The cache is also used by the Debug Devkit, so you will have cached results in there too.

Right – that explains what the cached files I'm seeing are; but what's the explanation for the Dynamic/Remote datasources not being cached (edit: or for the error codes I'm seeing – I've got it in my head that these are related, but it wouldn't be the first time I'd been wrong)?

I think we encountered this problem just recently: in certain circumstances I cannot recall at the moment, when the remote data source times out the error messages you see is shown instead of a timeout notice.

Thanks, Nils – would you recommend bumping up the $dsParamTIMEOUT value, then? If so, what's a reasonable value for this (or is it a question of increasing it until I stop getting errors)?

Am I right in thinking that in the Remote DS settings, I shouldn't always be seeing 'Cache has expired or does not exist' at the bottom of the page? Any ideas on how I can rectify this?

Thanks.

Let me clear something up. The Remote Datasource (and Dynamic XML extension) do not store the resulting cache in the /manifest/cache/ folder. The result is serialised and stored in the database in the sym_cache table.

However to do this a temporary file is written to /manifest/tmp/, and then removed once the cache has been created. This file is simply a locking file and prevents the same cache been writing at the same time if two requests should magically occur at exactly the same time.

The Debug Devkit does save it's cache files in /manifest/cache/, so that explains what those files are doing there.

Am I right in thinking that in the Remote DS settings, I shouldn't always be seeing 'Cache has expired or does not exist' at the bottom of the page? Any ideas on how I can rectify this?

No, this shouldn't appear all the time. Upon saving your datasource, provided there is no dynamic parts in your URL, the URL is fetched and stored in the cache table immediately. What you should see is Cache expires in %d minutes. Clear now?, where %d is the number of minutes until the data that was fetched is considered invalid (how long are you caching your datasources for?).

Can you confirm you have a manifest/tmp/ folder, and is it writable?

Something else to consider is that Twitter return a 400 error code when you are rate limited.

That clears a lot up, Brendo – thanks very much.

I've never seen Cache expires in %d minutes. Clear now? – I'm currently attempting to cache my datasources for 30 minutes (which I thought was definitely long enough to avoid Twitter's rate limiting).

I've got a manifest/tmp folder with permissions set to 777, so I don't think this is the problem.

Has the structure of the sym_cache table changed from Symphony 2.2.5 and 2.3? I think I basically copied my old database across when I upgraded – perhaps that needs amending.

I've just checked the epoch time of when the cached result should expire and it looks like it's in the past. Perhaps I should get on to my hosting company about their server clock.

Hmm, that is curious. Does the Dynamic XML Data Source also have this issue?

The Dynamic XML Data Source doesn't say Cache expires in %d minutes. Clear now? or Cache has expired or does not exist, but it does seem to suffer from the same issue, yes.

I think that everything is being put in the cache correctly, but I'm not sure if it's taken out of it correctly. And even on a clean MAMP install, I'm not able to see anything other than Cache has expired or does not exist using a Remote Data Source.

I've asked my hosting company to check the server clock, but I think I might have been too hasty to say that it was incorrect – I think it appeared to be an hour out, but it's probably because my local time is GMT+1 and epoch time is GMT.

I'm stumped.

The Dynamic XML Data Source doesn't say Cache expires in %d minutes. Clear now? or Cache has expired or does not exist, but it does seem to suffer from the same issue, yes.

No, it won't, it doesn't have that feature. I was more interested in does this datasource work on your frontend or does it also return unpredictable results.

I've asked my hosting company to check the server clock, but I think I might have been too hasty to say that it was incorrect – I think it appeared to be an hour out, but it's probably because my local time is GMT+1 and epoch time is GMT.

Might be onto something here, what happens if you set your cache time to be 90 minutes (so it covers the offset)?

I got this from my hosting company: We have checked the time on the database storage server and can confirm that the time is correct and up to date.

But in the meantime, I had the same idea as you and I've upped all the DS cache times to 90 minutes; so far I think they're behaving more predictably. I still see Cache has expired or does not exist, though.

hmmmmmm… about 90 minutes after I changed the cache update time to 90 minutes, they've stopped working again (status code 400).

I think I found the reason (and I think Nils is right).

For some reason (probably worth a Symphony bug report) the value of $dsParamTIMEOUT for new remote datasources is set to 1, instead of 6. So if the request to the remote source needs longer than the one second, you get the error message. I still need to check why it's the general error message and not the timeout message.

So you should set the value of the $dsParamTIMEOUT to 6 or even higher and I'll file a bug report why/that it's not done automatically.

I'd already bumped up $dsParamTIMEOUT to 6, but I've now made it 20. Hopefully that'll help…

…nope – broken again. $dsParamCACHE = 90 and $dsParamTIMEOUT = 20.

Any more ideas?

The 400 error means you are rate limited with Twitter.

Edit So I installed the Remote Datasource extension and attempted to try and recreate your problem with this datasource. While I did notice a few peculiar things (aka bugs), such as the default XPath not appearing, the cache time being set to 1 minute by default and the cache information message not displayed in the backend, I was able to retrieve the data from the Twitter RSS feed successfully (which is the most important part right :P?)

Edit 2 A freaking typo is to blame for that information not appearing in the backend.

Edit 3 I've pushed a couple of commits to the Remote Datasource repo, could you try pulling and giving it a whirl?

Amazing, Brendo; that's incredibly generous of you to go to such lengths to troubleshoot my problem.

I've now got Cache expires in %d mintes. Clear now, which makes me feel good!

I'd been led to believe that a 400 error meant that I was rate limited with Twitter before, but if I understand the caching that's going on correctly (and there's every chance that I don't), then having $dsParamCACHE = 90 should mean that I'm only using one API call every 90 minutes (possibly 30 minutes if the server clock is an hour out) – surely that's not enough for me to hit my limit?

I'll keep an eye on what happens now…

Thanks again,
D

Edit: I'm still getting 400 errors, even if I bump the cache time up to e.g. 600.

Create an account or sign in to comment.

Symphony • Open Source XSLT CMS

Server Requirements

  • PHP 5.3-5.6 or 7.0-7.3
  • PHP's LibXML module, with the XSLT extension enabled (--with-xsl)
  • MySQL 5.5 or above
  • An Apache or Litespeed webserver
  • Apache's mod_rewrite module or equivalent

Compatible Hosts

Sign in

Login details