Hampton Lintorn Catlin

Attachment_fu Gotchas

I’ve been struggling a bit with some problems uploading files with attachment_fu recently. I’m going to go through the common problems I’ve had. Attachment_fu is the best thing I’ve seen out there for upload handling. And, hopefully mentioning these gotchas will save some people some time.

1) Mongrel’s Dying with S3

This is really a problem with the AWS-S3 library. The problem is that the library uses persistent connections by default, and the last thing that Mongrel wants to deal with is keeping a persistent connection open to an outside service while its trying to serve unrelated web requests. My current theory is that the multi-threading + Mutex means that the persistent connection is ignored and times out, then when the app comes back to try and say hi to the connection, it just waits forever. This is totally unproven, but its the theory I’ve got in my brain.

Luckily, this is easy to fix! Just add this to your config/amazon_s3.yml

development:
          bucket_name: mybucket_develompent
          access_key_id: 1RX1190JQBAV
          secret_access_key: RN2nBEFhYu8k5S3kVXtM
          persistent: false

Obviously, do that with all three types of connections. Dead mongrels, no more! But, we aren’t done with S3….

2) EofError with S3

This generally happens when you have thumbnails. Why does it happen? Because it takes too long for Rails to have multiple POST conversations with S3 and something weird happens with the connections. Yes, even if you turn off persistent connections. This is usually ok for single-file uploads to S3… aka, no thumbnails. It is really unfortunate that we can’t use Thread.new in Rails with any consistency because doing the uploads to S3 after the request is finished would be really fantastic and would nip this problem in the butt.

The only known solution is to switch to file_system. That, or use Merb which is thread safe.

3) Good Attachments Validate as not having a Size

I’m still a bit unclear about what is going on with this one. It seems to be inconsistent and mostly has to do with if the upload object given by the OS is a StringIO or a TempFile. The way to tell this one apart from #4 is that when you use debugger… the #size attribute isn’t set at all. You’ll get a “Size is not included in the list” with a #save! .

Add this line to your attachment_fu.rb file in the plugin…

def uploaded_data=(file_data)
          return nil if file_data.nil? || file_data.size == 0
          self.size = file_data.size # <----- THIS LINE 
          self.content_type = file_data.content_type
          self.filename     = file_data.original_filename if respond_to?(:filename)
          if file_data.is_a?(StringIO)
            file_data.rewind
            self.temp_data = file_data.read
          else
            self.temp_path = file_data.path
          end
        end

That should solve that!

4) Larger files FAIL!

Ok, this one is a gotcha and not at all a bug. In fact, I’m kind of embarrassed that I ever had this problem. Let’s say you have a general file upload area. Not just images… you want attachments put onto something on your site. And let’s also assume that this is a protected section. So, you build your attachment_fu things to look like this.

has_attachment :storage => :file_system

And that’s all you do because you don’t care about type, size, etc. GOTCHA! You may not have specified a max_size, but attachment_fu did! It sets the default max_size to 1.megabyte. DOH! So, the error you get is the same one as above… “Size is not included in the list” That error message could be a touch more useful, IMHO. However, this is also an easy fix…

has_attachment :storage => :file_system,
                           :max_size => 100.megabytes  # Since this is admin, and we don't care

So, there you go. Hopefully these helped some of you out there who have been using this non-released software. I guess that’s what we get for living on the dangerous side!


Comments

Nov 26, 2007
Sean said...
There are some things about the plugin that could deserve to be called "voodoo" rather than "fu". Thanks for pointing out these problems.
Nov 29, 2007
Andy Stewart said...
You could perhaps work around (2) by storing thumbnails on the file system while you store the full-size images at S3. Just specify a different class to use for thumbnails and set its storage to file system. (I must confess I haven't tested this.) It's not ideal but it might suit some circumstances.
Feb 3, 2008
Rohit said...
Have you ever seen attachment_fu stop posting to S3 altogether? Everything was working great for awhile, but now for some reason, files are no longer being uploaded to my bucket, although a record is being added in my DB. And my bucket logs show no POST activity. I tried your :persistent => false fix, but it didn't seem to help me. In any case, thanks for the post.
Feb 20, 2008
Neal said...
Re: issue 2, I am actually suspecting the ruby S3 gem as being the source of the threading issues. I am using Ramaze as my framework, so I AM able to spin off a thread to run the thumbnail generate-and-shove-into-S3 bits, and still, whenever I get a new upload that occurs while that old thread is running -- blammo: EOF error in S3. So the ability to use Thread.new is not the answer to your prayers.
Apr 11, 2008
Musa said...
i've made the changes on my small application, waiting to see the effects! Thanks!
Sep 10, 2008
qzfvwh said...
twhbhkbmlolrhrdbokivmpwepzovwr
Sep 16, 2008
fhyaqp said...
wlmauwvtbouucsbhvzthabtmdgpwid
Jan 3, 2009
JC said...
Cheers! Is was really getting upset with attachment_fu thangs to your infos I sovled the problems. Have a good day. JC
May 1, 2009
DK said...
Hello, thanks for post. I've tried the solution for problem 3, but it doesn't work... do I need to recombile code or do something else? Please let me know! Thanks.