I am creating a Facebook-like newsfeed. Meaning it's being constructed from many SQL tables and every data-type includes a specific layout. But's it's becoming very heavy to load and that i was wishing to really make it much more complex...

This is what I actually do now:

User model:

  def updates(more_options = {})
        (games_around({},more_options) + friends_statuses({},more_options).sort! { |a,b| b.updated_at <=> a.updated_at }.slice(0,35) + friends_stats({:limit  => 10},more_options) + friends_badges({:limit  => 3},more_options)).sort! { |a,b| b.updated_at <=> a.updated_at }
  end

Example for that Badges data:

  def friends_badges(options = {:limit  => 3}, more_options = {})
    rewards = []
      rewards = Reward.find(:all, options.merge!(:conditions  => ["rewards.user_id IN (?)",self.players_around({},more_options).collect{|p| p.id}], :joins  => [:user, :badge], :order  => "rewards.created_at DESC"))            
    rewards.flatten
  end

Newsfeed View:

<% for update in @current_user.updates %>
        <% if update.class.name == "Status" %>
            <% @status = update %>
            <%= render :partial  => "users/statuses/status_line", :locals  => {:status  => update} %>
        <% elsif update.class.name == "Game" %>
            <%= render :partial => "games/game_newsfeed_line", :locals  => {:game  => update} %>
        <% elsif update.class.name == "Stat" %>
            <%= render :partial => "stats/stat_newsfeed_line", :locals  => {:stat  => update} %>
        <% elsif update.class.name == "Reward" %>
            <%= render :partial => "badges/badge_newsfeed_line", :locals  => {:reward  => update} %>
        <% end %>
    <% end %>

The choices I figured about:

  • Creating a "Feed" table and preprocess the majority of the updates for every user having a background job. Probably per hour cron. I'd keep entire Web coding for every update.
  • Keep your initial structure but focus on caching each update individually (at this time I've no caching)
  • Change to MongoDB to obtain a faster accessibility database

I must say, I am no expert, Rails made the very first steps easy however using more than 150 SQL demands per page loaded Personally i think it's unmanageable as well as a specialist perspective...

How would you react?

Interesting precious help,

Screenshots

Your code does not let me know a great deal I believe it would be useful should you could construct your computer data structure in plain JSON / SQL.

Anyway, I'd serialize each user's stream to MongoDB. I would not keep HTML within the database for a number of reasons (a minimum of not at this degree of the program) rather, you need to save the appropriate data inside a (possibly polymorphic) collection. Fetching the newsfeed is extremely easy then, indexation is easy, etc. The vista structure would basically not change. Should you later wish to alter the HTML, that's simple as well.

However this will duplicate lots of data. If people might have plenty of fans, this might be a problem. Using arrays of user ids rather than just one user id may help (if the details are exactly the same for those fans), but it is also limited.

For large association problems, there's only caching. Generate an income comprehend it, the miracle both in twitter and facebook is they don't hit the db very frequently and lots of data in RAM. If you are associating vast amounts of products, doing that's challenging even just in RAM.

The updates ought to be written continously instead of per hour. Suppose you've got a large amount of traffic, and also the hourly update takes 30min. Now, the worst situation is really a 90 min. delay. Should you process changes just-in-time, you are able to cut this to most likely 5 min.

You will need to toss in presumptions sooner or later, use caching plus some heuristics. Some good examples:

  • The greater recent a tweet, the greater traffic it'll see. It features a greater possibility of being retweeted, and it is seen a lot more frequently. Ensure that it stays in RAM.
  • Your facebook timeline overview page of 1991 is most likely not likely to change every day, making this an applicant for lengthy-term output caching.
  • Current facebook activity will probably undergo lot's of creates. Output caching will not help much here. Again, the item ought to be stored in RAM.