Torsten Bühl
Software Engineer & Founder

Hi, I'm Torsten. I am the founder of Foodlane, Company Signal, and Exceptiontrap (acquired by Scout APM). I strive to build simple and beautiful software products that people love to use. Learn more about them here

Show and stream multiple production remote logs with Capistrano

Torsten Bühl Profile

Sometimes you want to see what’s happening in your production log files right now. You can use log management tools like Loggly or Papertrail for this, but there is also an easy way to stream your log files with Capistrano.

I found a basic implementation here, and improved it a bit.

# config/recipes/logs.rb
namespace :logs do
  desc "tail and stream production log files"
  task :tail, roles: :app do
    file = fetch(:file, 'production') # uses 'production' as default
    trap("INT") { puts 'Interupted'; exit 0; }
    run "tail -f #{shared_path}/log/#{file}.log" do |channel, stream, data|
      puts  # for an extra line break before the host name
      puts "#{channel[:host]}: #{data}"
      break if stream == :err
    end
  end
end
# config/deploy.rb
load "config/recipes/logs"

The main difference is the trap("INT") method which is used to abort the execution of the task with control + c. Additionally, file = fetch(:file, 'production') is used to specify the log file you want to show in the console. fetch checks if there is an argument file provided with the -s flag. If not, it uses a default value.

Note: Capistrano handles arguments in a specific way. You cannot check the value of file for nil or set it with the ||= operator.

So how do you run the task? It’s pretty easy:

# This will show and stream production.log
$ cap logs:tail

# This will show and stream cron.log
$ cap logs:tail -s file=cron

Any feedback? Just ping me at @tbuehl.