i have ruby on rails app deployed , ran bottlenecking issue when many users tried use it. it's simple game stats application. user enters name, app makes api call, , returns user's stats. works when there few users. when more users started using though, creates insufferable lag, of 5 minutes per request. therefore, added unicorn gemfile
, set procfile
, , deployed it. now, if there 2 simultaneous requests, crashes app. thought unicorn meant handle concurrent requests, not destroy them? @ least before, requests still processing, albeit delay. doing wrong here?
here procfile
used:
web: bundle exec unicorn -p $port -c ./config/unicorn.rb
here unicorn file:
worker_processes 3 timeout 30 preload_app true before_fork |server, worker| signal.trap 'term' puts 'unicorn master intercepting term , sending myself quit instead' process.kill 'quit', process.pid end defined?(activerecord::base) , activerecord::base.connection.disconnect! end after_fork |server, worker| signal.trap 'term' puts 'unicorn worker intercepting term , doing nothing. wait master sent quit' end defined?(activerecord::base) , activerecord::base.establish_connection end