Postpone commands when Internet is back online
Last update
2018-11-22
2018-11-22
«save commands to disk and run them when the connection is back»
If you need to run a command like wget
or mail
now but your internet connection is down then you can use this script to save your command to disk and run it when the connection is up again:
1 2 3 | wait4inet cmd wget ... # save a command for online execution wait4inet check # runs saved commands only if we are online wait4inet check-loop # runs every minute any saved command only if we are online |
you can add it to your crontab too:
1 2 | # m h dom mon dow command */5 * * * * /path/to/wait4inet check > /dev/null 2>&1 |
Here is the wait4inet
script:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 | #!/usr/bin/env ruby Signal.trap('INT') { exit } # graceful exit module Wait4Inet require 'shellwords' require 'yaml' JOB_DIR = "/tmp/#{File.basename __FILE__}.jobs" MAX_RETRIES = 3 def self.run(args) action = args.shift # extract action arguments data = case action when 'check' # read and execute now every job file process_files exit when 'check-loop' interval = args[0].to_i == 0 ? 60 : args[0].to_i loop { process_files; sleep interval } exit when 'cmd' { args: args } else puts "USAGE: #{File.basename __FILE__} <action> [params]" puts " * cmd <command> [command params]" puts " * check # runs queued actions" puts " * check-loop [S] # run \"check\" action every S seconds" exit end save_job action, data end # self.run --------------------------------------------------------------- def self.process_files return unless online? Dir.chdir('/tmp') # safe working directory Dir["#{JOB_DIR}/*.w4i"].sort.each do |job_file| next if File.exist?("#{job_file}.lock") File.open("#{job_file}.lock", 'w').close # touch data = YAML::load_file job_file rescue {} next unless data[:action] && data[:ts] # sanity check puts "\n===== #{File.basename job_file}: #{data}" next unless data[:retries].to_i < MAX_RETRIES case data[:action] when 'cmd' system %Q| #{data[:args].map(&:shellescape).join ' '}| update_job job_file, data, $? end File.unlink "#{job_file}.lock" sleep 1 end # each job_file end # self.process_files ----------------------------------------------------- def self.save_job(action, data) Dir.mkdir(JOB_DIR) unless File.exist?(JOB_DIR) File.chmod(0777, JOB_DIR) if File.owned?(JOB_DIR) # write job file with timestamp ts = Time.now data = data.merge action: action, ts: ts fname = "#{JOB_DIR}/#{ts.strftime '%F_%T_%3N'}.w4i" File.open(fname, 'w'){|f| f.write data.to_yaml } end # self.save_job ---------------------------------------------------------- def self.update_job(job_file, data, exit_code, exit_code_ok = 0) if exit_code.to_i == exit_code_ok.to_i File.unlink job_file else data[:retries ] = data[:retries].to_i + 1 data[:exit_code] = exit_code.to_i File.open(job_file, 'w'){|f| f.write data.to_yaml } File.rename(job_file, "#{job_file}.failed") if data[:retries].to_i == MAX_RETRIES end end # self.update_job -------------------------------------------------------- def self.online? `wget -q --spider -T 3 http://google.com` $?.to_i == 0 end # self.online? ----------------------------------------------------------- end # module Wait4Inet --------------------------------------------------------- Wait4Inet.run ARGV |
The script uses lock files in order to prevent multiple concurrent execution of the same command, so you can safely run long tasks and multiple instances.