An abstract cache store class. There are multiple cache store implementations, each having its own additional features. See the classes under the ActiveSupport::Cache module, e.g. ActiveSupport::Cache::MemCacheStore. MemCacheStore is currently the most popular cache store for large production websites.
Some implementations may not support all methods beyond the basic cache methods of fetch, write, read, exist?, and delete.
ActiveSupport::Cache::Store can store any serializable Ruby object.
cache = ActiveSupport::Cache::MemoryStore.new cache.read("city") # => nil cache.write("city", "Duckburgh") cache.read("city") # => "Duckburgh"
Keys are always translated into Strings and are case sensitive. When an object is specified as a key, its cache_key method will be called if it is defined. Otherwise, the to_param method will be called. Hashes and Arrays can be used as keys. The elements will be delimited by slashes and Hashes elements will be sorted by key so they are consistent.
cache.read("city") == cache.read(:city) # => true
Nil values can be cached.
If your cache is on a shared infrastructure, you can define a namespace for your cache entries. If a namespace is defined, it will be prefixed on to every key. The namespace can be either a static value or a Proc. If it is a Proc, it will be invoked when each key is evaluated so that you can use application logic to invalidate keys.
cache.namespace = lambda { @last_mod_time } # Set the namespace to a variable @last_mod_time = Time.now # Invalidate the entire cache by changing namespace
Caches can also store values in a compressed format to save space and reduce time spent sending data. Since there is some overhead, values must be large enough to warrant compression. To turn on compression either pass :compress => true in the initializer or to fetch or write. To specify the threshold at which to compress values, set :compress_threshold. The default threshold is 32K.
# File lib/active_support/cache.rb, line 176 176: def self.instrument 177: Thread.current[:instrument_cache_store] || false 178: end
Set to true if cache stores should be instrumented. Default is false.
# File lib/active_support/cache.rb, line 172 172: def self.instrument=(boolean) 173: Thread.current[:instrument_cache_store] = boolean 174: end
Create a new cache. The options will be passed to any write method calls except for :namespace which can be used to set the global namespace for the cache.
# File lib/active_support/cache.rb, line 153 153: def initialize (options = nil) 154: @options = options ? options.dup : {} 155: end
Cleanup the cache by removing expired entries.
Options are passed to the underlying cache implementation.
All implementations may not support this method.
# File lib/active_support/cache.rb, line 425 425: def cleanup(options = nil) 426: raise NotImplementedError.new("#{self.class.name} does not support cleanup") 427: end
Clear the entire cache. Be careful with this method since it could affect other processes if shared cache is being used.
Options are passed to the underlying cache implementation.
All implementations may not support this method.
# File lib/active_support/cache.rb, line 435 435: def clear(options = nil) 436: raise NotImplementedError.new("#{self.class.name} does not support clear") 437: end
Increment an integer value in the cache.
Options are passed to the underlying cache implementation.
All implementations may not support this method.
# File lib/active_support/cache.rb, line 416 416: def decrement(name, amount = 1, options = nil) 417: raise NotImplementedError.new("#{self.class.name} does not support decrement") 418: end
Deletes an entry in the cache. Returns true if an entry is deleted.
Options are passed to the underlying cache implementation.
# File lib/active_support/cache.rb, line 371 371: def delete(name, options = nil) 372: options = merged_options(options) 373: instrument(:delete, name) do |payload| 374: delete_entry(namespaced_key(name, options), options) 375: end 376: end
Delete all entries with keys matching the pattern.
Options are passed to the underlying cache implementation.
All implementations may not support this method.
# File lib/active_support/cache.rb, line 398 398: def delete_matched(matcher, options = nil) 399: raise NotImplementedError.new("#{self.class.name} does not support delete_matched") 400: end
Return true if the cache contains an entry for the given key.
Options are passed to the underlying cache implementation.
# File lib/active_support/cache.rb, line 381 381: def exist?(name, options = nil) 382: options = merged_options(options) 383: instrument(:exist?, name) do |payload| 384: entry = read_entry(namespaced_key(name, options), options) 385: if entry && !entry.expired? 386: true 387: else 388: false 389: end 390: end 391: end
Fetches data from the cache, using the given key. If there is data in the cache with the given key, then that data is returned.
If there is no such data in the cache (a cache miss occurred), then nil will be returned. However, if a block has been passed, then that block will be run in the event of a cache miss. The return value of the block will be written to the cache under the given cache key, and that return value will be returned.
cache.write("today", "Monday") cache.fetch("today") # => "Monday" cache.fetch("city") # => nil cache.fetch("city") do "Duckburgh" end cache.fetch("city") # => "Duckburgh"
You may also specify additional options via the options argument. Setting :force => true will force a cache miss:
cache.write("today", "Monday") cache.fetch("today", :force => true) # => nil
Setting :compress will store a large cache entry set by the call in a compressed format.
Setting :expires_in will set an expiration time on the cache. All caches support auto expiring content after a specified number of seconds. This value can be specified as an option to the construction in which call all entries will be affected. Or it can be supplied to the fetch or write method for just one entry.
cache = ActiveSupport::Cache::MemoryStore.new(:expire_in => 5.minutes) cache.write(key, value, :expire_in => 1.minute) # Set a lower value for one entry
Setting :race_condition_ttl is very useful in situations where a cache entry is used very frequently unver heavy load. If a cache expires and due to heavy load seven different processes will try to read data natively and then they all will try to write to cache. To avoid that case the first process to find an expired cache entry will bump the cache expiration time by the value set in :race_condition_ttl. Yes this process is extending the time for a stale value by another few seconds. Because of extended life of the previous cache, other processes will continue to use slightly stale data for a just a big longer. In the meantime that first process will go ahead and will write into cache the new value. After that all the processes will start getting new value. The key is to keep :race_condition_ttl small.
If the process regenerating the entry errors out, the entry will be regenerated after the specified number of seconds. Also note that the life of stale cache is extended only if it expired recently. Otherwise a new value is generated and :race_condition_ttl does not play any role.
# Set all values to expire after one minute. cache = ActiveSupport::Cache::MemoryCache.new(:expires_in => 1.minute) cache.write("foo", "original value") val_1 = nil val_2 = nil sleep 60 Thread.new do val_1 = cache.fetch("foo", :race_condition_ttl => 10) do sleep 1 "new value 1" end end Thread.new do val_2 = cache.fetch("foo", :race_condition_ttl => 10) do "new value 2" end end # val_1 => "new value 1" # val_2 => "original value" # sleep 10 # First thread extend the life of cache by another 10 seconds # cache.fetch("foo") => "new value 1"
Other options will be handled by the specific cache store implementation. Internally, # calls #, and calls # on a cache miss. options will be passed to the # and # calls.
For example, MemCacheStore’s # method supports the :raw option, which tells the memcached server to store all values as strings. We can use this option with # too:
cache = ActiveSupport::Cache::MemCacheStore.new cache.fetch("foo", :force => true, :raw => true) do :bar end cache.fetch("foo") # => "bar"
# File lib/active_support/cache.rb, line 271 271: def fetch(name, options = nil) 272: if block_given? 273: options = merged_options(options) 274: key = namespaced_key(name, options) 275: unless options[:force] 276: entry = instrument(:read, name, options) do |payload| 277: payload[:super_operation] = :fetch if payload 278: read_entry(key, options) 279: end 280: end 281: if entry && entry.expired? 282: race_ttl = options[:race_condition_ttl].to_f 283: if race_ttl and Time.now.to_f - entry.expires_at <= race_ttl 284: entry.expires_at = Time.now + race_ttl 285: write_entry(key, entry, :expires_in => race_ttl * 2) 286: else 287: delete_entry(key, options) 288: end 289: entry = nil 290: end 291: 292: if entry 293: instrument(:fetch_hit, name, options) { |payload| } 294: entry.value 295: else 296: result = instrument(:generate, name, options) do |payload| 297: yield 298: end 299: write(name, result, options) 300: result 301: end 302: else 303: read(name, options) 304: end 305: end
Increment an integer value in the cache.
Options are passed to the underlying cache implementation.
All implementations may not support this method.
# File lib/active_support/cache.rb, line 407 407: def increment(name, amount = 1, options = nil) 408: raise NotImplementedError.new("#{self.class.name} does not support increment") 409: end
Silence the logger within a block.
# File lib/active_support/cache.rb, line 164 164: def mute 165: previous_silence, @silence = defined?(@silence) && @silence, true 166: yield 167: ensure 168: @silence = previous_silence 169: end
Fetches data from the cache, using the given key. If there is data in the cache with the given key, then that data is returned. Otherwise, nil is returned.
Options are passed to the underlying cache implementation.
# File lib/active_support/cache.rb, line 312 312: def read(name, options = nil) 313: options = merged_options(options) 314: key = namespaced_key(name, options) 315: instrument(:read, name, options) do |payload| 316: entry = read_entry(key, options) 317: if entry 318: if entry.expired? 319: delete_entry(key, options) 320: payload[:hit] = false if payload 321: nil 322: else 323: payload[:hit] = true if payload 324: entry.value 325: end 326: else 327: payload[:hit] = false if payload 328: nil 329: end 330: end 331: end
Read multiple values at once from the cache. Options can be passed in the last argument.
Some cache implementation may optimize this method.
Returns a hash mapping the names provided to the values found.
# File lib/active_support/cache.rb, line 339 339: def read_multi(*names) 340: options = names.extract_options! 341: options = merged_options(options) 342: results = {} 343: names.each do |name| 344: key = namespaced_key(name, options) 345: entry = read_entry(key, options) 346: if entry 347: if entry.expired? 348: delete_entry(key) 349: else 350: results[name] = entry.value 351: end 352: end 353: end 354: results 355: end
Silence the logger.
# File lib/active_support/cache.rb, line 158 158: def silence! 159: @silence = true 160: self 161: end
Writes the value to the cache, with the key.
Options are passed to the underlying cache implementation.
# File lib/active_support/cache.rb, line 360 360: def write(name, value, options = nil) 361: options = merged_options(options) 362: instrument(:write, name, options) do |payload| 363: entry = Entry.new(value, options) 364: write_entry(namespaced_key(name, options), entry, options) 365: end 366: end
Add the namespace defined in the options to a pattern designed to match keys. Implementations that support delete_matched should call this method to translate a pattern that matches names into one that matches namespaced keys.
# File lib/active_support/cache.rb, line 443 443: def key_matcher(pattern, options) 444: prefix = options[:namespace].is_a?(Proc) ? options[:namespace].call : options[:namespace] 445: if prefix 446: source = pattern.source 447: if source.start_with?('^') 448: source = source[1, source.length] 449: else 450: source = ".*#{source[0, source.length]}" 451: end 452: Regexp.new("^#{Regexp.escape(prefix)}:#{source}", pattern.options) 453: else 454: pattern 455: end 456: end
# File lib/active_support/cache.rb, line 511 511: def instrument(operation, key, options = nil) 512: log(operation, key, options) 513: 514: if self.class.instrument 515: payload = { :key => key } 516: payload.merge!(options) if options.is_a?(Hash) 517: ActiveSupport::Notifications.instrument("cache_#{operation}.active_support", payload){ yield(payload) } 518: else 519: yield(nil) 520: end 521: end
# File lib/active_support/cache.rb, line 523 523: def log(operation, key, options = nil) 524: return unless logger && logger.debug? && !silence? 525: logger.debug("Cache #{operation}: #{key}#{options.blank? ? "" : " (#{options.inspect})"}") 526: end
Prefix a key with the namespace. Namespace and key will be delimited with a colon.
# File lib/active_support/cache.rb, line 503 503: def namespaced_key(key, options) 504: key = expanded_key(key) 505: namespace = options[:namespace] if options 506: prefix = namespace.is_a?(Proc) ? namespace.call : namespace 507: key = "#{prefix}:#{key}" if prefix 508: key 509: end
Disabled; run with --debug to generate this.
Generated with the Darkfish Rdoc Generator 1.1.6.