Data, Record Size, and Usage Limits
On this page
Size limits
Record size limits
Records can’t go beyond a certain size limit. This limit might depend on your plan—see the Algolia pricing page for more details. If you try to index a record that exceeds the limit, Algolia returns the Record is too big
error.
There are techniques to help you break up your records into smaller ones if needed.
Index size limits
You only need to worry about index size if your application runs on dedicated hardware—that is, if your plan has the Enterprise add-on. Though there is no strict upper limit to an index’s size, you should keep your indices’ total size smaller than 102 GB. This represents 80% of the RAM capacity (128 GB) of dedicated servers, which leaves 20% of the RAM capacity to handle your indexing tasks. If the index size exceeds the 128 GB capacity, performance degrades severely: data swaps back and forth between temporary and permanent memory, which is a costly operation.
There is no limit on the number of records an index can have, only on the memory capacity of the hardware.
Indexing usage limits
Maximum indexing operations
Algolia counts the number of operations performed every month. When you hit your plan’s limit, you’re charged for the extra operations, based on your plan’s over-quota pricing.
Indexing rate limit
Algolia delays or rejects indexing operations whenever a server is overloaded. If Algolia determines that indexing operations can negatively impact search requests, it takes action to favor search over indexing. This rate limit exists to protect the server’s search capacity.
Query usage
Algolia counts a search operation whenever you perform a search. In search-as-you-type implementations, this happens on every keystroke. If you’re querying several indices at each keystroke, then one keystroke triggers as many operations as queried indices, unless you use the multipleQueries
method to do this.