howto - speed up mysql constantantly with increasing tablesize
The startingpoint is the following. You have a table with increasing content. Due to this fact, somebody added a timestamp or a created_at column. Now a requirement pops up telling you "we have to get a quick overview of entries per day and we want to paginate over the results". First idea is add a index on the created_at column. But your table is growing and another index means also more workload for the db to write an entry. To implement the pagination, you implement the usage of a limit. But the limit doesn't really speeds up your pagination when you try to get results for the last pages. The problem for the limit is, that mysql needs also to go over the results you want to leave of (because of the offset you have to provide). So what to do? Assuming you have an unique key that is an autoincrement value, you can use this to reach your goals (and don't need another created_at column and write intensive index). But for the per day pagination you also need a little helper. One idea is to create a table that holds the start unique id for per day or you create a table per day or month. Since the second idea has too much drawbacks for me, i will explain the first one a bit more in detail. The idea is to create a table with columns `id`, `date`, `first_id_for_the_day`. Now you can easily deal with the question "what id's are entered per day". Now you just have to add to you query a "BETWEEN id_of_search_date AND (id_of_search_next_date - 1)". Of course, you still need the limit but the database now is using a smaller scope (limited by the between clause), so it shouldn't be that big deal for it :-).
Kommentare
Ansicht der Kommentare: Linear | Verschachtelt