There are four ways to implement paging stored procedures: row counting, cursor, ascending-descending order, and subquery
I remember someone once tested the efficiency of these four methods: from the best performance to the worst order - row counting, cursor, ascending-descending, subquery,
The following are some information I have collected for your reference
QUOTE:
Original address: /aspnet/
Author: Jasmin Muharemovic
Translator: Tony Qu
download:
introduce
In web applications, pagination of a large database result set is already a household problem. Simply put, you don't want all the query data to be displayed in a separate page, so display with paging is more appropriate. Although this is not a simple task in traditional asp, in which the DataGrid control simplifies this process to just a few lines of code. Therefore, in , paging is simple, but the default DataGrid paging event will read out all records from the database and put them into the web application. When your data is over one million, this will cause serious performance problems (if you don't believe it, you can execute a query in your application and then view the memory consumption of aspnet_wp.exe in Task Manager). This is why custom paging behavior is required, which ensures that you only get the data records required by the current page.
There are many articles and posts about this issue online, and some mature solutions. The purpose of my writing is not to show you a stored procedure that can solve all problems, but to optimize existing methods and provide you with an application for testing so that you can develop according to your needs. Here is a good start, it contains many different methods and gives some performance test results
How to paging through Recordset? 》
But I'm not very satisfied with most of the above. First, half of the methods are made with traditional ADO, and it is obvious that they are written for "old" asp. The remaining methods are SQL Server stored procedures, and some of them cannot be used due to the slow time, just like the performance results you saw at the end of the article, but some of them still caught my attention.
Generalization
I decided to do a careful analysis of three of these methods, which are temporary tables (TempTable), dynamic SQL (DynamicSQL), and row count (Rowcount). In the following, I would rather call the second method (asc-desc) Asc-Desc method. I don't think dynamic SQL is a good name, because you can also apply dynamic SQL logic to another method. The common problem with all these stored procedures is that you have to estimate which columns are you about to sort, not just the primary key columns (PK Columns), which can lead to a series of problems - for each query, you need to display them through paging, that is, for each different sorted column you must have many different paging queries, which means you either do different stored procedures for each sorted column (regardless of which paging method is used), or you must use the help of dynamic SQL to put this function in a stored procedure. These two methods have a slight impact on performance, but it increases maintainability, especially when you need to use this method to display different queries. Therefore, in this article I will try to summarize all stored procedures using dynamic SQL, but for some reasons, we can only achieve the generality of the implementation part, so you still have to write independent stored procedures for complex queries.
The second problem with allowing all sorted fields including primary key columns is that if those columns are not indexed properly, then none of these methods can help. In all these methods, a paging source must be sorted first, and for big data tables, the cost of using non-indexed column sorting is negligible. In this case, due to the long time of the corresponding time, all stored procedures cannot be used in actual situations. (The corresponding times vary from seconds to minutes, depending on the size of the table and the first record to be obtained). Indexing of other columns can bring additional undesirable performance issues, such as if you import a lot of data every day, it can become slow.
I remember someone once tested the efficiency of these four methods: from the best performance to the worst order - row counting, cursor, ascending-descending, subquery,
The following are some information I have collected for your reference
QUOTE:
Original address: /aspnet/
Author: Jasmin Muharemovic
Translator: Tony Qu
download:
introduce
In web applications, pagination of a large database result set is already a household problem. Simply put, you don't want all the query data to be displayed in a separate page, so display with paging is more appropriate. Although this is not a simple task in traditional asp, in which the DataGrid control simplifies this process to just a few lines of code. Therefore, in , paging is simple, but the default DataGrid paging event will read out all records from the database and put them into the web application. When your data is over one million, this will cause serious performance problems (if you don't believe it, you can execute a query in your application and then view the memory consumption of aspnet_wp.exe in Task Manager). This is why custom paging behavior is required, which ensures that you only get the data records required by the current page.
There are many articles and posts about this issue online, and some mature solutions. The purpose of my writing is not to show you a stored procedure that can solve all problems, but to optimize existing methods and provide you with an application for testing so that you can develop according to your needs. Here is a good start, it contains many different methods and gives some performance test results
How to paging through Recordset? 》
But I'm not very satisfied with most of the above. First, half of the methods are made with traditional ADO, and it is obvious that they are written for "old" asp. The remaining methods are SQL Server stored procedures, and some of them cannot be used due to the slow time, just like the performance results you saw at the end of the article, but some of them still caught my attention.
Generalization
I decided to do a careful analysis of three of these methods, which are temporary tables (TempTable), dynamic SQL (DynamicSQL), and row count (Rowcount). In the following, I would rather call the second method (asc-desc) Asc-Desc method. I don't think dynamic SQL is a good name, because you can also apply dynamic SQL logic to another method. The common problem with all these stored procedures is that you have to estimate which columns are you about to sort, not just the primary key columns (PK Columns), which can lead to a series of problems - for each query, you need to display them through paging, that is, for each different sorted column you must have many different paging queries, which means you either do different stored procedures for each sorted column (regardless of which paging method is used), or you must use the help of dynamic SQL to put this function in a stored procedure. These two methods have a slight impact on performance, but it increases maintainability, especially when you need to use this method to display different queries. Therefore, in this article I will try to summarize all stored procedures using dynamic SQL, but for some reasons, we can only achieve the generality of the implementation part, so you still have to write independent stored procedures for complex queries.
The second problem with allowing all sorted fields including primary key columns is that if those columns are not indexed properly, then none of these methods can help. In all these methods, a paging source must be sorted first, and for big data tables, the cost of using non-indexed column sorting is negligible. In this case, due to the long time of the corresponding time, all stored procedures cannot be used in actual situations. (The corresponding times vary from seconds to minutes, depending on the size of the table and the first record to be obtained). Indexing of other columns can bring additional undesirable performance issues, such as if you import a lot of data every day, it can become slow.
12345Next pageRead the full text