Pagination is a pain in the arse, right? That's one of the lovely things about using an ORM or something to do it for you. But that isn't always possible, so at some point you'll need to use a LIMIT on your select. Now, MySQL has a nice little feature called SQL_CALC_FOUND_ROWS which puts the number of rows that would be returned before your LIMIT was applied.
You do something like this in your SQL SELECT statement.
SELECT SQL_CALC_FOUND_ROWS my_table.* FROM my_table,another_table WHERE my_table.id=another_table.id LIMIT 1,10;
Then you can do something like this to find out the total number of rows that would have been returned had you not applied the limit.
SELECT FOUND_ROWS();
If you wanted to do this in a Perl DBI program you'd do something like the following.
my $query = "SELECT SQL_CALC_FOUND_ROWS *
FROM shizzle
WHERE id=$self->{id}
AND is_public=1
ORDER BY updated DESC";
my $offset = 1;
if($pg && $max) {
$offset = ( $pg * $max ) - $max;
$query .= " LIMIT $offset,$max";
}
my $sth = $dbh->prepare($query) || warn "Aww crap, I can't prepare $query";
$sth->execute() || warn "Aww crap, I can't run can't query $query";
# We can now set our count of number of rows retrieved
my $num_rows_arrayref = $dbh->selectall_arrayref("SELECT FOUND_ROWS()");
my $num_rows = $num_rows_arrayref->[0][0];
while (my ${hash} = $sth->fetchrow_hashref() ) {
# Do stuff with $hash->{field1}, etc …
}
Its great to be able to at least make the pagination simpler to do using this technique. Its not quite as cute as using an ORM, but it does the job.