MySQL 偏移无限行

发布于 2024-07-07 12:17:36 字数 100 浏览 9 评论 0原文

我想构造一个查询,显示表中的所有结果,但从表的开头偏移 5。 据我所知,MySQL 的 LIMIT 需要一个限制和一个偏移量。 有什么办法可以做到这一点吗?

I would like to construct a query that displays all the results in a table, but is offset by 5 from the start of the table. As far as I can tell, MySQL's LIMIT requires a limit as well as an offset. Is there any way to do this?

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(10

兔小萌 2024-07-14 12:17:37
WHERE .... AND id > <YOUROFFSET>

id 可以是您拥有的任何自动增量或唯一数字列...

WHERE .... AND id > <YOUROFFSET>

id can be any autoincremented or unique numerical column you have...

寂寞花火° 2024-07-14 12:17:36

来自 MySQL LIMIT 手册

检索某个特定行的所有行
偏移到结果末尾
设置,您可以使用一些大数字
第二个参数。 这个说法
检索第 96 行中的所有行
到最后:

SELECT * FROM tbl LIMIT 95, 18446744073709551615;

From the MySQL Manual on LIMIT:

To retrieve all rows from a certain
offset up to the end of the result
set, you can use some large number for
the second parameter. This statement
retrieves all rows from the 96th row
to the last:

SELECT * FROM tbl LIMIT 95, 18446744073709551615;
只有影子陪我不离不弃 2024-07-14 12:17:36

正如您提到的,需要 LIMIT,因此您需要使用可能的最大限制,即 18446744073709551615 (无符号 BIGINT 的最大值)

SELECT * FROM somewhere LIMIT 18446744073709551610 OFFSET 5

As you mentioned it LIMIT is required, so you need to use the biggest limit possible, which is 18446744073709551615 (maximum of unsigned BIGINT)

SELECT * FROM somewhere LIMIT 18446744073709551610 OFFSET 5
二智少女 2024-07-14 12:17:36

正如其他答案中所述,MySQL 建议使用 18446744073709551615 作为限制中的记录数,但请考虑一下:如果返回 18,446,744,073,709,551,615 条记录,你会做什么? 事实上,如果你有 1,000,000,000 条记录,你会做什么?

也许您确实想要超过 10 亿条记录,但我的观点是,您想要的数量有一些限制,,它少于 18 万亿。 为了稳定性、优化和可能的可用性,我建议对查询设置一些有意义的限制。 这也将减少任何从未见过这个神奇数字的人的困惑,并且还有一个额外的好处,即至少可以传达您愿意一次处理多少条记录。

如果您确实必须从数据库中获取全部 18 500 亿条记录,也许您真正想要的是以 1 亿为增量获取它们并循环 1840 亿次。

As noted in other answers, MySQL suggests using 18446744073709551615 as the number of records in the limit, but consider this: What would you do if you got 18,446,744,073,709,551,615 records back? In fact, what would you do if you got 1,000,000,000 records?

Maybe you do want more than one billion records, but my point is that there is some limit on the number you want, and it is less than 18 quintillion. For the sake of stability, optimization, and possibly usability, I would suggest putting some meaningful limit on the query. This would also reduce confusion for anyone who has never seen that magical looking number, and have the added benefit of communicating at least how many records you are willing to handle at once.

If you really must get all 18 quintillion records from your database, maybe what you really want is to grab them in increments of 100 million and loop 184 billion times.

安静被遗忘 2024-07-14 12:17:36

另一种方法是选择一个自动增量列,然后使用 HAVING 对其进行过滤。

SET @a := 0; 
select @a:=@a + 1 AS counter, table.* FROM table 
HAVING counter > 4

但我可能会坚持采用上限方法。

Another approach would be to select an autoimcremented column and then filter it using HAVING.

SET @a := 0; 
select @a:=@a + 1 AS counter, table.* FROM table 
HAVING counter > 4

But I would probably stick with the high limit approach.

想念有你 2024-07-14 12:17:36

正如其他人提到的,来自 MySQL 手册。 为了实现这一点,您可以使用无符号大整数的最大值,即这个可怕的数字(18446744073709551615)。 但为了让它不那么混乱,你可以使用波形符“~”按位运算符。

  LIMIT 95, ~0

它的作用相当于按位求反。 “~0”的结果是18446744073709551615。

As others mentioned, from the MySQL manual. In order to achieve that, you can use the maximum value of an unsigned big int, that is this awful number (18446744073709551615). But to make it a little bit less messy you can the tilde "~" bitwise operator.

  LIMIT 95, ~0

it works as a bitwise negation. The result of "~0" is 18446744073709551615.

我们只是彼此的过ke 2024-07-14 12:17:36

我在练习 LC#1321 时遇到了一个非常类似的问题,我必须选择除前 6 个日期外的所有日期都将被跳过。

我借助 ROW_NUMBER() 窗口函数和子查询在 MySQL 中实现了这一点。 例如,以下查询返回所有结果,并跳过前五行:

SELECT
    fieldname1,
    fieldname2
FROM(
    SELECT
        *,
        ROW_NUMBER() OVER() row_num
    FROM
        mytable
) tmp
WHERE
    row_num > 5;

您可能需要在子查询中添加更多逻辑,尤其是在 OVER() 中以满足您的需要。 此外,根据您的实际偏移逻辑,可以使用 RANK()/DENSE_RANK() 窗口函数代替 ROW_NUMBER()

参考:

MySQL 8.0 参考手册 - ROW_NUMBER ()

I ran into a very similar issue when practicing LC#1321, in which I have to select all the dates but the first 6 dates are skipped.

I achieved this in MySQL with the help of ROW_NUMBER() window function and subquery. For example, the following query returns all the results with the first five rows skipped:

SELECT
    fieldname1,
    fieldname2
FROM(
    SELECT
        *,
        ROW_NUMBER() OVER() row_num
    FROM
        mytable
) tmp
WHERE
    row_num > 5;

You may need to add some more logics in the subquery, especially in OVER() to fit your need. In addition, RANK()/DENSE_RANK() window functions may be used instead of ROW_NUMBER() depending on your real offset logic.

Reference:

MySQL 8.0 Reference Manual - ROW_NUMBER()

扮仙女 2024-07-14 12:17:36

您可以使用带有 LIMIT 的 MySQL 语句:

START TRANSACTION;
SET @my_offset = 5;
SET @rows = (SELECT COUNT(*) FROM my_table);
PREPARE statement FROM 'SELECT * FROM my_table LIMIT ? OFFSET ?';
EXECUTE statement USING @rows, @my_offset;
COMMIT;

在 MySQL 5.5.44 中测试。 因此,我们可以避免插入数字18446744073709551615。

注意:事务确保变量@rows与语句执行时考虑的表一致。

You can use a MySQL statement with LIMIT:

START TRANSACTION;
SET @my_offset = 5;
SET @rows = (SELECT COUNT(*) FROM my_table);
PREPARE statement FROM 'SELECT * FROM my_table LIMIT ? OFFSET ?';
EXECUTE statement USING @rows, @my_offset;
COMMIT;

Tested in MySQL 5.5.44. Thus, we can avoid the insertion of the number 18446744073709551615.

note: the transaction makes sure that the variable @rows is in agreement to the table considered in the execution of statement.

浅浅 2024-07-14 12:17:36

就在今天,我正在阅读有关从 mysql 表获取大量数据(超过一百万行)的最佳方法。 按照建议,一种方法是使用 LIMIT x,y,其中 x 是偏移量,y 是要返回的最后一行。 然而,正如我发现的那样,这并不是最有效的方法。 如果您有自动增量列,则可以轻松使用带有 WHERE 子句的 SELECT 语句,说明您要从哪条记录开始。

例如,
SELECT * FROM table_name WHERE id > > SELECT * FROM table_name WHERE id > x;

当您使用LIMIT时,mysql似乎会获取所有结果,然后只显示适合偏移量的记录:这不是最好的性能。

资料来源:对此问题的回答 MySQL 论坛。 请注意,这个问题是关于 6 岁的。

Just today I was reading about the best way to get huge amounts of data (more than a million rows) from a mysql table. One way is, as suggested, using LIMIT x,y where x is the offset and y the last row you want returned. However, as I found out, it isn't the most efficient way to do so. If you have an autoincrement column, you can as easily use a SELECT statement with a WHERE clause saying from which record you'd like to start.

For example,
SELECT * FROM table_name WHERE id > x;

It seems that mysql gets all results when you use LIMIT and then only shows you the records that fit in the offset: not the best for performance.

Source: Answer to this question MySQL Forums. Just take note, the question is about 6 years old.

陌路终见情 2024-07-14 12:17:36

我知道这已经很旧了,但我没有看到类似的回复,所以这是我将使用的解决方案。

首先,我将对表执行计数查询以查看存在多少条记录。 该查询速度很快,通常执行时间可以忽略不计。 类似于:

SELECT COUNT(*) FROM table_name;

然后我将使用从 count 获得的结果作为我的限制来构建查询(因为这是表可能返回的最大行数)。 类似:

SELECT * FROM table_name LIMIT count_result OFFSET desired_offset;

或者可能类似:

SELECT * FROM table_name LIMIT desired_offset, count_result;

当然,如果有必要,您可以从 count_result 中减去desired_offset,以获得实际的、准确的值作为限​​制。 如果我实际上可以确定要提供的适当限制,那么传递“18446744073709551610”值就没有意义。

I know that this is old but I didnt see a similar response so this is the solution I would use.

First, I would execute a count query on the table to see how many records exist. This query is fast and normally the execution time is negligible. Something like:

SELECT COUNT(*) FROM table_name;

Then I would build my query using the result I got from count as my limit (since that is the maximum number of rows the table could possibly return). Something like:

SELECT * FROM table_name LIMIT count_result OFFSET desired_offset;

Or possibly something like:

SELECT * FROM table_name LIMIT desired_offset, count_result;

Of course, if necessary, you could subtract desired_offset from count_result to get an actual, accurate value to supply as the limit. Passing the "18446744073709551610" value just doesnt make sense if I can actually determine an appropriate limit to provide.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文