将 PostgreSQL 的 PL/pgSQL 输出保存到 CSV 文件

发布于 2024-08-06 09:50:13 字数 117 浏览 8 评论 0原文

将 PL/pgSQL 输出从 PostgreSQL 数据库保存到 CSV 文件的最简单方法是什么?

我正在使用带有 pgAdmin III 的 PostgreSQL 8.4 和运行查询的 PSQL 插件。

What is the easiest way to save PL/pgSQL output from a PostgreSQL database to a CSV file?

I'm using PostgreSQL 8.4 with pgAdmin III and PSQL plugin where I run queries from.

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(21

酷炫老祖宗 2024-08-13 09:50:14
import json
cursor = conn.cursor()
qry = """ SELECT details FROM test_csvfile """ 
cursor.execute(qry)
rows = cursor.fetchall()

value = json.dumps(rows)

with open("/home/asha/Desktop/Income_output.json","w+") as f:
    f.write(value)
print 'Saved to File Successfully'
import json
cursor = conn.cursor()
qry = """ SELECT details FROM test_csvfile """ 
cursor.execute(qry)
rows = cursor.fetchall()

value = json.dumps(rows)

with open("/home/asha/Desktop/Income_output.json","w+") as f:
    f.write(value)
print 'Saved to File Successfully'
内心激荡 2024-08-13 09:50:14

JackDB(网络浏览器中的数据库客户端)使这一切变得非常简单。特别是如果您使用 Heroku。

它允许您连接到远程数据库并对其运行 SQL 查询。

                    sp;                     sp;                     sp;                     sp;                     sp;                     sp; 来源
jackdb-赫罗库
(来源:jackdb.com)


连接数据库后,您可以运行查询并导出到 CSV 或 TXT(请参见右下角)。


jackdb-export

注意: 我与 JackDB 没有任何关系。我目前使用他们的免费服务,并认为这是一个很棒的产品。

JackDB, a database client in your web browser, makes this really easy. Especially if you're on Heroku.

It lets you connect to remote databases and run SQL queries on them.

                                                                                                                                                       Source
jackdb-heroku
(source: jackdb.com)


Once your DB is connected, you can run a query and export to CSV or TXT (see bottom right).


jackdb-export

Note: I'm in no way affiliated with JackDB. I currently use their free services and think it's a great product.

眸中客 2024-08-13 09:50:14

根据 @skeller88 的要求,我将我的评论重新发布为答案,这样那些没有阅读每条回复的人就不会迷失它......

DataGrip 的问题在于它会控制你的钱包。它不是免费的。请尝试 dbeaver.io 上的 DBeaver 社区版。它是一款面向 SQL 程序员、DBA 和分析师的 FOSS 多平台数据库工具,支持所有流行的数据库:MySQL、PostgreSQL、SQLite、Oracle、DB2、SQL Server、Sybase、MS Access、Teradata、Firebird、Hive、Presto 等。

DBeaver 社区版使得连接数据库、发出查询来检索数据,然后下载结果集以将其保存为 CSV、JSON、SQL 或其他常见数据格式变得非常简单。它是 TOAD for Postgres、TOAD for SQL Server 或 Toad for Oracle 的可行的 FOSS 竞争对手。

我与 DBeaver 没有任何关系。我喜欢它的价格和功能,但我希望他们能够更多地开放 DBeaver/Eclipse 应用程序,并使向 DBeaver/Eclipse 添加分析小部件变得容易,而不是要求用户支付年度订阅费用来直接在其中创建图形和图表该应用程序。我的 Java 编码技能已经生疏,我不想花几周时间重新学习如何构建 Eclipse 小部件,却发现 DBeaver 已经禁用了向 DBeaver 社区版添加第三方小部件的功能。

DBeaver 用户是否了解创建分析小部件以添加到 DBeaver 社区版中的步骤?

Per the request of @skeller88, I am reposting my comment as an answer so that it doesn't get lost by people who don't read every response...

The problem with DataGrip is that it puts a grip on your wallet. It is not free. Try the community edition of DBeaver at dbeaver.io. It is a FOSS multi-platform database tool for SQL programmers, DBAs and analysts that supports all popular databases: MySQL, PostgreSQL, SQLite, Oracle, DB2, SQL Server, Sybase, MS Access, Teradata, Firebird, Hive, Presto, etc.

DBeaver Community Edition makes it trivial to connect to a database, issue queries to retrieve data, and then download the result set to save it to CSV, JSON, SQL, or other common data formats. It's a viable FOSS competitor to TOAD for Postgres, TOAD for SQL Server, or Toad for Oracle.

I have no affiliation with DBeaver. I love the price and functionality, but I wish they would open up the DBeaver/Eclipse application more and made it easy to add analytics widgets to DBeaver / Eclipse, rather than requiring users to pay for the annual subscription to create graphs and charts directly within the application. My Java coding skills are rusty and I don't feel like taking weeks to relearn how to build Eclipse widgets, only to find that DBeaver has disabled the ability to add third-party widgets to the DBeaver Community Edition.

Do DBeaver users have insight as to the steps to create analytics widgets to add into the Community Edition of DBeaver?

云雾 2024-08-13 09:50:13

您想要将生成的文件放在服务器上还是客户端上?

服务器端

如果你想要一些易于重用或自动化的东西,你可以使用 Postgresql 的内置 复制 命令。例如

Copy (Select * From foo) To '/tmp/test.csv' With CSV DELIMITER ',' HEADER;

这种方法完全在远程服务器上运行 - 它无法写入您的本地PC。它还需要作为 Postgres“超级用户”(通常称为“root”)运行,因为 Postgres 无法阻止它对该计算机的本地文件系统做一些令人讨厌的事情。

这实际上并不意味着您必须作为超级用户进行连接(自动化这将带来另一种安全风险),因为您可以使用 CREATE FUNCTIONSECURITY DEFINER 选项,以创建一个运行的函数尽管您是超级用户。

关键的部分是你的函数是用来执行额外的检查,而不仅仅是绕过安全性 - 因此你可以编写一个导出你需要的确切数据的函数,或者你可以编写一些可以接受各种选项的东西,只要它们满足严格的白名单。您需要检查两件事:

  1. 应允许用户在磁盘上读/写哪些文件?例如,这可能是一个特定的目录,并且文件名可能必须具有合适的前缀或扩展名。
  2. 用户应该能够在数据库中读/写哪些?这通常由数据库中的 GRANT 定义,但该函数现在作为超级用户运行,因此通常“越界”的表将完全可以访问。您可能不想让某人调用您的函数并在“用户”表的末尾添加行......

我已经编写了 扩展此方法的博客文章,包括导出(或导入)满足严格条件的文件和表的函数的一些示例。


客户端

另一种方法是在客户端进行文件处理,即在您的应用程序或脚本中。 Postgres 服务器不需要知道您要复制到哪个文件,它只是吐出数据,然后客户端将其放在某个地方。

其底层语法是 COPY TO STDOUT 命令,像 pgAdmin 这样的图形工具会在一个漂亮的对话框中为您包装它。

psql 命令行客户端有一个特殊的“元命令”,称为\copy,它接受所有与“真实”COPY 相同的选项,但在客户端内部运行:

\copy (Select * From foo) To '/tmp/test.csv' With CSV DELIMITER ',' HEADER

请注意,没有终止 ;,因为元命令由换行符终止,与 SQL 命令不同。

来自文档 :

不要将 COPY 与 psql 指令 \copy 混淆。 \copy 调用 COPY FROM STDIN 或 COPY TO STDOUT,然后在 psql 客户端可访问的文件中获取/存储数据。因此,当使用 \copy 时,文件可访问性和访问权限取决于客户端而不是服务器。

您的应用程序编程语言可能也支持推送或获取数据,但您通常不能在标准中使用COPY FROM STDIN/TO STDOUT SQL语句,因为没有办法连接输入/输出流。 PHP 的 PostgreSQL 处理程序(不是 PDO)包括非常基本的 pg_copy_frompg_copy_to 函数,用于复制到 PHP 数组或从 PHP 数组复制,这对于大型数据集可能效率不高。

Do you want the resulting file on the server, or on the client?

Server side

If you want something easy to re-use or automate, you can use Postgresql's built in COPY command. e.g.

Copy (Select * From foo) To '/tmp/test.csv' With CSV DELIMITER ',' HEADER;

This approach runs entirely on the remote server - it can't write to your local PC. It also needs to be run as a Postgres "superuser" (normally called "root") because Postgres can't stop it doing nasty things with that machine's local filesystem.

That doesn't actually mean you have to be connected as a superuser (automating that would be a security risk of a different kind), because you can use the SECURITY DEFINER option to CREATE FUNCTION to make a function which runs as though you were a superuser.

The crucial part is that your function is there to perform additional checks, not just by-pass the security - so you could write a function which exports the exact data you need, or you could write something which can accept various options as long as they meet a strict whitelist. You need to check two things:

  1. Which files should the user be allowed to read/write on disk? This might be a particular directory, for instance, and the filename might have to have a suitable prefix or extension.
  2. Which tables should the user be able to read/write in the database? This would normally be defined by GRANTs in the database, but the function is now running as a superuser, so tables which would normally be "out of bounds" will be fully accessible. You probably don’t want to let someone invoke your function and add rows on the end of your “users” table…

I've written a blog post expanding on this approach, including some examples of functions that export (or import) files and tables meeting strict conditions.


Client side

The other approach is to do the file handling on the client side, i.e. in your application or script. The Postgres server doesn't need to know what file you're copying to, it just spits out the data and the client puts it somewhere.

The underlying syntax for this is the COPY TO STDOUT command, and graphical tools like pgAdmin will wrap it for you in a nice dialog.

The psql command-line client has a special "meta-command" called \copy, which takes all the same options as the "real" COPY, but is run inside the client:

\copy (Select * From foo) To '/tmp/test.csv' With CSV DELIMITER ',' HEADER

Note that there is no terminating ;, because meta-commands are terminated by newline, unlike SQL commands.

From the docs:

Do not confuse COPY with the psql instruction \copy. \copy invokes COPY FROM STDIN or COPY TO STDOUT, and then fetches/stores the data in a file accessible to the psql client. Thus, file accessibility and access rights depend on the client rather than the server when \copy is used.

Your application programming language may also have support for pushing or fetching the data, but you cannot generally use COPY FROM STDIN/TO STDOUT within a standard SQL statement, because there is no way of connecting the input/output stream. PHP's PostgreSQL handler (not PDO) includes very basic pg_copy_from and pg_copy_to functions which copy to/from a PHP array, which may not be efficient for large data sets.

独木成林 2024-08-13 09:50:13

解决办法有以下几种:

1 psql 命令

psql -d dbname -t -A -F"," -c "select * from users" > output.csv

这有一个很大的优点,您可以通过 SSH 使用它,例如 ssh postgres@host 命令 - 使您能够获得

2 个 postgres copy 命令

< code>COPY (SELECT * from users) To '/tmp/output.csv' With CSV;

3 psql 交互式(或不)

>psql dbname
psql>\f ','
psql>\a
psql>\o '/tmp/output.csv'
psql>SELECT * from users;
psql>\q

所有这些都可以在脚本中使用,但我更喜欢#1。

4 pgadmin 但这不可编写脚本。

There are several solutions:

1 psql command

psql -d dbname -t -A -F"," -c "select * from users" > output.csv

This has the big advantage that you can using it via SSH, like ssh postgres@host command - enabling you to get

2 postgres copy command

COPY (SELECT * from users) To '/tmp/output.csv' With CSV;

3 psql interactive (or not)

>psql dbname
psql>\f ','
psql>\a
psql>\o '/tmp/output.csv'
psql>SELECT * from users;
psql>\q

All of them can be used in scripts, but I prefer #1.

4 pgadmin but that's not scriptable.

美人迟暮 2024-08-13 09:50:13

在终端(连接到数据库时)将输出设置为 cvs 文件

1) 将字段分隔符设置为 ',':

\f ','

2) 设置输出格式不对齐:

\a

3) 仅显示元组:

\t

4) 设置输出:

\o '/tmp/yourOutputFile.csv'

5) 执行查询:

:select * from YOUR_TABLE

6) 输出:

\o

然后您将能够在此位置找到您的 csv 文件:

cd /tmp

使用 scp 命令复制它或使用 nano 进行编辑:

nano /tmp/yourOutputFile.csv

In terminal (while connected to the db) set output to the cvs file

1) Set field seperator to ',':

\f ','

2) Set output format unaligned:

\a

3) Show only tuples:

\t

4) Set output:

\o '/tmp/yourOutputFile.csv'

5) Execute your query:

:select * from YOUR_TABLE

6) Output:

\o

You will then be able to find your csv file in this location:

cd /tmp

Copy it using the scp command or edit using nano:

nano /tmp/yourOutputFile.csv
极度宠爱 2024-08-13 09:50:13

CSV 导出统一

此信息并未得到很好的体现。由于这是我第二次需要导出此内容,因此我会将其放在这里以提醒自己。

实际上,最好的方法(从 postgres 中获取 CSV)是使用 COPY ... TO STDOUT 命令。尽管您不想按照此处答案中所示的方式进行操作。正确的命令使用方法是:

COPY (select id, name from groups) TO STDOUT WITH CSV HEADER

只记住一个命令!

它非常适合在 ssh 上使用:

$ ssh psqlserver.example.com 'psql -d mydb "COPY (select id, name from groups) TO STDOUT WITH CSV HEADER"' > groups.csv

它非常适合在 docker 内部通过 ssh 使用:

$ ssh pgserver.example.com 'docker exec -tu postgres postgres psql -d mydb -c "COPY groups TO STDOUT WITH CSV HEADER"' > groups.csv

它在本地计算机上甚至很棒:

$ psql -d mydb -c 'COPY groups TO STDOUT WITH CSV HEADER' > groups.csv

或者在本地计算机上的 docker 内部?:

docker exec -tu postgres postgres psql -d mydb -c 'COPY groups TO STDOUT WITH CSV HEADER' > groups.csv

或者在 kubernetes 集群上,在 docker 中,通过 HTTPS??:

kubectl exec -t postgres-2592991581-ws2td 'psql -d mydb -c "COPY groups TO STDOUT WITH CSV HEADER"' > groups.csv

如此多才多艺,很多逗号!

你甚至吗?

是的,我做到了,这是我的注释:

使用 /copy 的COPYses

可以在运行 psql 命令的任何系统上有效地执行文件操作,就像执行该命令的用户< a href="https://wiki.postgresql.org/wiki/COPY" rel="noreferrer">1。如果您连接到远程服务器,则可以轻松地将执行 psql 的系统上的数据文件复制到远程服务器或从远程服务器复制数据文件。

COPY 作为后端进程用户帐户(默认 postgres)在服务器上执行文件操作,相应地检查并应用文件路径和权限。如果使用 TO STDOUT,则会绕过文件权限检查。

如果 psql 未在您希望生成的 CSV 最终驻留的系统上执行,则这两个选项都需要后续文件移动。根据我的经验,当您主要使用远程服务器时,这是最有可能的情况。

通过 ssh 配置 TCP/IP 隧道到远程系统以进行简单的 CSV 输出更为复杂,但对于其他输出格式(二进制),最好通过隧道连接进行 /copy ,执行本地psql。同样,对于大型导入,将源文件移动到服务器并使用 COPY 可能是性能最高的选项。

PSQL 参数

使用 psql 参数,您可以像 CSV 一样格式化输出,但有一些缺点,例如必须记住禁用寻呼机并且无法获取标头:

$ psql -P pager=off -d mydb -t -A -F',' -c 'select * from groups;'
2,Technician,Test 2,,,t,,0,,                                                                                                                                                                   
3,Truck,1,2017-10-02,,t,,0,,                                                                                                                                                                   
4,Truck,2,2017-10-02,,t,,0,,

其他工具

不,我只想从我的服务器中获取 CSV,而不需要编译和/或安装工具。

CSV Export Unification

This information isn't really well represented. As this is the second time I've needed to derive this, I'll put this here to remind myself if nothing else.

Really the best way to do this (get CSV out of postgres) is to use the COPY ... TO STDOUT command. Though you don't want to do it the way shown in the answers here. The correct way to use the command is:

COPY (select id, name from groups) TO STDOUT WITH CSV HEADER

Remember just one command!

It's great for use over ssh:

$ ssh psqlserver.example.com 'psql -d mydb "COPY (select id, name from groups) TO STDOUT WITH CSV HEADER"' > groups.csv

It's great for use inside docker over ssh:

$ ssh pgserver.example.com 'docker exec -tu postgres postgres psql -d mydb -c "COPY groups TO STDOUT WITH CSV HEADER"' > groups.csv

It's even great on the local machine:

$ psql -d mydb -c 'COPY groups TO STDOUT WITH CSV HEADER' > groups.csv

Or inside docker on the local machine?:

docker exec -tu postgres postgres psql -d mydb -c 'COPY groups TO STDOUT WITH CSV HEADER' > groups.csv

Or on a kubernetes cluster, in docker, over HTTPS??:

kubectl exec -t postgres-2592991581-ws2td 'psql -d mydb -c "COPY groups TO STDOUT WITH CSV HEADER"' > groups.csv

So versatile, much commas!

Do you even?

Yes I did, here are my notes:

The COPYses

Using /copy effectively executes file operations on whatever system the psql command is running on, as the user who is executing it1. If you connect to a remote server, it's simple to copy data files on the system executing psql to/from the remote server.

COPY executes file operations on the server as the backend process user account (default postgres), file paths and permissions are checked and applied accordingly. If using TO STDOUT then file permissions checks are bypassed.

Both of these options require subsequent file movement if psql is not executing on the system where you want the resultant CSV to ultimately reside. This is the most likely case, in my experience, when you mostly work with remote servers.

It is more complex to configure something like a TCP/IP tunnel over ssh to a remote system for simple CSV output, but for other output formats (binary) it may be better to /copy over a tunneled connection, executing a local psql. In a similar vein, for large imports, moving the source file to the server and using COPY is probably the highest-performance option.

PSQL Parameters

With psql parameters you can format the output like CSV but there are downsides like having to remember to disable the pager and not getting headers:

$ psql -P pager=off -d mydb -t -A -F',' -c 'select * from groups;'
2,Technician,Test 2,,,t,,0,,                                                                                                                                                                   
3,Truck,1,2017-10-02,,t,,0,,                                                                                                                                                                   
4,Truck,2,2017-10-02,,t,,0,,

Other Tools

No, I just want to get CSV out of my server without compiling and/or installing a tool.

北城半夏 2024-08-13 09:50:13

新版本 - psql 12 - 将支持 --csv

psql - 开发

--csv

切换到 CSV(逗号分隔值)输出模式。这相当于 \pset 格式 csv

<小时>

csv_fieldsep

指定 CSV 输出格式中使用的字段分隔符。如果分隔符出现在字段值中,则按照标准 CSV 规则,该字段将在双引号内输出。默认为逗号。

用法:

psql -c "SELECT * FROM pg_catalog.pg_tables" --csv  postgres

psql -c "SELECT * FROM pg_catalog.pg_tables" --csv -P csv_fieldsep='^'  postgres

psql -c "SELECT * FROM pg_catalog.pg_tables" --csv  postgres > output.csv

New version - psql 12 - will support --csv.

psql - devel

--csv

Switches to CSV (Comma-Separated Values) output mode. This is equivalent to \pset format csv.


csv_fieldsep

Specifies the field separator to be used in CSV output format. If the separator character appears in a field's value, that field is output within double quotes, following standard CSV rules. The default is a comma.

Usage:

psql -c "SELECT * FROM pg_catalog.pg_tables" --csv  postgres

psql -c "SELECT * FROM pg_catalog.pg_tables" --csv -P csv_fieldsep='^'  postgres

psql -c "SELECT * FROM pg_catalog.pg_tables" --csv  postgres > output.csv
原谅过去的我 2024-08-13 09:50:13

如果您对特定表的所有列以及标题感兴趣,您可以使用

COPY table TO '/some_destdir/mycsv.csv' WITH CSV HEADER;

简单一点,是等效的。

COPY (SELECT * FROM table) TO '/some_destdir/mycsv.csv' WITH CSV HEADER;

这比据我所知

If you're interested in all the columns of a particular table along with headers, you can use

COPY table TO '/some_destdir/mycsv.csv' WITH CSV HEADER;

This is a tiny bit simpler than

COPY (SELECT * FROM table) TO '/some_destdir/mycsv.csv' WITH CSV HEADER;

which, to the best of my knowledge, are equivalent.

那支青花 2024-08-13 09:50:13

我必须使用 \COPY 因为我收到错误消息:

ERROR:  could not open file "/filepath/places.csv" for writing: Permission denied

所以我使用:

\Copy (Select address, zip  From manjadata) To '/filepath/places.csv' With CSV;

并且它正在运行

I had to use the \COPY because I received the error message:

ERROR:  could not open file "/filepath/places.csv" for writing: Permission denied

So I used:

\Copy (Select address, zip  From manjadata) To '/filepath/places.csv' With CSV;

and it is functioning

時窥 2024-08-13 09:50:13

我正在使用 AWS Redshift,它不支持 COPY TO 功能。

不过,我的 BI 工具支持制表符分隔的 CSV,因此我使用了以下内容:

 psql -h dblocation -p port -U user -d dbname -F 
\t' --no-align -c "SELECT * FROM TABLE" > outfile.csv

I'm working on AWS Redshift, which does not support the COPY TO feature.

My BI tool supports tab-delimited CSVs though, so I used the following:

 psql -h dblocation -p port -U user -d dbname -F 
\t' --no-align -c "SELECT * FROM TABLE" > outfile.csv
吃颗糖壮壮胆 2024-08-13 09:50:13

psql 可以为您执行此操作:

edd@ron:~$ psql -d beancounter -t -A -F"," \
                -c "select date, symbol, day_close " \
                   "from stockprices where symbol like 'I%' " \
                   "and date >= '2009-10-02'"
2009-10-02,IBM,119.02
2009-10-02,IEF,92.77
2009-10-02,IEV,37.05
2009-10-02,IJH,66.18
2009-10-02,IJR,50.33
2009-10-02,ILF,42.24
2009-10-02,INTC,18.97
2009-10-02,IP,21.39
edd@ron:~$

请参阅 man psql 以获取有关此处使用的选项的帮助。

psql can do this for you:

edd@ron:~$ psql -d beancounter -t -A -F"," \
                -c "select date, symbol, day_close " \
                   "from stockprices where symbol like 'I%' " \
                   "and date >= '2009-10-02'"
2009-10-02,IBM,119.02
2009-10-02,IEF,92.77
2009-10-02,IEV,37.05
2009-10-02,IJH,66.18
2009-10-02,IJR,50.33
2009-10-02,ILF,42.24
2009-10-02,INTC,18.97
2009-10-02,IP,21.39
edd@ron:~$

See man psql for help on the options used here.

夜空下最亮的亮点 2024-08-13 09:50:13

在 pgAdmin III 中,有一个选项可以从查询窗口导出到文件。在主菜单中是“查询”->“查询”。执行到文件或者有一个按钮可以执行相同的操作(它是带有蓝色软盘的绿色三角形,而不是仅运行查询的纯绿色三角形)。如果您没有从查询窗口运行查询,那么我会执行 IMSoP 建议的操作并使用复制命令。

In pgAdmin III there is an option to export to file from the query window. In the main menu it's Query -> Execute to file or there's a button that does the same thing (it's a green triangle with a blue floppy disk as opposed to the plain green triangle which just runs the query). If you're not running the query from the query window then I'd do what IMSoP suggested and use the copy command.

抠脚大汉 2024-08-13 09:50:13

如果您有更长的查询并且喜欢使用 psql,则将查询放入文件并使用以下命令:

psql -d my_db_name -t -A -F";" -f input-file.sql -o output-file.csv

If you have longer query and you like to use psql then put your query to a file and use the following command:

psql -d my_db_name -t -A -F";" -f input-file.sql -o output-file.csv
花心好男孩 2024-08-13 09:50:13

我编写了一个名为 psql2csv 的小工具,它封装了 COPY查询 TO STDOUT 模式,生成正确的 CSV。它的界面类似于psql。

psql2csv [OPTIONS] < QUERY
psql2csv [OPTIONS] QUERY

假定查询是 STDIN 的内容(如果存在)或最后一个参数。除以下参数外,所有其他参数都转发到 psql:

-h, --help           show help, then exit
--encoding=ENCODING  use a different encoding than UTF8 (Excel likes LATIN1)
--no-header          do not output a header

I've written a little tool called psql2csv that encapsulates the COPY query TO STDOUT pattern, resulting in proper CSV. It's interface is similar to psql.

psql2csv [OPTIONS] < QUERY
psql2csv [OPTIONS] QUERY

The query is assumed to be the contents of STDIN, if present, or the last argument. All other arguments are forwarded to psql except for these:

-h, --help           show help, then exit
--encoding=ENCODING  use a different encoding than UTF8 (Excel likes LATIN1)
--no-header          do not output a header
八巷 2024-08-13 09:50:13

我尝试了几种方法,但很少有人能够为我提供所需的带有标题详细信息的 CSV。

这对我有用。

psql -d dbame -U username \
  -c "COPY ( SELECT * FROM TABLE ) TO STDOUT WITH CSV HEADER " > \
  OUTPUT_CSV_FILE.csv

I tried several things but few of them were able to give me the desired CSV with header details.

Here is what worked for me.

psql -d dbame -U username \
  -c "COPY ( SELECT * FROM TABLE ) TO STDOUT WITH CSV HEADER " > \
  OUTPUT_CSV_FILE.csv
掩耳倾听 2024-08-13 09:50:13

从 Postgres 12 开始,您可以更改输出格式:

\pset format csv

允许使用以下格式:

aligned, asciidoc, csv, html, latex, latex-longtable, troff-ms, unaligned, wrapped

如果您想导出请求的结果,可以使用 \o filename 功能。

例子 :

\pset format csv

\o file.csv
SELECT * FROM table LIMIT 10;
\o

\pset format aligned

Since Postgres 12, you can change the output format :

\pset format csv

The following formats are allowed :

aligned, asciidoc, csv, html, latex, latex-longtable, troff-ms, unaligned, wrapped

If you want to export the result of a request, you can use the \o filename feature.

Example :

\pset format csv

\o file.csv
SELECT * FROM table LIMIT 10;
\o

\pset format aligned
兰花执着 2024-08-13 09:50:13

要下载列名称为 HEADER 的 CSV 文件,请使用以下命令:

Copy (Select * From tableName) To '/tmp/fileName.csv' With CSV HEADER;

To Download CSV file with column names as HEADER use this command:

Copy (Select * From tableName) To '/tmp/fileName.csv' With CSV HEADER;
好倦 2024-08-13 09:50:13

我发现 psql --csv 创建了一个包含 UTF8 字符的 CSV 文件,但缺少 UTF8 字节顺序标记 (0xEF 0xBB 0xBF)。如果不考虑这一点,此 CSV 文件的默认导入将损坏国际字符,例如 CJK 字符。

为了解决这个问题,我设计了以下脚本:

# Define a connection to the Postgres database through environment variables
export PGHOST=your.pg.host
export PGPORT=5432
export PGDATABASE=your_pg_database
export PGUSER=your_pg_user

# Place credentials in $HOME/.pgpass with the format:
# ${PGHOST}:${PGPORT}:${PGUSER}:master:${PGPASSWORD}

# Populate long SQL query in a text file:
cat > /tmp/query.sql <<EOF
SELECT item.item_no,item_descrip,
invoice.invoice_no,invoice.sold_qty
FROM item
LEFT JOIN invoice
ON item.item_no=invoice.item_no;
EOF

# Generate CSV report with UTF8 BOM mark
printf '\xEF\xBB\xBF' > report.csv
psql -f /tmp/query.sql --csv | tee -a report.csv

通过这种方式,我可以编写 CSV 创建过程的脚本以实现自动化,并允许我在单个源文件中简洁地维护脚本。

I found that psql --csv creates a CSV file with UTF8 characters but it is missing the UTF8 Byte Order Mark (0xEF 0xBB 0xBF). Without taking it into account, the default import of this CSV file will corrupt international characters such as CJK characters.

To fix it, I devised the following script:

# Define a connection to the Postgres database through environment variables
export PGHOST=your.pg.host
export PGPORT=5432
export PGDATABASE=your_pg_database
export PGUSER=your_pg_user

# Place credentials in $HOME/.pgpass with the format:
# ${PGHOST}:${PGPORT}:${PGUSER}:master:${PGPASSWORD}

# Populate long SQL query in a text file:
cat > /tmp/query.sql <<EOF
SELECT item.item_no,item_descrip,
invoice.invoice_no,invoice.sold_qty
FROM item
LEFT JOIN invoice
ON item.item_no=invoice.item_no;
EOF

# Generate CSV report with UTF8 BOM mark
printf '\xEF\xBB\xBF' > report.csv
psql -f /tmp/query.sql --csv | tee -a report.csv

Doing it this way, lets me script the CSV creation process for automation and allows me to succinctly maintain the script in a single source file.

大海や 2024-08-13 09:50:13

当您的查询太长且无法内联写入时,可以使用如下临时表:

CREATE TABLE tmp_table as (

    SELECT *
    FROM my_table mt
    WHERE ...

);

\COPY tmp_table TO '~/Desktop/tmp_table.csv' DELIMITER ';' CSV HEADER;
DROP TABLE tmp_table;

When your query is too long and you can't write it inline, you can use a temporary table like this :

CREATE TABLE tmp_table as (

    SELECT *
    FROM my_table mt
    WHERE ...

);

\COPY tmp_table TO '~/Desktop/tmp_table.csv' DELIMITER ';' CSV HEADER;
DROP TABLE tmp_table;
何以畏孤独 2024-08-13 09:50:13

如果您使用的是 AWS,例如 AWS RDS,则可以使用 PGAdmin。我所做的是创建一个具有所需输出的临时表:

CREATE TABLE export_descriptions AS SELECT description FROM products WHERE id = 406;

然后在 PGAdmin 中,它有一个导出到 CSV 的选项:

在此处输入图像描述

在那里,您可以指定保存位置以及格式:

在此处输入图像描述

然后它会直接保存到您的计算机上。请记住,AWS RDS 向您隐藏了其运行的底层计算,因此您无权访问底层服务器(EC2 或 Fargate 实例)。换句话说,您无法通过 ssh 访问它。不过,您可以访问 postgres cli 并从 PGAdmin 连接到它,并且使用新的 PGAdmin 界面,可以轻松导出到 csv。

If you are using AWS, such as AWS RDS, then you can use PGAdmin. What I do is create a temporary table with the desired output:

CREATE TABLE export_descriptions AS SELECT description FROM products WHERE id = 406;

Then in PGAdmin, it has an export to CSV Option:

enter image description here

There, you can specify where to save it and in what format:

enter image description here

And then it saves right to your computer. Remember AWS RDS hides the underlying compute it runs on from you, so you do not have access to the underlying server (EC2 or Fargate instance). In other words, you cannot ssh into it. You can access the postgres cli though and connect to it from PGAdmin and with the new PGAdmin interface, it makes it easy to export to csv.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文