ConvertTo-Json 结果意外?答案:默认-Depth为2

发布于 2025-01-19 07:31:33 字数 2653 浏览 0 评论 0 原文

为什么我会得到意外 结果,为什么我会获得 system.Collections.hashtable 和/或为什么往返往返( $ json | convertfrom-json |)之类的值之类的值。 convertto-json )失败?

Meta问题

Stackoverflow具有防止重复问题的良好机制,但据我所知,没有任何机制可以防止具有重复的问题原因。以这个问题为例:几乎每周都有一个新问题出现相同的原因,但是通常很难将其定义为重复,因为问题本身略有不同。 尽管如此,如果这个问题/回答本身最终以重复(或主题)的方式最终出现,但不幸的是,Stackoverflow没有可能写作”文章,以防止其他程序员继续写出这一“已知”陷阱引起的写作问题。

重复

一些类似问题的示例,具有相同的共同原因:

… 它具有标题中的共同原因,并且由于同样的原因,它可能会更好地防止重复问题。

Why do I get unexpected ConvertTo-Json results, why do I get values like System.Collections.Hashtable and/or why does a round-trip ($Json | ConvertFrom-Json | ConvertTo-Json) fail?

Meta issue

Stackoverflow has a good mechanism to prevent duplicate questions but as far as I can see there is no mechanism to prevent questions that have a duplicate cause. Take this question as a an example: almost every week a new question comes in with the same cause, yet it is often difficult to define it as a duplicate because the question itself is just a slightly different.
Nevertheless, I wouldn't be surprised if this question/answer itself ends up as a duplicate (or off-topic) but unfortunately stackoverflow has no possibility to write an article to prevent other programmers from continuing writing questions caused by this “known” pitfall.

Duplicates

A few examples of similar questions with the same common cause:

Different

So, were does this “self-answered” question differ from the above duplicates?
It has the common cause in the title and with that it might better prevent repeating questions due to the same cause.

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(3

稀香 2025-01-26 07:31:33

回答

ConvertTo-Json 有一个 -Depth 参数:

指定包含的对象的级别
JSON 表示。
默认值为2

示例

要使用 JSON 文件进行完整往返,您需要增加 ConvertTo-Json cmdlet 的 -Depth

$Json | ConvertFrom-Json | ConvertTo-Json -Depth 9

TL;DR

可能是因为 ConvertTo- Json 使用 (.Net) 完整类型名称终止比默认 -Depth (2) 更深的分支,程序员假设存在错误或 cmdlet 限制并且不要阅读帮助或关于。
就我个人而言,我认为字符串末尾有一个简单的 省略号 (三个点:...)切断分支,会有更清晰的含义(另请参阅:Github问题:8381

为什么?

这个问题通常也会在另一个讨论中结束:为什么深度受到限制?

一些对象具有循环引用,这意味着子对象可以引用父对象(或其祖父母之一)如果将其序列化为 JSON,则会导致不定式循环。

以下面的哈希表为例,其 parent 属性引用对象本身:

$Test = @{Guid = New-Guid}
$Test.Parent = $Test

如果执行: $Test | ConvertTo-Json 默认情况下,它会方便地停止在深度级别 2:

{
    "Guid":  "a274d017-5188-4d91-b960-023c06159dcc",
    "Parent":  {
                   "Guid":  "a274d017-5188-4d91-b960-023c06159dcc",
                   "Parent":  {
                                  "Guid":  "a274d017-5188-4d91-b960-023c06159dcc",
                                  "Parent":  "System.Collections.Hashtable"
                              }
               }
}

这就是为什么自动将 -Depth 设置为很大的值并不是一个好主意。

Answer

ConvertTo-Json has a -Depth parameter:

Specifies how many levels of contained objects are included in the
JSON representation.
The default value is 2.

Example

To do a full round-trip with a JSON file you need to increase the -Depth for the ConvertTo-Json cmdlet:

$Json | ConvertFrom-Json | ConvertTo-Json -Depth 9

TL;DR

Probably because ConvertTo-Json terminates branches that are deeper than the default -Depth (2) with a (.Net) full type name, programmers assume a bug or a cmdlet limitation and do not read the help or about.
Personally, I think a string with a simple ellipsis (three dots: …) at the end of the cut off branch, would have a clearer meaning (see also: Github issue: 8381)

Why?

This issue often ends up in another discussion as well: Why is the depth limited at all?

Some objects have circular references, meaning that a child object could refer to a parent (or one of its grandparents) causing a infinitive loop if it would be serialized to JSON.

Take for example the following hash table with a parent property that refers to the object itself:

$Test = @{Guid = New-Guid}
$Test.Parent = $Test

If you execute: $Test | ConvertTo-Json it will conveniently stop at a depth level of 2 by default:

{
    "Guid":  "a274d017-5188-4d91-b960-023c06159dcc",
    "Parent":  {
                   "Guid":  "a274d017-5188-4d91-b960-023c06159dcc",
                   "Parent":  {
                                  "Guid":  "a274d017-5188-4d91-b960-023c06159dcc",
                                  "Parent":  "System.Collections.Hashtable"
                              }
               }
}

This is why it is not a good idea to automatically set the -Depth to a large amount.

掀纱窥君容 2025-01-26 07:31:33

更新 PowerShell 7.1在发生截断时引入了警告 。虽然这比早期版本中发生的安静截断要好,但下面建议的解决方案似乎对我来说更可取。


您有用的问题和答案清楚地说明了当前默认值 convert> convert> convertto-json >行为是。

至于行为的正当化

而 -depth 对有意有意 截断输入对象树的输入对象树完整的深度,您不需要, -depth 默认 2 悄悄截断输出量从毫无戒心的用户的观点 - 失败直到以后才能发现的序列化序列化安静的事实失败

看似任意和安静的截断对于大多数用户来说都是令人惊讶的,并且必须在每个 convertto-json 调用中考虑它是不必要的负担。

我创建了 github问题#8393 包含行为,特别是如下:

  • do 不是 深度限制的序列化[pscustomObject] hashtable/dictionary 对象图本身(从概念上dtos(数据传输对象,“属性袋”)的潜在层次结构,例如从*-json rection返回的 convert*,具体)。



      相比之下,

    • 具有具有自动深度极限的任意(非主要).NET类型 是有意义的,包括当它们在DTO对象图中遇到的叶子时,因为这种类型的实例可能是深度过度的对象图,甚至可能包含循环引用;例如, Get-ChildItem | convertto-json 可以用 -depth 值迅速失控,低如 4 。也就是说,通常不明智地使用JSON Serialization使用任意.NET类型:JSON是不是 设计为给定平台类型的通用序列化格式;相反,它专注于DTO,仅包含属性有限的数据类型



    • 实际上,DTO和其他类型之间的这种区别在幕后 ,即在的序列化背景下 >和背景作业

  • 使用 -depth 仅需要有意 在指定的深度处将输入对象树截断(或者主要是假设地,以便序列化到更深的层次比内部最大限制, 100 )。

Update: PowerShell 7.1 introduced a warning when truncation occurs. While that is better than the quiet truncation that occurred in earlier versions, the solution suggested below seems much preferable to me.


Your helpful question and answer clearly illustrate how much of a pain point the current default ConvertTo-Json behavior is.

As for the justification of the behavior:

While -Depth can be useful to intentionally truncate an input object tree whose full depth you don't need, -Depth defaulting to 2 and quietly truncating the output amounts to quiet de-facto failure of the serialization from the unsuspecting user's perspective - failure that may not be discovered until later.

The seemingly arbitrary and quiet truncation is surprising to most users, and having to account for it in every ConvertTo-Json call is an unnecessary burden.

I've created GitHub issue #8393 containing a proposal to change the current behavior, specifically as follows:

  • Do not depth-limit the serialization of [pscustomobject] and hashtable / dictionary object graphs themselves (a potential hierarchy of what are conceptually DTOs (data-transfer objects, "property bags"), such as returned from Convert*From*-Json, specifically).

    • By contrast, it does make sense to have an automatic depth limit for arbitrary (non-primitive) .NET types, including when they're encountered as leafs in DTO object graphs, as instances of such types can be object graphs of excessive depths and may even contain circular references; e.g., Get-ChildItem | ConvertTo-Json can get quickly out of hand, with -Depth values as low as 4. That said, it is generally ill-advised to use arbitrary .NET types with JSON serialization: JSON is not designed to be a general-purpose serialization format for a given platform's types; instead, it is focused on DTOs, comprising properties only, with a limited set of data types.

    • This distinction between DTOs and other types is, in fact, employed by PowerShell itself behind the scenes, namely in the context of serialization for remoting and background jobs.

  • Use of -Depth is then only needed to intentionally truncate the input object tree at the specified depth (or, mostly hypothetically, in order to serialize to a deeper level than the internal maximum-depth limit, 100).

抹茶夏天i‖ 2025-01-26 07:31:33

.. | convertto -json -depth 100 | ..

最新powershell中的最大深度为100(我的:7.3.4,从PowerShell Core重命名)。

.. | ConvertTo-Json -Depth 100 | ..

The max depth is 100 in latest PowerShell (mine: 7.3.4, renamed from PowerShell Core).

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文