Unexpected ConvertTo-Json results? Answer: it has a default -Depth of 2

﹥>﹥吖頭↗ 提交于 2019-11-26 03:59:55

问题


Why do I get unexpected ConvertTo-Json results?
And why does a round-trip ($Json | ConvertFrom-Json | ConvertTo-Json) fail?

Meta issue

Stackoverflow has a good mechanism to prevent duplicate questions but as far as I can see there is no mechanism to prevent questions that have a duplicate cause. Take this question as a an example: almost every week a new question comes in with the same cause, yet it is often difficult to define it as a duplicate because the question itself is just a slightly different. Nevertheless, I wouldn\'t be surprised if this question/answer itself ends up as a duplicate (or off-topic) but unfortunately stackoverflow has no possibility to write an article to prevent other programmers from continuing writing questions caused by this “known” pitfall.

Duplicates

A few examples of similar questions with the same common cause:

  • PowerShell ConvertTo-Json does not convert Array as expected (yesterday)
  • Powershell ConvertTo-json with embedded hashtable
  • powershell “ConvertTo-Json” has messed json format output
  • Nested arrays and ConvertTo-Json
  • Powershell ConvertTo-JSON missing nested level
  • How to save a JSON object to a file using Powershell?
  • Cannot convert PSCustomObjects within array back to JSON correctly
  • ConvertTo-Json flattens arrays over 3 levels deep
  • Add an array of objects to a PSObject at once
  • Why does ConvertTo-Json drop values
  • How to round-trip this JSON to PSObject and back in Powershell

Different

So, were does this “self-answered” question differ from the above duplicates?
It has the common cause in the title and with that it might better prevent repeating questions due to the same cause.


回答1:


Answer

ConvertTo-Json has a -Depth parameter:

Specifies how many levels of contained objects are included in the JSON representation.
The default value is 2.

Example

To do a full round-trip with a JSON file you need to increase the -Depth for the ConvertTo-Json cmdlet:

$Json | ConvertFrom-Json | ConvertTo-Json -Depth 9

TL;DR

Probably because ConvertTo-Json terminates branches that are deeper than the default -Depth (2) with a (.Net) full type name, programmers assume a bug or a cmdlet limitation and do not read the help or about.
Personally, I think a string with a simple ellipsis (three dots: …) at the end of the cut off branch, would have a clearer meaning (see also: Github issue: 8381)

Why?

This issue often ends up in another discussion as well: Why is the depth limited at all?

Some objects have circular references, meaning that a child object could refer to a parent (or one of its grandparents) causing a infinitive loop if it would be serialized to JSON.

Take for example the following hash table with a parent property that refers to the object itself:

$Test = @{Guid = New-Guid}
$Test.Parent = $Test

If you execute: $Test | ConvertTo-Json it will conveniently stop at a depth level of 2 by default:

{
    "Guid":  "a274d017-5188-4d91-b960-023c06159dcc",
    "Parent":  {
                   "Guid":  "a274d017-5188-4d91-b960-023c06159dcc",
                   "Parent":  {
                                  "Guid":  "a274d017-5188-4d91-b960-023c06159dcc",
                                  "Parent":  "System.Collections.Hashtable"
                              }
               }
}

This is why it is not a good idea to automatically set the -Depth to a large amount.




回答2:


Your helpful question and answer clearly illustrate how much of a pain point the current default ConvertTo-Json behavior is.

As for the justification of the behavior:

While -Depth can be useful to intentionally truncate an input object tree whose full depth you don't need,
-Depth defaulting to 2 and quietly truncating the output amounts to quiet de-facto failure of the serialization from the unsuspecting user's perspective - failure that may not be discovered until later.

The seemingly arbitrary and quiet truncation is surprising to most users, and having to account for it in every ConvertTo-Json call is an unnecessary burden.

I've created this GitHub issue to request changing the current behavior, specifically as follows:

  • Ignore -Depth for [pscustomobject] object graphs (a hierarchy of what are conceptually DTOs (data-transfer objects, "property bags"), such as returned from Convert*From*-Json), specifically.

    • It does make sense to have an automatic depth limit for arbitrary .NET types, as they can be object graphs of excessive depths and may even contain circular references; e.g., Get-ChildItem | ConvertTo-Json can get quickly out of hand, with -Depth values as low as 4.

    • Note that nested collections, including hashtables, are not themselves subject to the depth limit only their (scalar) elements.

    • This distinction between DTOs and other types is, in fact, employed by PowerShell itself behind the scenes, namely in the context of serialization for remoting and background jobs.

  • Use of -Depth is then only needed to intentionally truncate the input object tree at the specified depth or serialize to a deeper level (if needed if to depth greater than the internal maximum-depth limit, 100)

Make your voice heard there, if you'd like to see this change happen (or disagree).



来源:https://stackoverflow.com/questions/53583677/unexpected-convertto-json-results-answer-it-has-a-default-depth-of-2

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!