I have script that runs on MacOS it uses curl to fetch a json containing files on a sharepoint site.
All works fine but the response from curl when running in bash or sh has the unicode characters written out in \uxxxx format.
for example ö is \u00f6
"Name":"\u00f6vning.dotx"
but when running with zsh it encodes correctly.
Any idea why and can you get it working using bash or sh?
#!/bin/sh
url="https://company.sharepoint.com/sites/Testfiles"
folder="TestDocuments"
files=$(curl -s $url"/_api/web/GetFolderByServerRelativeUrl('Documents/"$folder"')/Files" -H "Accept: application/json")
echo $files
when running curl -v
GET /sites/Testfiles/_api/web/GetFolderByServerRelativeUrl('Documents/TestDocuments')/Files HTTP/1.1
> Host: company.sharepoint.com
> User-Agent: curl/7.64.1
> Accept: application/json
>
< HTTP/1.1 200 OK
< Cache-Control: private, max-age=0
< Transfer-Encoding: chunked
< Content-Type: application/json;odata=minimalmetadata;streaming=true;charset=utf-8
here's the complete json response
{
"odata.metadata": "https://company.sharepoint.com/sites/Testfiles/_api/$metadata#SP.ApiData.Files12",
"value": [
{
"odata.type": "SP.File",
"odata.id": "https://company.sharepoint.com/sites/Testfiles/_api/Web/GetFileByServerRelativePath(decodedurl='/sites/Testfiles/Documents/TestDocs/\u00f6vning.dotx')",
"odata.editLink": "Web/GetFileByServerRelativePath(decodedurl='/sites/Testfiles/Documents/TestDocs/övning.dotx')",
"CheckInComment": "",
"CheckOutType": 2,
"ContentTag": "{39C1CD78-3674-49F4-9982-214B33FC03BE},2,5",
"CustomizedPageStatus": 0,
"ETag": "\"{39C1CD78-3674-49F4-9982-214B33FC03BE},2\"",
"Exists": true,
"IrmEnabled": false,
"Length": "48725",
"Level": 1,
"LinkingUri": "https://company.sharepoint.com/sites/Testfiles/Documents/TestDocs/\u00f6vning.dotx?d=w11c1cd78361119f49982214b33fd43be",
"LinkingUrl": "https://company.sharepoint.com/sites/Testfiles/Documents/TestDocs/\u00f6vning.dotx?d=w11c1cd78361119f49982214b33fd43be",
"MajorVersion": 1,
"MinorVersion": 0,
"Name": "\u00f6vning.dotx",
"ServerRelativeUrl": "/sites/Testfiles/Documents/TestDocs/\u00f6vning.dotx",
"TimeCreated": "2021-10-14T13:32:16Z",
"TimeLastModified": "2021-10-14T13:32:16Z",
"Title": "",
"UIVersion": 512,
"UIVersionLabel": "1.0",
"UniqueId": "39c1cd78-3674-49f4-9982-214b33fc03be"
}
]
}
CodePudding user response:
In bash
, you need to use the -e
option to have echo
expand the \u
escapes to UTF-8 sequences that your terminal can display.
$ x='"\u00f6vning.dotx"'
$ echo "$x"
"\u00f6vning.dotx"
$ echo -e "$x"
"övning.dotx"
First, we have a "coincidence" that JSON, bash
and zsh
use the same syntax (\u....
) for representing arbitrary Unicode code points in plain ASCII. This is not true for POSIX-compliant shells in general, so I don't think you can expect this to work when running sh
, regardless of which shell is actually used. (In practice, it does not work with bash
3.2, but does work with later versions of bash
.)
zsh
's implementation of echo
is XSI-compliant, in that it expands various character sequences by default. (\u
itself is not defined by XSI, but zsh
includes them in the list of sequences expanded by echo
.) (POSIX compliance for echo
is fairly loose, stating only that arguments containing backslashes can be handled in an implementation-specific manner.)
bash
's implemeantion of echo
is not XSI-compliant by default. -e
enables the expansion of backslash sequences, as would setting the xpg_echo
shell option.
$ shopt -s xpg_echo
$ echo "$x"
"övning.dotx"
CodePudding user response:
To have same command working on both bash and zsh :
printf "%b\n" "\u00f6vning.dotx"