I'm trying to work out how "current_utilization" in the -raw output from "statistics show -object resource_headroom_cpu -raw" correlates (ie. is calculated/converted) to main percentage output in "statistics show -object resource_headroom_cpu"
 
EXAMPLE:
 
cluster1::*> statistics show -object resource_headroom_cpu
Object: resource_headroom_cpu
Instance: CPU_cluster1-n1
Start-time: 11/9/2018 10:23:20
End-time: 11/9/2018 10:31:39
Elapsed-time: 498s
Scope: cluster1-n1
Counter Value
 -------------------------------- --------------------------------
 current_latency 330us
 current_ops 4215
 current_utilization 56%             <<<<<<<<<<<<<
 
...
 
cluster1::*> statistics show -object resource_headroom_cpu -raw
Object: resource_headroom_cpu
Instance: CPU_cluster1-n1
Start-time: 11/9/2018 10:27:43
End-time: 11/9/2018 10:27:43
Scope: cluster1-n1
Counter Value
 -------------------------------- --------------------------------
 current_latency 20267702036037us
 current_ops 53787499981
 current_utilization 36639819423391%                   <<<<<<  (how is this 56%?)
 
 
Similarly, how is -raw optimal_point_utilization converted to standard output?
 
EG.
statistics show -object resource_headroom_cpu:
optimal_point_utilization 88
 
statistics show -object resource_headroom_cpu -raw:
optimal_point_utilization 28078837