GP means string literals. To quote from the spec: "4.3.16 String value: primitive value that is a finite ordered sequence of zero or more 16-bit unsigned integer... Each integer value in the sequence usually represents a single 16-bit unit of UTF-16 text."
Javascript "strings" are, as the spec says, just arrays of 16 bit integers internally. Since Unicode introduced characters outside the Basic Multilingual Plane (BMP) i.e. those with codepoints greater than 0xFFFF it has no longer been possible to store all characters as a single 16 bit integer. But it turns out that you can store non-BMP character using a pair of 16 bit integers. In a UTF-16 implementation it would be impossible to store one half of a surrogate pair without the other, indexing characters would no longer be O(1) and the length of a string would not necessarily be equal to the number of 16 bit integers, since it would have to account for the possibility of a four byte sequence representing a single character. In javascript none of these things are true.
This turns out to be quite a significant difference. For example it is impossible in general to represent a javascript "string" using a conforming UTF-8 implementation, since that will choke on lone surrogates. If you are building an application that is supposed to interact with javascript — for example a web browser — this prevents you from using UTF-8 internally for the encoding, at least for those parts that are accessible from javascript.