According to this paper ( https://dcc.ligo.org/LIGO-P150914/public ) they detected the signal first at Livingston, Louisiana and 6.9ms later in Hanford, Washington. The distance between them according to wikipedia ( https://en.wikipedia.org/wiki/LIGO ) is 3002km (Ok, the 3002 km distance is on the Earth). If the gravity wave travel at the speed of light they should detect 10ms later (300 000/3002 sec = 1/100 sec = 10ms ). From these data the gravity travels at 434 000km/sec instead of 300 000km/sec. Almost 50% faster then light... Is there any error in my calc?
10ms is the absolute maximum difference in time, if the source was located on a line running through the two detectors. If the source was located on a perpendicular bisector of the line running through the two detectors, the difference in time between detections at the two detectors would be zero. Any value between the two is possible depending on the geometry.
I think your calculation assumes that the waves are traveling parallel to the line connecting Livingston/Hanford. In the diagram below, 's' is the source of the waves.
H-----L-------s
If instead the waves are traveling perpendicularly to the line between those two cities, they should be detected at the same time.
s
/|\
/ | \
L-----H
Since the measured time difference is between 0ms and 10ms, the reality is probably somewhere in between these two extremes.
Even ignoring the curvature of the Earth, the signal source was not necessary located on the straight line between two LIGO locations, but rather at some angle to this line. For example if the signal origin was on the line that is exactly between the two LIGO detectors, the time delay would be zero.
Consider that the black holes merged about 1.3 billion years ago. If gravitational waves travelled 50% faster than the speed of light, they would've passed by earth long before our species came around, unless the effect of the collision went on for, say, a few million years after the event?