In the early days, seafarers did not have the luxury of measuring their ship’s speed using modern GPS (Global Positioning System) driven devices.
How they used to measure how fast their ship was sailing was by throwing a piece of wood or other floatable object over the vessel’s bow and then counting the period of time that elapsed before the ship’s stern passed the floating object.
By the late 16th century, seafarers had begun using a chip log to measure the speed of the ship. In this method, knots were tied at equal intervals in a length of a rope and then one end of the rope, with a pie-slice-shape piece of wood/chip attached to it, was thrown behind the ship. As the vessel sailed forward, the line of rope was allowed to roll out easily for a precise period of time, which was then charted with an hourglass. A knot came to mean one nautical mile per hour. Thus, if a ship is said to be travelling at 15 knots is going at 15 nautical miles per hour.
There were many nations disputing the accurate measurement of a nautical mile. In 1929, the nautical mile was standardized at 6,076 feet. A nautical mile differs from a mile on land which is calculated based on walking distance.