The gigabit Ethernet standard definition is a technology based on the Ethernet frame format and protocol used in LANs which provides a data transfer rate of 1 billion bits per second which equals one gigabit. This Ethernet technology is widely used and is the backbone of many enterprise networks.
This gigabit Ethernet is primarily carried on fiber optic cables. While the speed sounds good, and looks good, the speeds you receive depend on the capabilities of the computer you are connecting to.
Even if you have two computers and both computers have gigabit Ethernet capable network interfaces, the present mechanical hard drives are not presently capable of writing and reading at such high speeds to even come close to maxing out gigabit Ethernet connections. Choose solid-state drives over disc drive to obtain the best performance.
This causes a lot of confusion with the speeds of the internet connections with some cable company customers. Even though the cables themselves are capable of that high of a rate of data transmission, the computers that are processing the information that is received are not capable as of yet to process the information at that rate so the speeds seem much slower. In 2002 the 10GBE was available.
The rate of adoption however was very slow. The adoption rate did start to pick up by 2006. In 2010 the 100 GbE was first released, but has not as of yet taken off.
This is due to the same reasons for the confusion about the speeds. The computers themselves are not yet capable of that high of a processing rate. Once the computers catch up with the speeds of the cables there will be a wider adoption rate for the technology.