Super-fast internet
Whilst the region is excited about the advent of super-fast broadband, it is interesting to note what is happening behind the scenes to ensure the experience is what we expect it to be. Whilst the sound of downloading at speeds of 100Mbps is very exciting to anyone who uses the Internet at all, the connection speeds able to be achieved rely on the rest of the Internet having the ability to deliver data at the required speeds.
Subscribe now for unlimited access.
$0/
(min cost $0)
or signup to continue reading
When organisations are investing in server farms and connections to the Internet they will need to factor in the additional capability of their end users. On a much bigger picture, moving data among continents can also create bottlenecks that end users experience.
Satellites may seem like an answer but given geostationary satellites orbit at 35,786km above the earth the two-way nature of Internet traffic does not work so well in an environment with high latency.
Having speeds of 100Mbps from your Exchange to your house is a waste of time if the server farm you want to access information from has an upload speed that is not capable of delivering this.
The world seems to like using water analogies with the Internet so I won’t stray from the concept. Imagine that you have a 20mm water pipe from the water tower to your house.
If you install a 100mm water pipe from your water meter to your shower, it will not allow a sudden increase in the flow of water when you are belting out your morning tunes in the shower. The bottleneck in this case is the water mains feeding the actual house.
What is more common in the Internet world is the concept that not all users on a link are using the Internet to its maximum at the same time. To stay with the water example, imagine a 100mm water mains from a water tower that was used to service 50 houses with each house having a 20mm water pipe from the water meter into the house.
If only one house turned on the shower, that person would receive the full volume of water as there would be the available capacity of 100mm flowing into a 20mm pipe. Now imagine everyone having a shower at 7.30am. The 50 houses would be sharing 100mm of water mains meaning that each potential Pavarotti would receive a dribble of only an effective 2mm water pipe.
The ’Faster Consortium’ has just unveiled its latest investment in ensuring the world can communicate. The consortium announced the highest-capacity undersea cable built to date. It will provide 60 terabits per second (Tbps) bandwidth between the United States and Japan. The major player in the Faster Consortium is Google and they join other major multi-nationals developing undersea cables. Microsoft and Facebook announced last month a joint effort to build another cable across the Atlantic.
The ‘Marea’ system will offer speeds of 160Tbps with construction to start later this year. These speeds are a far cry from Google’s first investment in undersea cable systems. In 2010 the trans-Pacific Unity cable went online with what seemed like a massive 7.68Tbps capacity.
Antarctica is the poor cousin in the world of communications. It is the only continent yet to be reached by a submarine telecommunications cable.
All data and communication must be relayed to the rest of the world via satellite links. The low population of the continent combined with the extreme weather conditions have resulted in no feasible solution for this continent.
At least no feasible solution from an economic perspective.
Luckily, Australia has sixteen external connections to the rest of the world but to keep delivering on the NBN promise, I can see more submarine cabling being performed in the near future. Once the NBN rollout has been completed, maybe there will be some work for these installers manning the ships that roll out the cabling across the oceans of the world.