I have had a number of ideas as to what this network could be used for, as it presents the opportunity for limitless and cheap data transfer between geographical nodes of the Google data system. My first idea was that it would provide enough bandwidth to allow for the creation of a massive global databasing using technology like MySQL replication. This idea failed because within most relational database systems, replication has a limit to the number of servers that can be managed and the method of replication would create more resource needs to maintain, outweighing its benefit.
My second idea was to use the network to create a new type of indexing, which I term Swarm Indexing, where geographical nodes within the search system matrix maintain the primary index for their region, with backups elsewhere. Software would then be used to create a meta index, describing the content for that region (for example certain key words or phrases that are geographically contextualised) and this meta index would be transferred to Google's central query-processing hub. As the meta index would be relatively small, this could be transferred using normal data channels, but where the dark fibre comes into play is in the speed of connection – so that when a query is identified as region-contextual, the query could be passed to the relevant geographic node for processing. This node could then process the search and send back a response, caching the query for future use. Using the dark fibre connections (at possible terabits-per-second speeds if wavelength division multiplexing is used) would make Google's global network not just faster and more reliable than most others, but because of its size able to handle massive traffic demands. This could also be extended to reduce costs for YouTube, more details of which are coming up in the next article in this series.
Finally, there is a direct commercial interest in holding dark fibre capacity: secure, cheap, long-distance communications. I propose the development of an automated system where Google's dark fibre extends to major data centres and provides open peering (meaning free transit) to other data networks at the centres. Google could then monitor and provide access to transfer of data across their dark fibre capacity at a price which undercuts other providers when transfer time is critical, such as in the broadcast industry. An automated system could be implemented to monitor the capacity of each fibre trunk and set a price available for a time/bandwidth unit, similar to the way AdWords intelligently prices advertising within Google's search results. In order to instil trust from clients, Google could mandate that connections across their network use an encrypted VPN, like the software OpenVPN client which can connect to most VPN networks.
So, the final question is: with this type of system in place, would you trust and use Google to shift your data?