The Gigabit App Challenge

A discussion I was having recently made me remember the gigabit challenge held by the City of Chattanooga in 2012. Advertised as the Gig Tank challenge, the City offered a prize for anybody who could develop an application that required gigabit speeds. The City held this challenge as a way to promote the idea that the City’s fiber network offered gigabit speeds to the public. At the time, the City’s EPB was one of a handful of ISPs, along with Google and a few other municipal networks and small ISPs that offered residential gigabit. The gig challenge had a second goal of attracting young tech entrepreneurs to the City.

The winner of the $100,000 Gig Tank prize was Banyan, a tech start-up from Tampa that showed how gigabit speeds could enable science researchers to work at home with large data sets. It’s been thirteen years since the challenge, and large data is still the only realistic way to take full advantage of the gigabit connections being purchased by millions of homes.

Over that time, we have developed many commercial and industrial applications that use gigabit and faster speeds in data centers, factories, and banks. But we’ve never found a killer app for homes that use even a reasonable fraction of a gigabit.

About the fastest application that uses a constant 100 Mbps+ of bandwidth is 8K video. Millions of TVs have been sold that include the 8K capability, but programming is still not widely available online except for some minor programming on YouTube and similar sites. It’s never become popular because a viewer needs a huge TV to see the difference between 8K and 4K video.

Immersive online gaming can also consume up to 100 Mbps of bandwidth, but most games use far less, from 25 Mbps to 50 Mbps. Game companies constantly strive to find ways to improve visualization while reducing bandwidth needs.

Even after the thirteen years since the challenge, there are not a whole lot of ways to use gigabit speeds other than sending large data files. I recall that the first two customers who bought gigabit speeds from the municipal utility in Lafayette, Louisiana, were doctors who wanted to view MRI scans and related large files without having to run into the office.

With the proliferation of people who work from home, there are a lot of people who routinely send and receive big data files. Content providers send non-compressed video files to be edited. Engineers, researchers, and scientists routinely work with giant files.

ISPs will tell you that if they find a resident using gigabit speeds routinely, the odds are high the customer is operating a server at home. This could be an ecommerce site, but more likely is related to pornography or pirate video sharing. I’ve been contacted by ISPs many time over the years asking what to do about such customers. (Simple answer, treat them like a large data user, and most won’t pay the higher rates. Those that do can be a great customer).

I’m sure that industry folks are as flabbergasted as I am why people are buying home speeds faster than one gigabit. A lot of ISPs tell me they are getting respectable penetration rates for two, three, or five gigabit home products.

If you asked me in 2012 if there would be gigabit applications by 2025, I’m positive I would have said yes. But in the ensuing years, we’ve developed compression codices that allow the transmission of huge amounts of data without needed huge amounts of bandwidth.

With that said, I still want my home holodeck. Maybe in the next thirteen years?

6 thoughts on “The Gigabit App Challenge

  1. Great perspective on the reality of gigabit speeds! For most households, 100-200 Mbps handles streaming 4K on multiple devices, video calls, gaming, and file downloads perfectly well. While multi-gigabit speeds sound impressive, they often feel like a marketing-driven solution searching for a problem. The real-world applications that truly require those speeds, such as large data transfers for professionals or content creators, represent a small fraction of users. Perhaps the focus should be on reliable, consistent speeds rather than headline-grabbing numbers.

  2. Disney+ has calmed down a bit, but I was tracking 600-700Mbps buffer fill bursts on 4k Disney+ on AppleTV 4k units.

    We do regularly see game updates hit gigabit services at 900Mbps+ for decent stretches. Some newer games are 150-200GB updates, which is 20-30Minutes at 900Mbps. This is basically the only reason we recommend higher speed plans to games that generally only need great latency, these massive game updates. No one loves waiting 4-5 hours for a game download on a 100Mbps service.

    We have very few >1G services. Exactly 2 gamers on 2Gbps packages, and when they do updates it’s 1.6-1.8Gbps for 10-20 minutes or whatever.

    I can’t really think of anything else that is quite that heavy of a use, but there are lots of gamers so this is a ‘1st class’ example if you will, not a corner case.

      • No, it was just a VERY aggressive caching effort on the Apple TV Disney+ app. Other apps also had decent sized bursts but Disney+ would aggressively cache BIG chunks of 4K content and the Apple TV with WiFi6 is a capable enough and has enough ram to store it.

        Other boxes didn’t hit so hard or so long. I did testing on apple tv 4k, roku ultra, firetv, nvidia shield, and a mii android box. and I did WiFi and Ethernet testing on devices that support it. The conclusion was that the bigger that RAM onboard the streaming box, the more it can cache, and the harder it’ll hit the connection. Apple TV 4k hit the absolute hardest, then the nvidia, and everything else was about the same. firetv exposed as a VERY poor wifi client and that reflects ‘performance’ complaints about streaming on firetv vs others as well.

  3. I have to respectfully disagree with the 100-200 theorem as i have a household of 6 tvs and many times we have the slow circle of wait a minute on services, i do believe that a higher throughput would eliminate this but i do digress 1-200 for the general user would probably suffice but in this ever more digital world speed is king

    • If you have 6 TVs and you’re getting buffer waits on a 100M service then you have either a very poor 100M service or you have a poor home network/wifi, or you have poor APs.

      Essentially all streaming services will settle into a 1080p stream very easily in 2025 at well under 10Mbps, you should be able to have a buffer free viewing experience on 6 TVs with even a 50Mbps service. 25Mbps would do it but you’d likely see a couple seconds of buffering because of the app testing for higher bitrates.

      general rules to make things smooth:
      Have a mesh wifi unit at no more than 35′ increments, 30′ would be better.
      Don’t use firetv sticks plugged directly into HDMI port
      Don’t use built-in TV streaming apps
      Choose the higher end Roku units, or Apple TV, etc.
      Place the streaming box above or below the TV, ie don’t let the TV screen get in the path of the wifi.
      Do not use a netgear ever.
      Don’t use traditional range extenders ever, only mesh units if you can’t run a wire.
      Look for a router that supports ‘smart shapers, smart queues, cake, fq_codel’, these features dramatically improve the smoothness of a connection and keep individual services from pushing others out of the way. Eero 6/6E/7 line is fantastic. heard good things about gl.inet.
      worth saying again, never netgear. Also, any of the ‘gaming’ routers tend to be more about marketing than quality so stear clear.

      The thing to keep in mind here is that the internet is a chain of services and devices that tends to end in a home with throw away tech. Yes, the upstream service can be over committed, have flaws, etc, but the cheapest piece of equipment is almost always the $29 wifi router or the fire/roku TV and 9/10 these things give people the impression that their internet sucks. well, it’s just the last 1-2 links in the chain that usually suck, start there.

Leave a Reply