Safer Passwords but not More Privacy

Image representing Yubico as depicted in Crunc...

Image by None via CrunchBase

Google has developed a new device that might save us all from having to remember passwords. I know that passwords are one of my own personal bugaboos and it’s embarrassing to admit how many times I have returned to a website or service and been unable to use it since I could not remember my user name and password.

In 2014 Google is going to release a new privacy platform that would be the first big step to do away with passwords. The product will be called the YubiKey Neo and will be a USB dongle built by Yubico for Google. The technology involved is called U2F, or Universal Second Factor. This technology builds upon earlier work done in the development of smart cards.

The way the YubiKey will work is that when you are using Google Chrome or Gmail, you will log in once to the YubiKey with a user name and a PIN. Then, whenever the need for a password arises in those two applications, the YubiKey will verify who you are and you will no longer need to know passwords. It then is impossible for somebody else to pose as you on the Internet unless they have your username, PIN and physical possession of your YubiKey.

This would be a somewhat limited security platform if it only uses Google Chrome and Gmail. But Google wants this to be a universal security device and has joined a new coalition called FIDO (Fast Identy Online Alliance). This coalition includes other heavy hitters like Mastercard and Paypal. Google’s has published the U2F specification and says it is committed to an open source security solution. Google hopes this becomes the standard way to protect your identity.

This kind of technology could make online shopping even safer. And it certainly is a great way to make life easier for people like me who don’t really want to remember the passwords for a hundred different sites and services. But in the end, it really doesn’t help our privacy, just our security.

Let’s face it. We have all bought into the world where we give up our personal data for the ease or enjoyment of using free services like Facebook, LinkedIn or Gmail. It’s a well-known axiom in the industry that the product of all of these free sites are us, their users. These companies make money by using information they gather about you and everybody else on their site.

They mostly use the data today to feed advertisers, who are using that data to get more and more focused in bringing you ads for things you want to buy. But your data is starting to be used in many other new ways. Things you post on Facebook are now searchable on Bing and Google. There are social web connections being made where companies no longer just try to figure out what you lie, but they also want to know who you know. These large companies are constantly playing around with our data to see if they can find new ways to make money from your data or to make their product more valuable so they can gather even more data.

People just don’t realize, or mostly don’t care that everything they do and say on free web services is kept and analyzed and used by the web companies to profile them better and know even more about them. The only way to put the genie back in the bottle is to stop cold turkey using social web sites, and very few people want to do that. So I am certainly glad if the new Google product can do away with passwords, but I don’t take comfort that I am any safer on the web.

The Explosion of WiFi

Wi-Fi Signal logo

Wi-Fi Signal logo (Photo credit: Wikipedia)

WiFi has been around since the mid 90’s as a local wireless data connection. WiFi products grew somewhat slowly with the two primary uses being external WiFi networks used to supply point-to-point data in mostly rural areas, and as the way to connect wirelessly to computers within a home or business. And companies like Cisco, Linksys and others made a decent living selling WiFi transmitters.

But then along came the smartphone and suddenly cellular data offload became a huge business as everybody scrambled to use WiFi data from their landline network rather than pay for more expensive cellular data. All of a sudden WiFi routers became a necessity and most homes that have a landline data connection now also have a WiFi router. In fact, most cable companies, FTTX companies and telcos have built WiFi into their standard data modems.

And as successful as WiFi has been, the spectrum is about to get a lot busier. Consider the following industry trends:

Proliferation of Commercial Hotspots. There has been a proliferation of public WiFi hotspots in recent years. It used to be when you wanted free WiFi you would head to a Starbucks. But since most businesses now have data connections many of them had added WiFi for their customer’s convenience. One good indicator of this is the website WeFi.com. This site tracks known public WiFi hotspots and conveniently maps them. And this site shows many hotspots, but there are many additional hot spots that are not shown on these maps.

In addition to businesses deploying hot spots, some carriers have started deploying hot spots as part of their business plan. For example, it was recently reported that cable companies have deployed over 300,000 public WiFi hot spots, with most of those being deployed by Comcast, Comcast is deploying public hotspots in areas where they have stiff competition with fast landline data, such as areas with Verizon FiOS. So in some of these areas Comcast has deployed hot spots in areas where the public tends to congregate. For instance, they tout that they have completely covered the Jersey shore. When they can they sell hot spots to businesses as a money-making venture, but many of the Comcast hot spots are free for the use of any Comcast customer and have been installed to give them a competitive marketing advantage over their local competition. They report that the public is flocking to their hotspots with cell phones and tablets.

Settop Boxes. Many of the settop makers for cable television are coming out with version of their boxes that use WiFi to connect and transmit TV from one central hub to other televisions or to tablets, PCs or cell phones. There has already been a trend of creating a ‘whole-house’ centralized DVR / settop box that is able to record and playback multiple shows to any other TV in the home. Settop box manufacturers are going to count on the new 801.11ac standard to provide enough bandwidth to transmit cable signal between TVs.

City-wide WiFi networks. There have been a number of municipalities and other entities that have been expanding free WiFi networks. Wikipedia now lists 65 US cities that have deployed WiFi networks in some or all parts of the City. For the most part these networks offer free service although some of them instead offer WiFi by the hour or day similar to what is available in airports. I know of cities who do this which are not on this list, so the actual count of cities with some public WiFi coverage is probably quite a bit higher than 65. And I read almost daily of cities who are thinking about adding more of this. Additionally there are many cities that have added WiFi networks for first responders and City employees without offering these networks for the public.

The Internet of Things. But the real explosion of WiFi is going to come from the Internet of Things. There is only two current reliable ways for the multitude of IoT devices to communicate with a central hub, either WiFi or Bluetooth. It appears that most device makers are leaning towards WiFi as the preferred communications method since Bluetooth is mostly limited by line of sight to the central router. It’s estimated that over the next decade that billions of new IoT devices will be deployed and will start sharing the WiFi bandwidth.

There are a lot of concerns that the number of devices that will be using WiFi is going to cause a lot of local interference, which is an issue I will cover in a later blog.

Proposed Changes in Telecom Law

Capital B

Capital B (Photo credit: Wikipedia)

Two bills just made it out of the Communications and Technology subcommittee of the House of Representatives. There are so few bills making it to the floor these days that it’s interesting to see two telecom bills being moved forward. Especially since both are on a bipartisan basis.

The first bill is a revised version of the Federal Communications Commission Process Reform Act, H.R. 3675, which replaces an earlier version of the proposed law. This bill would make a number of changes at the FCC. The bill was heralded yesterday as bringing additional transparency to the workings of the FCC.

First it would change the rulemaking process. It would require all notices of proposed rulemaking (NPRMs) to allow 60 days for public comments before issuance. The FCC currently gives the public an opportunity to comment on the content and structure of an NPRM about one third of the time. Having followed FCC NPRM’s for years, this move seems aimed at slowing down the process. The FCC has generally asked for public input for major rule changes before issuing an NPRM. But the FCC also issues a number of NPRMs that are more procedural or which make minor amendments to rules and this new proposed process would slow down those more minor rule changes. Of course, it’s hard to argue about giving the public more input, but in this case the change gives them more input to the document that is asking for public input rather than to the actual factual proceeding. The public has always had the opportunity to respond to any NPRM once issued, but now they will get a chance to first comment on the format and questions asked by any NPRM before it is issued.

The bill also would require that there would have to be a broader review of any rule change that is expected to have an economic impact of $100M or more. This review would come from other government agencies who want to chime in on the change. This also will add time to the process of allowing the FCC to make major changes and my reading of this bill in general is that it sounds good in intent, but my gut tells me that this is a backdoor attempt to slow then FCC down from making any major changes. There are those in Congress who have been advocating removing most of the FCC’s responsibilities, and if you can’t stop the FCC, then I guess it’s okay to just slow them down.

The subcommittee also approved a bill H.R. 3674, the Federal Spectrum Incentive Act. This bill is aimed at freeing up more cellular frequency in the lower spectrum ranges. It provides both incentives and processes to move government and other users off of certain frequencies in order to provide more bandwidth for cellular telephone usage.

The main provision of the law is that it would allow government agencies that are currently using spectrum in certain ranges the ability to take a cut of any auction proceeds coming from the sale of that spectrum for commercial use.

There was a similar bill passed in 2012 which gave incentives for TV stations willing to give up their spectrum using a tool which is called a reverse auction. This bill would give about 1% of the proceeds of an auction sale to any agency that gave up the spectrum. The bill handles the mechanics by creating a new Federal Spectrum Incentive Fund which would hold auction proceeds until qualifying agencies could qualify for the funds.

These incentive funds are a good idea in that they free up frequency that is badly needed by cellular providers. Most of the frequencies involved are below 1 GHz, and these are the frequencies that can be used to carry a cellular signal for a long distance. These spectrums are necessary if the country wants to use cellular frequencies to bring more data to rural areas. In urban areas the carriers can use higher frequencies because the towers in those areas are already fairly close together. But it’s only economical to provide cell coverage in rural areas if the spectrum can carry for long distances from each transmitter.

The Real Cost of Money

Money cash

Money cash (Photo credit: @Doug88888)

I have often heard it said that municipal bonds are cheaper than bank loans. This is an argument rolled out by incumbent telephone and cable companies all of the time when they are trying to stave off competition by a municipal provider. Many times I’ve heard the argument that governments have an unfair advantage over commercial firms in that they can raise cheaper money through bonds.

But I have been recently working with some municipal entities and also some public / private partnerships and I think that argument is dead wrong. It looks to me like bond money is some of the most expensive money in the market.

It’s always been easy to make the argument that government money is cheaper due to municipal bonds having lower interest rates. And that is true. Historically municipal bonds have had lower interest rates. There has always been a spread between bond rates and commercial lending rates and bond rates almost always have lower interest rates. But interest rates are not the only cost of money, and so to make a comparison between the two kinds of borrowing based only upon interest rates is not telling the real story.

There are numerous other costs associated with borrowing large amounts of money. It’s easy for the average person to be able to think of loans in terms of interest rates, because when somebody uses a credit card or buys a car there are no additional costs of money other than the interest. But when somebody wants to borrow large amounts of money like what is needed for a major telecom project, then there are extra costs, much in the manner that there are closing costs when you get a mortgage on a house.

The true cost of money is the costs incurred to borrow the money and to administer the payback. Following are examples of some of the extra costs associated with borrowing large amounts of money:

  • Application Fees. For a large borrowing there is typically the requirement for a business plan. But bonds also require an additional document be prepared that is the equivalent of an offering document when a commercial firm sells securities. These documents can come with a significant cost, in the hundreds of thousands.
  • Legal Fees. Both commercial and municipal borrowing include legal fees. But the legal fees associated with bond financing are generally much larger than the costs associated with a commercial loan. Bonds are more complicated, and in some cases can be contested by the public, so there is a lot of additional due diligence done for bonds to make sure they will succeed. And if the bonds are challenged legally there can be a huge legal cost.
  • Referendum Costs. Many kinds of bonds require a vote of the public to be approved and getting a bond question onto a ballot can have a significant cost, particularly if this is not done at the time of a major general election.
  • Capitalized Interest. Bonds generally hand over the entire amount to be borrowed on day one. The bond borrower then has to pay interest on the whole balance from the start of a project. If the revenues associated with a bond don’t start right away (like with telecom projects), then it is typical for the borrower to have to borrow the first two to four years of interest payments. This can significantly increase the cost of the borrowing. For example, on a $50M project, capitalized interest can range from $5M to $10M, which is a 10% to 20% adder to the cost of the project. Commercial loans generally us a construction method where the borrower only draws the loan as it is needed, which greatly reduces the early year interest costs.
  • Debt Service Reserve Fund. Many bonds also require a debt service reserve fund. This is an amount of money set aside to pay bondholders in case the borrower is unable some year to make the full bond payments. It’s not untypical for this to be set at a full year’s interest and principal payment, adding another 3% to 5% to the total cost of the borrowing.
  • Bond Insurance. Some bonds also require bond insurance. This is an amount paid up front at closing to an insurance company that will guarantee some payments to bondholders in case of a default. The insurance rates typically run 1% to 2% of the total project.
  • Escrow Fees. Almost all bonds, and some types of commercial loans require an intermediate escrow company to gather payments monthly in order to make periodic payments to bondholders of lenders. Additionally escrow companies are used to hold money such as the debt service reserve funds or capitalized interest.
  • Reporting and Administration. Most large loans have costs of reporting results to the borrowers in some manner.

When considering all of these costs it is not unusual for a municipal telecom project to have much higher financing costs than an equivalent commercial project. When considering all of these costs it’s not hard to find municipal projects where the total cost of financing is 12% to 18% of the project. It’s rare to find a commercial loan these days where the all-in costs even hit 10%. I’ve recently seen some public / private partnership deals where bringing in commercial money has greatly lowered the cost of borrowing compared to traditional bond financing. So forget interest rates. It’s the whole cost of getting and paying back the money that matters.

Faster and Faster Fiber

University of Bristol

University of Bristol (Photo credit: Wikipedia)

One thing that is certain is that mankind’s base of knowledge is growing rapidly. This article that human knowledge is doubling every thirteen months and that with the Internet of Things that could accelerate to a doubling every twelve hours!

We certainly see evidence of the growth of knowledge because it’s rare any more to go more than a few days without reading about some new breakthrough that is going to improve telecommunications and computing. Following are two new breakthroughs having to do with fiber speeds that I have seen in the last week:

Noise Free Fiber.  A research team led by Xiang Liu of Bell Labs announced that they have been able to use a technique called phase conjunction to greatly decrease the noise and thus increase the efficiency of a long-haul fiber transmission. They were able to send a 400 Gigabit per second signal for 12,800 kilometers (nearly 8,000 miles).

This technique is the photonics equivalent to the same technology that is used in noise-cancelling headsets. The scientists were able to send twin beams of light down the same fiber and superimpose the two beams in such a way as to cancel out all of the distortion and noise that generally interferes with a light transmission. Essentially the second beam of light acts as the inverse of the first beam by sending them out of phase with each other and eliminating the introduction of external noise into the harmonics of the light beam.

This technique promises to greatly improve the efficiency of long-haul fiber transmissions allowing for much more usable data to make it through the same data path. Sending 400 Gigabits for such a long distance is huge improvement over what is available with current fiber technologies.

Petabit Fiber Networks.  Another set of scientists have taken a different approach to pushing more data through a fiber network. Researchers at the University of Bristol in the UK along with the National Institute for Information and Communications Technology in Japan have reported the creation of a software technique capable of handling huge amounts of data across fiber.

They have been able to control a fiber network consisting of signals across multiple fibers. This is a huge step because it is the beginning of software-defined networking (SDN) on a big scale. With SDN technologies the amount of bandwidth on the network can be controlled and defined by the software rather than depending only on the lasers. This means that the characteristics of the network can be changed on the fly as is needed by demand. Current fiber networks are limited by the amount of bandwidth that can be handled by the laser used on one fiber pair. But with this new SDN technology many pairs of fiber can be tied together into one big network, much like VDSL bonds together multiple copper pairs to make a bigger data pipe to a home.

The SDN technology is needed because there have been trials of combining multiple fibers. Scientists at NTT, the Japanese telecom company have successfully bonded twelve fiber pairs to create a single network that was able to send a huge data pipe of 818 terabits for 450 kilometers (280 miles). That is getting very close to petabit speeds (1,000 terabits).

The SDN breakthrough holds out the promise of making fiber networks programmable. This would be similar to what has been done with radios over the last decade in that the frequency of a given transmitter is now programmable instead of being defined strictly by the power source and the antenna structure. Programmable radios can be quickly changed to fit specific bandwidth needs and if the same can be done with fiber networks, network owners will be able to quickly reconfigure networks to meet changing demand.

Comcast and the Cable Industry

GMMZ Box

GMMZ Box (Photo credit: isriya)

Last week I wrote a blog about my poor experience in signing up with Comcast for cable modem service. Within a day after writing that article I saw several web articles about Comcast and the cable industry that I found very interesting.

First, just a few days after I signed up with Comcast they announced their next set of rate hikes. They raised the cost of leasing their cable modems from $7 to $8 per month. I took their cable modem due to the hectic process of moving into a new home, but I plan soon to replace it. It’s hard justifying spending $96 per year to lease a device that I can buy for a little less than that, and the $8 fee is really out of line with Comcast’s costs. Comcast takes advantage of the fact that most people are not technical enough to feel comfortable installing their own cable modem.

Comcast also raised the price of basic cable between $1 and $2. Basic cable is the small package that includes the traditional network channels plus some freebies like shopping networks. Comcast is also raising the price of other tiers by $2, meaning an average customer is going to see an increase of $3 – $4.

But what was not included in the announcement is that Comcast is introducing a new fee of $1.50 called a ‘broadcast fee’. This supposedly is to offset some of the increased in costs they are seeing in basic channels due to the cost of retransmission deals with traditional broadcasters. Networks like Fox, NBC, ABC and CBS have been raising rates significantly faster than inflation in recent years and this charge supposedly offsets some of those fees. However, the real reason that Comcast is including this as a separate rate is to be able to advertise lower rates for all of their cable packages than what customers actually pay. I would hope the FCC will slap them for this practice, because this new rate applies to every cable customers and to not include this as part of basic rates is a huge deception and false advertising.

I also saw a long article from Business Insider that says that “TV is Dying, and Here are the Stats to Prove It.”  It’s an interesting article and worth the read. I won’t repeat all of their statistics, which show in aggregate the cable industry has tipped over the edge and is now shrinking as an industry. Cable companies like Comcast and Time Warner are losing customers a lot faster than the industry as a whole

Telcos like Verizon FiOS and AT&T UVerse are gaining customers at the expense of the cable companies, but the industry as a whole is losing customers. This article makes a strong case that the problems experienced by the cable industry are mostly economic and that the price of cable is starting to force customers from the market. Some other sources that I have cited in the blog in the past put a lot of the blame on customers have alternative sources of content on the Internet. Certainly it is some of both, but cable is getting so expensive and continues to have big annual rate increases that are convincing customers that they need to seek an alternative. Throw into this the negative feelings that a lot of customers have about the cable company’s customer service and there are more and more people poised to drop the big cable packages.

I remember a time about four or five years ago when you couldn’t turn on a consumer show or open a magazine that didn’t have an article that was telling people to drop their home phones. And people listened to the advice and more than 40% of homes have now dropped landlines. I am now seeing the same sort of discussion about dropping cable TV. Most people are creatures of habit and most households are going to need repeated convincing to get them to drop their cable. But as the media keeps urging them to do so, and as their neighbors tell them it is okay to do so, and as the cable companies keep raising the rates, more and more households will drop cable. Right now the rate of drop-off is relatively slow, but it might well turn into a flood in the same manner that happened with landlines. Many households will never drop cable, but enough of them could do so to transform the industry in a very short period of time.

The current cable model is broken and it’s just a matter of how long it takes for the wheels to come off. The industry is driven by the content providers who are driven by the demand that their corporate earnings increase year over year. In an environment where the number of subscribers is shrinking the industry is now at the beginning of a death spiral – rate increase drive away customers, which forces content providers to increase rates, etc. And death spirals always end with, well, death.

An Update on the Transition to an All-IP Network

Overlay network diagram: IP layer

Overlay network diagram: IP layer (Photo credit: Wikipedia)

I have closely been following the transition of the PSTN to an all-IP network. Every client of CCG who has any voice traffic on their network will be affected by the changes made in that order.

On November 19 the new Chairman of the FCC, Tom Wheeler published a blog talking about his goals for this transition. In his blog he says that he is expecting that in January the FCC will vote on an Order for Immediate Action.

Chairman Wheeler laid out some expectations for the new transition to IP. He is referring to it as the Fourth Network Revolution. Rather than summarize what he said., I quote from his blog:

 “That Order should include recommendations to the Commission on how best to: (i) obtain comment on and begin a diverse set of experiments that will allow the Commission and the public to observe the impact on consumers and businesses of such transitions (including consideration of AT&T’s proposed trials); (ii) collect data that will supplement the lessons learned from the experiments, and (iii) initiate a process for Commission consideration of legal, policy, and technical issues that would not neatly fit within the experiments, with a game plan for efficiently managing the various adjudications and rulemakings that, together, will constitute our IP transition agenda.”

The following day the Technology Transitions Policy Task Force issued a statement

the following day giving interested parties instructions on how to meet with the Task Force to supply input.

I get asked by a client every few weeks about the IP transition. I think the Chairman’s blog makes it clear that the process is just starting and that we are at least several years away from getting the process started. The FCC is now looking at a goal of having such a transition fully implemented by 2018

If the FCC approves the Order for Immediate Action this will authorize the start of a few trials of the IP network. Expect such trials to be between AT&T, Verizon and a few large carriers in just a few markets. Even in those markets it is unlikely that they will want smaller carriers to take part in the trials.

As I have discussed earlier, CCG is very much in favor of the transition to an all-IP network as long as making the change doesn’t trample the rights of our many CLEC clients. Probably the most important right that a CLEC has today is the ability to interconnect with the incumbents at any technically feasible location. This right was established by the Telecommunications Act of 1996 and it is vital that this transition does not change this provision. We fear the large incumbents will use this as an excuse to force everybody to meet them at large statewide or regional IP POPs, at the CLEC’s cost. Such a change could greatly increase the cost to CLECs of interconnection.

We are also concerned that a change to an all-IP network will be an excuse for the RBOCs to try to get out of the historical practice of interchanging local traffic on a bill and keep basis. Historically the calling patterns from rural areas are that they send a lot more traffic to urban areas than what they receive. If such traffic must be brought to IP POPs at the ILEC’s cost or if such traffic becomes subject to transit traffic the costs to rural ILECs will increase tremendously.

CCG plans to file comments on these two issues with the FCC in this Docket and any of our clients interested in these two topics should let us know if you have any other specific concerns.

Do We Really Need FirstNet?

English: LTE logo

English: LTE logo (Photo credit: Wikipedia)

Many of you probably are not aware that Congress has mandated the creation and construction of a new first responder communications network. The project is being called FirstNet. As I have watched the development of this process I see some fundamental problems with the way that this is being done, which I will discuss below.

The idea for FirstNet came out of the 9/11 Commission. I am sure that everybody remembers that during the World Trade Center attacks that many of the responders from nearby states were unable to communicate well with the New York City police and fire departments. One of the findings of the 9/11 Commission was a recommendation that first responders be made to coordinate technology systems.

FirstNet was then created by Congress as part of the Middle Class Tax Relief and Job Creation Act signed in February 2012. This Act established the project, but most of the funding is going to come from proceeds from 2014 spectrum auctions. FirstNet is being overseen by a Board including individuals from public safety; current and former local, state and federal officials; and wireless experts. FirstNet is a new independent entity created within the National Telecommunications and Information Administration (NTIA).

FirstNet is will use LTE wireless spectrum for first responder communications. For now FirstNet will provide mobile data and commercial class voice, but existing technology systems must be maintained to provide mission critical voice.

So what are my problems with this idea? After all, being against FirstNet can be made to feel like being against police and firefighters. I certainly think the overall goal of improving coordination among first responders is a great one, but I have the following concerns about it.

Technology. FirstNet has chosen LTE spectrum as the one solution to fit everywhere. However, the topology in the US varies widely and while LTE might be a good solution for metro areas, it is a pretty poor solution for the hills of West Virginia. Choosing one single nationwide solution overrides the ability of locales to pick the best local engineering solutions.

Further, the LTE solution doesn’t guarantee to be able to supply mission critical voice. I don’t know how it could. Large geographical areas of the US still don’t have adequate cell phone coverage or even cell phone towers and the LTE solution won’t work in these rural places. As I have said before, rural cell phone towers were designed to handle roaming cars, not where people live, so the coverage is often very spotty in the places that first responders are going to need coverage.

National Solution to Fix a Regional Problem. Do we really care that first responders from Arkansas can interface seamlessly with responders in New York or California? As a country we have faced similar issues before and fixed them on a regional basis. For example, when we were seeing increasing electric blackouts, FERC, the national electric regulation agency required the electric grid to fix their problems regionally. It forced electric companies in a given region to cooperate to create the infrastructure that could be used to pinch off blackouts.  In 9/11 the issue was that first responders from New Jersey could not communicate with the New York City units. That does not necessarily require a nationwide, one-size-fits-all solution.

Too Costly. This is too costly in several ways. First, it creates a new permanent federal agency to oversee it. Long after communications are improved we will still have a new bureaucracy to pay for. Second, it’s not even a complete solution and is requiring localities to maintain their current mission critical voice systems. Third, this obsoletes a lot of systems that are already working well and forces an expensive upgrade and retrofit over top of already new technology. In the past there have been several swaths of bandwidth assigned to first responders and this basically ignores all of the work that has been done with those frequencies. Finally, who pays for all of this in another decade or two when the LTE technology becomes old? It’s bound to happen – as I have discussed in this blog, wireless technology continues to improve and LTE is not the mobile end game.

Blackmails States into Joining. States are free to do this on their own, but if they elect to do so they must pick up 20% of the cost. Like many other federal programs there is a big stick to coerce states to do things the federal way.

To me this just feels like an expensive big government approach that uses a hammer to fix a screw driver problem. Congressmen always complain that it’s hard to balance the federal budget, but it’s hundreds of these kinds of permanent mandates that make it almost impossible to cut federal expenses. Once this new bureaucracy is created it can’t go away. This has the federal government spending huge dollars to take care of something that has always been done locally. If the feds instead created standards for compatibility, this could  be done on a local or regional basis with regional tax dollars. I applaud the goal of FirstNet, but I hate the solution.