Home Networking, Verizon, and Lafayette

Food For Thought Department
[What follows is lengthy and starts out with arcana but I think the implications are significant—perhaps especially for Lafayette. I ask that you stay with me…]

According to TelephonyOnline Verizon is radically upgrading the gateways it installs in homes served by the fiber-to-the home-based FIOS service. —FIOS customers can buy a triple play internet/cable/phone package from Verizon based on technology that is very similar to that being constructed in Lafayette by the community’s Lafayette Utility System.

The new in-home devices have a number of interesting characteristics; they will:

  1. bump “speeds over coaxial cable in the home from 75 Mb/s to 175 Mb/s”
  2. “have double the processing power” compared to the current gateways
  3. allow “users to create up to four separate wireless networks, each with different security settings”
  4. allow “remote Verizon technician management”

Understand that an upgrade like this is costly. Customer Premise Equipment (CPE) is costly. Putting a piece of relatively pricey equipment in every home (on top of the set-top boxes you’ve installed for video and any VOIP equipment) really adds up. CPE is where every company tries to pinch pennies and extend the life of its equipment. So upgrades are rare. And they are never done without a damn good reason.

So why would Verizon invest in new hardware with the hopes of using the new capacities in “the next three to four years?”

My best guess: to ride the wave of big bandwidth in the home…Big bandwidth inside the home has recently emerged as an issue. (I’ve just recently caught on. See my recent post, FTTD (Fiber To The Desk, for some background musing on how really big in-home transfer might be accomplished. What Verizon is doing validates the idea but is pretty small potatoes compared to what is coming. Don’t miss the comments–good stuff there.) Verizon clearly thinks that its current model which provides 75 mb/s will prove inadequate for in home use in the next 3-4 years. Pause to let that soak in please: the next 3-4 years. Tomorrow.

That is a near-future time frame. Nobody spends the amount of money that Verizon will spend even gradually moving over to new equipment without a very compelling plan to make back their investment. And Verizon knew what it wanted in these boxes. These are not off-the-shelf pieces; they’ve been designed to Verizon’s specs and the company has contracted two independent providers that meet those specs in order to assure itself of supply.

So the difference between the previous standard and the new equipment should strongly hint at what Verizon thinks folks will do that makes the upgrade pay out. Let’s unwrap those specs looking for clues:

The Analysis
—Faster speed, from 75 to 175 mb/s, means that Verizon is expecting a lot of internal traffic on home networks. That is lot-—especially since Verizon won’t offer you more than 50 megs of connectivity to the internet itself so it’s not video downloads they’re trying to accommodate (of course not 😉 ).

So for what do you need massive amounts of in-home networking speed? Take a gander at the processing power for a partial answer.

—”Doubling the processing power”–if you dig around a bit (1,2) you’ll see that that phrase refers to moving from a 32 bit chip architecture to a dual core 64 bit chip. That’s the way my fancy laptop is built. That’s real processing power even if the clock speed turns out be a bit lower. It allows the onboard computer to coordinate more in-home devices. Most obviously multiple set-top boxes for the cable video service are in the mix; it takes a lot of bandwidth to push HDTV around especially if one or more set top boxes is acting like a DVR/video server and pushing video out to secondary screens. In fact the “doubling” phrase clearly understates the added computational capacity. On top of chip architecture the gateways can serve out eight (8!) Quality of Service (QOS) controlled channels. At a minimum that means that Verizon can push 8 separate protected streams to multiple TVs. But eight seems like more than homes really need. Of course there are Xboxes, and Wiis, and Apple TVs and the like in addition to a raft of reasons that users or companies selling to users might want a protected stream…So eight makes sense. . . If (and only if) you are planning to do something beyond video.

Ok, its faster and more powerful…

—What’s with that feature: allow “users to create up to four separate wireless networks, each with different security settings?” Well, that allows you to set up a buncha different networks, some with QOS, some without, some that are slow, some that are public…..hmmn, whats with that? Again, it seems like overkill for current services. Wireless is hard to maintain for the QOS that video requries. They are probably wisely sticking with wires (coax) for that function given the eight protected streams on the wired side. By any measure the capacity for 12 separated streams is pretty astonishing.

Faster, more powerful, many separate streams, eh?…

—Finally: allow “remote Verizon technician management.” That means that Verizon can modify the thing from their headquarters. That alone isn’t too new—most “modems” can be upgraded by the company or at least reset to clear glitches and given the clearences it needs to access the ISP’s network. But in this context “remote management” surely means the ability for Verizon to enable and assign all those streams and to install some management software or special access codes on unit. And, sell that capacity to third parties who would like to use the in-home network that Verizon’s fancy new gateway creates.

Faster, more powerful, many separate streams, that can be controlled by the network owner…extending its control of the last mile into your rooms.

Chew on that for awhile. I did.

The Conclusion
Some folks might think all these bells and whistles are just over engineering. I can’t believe that a traditional telco like Verizon, one that is already straining its financial capacity to pay for a fiber build, is investing that kind of cash unless they really think this amount of capacity will be valuable to them within 4 years and pay for itself rapidly at that point.

My guess is that Verizon wants to control your home network and all the things that you are shortly going to want to run on it. Things which you might want today if only it weren’t so hard and costly to get the service up and running.

What Verizon wants to sell you directly is only the base. Video and video serving is likely only the beginning from Verizon’s point of view. The corporation has two profit centers currently: data and wireless. (Video shows promise but isn’t there today. Old style telephone lines are shrinking.) Convincing you to buy more data capacity and their wireless service is a proven cash cow. Wireless’s Achilles heel remains coverage and the most persistant irritation. In-house and in-building coverage is a big problem and one that is hard and expensive to solve by popping up more cell towers. The emerging solution is to use “femtocells” —to set up a small base station inside the building that is hooked up to a wired network and provides a mini “tower” that dramatically improves service. An in-home gateway like the ones described could help service and manage the bandwidth and protocols necessary to easily deploy this service to those that need it. And potentially radically reduce expensive customer turn over.

But, as popular as video and wireless retention has to be with the accountants who like old services and guaranteed returns, the real goal is likely broader: providing a platform for other, secure, protected services. Services which people can be sold but for which each provider currently has to figure out how to provison. A truly capable gateway like the ones that are described would let a lot of service providers play without installing their own in-home network and/or controller device.

All Verizon would want is, say, 20% off the top.

And providers would probably find that cheap compared to installing their own network.

Here’s an unordered list of things which would be much more commercially viable if the infrastructure/platform were already installed in the home and could be activated and managed remotely:

  • Gaming can eat local bandwidth too.
  • Virtual Private Networks (VPN).
  • Video telephony and intercomms.
  • A local high school sports “network’s” video stream and pay per view.
  • National professional and college sports “channel” versions of streaming video and downloads direct to your DVR.
  • Sophisticated security networks and security cameras.
  • “Telepresence” and other video conferencing/telephony.
  • Allowing all your electrical devices AC, refrigerator, hot water, etc to communicate and lower energy costs.
  • “Smart home” sensor and activity networks.
  • “Satellite radio” channels.
  • A myriad of music rental services could play directly through your connected sound system.
  • And many more…..add your own in the comments

Many of these “long-tail” sorts of uses will be “gotta have it” for some subset of users. The 2 dollar USL sports network will lock in users who will spend the next 200 monthly dollars on Verizon. Suppose only 1% of users has gotta have each of the above functions. That’s 11% of the market right there. Locked into your company from the start. I think Verizon could make a pretty penny by controlling the gateway device that made such home functions easy to buy, install and provide.

I think any network could.

The Take Home
Understand that the current incumbents know very well that the only thing that keeps them from becoming cheap, commodified transporters of other people’s expensive bits is their monopoly-based control of the last mile architecture. If we had six connections to the outside world all networks would be running cheaply and competing on how fast and reliably they could provide us with bits. (But that ain’t in the cards…which is why wise communities will follow Lafayette’s lead.) The incumbent’s stranglehold on the last mile is crucial to their profit profile. With it they are Gods of the network age. Without it they are the guys who sweep the roads and fill in potholes—and will be paid appropriately. They’d rather be Gods. That last mile control is the key to being invited in to control the network in your home too and the key that will give them huge new sources of revenue by controlling the toolbooth that they hope that new gateway will become.

It’s a pretty damn good plan.

Lafayette
No Lafayette Pro Fiber blog post would be complete with a comment on the local implications. IMHO this is another place where Lafayette could lead the way.

Verizon is poised to extend its fiber-based advantage into the home by controlling access to big bandwidth inside the home and by easing the entry of services that critically depend upon accessing a robust home network. In some places the cable company has already partnered with security firms to provide robust networking that rides on the coax installed for cable. Other incumbents surely will see the writing on the wall; they’ll have to follow suite or watch companies like Verizon generate the revenues that will enable them to become more and more dominant. Verizon has the big bandwidth advantage in fiber. But that advantage is purely theoretical until the public can see that the more capable network can provide not only “more of the same” but actually “different and better” services. The gateway can be the key that unlocks that potential.

A gateway or something similar can do the same for Lafayette’s network.

I’ve been hearing a lot of background buzz lately about trying to encourage tech development in and for Lafayette. Meetings in various places, varying level gurus flown in from various places to attend the meetings. Dinners in posh private homes. Talk about establishing an “x-prize” for Lafayette. An attempt to organize a meeting for developers. Desultory attempts at secrecy. (My list is surely incomplete.) The usual influentials’ names are bandied about. You know, the works. No one knows whether any of it will come to fruition. But the point is that Lafayette is beginning to wake up to the fact that it will be well served to actually do something to encourage development. A “build it and they will come” attitude only works in the movies. In the real world if you want something to happen you’ve got to do something special to encourage it. Building LITE and LUSFiber and ramping up LCG’s example are great starts but they won’t, alone, be enough to make Lafayette the mecca many of us would like to see it become.

So far most of the Lafayette discussion on this topic has been couched in terms of somehow convincing (or bribing) developers to make us something special. As much as I like the idea—and hope it succeeds—I think we’d have better chance at success if we instead tried to do something special ourselves.

Like Verizon is evidently doing.

The heart of Verizon’s apparent plan is to make it possible, even easy, for developers to do something great and different. They are poised to eliminate the barriers to “getting things done” by providing the platform over which these things can be accomplished. Verizon lays out all the tools on the table (albeit tools that lock you into their network) and will surely even handle billing for you. But they won’t pay you. Instead they’ll charge you…and your customers. Frankly, that’s a better way. Opening the door is always a better plan than subsidizing the battering ram!

The right box in the house could do for Lafayette what Verizon’s gateway is poised to do in the homes of it FIOS users. But Lafayette’s could be based on ethernet and open IP standards instead of the clunky cable-oriented and proprietary network hardware and protocols that serve Verizon but are unfamiliar to most developers. Lafayette could do a better job of facilitating access to the Local Area Network (LAN) that is the home than any of the competitors is willing to do.

But Lafayette could go further. It could do the same for its MAN (Metropolitan Area Network) by building in the resources that make access easy. Make available storage. Make available the kind of computational power represented by LITE and Abacus. Embed modern protocols. Pack up some servers to enable within network serving of various kinds of data (streaming video, for instance).

In short Lafayette could make its networks, both inside the home and inside the city, playgrounds for the easy, fluid kind of development that developers love.

And we might, eventually even make a few pennies off it. But quickly and surely we could make Lafayette a tech mecca, give LITE a clear purpose in the community, and make LUSFiber a roaring success.

You want to be that shining city on the hill? The path is open.

Net Citizenship and You

Food For Thought: Wouldn’t you rather your master be you?

I’m going to have to lay out an unfamiliar thesis: You, fair reader, are almost certainly not on the internet. Not really. You are a second class citizen who is not allowed to make many of the most basic decisions that full members are free to make; you are a dependent of your modem and the wireline owner it is connected to. Generously: you are a client of AT&T or Cox or ____ (your local duopolist here). Less generously: you are a second class citizen of the internet allowed only the access that Big Daddy allows you. And Big Daddy, as in Tennessee Williams’ play, is more interested in wealth and power than he is the welfare of his dependents.

Full citizenship on the web can be defined simply enough: full citizens can use their connection in any way that they want. They are independent actors who are free to make available or view anything.

That’s not you.

Take a look at your TOS (Terms of Service). Cox and AT&T’s, for instance, do meaningfully differ. But they agree about the essentials that concern us here:

1) You are the client, clients of clients are forbidden; you may not distribute service to others,
2) You can’t talk bad about Big Daddy, (e.g.: Customer is prohibited from engaging in any other activity, whether legal or not, that AT&T determines in its sole discretion, to be harmful to its subscribers, operations, network(s). This includes … or which causes AT&T or the AT&T IP Services to be viewed unfavorably by others.)
3) Free speech? No sucha thing. They get to say what you can say. (e.g.: “Cox reserves the right to refuse to post or to remove any information or materials from the Service, in whole or in part, that it, in Cox’s sole discretion, deems to be illegal, offensive, indecent, or otherwise objectionable.
4) No Free Enterprise. You can’t sell things, for that you need the master’s special permission and a (higher-priced) service, regardless of how much traffic you use,
5) It’s not your connection. “Unlimited, always-on” connections are both limited and subject to an abrupt end. AT&T is bizarrely vague while Cox gives clear limits–which are seldom enforced. It’s not your connection; you need to remember that.
6) Your client status is a privilege, not a right. They can kick you to the curb at any time using whatever rationale seems most useful at the moment. (e.g.: Customer’s failure to observe the guidelines set forth in this AUP may result in AT&T taking actions anywhere from a warning to a suspension of privileges or termination of your Service(s). …AT&T’s decisions with respect to interpretation of the AUP and appropriate remedial actions are final and determined by AT&T in its sole discretion.)

7) Lucky 7 Laigniappe clause: Masters don’t have to follow the rules, only clients. (e.g.: AT&T reserves the right, but does not assume the obligation, to strictly enforce the AUP.)

You are in a master-client relationship with your network provider. You are NOT a full citizen of the internet. Your “location,” your IP address belongs to someone else. They have an assured, static IP. You do not. As long as they own that property you are dependent upon them and they can dictate the terms of that use.

Be aware that this is not the way it was supposed to be. The internet, right down to its IP core was designed around your freedom to connect.

One way of looking at network citizenship is through the lens of internet protocols and the operation of “the end to end principle.” From wikipedia:

The end-to-end principle is one of the central design principles of the Transmission Control Protocol (TCP) widely used on the Internet as well as in other protocols and distributed systems in general. The principle states that, whenever possible, communications protocol operations should be defined to occur at the end-points of a communications system, or as close as possible to the resource being controlled.

That’s a mouthful. Translated: The internet is designed as a transmission device that is supposed to be controlled by those on ends of a communication. You and the person at the other end. A request from one end is simply passed on to the other end—no single positive, centrally-controlled “circuit” exists. No controller stands in the middle. This is in contrast to the underlying design of the phone network with its centralized circuit switching system that designates a circuit for you and holds it open. (We’re talking about protocols, now….not physical implementation or the practical experience of users.)

Net neutrality battles are raging around the edge of this nascent war. We want to be full citizens of the new order. The incumbents would prefer that we be clients, vassels, and that they be the masters. Right now they are winning. Right now few of us even realize that current order is not necessary or natural—it was arranged for somebody else’s benefit; not for ours.

It really is that simple.

What we need to recognize is the nature of the war. What we need to be fighting for is ownership of our own connection. For full citizenship. To kill the Master-client relationship that constrains our current access to the network.

Ownership of the network is the most complete solution. Any limits we impose on ourselves are limits that we impose; they are not the dictates of the master. We may start out copying what we know in some ways. But that won’t last.

Lafayette, with its community-owned, fiber-based network utility is a good example of how that will work. From the begining things will be different here. We’ll have static IP addresses…and a lot of potential will flow from that. We’ll have full access to the speeds and capacity of our own network–that is what the 100 meg intranet is all about. As it becomes more and more obvious that many of the limits imposed by the current owners are not natural and not in the interests of users we’ll change those aspects as well.

That’s the real value of the battle fought and won here in Lafayette.

Worth thinking about…

Laptops in Schools: A tale of two cities

The Gist: Regional cities are getting laptops to school kids. Both in Birmingham, Al and in Alexandria, La. I’m envious.

If you are interested in the intersection of computers and education the big news this week is that Birmingham, Alabama has announced its intention to buy 15,000 OLPC (One Laptop Per Child) computers for its elementary and middle school students.

That’s right, the struggling steel city a few states to the east.

The Dream — OPLC and Birmingham
The OLPC program, attuned readers will know, is a product of the fertile imagination of Nicholas Negroponte of the MIT Media Lab. It’s the famous “$100 dollar laptop” that has been widely touted in the media. It’s been grandly promoted as a project to put a computer in the hand of every child in the world. The purpose laid out on the website is only a bit less grandiose:

OLPC is not, at heart, a technology program, nor is the XO a product in any conventional sense of the word. OLPC is a non-profit organization providing a means to an end—an end that sees children in even the most remote regions of the globe being given the opportunity to tap into their own potential, to be exposed to a whole world of ideas, and to contribute to a more productive and saner world community.

It’s not just a nifty computer we’re talking about; it’s a nifty networked computer—which is an entirely different animal. Each machine is capable of using wifi and creating a node in a mesh network—the machines create an ad-hoc network that extends any user’s connection to all the other computers in the neighborhood. That opens up large areas for collaboration with local users and potentially with any internet user world-wide. Spend a moment thinking about that. Of course the reliance on ad hoc mesh networking introduces both speed and reliability issues that the OPLC people don’t talk about. But the integration of networking into the core makes applications which were previously impossible to consider because of the lack of infrastructure pretty easy. Kids won’t need to go offline to work together.

Negroponte’s TED talk is worth a watch if you’d like to get a flavor of the project..and the man. While the ideal of building a machine for every child is a bit grand, less grandly, the OLPC laptop is a tour de force effort to make networked computing technology affordable, durable, power efficient, usable and cheap. In a phrase: a cheap utilitarian commodity. The computing industry hates it. They’re too close to a commodity already.

OLPC also offers a frontal challenge t0 both the software industry and the educational community. The radical software innovations start with the operating system. In contrast to the “modern” desktop and document metaphor popularized by the Macintosh the “Sugar” interface operates on a social-activity metaphor (see guidelines) where the central visual organizer is organizing ongoing activities around the child. (Literally central–the image at right with the child in the center of their ongoing set of activities is the equivalent of the desktop in the Sugar interface.) The challenge to the educational community is embodied in that metaphor—the organizing principle of the educational arm of the project is that learning consists not in storing facts but in successfully joining ongoing activities. (Just for the record: this is NOT far out; Most modern educational frameworks for learning theory since the the 1890’s take a version of this stance. It’s practice that has lagged.)

Looked at in that way one has to wonder whether the florid global ambitions of the OLPC aren’t, in fact, a way to distract observers from the really ambitious project that lurks in the background: to transform modern computation and software so as to drive a fundamental change in educational practices–in learning– in the 21st century. (Now there is a really grandiose, if noble ambition. If that is the hope, then putting the idea that they want to give every child a laptop front and center is a way of being modest.)

That’s what the city down the Interstate is getting into.

The Dream—Alexandria
Now laptops in the schools are not new…Apple, in particular, has a long history of pretty aggressive marketing into schools and once produced a set of rugged laptops (example, emate 300 at right) tricked out with kid-driven software and extensive online support. Maine was an early adopter has had a successful laptop program for years. (Negroponte was associated with it in the early years.)

That legacy lives on. Now it has come to Alexandria, Louisiana.

A recent Town Talk editorial lauded a Louisiana/Apple program that has put Macintosh laptops in local schools:

“Turn On” has put laptop computers into the hands of children in 54 of the state’s public schools. In Central Louisiana, Bolton High School students received laptops at the start of the school year. Now Cottonport Elementary School and Mary Goff Elementary School sixth-graders have received them.

Twenty years ago, computer literacy was optional. Not any more. Today it is fundamental to the working world and to an individual’s ability to succeed.

…It is no surprise that Gov. Kathleen Blanco has helped to get the “Turn On” program going in Louisiana. Blanco has been out in front of significant technological initiatives during her tenure, including the Louisiana Optical Network Initiative and the Louisiana Immersive Technologies Enterprise Center.

The Problem
Lafayette prides itself on being a progressive city…going for something like this seems an obvious addition addition to a city-wide fiber and wireless build. Programs like Maine’s, Birmingham’s, and the one in Louisiana use laptops because they give each child learning tools both at school and at home. Apple’s program requires that schools have a good internet connection in order to be considered—one of its few real requirements. Where these programs run into trouble is with having easy, fast access at home. No school system can mandate that homes have an adequate connection; there is not only the cost, but some homes or apartments in every district simply cannot buy, at any price, a reasonably fast connection.

But bandwidth is essential to the vision. And not having a fast connection available in every home has been THE major stumbling block in pushing the use of network-based learning.

Nation-wide folks like Apple have simply had to compromise the vision. No comprehensive assignments can be made for completion at home. No teacher can assume that learning, practice, and reinforcement are available anywhere but in the school itself. That limitation keeps anyone from seriously designing programs that really encourage the habits of life-long learning that a dynamically changing society has come to demand.

Testing the idea of pervasive, always-on learning hasn’t been possible.

Solutions
OLPC’s ad-hoc mesh networking comes as close as anyone has to proposing a viable solution to the lack of universal, always-on broadband service. A laptop taken home wouldn’t be assured of a connection to either their fellow students or the internet. Mobile Ad hoc mesh networking only works even half-reliably in the confines of a small area–like a school. Because it implicitly relies on one connection to the larger internet it is limited to dividing the available bandwidth (usually a small fraction of wifi’s potential bandwidth) it is, on its best days, slow. Video “show and tell” using cheap, built-in cameras like those found in Alexandria’s Macintoshes isn’t possible–and a whole range of program and screen sharing capacities are but theoretical dreams given those limits. But the OPLC implementation of networking is the best solution for collaboration that I can imagine without comprehensive support from the surrounding community. After all the OPLC was designed for use in third world countries where the village simply doesn’t have any way to provide connectivity. Some of the laptop’s most widely praised features result from its not being able to count on reliable electricity; in those places local networking can only come from the computers themselves.

But here, in these United States, electricity isn’t an issue. We could provide robust pervasive wireless access. If we had the will. That is what the wireless municipal dream has been about. (While I have critiqued the simplistic version of that dream it was never the dream I distrusted—only the suitability of the tools to realize it and the unwillingness of some promoters to deal with the weaknesses of their plans.)

A Solution; The Dream — Lafayette
Lafayette will soon have a functional fiber-optic network in a every corner of the city. A wireless network hooked into the fiber at every other node will closely follow that build. At the end we’ll see the nation’s first integrated fiber-optic/wifi network with speeds on both sides funded by 100 megs or more of bandwidth. Each wifi node could, if we chose, distribute 50 megs of bandwidth to its local area. That’s enough to provide more than enough bandwidth for all the kids on the block to use good quality mpeg-4/H.264 video for their collaboration–even at home. Lafayette’s kids could do screen sharing and use whiteboarding applications.

It would be easy to lock a code into the laptops that would give them special speeds and access privileges to school-provided programs. The school system and even individual classes could tunnel their own VPN’s (Virtual Private Networks) to provide tools and security. None of this is technically difficult. Access control and provisioning have all been more than adequately developed on university and large corporate campuses.

There’s grant money going begging and imaginative projects that lack grant support only because no one can imagine where the bandwidth to use them will be widely enough available to justify helping out.

With the essential, fast, universal infrastructure in place, the only limits for Lafayette would lie in our imagination and in our willingness to boldly use public assets for the public good.

Worth thinking about, don’t you think?

Good Sense Spreads…Muni Broadband comes of age

Two stories came across my virtual desk yesterday that tell me that the municipal telecom movement is maturing. The time is ripe for Lafayette’s resolution to the disagreements within the camp of those who favor municipal and regional public networks.

Background
The quiet, background, argument within that community has been between those that saw WiFi as the obvious way to provide ubiquitous, cheap internet connectivity and those who saw fiber as the only sensible long-term foundation for a municipal telecomm utility that would provide public capacity for internet, phone, cable, wireless and other services as they emerged.

I’ve argued that muni networks would need both fiber’s capacity and the mobility of wireless if they hoped to provide a valuable and competitive alternative to the increasingly interlocked camps of private incumbents. The opposition between Fiber and WiFi has always been false one but, for a host of reasons, the only practical way forward is to make the committment to building a FTTH network and only then build out a wireless network that would piggy-back on the crucial fiber infrastructure. That’s Lafayette’s plan.

With the recent shakeout of muni wifi market the hope that cities could get a private provider to build a network without any local risk or investment has been revealed as an impractical one. We’re now getting down to a more realistic appraisal of what cities will have to provide—and when it’s their own money on the line cities appear to be taking a more sophisticated view of what their citizens really need and the crucial role of fiber in providing it. When the “free,” “good enough” alternative evaporates people buckle down and actually think about their needs and how to make sure their investment pays for itself.

In Minnesota and Vermont
In the first of the two stories that indicate that muni telecomm is maturing, one a city has made the decision to push for a fiber network even though its neighbor is famous for one of the more successful WiFi builds. In the second, a successful fiber build has announced its intention to add wireless.

In Minnessota’s twin cities of Minneapolis and St. Paul Minneapolis has gotten a lot of publicity for moving forward, apparently on pretty favorable terms, with a WiFi network. It’s next door neighbor, however, isn’t buying in. St. Paul opted for a fiber network:

Minneapolis can keep its Wi-Fi network. St. Paul says Wi-Fi is too slow, and it wants something faster. Much, much, much faster.

On Wednesday, the City Council unanimously approved an advisory committee’s proposal to seek partners for a publicly owned fiber-optic cable network for high-speed Internet access…

St. Paul’s broadband system would be fixed in place, but the 20-member advisory committee said the city could add a Wi-Fi service later though a private provider. That would let the wireless system piggyback on the fiber-optic network, which it would need anyway to connect back to the Internet.

A sidebar succinctly makes the case:

WHY NOT WI-FI?

St. Paul quickly rejected the idea of Wi-Fi, City Council Member Lee Helgen says. Some reasons:

Too slow. Typical Wi-Fi speeds are 1-3 megabits per second, but research indicates average users may need speeds of up to 25 megabits per second by 2012.

It’s flaky. Wi-Fi doesn’t penetrate far into buildings; leaves, rain or snow can interfere with its signal.

In Vermont Burlington’s FTTH system has taken the go-slow approach to success and is now planning its move into wireless.

“We are going to build a wireless network,” said Tim Nulty, BT director, in an interview. “But the best way to build wireless is to build fiber first. That way we already have backhaul [capability] and every telephone pole is a potential antenna site.”

Like many municipalities seeking to deploy their own networks, the challenges in Burlington, the largest city in Vermont with 39,000 residents, were daunting. It had to convince state and city politicians and the town’s voters that the network was a good idea, as well as fend off criticism from established telecom providers. And early financing problems nearly sunk the project.

After picking its way through complicated political and financial minefields, BT developed a city-owned network that will supply Burlington citizens with low-cost triple-play broadband and, when its debt is retired in 15 years, should provide the city with 20% of its general fund.

“BT will be able to pay down its debt very quickly,” said Christopher Mitchell, of the Minneapolis-based Institute for Local Self Reliance. “On the cost side of the equation, Burlington once faced massively growing telecommunications expenditures. It now views the telecommunications sector as an important source of new revenues.”

“…We resisted pressure to do wireless at first,” he [Nulty] said, adding that he expects that BT will one day provide Burlington with a “wireless cloud.” Nulty is beginning to look at various wireless approaches including Wi-Fi, WiMax, mesh, EV-DO, cellular resale, and 700 MHz among others.

BT is reported to have started negotiations with other Vermont cities including Montpelier and Rutland as well as smaller neighboring communities interested in gaining access to the Burlington network.

Dealing realistically with the difficult facts of an endeavor is always a sign of emerging maturity. Muni broadband is coming of age.

VOE: Fixing What is Wrong with Muni WiFi

Voice of Experience Department.

[Lafayette’s decision two years ago in voting to build a Fiber To The Home system rather than a cheaper, less capable wireless system is being validated by current events and the emerging pattern suggests that local citizens might end up owning the nation’s most impressive model of a real, inexpensive, municipal network with modern bandwidth and workable mobility. Read on…]

Business Week picks up on current net buzz on the difficulties encountered by municipal wifi networks and the story does a good job in laying out the current unhappy state of such projects. It’s a sad story for a lot of people in a lot of places.

The static crackling around municipal wireless networks is getting worse.

San Francisco Wi-Fi, perhaps the highest-profile project among the hundreds announced over the past few years, is in limbo. Milwaukee is delaying its plan to offer citywide wireless Internet access. The network build-out in Philadelphia, the trailblazer among major cities embracing wireless as a vital new form of municipal infrastructure, is progressing slower than expected.

My friends in Philly say the network is pretty near useless where it is up—service is beyond spotty and it comes and goes unpredictably. The boards tell a similar story in Corpus Christi where Earthlink, a private provider, had bought the municipal network with a promise of upgrades. Google’s hometown Mountain View network isn’t anything to brag on either. The problem isn’t with public networks; difficulties seems to be hitting private and public muni wifi WANs (Wide Area Networks) pretty much equally.

There has been a lot of doom and gloom about the problems muni wifi networks are encountering (the Business Week article among them) and there has been the inevitable reaction to that on the part of advocates pointing out the immaturity—and naivete—of the original business plans. Business Week does, at the end of the story, note that a more mature business plan relies on the local city government being involved:

To make the business more profitable, Wi-Fi service providers are trying to pass more of the cost to the cities. “There’s no one that I am aware of right now who’d build a network without the city as a paying customer,” says Lou Pelosi, vice-president for marketing at MetroFi, which six months ago stopped bidding for projects unless the city agreed to become the network’s anchor tenant.

Advocates imply that a naive business plan is all that is wrong with the current crop of wide area wifi networks. Would that it were so.

The doom and gloom is overstated. But the truth is the version of muni wireless that emphasized cheap (or free) residential service using a wireless mesh to minimize costs was always a castle built on shaky technical grounds. From the beginning the fundamental concept was that you’d take a single expensive connection to the net and divide it up like the loaves and fishes between many users and still end up with sufficient connectivity to feed the masses. Thinking that way was hoping for the sort of miracle that doesn’t occur in our daily world. An analogy might be taking your home connection and “sharing” it with most of your neighborhood. That might work at times. But service could never be very fast or reliable. (Yes, it’s more complicated; I know–but that’s a fair analogy.) Additional problems having to do with the nature of the spectrum allocated to wifi (short range power and a frequency that has trouble cutting through vegetation or walls) added the limitations of physics to the questionable network design decisions.

Those problems can be overcome. It’s not even a twelve step program. Two will do

Step One is to abandon the idea that a wifi network will ever work well as a person’s primary, reliable, home connection to the full richness of the network.

Rock solid reliability is not in the cards for wifi–and affordable access to a reliable always-on connection is a prerequisite for full participation in the emerging digital culture.

You will need a hardwired, preferably Fiber To The Home connection if you plan to make full, reliable, consistent use of downloadable video, cable TV, Voice over IP, security alarms, medical monitoring and the like. With a fiber connection every individual can easily and cheaply provision their own in-home wifi network if wireless suits their style.

Any community that takes that stand abandons at one blow all the unrealistic demands that wifi technology simply cannot fulfill. Concentrate on ubiquitous local coverage, emphasize mobility and help people understand that cell phone levels of reliability is the best that can be hoped for. (That level of service would be a huge boon even without the unrealistic expectation, with ubiquitous coverage I could get a connection anywhere while on the go. I might not be able to do everything with it I could do at home–but I could do almost anything I can imagine that I would want to do on the run. Including in the best case, which I’ll get to below, mobile, albeit cell quality, VOIP.)

I do understand, and deeply sympathize with, the hope that cheap wifi could help close the digital divide. There still may be a role for it there if the bandwidth issues can be overcome (again see below). —But the reliability issues, arguably, are fundamental and the hacked-up solutions necessary unstable and too technically exacting to expect large populations to manage on their own. Pretending that wireless connectivity is the same as wired connectivity is profoundly misleading—and is a recipe for creating a second-class version of net usage where poorer users simply can’t rely on the net being there and so aren’t able to trust it fully enough to make it as central as their better-off brethren. Imagine what would have happened to telephone usage in our culture if the well-off got good, reliable, always on wired phone service. But “other people” got cheap, spotty, poor “radio” service on “garbage” bandwidth that might or might not work on any given day or location. That is the sort of divided service model was avoided in our phone history and if we try it today it will cause trouble downstream that I, for one, would rather avoid. The real solution to too expensive wired network connections is cheap wired network connections. And that is the solution that any conscientious community should seek. [I am grateful that that is the solution Lafayette has sought—LUS proposes to narrow the digital divide by making service significantly cheaper.]

With cheap, wired, reliable, big broadband available in every home the threshold moves to making some form of connection available on every corner (ubiquity) and making it available while you are on the move (mobility). That’s what wireless networks are good for—and why cell phones, as unreliable as they are, remain useful and hence popular.

Step Two is to abandon the the belief that wireless mesh networks can be used to turn an expensive wired connection into many cheap wireless ones.

It can’t; only Christ could manage the miracle of the Sermon on the Mount.

Build, instead, on the real virtues of wireless networks: ubiquity and mobility. Do your absolute best to minimize its weaknesses by making it as fast and reliable as possible within the confines set by physics and federal regulation.

Abandoning the idea that one connection the wired broadband internet can serve many users over a broad area well is the key to succeeding. Instead of designing the wireless network so that each wired connection feeds five, six, or more wifi access points, limit the ratio of access points to internet connections to 1:1. This makes for much less sharing of limited bandwidth among users, greater reliability, and dramatically reduced “latency” (the lag caused by mulitple jumps that makes VOIP phones impractical on most muni networks).

Better yet, attach your wifi network directly to a full throttle fiber network. Fund the entire capacity of wireless protocols. (Outside of a few University or corporate campuses very few of us have ever used a wireless network that worked the internet as fast as they could. The usual limiting factor is the wired network that supplies bandwidth to the wifi. If Cox or AT&T only gives me 5 megs of wired bandwidth to my access point then the theoretical 54 Mbit/s that is theoretically possible is limited to at most 5 Mbit/s. You’ll never see the other 49 Mbit/s no matter what it says on the side of the box.) A fiber network can easily supply a minimum of 100 Mbit/s supplied to the wifi access point; split that 100 once to a second wifi access point and something close to the full 50 megs of bandwidth that wifi is capable of could actually be seen on the street. Even split among a sizeable group of users on two nodes that would be plenty fast enough to support excellent quality VOIP with no discernable lag, great data connections, and many, many extras. Even if turned out to be less reliable and a bit slower in use than its wired counterparts the virtues of ubiquity and mobility would be there and our willingness to use cell phones proves that we find this trade-off acceptable.

A wifi network built this way would be as much superior to its wireless competitors as the fiber network would be to its wireline competitors.

But getting to that dream requires abandoning unrealistic expectations…and starting with a fiber network running down every street.

Lafayette is positioned to realize the ultimate dream: a cheap, blindingly fast, reliable, fiber-optic connection made available to every home and, based on that, a solidly architected, cheap, uniquely fast municipal wireless network that is demonstrably better than any muni wifi network in the nation.

Living large in Lafayette.

(Thanks go out to reader Scott who forwarded the story.)

PhotoSynth & Web 2.0—Worth Thinking About

Sunday Thought Dept.

Warning: this is seriously different from the usual fare here but fits roughly into my occasional “Sunday Thought” posts. I’ve been thinking hard about how to make the web more compelling for users and especially how to integrate the local interests that seem so weakly represented on the internet. As part of that exploration I ran across a research program labeled “PhotoSynth.” It offers a way to integrate “place” into the abstract digital world of the web in a pretty compelling way if your interest is in localism: it automatically recreates a 3 dimensional world from any random set of photographs of a scene and allows tags and links to be embedded in them. Once anyone has tagged a local feature (say the fireman’s statue on Vermillion St. or a associated a review with a picture of Don’s Seafood downtown.) everyone else’s images are, in effect, enriched by their ability to “inherit” that information.

But it seems that it is a lot more than just the best thing to happen to advocates of web localism in a long time. It’s very fundamental stuff, I think, with implications far beyond building a better local web portal…. Read On…

—————————-
Photosynth aka “Photo Tourism” encapsulates a couple of ideas that are well worth thinking hard about. Potentially this technical tour de force provides a new, automated, and actually valuable way of building representations of the world we live in.

This is a big deal.

Before I get all abstract on you (as I am determined to do) let me strongly encourage you to first take a look at the most basic technical ideas behind what I’m talking about. Please take the time to absorb a five and a half minute video illustrating the technology. If you’re more a textural learner you can take a quick look at the text-based, photo-illustrated overview from the Washington State/MS lab. But I recommend trying the video first.

(Sorry this video was removed by YouTube).

You did that? Good; thanks….otherwise the rest will be pretty opaque—more difficult to understand than it needs to be.

One way to look at what the technology does is that it recreates a digitized 3D world from a 2D one. It builds a fully digital 3D model of the world from multiple 2D photos. Many users contribute their “bits” of imagery and, together, they are automatically interlinked to yield, out of multiple points of view, a “rounded” representation of the scene. The linkages between images are established on the basis of data inside the image–on the basis of their partial overlap—and ultimately on the basis of their actually existing next to each other—and this is done without the considered decisions of engaged humans.

Why is that a big deal?

Because its not all handmade. Today’s web is stunningly valuable but it is also almost completely hand-made. Each image or word is purpose-chosen for its small niche on a web page or in its fragment of context. The links that connect the web’s parts are (for the most part) hand-crafted as well and represent someone’s thoughtful decision. Attempts to automate the construction of the web, to automatically create useful links, have failed miserably—largely because connections need to be meaningful in terms of the user’s purpose and algorithms don’t grok meaning or purpose.

The web has been limited by its hand-crafted nature. There is information (of all sorts, from videos of pottery being thrown, to bird calls, to statistical tables) out there we can’t get to—or even get an indication that we ought to want to get to. We rely mostly on links to find as much as we do and those rely on people making the decision to hand-craft them. But we don’t have the time, or the inclination, to make explicit and machine-readable all the useful associations that lend meaning to what encounter in our lives. So the web remains oddly thin—it consists of the few things that are both easy enough and inordinately important enough to a few of our fellows to get represented on the net. It is their overwhelming number and the fact that we are all competent in our own special domains that makes the web so varied and fascinating.

You might think that web search, most notably the big success story of the current web, Google’s, serves as a ready substitute for consciously crafted links. We think Google links us to appropriate pages without human intervention. But we’re not quite right—Google’s underlying set of algorithms, collectively known as “PageRank,” mostly just ranks pages by reference to how many other pages link to those pages and weights those by the links form other sites that those pages receive…and so on. To the extent that web search works it relies on making use of handmade links. The little fleas algorithm.™ It’s handmade links all the way down.

Google was merely the first to effectively repackage human judgment. You’ve heard of web 2.0? (More) The idea that underpins that widely hyped craze is that you can go to your users to supply the content, the meaning, the links. That too is symptomatic of what I’m trying to point to here: the model that relies solely on the web being built by “developers” who are guessing their users needs has reached its limits.

That’s why Web 2.0 is a big deal: The folks designing the web are groping toward a realization of their limits, how to deal with them, and keep the utility of the web growing.

It is against that backdrop that PhotoSynth appears. It represents another path toward a richer web. The technologies it uses have been combined to contextually indexes images based on their location in the real, physical world. The physical world becomes its own index—one that exist independently of hand-crafted links. Both Google and Yahoo have been looking for a way to harness “localism,” recognizing that they are missing a lot of what is important to users by not being able to locate places, events, and things that are close to the user’s physical location.

The new “physical index” would quickly become intertwined with the meaning-based web we have developed. Every photo that you own would, once correlated with the PhotoSynth image, “inherit” all the tags and links embedded in all the other imagery there or nearby. More and more photos are tagged with meta-data and sites like flicker allow you to annotate elements of the photograph (as does PhotoSynth). The tags and links represented tie back into the already established web of hand-crafted links and knit them together in new ways. And it potentially goes further: Image formats typically already support time stamps and often a time stamp is registered in a digital photograph’s metadata even when the user is unaware of it. Though I’ve not seen any sign thatPhotoSynth makes use of time data it would be clearly be almost trivial to add that functionality. And that would add an automatic “time index” to the mix. So if you wanted to see pictures of the Vatican in every season you could…or view images stretching back to antiquity.

It’s easy to fantasize about how place, time, and meaning-based linking might work together. Let’s suppose you stumble across a nifty picture of an African Dance troupe. Metadata links that to a date and location—Lafayette in April of 2005. A user tag associated with the picture is “Festival International.” From there you get to the Festival International de Louisiane website. You pull up—effectively create—a 3-D image of the Downtown venue recreated from photos centered on the stage 50 feet from where the metadata says the picture was taken. A bit of exploration in the area finds Don’s Seafood, the Louisiana Crafts Guild, a nifty fireman’s statue, a fountain (with an amazing number of available photos) and another stage. That stage has a lot of associations with “Zydeco” and “Cajun” and “Creole.” You find yourself virtually at the old “El Sido’s,” get a look at the neighborhood and begin to wonder about the connections between place, poverty, culture, and music….

The technologies used in SynthPhoto are not new or unique. Putting them all together is…and potentially points the way toward a very powerful way to enhance the web and make it more powerfully local.

Worth thinking about on a quiet Sunday afternoon.

Lots o’ Langiappe:

TED talk Video — uses a Flickr data set to illustrate how the program can scoop up any imagry. This was the first reference I fell across.

Photo Tourism Video — Explains the basics, using the photo tourism interface. Shows the annotation feature of the program…

Roots of PhotoSynth Research Video—an interesting background bit…seadragon, stitching & virtual tourist, 3D extraction….

PhotoSynth on the Web Video: a version of the program is shown running in a web browser; only available to late version Microsoft users. (Web Site)

Microsoft Interactive Visual Media Group Site. Several of these projects look very interesting—and you can see how some of the technologies deployed in PhotoSynth have been used in other contexts.

Microsoft PhotoSynth Homepage

Thinking in Tucson, AZ: Getting it Right

Muniwireless points to a study, meant to inform about how to write up a request for proposals for Tucson’s prospective wireless RFP that caught my attention. First, the extent of the research and the detail in the study far exceeds that which goes into most full proposals, much less the RFP. A large amount of information about broadband usage, digital divide issues, and market questions is in this study—enough to provide plenty of well-researched data to support both public purposes (like economic expansion and bridging the digital divide) and to support a strong marketing plan (it includes current costs of broadband and geographical usage patterns).

Lafayette needs such a public document. Without the baseline it provides it will be difficult to demonstrate the success of the fiber project. You need such a baseline to demonstrate the economic benefits and to document the effects of lower cost broadband on bringing new faces into the broadband world.

But if possible, even more impressive than the original survey research was the quality of thought exhibited. Doing a study like this is a job–and most folks are tempted to do the job to specs even if that is not what is called for by the reality of the situation. CTC, the consultants doing this study didn’t succumb to that temptation. The job specs, it is clear, were to tell the city how to write an RFP that get private agencies to provide city-wide wifi without municipal investment. Universal coverage, closing the digital divide and economic development were apparently important parameters given the consultants.

Trouble is, it’s become clear that the private sector simply won’t, and perhaps can’t, fill that wishlist. And CTC, instead of just laying out what would give such an RFP the best chance, more or less told the city it couldn’t have all that without at least committing as the major anchor tenet. That was responsible, if unlikely to make the clients happy. And on at least two other points (Digital Divide issues and Fiber) they pushed their clients hard.

1) Digital Divide issues:

The interviews indicated that as computers become more affordable, the digital inclusion challenge that needs to be addressed is not as much equipment-based but rather how to overcome the monthly Internet access charge. (p. 18)

Concentrate WiFi provider efforts on low-cost or free access – not the other elements of the digital divide. (p. 17)

Entering the digital community is no longer about hardware; it’s about connectivity. The hardware is a one-time expense that is getting smaller and smaller with each day. Owning a computer is no longer the issue it once was. Keeping it connected is the real fiscal barrier these days. As their survey work shows, the people most effected know this themselves.

A CTC review of Lafayette’s project would note we’re doing several things they say most cities neglect to do: 1) LUS has consistently pushed lower prices as it major contribution to closing the digital divide—(and we must make sure that there is an extremely affordable lower tier available on both the FTTH and the WiFi components). 2) Ubiquitous coverage is a forgone conclusion; LUS will serve all–something no incumbent will promise (and something they have fought to prevent localities from requiring). 3) Avoiding means-testing. Lafayette’s planned solutions are all available to all…but most valuable and attractive to those with the least. Means-testing works (and is intended to work) to reduce the number of people taking advantage of the means-tested program. If closing the digital divide is the purpose means-testing is counterproductive.

About hardware, yes, working to systematically lower the costs and accessibility of hardware through wise selection, quantity purchase, and allowing people to pay off an inexpensive computer with a small amount each month on their telecom bill makes a lot of sense and should be pursued. But the prize is universal service and lowering the price of connectivity. Eyes, as is said, on the prize.

CTC additionally recommends against allowing extremely low speeds for the inclusion tier and for a built-in process for increasing that speed as the network proves itself. It also rejects the walled-garden approach, an approach which they discreetly don’t say out loud, turns the inclusion tier into a private reserve that will inevitably be run for the profit of the provider.

Good thinking…

2) The Necessity of Fiber

CTC also boldly emphasized fiber, not wireless, as the most desirable endpoint for Tucson.

We strongly recommend that the City of Tucson view the WiFi effort as a necessary first step, then look at ways to embrace and encourage incremental steps toward fiber deployment to large business and institutions, then smaller business, and eventually to all households. (p. 19)

Although wireless technologies will continue to evolve at a rapid pace, wireless will not replace fiber for delivering high-capacity circuits to fixed locations. In addition, fiber will always be a necessary component of any wireless network because it boosts capacity and speed. (p. 20)

The report explicitly rejects the theory that wireless will ever become the chief method for providing broadband service to fixed locations like businesses or homes. Few in the business of consulting on municipal wireless networking are so forthright in discussing the limitations of wireless technologies and the role of fiber in creating a successful wireless network that is focused on what wireless does best: mobile computing.

Again, good thinking.

Communities would do well to think clearly about what they want, what is possible, and the roles of fiber and wireless technologies can play in their communities’ futures. CTC has done a real service to the people of Tuscon. Too much unsupported and insupportable hype has driven muni wireless projects. That unrealistic start will come back to haunt municipal broadband efforts nationally as the failed assumptions show up in the form of failed projects. But those mistakes were not inevitable. The people of Lafayette should take some comfort in the fact that we haven’t made the sorts of mistakes that Tuscon’s consultants warn against and are planning on implementing its most crucial recommendations.

On Really Getting It

One of the most gratifying things about Thursday night’s fiber forum was watching Lafayette’s leaders (and a nice chunk of the community) exhibit all the signs that they really get it. They understand the potentials of the new technologies and have a good sense of how to milk the most out of them. This, my friends, is extraordinary–and vanishingly rare.

There is evidence that they clearly understand: 1) Great things are coming but what those great things are is unknown; 2) that the best thing to do encourage unknown great things is to be generous, and; 3) generosity needn’t cost much or anything.

On Great Things are unknown:
At one point in the night Huval broke into an historical analogy. He said that he felt like his predecessor in in 1897 must have felt when electricity was being introduced. All the questions were about lighting and light bulbs: “What do we do when the light bulb breaks” and much concern was shown about the dangers of sticking a finger in the socket. Nobody knew about radio, or TV, or microwave ovens. The idea, of course, is that the hopes and anxieties of the initial stages of a new technology are incomplete and even misleading when viewed in retrospect. The conclusion is that we don’t, can’t, know all the great things that will result from ubiquitous really huge bandwidth. That’s wise. To believe otherwise encourages folks to build elaborate edifices for a future that is never realized–and that has been the single greatest danger of “visionary” enterprises. But the danger in the wise recognition that you can’t know the future in detail is that it might lead to inaction: there is a temptation to believe that you can’t encourage that which you do not know. That’s not true and these guys are NOT making that mistake.

On designing for unknown Great Things:
There is a way to design a system to encourage unknown great things: Where possible choose networks that leave open the most possibilities for users to “do things” with the network. And once you have such a networks don’t put any limits on users that are not absolutely necessary. That can get technical pretty quickly. But the underlying attitude is not complicated: Be Generous. If you have a choice to make about network design: Choose the more generous network. If you have a choice to make about what a user is allowed to do with the network: Be Generous. That’s a pretty simple and easy to enact principle.

Such a “generous” attitude was exhibited when Huval illustrated how he thinks Lafayette’s network will be different from other networks. Verizon, which has a fiber to the home network with the attendant large capacity, is not offering much of that capacity to its public. It is choosing to merely compete with its cable opponents by offering a little more of the same for a little less. Verizon’s attitude is that if it can’t make a buck off it then it won’t offer it–it won’t give away anything, not even something which costs it nothing. Huval, pointing to Verizon said “our philosophy is going to be completely different” and that LUS will take the position of offering a much as possible as long as doing so doesn’t create an obvious problem with the business plan. Both the decision to offer symmetrical bandwidth and to allow full intranet bandwidth between customers show what decisions result when you take a generous position.

On the idea that generousity can be cheap:
Given generous upfront decisions about the intial design of the network features like symmetrical speeds and full intranet speeds will be very cheap to provide in light of the huge excess capacity the network will have. Making the decision to be generous need not be expensive. This point was made during the discussion Thursday. One man voiced concerns that all the nifty ideas that had been suggested would be expensive and that only some of them could be chosen. Huval seemed genuinely puzzled as he responded that, actually, very few would cost anything. In that he was right…but his point was that he was inclined to do as much of it as he could in that case.

So these guys get it: Generosity pays dividends. We’ve always known this, of course, but it is interesting to find the principle showing up so vividly in the esotoric world fiber-optic networks.

A National Broadband plan? Europe and Eisenhower Show the Way

Everybody from Lafayette’s Mayor Durel, to Jim Baller of Baller-Herbst, to Michael Dell of Dell Computer, to the president of the United States seem to think that we really, really ought to have a working National Broadband plan. We should. And friends, it’s not rocket science.

We’re not nearly as clueless as we think. Some developed Western countries have figured it out–their experiences should apply to ours. Filter that through the US’ own success in building a complex, expensive national infrastructure network and you’ve got a pretty detailed outline.

The European Experience:
VuNet reports on the Fiber To The Home market in Europe. While France, Scandinavia and The Netherlands are deploying significant fiber, the rest of Europe is not moving forward. The article notes that:

“In part, this is due to a lack of initiative from utilities and local authorities, but also because markets are dominated by incumbents and cable operators which have no incentive to make hefty investments in brand new infrastructure.”

…Generally speaking, there is less interest in building FTTH networks from conventional national telecoms operators, which argue that the approach is too expensive to carry out on a widespread basis.

…The majority of former state-owned monopolies, for example, have instead committed to fibre-to-the node.

Sound painfully familiar? It should. That could be AT&T they’re talking about. Incumbent duopolies have little incentive to build new systems which would provide abundant bandwidth when they can continue to sell an expensive, scarce resource over a paid for, if antiquated, network. The US is in exactly the same fix.

What’s the solution? Don’t worry about cajoling the incumbents. Find a infrastructure provider that is differently motivated. Sweden shows how that works:

FTTH is most advanced in Sweden, where the technology is used for 650,000 broadband subscriptions, or over 27 per cent of the country’s 2.3 million residents.

The study pointed out that the 150 municipal networks serving these customers tend not to be owned by conventional telecoms operators, but by utilities or local authorities.

So Sweden, with about 2% of the EU‘s population has more than half of its fiber-connected homes and a take rate (NOT homes passed, actual subscriptions) of 27% of the population. That is amazing.

You’d think any country that wanted to figure out how to encourage real broadband and extensive use in a modern Western economy would take a lesson from this. Here’s a proven national strategy: encourage local communities to take on the task. They know what their citizens need. They’re willing to take the longer view. They’ve got no baggage of old networks to protect. And they’re not interested in squeezing the maximum return out of their customers.

The American Experience:
The Eisenhower Interstate Highway System.

No one any longer argues that the Interstate Highway System wasn’t the best economic investment since the Louisiana Purchase. The return on investment has been astronomical and it hard to imagine the modern US economy without it.

That system is owned and operated by the states and the states provide 56% of the funds necessary to build and maintain them. There is an elaborate set of standards and inspections and a significant amount of federal “guidance” in contracting and costing.

An extensive, expensive, successful state-of-the-art national infrastructure has already been built in America. We know how to do it. Just apply the lessons learned:

So here’s a real national strategy in a nutshell: Adapt the Interstate Highway model to a municipal ownership model.

1) Offer a 60-40 local/federal split to communities everywhere for the expensive last mile builds on their locally-owned rights-of-way.
2) Offer the same for the states to build the interconnects within their own states and tie-ins to neighboring states using rights-of-way along state highways (and their interstates).
3) Every community decides how much it wants to spend and the nature of the network they want; if it accepts the Federal money it adheres to federal rules in its construction and maintenance.

Sure there are details. I, for one, would impose traditional common carriage rules on the communities that accept federal money or federally funded interconnects. And I’d want a “no speed limit” clause built into the law. (Yes, that’s a joke.)

But those sorts of things would be extras. They’d not be necessary to accomplishing our national goals. The above is all that is critical. In a decade we’d have an “Interstate High Broadband System” that would be the envy of the world.

On Killing the Goose that Lays the Golden Egg

The Advocate carries the odd story, splashed across the front page of its Acadiana section this morning, that retells the tale one Steve Pellessier told the Concerned Citizens for Good Government yesterday. Pellessier wants Lafayette to sell off LUS to pay for current shortfalls in road funding.

Thank heaven that at least some folks have a classical education. Joey Durel responded humorously but basically dismissively to the suggestion by saying that do so would be like getting “rid of the goose that laid the golden egg.”

The idea of selling off a consistent money-maker, to the tune of 17.2 million and a quarter of the city-parish budget each year, for a one-shot, quick fix play to meet the parish’s road needs following Katrina & Rita is plain foolish. It has to be one of the purest examples of the lessons of Aesop’s fable concerning “the destructiveness of greed, the virtue of patience.”

First, historically LUS has had lower prices than its private competitors (the current rough equity is unusual) and Pellessier appears to know that. Citizens would end up paying twice: once in the form of 25% higher taxes–the money has to come from somewhere–and once in the form of higher utility bills. Second, and this point appears to have very discretely not been raised considering the current divisiveness of the issue in the council, it would be a sale of city assets to benefit almost solely suburban needs and the downstream cost of more expensive electricity would be borne solely by city residents as well. Politically this should be a major nonstarter. The current push to dissolve the city-parish form of government is mostly based on formless resentment. Any movement in this direct would give that movement a basis in real injustice and a real constituency.

Beyond the foolishness of the idea of killing the goose you’ve got the fact that this goose is fertile. The goose in the fable is obviously sterile–it lays golden eggs but those eggs don’t hatch. It is unique. LUS however is incubating another goose that promises to lay even larger golden eggs. The mere threat of an LUS Telecom network has kept Cox from raising prices. The reality of a cheaper, more capable alternative will save us all a bundle off our monthly bills.

Beyond the cost savings we should all be aware that the income to the city-parish coffers should be substantial. That 17.2 million LUS gives us comes chiefly from electricity…a low-margin utility. The money coming into the coffers from the Telecom division will mostly be from high-margin cable industry competition. How much do you spend on electricity? How much do you spend on cable, internet, and phone service? Think about it…

If there is anything that’s more foolish than killing the goose that laid the golden egg it’s killing one that has offspring that also lay golden eggs.

Though the Advocate story doesn’t mention it Pellessier, in a recent letter, did say that LUS could keep its recently voted-in telecom division. That’s a crock and Pellessier, an opponent of the LUS plan, should know it. Much of what makes the telecom unit economic–and the main reason more cities are not in a position to make the same decision Lafayette has–is that Lafayette already owns and operates the necessary infrastructure in poles and maintenance crews in order to service its electrical division. It is hard not to suspect that a suggestion this off the mark isn’t motivated in some part by left over resentments from having lost that fight.

You’d think a “Certified Commercial Investment Member” — someone who specializes in commercial real estate investments–would understand that trading a growing revenue-producing asset for a one-shot wasting asset is always a bad idea. Don Bertrand makes the point more succinctly:

Don Bertrand said he’s glad to have a discussion about how to fund roads, but that LUS is the city’s best asset. Bertrand said there are other options to raise funding without giving up a revenue-producing entity like LUS.

“When we’re done, we’ll have roads, but roads don’t produce money,” Bertrand said.