PhotoSynth & Web 2.0—Worth Thinking About

Sunday Thought Dept.

Warning: this is seriously different from the usual fare here but fits roughly into my occasional “Sunday Thought” posts. I’ve been thinking hard about how to make the web more compelling for users and especially how to integrate the local interests that seem so weakly represented on the internet. As part of that exploration I ran across a research program labeled “PhotoSynth.” It offers a way to integrate “place” into the abstract digital world of the web in a pretty compelling way if your interest is in localism: it automatically recreates a 3 dimensional world from any random set of photographs of a scene and allows tags and links to be embedded in them. Once anyone has tagged a local feature (say the fireman’s statue on Vermillion St. or a associated a review with a picture of Don’s Seafood downtown.) everyone else’s images are, in effect, enriched by their ability to “inherit” that information.

But it seems that it is a lot more than just the best thing to happen to advocates of web localism in a long time. It’s very fundamental stuff, I think, with implications far beyond building a better local web portal…. Read On…

—————————-
Photosynth aka “Photo Tourism” encapsulates a couple of ideas that are well worth thinking hard about. Potentially this technical tour de force provides a new, automated, and actually valuable way of building representations of the world we live in.

This is a big deal.

Before I get all abstract on you (as I am determined to do) let me strongly encourage you to first take a look at the most basic technical ideas behind what I’m talking about. Please take the time to absorb a five and a half minute video illustrating the technology. If you’re more a textural learner you can take a quick look at the text-based, photo-illustrated overview from the Washington State/MS lab. But I recommend trying the video first.

(Sorry this video was removed by YouTube).

You did that? Good; thanks….otherwise the rest will be pretty opaque—more difficult to understand than it needs to be.

One way to look at what the technology does is that it recreates a digitized 3D world from a 2D one. It builds a fully digital 3D model of the world from multiple 2D photos. Many users contribute their “bits” of imagery and, together, they are automatically interlinked to yield, out of multiple points of view, a “rounded” representation of the scene. The linkages between images are established on the basis of data inside the image–on the basis of their partial overlap—and ultimately on the basis of their actually existing next to each other—and this is done without the considered decisions of engaged humans.

Why is that a big deal?

Because its not all handmade. Today’s web is stunningly valuable but it is also almost completely hand-made. Each image or word is purpose-chosen for its small niche on a web page or in its fragment of context. The links that connect the web’s parts are (for the most part) hand-crafted as well and represent someone’s thoughtful decision. Attempts to automate the construction of the web, to automatically create useful links, have failed miserably—largely because connections need to be meaningful in terms of the user’s purpose and algorithms don’t grok meaning or purpose.

The web has been limited by its hand-crafted nature. There is information (of all sorts, from videos of pottery being thrown, to bird calls, to statistical tables) out there we can’t get to—or even get an indication that we ought to want to get to. We rely mostly on links to find as much as we do and those rely on people making the decision to hand-craft them. But we don’t have the time, or the inclination, to make explicit and machine-readable all the useful associations that lend meaning to what encounter in our lives. So the web remains oddly thin—it consists of the few things that are both easy enough and inordinately important enough to a few of our fellows to get represented on the net. It is their overwhelming number and the fact that we are all competent in our own special domains that makes the web so varied and fascinating.

You might think that web search, most notably the big success story of the current web, Google’s, serves as a ready substitute for consciously crafted links. We think Google links us to appropriate pages without human intervention. But we’re not quite right—Google’s underlying set of algorithms, collectively known as “PageRank,” mostly just ranks pages by reference to how many other pages link to those pages and weights those by the links form other sites that those pages receive…and so on. To the extent that web search works it relies on making use of handmade links. The little fleas algorithm.™ It’s handmade links all the way down.

Google was merely the first to effectively repackage human judgment. You’ve heard of web 2.0? (More) The idea that underpins that widely hyped craze is that you can go to your users to supply the content, the meaning, the links. That too is symptomatic of what I’m trying to point to here: the model that relies solely on the web being built by “developers” who are guessing their users needs has reached its limits.

That’s why Web 2.0 is a big deal: The folks designing the web are groping toward a realization of their limits, how to deal with them, and keep the utility of the web growing.

It is against that backdrop that PhotoSynth appears. It represents another path toward a richer web. The technologies it uses have been combined to contextually indexes images based on their location in the real, physical world. The physical world becomes its own index—one that exist independently of hand-crafted links. Both Google and Yahoo have been looking for a way to harness “localism,” recognizing that they are missing a lot of what is important to users by not being able to locate places, events, and things that are close to the user’s physical location.

The new “physical index” would quickly become intertwined with the meaning-based web we have developed. Every photo that you own would, once correlated with the PhotoSynth image, “inherit” all the tags and links embedded in all the other imagery there or nearby. More and more photos are tagged with meta-data and sites like flicker allow you to annotate elements of the photograph (as does PhotoSynth). The tags and links represented tie back into the already established web of hand-crafted links and knit them together in new ways. And it potentially goes further: Image formats typically already support time stamps and often a time stamp is registered in a digital photograph’s metadata even when the user is unaware of it. Though I’ve not seen any sign thatPhotoSynth makes use of time data it would be clearly be almost trivial to add that functionality. And that would add an automatic “time index” to the mix. So if you wanted to see pictures of the Vatican in every season you could…or view images stretching back to antiquity.

It’s easy to fantasize about how place, time, and meaning-based linking might work together. Let’s suppose you stumble across a nifty picture of an African Dance troupe. Metadata links that to a date and location—Lafayette in April of 2005. A user tag associated with the picture is “Festival International.” From there you get to the Festival International de Louisiane website. You pull up—effectively create—a 3-D image of the Downtown venue recreated from photos centered on the stage 50 feet from where the metadata says the picture was taken. A bit of exploration in the area finds Don’s Seafood, the Louisiana Crafts Guild, a nifty fireman’s statue, a fountain (with an amazing number of available photos) and another stage. That stage has a lot of associations with “Zydeco” and “Cajun” and “Creole.” You find yourself virtually at the old “El Sido’s,” get a look at the neighborhood and begin to wonder about the connections between place, poverty, culture, and music….

The technologies used in SynthPhoto are not new or unique. Putting them all together is…and potentially points the way toward a very powerful way to enhance the web and make it more powerfully local.

Worth thinking about on a quiet Sunday afternoon.

Lots o’ Langiappe:

TED talk Video — uses a Flickr data set to illustrate how the program can scoop up any imagry. This was the first reference I fell across.

Photo Tourism Video — Explains the basics, using the photo tourism interface. Shows the annotation feature of the program…

Roots of PhotoSynth Research Video—an interesting background bit…seadragon, stitching & virtual tourist, 3D extraction….

PhotoSynth on the Web Video: a version of the program is shown running in a web browser; only available to late version Microsoft users. (Web Site)

Microsoft Interactive Visual Media Group Site. Several of these projects look very interesting—and you can see how some of the technologies deployed in PhotoSynth have been used in other contexts.

Microsoft PhotoSynth Homepage

Verizon’s fiber-optic payoff | CNET News.com

Food for Thought, Learning from the Big Guys Division:

“Verizon’s fiber-optic payoff;” The premise of this story is that Verizon did right by going with a Fiber To The Home plan…and AT&T/BellSouth messed up by trying to get by on the cheap. Verizon will have the bandwidth and the flexibility to compete more than adequately against the cable companies. But AT&T will not, on the basis of this author’s analysis.

Lafayette can mine Verizon’s experience for insight into how an all-fiber network can compete against the cablecos. Verizon will be several years ahead of LUS in the deployment of a new fiber system and its successes can be followed and its failures avoided. Verizon is, happily for Lafayette, not the incumbent locally. It’s enormous numbers will allow suppliers and developers for its products to supply a place like Lafayette almost as an afterthought–and at reasonable prices since the big-ticket purchaser is the giant Verizon. Even better, the also-rans on giant Verizon contracts will be looking for a place to prove their ideas. If Lafayette’s buyer’s are wise we’ll be able to cut some interesting deals on cutting-edge ideas that badly need a place to demonstrate that their ideas are viable before marketing it to the big fellow. A real danger has always been that LUS would be so far ahead of what the market in other places can provide that useful products would have to be untested and hand-crafted for us. That could be exciting…but expensive. Much better to have a big trailblazer proving basic concepts somewhere else. Then we only have to lay in a better implementation.

Apparently Verizon is succeeding. Thought its stock has taken a hit because of its heavy, long term investment in fiber has suppressed earnings its subscriber numbers are very healthy for the same reason. Its bet on a combination of old-style cable technology, fancy new IPTV for the extras, and fiber to the home capacity is allowing the company success in delivering both a high-quality, reliable TV experience, and fancy new IP services. AT&T, on the other hand, has high stock prices but low subscriber numbers for its new hybrid fiber/DSL that uses unstable IP for all its services. If I were looking at a long-term investment I know where my money would go.

The ticket for Verizon so far appears to be sticking to the absolutely reliable technology on the cash cow—cable TV—and using the capacity of fiber to deliver more channels; especially to deliver more bandwidth-hungry HDTV. In addition they are mounting an aggressive push into IP-based services. Verizon is clear about the need to migrate to a full IP system as soon as the technology is proven and it, and AT&T’s committment, ensures that the kinks will be worked out of IPTV sooner rather than later. All in all Verizon’s success validates the similar decisions made by the technical guys at LUS; the bottom line is that it’s good news for LUS.

Is there a downside for Lafayette? Sure. LUS may not compete with Verizon but Cox does. And as Verizon proves that fiber means deadly competition Cox is more likely to feel the pressure to develop effective ways to expand its own bandwidth and competitive delivery of services. But that’s not all bad of course–for the consumer. The catch for Cox is that Verizon is currently succeeding without competing much on price. LUS will and that and the local loyalty that LUS has gained (and Cox forfeited) makes LUS a much tougher target locally than Verizon is through most of its footprint.

The real loser? AT&T/BellSouth who will have the least capable system in the city and very little room to really compete on price.

Creating a Lafayette Commons

Here’s something that’s worth the read on a rainy Sunday afternoon. It’s an inspiring essay titled “Reclaiming the Commons” by David Bollier. He bitterly complains about the growing tendency to allow our common resources and heritage —from concrete public property like oil or grazing rights on public land, to more abstract rights to goods created by regulation like the electro-magnetitc spectrum, to truly abstract (but very real and increasingly valuable) rights to common ideas and intellectual resources— those common resources are being taken from us and handed over to the few.

The argument is that we are all poorer for it. And that society would be richer if those things remained in the public domain. He convincingly argues that undue private ownership of ideas stifles the invention of new variants and new ideas.

The point for Lafayette is that we are about to create a new common resource: the Lafayette fiber intranet, and we are creating it as a publicly-owned resource. If Bollier is right then we have a real responsibility to make sure that our common property serves the common good and that it not be “enclosed” by the few.

We here in Lafayette will be in the nearly unique position of commonly owning a completely up-to-date telecommunications infrastructure ranging from a fiber-all-the-way-to-the-home network, to a wifi network using the capacity of that fiber. A citizen who wants to will be able to get all of his telecommunications needs met using local resources, resources that are owned in common.

A lot of what is missing on the web is access to local resources-the church calendar, the schedule for the shrimp truck, what vegetables are available at the farmer’s market, specials at the local restaurants, nearby childcare, adult ed resources, local jobs…and much more aren’t available or are the next best thing to impossible to usefully gather in one place. We could, by acting together, fix that.

Creating a thriving network-based commons is the task that is set before us. Bollier gives us some insight into the magnitude of what is at stake. We can, by the way we participate and what we create, create a truly common and truly valuable resource.

Give the Bollier article a look. Then think a bit about what we can do to make ours a transparently valuable network–one that will encourage all to participate fully.

(Hey, we can be idealists if we want. :-))

Thinking in Tucson, AZ: Getting it Right

Muniwireless points to a study, meant to inform about how to write up a request for proposals for Tucson’s prospective wireless RFP that caught my attention. First, the extent of the research and the detail in the study far exceeds that which goes into most full proposals, much less the RFP. A large amount of information about broadband usage, digital divide issues, and market questions is in this study—enough to provide plenty of well-researched data to support both public purposes (like economic expansion and bridging the digital divide) and to support a strong marketing plan (it includes current costs of broadband and geographical usage patterns).

Lafayette needs such a public document. Without the baseline it provides it will be difficult to demonstrate the success of the fiber project. You need such a baseline to demonstrate the economic benefits and to document the effects of lower cost broadband on bringing new faces into the broadband world.

But if possible, even more impressive than the original survey research was the quality of thought exhibited. Doing a study like this is a job–and most folks are tempted to do the job to specs even if that is not what is called for by the reality of the situation. CTC, the consultants doing this study didn’t succumb to that temptation. The job specs, it is clear, were to tell the city how to write an RFP that get private agencies to provide city-wide wifi without municipal investment. Universal coverage, closing the digital divide and economic development were apparently important parameters given the consultants.

Trouble is, it’s become clear that the private sector simply won’t, and perhaps can’t, fill that wishlist. And CTC, instead of just laying out what would give such an RFP the best chance, more or less told the city it couldn’t have all that without at least committing as the major anchor tenet. That was responsible, if unlikely to make the clients happy. And on at least two other points (Digital Divide issues and Fiber) they pushed their clients hard.

1) Digital Divide issues:

The interviews indicated that as computers become more affordable, the digital inclusion challenge that needs to be addressed is not as much equipment-based but rather how to overcome the monthly Internet access charge. (p. 18)

Concentrate WiFi provider efforts on low-cost or free access – not the other elements of the digital divide. (p. 17)

Entering the digital community is no longer about hardware; it’s about connectivity. The hardware is a one-time expense that is getting smaller and smaller with each day. Owning a computer is no longer the issue it once was. Keeping it connected is the real fiscal barrier these days. As their survey work shows, the people most effected know this themselves.

A CTC review of Lafayette’s project would note we’re doing several things they say most cities neglect to do: 1) LUS has consistently pushed lower prices as it major contribution to closing the digital divide—(and we must make sure that there is an extremely affordable lower tier available on both the FTTH and the WiFi components). 2) Ubiquitous coverage is a forgone conclusion; LUS will serve all–something no incumbent will promise (and something they have fought to prevent localities from requiring). 3) Avoiding means-testing. Lafayette’s planned solutions are all available to all…but most valuable and attractive to those with the least. Means-testing works (and is intended to work) to reduce the number of people taking advantage of the means-tested program. If closing the digital divide is the purpose means-testing is counterproductive.

About hardware, yes, working to systematically lower the costs and accessibility of hardware through wise selection, quantity purchase, and allowing people to pay off an inexpensive computer with a small amount each month on their telecom bill makes a lot of sense and should be pursued. But the prize is universal service and lowering the price of connectivity. Eyes, as is said, on the prize.

CTC additionally recommends against allowing extremely low speeds for the inclusion tier and for a built-in process for increasing that speed as the network proves itself. It also rejects the walled-garden approach, an approach which they discreetly don’t say out loud, turns the inclusion tier into a private reserve that will inevitably be run for the profit of the provider.

Good thinking…

2) The Necessity of Fiber

CTC also boldly emphasized fiber, not wireless, as the most desirable endpoint for Tucson.

We strongly recommend that the City of Tucson view the WiFi effort as a necessary first step, then look at ways to embrace and encourage incremental steps toward fiber deployment to large business and institutions, then smaller business, and eventually to all households. (p. 19)

Although wireless technologies will continue to evolve at a rapid pace, wireless will not replace fiber for delivering high-capacity circuits to fixed locations. In addition, fiber will always be a necessary component of any wireless network because it boosts capacity and speed. (p. 20)

The report explicitly rejects the theory that wireless will ever become the chief method for providing broadband service to fixed locations like businesses or homes. Few in the business of consulting on municipal wireless networking are so forthright in discussing the limitations of wireless technologies and the role of fiber in creating a successful wireless network that is focused on what wireless does best: mobile computing.

Again, good thinking.

Communities would do well to think clearly about what they want, what is possible, and the roles of fiber and wireless technologies can play in their communities’ futures. CTC has done a real service to the people of Tuscon. Too much unsupported and insupportable hype has driven muni wireless projects. That unrealistic start will come back to haunt municipal broadband efforts nationally as the failed assumptions show up in the form of failed projects. But those mistakes were not inevitable. The people of Lafayette should take some comfort in the fact that we haven’t made the sorts of mistakes that Tuscon’s consultants warn against and are planning on implementing its most crucial recommendations.

Just Not True: Cable Rates Not Falling in Texas

Here’s something to think about:

Competition has not led to lower basic cable-rates in Texas. (MultiChannel News)

That’s the long and the short of it according to a study by the Texas chapter of NATOA.

Two years ago Texas’ Telephone companies (basically SBC (now AT&T) and Verizon) used the Texas legislature’s famous penchant for failed deregulation to initiate a national push to move control of cable franchise from local communities to the state. The phone companies’ purpose was to avoid the demand that cable companies serve the entire community—and not just the most profitable part–if the companies wanted to use the right-of-way property owned by the community to turn a profit on the people living there. Local counties and municipalities consistently insisted on this principle….and the new state “regulators” (deregulators) do not. Louisiana dodged a bullet last year when Governor Blanco vetoed a similar bill approved by our legislators.

The Texas legislators were promised that they’d see “competition” and dramatic reductions in prices in exchange for removing local government’s ability to determine what is done with local property. And as the phone company juggernaut rolled through state legislatures they promised state after state the same — and told them it had worked in Texas, the first state in the union to enact such a law. They told that story about the success of competition in the halls of Congress as well. It was easy to believe; after all everyone knows that competition brings lower prices.

Except it hadn’t worked. And it still isn’t working….

You can take a look at the data gathered by the Texas chapter of NATAO. What is clearly shown is that the cable companies are not lowering their rates to compete with Verizon and SBC/BS/AT&T’s very limited rollout.

This is very similar to the disappointment that has accompanied the fashionable deregulation of electricity in a number of states. The data shows that the electricity is cheaper in the regulated states–and the gap is growing.

What’s wrong with picture? Why hasn’t “competition” worked? Legislators across the nation have endorsed the politically correct argument raised by the monopoly corporations that owned the electrical and telecom wires and bet their citizen’s money on the faith that deregulation would lead to falling prices. They, and their constituents, have lost that bet. And a whole lot of people are trying to make sure that the public does not notice.

Here is a hard truth: The blind faith that “deregulation” leads to true price competition is a false faith. In natural monopoly markets regulation–or public ownership–is the only real way to establish fair prices. Utility markets are monopoly markets…and giving the monopolists free reign won’t lower prices–quite the contrary. Utility deregulation is a failed experiment.

It is something the country as a whole ought to be thinking through rationally. The honest solution is to reinstitute real regulation where markets don’t work. Most places don’t have the resources to resist the drain on community wealth that private energy utilities and private telecommunications utilities represent. But a few communities, like Lafayette, can choose to opt out of the mistakes that the rest of the country is making. Lafayette has done so and will have its own locally-regulated power and telecom utilities.

Lafayette made the right choice on July 16th two years ago.

Joost

We are in the final days…of TV1.0. The signs are everywhere. Most recently, I received an invite (thanks to a sympathetic reader) to beta test Joost–a combo software client and web-based content library that allows the user to demonstrate for themselves that the old way of doing things is numbered.

TV1.0 is the familiar old broadcast model of one broadcaster sending to many, passive “receivers.” TV stations send their signal out and we sit and watch it. Defined by limited spectrum, there were only a few channels, shows appeared in their set time slot, for the defined number of minutes less the minutes devoted to the ubiquitous ads. Shows are designed to appeal to the broadest number of people and offend the fewest. Cable changed very little except that it gave you more channels. PBS introduced the idea of voluntary subscription support–but remains in other ways locked into the broadcast model as well.

There’s lots to hate about this model of video. (And I’ve been happy to jump in; see “Die TV. Die! Die! Die!” or “Why You Want Real Bandwidth”.) I’ve called the emerging model “DV” for Downloadable Video. The basic idea is that when bandwidth is no longer scarce (e.g. when we have fiber to the home) and we can download video to our hearts content, then the reasons for the old, annoying way of getting video will go away and new forms will emerge that cater to our obvious interest in watching shows whenever we want to, unlimited by advertiser-defined time slots, and uninterrupted by ads. Shows can be designed to appeal to the passionate viewer and world-wide, cheap, direct-to-consumer distribution can be counted on to provide an audience to support even the most specialized shows.

Joost plays in to this because it has become the most credible contender for long-show, commercially-produced content king. (YouTube has the short piece, self-produced end of the DV market pretty much sown up–and in some ways is even further into a DV1.0 world.) Joost first hit the news as the brainchild of the same guys who brought you the telephony-disrupting Skype and terrified the music and video businesses with Kazaa. The trick in all these enterprises was leveraging the unused bandwidth of customers using an idea described as peer-to-peer aka P2P. In return for the downloaded service you get you let the network use your spare uploading capacity.

Joost uses this technology as well and so holds down their main operating costs…but the real splash came when they began to sign up real, long-form content and supported it with in-video advertising. That gave them both content credibility and a visible business plan–something no similar competitor has. The jury is still out on whether long-form content has to be supported by advertising that is embedded in the download or whether, like YouTube, advertising can be on the supporting web page or whether, like iTunes, a pay-per-view model is possible.

Part of what is interesting about Joost is that they are setting up to be a very social site. They’ve got chat, you can invite friends, and there is an API for new widgets that could further extend the ability to hook into IP services and RSS feeds. This opens doors. Conceivably one could invite friends from all over the country to watch the same show or sporting event and chat online while it was playing. No doubt “clubs” will arise focused around particular shows and scheduled meetings. RSS will allow for further amalgamation and integration with other services and video feeds.

But the proof is in the pudding; or in this case, the viewing. I recently sat down, played around with the (very slick) interface and actually settled in to watch a commercial TV/now DV show. It played at full screen on my laptop–there was noticeable blockiness but no actual hesitations even though the feed was being relayed over my wifi. Cox had provided me access to the first real, commercial television show I’ve streamed down and watched in its entirety over the internet instead of watching it when it was scheduled to be on cable. It’s a sign. We’re in the final days of TV1.0.

——————–

(Like the idea and have found by clicking through that Joost is still in beta and requires an invite from a user? Happy days: GigaOm’s influential NewTeeVee blog has the pull to get a simple sign-in sheet for its readers. You can use it too.)

Incidently, there are other, less high-profile startups trying to do something similar. The Joost page on Wikipedia points to several. I’d particularly recommend the Democracy site and player.

Learning & Teaching—and the Library

Here’s something that is a short, fun, watch but deserves a longer, contemplative, consideration.

It’s a roller coaster ride done in a classic Atari program. Go try it, noting the long, long rise at the end where you get to look down on the roller coaster below you.

Go on, this is fun and the rest won’t make sense unless you’ve actually tried it: YouTube – Real Estate Roller Coaster

…………………………………………………………………………………….
…………………………………………………………………………………….
…………………………………………………………………………………….

OK, now the not-so-fun part. That is a video that maps the cost-adjusted price of housing stock since 1890. (Here’s what that looks like in a NYTimes graph–you’ll recognize the “ride.”) Before you cry “boring–the worst of social studies” let me hasten to say that while I do not find the content boring (after all I was a social studies teacher in another life–and own my home) that is not why I’ve posted this for your lazy Sunday consideration.

I’m more interested in the context of this blog in the very interesting fact that you can learn something from this video that you can’t learn in more standard ways. We learn most usefully from “experience.” Educators mean something pretty specific when they use that term and it doesn’t preclude learning that takes place in schools. It includes things like this video which give you the experience of change over time. This is pretty different from the all -at-once time-abstracted image you get from the graph.

Long story short: this is a fine learning/teaching tool.

What makes that interesting here is that it was made by a “regular person” using the cheapest of hardware and software to help folks understand something which is otherwise difficult to put across about a very special interest of his or her own. That sort of individually localized “production” of sophisticated material is new…and very encouraging.

If we want more of this sort of thing we should do a couple of things: 1) Supply big, cheap, upload bandwidth–so that people can do video uploads or serve a few videos effectively from their own server. 2) Provide access to sophisticated and flexible software…this video required mating graphs with a 3D game program.

We’ll soon enough have #1 covered in Lafayette, and with the amazing bandwidth that will make available, at least on the local intranet, we’ll have the potential to use increasing sophisticated programs located on the net that will help with #2. If we choose, we can buy access to amazingly sophisticated programs and offer fast access to them through a local “library” organization. The library here has some technically sophisticated folks; librarians caught on to the value of communications technology early. I see no reason that the Lafayette Public Library couldn’t offer such a “loan” program and occasional classes on the software. (They already offer more basic computer/net classes.)

It is worth really thinking about how we can set the stage for our community to have access to the creative tools they might need to create really interesting products.

An on-net software library might be an way to exploit the utility of our fast intranet and the power of the pooled resource of the community library for everyone’s benefit.

Food For Thought: LUS’ Wireless RFP

A little more than a week ago I posted a piece about LUS’ Wireless RFP (request for proposals) and asked a few questions. Since no one else answered them I decided to go down to City Hall and pick up a copy for myself.

For those who might have missed the story, LUS put out a call for proposals to supply what was described as a wireless network for LUS and city use. No mention of public access was made, though locals familiar with the way that the LUS fiber project evolved from purely utility purposes are reasonably hopeful that a wireless network will evolve in the same way.

The RFP itself is pretty simple as such things go and you have to think that bidders will need to request further specifications. But there is enough there take a stab at answer the questions I asked earlier.

Note: this is an 802.11 “WiFi” mesh network. That’s the same architecture that is being used in metro wireless installations from Philadelphia to San Francisco. For the technically inclined: the hardware standard described involves two radios operating in two different bands. Specifically, the equivalent of Tropos’ most advanced access points, and its software, is specified. (Tropos is the market leader in metro WiFi.)

1. Does it include a very strong backbone “supply” element?

  • Yes, It is hung directly off LUS’s current fiber ring. –It will not be crippled by running off a wireline supply source that has less capacity than it is able to use. (The expense of providing for adequate “backhaul”–and sometimes the ability to find such at any price has been a major limiting factor in most public muni WiFi efforts.)

2. Are upgrade “hooks” part of the proposed deal?

  • Yes, the request makes it clear that there will be at least a “phase 2” (official protestations aside) and that proposal should take into account the networks eventual expansion to full coverage of the 45 square miles of the city. The access point model specified is the first of a new generation from its maker and future models in the family are promised to be interoperable with these and to support emerging technology and standards like MIMO and WiMax and older standards like public safety.

3. Does it assume ubiquitous fiber?

  • Hmmn…well maybe or at least implicitly. Nothing beyond the first layer, “phase 1” is specified. But assuming that what is described for phase 1 sets the pattern for the future it looks like the plan is to make full use of the fiber. Wireless mesh networks are built around ratios between aggregation access points that are connected to backhaul networks and simple mesh network which are only connected to other access points via wireless. Common acceptable ratios are 5 or 6 mesh nodes per aggregation point. All too many systems are using larger ratios and putting up with the resulting performance issues. A gold-plated system would use a slightly smaller number. The ratio LUS is suggesting for phase 1 is 1:1.3. That is astonishingly low and only makes sense where the wireless owner also owns the backhaul network (in our case fiber). Other users would have to pay per drop for their microwave, WiMax, T1, fiber link, or the like and such per drop costs would run up the expenses very quickly. Maintaining such a low ratio would mean deploying a system of pretty astonishing capacity. While policy might limit the bandwidth allowed, nothing in the network itself limit network speeds. They could conceivably run at near the rated speed of 802.11 protocols that underpin it–currently about 54 megs.

4. Does it use owned spectrum for local backhaul? Or open? Or fiber?

  • Fiber. This is certifiably yummy. See above.

5. Does it use open spectrum for the final connection?

  • Yes. This is a “good thing,” for it means that a multitude of low-cost hardware will be able to access the network. Proprietary spectrum has some advantages for local governments and, generally, some is available to it for various safety functions but such networks cannot be practically be opened for public use.

6. What technologies are specified….WiFi, WiMax, etc…?

  • WiFi is specified. The suggested hardware is software upgradeable.

7. What applications are supported; either explicitly or through the specification of indicative standards?

  • Support for a wide range of applications including surveillance video, voice, data, mobile communications-seamless roaming, VPN, and meter reading are in the specs.

Long story short: There is nothing here that would impede using this as the core of a very capable public wireless network. Caveat: there is no particular reason for me to assume that it will be — beyond sheer desire and my own belief that a wireless component will be necessary in the coming competition with AT&T/BS and Cox.

Municipal Campaign Strategy; Learning from Lafayette

So what did we all learn from the battle of Lafayette? I’ve been asked recently and have been thinking about it some…What follows is a first draft which focuses pretty much on the active strategies of the two sides as I see it. —It’s about what they tried to accomplish and where they wanted the conversation to go. This ignores some interesting larger factors (like trust in the mayor, or the relaxed southern Lousiana attitude toward government, or Lafayette’s peculiar ways of organizing influence, for instance) that could be considered important but background factors. It also mostly ignores the tactical questions–how the strategies were enacted–that are some of the more interesting things to come out of this fight. Instead this is a more birds-eye view of what, it seems to me, both sides might have learned from Lafayette’s fight for fiber.

First off, it’s pretty apparent that the incumbents don’t have much new up their sleeves. The campaign they waged here mirrored campaigns they’ve waged in the past. We didn’t see the as dramatic a finish as we saw in the Tri-Cities but that may well have been because the battle was already lost for BellSouth and Cox before the end arrived. But that doesn’t mean that their basic idea about what makes for an effective campaign has changed: the basic strategy of sowing fear, uncertainty, and doubt seems pretty constant. The tactics seemed to involve a lot of replays as well…Push polls were used here, albeit pretty counter effectively. We got two last minute overexcited direct mail focusing on false claims about taxes, the repeatedly disproven idea that all municipal broadband (or even most) is failing, and silliness about the debt families are supposedly taken. Too, as in the Tri-Cities, an editorial writer who played a prominent role in the opposition was taken to task for unseemly involvement with the incumbents or their allies. The tactics were mostly retreads; what was different was that the predictable campaign was not fronted entirely by the incumbents themselves but, especially in the last days by their allies at Fiber 411.

One of the things the incumbents learned here was that long campaigns are bad for them. Given time, and an aggressive willingness to fight back, lies can be disproved, push polls turned to outrage, and promoting fear and insulting the intelligence of the locals begins to sour any possible relationship with the community. In Lafayette the fight went on for too long. The incumbents had to trot out their best weapons too early and pro-fiber partisans were able to correctly label them as FUD and drive home the message that the incumbents were not being truthful—a message that inoculated the public against further last minute lies.

Unfortunately, I think the incumbents also learned that, saved to the last minute, and promoted through a local proxy, their FUD (fear, uncertainty, and doubt) approach can still be effective. I agree with Don’s analysis that the last minute mailers, the full page ads that simply reprinted a (non)local editorialist’s massively inaccurate take and automated phone calling about a new fantasy “debt” issue were effective. They were simply not effective enough. The local pro-fiber groups kept up a dogged insistence, even during the incumbents’ quietest moments, that the incumbents and their allies were not truthful. Radio time remained filled with a recut version of the push poll and Lafayette Coming Together (LCT) was relentless in pushing the issue. LCG and LUS, while toning down this message near the end and moving it away from the Terry and Joey, never fully abandoned it.

What the pro-municipal fiber forces learned was probably more valuable: that they can win. The overwhelming economic power of the incumbents can be blunted. Their willingness to leave accuracy and truthfulness aside in the pursuit of their own interests can be turned against them. What it takes is something that most municipal officials will not have the stomach for: a full throated attack on some of the most powerful corporations in their city. Telephone corporations have a long history of being the most “generous” investors in state election campaigns and the most powerful lobbying force in state legislatures. Cable companies control what politicians understand to be the most powerful media in town. Lafayette was willing to fight with a strong local and populist message that clearly labeled its opposition as “greedy” “out-of-state” “monopolies.” The spectacle of our Mayor and the head of the utility system “standing up” for Lafayette in a press conference after every bit of misinformation spread by the incumbents and being uncompromising in calling them on each and every false claims was crucial to the campaign. Driving home the message that the incumbents self-interest and greed was driving this process was invaluable in resisting the final onslaught.

There is little doubt that Lafayette had advantages that might not be available in all locales. The bravery of the leadership and its willingness to call a monopoly, a monopoly and greed, greed has already been noted and was tremendously important. There was also a determined, deliberately broad-based coalition of citizens that made it hard to paint the project as one fostered by wealthy technocrats. The coalition group, Lafayette Coming Together, was also quite sophisticated about the use of both old and new media. But the greatest advantage was a pride of place born out of a realistic belief that the region, and Lafayette as the heart of that region, is unique and not subject to rules imposed on us by outsiders. It mixture of cultures, its cultural identities, and the ways the people have found to sustain their cultures make it very difficult for outsiders to successfully come in and infer that the locals are incompetent or successfully introduce effective divisive tactics. (One of the more despicable strategies, used all the way through and culminating in simple lies on Black radio near the end, was to try and split the Creole and black communities away from the rest of the community by using historical resentments which had nothing to do with the issue at hand. Without the aid of community leaders this attempt did not take hold. But the attempt is destined to be one of the longest remembered stains on the campaign of the incumbents and their allies.) Most communities have never had to develop that sort of resilience in the face of outside disapproval but the communities of Acadiana are very good at dismissing outsiders.

Other considerations that helped support a victory in Lafayette appear to be a result of market and national policy worries of the incumbents. Fights like the one in the Tri-Cities can be considered Pyrrych victories—the cost was high, not necessarily in terms of money, but in terms of their reputation both locally and nationally. The cable and telephone companies simply are regional monopolies in their core business and maintaining a favorable regulatory relations at the state level and franchise agreements at the local level depend upon their being perceived as good, or at least benign, local citizens. It will surely take a decade or more to regain that status in the Tri-Cities; even voters who succumbed to the arguments of the incumbents could not help but notice the fear-based tactics that were used to bring them along. There was no large federal issue ongoing at the time of the fight in Illinois. But major initiatives of both the Cable and the Phone companies are before statehouses and more importantly, the Congress. The centrally important 1996 telecom act is up for revision this legislative season, in but one example. An ugly, high-profile attack on Lafayette when the defenders were willing to fight back by identifying the incumbent corporations as “greedy monopolists” may well have been too much to stomach for those at corporate central who felt they had bigger fish to fry and to much to lose to risk that sort of battle in a single small city.

Finally there is the basic market motivation: too much bad behavior damages the bottom line–if you lose. Surely BellSouth and Cox had done their own polling and could read the writing on the wall as well as anyone. The referendum was going to succeed and p0lling no doubt showed that the first reaction of the population to a new round of misinformation would turn more people against them than it gained. If there was any doubt about that the swift and overwhelmingly hostile reaction to the second push poll this summer proved the point that the usual incumbent tactics had become counter-productive. The hard truth was that BellSouth and Cox still had to compete in Lafayette and a loss in a full scale assault would have immediately pushed the likely “take rate” among voters past 5o% percent if corporate behavior turned a “Yes” vote into a vote against Cox and BellSouth. Working through proxies and saving the mail pieces and scare phoning until the end when they could not be answered might well have been all that can be done without damaging their market position by turning the referendum into a marketing tool for LUS.

Lafayette’s battle deserves, I believe, to be seen as one model for regaining local control of crucial monopoly infrastructure. The underlying populist message of local self-determination and legitimate anger toward regional monopolies like BellSouth and Cox was what drove the winning argument in Lafayette. People saw nothing wrong with building for themselves a network that the incumbents refused to build for them. Similarly, people do understand that these companies are monopolies whose bottom line has nothing to do with what is best for the communities across the country in which they reside. That is the core upon which electoral success was built. Lafayettes’ leadership, her aware citizens’ group, a committed ‘old Lafayette’ leadership, and the way her cultural distinctiveness played out made the message relatively easy to develop and denied the opposition virtually all local assets. Other communities might not share those particular advantages but the anti-incumbent message that can win has now been established and future communities can sharpen the message and develop their own resources.

Lafayette can be proud to have developed a winning model and strategy—not without help of course, but with plenty of verve. It will be up to our successors to sharpen the tool and make it more generally useful.

“Die TV. Die! Die! Die!” or “Why You Want Real Bandwidth”

Television is really aggravating. We are so used to it that we forget how irritating most of the time but occasionally something happens to remind us just how bad things are. And we go off on TV (and sometimes even go off it for awhile). But we almost never realize why it is so bad.

We hate our TV because of limited bandwidth.

A fella named Ernest Miller reminded me of this with a post of his called “Die Channel. Die! Die! Die!” Ernest is one of those brilliant men who sit down, locate a problem of real substance, and try to fix it. His area is the intersection of law and technology. He’s at Yale now and is noted for his work on modern copyright issues. But his complaints about having to watch TV on someone else’s scheduling and about the artificial lengths of TV shows is what led me to think once again about how irritating TV is.

And I think we hate our TVs because of long-standing bandwidth limits.

Things to be justly irritated by:

  • Your favorite show is scheduled at a fixed time every week. (But your schedule isn’t fixed to match!)
  • Somebody in New York thinks all the good stuff ought to come on while you want to sleep. (And you refuse to change your sleeping habits or job to accommodate that New Yorker!)
  • Apparently there is some “normal” person in Kansas who all these shows is supposed to please mildly without offending very often. (But this fare pleases you about as well as the food in Kansas . . . you want something with a little more life!)
  • Someone has made up a rule that TV shows can only be shown in increments of a half-hour. (But you are irritated by shows that are have 23 minutes of decent content and 7 minutes of utter fluff!)
  • Every time something dramatic or interesting is about to happen on a TV show, they go off on a commercial break. (Even worse, you suspect that the only reason anything interesting happened was so that you’d hang around till the commercials were over!)
  • 212 channels and they can’t find anything worth watching? (What’s that about? A rerun of the Mary Tyler Moore Show is my best choice? Why?)
  • Not only that–but all that junk is expensive. (I hate paying for stuff I not only don’t like but wouldn’t have in my house if I had a choice!)

All that can be attributed to limited bandwidth — to bandwidth that is rare and therefore expensive. Now nobody much thinks about it this way right now. But that is because you seldom can see what the problem is until it has been solved. And I suspect that the problem with TV is about to be solved.

The solution is Downloadable Video (DV instead of TV). You go to the internet and find the show you want to watch, (pay probably), download it, and watch it.

You can:

  • You can watch episode one at 7:12 one Wednesday night and episode two at 2:00 the next Thursday if it suits your schedule.
  • Watch your favorite show at 3:15 in the afternoon every day and sleep when you want, thank you very much.
  • You don’t have to watch anything that that guy in Kansas would watch. And you don’t have to eat his food, either.
  • Some episodes of a show are 52 minutes long and some are 68 minutes long and it is all good stuff, ’cause nobody bothers with fluff if it doesn’t have to fit the schedule of some advertising executive.
  • The rhythm of DV shows is not determined by advertising breaks the way that TV shows are. The plot actually drives the show. At first it seems weird but it’s easy to get used to.
  • You’re not limited to 212 channels. Like bass fishing? Download your favorite show from 1982. Have a strange sense of humor? Download 12 Andy of Mayberrys and have a party with an Aunt Bee theme.
  • You pay for what you download. But you only pay for what you want to watch. None of that awful schlock. (Unless you like awful schlock–then you can have as much as you want—there is plenty.)

But you can’t fix TV this way unless you have real, big, bandwidth—cheap. Fiber to the home is the way out of the wasteland. Nothing else will provide adequate bandwidth to do this and everything else you might want to do at the same time. It is the future. Even after we get big bandwidth it will take a while to mature. Only those companies that have capacity to burn will be able to compete. And only those communities that have really big bandwidth will get it early. It will be well worth having, don’t you think? Replace your TV with DV.

You can put in an order on July 16th by voting Yes!, For Fiber.