Credit Where Credit’s due: Cox & Privacy

Credit where credit is due: Cox Communications, according to an entry Wired’s “Threat Level” blog, is doing as right by the public in regard to their protecting their privacy from illegal government intrusion as is permissible.

Kudos.

Currently in Lafayette, and much of Louisiana, the choice for telecommunications services is between Cox and AT&T. If protecting your privacy from illegal government surveillance is important to you it appears that you’d be well-served to switch to Cox. (AT&T has been nailed repeatedly for complying with illegal requests.)

The blog entry is pretty much a set of reporters notes on a story he wrote for Wired, “Point, Click, Wiretap: How the FBI’s wiretap net operates.” The main story documents a pervasive network of surveillance with the FBI constantly tied into private providers communications centers across the country using a network physically separated from the regular internet. That network, according to the illustration from Wired at right must run through Lafayette on its way from New Orleans to Beaumont either on I-10 fiber or up US 90 along the railroad..

The FBI has quietly built a sophisticated, point-and-click surveillance system that performs instant wiretaps on almost any communications device…

The surveillance system, called DCSNet, for Digital Collection System Network, connects FBI wiretapping rooms to switches controlled by traditional land-line operators, internet-telephony providers and cellular companies. It is far more intricately woven into the nation’s telecom infrastructure than observers suspected.

It’s a “comprehensive wiretap system that intercepts wire-line phones, cellular phones, SMS and push-to-talk systems,” says Steven Bellovin, a Columbia University computer science professor and longtime surveillance expert.

DCSNet is a suite of software that collects, sifts and stores phone numbers, phone calls and text messages. The system directly connects FBI wiretapping outposts around the country to a far-reaching private communications network.

The backstory is that during the Clinton administration federal law enforcement agencies complaining that digital communications made wiretapping increasing ineffective asked for a law that would force network providers to only install hardware and software that allowed for easy, centralized, information capture by all private network operators. That law, commonly labeled CALEA, passed and was augmented post 9-11 by the Bush administration. An FCC ruling this year extended CALEA compliance rules to all VOIP providers, facility based like AT&T or independent, like Vonage. That, in conjunction with elements of 911 compliance ensures that constant monitoring is possible. (You can, however, personally encrypt your communications though few do. Carrier-provided encryption must, by law, be trap-doored and that trap made available to governmental agencies that legally request them.)

What the story documents is just how the FBI has implemented this law and just how easily it can be and how extensively such monitoring is done.

It’s not news that the large telecom corporations, intricately dependent upon federal regulation to protect their competitive positions, extensive subsidies, and spectrum “property” are pretty cravenly submissive to whatever the Feds ask of them. What is news, in a sort of man bites dog sort of way, is when one of the resists giving the administration anything they want. Qwest has earned kudos in the past and now it appears that Cox has also done “the right thing.” From the blog:

Cox Communications lawyer Randy Cadenhead was also key to the story. Among the things that didn’t make it into the final piece is that Cox is the only major telecom company to publicly publish its forms and fees for wiretaps. That documentation, which doesn’t reveal any national secrets, should be on every telecom’s website, in interests of transparency. Unfortunately, none of the largest wireless carriers do so, nor they, with the notable exception of AT&T, responded to requests for comments on the story.

Cadenhead also noted that Cox Communications did not participate in, or have any knowledge of, other wiretapping programs that have recently been in the news (read: warrantless wiretapping).

Now it should be noted that this leaves open the possibility that Cox simply was not asked to join the cabal. But as the third largest cable carrier and a VOIP leader in their field that seems unlikely. Nor does it mean that Cox hasn’t complied fully with CALEA requirements. They surely have. Now it could be that once locked into an aggregation point on Cox’s network they wouldn’t have to ask Cox to do anything in order to “wiretap”—illegally or otherwise. In which case Cox’s denial would be disingenuous. They’d have a warrant for legal wiretaps and wouldn’t have, and thus wouldn’t “know about,” any illegal ones.

But that caveat aside it does appear that the reporter and the Cox representative believe that Cox is not cooperating with illegal wiretaps. And we know that AT&T is. One more reason to not hang up the phone when that annoying guy from Cox calls trying push VOIP during dinner.

(And, oddly, one more reason to be eager to see LUS enter the market. As a public agency LUS will be no less obligated to obey the law than any private corporation–but they are also, by law, will be unavoidably much more transparent than any private corporation. Public agencies can be required to submit records that make much of what they do visible (rightly so). But what that means to black hat operations like those we’ve seen recently is that those running them would be wise to avoid trying impose their illegalities on utilities like LUS which cannot hide their interactions from public scrutiny.)

“Speed up, with fiber”

They’re figuring it out in Minnesota….

“The focus has changed. It’s really all about speed,” Garrison said. “Wireless is the icing on the cake. It’s not the cake itself.”

Last year’s must-have, municipal Wi-Fi – a relatively cheap and quick-to-install way for communities to get a broadband fix – is losing some allure.

Instead, this year more cities appear to be asking: Got fiber?

A ‘FUTURE-PROOF’ OPTION

Fiber is the undisputed future of telecommunications, experts say…

Broadband advocates nationwide are realizing that what’s really need is fiber; something we’ve already acted on here in Lafayette. It’s nice to be out front on something other than obesity and rates of imprisonment.

Read on at Speed up, with fiber” at the Twin Cities Pioneer Press. Worth the click.

AT&T $10 DSL Vanishes (Again)

AT&T continues to hide its $10 dollar DSL program. The plan, mandated by the FCC when it allowed AT&T to merge with BellSouth was intended as a sop thrown to consumers for the loss of potential “competition” between the two monopolies. As previously reported here it is hard to impossible to find a way to apply f0r the “deal.”

Now the Hear Us Now Blog (a project of Consumers Union–the one that doesn’t accept advertising) has a story detailing the difficulty a consumer reporter/advocate in St. Louis had in getting access to the plan. Apparently AT&T told the reporter that they’d “fixed” the website to make it more accessible. But, if ever really implemented, the web site changes vanished again. According to Consumers Union one person did succeed…but more than 20 were unable to make it work.

Sounds like lawsuit bait to me. The law is also supposed to apply to large corporations…even if they do curry favor with the administration by cooperating in illegal spying on the American people.

The $10 deal is supposed to be available to any new broadband customer that has AT&T service. I’d be very interested in the experiences of any local folks who’ve tried to get the deal.

Update 11:51:
Tennessee’s Regulatory Authority has some questions about it:

Phone company officials also say they’ve made changes to make the $10 Internet easier to find on its Web site. Hicks explained the new six-step process of finding the offer online at Monday’s TRA meeting.

Jones, the TRA director, asked Hicks, “How will citizens who don’t have Internet connections be able to take advantage of the offer if you don’t advertise to them in some medium other than on the Internet?”

“I think there’s been a lot of media coverage about the $10 offer and they would have general knowledge of it,” Hicks said.

He said customers who don’t have Internet access at home could “go to a friend or family member’s computer or the public library computer.”

Cough, Coughsix steps? media coverage of an alternative you’re hiding? the only way to buy internet is to already have access to it? AT&T doesn’t pay hacks like Hicks enough; I’m certain. How much is your pride worth?

VOE: Fixing What is Wrong with Muni WiFi

Voice of Experience Department.

[Lafayette’s decision two years ago in voting to build a Fiber To The Home system rather than a cheaper, less capable wireless system is being validated by current events and the emerging pattern suggests that local citizens might end up owning the nation’s most impressive model of a real, inexpensive, municipal network with modern bandwidth and workable mobility. Read on…]

Business Week picks up on current net buzz on the difficulties encountered by municipal wifi networks and the story does a good job in laying out the current unhappy state of such projects. It’s a sad story for a lot of people in a lot of places.

The static crackling around municipal wireless networks is getting worse.

San Francisco Wi-Fi, perhaps the highest-profile project among the hundreds announced over the past few years, is in limbo. Milwaukee is delaying its plan to offer citywide wireless Internet access. The network build-out in Philadelphia, the trailblazer among major cities embracing wireless as a vital new form of municipal infrastructure, is progressing slower than expected.

My friends in Philly say the network is pretty near useless where it is up—service is beyond spotty and it comes and goes unpredictably. The boards tell a similar story in Corpus Christi where Earthlink, a private provider, had bought the municipal network with a promise of upgrades. Google’s hometown Mountain View network isn’t anything to brag on either. The problem isn’t with public networks; difficulties seems to be hitting private and public muni wifi WANs (Wide Area Networks) pretty much equally.

There has been a lot of doom and gloom about the problems muni wifi networks are encountering (the Business Week article among them) and there has been the inevitable reaction to that on the part of advocates pointing out the immaturity—and naivete—of the original business plans. Business Week does, at the end of the story, note that a more mature business plan relies on the local city government being involved:

To make the business more profitable, Wi-Fi service providers are trying to pass more of the cost to the cities. “There’s no one that I am aware of right now who’d build a network without the city as a paying customer,” says Lou Pelosi, vice-president for marketing at MetroFi, which six months ago stopped bidding for projects unless the city agreed to become the network’s anchor tenant.

Advocates imply that a naive business plan is all that is wrong with the current crop of wide area wifi networks. Would that it were so.

The doom and gloom is overstated. But the truth is the version of muni wireless that emphasized cheap (or free) residential service using a wireless mesh to minimize costs was always a castle built on shaky technical grounds. From the beginning the fundamental concept was that you’d take a single expensive connection to the net and divide it up like the loaves and fishes between many users and still end up with sufficient connectivity to feed the masses. Thinking that way was hoping for the sort of miracle that doesn’t occur in our daily world. An analogy might be taking your home connection and “sharing” it with most of your neighborhood. That might work at times. But service could never be very fast or reliable. (Yes, it’s more complicated; I know–but that’s a fair analogy.) Additional problems having to do with the nature of the spectrum allocated to wifi (short range power and a frequency that has trouble cutting through vegetation or walls) added the limitations of physics to the questionable network design decisions.

Those problems can be overcome. It’s not even a twelve step program. Two will do

Step One is to abandon the idea that a wifi network will ever work well as a person’s primary, reliable, home connection to the full richness of the network.

Rock solid reliability is not in the cards for wifi–and affordable access to a reliable always-on connection is a prerequisite for full participation in the emerging digital culture.

You will need a hardwired, preferably Fiber To The Home connection if you plan to make full, reliable, consistent use of downloadable video, cable TV, Voice over IP, security alarms, medical monitoring and the like. With a fiber connection every individual can easily and cheaply provision their own in-home wifi network if wireless suits their style.

Any community that takes that stand abandons at one blow all the unrealistic demands that wifi technology simply cannot fulfill. Concentrate on ubiquitous local coverage, emphasize mobility and help people understand that cell phone levels of reliability is the best that can be hoped for. (That level of service would be a huge boon even without the unrealistic expectation, with ubiquitous coverage I could get a connection anywhere while on the go. I might not be able to do everything with it I could do at home–but I could do almost anything I can imagine that I would want to do on the run. Including in the best case, which I’ll get to below, mobile, albeit cell quality, VOIP.)

I do understand, and deeply sympathize with, the hope that cheap wifi could help close the digital divide. There still may be a role for it there if the bandwidth issues can be overcome (again see below). —But the reliability issues, arguably, are fundamental and the hacked-up solutions necessary unstable and too technically exacting to expect large populations to manage on their own. Pretending that wireless connectivity is the same as wired connectivity is profoundly misleading—and is a recipe for creating a second-class version of net usage where poorer users simply can’t rely on the net being there and so aren’t able to trust it fully enough to make it as central as their better-off brethren. Imagine what would have happened to telephone usage in our culture if the well-off got good, reliable, always on wired phone service. But “other people” got cheap, spotty, poor “radio” service on “garbage” bandwidth that might or might not work on any given day or location. That is the sort of divided service model was avoided in our phone history and if we try it today it will cause trouble downstream that I, for one, would rather avoid. The real solution to too expensive wired network connections is cheap wired network connections. And that is the solution that any conscientious community should seek. [I am grateful that that is the solution Lafayette has sought—LUS proposes to narrow the digital divide by making service significantly cheaper.]

With cheap, wired, reliable, big broadband available in every home the threshold moves to making some form of connection available on every corner (ubiquity) and making it available while you are on the move (mobility). That’s what wireless networks are good for—and why cell phones, as unreliable as they are, remain useful and hence popular.

Step Two is to abandon the the belief that wireless mesh networks can be used to turn an expensive wired connection into many cheap wireless ones.

It can’t; only Christ could manage the miracle of the Sermon on the Mount.

Build, instead, on the real virtues of wireless networks: ubiquity and mobility. Do your absolute best to minimize its weaknesses by making it as fast and reliable as possible within the confines set by physics and federal regulation.

Abandoning the idea that one connection the wired broadband internet can serve many users over a broad area well is the key to succeeding. Instead of designing the wireless network so that each wired connection feeds five, six, or more wifi access points, limit the ratio of access points to internet connections to 1:1. This makes for much less sharing of limited bandwidth among users, greater reliability, and dramatically reduced “latency” (the lag caused by mulitple jumps that makes VOIP phones impractical on most muni networks).

Better yet, attach your wifi network directly to a full throttle fiber network. Fund the entire capacity of wireless protocols. (Outside of a few University or corporate campuses very few of us have ever used a wireless network that worked the internet as fast as they could. The usual limiting factor is the wired network that supplies bandwidth to the wifi. If Cox or AT&T only gives me 5 megs of wired bandwidth to my access point then the theoretical 54 Mbit/s that is theoretically possible is limited to at most 5 Mbit/s. You’ll never see the other 49 Mbit/s no matter what it says on the side of the box.) A fiber network can easily supply a minimum of 100 Mbit/s supplied to the wifi access point; split that 100 once to a second wifi access point and something close to the full 50 megs of bandwidth that wifi is capable of could actually be seen on the street. Even split among a sizeable group of users on two nodes that would be plenty fast enough to support excellent quality VOIP with no discernable lag, great data connections, and many, many extras. Even if turned out to be less reliable and a bit slower in use than its wired counterparts the virtues of ubiquity and mobility would be there and our willingness to use cell phones proves that we find this trade-off acceptable.

A wifi network built this way would be as much superior to its wireless competitors as the fiber network would be to its wireline competitors.

But getting to that dream requires abandoning unrealistic expectations…and starting with a fiber network running down every street.

Lafayette is positioned to realize the ultimate dream: a cheap, blindingly fast, reliable, fiber-optic connection made available to every home and, based on that, a solidly architected, cheap, uniquely fast municipal wireless network that is demonstrably better than any muni wifi network in the nation.

Living large in Lafayette.

(Thanks go out to reader Scott who forwarded the story.)

A Little Panicky in Seattle

There’s an odor of panic out the world of American broadband advocates and it even extends even to places like Seattle (home of Microsoft)–where a broadband panel has been making achingly slow progress toward creating a fiber plan.

An impatient Seattle Times columnist, Brier Dudley, announces that he’s changing his tune on municipal broadband. He had held out for the city to build a fiber network itself. But he’s getting a little panicky. He’s worried about two things: weak-kneed politicians and malevolent incumbents. A fatal combination.

Dudley worries that the politicians haven’t been able to pass a net neutrality bill and they don’t even seem to realize that universal service ought to be national policy—as is universal phone service. And he’s noticed that incumbent AT&T (yes, our AT&T) recently decided to censor a Lollapalooza web broadcast containing lyrics that attacked George Bush’s political policies by Seattle homeboys Pearl Jam. (Didn’t hear about that? Still thinking that maybe the “broadband monopolists” wouldn’t dare censor the internet? Let MTV disabuse you.) But the offense that pushed Dudley over the line was having a friend that used too much bandwidth and had his Comcast broadband terminated. Apparently cable company Comcast didn’t like the number of movies he was downloading. But it wasn’t willing to tell him, or the reporter, how much was too much.

So, apparently people are starting to get a little panicky about the state of US broadband, even—or especially—in tech meccas like Seattle. And they are beginning to be willing to do previously unpalatable things to get out from under a regime that does little to rein in monopoly power and a set of monopolists constitutionally unable to stop abusing their position.

Dudley hopes that:

If Seattle isn’t led astray by its broadband partners, it could build an island of neutrality that would attract Internet companies and set a precedent for universal service.

That’s a big “IF.” Dudley needs to not give up on public provision and purely local ownership. Making the broadband provider directly responsible to the public is the only reliable path toward freedom… and reasonable, powerful, service…in US broadband.

Seattle needs to follow Lafayette’s precedent. The “Lafayette solution” is the last, best, hope for a free internet in the US.

$10 DSL Revisted

New AT&T CEO Randall talks to the Atlanta Journal Constituion about, among other things, the “hidden” $10 DSL program that I’ve noted recently. The FCC required that AT&T offer this discount program—and an accompanying “naked DSL” program for a bit more—as a condition of it allowing BellSouth and AT&T to merge. As it turns out, in my experience and the experience of others, it is inordinately hard to find and order. Not a few people think AT&T is avoiding keeping its word.

Randall says that you don’t really want it. Is that true? If you’ve tried to get it I’d like to hear your experiences. (John2 “at” LafayetteProFiber “dot” com)

In his own words:

Q: Of all the things the AJC has written about AT&T lately, none has caused more reader irritation than AT&T’s $10 a month DSL offer, which was required by the Federal Communications Commission when you bought BellSouth. A lot of folks said they couldn’t find it. It was hard to find on your site. Why?

A: We haven’t made it difficult to find. To be honest with you, that’s not a product that our customers have clamored for. We still have $15 offers out there in the marketplace, even $20 offers, for 1.5 megabit speeds. Those are really kind of the minimum speeds that give a good user experience. So I don’t want to necessarily offer up a product where the user experience is not what I would consider really state of the art. That $10 product is kind of in that mode.”

Congressional Policy & Lafayette

Dick Durbin, Senator from Illinois, has been holding a nightly public forum on telecom policy issues online over the last three nights and tonights question is: “What do we need to to do to encourage investment in broadband infrastructure?” Lafayette’s network is being featured as an example:

Tonight, I’d like to focus on other ways to provide incentives to build broadband networks. Public/private initiatives like Connect Kentucky have achieved success where the market alone has failed. Other projects like Lafayette, Louisiana’s Fiber for the Future and Utah’s UTOPIA project have also made significant steps.

Durbin also features Lafayette as an example on the video lead-in to the forum:

(Sorry, video no longer exists on YouTube https://www.youtube.com/v/Ca4ioHHaBYs).

Louisiana is being mentioned in the same light as Connect Kentucky and the Utopia Projects—both state-wide efforts that have garnered a lot of positive comment in Washington and on the net. Each night has featured well-known national experts and advocates of broadband. Tonight’s features Jim Baller, who aided LUS and Lafayette during the fiber fight, Paul Morris of Utopia, and Andrew McNeil of Connect Kentucky.

Lafayette’s Partisan’s might want to attend the forum at 6:00. Durbin is hoping to draft new law on broadband availability and this discussion is a chance to talk to a major policy maker directly. Federal legislation is one of the few forces that might get AT&T and Cox off LUS’ back. The format is a “Live Blog” done in what I think of as “Drupal Style” –meaning that there is a long string of responses and responses to responses and anyone can pitch in with their remarks. The first three nights have been interesting and this last one, with its exploration of real, in-the-world alternatives, promises to be even more contentious and useful.

NOTE: the active forum has opened up at a new URL. Go to: http://openleft.com/showDiary.do?diaryId=451

PhotoSynth & Web 2.0—Worth Thinking About

Sunday Thought Dept.

Warning: this is seriously different from the usual fare here but fits roughly into my occasional “Sunday Thought” posts. I’ve been thinking hard about how to make the web more compelling for users and especially how to integrate the local interests that seem so weakly represented on the internet. As part of that exploration I ran across a research program labeled “PhotoSynth.” It offers a way to integrate “place” into the abstract digital world of the web in a pretty compelling way if your interest is in localism: it automatically recreates a 3 dimensional world from any random set of photographs of a scene and allows tags and links to be embedded in them. Once anyone has tagged a local feature (say the fireman’s statue on Vermillion St. or a associated a review with a picture of Don’s Seafood downtown.) everyone else’s images are, in effect, enriched by their ability to “inherit” that information.

But it seems that it is a lot more than just the best thing to happen to advocates of web localism in a long time. It’s very fundamental stuff, I think, with implications far beyond building a better local web portal…. Read On…

—————————-
Photosynth aka “Photo Tourism” encapsulates a couple of ideas that are well worth thinking hard about. Potentially this technical tour de force provides a new, automated, and actually valuable way of building representations of the world we live in.

This is a big deal.

Before I get all abstract on you (as I am determined to do) let me strongly encourage you to first take a look at the most basic technical ideas behind what I’m talking about. Please take the time to absorb a five and a half minute video illustrating the technology. If you’re more a textural learner you can take a quick look at the text-based, photo-illustrated overview from the Washington State/MS lab. But I recommend trying the video first.

(Sorry this video was removed by YouTube).

You did that? Good; thanks….otherwise the rest will be pretty opaque—more difficult to understand than it needs to be.

One way to look at what the technology does is that it recreates a digitized 3D world from a 2D one. It builds a fully digital 3D model of the world from multiple 2D photos. Many users contribute their “bits” of imagery and, together, they are automatically interlinked to yield, out of multiple points of view, a “rounded” representation of the scene. The linkages between images are established on the basis of data inside the image–on the basis of their partial overlap—and ultimately on the basis of their actually existing next to each other—and this is done without the considered decisions of engaged humans.

Why is that a big deal?

Because its not all handmade. Today’s web is stunningly valuable but it is also almost completely hand-made. Each image or word is purpose-chosen for its small niche on a web page or in its fragment of context. The links that connect the web’s parts are (for the most part) hand-crafted as well and represent someone’s thoughtful decision. Attempts to automate the construction of the web, to automatically create useful links, have failed miserably—largely because connections need to be meaningful in terms of the user’s purpose and algorithms don’t grok meaning or purpose.

The web has been limited by its hand-crafted nature. There is information (of all sorts, from videos of pottery being thrown, to bird calls, to statistical tables) out there we can’t get to—or even get an indication that we ought to want to get to. We rely mostly on links to find as much as we do and those rely on people making the decision to hand-craft them. But we don’t have the time, or the inclination, to make explicit and machine-readable all the useful associations that lend meaning to what encounter in our lives. So the web remains oddly thin—it consists of the few things that are both easy enough and inordinately important enough to a few of our fellows to get represented on the net. It is their overwhelming number and the fact that we are all competent in our own special domains that makes the web so varied and fascinating.

You might think that web search, most notably the big success story of the current web, Google’s, serves as a ready substitute for consciously crafted links. We think Google links us to appropriate pages without human intervention. But we’re not quite right—Google’s underlying set of algorithms, collectively known as “PageRank,” mostly just ranks pages by reference to how many other pages link to those pages and weights those by the links form other sites that those pages receive…and so on. To the extent that web search works it relies on making use of handmade links. The little fleas algorithm.™ It’s handmade links all the way down.

Google was merely the first to effectively repackage human judgment. You’ve heard of web 2.0? (More) The idea that underpins that widely hyped craze is that you can go to your users to supply the content, the meaning, the links. That too is symptomatic of what I’m trying to point to here: the model that relies solely on the web being built by “developers” who are guessing their users needs has reached its limits.

That’s why Web 2.0 is a big deal: The folks designing the web are groping toward a realization of their limits, how to deal with them, and keep the utility of the web growing.

It is against that backdrop that PhotoSynth appears. It represents another path toward a richer web. The technologies it uses have been combined to contextually indexes images based on their location in the real, physical world. The physical world becomes its own index—one that exist independently of hand-crafted links. Both Google and Yahoo have been looking for a way to harness “localism,” recognizing that they are missing a lot of what is important to users by not being able to locate places, events, and things that are close to the user’s physical location.

The new “physical index” would quickly become intertwined with the meaning-based web we have developed. Every photo that you own would, once correlated with the PhotoSynth image, “inherit” all the tags and links embedded in all the other imagery there or nearby. More and more photos are tagged with meta-data and sites like flicker allow you to annotate elements of the photograph (as does PhotoSynth). The tags and links represented tie back into the already established web of hand-crafted links and knit them together in new ways. And it potentially goes further: Image formats typically already support time stamps and often a time stamp is registered in a digital photograph’s metadata even when the user is unaware of it. Though I’ve not seen any sign thatPhotoSynth makes use of time data it would be clearly be almost trivial to add that functionality. And that would add an automatic “time index” to the mix. So if you wanted to see pictures of the Vatican in every season you could…or view images stretching back to antiquity.

It’s easy to fantasize about how place, time, and meaning-based linking might work together. Let’s suppose you stumble across a nifty picture of an African Dance troupe. Metadata links that to a date and location—Lafayette in April of 2005. A user tag associated with the picture is “Festival International.” From there you get to the Festival International de Louisiane website. You pull up—effectively create—a 3-D image of the Downtown venue recreated from photos centered on the stage 50 feet from where the metadata says the picture was taken. A bit of exploration in the area finds Don’s Seafood, the Louisiana Crafts Guild, a nifty fireman’s statue, a fountain (with an amazing number of available photos) and another stage. That stage has a lot of associations with “Zydeco” and “Cajun” and “Creole.” You find yourself virtually at the old “El Sido’s,” get a look at the neighborhood and begin to wonder about the connections between place, poverty, culture, and music….

The technologies used in SynthPhoto are not new or unique. Putting them all together is…and potentially points the way toward a very powerful way to enhance the web and make it more powerfully local.

Worth thinking about on a quiet Sunday afternoon.

Lots o’ Langiappe:

TED talk Video — uses a Flickr data set to illustrate how the program can scoop up any imagry. This was the first reference I fell across.

Photo Tourism Video — Explains the basics, using the photo tourism interface. Shows the annotation feature of the program…

Roots of PhotoSynth Research Video—an interesting background bit…seadragon, stitching & virtual tourist, 3D extraction….

PhotoSynth on the Web Video: a version of the program is shown running in a web browser; only available to late version Microsoft users. (Web Site)

Microsoft Interactive Visual Media Group Site. Several of these projects look very interesting—and you can see how some of the technologies deployed in PhotoSynth have been used in other contexts.

Microsoft PhotoSynth Homepage

AT&T’s 10 dollar deal: Is it real?

I recently covered a $10 dollar DSL deal and a promise of a $20 dollar “naked DSL plan from AT&T that ought to be available locally. (It’s got some preconditions, see my post.) David Isenberg dug a little harder and made the case that these plans weren’t actually being “offered” in any real sense of the term. Now he (and I!) would like to know if anyone out there is actually getting this deal.

Background: AT&T agreed, as a condition of its merger with BellSouth, to offer these deals. As we in Lafayette know, the phone company doesn’t necessarily keep its word and this appears to be a case where the company is skirting pretty close to simply breaking the law. I dug around a bit for this post and discovered yet another very real obligation on Ars Technica: AT&T is supposed to make broadband available to EVERYONE in its footprint; it promised to provide at least 85% of its customers with DSL and would tie the last 15% in with satellite or WiMax. So if they tell you they can’t provision you — ask again.!

The question of AT&T keeping its word comes up following a small internet furor over a story popularized on engadget about a fellow that had a real hassle getting the $10 dollar deal from AT&T. I had similar issues when I tried to see how real the offer was locally.

Has anyone out there tried? What’s your story? Were you successful? Did you eventually give up trying and go for the “good” (but more expensive) “deal” that was easier to get?

I’d love to hear from you in the comments here or via email at John2_AT_lafayetteprofiber_DOT_com

Thoughts on Killer Apps and Community

I’ve been chewing over an informal speech/meeting with Geoff Daily of KillerApp Monday evening from which I came away pretty impressed. He was speaking on what drives broadband usage—especially usage of high-capacity fiber networks. Daily actually gets it—he’s not so distracted by the technology itself that he doesn’t see that something more is necessary to create real change.

Daily was in town at the behest of Abigail Ransonet (aka fiberina and mistress of Abacus Marketing) who is hosting him here. Geoff, who is “on tour” of communities which have significant fiber to the home networks, is visiting Lafayette with the dual purpose of seeing what we are doing (or planning to do) with our fiber and informing us about what others have done.

What impressed me was that Geoff didn’t succumb to the implications of the name of the business for which he works—nor the mindset that is so popular that the name was an obvious choice for a business focused on broadband. He doesn’t think there is going to be a “KillerApp” that drives full utilization of fiber networks and leads to broadband utopia.

What Daily pointed out Monday was that most of the applications that people expect will drive broadband usage already exist. Some of them don’t really require big broadband if only a few people are using them—and only a few people are. Those that do require a big pipe don’t appear to be widely adopted where the bandwidth is available. The missing element is adoption. Waiting for “the killer app” is just a way of putting off the real works: preparing the community to make use of the many advantages which fiber’s big bandwidth makes available.

Without community education—and providing a way for that education to occur—networks may be fiscally successful. But they will not realize the dreams of their advocates to provide a foundation for accelerated growth, equity, and a markedly better quality of life for citizens.

The “build it and they will come” assumption is insufficient to those goals. Building a community-owned fiber network is, I believe, a necessary precondition realize such dreams. Privately-owned networks will never be motivated to serve the needs of the community except indirectly. If any community hopes to get ahead of the curve or to simply control its own destiny it must own its own tools. That’s true for carpenters and that’s true for cities. Lafayette did the right thing in building its own network. But Geoff Daily reminds us that this is only the beginning. (Check out his blog at KillerApp for relevant ideas.)

Daily pointed to the Utopia project in Utah as one that appeared to him to be built on “build it and they will come” assumption. In truth, as Daily probably realizes, this attitude was pretty much forced on them by their statehouse: the state of Utah would only allow local communities to build the networks the private providers refused to build if they leased them out to private service providers. In consequence the Utopia project is not, and cannot be, “utopian” in any real sense. The citizens who own and will have taken the risk in providing the network will find themselves with services that are typical of services offered by any private network since what motivates their providers will be no different from what motivates anyone else’s.

That is better than not having such services at all, I’ll grant, but that is not what Lafayette voted for—we voted for the dream.

One point was unmistakable: Geoff Daily wants that dream too. He wants to see the technology lead to better things for communities and their residents. That leads him to think that we need a visionary success in at least one community to kickstart nation-wide usage. The country needs to see a place where an advanced network kicks off accelerates growth, decreases inequality, and results in a markedly better quality of life for all its citizens.

I nominate Lafayette.

But, as Geoff’s presentation and the following discussion made clear, it won’t happen by itself. The the only way that will happen is if LCG, LUS, and the community decide to make it happen.